Search⌘ K
AI Features

Data Ingestion and Transformation I

Explore how to implement reliable data ingestion and transformation pipelines using AWS services including DynamoDB Streams, AWS Lambda, Kinesis Data Streams, and AWS Glue. Understand techniques for event-driven processing, scheduling Glue crawlers, and optimizing real-time data workloads. This lesson prepares you to manage near real-time data updates, set up automated workflows, and reduce processing latency in AWS environments.

Question 1

A retail company captures item-level changes from an Amazon DynamoDB table that stores order information. The company needs to process these changes in near real time to update inventory dashboards and send notifications. The data engineer must configure the ingestion pipeline to capture changes reliably.

Which two combinations of services correctly capture and process DynamoDB item-level changes? (Select any two options.)

A. Enable DynamoDB Streams on the table and configure an AWS Lambda function as a stream consumer to process changes.

B. Configure Amazon EventBridge to capture DynamoDB item-level changes and trigger an AWS Lambda function for processing.

C. Enable Amazon Kinesis Data Streams as the change data capture destination for the DynamoDB table and use an AWS Lambda function to consume records from the Kinesis stream.

D. Use AWS Database Migration Service (DMS) with DynamoDB as the source and ...