Table of Contents
hide
Amazon DynamoDB Streams
- DynamoDB Streams provides a time-ordered sequence of item-level changes made to data in a table.
- DynamoDB Streams stores the data for the last 24 hours, after which they are erased.
- DynamoDB Streams maintains an ordered sequence of the events per item however, sequence across items is not maintained.
- Example
- For e.g., suppose that you have a DynamoDB table tracking high scores for a game and that each item in the table represents an individual player. If you make the following three updates in this order:
- Update 1: Change Player 1’s high score to 100 points
- Update 2: Change Player 2’s high score to 50 points
- Update 3: Change Player 1’s high score to 125 points
- DynamoDB Streams will maintain the order for Player 1 score events. However, it would not maintain order across the players. So Player 2 score event is not guaranteed between the 2 Player 1 events
- For e.g., suppose that you have a DynamoDB table tracking high scores for a game and that each item in the table represents an individual player. If you make the following three updates in this order:
- Applications can access this log and view the data items as they appeared before and after they were modified, in near-real time.
- DynamoDB Streams APIs help developers consume updates and receive the item-level data before and after items are changed.
- Streams allow reads at up to twice the rate of the provisioned write capacity of the DynamoDB table.
- Streams have to be enabled on a per-table basis. When enabled on a table, DynamoDB captures information about every modification to data items in the table.
- Streams support Encryption at rest to encrypt the data.
- Streams are designed for No Duplicates so that every update made to the table will be represented exactly once in the stream.
- Streams write stream records in near-real time so that applications can consume these streams and take action based on the contents.
- Streams can be used for multi-region replication to keep other data stores up-to-date with the latest changes to DynamoDB or to take actions based on the changes made to the table
- Stream records can be processed using Kinesis Data Streams, Lambda, or KCL application.
AWS Certification Exam Practice Questions
- Questions are collected from Internet and the answers are marked as per my knowledge and understanding (which might differ with yours).
- AWS services are updated everyday and both the answers and questions might be outdated soon, so research accordingly.
- AWS exam questions are not updated to keep up the pace with AWS updates, so even if the underlying feature has changed the question might not be updated
- Open to further feedback, discussion and correction.
- An application currently writes a large number of records to a DynamoDB table in one region. There is a requirement for a secondary application to retrieve new records written to the DynamoDB table every 2 hours and process the updates accordingly. Which of the following is an ideal way to ensure that the secondary application gets the relevant changes from the DynamoDB table?
- Insert a timestamp for each record and then scan the entire table for the timestamp as per the last 2 hours.
- Create another DynamoDB table with the records modified in the last 2 hours.
- Use DynamoDB Streams to monitor the changes in the DynamoDB table.
- Transfer records to S3 which were modified in the last 2 hours.
2 thoughts on “Amazon DynamoDB Streams”
Comments are closed.