Developing Event-driven Applications with Apache Kafka and Red Hat AMQ Streams
Abstract
| Goal | Review tasks from Developing Event-driven Applications with Apache Kafka and Red Hat AMQ Streams |
| Objectives |
|
| Sections |
|
| Lab |
Comprehensive Review of AD482 |
After completing this section, you should be able to demonstrate knowledge and skills learned in Developing Event-driven Applications with Apache Kafka and Red Hat AMQ Streams.
Before beginning the comprehensive review for this course, you should be comfortable with the topics covered in each chapter.
You can refer to earlier sections in the textbook for extra study.
Describe the principles of event-driven applications.
Describe event-driven applications.
Build applications with basic read and write messaging capabilities.
Describe Kafka and AMQ Streams' history and use cases.
Describe the architecture of Kafka and AMQ Streams.
Create a topic with Kafka.
Send data with producers to topics.
Consume data from topics.
Define data contracts and integrate schema registries.
Leverage the Streams API to create data streaming applications.
Process a basic data stream with Kafka Streams.
Describe Kafka Streams architecture and concepts.
Use KTables, KStreams and other DSL objects to manage data streams.
Create asynchronous services using the event collaboration pattern.
Apply stateless transformations to event streams.
Apply stateful transformations to event streams.
Repartition an event stream to scale Streams applications.
Connect data systems and react to data changes using Kafka Connect and Debezium.
Create a Kafka Connect cluster.
Create a Kafka Connect connector.
Apply Single Message Transformations (SMT) with Kafka Connect.
Capture change event data with Debezium.
Handle common problems in Kafka and AMQ Streams applications.
Handle out-of-order or late events.
Configure producer retries and idempotence.
Prevent duplication and data loss.
Implement test cases in event-driven applications.