Bookmark this page

Lab: Comprehensive Review of AD482

In this lab, you will use Kafka, Kafka Streams, and Kafka Connect to build a streaming application.

You are hired by a company, SmartGardeners Inc. They want you to create an application that uses their sensor data and make the relevant calculations.

The company already uses a legacy CRUD application with some data stored in a relational database. This relational database includes metadata for garden sensors registered in the system. You must use this metadata, as well as real-time sensor data, to build a new Streaming application.

The following are the main components of the system.

  • PostgreSQL database. A database that stores the static sensor metadata. You must extract sensor data from this database, and write the data into the garden-sensors topic.

  • garden-sensors. A service that reads real-time measurements from garden sensors and produces records to the garden-sensor-measurements and garden-sensors-measurements-repl topics. You must implement a producer to produce measurements to the garden-sensor-measurements.

  • garden-streams. A service that reads data from the garden-sensors and garden-sensors-measurements-repl to process and aggregate the data. You must write a Kafka Streams topology to process the data and write the results into multiple Kafka topics.

  • garden-back. A back-end service that reads events produced by the garden-streams and garden-sensors services, and makes the data available to the front end. You must implement the consumer that reads the records from the garden-sensor-measurements topic.

  • garden-front. A front-end web application that displays the data exposed by the garden-backend service.

Outcomes

You should be able to:

  • Apply a Kafka Connect connector to gather data from an external database and write the data into a Kafka topic.

  • Create Kafka producers and consumers.

  • (De)serialize records with Service Registry and use Avro type.

  • Use Kafka Streams to split a stream or records into multiple streams.

  • Enrich a stream with static data with Kafka Streams.

  • Aggregate data from a stream, under a specific time window by using Kafka Streams.

To perform this exercise, ensure you have the following:

  • Access to a configured and running OpenShift cluster.

  • Access to an installed and running Kafka instance in the OpenShift cluster.

  • A configured Python virtual environment, including the grading scripts for this course.

  • The OpenShift CLI (oc) installed.

  • The JDK installed.

From your workspace directory, activate the Python virtual environment.

[user@host AD482]$ source .venv/bin/activate

Important

On Windows, use the Activate.ps1 script to activate your Python virtual environment.

PS C:\Users\user\AD482> ./.venv/Scripts/Activate.ps1

Use the lab command to start the scenario for this exercise.

(.venv) [user@host AD482]$ lab start comprehensive-review

Copy the Service Registry URL value. You will use this value to connect to the service registry.

The lab command copies the exercise files, from the AD482-apps/comprehensive-review/apps directory, which is in your local Git repository, into the comprehensive-review directory, at the root of your workspace.

Procedure 7.1. Instructions

  1. Create a PostgreSQL Debezium source connector by applying the AD482/comprehensive-review/resources/sensor-connector.yaml file. This file defines a KafkaConnector custom resource that reads sensor metadata from the PostgreSQL database, and writes the records into the garden-sensors topic.

    1. Change to the comprehensive-review directory.

      (.venv) [user@host AD482]$ cd comprehensive-review
    2. Use the oc create command to create the source connector.

      (.venv) [user@host comprehensive-review]$ oc create \
       -f resources/sensors-connector.yaml
      kafkaconnector.kafka.strimzi.io/sensors-connector created
  2. Create a producer in the garden-sensors application that gathers sensor measurements and produces them into the garden-sensor-measurements topic. The producer must send measurement events every five seconds, and use the Avro format.

    Before running the application, you must generate the com.redhat.training.gardens.event.SensorMeasurementTaken and com.redhat.training.gardens.event.SensorMeasurementType classes from the provided Avro schema. From the comprehensive-review/garden-sensors directory, run the ./mvnw generate-resources command to generate the relevant classes.

    After implementing the producer, start the garden-sensors application in a separate terminal window. From the comprehensive-review/garden-sensors directory, run the ./mvnw package quarkus:dev command. The command makes the application available at http://localhost:8080.

    Note that the garden-sensors service also includes an additional producer to replicate measurements to the garden-sensor-measurements-repl topic, without using Avro. This is necessary for the garden-streams service, which does not use Avro to deserialize sensor measurements.

    1. Navigate to the garden-sensors application directory.

      (.venv) [user@host comprehensive-review]$ cd garden-sensors
    2. Open the src/main/resources/application.properties file and replace YOUR_SERVICE_REGISTRY_URL with your registry service URL that is provided in the start script output.

    3. In the same file, add the required configuration for an outgoing channel named garden-sensor-measurements-out. Use the smallrye-kafka connector, and set garden-sensor-measurements as the channel topic. The configuration should look like the following.

      ...configuration omitted...
      # TODO: configure an outgoing channel named "garden-sensor-measurements-out" for "garden-sensor-measurements" Kafka topic
      mp.messaging.outgoing.garden-sensor-measurements-out.apicurio.registry.auto-register = true
      mp.messaging.outgoing.garden-sensor-measurements-out.connector = smallrye-kafka
      mp.messaging.outgoing.garden-sensor-measurements-out.topic = garden-sensor-measurements
      ...configuration omitted...
    4. Generate the com.redhat.training.gardens.event.SensorMeasurementTaken and com.redhat.training.gardens.event.SensorMeasurementType classes from the provided Avro schema. Run the following command to generate the classes.

      (.venv) [user@host garden-sensors]$ ./mvnw generate-resources
      ...output omitted...
      [INFO] ------------------------------------------------------------------------
      [INFO] BUILD SUCCESS
      ...output omitted...
    5. Open the com.redhat.training.gardens.service.SensorMeasurementService class. Add a method that produces SensorMeasurementTaken event data to the garden-sensor-measurements-out channel. Use the generateEvent method to create a SensorMeasurementTaken event from sensor data.

      // TODO: Implement the Kafka producer
      @Outgoing("garden-sensor-measurements-out")
      @Broadcast
      public Multi<Record<Integer, SensorMeasurementTaken>> measure() {
          return Multi.createFrom().ticks().every(Duration.ofMillis(5000))
              .onOverflow().drop()
              .map(tick -> {
                  SensorMeasurementTaken event = generateEvent(
                          sensorService.getSensor()
                  );
                  LOGGER.info("Sensor measurement taken: " + event);
                  return Record.of(event.getSensorId(), event);
              });
      }
    6. Run the application. You should see the garden-sensors application logging the output of the sensor measurement event data and its replication to another topic.

      [user@host garden-sensors]$ ./mvnw package quarkus:dev
      ...output omitted...
      2021-10-19 13:56:20,022 INFO  [com.red.tra.gar.ser.SensorMeasurementService] (executor-thread-0) Sensor measurement taken: {"sensorId": 1, "value": 0.061971520044841855, "timestamp": 1634640980022, "type": "HUMIDITY"}
      2021-10-19 13:56:20,563 INFO  [com.red.tra.gar.ser.SensorMeasurementReplicatorService] (vert.x-eventloop-thread-0) Sensor measurement event replicated: {"sensorId": 1, "value": 0.061971520044841855, "timestamp": 1634640980022, "type": "HUMIDITY"}
      2021-10-19 13:56:25,014 INFO  [com.red.tra.gar.ser.SensorMeasurementService] (executor-thread-0) Sensor measurement taken: {"sensorId": 1, "value": 0.4865613400995872, "timestamp": 1634640985014, "type": "HUMIDITY"}
      2021-10-19 13:56:25,177 INFO  [com.red.tra.gar.ser.SensorMeasurementReplicatorService] (vert.x-eventloop-thread-0) Sensor measurement event replicated: {"sensorId": 1, "value": 0.4865613400995872, "timestamp": 1634640985014, "type": "HUMIDITY"}
      ...output omitted...

      Leave this terminal window open.

  3. Create two consumers in the garden-back application. The garden-back application must consume raw sensor measurements produced by the garden-sensors, and enriched sensor measurements produced by garden-streams services.

    The consumer implementations must have the following configurations.

    Consumed Kafka topicIncoming channel nameOutgoing in-memory channel nameDeserialization
    garden-sensor-measurements garden-sensor-measurements-raw in-memory-garden-sensor-measurements-raw Automatic (Avro)
    garden-sensor-measurements-enriched garden-sensor-measurements-enriched in-memory-garden-sensor-measurements-enriched Custom

    For the custom deserialization, you must use the com.redhat.training.gardens.serde.SensorMeasurementEnrichedDeserializer deserializer class.

    The garden-back service uses enriched sensor measurements for exposing aggregated data in the front-end service. The garden-streams service produces the enriched sensor measurements.

    The garden-back service also exposes the raw sensor measurements for debugging purposes. The garden-sensors service produces the raw sensor measurements.

    Before running the application, you must generate the com.redhat.training.gardens.event.SensorMeasurementTaken and com.redhat.training.gardens.event.SensorMeasurementType classes from the provided Avro schema. From the comprehensive-review/garden-back directory, run the ./mvnw generate-resources command to generate the relevant classes.

    After implementing the consumers, start the garden-back service in a separate terminal window. From the comprehensive-review/garden-back directory, run the ./mvnw package quarkus:dev command. The command makes the application available at http://localhost:8081.

    1. In a new terminal window, navigate to the garden-back application directory.

      [user@host AD482]$ cd comprehensive-review/garden-back
    2. In the src/main/resources/application.properties file, add the required configuration for an incoming channel named garden-sensor-measurements-enriched. Use the smallrye-kafka connector, and set garden-sensor-measurements-enriched as the channel topic. The configuration should look like the following.

      # TODO: configure an incoming channel named "garden-sensor-measurements-enriched"
      mp.messaging.incoming.garden-sensor-measurements-enriched.connector = smallrye-kafka
      mp.messaging.incoming.garden-sensor-measurements-enriched.topic = garden-sensor-measurements-enriched
      mp.messaging.incoming.garden-sensor-measurements-enriched.value.deserializer = com.redhat.training.gardens.serde.SensorMeasurementEnrichedDeserializer
    3. In the same file, replace YOUR_SERVICE_REGISTRY_URL with your registry service URL.

    4. In the same file, add the required configuration for an incoming channel named garden-sensor-measurements-raw, and set garden-sensor-measurements as the channel topic. The configuration should look like the following.

      # TODO: configure an incoming channel named "garden-sensor-measurements-raw"
      mp.messaging.incoming.garden-sensor-measurements-raw.connector = smallrye-kafka
      mp.messaging.incoming.garden-sensor-measurements-raw.topic = garden-sensor-measurements
      mp.messaging.incoming.garden-sensor-measurements-raw.enable.auto.commit = false
      mp.messaging.incoming.garden-sensor-measurements-raw.auto.offset.reset = earliest
      mp.messaging.incoming.garden-sensor-measurements-raw.apicurio.registry.use-specific-avro-reader = true
    5. Generate the com.redhat.training.gardens.event.SensorMeasurementTaken and com.redhat.training.gardens.event.SensorMeasurementType classes from the provided Avro schema. Run the following command to generate the classes.

      (.venv) [user@host garden-back]$ ./mvnw generate-resources
      ...output omitted...
      [INFO] --- quarkus-maven-plugin:2.1.4.Final:generate-code (default) @ garden-back ---
      [INFO] ------------------------------------------------------------------------
      [INFO] BUILD SUCCESS
      ...output omitted...
    6. Open the com.redhat.training.gardens.rest.SensorResource class, and add a method that consumes SensorMeasurementEnriched data from the garden-sensor-measurements-enriched channel. This method must use an in-memory outgoing channel called in-memory-garden-sensor-measurements-enriched to expose the data via a REST endpoint.

      // TODO: Implement a Kafka consumer that returns "SensorMeasurementEnriched" data.
      //  Stream messages to an outgoing channel called "in-memory-garden-sensor-measurements-enriched"
      @Incoming("garden-sensor-measurements-enriched")
      @Outgoing("in-memory-garden-sensor-measurements-enriched")
      @Broadcast
      public SensorMeasurementEnriched
              consumeEnrichedSensorMeasurements(SensorMeasurementEnriched event) {
          return event;
      }
    7. In the same class, add a method that consumes SensorMeasurementTaken event data from the garden-sensor-measurements-raw channel. This method must use an in-memory outgoing channel called in-memory-garden-sensor-measurements-raw to expose the data via a REST endpoint. Use the createSensorMeasurementFromEvent method to create a SensorMeasurement object with the consumed SensorMeasurementTaken event data.

      // TODO: Implement a Kafka consumer that returns "SensorMeasurement" data.
      //  Stream messages to an outgoing channel called "in-memory-garden-sensor-measurements-raw"
      @Incoming("garden-sensor-measurements-raw")
      @Outgoing("in-memory-garden-sensor-measurements-raw")
      @Broadcast
      public SensorMeasurement consumeRawSensorMeasurements(SensorMeasurementTaken event)
              throws JsonProcessingException {
          SensorMeasurement sensorMeasurement = createSensorMeasurementFromEvent(event);
          return sensorMeasurement;
      }
    8. Run the garden-back service.

      [user@host garden-back]$ ./mvnw package quarkus:dev
      ...output omitted...
      2021-10-20 14:01:27,126 INFO  [io.quarkus] (Quarkus Main Thread) garden-back ... started in 4.112s. Listening on: http://localhost:8081
      ...output omitted...
    9. Open your web browser and navigate to http://localhost:8081/sensor/measurements/raw. Verify that the web page shows the streaming raw sensor measurement data continuously.

      ...output omitted...
      data: {"sensorId":4,"type":"TEMPERATURE",
      "value":38.58951966283255,"timestamp":1634683887382}
      
      data: {"sensorId":3,"type":"WIND",
      "value":8.610418216859195,"timestamp":1634683892380}
      
      data: {"sensorId":3,"type":"WIND",
      "value":14.241427507143873,"timestamp":1634683897380}
      
      data: {"sensorId":3,"type":"WIND",
      "value":12.273055290240634,"timestamp":1634683902384}
      ...output omitted...

      Important

      Certain web browsers might try to download the stream as a file. In that case, verify that the file contains raw sensor measurements.

      Alternatively, you can use a different browser or a tool such as curl.

      Close the browser tab. Leave this terminal window open.

  4. Start the front-end application in a separate terminal window.

    From the comprehensive-review directory, run the python scripts/serve-frontend.py command.

    This script makes the front-end application available at http://localhost:8083. Ensure the Python virtual environment is activated before running the Python script.

    1. Open a new terminal window, and ensure you are in your workspace directory. Activate the Python virtual environment, navigate to the comprehensive-review directory, and run the front-end application.

      [user@host AD482]$ source .venv/bin/activate
      (.venv) [user@host AD482]$ cd comprehensive-review
      (.venv) [user@host comprehensive-review]$ python scripts/serve-frontend.py
      ...output omitted...

      Leave this terminal window open.

    2. Open your web browser and navigate to http://localhost:8083. Verify that the dashboard lists no garden data.

      Leave the browser tab open.

  5. In the garden-streams service, read the sensor metadata and sensor measurements streams, and join them to create a stream of enriched sensor measurements.

    The application already implements the following features:

    • The Debezium source connector writes sensor metadata in the garden-sensors topic.

    • The garden-sensors service writes sensor measurements in the garden-sensor-measurements-repl topic.

    In the com.redhat.training.gardens.GardenStreamsTopology class, you must implement the code to write a stream of SensorMeasurementEnriched records to the garden-sensor-measurements-enriched topic. You must load the sensor metadata into a table, read the sensor measurements stream, and join them according to the following table:

    StreamTopicKeyValue
    Sensors Stream garden-sensors Sensor ID (Integer)Sensor metadata (Sensor)
    Measurements Stream garden-sensor-measurements-repl Sensor ID (Integer)Sensor measurement (SensorMeasurement)
    Enriched Stream garden-sensor-measurements-enriched Sensor ID (Integer)Sensor measurement + metadata (SensorMeasurementEnriched)

    Finally, run the garden-streams application. You must skip tests, because the application is still not complete. Use the ./mvnw package quarkus:dev -DskipTests command.

    1. In a new terminal window, navigate to the garden-streams service directory.

      [user@host AD482]$ cd comprehensive-review/garden-streams
    2. Edit the com.redhat.training.gardens.GardenStreamsTopology class to load the Sensor records into a global table.

      // TODO: Read sensors
      GlobalKTable<Integer, Sensor> sensors = builder.globalTable(
          SENSORS_TOPIC,
          Consumed.with(Serdes.Integer(), sensorSerde));
    3. Read the SensorMeasurement records as a stream.

      // TODO: Read sensor measurements
      KStream<Integer, SensorMeasurement> sensorMeasurements = builder.stream(
          SENSOR_MEASUREMENTS_TOPIC,
          Consumed.with(Serdes.Integer(), sensorMeasurementSerde));
    4. Join the stream of sensor measurements with the sensor table. Use the SensorMeasurementEnriched class to create joint records.

      // TODO: Join measurements with sensor table
      KStream<Integer, SensorMeasurementEnriched> enrichedSensorMeasurements = sensorMeasurements
          .join(
              sensors,
              (sensorId, measurement) -> sensorId,
              (measurement, sensor) -> new SensorMeasurementEnriched(
                  measurement, sensor));
    5. Send the enriched sensor measurements to the garden-enriched-sensor-measurements topic.

      // TODO: Send enriched measurements to topic
      enrichedSensorMeasurements.to(
          ENRICHED_SENSOR_MEASUREMENTS_TOPIC,
          Produced.with(Serdes.Integer(), sensorMeasurementEnrichedSerde));
    6. Print the enrichedSensorMeasurements. This helps you to debug whether the topology is receiving and joining data.

      // TODO: Send enriched measurements to topic
      enrichedSensorMeasurements.to(
          ENRICHED_SENSOR_MEASUREMENTS_TOPIC,
          Produced.with(Serdes.Integer(), sensorMeasurementEnrichedSerde));
      
      enrichedSensorMeasurements.print(Printed.toSysOut());
    7. Run the garden-streams application. Note that you must use the -DskipTests parameter to skip the execution of unit tests, because the application is not complete yet.

      [user@host garden-streams]$ ./mvnw package quarkus:dev -DskipTests
      ...output omitted...
      [KSTREAM-LEFTJOIN-0000000005]: 4, com.redhat...SensorMeasurementEnriched@7627d833
      [KSTREAM-LEFTJOIN-0000000005]: 1, com.redhat...SensorMeasurementEnriched@62bd28b4

      Verify that the output displays SensorMeasurementEnriched records.

      Leave this terminal window open. The Quarkus dev mode hot-reloads the application if you make changes to the source code.

      Note

      The Quarkus hot-reload feature might not work well if your editor automatically saves changes. In this case, you can stop the application and run the ./mvnw quarkus:dev command again.

    8. Return to your browser tab and refresh the page. Verify that the Sensor Measurements table displays enriched sensor measurement data.

  6. In the garden-streams service, compute garden events based on sensor measurements to display them in the front end.

    In the GardenStreamsTopology class, implement the necessary transformations to split the enriched sensor measurements stream by type. Next, create three new streams of events by applying conditions to measurement values, as specified in the following table:

    SensorMeasurementTypeCondition(Key, Event)Topic
    TEMPERATUREvalue < 5.0(sensorId, LowTemperatureDetected)garden-low-temperature-events
    HUMIDITYvalue < 0.2(sensorId, DryConditionsDetected)garden-low-humidity-events
    WINDvalue > 10(sensorId, StrongWindDetected)garden-strong-wind-events

    Important

    If you experience problems after restarting the garden-streams service, then you might want to delete the Kafka Streams states directory. In Linux and Mac, this directory is located at /tmp/kafka-streams.

    1. Create the GardenStreamsTopology#processTemperature private method to produce LowTemperatureDetected events.

      // TODO: implement temperature processor
      private void processTemperature(KStream<Integer, SensorMeasurementEnriched> temperatureMeasurements) {
          temperatureMeasurements
              .filter((sensorId, measurement) -> measurement.value < LOW_TEMPERATURE_THRESHOLD_CELSIUS)
              .mapValues((measurement) -> new LowTemperatureDetected(measurement.gardenName, measurement.sensorId,
                      measurement.value, measurement.timestamp))
              .to(LOW_TEMPERATURE_EVENTS_TOPIC, Produced.with(Serdes.Integer(), lowTemperatureEventSerde));
      }
    2. Create the GardenStreamsTopology#proccessHumidity private method to produce LowHumidityDetected events.

      // TODO: implement humidity processor
      private void processHumidity(KStream<Integer, SensorMeasurementEnriched> humidityMeasurements) {
          humidityMeasurements
              .filter((sensorId, measurement) -> measurement.value < LOW_HUMIDITY_THRESHOLD_PERCENT)
              .mapValues((measurement) -> new LowHumidityDetected(measurement.gardenName, measurement.sensorId,
                      measurement.value, measurement.timestamp))
              .to(LOW_HUMIDITY_EVENTS_TOPIC, Produced.with(Serdes.Integer(), lowHumidityEventSerde));
      }
    3. Create the GardenStreamsTopology#processWind private method to produce StrongWindDetected events.

      // TODO: implement wind processor
      private void processWind(KStream<Integer, SensorMeasurementEnriched> windMeasurements) {
          windMeasurements
          .filter((sensorId, measurement) -> measurement.value > STRONG_WIND_THRESHOLD_MS)
              .mapValues((measurement) -> new StrongWindDetected(measurement.gardenName, measurement.sensorId,
                      measurement.value, measurement.timestamp))
              .to(STRONG_WIND_EVENTS_TOPIC, Produced.with(Serdes.Integer(), strongWindEventSerde));
      }
    4. Split the enriched sensor measurements stream by type. The type is one of the values of the SensorMeasurementType enum.

      // TODO: split stream
      enrichedSensorMeasurements
          .split()
              .branch((sensorId, measurement) -> measurement.type.equals(SensorMeasurementType.TEMPERATURE),
                      Branched.withConsumer(this::processTemperature))
              .branch((sensorId, measurement) -> measurement.type.equals(SensorMeasurementType.HUMIDITY),
                      Branched.withConsumer(this::processHumidity))
              .branch((sensorId, measurement) -> measurement.type.equals(SensorMeasurementType.WIND),
                      Branched.withConsumer(this::processWind));
    5. Wait until Quarkus restarts the application and the Kafka Streams engine switches to the RUNNING state. Alternatively, you can terminate the command and rerun the application.

      2021-09-14 14:39:51,336 INFO  [...] State transition from REBALANCING to RUNNING

      Return to your browser tab and refresh the page. Verify that the dashboard displays events in the Garden Events table.

  7. In the garden-streams service, compute the last-minute status of each garden to display the values in the front end.

    The application already implements the following features:

    • The garden-back application exposes the /garden/statuses endpoint, which reads the garden-status-events topic and exposes a stream of server-sent events. Record values in the garden-status-events topic must be instances of GardenStatus. Each one of these objects contains the garden name, the last reported measurements and the trend.

    • The garden-front application opens a connection to the /garden/statuses endpoint and displays the received values.

    In the GardenStreamsTopology class, implement the necessary transformations to write a stream of GardenStatus objects to the garden-status-events Kafka topic. To achieve this, perform the following steps:

    • Group the enriched sensor measurements stream by garden name.

    • Add a time window of one minute.

    • Use the GardenStatus class to aggregate the data. Add records with the GardenStatus#updateWith method.

    • Transform the generated table of GardenStatus records to a stream.

    • Map the stream of (windowedGardenName, gardenStatus) records into another stream of (null, gardenStatus) records.

    • Send the stream to the garden-status-events topic.

    1. In the GardenStreamsTopology class of the garden-streams service, group the enrichedSensorMeasurements by garden name, apply a window of one minute, and aggregate by using a GardenStatus object as the accumulator.

      // Aggregate enriched measurements
      enrichedSensorMeasurements
          .groupBy(
              (sensorId, measurement) -> measurement.gardenName,
              Grouped.with(Serdes.String(), sensorMeasurementEnrichedSerde)
          )
          .windowedBy(
              TimeWindows.of(Duration.ofMinutes(1)).advanceBy(Duration.ofMinutes(1))
          )
          .aggregate(
              GardenStatus::new,
              (gardenName, measurement, gardenStatus) ->
                  gardenStatus.updateWith(measurement),
              Materialized
                  .<String, GardenStatus, WindowStore<Bytes, byte[]>>as(
                          "garden-status-store"
                      )
                      .withKeySerde(Serdes.String())
                      .withValueSerde(gardenStatusSerde))
    2. Transform the resulting table into a stream. Map the stream of windowed records into another stream of (null, GardenStatus) records. Finally, write the resulting stream to the garden-status-events topic.

          .aggregate(
              GardenStatus::new,
              (gardenName, measurement, gardenStatus) ->
                  gardenStatus.updateWith(measurement),
              Materialized
                  .<String, GardenStatus, WindowStore<Bytes, byte[]>>as(
                          "garden-status-store"
                      )
                      .withKeySerde(Serdes.String())
                      .withValueSerde(gardenStatusSerde))
          .toStream()
          .map((windowedGardenName, gardenStatus) -> new KeyValue<Void, GardenStatus>(
              null, gardenStatus))
          .to(
              GARDEN_STATUS_EVENTS_TOPIC,
              Produced.with(Serdes.Void(), gardenStatusSerde));
    3. Wait for Quarkus to restart the service or restart the application manually. You might have to wait a few seconds until the Kafka Streams engine reaches the RUNNING state after the restart.

    4. Refresh the front-end page. Verify that the page displays garden status.

      The front end should display the status of four gardens. Each garden shows only one measurement type because just one sensor per garden is registered in the application.

Evaluation

Do not stop the running applications.

Open a new terminal window, and make sure you are in your workspace directory. Activate the Python virtual environment, and use the lab command to grade your work. Correct any reported failures and rerun the command until successful.

[student@workstation AD482]$ lab grade comprehensive-review

Terminate all the running applications.

Finish

Use the lab command to complete this exercise. This is important to ensure that resources from previous exercises do not impact upcoming exercises.

[student@workstation AD482]$ lab finish comprehensive-review

This concludes the lab.

Revision: ad482-1.8-cc2ae1c