A baseline Nstream application that processes data hosted in Confluent Cloud.
We highly recommend following our Confluent starter walkthrough as you explore this codebase.
There are two backend components to this repository:
- An Nstream toolkit-empowered Swim server that consumes from Confluent Cloud topics and processes responses in Web Agents with minimal boilerplate (package
nstream.starter
in the Java code) - A means to populate the former with reasonably frequent messages (package
nstream.starter.sim
in the Java code)
There is also a minimal, general-purpose frontend component under index.html
that is available in a browser window under localhost:9001
while (at minimum) the first backened component runs.
- Java Development Kit (JDK) 11+
- See
build.gradle
for application-specific Java dependencies
- See
- A Confluent Cloud account that contains all the following:
- A schemaless Kafka topic hosted within a network-reachable cluster
- A topic-corresponding API key
- An API secret corresponding to the API key
-
Fix the configuration files
- Correctly populate
secret.properties
- If necessary, make other changes directly to the other
.properties
filessrc/main/resources
- Correctly populate
-
Run the Nstream server
*nix Environment:
./gradlew run
Windows Environment:
.\gradlew.bat run
-
Run the broker populator
*nix Environment:
./gradlew runSim
Windows Environment:
.\gradlew.bat runSim
In addition to achieving full parity with the aforementioned Kafka starter application, this codebase also exercises Confluent Cloud's Schema Registry. Working with this requires a few additional changes.
- A Confluent Cloud account that contains all the following:
- A Kafka topic hosted within a network-reachable cluster configured with the following Avro schema for its values (and no schema on its keys):
{ "fields": [ {"name": "id", "type": "int"}, {"name": "routeId", "type": "int"}, { "name": "dir", "type": { "name": "Dir", "symbols": ["INBOUND","OUTBOUND"], "type": "enum" } }, {"name": "latitude", "type": "float"}, {"name": "longitude", "type": "float"}, {"name": "speed", "type": "int"}, { "name": "bearing", "type": { "name": "Bearing", "symbols": ["N","NE","E","SE","S","SW","W","NW"], "type": "enum" } }, {"name": "routeName", "type": "string"}, {"name": "timestamp", "type": "long"} ], "name": "vehicle", "type": "record" }
- An API key corresponding to the Schema Registry
- An API secret corresponding to the API key
- A Kafka topic hosted within a network-reachable cluster configured with the following Avro schema for its values (and no schema on its keys):
-
Fix different configuration files
- Correctly populate
schema/secret.properties
- If necessary, make other changes directly to the other
.properties
filessrc/main/resources/schema
- Correctly populate
-
Run the Nstream server with different arguments
*nix Environment:
./gradlew runSchema
Windows Environment:
.\gradlew.bat runSchema
-
Run the broker populator
*nix Environment:
./gradlew runSchemaSim
Windows Environment:
.\gradlew.bat runSchemaSim