This project streams CSPR events emitted from active nodes, fires them into a Kafka cluster, then processes/transforms them down stream into various data stores.
- Java 17
- Kafka
- Kubernetes
- Postgres
- Mongo
The project is split into numerous subprojects, each controlled via a gradle build file.
Some projects are shared classes, some are deployed jars.
The deployed projects are:
- Audit API - Provides REST endpoints to replay events that have already been streamed from the nodes
- Audit Consumer - Processes events in Kafka topics into a Mongo datastore, allows replaying and DR
- Producer - Connects to CSPR node(s) via the Java SDK, wraps the events with metadata and sends to a Kafka topic
- Store API - Provides REST endpoints to monitor processed events
- Store Consumer - Processes Kafka topics into a Postgres relational db ready for any downstream use
To build the whole project:
./gradlew clean build
To build individual subprojects, from the project run:
../gradlew clean build
To run through sonar, install it locally and then run:
./gradlew clean build sonarqube -Dsonar.login=YOUR_SONARQUBE_TOKEN --continue -Djacoco.haltOnFailure=false
Current issues are in this GitHub project.
The project uses GitHub Actions to deploy changes from main to the Kubernetes cluster.
Each deployable sub project has it's own workflow file in
.github/workflows
And each deployable sub project also has it's own /deploy folder which contains specific YAML files
The project is extensively documented using the blog site, Medium.
- Part 1 - Highlights the problem space and proposes a technical solution
- Part 2 - Discusses how we set up the Kubernetes cluster with the Kafka ensemble
More blog parts are being added and will be added here as the project matures.
- Java SDK - Used to connect to cspr nodes, deserialise events and provide a steam
- Kubernetes Cluster - Collection of YAML files with detailed documentation to build the cluster