A service that utilizes Apache Kafka to listen & collect data of Realtime Global Seismic events from various Producers and stream them to various Consumer clients that can alert and keep the users updated & safe in Realtime. Additionally all micro-services are fully Dockerized, ready to run & be deployed just about anywhere!
Table of Contents
Seismic Alerts Streamer at its core, uses Apache Kafka to listen & collect data of Realtime Global Seismic events from Producers and streams them to Consumers.
All Producers are managed via a Python interface. Producers consists of:
- A WebSocket endpoint by European-Mediterranean Seismological Centre (EMSC).
- A Flask Rest API that allows a user to Report any Seismic Activity around them (POST) or Fetch log archives from the Database (GET).
Intuitive Error Handling
These events are then published to two Kafka topics namely minor_seismic_events
& severe_seismic_events
based on their magnitude.
Consumers connected to the Kafka broker, subscribe to these topics and start receiving Seismic Logs. All the consumers are managed via a Multi-threaded Java interface as clients. Consumers consists of:
- A Live Log Feed that reads from both topics allowing the user to conveniently view a Realtime feed of all Seismic Activity around the world
- A Java SMTP client reads from
severe_seismic_events
and Alerts the user of potentially Dangerous Seismic Activity via Email
- A Postgres Database connected directly via a Kafka-JDBC Sink Connector (initialized by the Java interface) conveniently maintains an archive of all Seismic Activity recorded through Kafka.
- An Interactive Web UI (inspired by EMSC) featuring a Map View of all Seismic Events reading from the Postgres Database via our Rest API.
Apache Kafka because of its high through-put enables it to scale easily to huge traffic using many brokers & clusters. The Service is further fully Containerized making it ready for deployment.
This Service can also be extended and Scaled by adding more Seismic Data Providers, and Consumer Clients like Mobile Apps & other Safety Protocols and adding Data Analytics Tools to the Web UI.
The Project is developed using the following Technologies, Libraries & Frameworks:
- Apache Kafka & Kafka-connect (Confluent)
- Docker
- Python
- Java & Maven
- React.js (Javascript)
- Tornado
- Flask
- Leaflet.js
- PostgreSQL
- JavaMail API & Google SMTP Server
- Shell
To setup the project locally follow the steps below
-
- Docker : Told you.. Its fully Containerized!
-
- Fork and clone the project to your local system
- Set necessary environment variables. Create a
.env
file in the root directory and write your Gmail credentials in it for the Gmail SMTP Server. This email will be used to send the Alerts to the users. Name them as:
[email protected] SERVICE_EMAIL_PASSWORD=password
If you have 2-step Authentication set up for your Google Account, Create a new app password and use that password in your .env file
- Now build and run the docker-compose file &
exec -it
into theconsumers
container. To do this, from project directory, run
docker compose up -d --build # Build & run the containers docker exec -it <containerId_of_consumers> sh # Attach shell to consumers container
This should run the following containers
To find the <container_id>, run docker ps
or use Docker Desktop, and copy the container id of container named consumers
or seismic-alerts-streamer-consumers
iv. Once inside the consumers
container's shell, run
mvn -q exec:java # Run the maven project
You have successfully entered the program as a user!
-
The Rest API accessible at
localhost:5000
has the following endpoints:
POST /seismic_events
: Self-Report Seismic Activity in you region
Request Body should adhere to the given format for a successful submission:
{
"magnitude":"float",
"region":"string",
"time":"ISO-8601 string. Min: YYYY, Max: YYYY-MM-DD(T)hh:mm:ss.ssssss(Zone)",
"co_ordinates":"array of floats"
}
GET /seismic_events
: Returns entire recorded archive from Database.
GET /seismic_events/minor
orGET seismic_events/severe
: Returns records based on severity of the Seismic activities.
-
All micro-services are set up with meaningful logs that can help with debugging and further development.
To view log of other containers, run
docker logs <container_id>
If you wish to develop it locally without docker, manually install the technologies used. Run the following commands to install necessary dependencies:
cd Producers pip install -r requirements.txt
cd Consumers mvn clean install
cd Producers/Web npm install