A powerful, real-time, AI-enhanced Security Information and Event Management (SIEM) platform designed to detect, correlate, and respond to modern threats — built with ❤️ by Adir.
- ✅ Log Ingestion via Kafka
- 🔍 Elasticsearch Indexing & Search
- 🧠 ML-based Anomaly Detection
- 📜 Sigma Rule Integration for Correlation
- 🗃️ MITRE ATT&CK Technique Tagging
- 📈 Real-Time Alerting & Visualization
- 🤖 LLM-Powered Triage Assistant
- 🧪 SOAR Automation & Playbooks (Coming soon)
- 📊 Beautiful Frontend in React (WIP)
SIEM/
├── ingestion/ → Log agent (sends logs to Kafka)
├── consumer/ → Indexer (reads Kafka, stores to Elasticsearch)
├── correlation/ → Sigma rule engine
├── ml/ → Anomaly detection
├── llm/ → GPT/LLM-based alert summarizer
├── rules/ → Sigma rule YAMLs
├── ui/ → React frontend + backend API
├── test_logs/ → Sample logs (for testing)
├── docker-compose.yml → All services (Kafka, Zookeeper, Elasticsearch, Kibana)
├── .gitignore → Exclude build files, logs, etc.
└── README.md → You’re here
-
Clone the Repo:
git clone https://github.com/YourUsername/SIEM.git cd SIEM
-
(Recommended) Run with Docker:
docker-compose up -d --build
This starts Zookeeper, Kafka, Elasticsearch, Kibana, and the Python services. Once all containers are up, you can generate logs (optional) and visit Kibana at http://localhost:5601.
-
Run Locally (Alternative):
- Set up a virtual environment and install dependencies:
python -m venv .venv . .venv/bin/activate # (on linux/mac) .\.venv\Scripts\activate # (on Windows) pip install -r requirements.txt
- Start services (Zookeeper, Kafka, Elasticsearch, Kibana) via Docker:
docker-compose up -d zookeeper kafka elasticsearch kibana
- Run the log generator (optional):
python ingestion/dummy_generator.py
- Index logs into Elasticsearch:
python consumer/indexer.py
- Optionally launch the correlation engine, LLM assistant, and back-end API:
python correlation/correlation_engine.py python llm/llm_assistant.py python ui/backend_api/app.py
- Set up a virtual environment and install dependencies:
-
Frontend Setup:
cd ui/frontend npm install npm start
Open http://localhost:3000 in your browser.
After running docker-compose up -d --build
, you can perform these steps to verify the core components:
-
Check Container Status:
docker ps
You should see containers for
siem_zookeeper
,siem_kafka
,siem_elasticsearch
,siem_kibana
,siem_indexer
,siem_correlation_engine
, andsiem_backend_api
running. -
Generate Dummy Logs: Run the dummy log generator script. This will send sample logs to the
logs_raw
Kafka topic.# Run this in a separate terminal in the project root python ingestion/dummy_generator.py
You should see output like
[DummyGen] Sent: {...}
. Let it run for a minute or two, then stop it withCtrl+C
. -
Check Logs in Kibana:
- Open Kibana in your browser: http://localhost:5601
- Navigate to the menu (☰) > Management > Stack Management > Kibana > Index Patterns.
- Click "Create index pattern".
- Enter
logs
as the index pattern name. It should find thelogs
index created by the indexer. - Select
@timestamp
as the time field. - Click "Create index pattern".
- Now go to the menu (☰) > Analytics > Discover.
- You should see the logs generated by
dummy_generator.py
appearing here.
-
Check Alerts in Kibana:
- Some dummy logs might trigger the "Suspicious Command Prompt" rule.
- Create another index pattern named
alerts
. - Select
@timestamp
as the time field. - Go back to Discover and select the
alerts
index pattern. - You might see alerts generated by the correlation engine.
-
Check Backend API:
- Logs Endpoint: Open http://localhost:5000/api/logs in your browser or use
curl
. You might need to provide the dummy token (checkui/backend_api/rbac.py
orui/frontend/src/api/client.js
for the current placeholder token).# Example using curl with the placeholder token curl -H "Authorization: Bearer fake-jwt-token-admin" http://localhost:5000/api/logs
- Alerts Endpoint: Open http://localhost:5000/api/alerts or use
curl
.curl -H "Authorization: Bearer fake-jwt-token-admin" http://localhost:5000/api/alerts
- Logs Endpoint: Open http://localhost:5000/api/logs in your browser or use
-
Check Frontend (If Running):
- If you started the React frontend (
npm start
inui/frontend
), open http://localhost:3000. - Navigate the UI (e.g., Dashboard, Timeline) to see if data is being displayed. Note that the frontend might require login/authentication setup to be fully functional.
- If you started the React frontend (
If logs and alerts appear in Kibana and the API endpoints return data, the basic pipeline is working.
This error typically occurs with older versions of kafka-python
on newer Python versions like 3.13.
Solution:
The primary fix is to ensure you are using a recent version of kafka-python
.
-
Update
requirements.txt
: Make sure yourrequirements.txt
specifies a recent version, for example:# requirements.txt kafka-python>=2.1.0 # Use a version >= 2.1.0 # ... other dependencies
(Note: The explicit
six==1.16.0
dependency is likely no longer needed with newerkafka-python
versions). -
Reinstall Dependencies:
- 🐳 Docker Users: Rebuild the relevant Docker images to include the updated library:
docker-compose build indexer correlation-engine backend-api # Rebuild services using Python docker-compose up -d --build # Or simply this to rebuild everything if needed
- 🏃 Local Environment Users: Ensure your virtual environment is activated and reinstall dependencies:
# Activate your virtual environment (e.g., .\.venv\Scripts\activate) pip install --upgrade pip pip install -r requirements.txt
- 🐳 Docker Users: Rebuild the relevant Docker images to include the updated library:
-
Verify: Try running the Python script (e.g.,
python consumer/indexer.py
) again.
If you still encounter issues after updating kafka-python
, double-check that you are operating within the correct, activated virtual environment where the updated packages were installed.
If components can't connect to Elasticsearch, ensure:
- Elasticsearch container is running:
docker ps | grep elasticsearch
- You can access it:
curl http://localhost:9200
- Environment variables are correct if you've customized the setup
If you see alerts in Elasticsearch but not logs, or the React frontend isn't updating with new data:
-
Verify Kafka Topic Health:
python consumer/check_topics.py
This will show if messages are flowing into Kafka. You should see messages in the
logs_raw
topic. -
Check Elasticsearch Indices:
python consumer/debugging.py
Ensure both
logs
andalerts
indices exist and contain documents. -
React Frontend Refresh Issues:
- The default React implementation doesn't automatically refresh data
- Add manual refresh by clicking the refresh button in the UI
- Check browser console (F12) for API errors when fetching data
- Verify API endpoints return data:
http://localhost:5000/api/logs
andhttp://localhost:5000/api/alerts
-
Timestamp Format Issues: If logs aren't appearing in Kibana despite being in Elasticsearch, check that:
- The
@timestamp
field exists and is properly formatted - Your time filter in Kibana is set to an appropriate range
- The index pattern is correctly configured with
@timestamp
as the time field
- The
title: Suspicious Command Prompt
detection:
selection:
CommandLine|contains: "cmd.exe"
condition: selection
level: high
Sigma rules are stored in
rules/
and dynamically loaded into the correlation engine.
- Real-time stream processing with Apache Flink
- Timeline visualizations & investigation graphs
- Full SOAR engine with playbooks
- PDF report generation
- Kubernetes deployment (Helm)
- LLM-powered Chat UI for threat hunting
MIT © 2025 Adir Shitrit
This is a passion project — feel free to fork, contribute, suggest ideas, or just star the repo ⭐
You can open issues for ideas, bugs, or feature requests.