Skip to content

pandaadir05/SIEM

Repository files navigation

🛡️ Next-Gen SIEM Platform

A powerful, real-time, AI-enhanced Security Information and Event Management (SIEM) platform designed to detect, correlate, and respond to modern threats — built with ❤️ by Adir.


🚀 Key Features

  • Log Ingestion via Kafka
  • 🔍 Elasticsearch Indexing & Search
  • 🧠 ML-based Anomaly Detection
  • 📜 Sigma Rule Integration for Correlation
  • 🗃️ MITRE ATT&CK Technique Tagging
  • 📈 Real-Time Alerting & Visualization
  • 🤖 LLM-Powered Triage Assistant
  • 🧪 SOAR Automation & Playbooks (Coming soon)
  • 📊 Beautiful Frontend in React (WIP)

🧱 Project Structure

SIEM/
├── ingestion/           → Log agent (sends logs to Kafka)
├── consumer/            → Indexer (reads Kafka, stores to Elasticsearch)
├── correlation/         → Sigma rule engine
├── ml/                  → Anomaly detection
├── llm/                 → GPT/LLM-based alert summarizer
├── rules/               → Sigma rule YAMLs
├── ui/                  → React frontend + backend API
├── test_logs/           → Sample logs (for testing)
├── docker-compose.yml   → All services (Kafka, Zookeeper, Elasticsearch, Kibana)
├── .gitignore           → Exclude build files, logs, etc.
└── README.md            → You’re here

⚡ Quick Start

  1. Clone the Repo:

    git clone https://github.com/YourUsername/SIEM.git
    cd SIEM
  2. (Recommended) Run with Docker:

    docker-compose up -d --build

    This starts Zookeeper, Kafka, Elasticsearch, Kibana, and the Python services. Once all containers are up, you can generate logs (optional) and visit Kibana at http://localhost:5601.

  3. Run Locally (Alternative):

    • Set up a virtual environment and install dependencies:
      python -m venv .venv
      . .venv/bin/activate       # (on linux/mac)
      .\.venv\Scripts\activate   # (on Windows)
      pip install -r requirements.txt
    • Start services (Zookeeper, Kafka, Elasticsearch, Kibana) via Docker:
      docker-compose up -d zookeeper kafka elasticsearch kibana
    • Run the log generator (optional):
      python ingestion/dummy_generator.py
    • Index logs into Elasticsearch:
      python consumer/indexer.py
    • Optionally launch the correlation engine, LLM assistant, and back-end API:
      python correlation/correlation_engine.py
      python llm/llm_assistant.py
      python ui/backend_api/app.py
  4. Frontend Setup:

    cd ui/frontend
    npm install
    npm start

    Open http://localhost:3000 in your browser.


🧪 Testing the Setup

After running docker-compose up -d --build, you can perform these steps to verify the core components:

  1. Check Container Status:

    docker ps

    You should see containers for siem_zookeeper, siem_kafka, siem_elasticsearch, siem_kibana, siem_indexer, siem_correlation_engine, and siem_backend_api running.

  2. Generate Dummy Logs: Run the dummy log generator script. This will send sample logs to the logs_raw Kafka topic.

    # Run this in a separate terminal in the project root
    python ingestion/dummy_generator.py

    You should see output like [DummyGen] Sent: {...}. Let it run for a minute or two, then stop it with Ctrl+C.

  3. Check Logs in Kibana:

    • Open Kibana in your browser: http://localhost:5601
    • Navigate to the menu (☰) > Management > Stack Management > Kibana > Index Patterns.
    • Click "Create index pattern".
    • Enter logs as the index pattern name. It should find the logs index created by the indexer.
    • Select @timestamp as the time field.
    • Click "Create index pattern".
    • Now go to the menu (☰) > Analytics > Discover.
    • You should see the logs generated by dummy_generator.py appearing here.
  4. Check Alerts in Kibana:

    • Some dummy logs might trigger the "Suspicious Command Prompt" rule.
    • Create another index pattern named alerts.
    • Select @timestamp as the time field.
    • Go back to Discover and select the alerts index pattern.
    • You might see alerts generated by the correlation engine.
  5. Check Backend API:

    • Logs Endpoint: Open http://localhost:5000/api/logs in your browser or use curl. You might need to provide the dummy token (check ui/backend_api/rbac.py or ui/frontend/src/api/client.js for the current placeholder token).
      # Example using curl with the placeholder token
      curl -H "Authorization: Bearer fake-jwt-token-admin" http://localhost:5000/api/logs
    • Alerts Endpoint: Open http://localhost:5000/api/alerts or use curl.
      curl -H "Authorization: Bearer fake-jwt-token-admin" http://localhost:5000/api/alerts
  6. Check Frontend (If Running):

    • If you started the React frontend (npm start in ui/frontend), open http://localhost:3000.
    • Navigate the UI (e.g., Dashboard, Timeline) to see if data is being displayed. Note that the frontend might require login/authentication setup to be fully functional.

If logs and alerts appear in Kibana and the API endpoints return data, the basic pipeline is working.


🔧 Troubleshooting

Kafka Dependency Errors (ModuleNotFoundError: No module named 'kafka.vendor.six.moves')

This error typically occurs with older versions of kafka-python on newer Python versions like 3.13.

Solution:

The primary fix is to ensure you are using a recent version of kafka-python.

  1. Update requirements.txt: Make sure your requirements.txt specifies a recent version, for example:

    # requirements.txt
    kafka-python>=2.1.0 # Use a version >= 2.1.0
    # ... other dependencies

    (Note: The explicit six==1.16.0 dependency is likely no longer needed with newer kafka-python versions).

  2. Reinstall Dependencies:

    • 🐳 Docker Users: Rebuild the relevant Docker images to include the updated library:
      docker-compose build indexer correlation-engine backend-api # Rebuild services using Python
      docker-compose up -d --build # Or simply this to rebuild everything if needed
    • 🏃 Local Environment Users: Ensure your virtual environment is activated and reinstall dependencies:
      # Activate your virtual environment (e.g., .\.venv\Scripts\activate)
      pip install --upgrade pip
      pip install -r requirements.txt
  3. Verify: Try running the Python script (e.g., python consumer/indexer.py) again.

If you still encounter issues after updating kafka-python, double-check that you are operating within the correct, activated virtual environment where the updated packages were installed.

Elasticsearch Connection Issues

If components can't connect to Elasticsearch, ensure:

  1. Elasticsearch container is running: docker ps | grep elasticsearch
  2. You can access it: curl http://localhost:9200
  3. Environment variables are correct if you've customized the setup

Data Flow Issues

If you see alerts in Elasticsearch but not logs, or the React frontend isn't updating with new data:

  1. Verify Kafka Topic Health:

    python consumer/check_topics.py

    This will show if messages are flowing into Kafka. You should see messages in the logs_raw topic.

  2. Check Elasticsearch Indices:

    python consumer/debugging.py

    Ensure both logs and alerts indices exist and contain documents.

  3. React Frontend Refresh Issues:

    • The default React implementation doesn't automatically refresh data
    • Add manual refresh by clicking the refresh button in the UI
    • Check browser console (F12) for API errors when fetching data
    • Verify API endpoints return data: http://localhost:5000/api/logs and http://localhost:5000/api/alerts
  4. Timestamp Format Issues: If logs aren't appearing in Kibana despite being in Elasticsearch, check that:

    • The @timestamp field exists and is properly formatted
    • Your time filter in Kibana is set to an appropriate range
    • The index pattern is correctly configured with @timestamp as the time field

💡 Example Rule (Sigma)

title: Suspicious Command Prompt
detection:
  selection:
    CommandLine|contains: "cmd.exe"
  condition: selection
level: high

Sigma rules are stored in rules/ and dynamically loaded into the correlation engine.


🧠 Coming Soon

  • Real-time stream processing with Apache Flink
  • Timeline visualizations & investigation graphs
  • Full SOAR engine with playbooks
  • PDF report generation
  • Kubernetes deployment (Helm)
  • LLM-powered Chat UI for threat hunting

📘 License

MIT © 2025 Adir Shitrit


🙌 Contributing

This is a passion project — feel free to fork, contribute, suggest ideas, or just star the repo ⭐
You can open issues for ideas, bugs, or feature requests.


About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •