Skip to content

A practical RAG where you can download and chat with github repo

Notifications You must be signed in to change notification settings

SylphAI-Inc/GithubChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GithubChat

A RAG assistant to allow you to chat with any github repo. Learn fast. The default repo is AdalFlow github repo.

Watch the video Click the image above to watch the demo video

Project Structure

.
├── frontend/           # React frontend application
├── src/                # Python backend code
├── api.py              # FastAPI server
├── app.py              # Streamlit application
└── pyproject.toml      # Python dependencies

Backend Setup

  1. Install dependencies:
poetry install
  1. Set up OpenAI API key:

Create a .streamlit/secrets.toml file in your project root:

mkdir -p .streamlit
touch .streamlit/secrets.toml

Add your OpenAI API key to .streamlit/secrets.toml:

OPENAI_API_KEY = "your-openai-api-key-here"

Running the Applications

Streamlit UI

Run the streamlit app:

poetry run streamlit run app.py

FastAPI Backend

Run the API server:

poetry run uvicorn api:app --reload

The API will be available at http://localhost:8000

React Frontend

  1. Navigate to the frontend directory:
cd frontend
  1. Install Node.js dependencies:
pnpm install
  1. Start the development server:
pnpm run dev

The frontend will be available at http://localhost:3000

API Endpoints

POST /query

Analyzes a GitHub repository based on a query.

// Request
{
  "repo_url": "https://github.com/username/repo",
  "query": "What does this repository do?"
}

// Response
{
  "rationale": "Analysis rationale...",
  "answer": "Detailed answer...",
  "contexts": [...]
}

ROADMAP

  • Clearly structured RAG that can prepare a repo, persit from reloading, and answer questions.
    • DatabaseManager in src/data_pipeline.py to manage the database.
    • RAG class in src/rag.py to manage the whole RAG lifecycle.

On the RAG backend

  • Conditional retrieval. Sometimes users just want to clarify a past conversation, no extra context needed.
  • Create an evaluation dataset
  • Evaluate the RAG performance on the dataset
  • Auto-optimize the RAG model

On the React frontend

  • Support the display of the whole conversation history instead of just the last message.
  • Support the management of multiple conversations.

About

A practical RAG where you can download and chat with github repo

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published