A concurrent RSS Feed Aggregator built with Go (Golang), PostgreSQL, and SQLC. This server allows users to add RSS feeds, follow them, and aggregates posts from those feeds in the background using concurrent workers.
Before running this project, ensure you have the following installed on your machine:
- Go (v1.20+)
- PostgreSQL
- Goose (Database migration tool)
go install github.com/pressly/goose/v3/cmd/goose@latest- SQLC (SQL compiler)
go install github.com/sqlc-dev/sqlc/cmd/sqlc@latest- Clone the repository:
git clone https://github.com/AkshatRai07/GoRSSAgg.git
cd GoRSSAgg- Dependencies:
The
vendorfolder is included in this repository, so you do not need to rungo mod tidyor download modules. Go will use the local vendor directory automatically. - Environment Variables:
Create a
.envfile in the root directory based on.env.example.
cp .env.example .envOpen .env and fill in your details:
PORT=8080
DB_URL=postgres://your_user:your_password@localhost:5432/rssagg?sslmode=disableReplace your_user, your_password, and gorssagg (database name) with your actual PostgreSQL credentials.
This project uses Goose for migrations.
- Ensure your PostgreSQL database (e.g.,
gorssagg) is created. - Run the migrations to create the tables (
users,feeds,feed_follows,posts):
cd sql/schema
goose postgres postgres://your_user:your_password@localhost:5432/gorssagg upNote: Replace the connection string with your actual DB_URL.
To start the server and the scraping workers:
go build && ./GoRSSAggOr simply:
go run .The server will start on the port defined in your .env file.
The base URL for all endpoints is http://localhost:PORT/v1.
Most endpoints are protected and require an API Key.
- Header:
Authorization - Value:
ApiKey {insert_your_api_key_here}
| Method | Endpoint | Description | Auth Required |
|---|---|---|---|
GET |
/healthz |
Check server health | No |
GET |
/err |
Test error response | No |
| Method | Endpoint | Description | Auth Required |
|---|---|---|---|
POST |
/users |
Create a new user | No |
GET |
/users |
Get current user details | Yes |
Create User Body JSON:
{
"name": "Your Name"
}| Method | Endpoint | Description | Auth Required |
|---|---|---|---|
POST |
/feeds |
Create a new RSS feed | Yes |
GET |
/feeds |
Get all available feeds | No |
Create Feed Body JSON:
{
"name": "Boot.dev Blog",
"url": "https://blog.boot.dev/index.xml"
}| Method | Endpoint | Description | Auth Required |
|---|---|---|---|
POST |
/feed_follows |
Follow a specific feed | Yes |
GET |
/feed_follows |
Get all feeds you follow | Yes |
DELETE |
/feed_follows/{id} |
Unfollow a feed | Yes |
Create Feed Follow Body JSON:
{
"feed_id": "uuid-of-the-feed"
}| Method | Endpoint | Description | Auth Required |
|---|---|---|---|
GET |
/posts |
Get latest posts from followed feeds | Yes |
If you want to modify the database schema or add new queries, follow this workflow:
If you need to add or change a table:
- Navigate to the schema folder:
cd sql/schema. - Create a new migration file using Goose:
goose create name_of_change sql- Edit the generated up/down SQL files.
- Apply the migration:
goose postgres "your_connection_string" upThis project uses SQLC to generate type-safe Go code from SQL.
- Edit or add queries in
sql/queries/*.sql. - Run SQLC to regenerate the Go code in
internal/database:
sqlc generateThis creates the necessary Go functions to interact with your new queries.
The scraping worker runs in the background (main.go). You can adjust the concurrency and scraping interval in the main() function:
// 10 routines, scraping every 1 minute
go startScraping(db, 10, time.Minute)internal/database: Go code generated by SQLC (do not edit manually).sql/schema: Goose migration files.sql/queries: Raw SQL queries used by SQLC.vendor: Application dependencies.main.go: Entry point, server configuration, and router.scrapper.go: Logic for the background worker that fetches RSS feeds.