Skip to content

AkshatRai07/GoRSSAgg

Repository files navigation

Go RSS Feed Aggregator

A concurrent RSS Feed Aggregator built with Go (Golang), PostgreSQL, and SQLC. This server allows users to add RSS feeds, follow them, and aggregates posts from those feeds in the background using concurrent workers.

Prerequisites

Before running this project, ensure you have the following installed on your machine:

  • Go (v1.20+)
  • PostgreSQL
  • Goose (Database migration tool)
go install github.com/pressly/goose/v3/cmd/goose@latest
  • SQLC (SQL compiler)
go install github.com/sqlc-dev/sqlc/cmd/sqlc@latest

Setup & Installation

  1. Clone the repository:
git clone https://github.com/AkshatRai07/GoRSSAgg.git
cd GoRSSAgg
  1. Dependencies: The vendor folder is included in this repository, so you do not need to run go mod tidy or download modules. Go will use the local vendor directory automatically.
  2. Environment Variables: Create a .env file in the root directory based on .env.example.
cp .env.example .env

Open .env and fill in your details:

PORT=8080
DB_URL=postgres://your_user:your_password@localhost:5432/rssagg?sslmode=disable

Replace your_user, your_password, and gorssagg (database name) with your actual PostgreSQL credentials.

Database Setup

This project uses Goose for migrations.

  1. Ensure your PostgreSQL database (e.g., gorssagg) is created.
  2. Run the migrations to create the tables (users, feeds, feed_follows, posts):
cd sql/schema
goose postgres postgres://your_user:your_password@localhost:5432/gorssagg up

Note: Replace the connection string with your actual DB_URL.

Running the Server

To start the server and the scraping workers:

go build && ./GoRSSAgg

Or simply:

go run .

The server will start on the port defined in your .env file.


API Endpoints

The base URL for all endpoints is http://localhost:PORT/v1.

Authentication

Most endpoints are protected and require an API Key.

  • Header: Authorization
  • Value: ApiKey {insert_your_api_key_here}

General

Method Endpoint Description Auth Required
GET /healthz Check server health No
GET /err Test error response No

Users

Method Endpoint Description Auth Required
POST /users Create a new user No
GET /users Get current user details Yes

Create User Body JSON:

{
  "name": "Your Name"
}

Feeds

Method Endpoint Description Auth Required
POST /feeds Create a new RSS feed Yes
GET /feeds Get all available feeds No

Create Feed Body JSON:

{
  "name": "Boot.dev Blog",
  "url": "https://blog.boot.dev/index.xml"
}

Feed Follows

Method Endpoint Description Auth Required
POST /feed_follows Follow a specific feed Yes
GET /feed_follows Get all feeds you follow Yes
DELETE /feed_follows/{id} Unfollow a feed Yes

Create Feed Follow Body JSON:

{
  "feed_id": "uuid-of-the-feed"
}

Posts

Method Endpoint Description Auth Required
GET /posts Get latest posts from followed feeds Yes

Development & Contributing

If you want to modify the database schema or add new queries, follow this workflow:

1. Modifying the Database Schema

If you need to add or change a table:

  1. Navigate to the schema folder: cd sql/schema.
  2. Create a new migration file using Goose:
goose create name_of_change sql
  1. Edit the generated up/down SQL files.
  2. Apply the migration:
goose postgres "your_connection_string" up

2. Modifying SQL Queries

This project uses SQLC to generate type-safe Go code from SQL.

  1. Edit or add queries in sql/queries/*.sql.
  2. Run SQLC to regenerate the Go code in internal/database:
sqlc generate

This creates the necessary Go functions to interact with your new queries.

3. Worker Configuration

The scraping worker runs in the background (main.go). You can adjust the concurrency and scraping interval in the main() function:

// 10 routines, scraping every 1 minute
go startScraping(db, 10, time.Minute)

Project Structure

  • internal/database: Go code generated by SQLC (do not edit manually).
  • sql/schema: Goose migration files.
  • sql/queries: Raw SQL queries used by SQLC.
  • vendor: Application dependencies.
  • main.go: Entry point, server configuration, and router.
  • scrapper.go: Logic for the background worker that fetches RSS feeds.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages