Backend providing asynchronous request processing for frontends handling intensive workloads.
This repository implements open design proposal 001:
Ensure you have Docker installed.
Clone the Repository
git clone https://github.com/digital-land/async-request-backend.git
cd async-request-backendCreate and Activate Virtual Environment
python -m venv venv
source venv/bin/activateInstall Dependencies
pip install -r requirements.txtInitialize the project
make initA docker compose setup has been configured to run the async request backend. This setup runs a Python/FastAPI for receiving requests, a Postgres database to store requests, an SQS queue using the excellent Localstack container to trigger processing and a basic Python app to process the requests.
You can run the docker compose stack by executing the following command:
docker-compose up -d --no-deps --buildTo view service logs, use:
docker compose logs -f <service_name>
# Example
docker compose logs -f request-processorTo inspect the database tables and records via CLI, execute:
docker-compose exec request-db psql -U postgres -d request_databaseTo create a new request, you can post via curl:
curl --location 'http://localhost:8000/requests' \
--header 'Content-Type: application/json' \
--data-raw '{
"user_email": "[email protected]"
}'Alternatively, a good way to test local service (with Postman) is go to https://provide.planning.data.gov.uk/ select a dataset and use it's Enpoint URL. Fill in the check_url_request.json template in request-api folder with this endpoint and create a POST to http://localhost:8000/requests with this json in the body.
You can then go to GET http://localhost:8000/requests/{request.id} to see the results. http://localhost:8000/requests/{request.id}/response-details provides the breakdown that the provide front end builds of.
If changes are made, you can use the production async-backend at (http://production-pub-async-api-lb-636110663.eu-west-2.elb.amazonaws.com) with a POST to /requests. Then compare the results of production to what your localhost machine is producing.
To create an SQS queue, you can use the AWS CLI:
aws --endpoint-url=http://localhost:4566 sqs create-queue --queue-name async-request-queue --region eu-west-2 --output table | catYou can place a test message on the queue like so:
aws --endpoint-url=http://localhost:4566 sqs send-message --queue-url http://sqs.eu-west-2.localhost.localstack.cloud:4566/000000000000/async-request-queue --message-body "Hello World"You can read the test message from the queue like so:
aws --endpoint-url=http://localhost:4566 sqs receive-message --queue-url http://sqs.eu-west-2.localhost.localstack.cloud:4566/000000000000/async-request-queueTo delete the message, you'll need to run the following command making use of the receipt-handle parameter associated with the message, e.g.
aws --endpoint-url=http://localhost:4566 sqs delete-message --queue-url http://sqs.eu-west-2.localhost.localstack.cloud:4566/000000000000/async-request-queue \
--receipt-handle "MzczYmIzODAtNmM2YS00ZDAyLThkOWYtMTgyYjcyYzZlOTA0IGFybjphd3M6c3FzOmV1LXdlc3QtMjowMDAwMDAwMDAwMDA6YXN5bmMtcmVxdWVzdC1xdWV1ZSBhMjk1ZGVhNi1jNGI2LTQ5ZDQtODEyNC0yNjMwMjFhOWZlOTMgMTcwNzgzNzc1My43NzMzOTk4"To include new dependencies, update the requirements.in file with the desired packages. Afterward, run the following command to generate an updated requirements.txt file, which includes both direct and dependent dependencies:
pip-compile -r requirements/requirements.in
This ensures that your project accurately reflects its dependencies, including any transitive dependencies required by the newly added packages.