You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This story adds a celery task to asynchronously execute Rulesets.
Expected behaviour
Assume celery does not execute with the same backend as the calling process. That is, calling the task must only accept serialized input and/or keys for e.g. rulesets
Deployment of rulesets to the celery instance shall be done using pip installable eggs (remember a ruleset is the collection of rules that either point to a module are have a yaml or JSON input).
There is a management command to deploy rulesets, using graceful restart of the celery instance. the command must take a ruleset name, build the egg, then deploy it.
Implementation notes
Ruleset execution
Ruleset execution is by means of calling the respective celery task. Can we have one named celery task for each ruleset? => can track each ruleset execution by celery's log, easy management with flower etc. If not, regress to a simple ruleset celery task that takes the ruleset's slug and the rule context, executes it, returns and stores its results.
using a simple celery instance. gracefully shutdown the celery worker, pip install the new ruleset/version, restart the worker
using docker. Have a docker image with python, django, celery. then each time we want to redeploy the tasks, gracefully shut down the worker, rebuild the docker image, shut down the docker image, start it again
priority
Since 2 is the same as 1 plus docker, do 1 first. We can do 2 in a later story.
restart worker instance
As for stopping and restarting celery instances, use flower
The text was updated successfully, but these errors were encountered:
This story adds a celery task to asynchronously execute Rulesets.
Expected behaviour
Implementation notes
Ruleset execution
Ruleset execution is by means of calling the respective celery task. Can we have one named celery task for each ruleset? => can track each ruleset execution by celery's log, easy management with flower etc. If not, regress to a simple
ruleset
celery task that takes the ruleset's slug and the rule context, executes it, returns and stores its results.Rule export
see #13
Rule deyployment
Two options for deployment.
priority
Since 2 is the same as 1 plus docker, do 1 first. We can do 2 in a later story.
restart worker instance
As for stopping and restarting celery instances, use flower
The text was updated successfully, but these errors were encountered: