tomoya 7485ae6fc2 modify README | 9 months ago | |
---|---|---|
media | 9 months ago | |
README.md | 9 months ago | |
docker-compose.yml | 9 months ago | |
tasks.py | 9 months ago |
This includes
docker-compose up
-d
run in backgrounddocker-compose down
Celery Documentation
Flower Documentation
pip install -U "celery[redis]"
celery -A proj worker
Worker Options:
-n, --hostname HOSTNAME Set custom hostname (e.g., 'w1@%%h').
Expands: %%h (hostname), %%n (name) and %%d, (domain).
-D, --detach Start worker as a background process.
-S, --statedb PATH Path to the state database. The extension
'.db' may be appended to the filename.
-l, --loglevel [DEBUG|INFO|WARNING|ERROR|CRITICAL|FATAL]
Logging level.
-O, --optimization [default|fair]
Apply optimization profile.
--prefetch-multiplier <prefetch multiplier>
Set custom prefetch multiplier value for
this worker instance.
Pool Options:
-c, --concurrency <concurrency>
Number of child processes processing the
queue. The default is the number of CPUs
available on your system.
celery worker --help can get more infomation.
add --pool=solo
option
>>> from tasks import hello
>>> hello.delay()
If your celery app set rsult backend
>>> from tasks import hello
>>> result = hello.delay()
The ready() method returns whether the task has finished processing or not:
>>> result.ready()
>>> False
You can wait for the result to complete, but this is rarely used since it turns the asynchronous call into a synchronous one:
>>> result.get(timeout=1)
>>> 'hello world'