Celery是一个用于处理异步任务的Python库,它允许你将任务分发到多个worker进行处理。以下是Celery的使用详解:
pip install celery
from celery import Celery
app = Celery('tasks', broker='pyamqp://guest@localhost//', backend='rpc://')
@app.task
装饰器将其标记为Celery任务。@app.task
def add(x, y):
return x + y
apply_async
或delay
方法调用任务。# 使用apply_async调用任务
add.apply_async(args=(1, 2), queue='default')
# 使用delay调用任务
result = add.delay(1, 2)
celery -A your_module worker --loglevel=info
AsyncResult
对象查看任务的状态。result = add.delay(1, 2)
print(result.ready()) # 检查任务是否完成
print(result.get()) # 获取任务结果
app = Celery('tasks', broker='pyamqp://guest@localhost//', backend='rpc://')
app.conf.update(
worker_directories=['path/to/worker1', 'path/to/worker2'],
task_queues=[
Queue('default', Exchange('default'), routing_key='default'),
Queue('high-priority', Exchange('high-priority'), routing_key='high-priority'),
],
task_default_queue='default',
task_default_exchange='default',
task_default_routing_key='default',
)
add.apply_async(args=(1, 2), priority=2, queue='high-priority', routing_key='high-priority')
app = Celery('tasks', broker='pyamqp://guest@localhost//', backend='redis://localhost:6379/0')
app.conf.update(
accept_content=['json'],
result_serializer='json',
task_serializer='json',
compression='gzip',
)