Django 深度集成 Celery 实战指南 -- 从配置到生产部署的全流程详解

一、环境准备与依赖安装

# 安装核心依赖  
pip install "celery[redis]" django-celery-results django-celery-beat flower  

# 要求版本  
Django 3.2+  
Celery 5.2+  
Redis 4.0+  

二、项目结构规划

myproject/  
├── myproject/  
│   ├── __init__.py  
│   ├── settings.py         # 主设置  
│   ├── celery.py           # Celery 实例  
│   └── urls.py  
├── apps/  
│   └── orders/  
│       ├── __init__.py  
│       ├── tasks.py        # 业务任务  
│       └── models.py  
└── manage.py  

三、详细配置流程

1. 修改 settings.py

# settings.py  

# Celery 配置  
CELERY_BROKER_URL = 'redis://:password@redis-host:6379/0'  
CELERY_RESULT_BACKEND = 'django-db'  # 使用Django数据库存储结果  
CELERY_CACHE_BACKEND = 'default'  
CELERY_ACCEPT_CONTENT = ['json']  
CELERY_TASK_SERIALIZER = 'json'  
CELERY_RESULT_SERIALIZER = 'json'  
CELERY_TIMEZONE = 'Asia/Shanghai'  

# 安装应用  
INSTALLED_APPS = [  
    ...,  
    'django_celery_results',  
    'django_celery_beat',  
]  

# Django缓存配置  
CACHES = {  
    'default': {  
        'BACKEND': 'django.core.cache.backends.redis.RedisCache',  
        'LOCATION': 'redis://:password@redis-host:6379/1',  
    }  
}  

2. 创建 celery.py

# myproject/celery.py  
import os  
from celery import Celery  

# 设置Django默认环境变量  
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.settings')  

app = Celery('myproject')  

# 使用django配置文件中的设置  
app.config_from_object('django.conf:settings', namespace='CELERY')  

# 自动发现所有注册应用中的tasks.py  
app.autodiscover_tasks()  

@app.task(bind=True, ignore_result=True)  
def debug_task(self):  
    print(f'Request: {self.request!r}')  

3. 项目初始化文件配置

# myproject/__init__.py  
from .celery import app as celery_app  

__all__ = ('celery_app',)  # 确保应用启动时加载Celery  

四、任务开发实战

1. 创建业务任务

# apps/orders/tasks.py  
from celery import shared_task  
from django.core.mail import send_mail  
from .models import Order  

@shared_task(name="process_new_order")  
def process_order(order_id):  
    """处理新订单"""  
    order = Order.objects.get(id=order_id)  
    order.process()  # 业务处理  
    return f"Order {order.id} processed"  

@shared_task(name="send_order_confirmation", autoretry_for=(Exception,), retry_backoff=30, max_retries=3)  
def send_confirmation_email(user_email, order_id):  
    """发送订单确认邮件(带自动重试)"""  
    send_mail(  
        subject=f"Order #{order_id} Confirmation",  
        message="Your order has been processed",  
        from_email="[email protected]",  
        recipient_list=[user_email],  
        fail_silently=False  
    )  
    return f"Email sent to {user_email}"  

2. 视图调用任务

# apps/orders/views.py  
from django.views.decorators.http import require_POST  
from .tasks import process_order, send_confirmation_email  

@require_POST  
def create_order(request):  
    order = Order.objects.create(  
        user=request.user,  
        items=request.POST.get('items')  
    )  
    
    # 异步处理订单  
    process_order.delay(order.id)  
    
    # 链式调用任务  
    (send_confirmation_email.s(request.user.email, order.id) |  
     log_order_creation.s(order.id)).delay()  
    
    return JsonResponse({"status": "created", "order_id": order.id})  

五、定时任务管理

1. 数据库动态管理定时任务

# 迁移数据库创建beat表  
python manage.py migrate django_celery_beat  
# 在Django Admin添加定时任务  
# 访问 /admin/django_celery_beat/periodictask/add/  

# 示例:每天凌晨清理日志  
任务: orders.tasks.cleanup_order_logs  
Crontab:0 2 * * *  # 每天2点执行  
参数:[]  

2. 代码静态配置定时任务

# settings.py  
from celery.schedules import crontab  

CELERY_BEAT_SCHEDULE = {  
    'weekly-report': {  
        'task': 'orders.tasks.generate_weekly_report',  
        'schedule': crontab(day_of_week=1, hour=7),  # 每周一7点  
        'args': (),  
    },  
    'every-10-minutes-check': {  
        'task': 'inventory.tasks.check_low_stock',  
        'schedule': 600.0,  # 每10分钟  
    }  
}  

六、任务结果处理

1. 结果状态跟踪

# 查询任务状态  
from django_celery_results.models import TaskResult  

def check_task_status(request, task_id):  
    try:  
        task = TaskResult.objects.get(task_id=task_id)  
        return JsonResponse({  
            'status': task.status,  
            'result': task.result,  
            'date_done': task.date_done  
        })  
    except TaskResult.DoesNotExist:  
        return JsonResponse({'error': 'Task not found'}, status=404)  

2. 错误处理与告警

# apps/core/signals.py  
from django.db.models.signals import post_save  
from django.dispatch import receiver  
from django_celery_results.models import TaskResult  
import requests  

@receiver(post_save, sender=TaskResult)  
def handle_failed_task(sender, instance, **kwargs):  
    if instance.status == 'FAILURE':  
        # 发送Slack告警  
        requests.post('https://hooks.slack.com/...', json={  
            "text": f"Task failed: {instance.task_id}\nError: {instance.result[:200]}"  
        })  

七、生产部署方案

1. Supervisor 配置

; /etc/supervisor/conf.d/celery.conf  

[program:celery_worker]  
command=/opt/project/venv/bin/celery -A myproject worker \  
        -P gevent -c 100 -Q high_priority,default -n worker.%%h  
directory=/opt/project  
user=www-data  
autostart=true  
autorestart=true  
stopsignal=QUIT  
stdout_logfile=/var/log/celery/worker.log  
redirect_stderr=true  

[program:celery_beat]  
command=/opt/project/venv/bin/celery -A myproject beat \  
        -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler  
directory=/opt/project  
user=www-data  
autostart=true  
autorestart=true  
stdout_logfile=/var/log/celery/beat.log  

2. 集群监控部署

# 启动Flower监控  
celery -A myproject flower \  
    --port=5555 \  
    --basic_auth=admin:ComplexP@ss! \  
    --persistent=True \  
    --db=/var/flower/flower.db  

访问监控面板:https://your-domain.com:5555


八、本地开发技巧

1. 本地调试配置

# settings/dev.py  

# 使用Dummy broker开发环境  
CELERY_TASK_ALWAYS_EAGER = True  # 同步执行任务  
CELERY_TASK_EAGER_PROPAGATES = True  # 传播异常  

# 或者使用本地Redis  
CELERY_BROKER_URL = 'redis://localhost:6379/0'  

2. Shell 快速测试

python manage.py shell  

>>> from orders.tasks import send_confirmation_email  
>>> task = send_confirmation_email.delay("[email protected]", 123)  
>>> task.get(timeout=10)  # 同步获取结果  

九、高级应用场景

1. 任务工作流编排

from celery import chain, chord  

def process_order_workflow(order_id):  
    workflow = chain(  
        validate_order.s(order_id),  
        process_payment.s(),  
        chord(  
            [update_inventory.s(), notify_fulfillment.s()],  
            send_confirmation_email.s()  
        )  
    )  
    return workflow.delay()  

2. 任务速率限制

# 限制外部API调用速率  
@shared_task(rate_limit='10/m')  # 每分钟最多10次  
def call_external_api(data):  
    response = requests.post('https://external.api/v1', json=data)  
    return response.json()  

十、性能优化建议

  1. 任务拆分原则
    • 单个任务不超过300ms
    • 大任务分解为小任务链
  2. 资源隔离配置
# 高性能队列专用worker  
celery worker -A myproject -Q payment -c 50 -P gevent -n payments@%h  

# 低优先级队列  
celery worker -A myproject -Q notifications -c 200 -n notify@%h  
  1. 使用任务结果过期
# settings.py  
CELERY_RESULT_EXPIRES = 3600  # 1小时后清除结果  

完整示例项目:Django-Celery-Integration-Example
最佳实践:配置前务必进行压力测试
监控要求:生产环境必须部署 Flower + Prometheus

通过本教程,您可以构建一套企业级 Django-Celery 集成方案,解决异步任务处理、定时任务、分布式队列管理等核心需求,轻松应对高并发生产场景。

你可能感兴趣的:(Django,V2,#,第12章,异步任务处理,django,Django-Celery集成,异步任务系统,定时任务管理,分布式任务调度,任务监控)