1. 首先在项目目录下创建一个异步工作单元文件夹worker
2. 创建init和config文件,内容分别是:
init:
#!/usr/bin/env python
import os
from celery import Celery
from worker import config
# 加载Django的settings
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "social_demo.settings")
# 定义celery
celery_app = Celery('worker')
# config_from_object自动从config模块里面加载配置
celery_app.config_from_object(config)
# 自动发现Django中有哪些任务
celery_app.autodiscover_tasks()
worker:
broker_url = 'redis://127.0.0.1:6379/1'
broker_pool_limit = 100 # Borker 连接池, 默认是10
timezone = 'Asia/Shanghai'
# 处理任务内容的类型
accept_content = ['pickle', 'json']
# redis需要字符串类型才能存储 利用pickle序列化一下
task_serializer = 'pickle'
result_expires = 3600 # 任务过期时间
# 结果保存
result_backend = 'redis://127.0.0.1:6379/1'
result_serializer = 'pickle'
result_cache_max = 10000 # 任务结果最大缓存数量
# 日志级别
worker_redirect_stdouts_level = 'INFO'
3. 然后在用接口函数调用
# 上传头像
def upload_avatar(request):
avatar_file = request.FILES['avatar']
# 通过delay让celery异步执行
logics.upload_avatar.delay(request.user, avatar_file)
return render_json()
4. 服务器上面开启服务,另外再开启一个celery进程
python manage.py runserver
celery worker -A worker --loglevel=info