背景
最近因項目需要,學習任務隊列Celery的用法,跟着官網寫Demo,出現如題錯誤,最終在github的Issues里找到解決辦法,記錄如下。
場景還原
本地環境如下:
- Windows 7
- Python 3.6.7
- Celery 4.1.0
代碼tasks.py:
from celery import Celery
app = Celery('tasks', broker='redis://:xxxx@xxx.xxx.xxx.xx:6379/0')
@app.task
def add(x, y):
return x + y
執行worker
celery -A tasks worker --loglevel=info
輸出:
-------------- celery@YG_lin v4.2.1 (windowlicker)
---- **** -----
--- * *** * -- Windows-7-6.1.7601-SP1 2018-12-05 20:03:58
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: tasks:0x38527b8
- ** ---------- .> transport: redis://:**@192.168.0.2:6379//
- ** ---------- .> results: redis://:**@192.168.0.2:6379/1
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> celery exchange=celery(direct) key=celery
[tasks]
. tasks.add
[2018-12-05 20:03:58,721: INFO/MainProcess] Connected to redis://:**@192.168.0.2:6379//
[2018-12-05 20:03:58,735: INFO/MainProcess] mingle: searching for neighbors
[2018-12-05 20:03:58,976: INFO/SpawnPoolWorker-1] child process 16292 calling self.run()
[2018-12-05 20:03:59,006: INFO/SpawnPoolWorker-2] child process 14764 calling self.run()
[2018-12-05 20:03:59,026: INFO/SpawnPoolWorker-3] child process 13864 calling self.run()
[2018-12-05 20:03:59,078: INFO/SpawnPoolWorker-4] child process 15980 calling self.run()
[2018-12-05 20:03:59,893: INFO/MainProcess] mingle: all alone
[2018-12-05 20:03:59,915: INFO/MainProcess] celery@YG_lin ready.
打開另一個python終端:
>>>from tasks import add
>>>add.delay(4, 4)
然后worker里報錯:
[2018-12-05 20:03:59,933: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
Traceback (most recent call last):
File "c:\users\administrator\envs\dj11.7\lib\site-packages\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\administrator\envs\dj11.7\lib\site-packages\celery\app\trace.py", line 537, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
[2018-12-05 20:04:15,392: INFO/MainProcess] Received task: tasks.add[b76c9d02-ca3c-4272-b593-89c280f633da]
[2018-12-05 20:04:15,399: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
Traceback (most recent call last):
File "c:\users\administrator\envs\dj11.7\lib\site-packages\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\administrator\envs\dj11.7\lib\site-packages\celery\app\trace.py", line 537, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0
解決:
看別人描述大概就是說win10上運行celery4.x就會出現這個問題,解決辦法如下,原理未知:
先安裝一個eventlet
pip install eventlet
然后啟動worker的時候加一個參數,如下:
celery -A <mymodule> worker -l info -P eventlet
也就是
celery -A tasks worker -l info -P eventlet
然后就可以正常的調用了。