12

I'm trying to get celery's official tutorial work but kept getting this error:

D:\test>celery -A tasks worker --loglevel=info
-------------- celery@BLR122S v3.0.17 (Chiastic Slide)
---- **** -----
--- * * * -- [Configuration]
-- * - **** --- . broker: amqp://guest@localhost:5672//
- ** ---------- . app: tasks:0x2a76850
- ** ---------- . concurrency: 2 (processes)
- ** ---------- . events: OFF (enable -E to monitor this worker)
- ** ----------
- *
--- * --- [Queues]
-- ******* ---- . celery: exchange:celery(direct) binding:celery
--- ***** -----
[Tasks]
. tasks.add
[2013-03-29 17:50:52,533: WARNING/MainProcess] celery@BLR122S ready.
[2013-03-29 17:50:52,568: INFO/MainProcess] consumer: Connected to amqp://guest@ 127.0.0.1:5672//.
[2013-03-29 17:51:32,496: INFO/MainProcess] Got task from broker: tasks.add[8345 9233-ce54-40ed-a2a8-ee0d60768006]
[2013-03-29 17:51:32,562: ERROR/MainProcess] Task tasks.add[83459233-ce54-40ed-a 2a8-ee0d60768006] raised exception: Task of kind 'tasks.add' is not registered, please make sure it's imported.
Traceback (most recent call last):

File "C:\Python27\lib\site-packages\billiard\pool.py", line 293, in worker

result = (True, func(*args, **kwds))
File "C:\Python27\lib\site-packages\celery\task\trace.py", line 320, in _fast_trace_task
return _tasks[task].__trace__(uuid, args, kwargs, request)[0]
File "C:\Python27\lib\site-packages\celery\app\registry.py", line 20, in __missing__
raise self.NotRegistered(key)
NotRegistered: 'tasks.add'

I installed celery==3.0.17 and rabbitMQ.
Then start celery by "D:\test>celery -A tasks worker --loglevel=info"
tasks.add seems to be in [Tasks], but calling by:

>>> from tasks import add
>>> add.delay(1,1)
# Out: AsyncResult: 83459233-ce54-40ed-a2a8-ee0d60768006

got the failure above. Does anyone have the same problem?

Edit: Here is my tasks.py copying from tutorial.

from celery import Celery

celery = Celery('tasks', broker='amqp://guest@localhost//')

@celery.task
def add(x, y):
    return x + y
Dhia
  • 10,119
  • 11
  • 58
  • 69
shaun shia
  • 1,042
  • 2
  • 9
  • 14

4 Answers4

5

try to import tasks first, I recommend you implement your work in a interactive python environment, like a python IDE, and then you do this:

  • import tasks

before you write tasks.add

Dhia
  • 10,119
  • 11
  • 58
  • 69
Roger Liu
  • 1,768
  • 3
  • 16
  • 25
  • 1
    I edited my question. Where should I put import tasks?(PS: this is my first question too) – shaun shia Mar 29 '13 at 11:26
  • 1
    I solved the problem but don't know why. If I run tasks.add in a dedicated file like [link](http://stackoverflow.com/questions/9769496/celery-received-unregistered-task-of-type-run-example) instead of in shell, there won't be any error. – shaun shia Mar 29 '13 at 12:34
  • 1
    I think it's about path, in your shell add the path of your project to the enviroment variable – Roger Liu Mar 30 '13 at 03:38
  • This is a long time later but thank you, I kept doing `from myapp.tasks import add` and got the same error. I did `from tasks import add` and it worked. – Amon Feb 14 '18 at 01:25
4

Following Will solve your Problem

from tasks import add
res = add.delay(1,2) #call to add 
res.get() #get result

Restart your worker after changes using

celery -A tasks worker --loglevel=info
Roshan Bagdiya
  • 2,048
  • 21
  • 41
4

Had the same problem; I found solution by comparing PYTHONPATH from my main.py and the one from console where I run my celery worker. tasks.py was not in PYTHONPATH for the celery worker so it couldn't import the task.

my files:

try_celery/tasks.py

from celery import Celery

app = Celery('tasks', broker='pyamqp://guest@localhost//', backend='redis://localhost')


@app.task
def add(x, y):
    return x + y

try_celery/main.py

from try_celery.tasks import add

if __name__ == '__main__':
    result = add.delay(4, 4)
    while not result.ready():
        pass
    print(result.get())

run_worker.sh

#!/bin/bash
PYTHONPATH=$PYTHONPATH:/<path_to_project>  # line this solved the problem
celery -A try_celery.tasks worker --loglevel=info

to execute, first run run_worker.sh then run python3 main.py As I run my main.py from IDE, the PYTHONPATH was already updated with project path

Rugnar
  • 2,894
  • 3
  • 25
  • 29
  • Curiously this also applies if you use this format: ```signature('proj.tasks.add', args=(10, 10))()``` If you type ```import proj.tasks``` first then it works. – Chris Huang-Leaver Mar 31 '23 at 02:24
-1

I solved same problem by reloading celery.service

sudo systemctl restart celery

if you change tasks, you need to restart service too

every time celery starts it register all tasks that it can find

oruchkin
  • 1,145
  • 1
  • 10
  • 21
  • This sounds like it is extremely specific to your particular install. Most folks won't be running `celery` via `systemd`. – DeusXMachina Aug 15 '22 at 21:31
  • @DeusXMachina, if i found this question, somebody might found it too. Also, how would most folks and you run celery? As i know most folks use systemd, i don't even know other options – oruchkin Aug 16 '22 at 05:52
  • "how would most folks run celery?" - tons of ways, `celery -A woker` directly, Docker, postman, supervisord, maybe even calling from the library. So this boils down to "try restarting celery", in which case your answer should say that. – DeusXMachina Sep 13 '22 at 16:30
  • `celery -A woker` directly only start celery, it won't autostart on-start, supervisord as i know is outdated, and most folks use systemd for autostart. And my answer is actually saying about restarting celery, but in more detail way – oruchkin Sep 13 '22 at 16:54
  • I think you are still missing the point. Your answer only addresses a very specific use-case of celery, not the general problem of "the application cannot find the registered tasks". And Supervisord isn't outdated, I use it all the time. Nor do "most folks use systemd for autostart", there's myriad other ways of deploying celery. – DeusXMachina Sep 14 '22 at 00:00
  • @DeusXMachina, my answer is about "the application cannot find the registered tasks" this is not about just restart your computer. As you may not know, every time you make changes for your code, you have to restart celery, because celery register tasks on the start, so "your application would not find registered tasks" if you added them just now, and not registered them yet (via reloading celery). So this is not specific use-case, this is very common case – oruchkin Sep 14 '22 at 05:47
  • You just answered the question correctly: reloading celery is the solution. The celery process is unable to find the task in memory, and due to the mechanics of the python import system, the only way to rectify is to restart the celery process itself. It has nothing to do with systemd, or any particular runtime. However celery was started, it needs to be re-started in that fashion. – DeusXMachina Sep 15 '22 at 22:43