7

I have a Django 1.11 and Celery 4.1 project, and I've configured it according to the setup docs. My celery_init.py looks like

from __future__ import absolute_import

import os

from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings.settings'

app = Celery('myproject')

app.config_from_object('django.conf:settings', namespace='CELERY')

#app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) # does nothing
app.autodiscover_tasks() # also does nothing

print('Registering debug task...')
@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))

However, when I launch a worker with:

.env/bin/celery worker -A myproject -l info

it shows no tasks being found except for the sample "debug_task", even though I have several installed apps with Celery tasks, with should have been found via the call to app.autodiscover_task(). This is the initial output my worker generates:

 -------------- celery@localhost v4.1.0 (latentcall)
---- **** ----- 
--- * ***  * -- Linux-4.13.0-16-generic-x86_64-with-Ubuntu-16.04-xenial 2017-10-31 15:56:42
-- * - **** --- 
- ** ---------- [config]
- ** ---------- .> app:         myproject:0x7f952856d650
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     amqp://
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** ----- 
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . myproject.celery_init.debug_task

[2017-10-31 15:56:42,180: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2017-10-31 15:56:42,188: INFO/MainProcess] mingle: searching for neighbors
[2017-10-31 15:56:43,211: INFO/MainProcess] mingle: all alone
[2017-10-31 15:56:43,237: INFO/MainProcess] celery@localhost ready.

All my legacy tasks in my app tasks.py files were defined like:

from celery.task import task

@task(name='mytask')
def mytask():
    blah

The docs suggest using the shared_task decorator, so instead I tried:

from celery import shared_task

@shared_task
def mytask():
    blah

But my Celery worker still doesn't see it. What am I doing wrong?

Edit: I've been able to get tasks to show up by explicitly listing them in my setting's CELERY_IMPORTS list, but even then I have to heavily edit the tasks.py to remove all imports of my Django project (models.py, etc) or it raises the exception Apps aren't loaded yet. This is better than nothing, but requires a huge amount of refactoring. Is there a better way?

Cerin
  • 60,957
  • 96
  • 316
  • 522
  • what happens when you run `celery worker -A app.tasks -l DEBUG` I found I needed to specify an app's task file, not the project overall. – Jason Nov 02 '17 at 18:23

6 Answers6

8

I had a similar issue, and the solution was to add the include kwarg to your celery call.

The include argument is a list of modules to import when the worker starts. You need to add our tasks module here so that the worker is able to find our tasks.

app = Celery('myproject', 
             backend = settings.CELERY.get('backend'),
             broker = settings.CELERY.get('broker'),
             include = ['ingest.tasks.web', ... ])

Check out http://docs.celeryproject.org/en/latest/getting-started/next-steps.html#proj-celery-py for more information

Jason
  • 11,263
  • 21
  • 87
  • 181
4

Just posting this here (I don't know why it works)

from django.conf import settings

app.config_from_object(settings, namespace='CELERY')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS, force=True)

force=True seems to be the solution

Another thing that works is calling django.setup() before instantiating celery.

from __future__ import absolute_import, unicode_literals
import os

import django

from celery import Celery


django.setup()  # This is key


# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

app = Celery('notifs')
app.config_from_object('django.conf:settings', namespace='CELERY')

app.autodiscover_tasks()

This method avoids force=True, import django.conf.settings and seems cleaner to me. Though i still have no idea why you need to call django.setup because It isn't stated in the docs.

danidee
  • 9,298
  • 2
  • 35
  • 55
0
from django.conf import settings    
celery_app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)

You can add the above line to make celery to autodiscover all tasks written in the whole project. This is working for me.

daemon24
  • 1,388
  • 1
  • 11
  • 23
0

I found the bigger problem was that Celery wasn't setting my custom Celery() instance as the current app. To fix it, I had to modify my celery_init.py to include:

from celery._state import _set_current_app

# setup my app = Celery(...)

_set_current_app(app)
Cerin
  • 60,957
  • 96
  • 316
  • 522
0

Even i am not sure, if this will work. But i guess it worked for me:

There is a section in the same documentation you mentioned:

enter image description here

After this i removed CELERY_IMPORTS and it could register the tasks from all apps inside my project

bluefoggy
  • 961
  • 1
  • 9
  • 23
0

Note that if you have class-based tasks (CBT), like in our projects, those hacks above will still not work. For example:

from celery.app.task import Task

class CustomTask(Task):

    def run(self):
        print('running.')

I found out that there are 2 solutions/workarounds:

  1. Register the task below the task class definition, and assign the task instance as the new global var:

    from celery import current_app
    
    CustomTask = current_app.register_task(CustomTask())
    

    This is kinda disruptive though if you have lots of CBT, and the existing CustomTask().apply_async() calls should be converted to CustomTask.apply_async() since CustomTask is now an object/instance (previously a class name). Although a decorator could probably simplify the registration part.

  2. Use the old/back-compatible base task class as the parent class instead of the celery.app.task.Task:

    from celery.task import Task
    
    class CustomTask(Task):
        ...
    

    This seems deprecated in the Celery 5.x codebase though. But at least it's the simplest solution for now (with Celery 4.x) before we upgrade.

Ranel Padon
  • 525
  • 6
  • 13