2012-09-07 13 views
5

Estoy tratando de seguir las instrucciones en http://docs.celeryproject.org/en/latest/django/first-steps-with-django.html para hacer el trabajo de apio con Django.base de datos de Django como una conexión corredor de apio dando negó error

En cuanto a corredor en la que estoy tratando de base de datos de Django configuración siguiendo instrucciones sobre http://docs.celeryproject.org/en/latest/getting-started/brokers/django.html#broker-django.

puedo iniciar correctamente un trabajador como se puede ver a continuación:

*****-MacBook-Air:celeryproject *****$ python manage.py celery worker -E --loglevel=info /Library/Python/2.7/site-packages/django_celery-3.0.9-py2.7.egg/djcelery/loaders.py:124: 
UserWarning: Using settings.DEBUG leads to a memory leak, never use 
this setting in production environments! warnings.warn("Using 
settings.DEBUG leads to a memory leak, never " -------------- 
[email protected]*****-MacBook-Air.local v3.0.9 (Chiastic Slide) 
---- **** ----- 
--- * *** * -- [Configuration] 
-- * - **** --- . broker:  django://localhost// 
- ** ---------- . app:   default:0x10b785fd0 (djcelery.loaders.DjangoLoader) 
- ** ---------- . concurrency: 4 (processes) 
- ** ---------- . events:  ON 
- ** ---------- 
- *** --- * --- [Queues] 
-- ******* ---- . celery:  exchange:celery(direct) binding:celery 
--- ***** ----- 

[Tasks]  . celeryapp.tasks.add 

[2012-09-06 20:03:23,711: WARNING/MainProcess] [email protected]*****-MacBook-Air.local has started. 
[2012-09-06 20:03:24,406: WARNING/PoolWorker-3] /Library/Python/2.7/site- 
packages/django_celery-3.0.9-py2.7.egg/djcelery/loaders.py:124: UserWarning: Using settings.DEBUG leads to a memory leak, never use this setting in production environments! 
warnings.warn("Using settings.DEBUG leads to a memory leak, never " 
[2012-09-06 20:03:24,406: WARNING/PoolWorker-1] /Library/Python/2.7/site-packages/django_celery-3.0.9-py2.7.egg/djcelery/loaders.py:124: UserWarning: Using 
settings.DEBUG leads to a memory leak, never use this setting in production environments! 
warnings.warn("Using settings.DEBUG leads to a memory leak, never " 
[2012-09-06 20:03:24,406: WARNING/PoolWorker-2] /Library/Python/2.7/site-packages/django_celery-3.0.9-py2.7.egg/djcelery/loaders.py:124: UserWarning: Using 
settings.DEBUG leads to a memory leak, never use this setting in production environments! 
warnings.warn("Using settings.DEBUG leads to a memory leak, never " 

[09/06/2012 20: 03: 24.406: ADVERTENCIA/PoolWorker-4] /Library/Python/2.7/site- paquetes/django_celery-3.0.9-py2.7.egg/djcelery/loaders.py: 124: UserWarning: Usando settings.DEBUG conduce a una pérdida de memoria, nunca utilice este ajuste en entornos de producción! warnings.warn ("Uso settings.DEBUG conduce a una pérdida de memoria, nunca se"

Pero cuando continúo siguiendo la instrucción y llamar a la tarea complementaria creado en los pasos comenzó a recibir, me encuentro con el error siguiente:

.
Python 2.7.1 (r271:86832, Jul 31 2011, 19:30:53) 
[GCC 4.2.1 (Based on Apple Inc. build 5658) (LLVM build 2335.15.00)] on darwin 
Type "help", "copyright", "credits" or "license" for more information. 
>>> from celeryapp.tasks import add 
>>> add.delay(2,2) 
Traceback (most recent call last): 
    File "<stdin>", line 1, in <module> 
    File "/Library/Python/2.7/site-packages/celery-3.0.9-py2.7.egg/celery/app/task.py", line 343, in delay 
    return self.apply_async(args, kwargs) 
    File "/Library/Python/2.7/site-packages/celery-3.0.9-py2.7.egg/celery/app/task.py", line 458, in apply_async 
    with app.producer_or_acquire(producer) as P: 
    File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/contextlib.py", line 17, in __enter__ 
    return self.gen.next() 
    File "/Library/Python/2.7/site-packages/celery-3.0.9-py2.7.egg/celery/app/base.py", line 256, in producer_or_acquire 
    with self.amqp.producer_pool.acquire(block=True) as producer: 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/connection.py", line 712, in acquire 
    R = self.prepare(R) 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/pools.py", line 54, in prepare 
    p = p() 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/pools.py", line 45, in <lambda> 
    return lambda: self.create_producer() 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/pools.py", line 42, in create_producer 
    return self.Producer(self._acquire_connection()) 
    File "/Library/Python/2.7/site-packages/celery-3.0.9-py2.7.egg/celery/app/amqp.py", line 160, in __init__ 
    super(TaskProducer, self).__init__(channel, exchange, *args, **kwargs) 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/messaging.py", line 83, in __init__ 
    self.revive(self.channel) 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/messaging.py", line 174, in revive 
    channel = self.channel = maybe_channel(channel) 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/connection.py", line 886, in maybe_channel 
    return channel.default_channel 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/connection.py", line 624, in default_channel 
    self.connection 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/connection.py", line 617, in connection 
    self._connection = self._establish_connection() 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/connection.py", line 576, in _establish_connection 
    conn = self.transport.establish_connection() 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/transport/amqplib.py", line 344, in establish_connection 
    connect_timeout=conninfo.connect_timeout) 
    File "/Library/Python/2.7/site-packages/kombu-2.4.5-py2.7.egg/kombu/transport/amqplib.py", line 154, in __init__ 
    super(Connection, self).__init__(*args, **kwargs) 
    File "build/bdist.macosx-10.7-intel/egg/amqplib/client_0_8/connection.py", line 129, in __init__ 
    File "build/bdist.macosx-10.7-intel/egg/amqplib/client_0_8/transport.py", line 281, in create_transport 
    File "build/bdist.macosx-10.7-intel/egg/amqplib/client_0_8/transport.py", line 85, in __init__ 
socket.error: [Errno 61] Connection refused 

Básicamente parece que esta aplicación no es capaz de configurar un transporte con mi agente

a continuación se presenta fragmento de mi settings.py:

# Django settings for celeryproject project. 
import djcelery 
djcelery.setup_loader() 

## Broker settings. 
BROKER_URL= 'django://' 
BROKER_BACKEND = "djkombu.transport.DatabaseTransport" 

# List of modules to import when celery starts. 
CELERY_IMPORTS = ("celeryapp.tasks",) 

## Using the database to store task state and results. 
#CELERY_RESULT_BACKEND = "database" 
#CELERY_RESULT_DBURI = "sqlite:///mydatabase.db" 

CELERY_ANNOTATIONS = {"tasks.add": {"rate_limit": "10/s"}} 

INSTALLED_APPS = (
    'django.contrib.auth', 
    'django.contrib.contenttypes', 
    'django.contrib.sessions', 
    'django.contrib.sites', 
    'django.contrib.messages', 
    'django.contrib.staticfiles', 
    'celeryapp', 
    'djcelery', 
    #'djcelery.transport', 
    'kombu.transport.django', 
    # Uncomment the next line to enable the admin: 
    # 'django.contrib.admin', 
    # Uncomment the next line to enable admin documentation: 
    # 'django.contrib.admindocs', 
) 

¿Alguna idea?

+0

Tengo el mismo problema. Usando apio/django-apio 3.0.11 y kombu 2.5.4. – jcdyer

Respuesta

1

El problema parece ser que las instrucciones que salen con un cargador desconfigurado, al poner en cola una tarea desde el shell interactivo.

Si tuviera un problema de transporte, que habría recibido un error cuando se ejecuta el servidor de apio, no cuando la tarea en cola.

Esto es lo que funcionó para mí:

>>> import djcelery 
>>> djcelery.setup_loader() 
>>> from myapp.tasks import add 
>>> add.delay(2,2) 
<AsyncResult: d43a673a-c2a5-4116-b60e-c59afa7dff51> 
>>> result = add.delay(4,5) 
>>> result.ready() 
True 
>>> result.result 
9 
>>> result.get() 
9 
>>> result.successful() 
True 
0

Las respuestas anteriores deberían funcionar bien. Sin embargo me dieron el mismo error connection refused en un proyecto Django, esto sucedió porque se me olvidó poner la configuración de apio en settings.py. Para solucionar esto, solo agregue la siguiente configuración en settings.py:

# CELERY SETTINGS 
BROKER_URL = 'redis://localhost:6379/0' 
CELERY_ACCEPT_CONTENT = ['json'] 
CELERY_TASK_SERIALIZER = 'json' 
CELERY_RESULT_SERIALIZER = 'json' 
Cuestiones relacionadas