Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.
Печём пирожки с Celery
Александр Мокров
О чем доклад
Немного об очередях
Краткий обзор возможностей Celery
Использование на примере виртуальной фабрики пирожков
Очереди
Producer
queue_name
Consumer
Очереди
Producer
Consumer
Consumer
queue_name
Роутинг
Producer
Consumer
Consumer
first_queue
X
second_queue
whitewhite
black
What is task queue?
Celery: Distributed Task Queue
Celery is an asynchronous task queue/job queue based on distributed message
passing. It is ...
Ask Solem Hoel
https://github.com/ask
Principal Software Engineer at Robinhood
San Francisco, CA
Staff Engineer, RabbitMQ
...
Celery
Kombu (Messaging library for Python) 41991 (22541) lines
Billiard (Python multiprocessing fork with improvements an...
It supports
Brokers
RabbitMQ, Redis,
MongoDB (exp), ZeroMQ (exp)
CouchDB (exp), SQLAlchemy (exp)
Django ORM (exp), Amazon ...
Application
from celery import Celery
from conf import Settings
from tasks.bake_pie import BakePie
APP_NAME = 'pie_fabric'...
Settings
class Settings:
BROKER_URL = 'amqp://guest:guest@localhost:5672//'
CELERY_RESULT_BACKEND = 'redis://127.0.0.1:637...
$ celery -A pie_fabric worker -l INFO
-------------- celery@amokrov v4.0.0rc2 (0today8)
---- **** -----
--- * *** * -- Lin...
Architecture
Client
1
Client
2
Worker
1Broker
Task Queue 1
Task Queue 2
...
Worker
2
Result Backend
send tasks
send tasks
...
Фабрика пирожков
Интерфейс
Заказ
Осуществление
заказа
Стряпание
пирога
Выпекание
пирога
Получение
ингредиентов
для теста
И...
Workflow
get flour
bake pie
get meat
seal pie
create
dough
order pie
get milk
get aggs
The canvas
Wokrflow primitives
● group
● chain
● chord
● map, starmap, chunks
Tasks
from celery.app.task import Task
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
cla...
Workflow
from celery.app.task import Task
from celery import canvas, signature
class OrderPie(Task):
name = 'order_pie'
de...
Client side
>>> from tasks.order_pie import OrderPie
>>> r = OrderPie().delay()
>>> r.id
'58c5bb47-8fb3-4d4a-b2f5-8899520b...
Routing
from kombu import Exchange, Queue
class Router:
def route_for_task(self, task, *args, **kwargs):
route = {'exchang...
Routing
class Settings:
EXCHANGE = Exchange('pie_fabric,
type='direct')
CELERY_ROUTES = (Router(),)
CELERY_QUEUES = (
Queu...
Second application
class Settings:
BROKER_URL = 'amqp://guest:guest@localhost:
5672//'
CELERY_RESULT_BACKEND = 'redis://12...
Gevent
gevent is a coroutine -based Python networking library that uses greenlet to
provide a high-level synchronous API o...
Polling
def run(self, *ingredients):
logger.info('Ingredients: {}'.format(ingredients))
id = create_dough(ingredients) # О...
Calling Tasks
apply
Execute this task locally, by blocking until the task returns.
apply_async
Apply tasks asynchronously ...
Options
Linking (callbacks/errbacks)
link
link_error
ETA and countdown
countdown
eta
Expiration
expires
Retry
retry=True,
...
Callbacks
options = {
'link': app.signature('seal_pie'),
'link_error': app.signature('order_error')
}
for sub_task in opti...
Periodic Tasks
from celery.schedule import crontab
CELERYBEAT_SCHEDULE = {
'check-every-minute': {
'task': 'check_ingredie...
Concurrency
prefork (multiprocessing)
eventlet/gevent
threads/single threaded
Signals
Task Signals
App Signals
Worker Signals
Beat Signals
Eventlet Signals
Logging Signals
Command signals
Deprecated S...
Ссылки
https://www.rabbitmq.com
http://docs.celeryproject.org
Cпасибо за внимание!
Вопросы?
Cooking pies with Celery
Nächste SlideShare
Wird geladen in …5
×

Cooking pies with Celery

286 Aufrufe

Veröffentlicht am

Celery overview

Veröffentlicht in: Software
  • Als Erste(r) kommentieren

  • Gehören Sie zu den Ersten, denen das gefällt!

Cooking pies with Celery

  1. 1. Печём пирожки с Celery Александр Мокров
  2. 2. О чем доклад Немного об очередях Краткий обзор возможностей Celery Использование на примере виртуальной фабрики пирожков
  3. 3. Очереди Producer queue_name Consumer
  4. 4. Очереди Producer Consumer Consumer queue_name
  5. 5. Роутинг Producer Consumer Consumer first_queue X second_queue whitewhite black
  6. 6. What is task queue?
  7. 7. Celery: Distributed Task Queue Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well. The execution units, called tasks, are executed concurrently on a single or more worker servers using multiprocessing, Eventlet, or gevent. Tasks can execute asynchronously (in the background) or synchronously (wait until ready). Celery is used in production systems to process millions of tasks a day.
  8. 8. Ask Solem Hoel https://github.com/ask Principal Software Engineer at Robinhood San Francisco, CA Staff Engineer, RabbitMQ VMware май 2011 – май 2013 (2 года 1 месяц) Open source and consulting work on Celery, RabbitMQ, Celery Дата начала: май 2009
  9. 9. Celery Kombu (Messaging library for Python) 41991 (22541) lines Billiard (Python multiprocessing fork with improvements and bugfixes) 19191(13115) Vine (promise, async, future) 2921 Celery 104296 (37495) Total 168399(76072)
  10. 10. It supports Brokers RabbitMQ, Redis, MongoDB (exp), ZeroMQ (exp) CouchDB (exp), SQLAlchemy (exp) Django ORM (exp), Amazon SQS, (exp) Concurrency prefork (multiprocessing), Eventlet, gevent threads/single threaded Result Stores AMQP, Redis memcached, MongoDB SQLAlchemy, Django ORM Apache Cassandra Serialization pickle, json, yaml, msgpack. zlib, bzip2 compression. Cryptographic message signing
  11. 11. Application from celery import Celery from conf import Settings from tasks.bake_pie import BakePie APP_NAME = 'pie_fabric' app = Celery(APP_NAME) app.config_from_object(Settings) app.tasks.register(BakePie())
  12. 12. Settings class Settings: BROKER_URL = 'amqp://guest:guest@localhost:5672//' CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/0'
  13. 13. $ celery -A pie_fabric worker -l INFO -------------- celery@amokrov v4.0.0rc2 (0today8) ---- **** ----- --- * *** * -- Linux-3.19.0-61-generic-x86_64-with-Ubuntu-14.04-trusty 2016-06-24 15:52:22 -- * - **** --- - ** ---------- [config] - ** ---------- .> app: pie_fabric:0x7efc8cd4c588 - ** ---------- .> transport: amqp://guest:**@localhost:5672// - ** ---------- .> results: redis://127.0.0.1:6379/0 - *** --- * --- .> concurrency: 4 (prefork) -- ******* ---- --- ***** ----- [queues] -------------- .> celery exchange=celery(direct) key=celery [tasks] . bake_pie
  14. 14. Architecture Client 1 Client 2 Worker 1Broker Task Queue 1 Task Queue 2 ... Worker 2 Result Backend send tasks send tasks tasks tasks task result task result get task result get task result
  15. 15. Фабрика пирожков Интерфейс Заказ Осуществление заказа Стряпание пирога Выпекание пирога Получение ингредиентов для теста Ингредиенты Заготовка Пирог Создание теста Получение начинки
  16. 16. Workflow get flour bake pie get meat seal pie create dough order pie get milk get aggs
  17. 17. The canvas Wokrflow primitives ● group ● chain ● chord ● map, starmap, chunks
  18. 18. Tasks from celery.app.task import Task from celery.utils.log import get_task_logger logger = get_task_logger(__name__) class GetIngredient(Task): name = 'get_ingredients' def run(self, ingredient_name): logger.info('Getting {}'.format(ingredient_name)) l # код получения ингредиента return ingredient
  19. 19. Workflow from celery.app.task import Task from celery import canvas, signature class OrderPie(Task): name = 'order_pie' def run(self): get_meat = signature('get_ingredients', args=('meat',)) ... dough_chord = canvas.chord([get_eggs, get_milk, get_flour], create_dough) components_group = canvas.group(dough_chord, get_meat) return (components_group | seal_pie | bake_pie).delay()
  20. 20. Client side >>> from tasks.order_pie import OrderPie >>> r = OrderPie().delay() >>> r.id '58c5bb47-8fb3-4d4a-b2f5-8899520b5179' >>> r.ready() True >>> r.get() [['091f9ae7-561a-4781-a4d9-47bbcb121360', [['0865f66b-b89d-4ff3-a272-1f6b01d0a11f', None], None]], None] >>> r.backend.get_task_meta(r.get()[0][0]) {'task_id': '091f9ae7-561a-4781-a4d9-47bbcb121360', 'children': [], 'traceback': None, 'result': 'baked pie with meat', 'status': 'SUCCESS'}
  21. 21. Routing from kombu import Exchange, Queue class Router: def route_for_task(self, task, *args, **kwargs): route = {'exchange': Settings.EXCHANGE.name, 'routing_key': 'common'} if task in {'bake_pie', 'get_ingredients', 'seal_pie', 'order_pie'}: route['routing_key'] = 'fabric' elif task == 'create_dough': route = {'exchange': 'dough', 'routing_key': 'dough'} return route
  22. 22. Routing class Settings: EXCHANGE = Exchange('pie_fabric, type='direct') CELERY_ROUTES = (Router(),) CELERY_QUEUES = ( Queue('pie_fabric.common', EXCHANGE, routing_key='common'), Queue('pie_fabric.fabric', EXCHANGE, routing_key='fabric'), ) --- ***** ----- [queues] -------------- .> pie_fabric.common exchange=pie_fabric(direct) key=common .> pie_fabric.fabric exchange=pie_fabric(direct) key=fabric
  23. 23. Second application class Settings: BROKER_URL = 'amqp://guest:guest@localhost: 5672//' CELERY_RESULT_BACKEND = 'redis://127.0.0.1: 6379/0' EXCHANGE = Exchange('dough', type='direct') CELERY_QUEUES = ( Queue('dough', EXCHANGE, routing_key='dough'), ) app = Celery('dough') app.config_from_object(Settings) app.tasks.register(CreateDough()) $celery -A dough worker -P gevent -- concurrency=1000 -l INFO - ** ---------- [config] - ** ---------- .> app: dough:0x7f850ee22748 - ** ---------- .> transport: amqp://guest: **@localhost:5672// - ** ---------- .> results: redis://127.0.0.1:6379/0 - *** --- * --- .> concurrency: 1000 (gevent) -- ******* ---- --- ***** ----- [queues] -------------- .> dough exchange=dough (direct) key=dough [tasks] . create_dough
  24. 24. Gevent gevent is a coroutine -based Python networking library that uses greenlet to provide a high-level synchronous API on top of the libev event loop gevent.monkey – Make the standard library cooperative def _patch_gevent(): from gevent import monkey, signal as gsignal, version_info monkey.patch_all()
  25. 25. Polling def run(self, *ingredients): logger.info('Ingredients: {}'.format(ingredients)) id = create_dough(ingredients) # Отправка запроса на приготовление теста while True: time.sleep(polling_timeout) if ready(id): # Проверяем готово ли тесто dough = get_dough(id) # Если готово, то забираем return dough
  26. 26. Calling Tasks apply Execute this task locally, by blocking until the task returns. apply_async Apply tasks asynchronously by sending a message. delay Shortcut to send a task message, but does not support execution options. retry Retry the task.
  27. 27. Options Linking (callbacks/errbacks) link link_error ETA and countdown countdown eta Expiration expires Retry retry=True, retry_policy={ 'max_retries': 3, 'interval_start': 0, 'interval_step': 0.2, 'interval_max': 0.2, }) Serializers Compression Routing options
  28. 28. Callbacks options = { 'link': app.signature('seal_pie'), 'link_error': app.signature('order_error') } for sub_task in options.values(): sub_task.set( **app.amqp.router.route(sub_task.options, sub_task.task, sub_task.args, sub_task.kwargs) )
  29. 29. Periodic Tasks from celery.schedule import crontab CELERYBEAT_SCHEDULE = { 'check-every-minute': { 'task': 'check_ingredients', 'schedule':crontab() } $ celery -A pie_fabric beat
  30. 30. Concurrency prefork (multiprocessing) eventlet/gevent threads/single threaded
  31. 31. Signals Task Signals App Signals Worker Signals Beat Signals Eventlet Signals Logging Signals Command signals Deprecated Signals Worker Signals celeryd_after_setup celeryd_init worker_init worker_ready worker_process_init worker_process_shutdown worker_shutdown
  32. 32. Ссылки https://www.rabbitmq.com http://docs.celeryproject.org
  33. 33. Cпасибо за внимание!
  34. 34. Вопросы?

×