Crons

Learn how to set up Sentry Crons for Celery

Sentry Crons allows you to monitor the uptime and performance of any scheduled, recurring job. Once implemented, it'll allow you to get alerts and metrics to help you solve errors, detect timeouts, and prevent disruptions to your service.

Use the Celery integration to monitor your Celery periodic tasks and get notified when a task is missed (or doesn't start when expected), if it fails due to a problem in the runtime (such as an error), or if it fails by exceeding its maximum runtime.

Please note that a cron monitor will only be created the first time your task runs.

Get started by setting up your Celery beat schedule:

tasks.py
Copied
# tasks.py
from celery import Celery
from celery.schedules import crontab

app = Celery('tasks', broker='...')
app.conf.beat_schedule = {
    'set-in-beat-schedule': {
        'task': 'tasks.tell_the_world',
        'schedule': crontab(hour='10', minute='15'),
        'args': ("Some important message!", ),
    },
}

Please note that only schedules that can be parsed by crontab will be successfully updated or inserted.

Next, initialize Sentry. Where to do this depends on how you run beat:

  • If beat is running in your worker process (that is, you're running your worker with the -B/--beat option), initialize Sentry in either the celeryd_init or beat_init signal.
  • If beat is running in a separate process, you need to initialize Sentry in both the celeryd_init and beat_init signal.

Make sure to also set monitor_beat_tasks=True in CeleryIntegration.

In addition to capturing errors, you can monitor interactions between multiple services or applications by enabling tracing. You can also collect and analyze performance profiles from real users with profiling.

Select which Sentry features you'd like to install in addition to Error Monitoring to get the corresponding installation and configuration instructions below.

tasks.py
Copied
# tasks.py
from celery import signals

import sentry_sdk
from sentry_sdk.integrations.celery import CeleryIntegration

@signals.beat_init.connect   # omit this line if you're running beat directly within your worker process
@signals.celeryd_init.connect
def init_sentry(**kwargs):
    sentry_sdk.init(
        dsn='https://examplePublicKey@o0.ingest.sentry.io/0',
        # Set traces_sample_rate to 1.0 to capture 100%
        # of transactions for tracing.
        traces_sample_rate=1.0,
        # Set profiles_sample_rate to 1.0 to profile 100%
        # of sampled transactions.
        # We recommend adjusting this value in production.
        profiles_sample_rate=1.0,
+       integrations=[
+           CeleryIntegration(
+               monitor_beat_tasks=True
+           )
+       ],
        environment="local.dev.grace",
        release="v1.0",
    )

Once Sentry Crons is set up, tasks in your Celery beat schedule will be auto-discoverable, and telemetry data will be captured when a task is started, when it finishes, and when it fails.

Start your Celery beat and worker services and see your tasks being monitored at https://sentry.io/crons/.

In addition to the default Celery Beat scheduler, we also support Redbeat.

You don't need to create Cron Monitors for your tasks on Sentry.io, we'll do it for you.

You can exclude Celery Beat tasks from being auto-instrumented. To do this, add a list of tasks you want to exclude as option exclude_beat_tasks when creating CeleryIntegration. The list can contain simple strings with the full task name, as specified in the Celery Beat schedule, or regular expressions to match multiple tasks.

Copied
    sentry_sdk.init(
        # ...
        integrations=[
            CeleryIntegration(
                monitor_beat_tasks=True,
+               exclude_beat_tasks=[
+                   "some-task-a",
+                   "payment-check-.*",
+               ]
            ),
        ],
    )

In this example the task some-task-a and all tasks with a name starting with payment-check- will be ignored.

For more information, see the documentation for options on CeleryIntegration.

We provide a lightweight decorator to make monitoring individual tasks easier. To use it, add @sentry_sdk.monitor to your Celery task (or any function), then supply a monitor_slug of a monitor created previously on Sentry.io. Once this is done, every time the task (or function) is executed, telemetry data will be captured when a task is started, when it finishes, and when it fails.

Make sure the Sentry @sentry_sdk.monitor decorator is below Celery's @app.task decorator.

tasks.py
Copied
# tasks.py
from celery import Celery, signals

import sentry_sdk
from sentry_sdk.integrations.celery import CeleryIntegration

app = Celery('tasks', broker='...')

@signals.celeryd_init.connect
def init_sentry(**kwargs):
    sentry_sdk.init(
        # same as above
    )

@app.task
+@sentry_sdk.monitor(monitor_slug='<monitor-slug>')
def tell_the_world(msg):
    print(msg)
Help improve this content
Our documentation is open source and available on GitHub. Your contributions are welcome, whether fixing a typo (drat!) or suggesting an update ("yeah, this would be better").