A broker agnostic implementation of the outbox and other message resilience patterns for Django apps.
To use jaiminho with your project, you just need to do 6 steps:
python -m pip install jaiminho
INSTALLED_APPS = [
...
"jaiminho"
]
python manage.py migrate
JAIMINHO_CONFIG = {
"PERSIST_ALL_EVENTS": False,
"DELETE_AFTER_SEND": True,
"DEFAULT_ENCODER": DjangoJSONEncoder,
"PUBLISH_STRATEGY": "publish-on-commit",
}
from jaiminho.send import save_to_outbox
@save_to_outbox
def any_external_call(**kwargs):
# do something
return
python manage.py events_relay --run-in-loop --loop-interval 1
If you don't use --run-in-loop
option, the relay command will run only 1 time. This is useful in case you want to configure it as a cronjob.
Jaiminho @save_to_outbox
decorator will intercept decorated function and persist it in a database table in the same transaction that is active in the decorated function context. The event relay command, is a separated process that fetches the rows from this table and execute the functions. When an outage happens, the event relay command will keep retrying until it succeeds. This way, eventual consistency is ensured by design.
PUBLISH_STRATEGY
- Strategy used to publish events (publish-on-commit, keep-order)PERSIST_ALL_EVENTS
- Saves all events and not only the ones that fail, default isFalse
. Only applicable for{ "PUBLISH_STRATEGY": "publish-on-commit" }
since all events needs to be stored on keep-order strategy.DELETE_AFTER_SEND
- Delete the event from the outbox table immediately, after a successful sendDEFAULT_ENCODER
- Default Encoder for the payload (overwritable in the function call)
This strategy is similar to transactional outbox described by Chris Richardson. The decorated function intercepts the function call and saves it on the local DB to be executed later. A separate command relayer will keep polling local DB and executing those functions in the same order it was stored. Be carefully with this approach, if any execution fails, the relayer will get stuck as it would not be possible to guarantee delivery order.
This strategy will always execute the decorated function after current transaction commit. With this approach, we don't depend on a relayer (separate process / cronjob) to execute the decorated function and deliver the message. Failed items will only be retried through relayer. Although this solution has a better performance as only failed items is delivered by the relay command, we cannot guarantee delivery order.
We already provide a command to relay items from DB, EventRelayCommand. The way you should configure depends on the strategy you choose.
For example, on Publish on Commit Strategy you can configure a cronjob to run every a couple of minutes since only failed items are published by the command relay. If you are using Keep Order Strategy, you should run relay command in loop mode as all items will be published by the command, e.g call_command(events_relay.Command(), run_in_loop=True, loop_interval=0.1)
.
You can use Jaiminho's EventCleanerCommand in order to do that. It will query for all events that were sent before a given time interval (e.g. last 7 days) and will delete them from the outbox table.
The default time interval is 7 days
. You can use the TIME_TO_DELETE
setting to change it. It should be added to JAIMINHO_CONFIG
and must be a valid timedelta.
You can run those commands in a cron job. Here are some config examples:
- name: relay-failed-outbox-events
schedule: "*/15 * * * *"
suspend: false
args:
- ddtrace-run
- python
- manage.py
- events_relay
resources:
requests:
cpu: 1
limits:
memory: 384Mi
- name: delete-old-outbox-events
schedule: "0 5 * * *"
suspend: false
args:
- ddtrace-run
- python
- manage.py
- event_cleaner
resources:
requests:
cpu: 1
limits:
memory: 384Mi
Different streams can have different requirements. You can save separate events per streams by using the @save_to_outbox_stream
decorator:
@save_to_outbox_stream("my-stream")
def any_external_call(payload, **kwargs):
# do something
pass
you can also overwrite publish strategy configure on settings:
@save_to_outbox_stream("my-stream", PublishStrategyType.KEEP_ORDER)
def any_external_call(payload, **kwargs):
# do something
pass
And then, run relay command with stream filter option
python manage.py relay_event True 0.1 my-stream
In the example above, True
is the option for run_in_loop; 0.1
for loop_interval; and my_stream
is the name of the stream.
Jaiminho triggers the following Django signals:
Signal | Description |
---|---|
event_published | Triggered when an event is sent successfully |
event_failed_to_publish | Triggered when an event is not sent, being added to the Outbox table queue |
You could use the Django signals triggered by Jaiminho to collect metrics. Consider the following code as example:
from django.dispatch import receiver
@receiver(event_published)
def on_event_sent(sender, event_payload, **kwargs):
metrics.count(f"event_sent_successfully {event_payload.get('type')}")
@receiver(event_failed_to_publish)
def on_event_send_error(sender, event_payload, **kwargs):
metrics.count(f"event_failed {event_payload.get('type')}")
Jaiminho can be very useful for adding reliability to Celery workflows. Writing to the database and enqueuing Celery tasks in the same workflow is very common in many applications, and this pattern can benefit greatly from the outbox pattern to ensure message delivery reliability.
Instead of configuring the @save_to_outbox
decorator for every individual Celery task, you can integrate it at the Celery class level by overriding the send_task
method, which is used by Celery to enqueue new tasks. This way, all tasks automatically benefit from the outbox pattern without requiring individual configuration.
Here's how to implement this:
from celery import Celery
from jaiminho import save_to_outbox
class CeleryWithJaiminho(Celery):
"""
Custom Celery class that inherits from Celery base class
and adds Jaiminho functionality
"""
@save_to_outbox
def send_task(self, *args, **kwargs):
"""Send task with outbox pattern for reliability"""
return super().send_task(*args, **kwargs)
app = CeleryWithJaiminho("tasks")
With this approach, all tasks sent through your Celery app will automatically use the outbox pattern, ensuring that task enqueuing is resilient to transient failures and network issues.
Create a virtualenv
virtualenv venv
pip install -r requirements-dev.txt
tox -e py39
If you want to improve or suggest improvements, check our CONTRIBUTING.md file.
This project is licensed under MIT License.
If you have any security concern or report feel free to reach out to [email protected];