pyncette 0.10.1

Last updated:

0 purchases

pyncette 0.10.1 Image
pyncette 0.10.1 Images
Add to Cart

Description:

pyncette 0.10.1

A reliable distributed scheduler with pluggable storage backends for Async Python.

Free software: MIT license


Installation
Minimal installation (just SQLite persistence):
pip install pyncette
Full installation (all the backends and Prometheus metrics exporter):
pip install pyncette[all]
You can also install the in-development version with:
pip install https://github.com/tibordp/pyncette/archive/master.zip


Documentation
https://pyncette.readthedocs.io


Usage example
Simple in-memory scheduler (does not persist state)
from pyncette import Pyncette, Context

app = Pyncette()

@app.task(schedule='* * * * *')
async def foo(context: Context):
print('This will run every minute')

if __name__ == '__main__':
app.main()
Persistent distributed cron using Redis (coordinates execution with parallel instances and survives restarts)
from pyncette import Pyncette, Context
from pyncette.redis import redis_repository

app = Pyncette(repository_factory=redis_repository, redis_url='redis://localhost')

@app.task(schedule='* * * * * */10')
async def foo(context: Context):
print('This will run every 10 seconds')

if __name__ == '__main__':
app.main()
See the examples directory for more examples of usage.


Use cases
Pyncette is designed for reliable (at-least-once or at-most-once) execution of recurring tasks (think cronjobs) whose
lifecycles are managed dynamically, but can work effectively for non-reccuring tasks too.
Example use cases:

You want to perform a database backup every day at noon
You want a report to be generated daily for your 10M users at the time of their choosing
You want currency conversion rates to be refreshed every 10 seconds
You want to allow your users to schedule non-recurring emails to be sent at an arbitrary time in the future

Pyncette might not be a good fit if:

You want your tasks to be scheduled to run (ideally) once as soon as possible. It is doable, but you will be better served by a general purpose reliable queue like RabbitMQ or Amazon SQS.
You need tasks to execute at sub-second intervals with low jitter. Pyncette coordinates execution on a per task-instance basis and this corrdination can add overhead and jitter.



Supported backends
Pyncette comes with an implementation for the following backends (used for persistence and coordination) out-of-the-box:

SQLite (included)
Redis (pip install pyncette[redis])
PostgreSQL (pip install pyncette[postgres])
MySQL 8.0+ (pip install pyncette[mysql])
Amazon DynamoDB (pip install pyncette[dynamodb])

Pyncette imposes few requirements on the underlying datastores, so it can be extended to support other databases or
custom storage formats / integrations with existing systems. For best results, the backend needs to provide:

Some sort of serialization mechanism, e.g. traditional transactions, atomic stored procedures or compare-and-swap
Efficient range queries over a secondary index, which can be eventually consistent



Development
To run integration tests you will need Redis, PostgreSQL, MySQL and Localstack (for DynamoDB) running locally.
To run the all tests run:
tox
Alternatively, there is a Docker Compose environment that will set up all the backends so that integration tests can run seamlessly:
docker-compose up -d
docker-compose run --rm shell
tox
To run just the unit tests (excluding integration tests):
tox -e py310 # or your Python version of choice
Note, to combine the coverage data from all the tox environments run:






Windows
set PYTEST_ADDOPTS=--cov-append
tox


Other
PYTEST_ADDOPTS=--cov-append tox






Changelog

0.10.1 (2023-05-09)

Include missing lua files in the built wheel



0.10.0 (2023-05-08)

Drop support for Python 3.7
Add support for Python 3.11
Modernize Python package structure and linters
Fix a few bugs and type annotations



0.8.1 (2021-04-08)

Improve performance for calculation of the next execution time
Add ability for repositories to pass a pagination token
Add add_to_context() to inject static data to context
Clean up documentation and add additional examples



0.8.0 (2021-04-05)

Added Amazon DynamoDB backend
Added MySQL backend
Added support for partitioned dynamic tasks



0.7.0 (2021-03-31)

Added support for automatic and cooperative lease heartbeating
PostgreSQL backend can now skip automatic table creation
Improved signal handling
CI: Add Codecov integration
Devenv: Run integration tests in Docker Compose



0.6.1 (2020-04-02)

Optimize the task querying on Postgres backend
Fix: ensure that there are no name colissions between concrete instances of different dynamic tasks
Improve fairness of polling tasks under high contention.



0.6.0 (2020-03-31)

Added PostgreSQL backend
Added Sqlite backend and made it the default (replacing InMemoryRepository)
Refactored test suite to cover all conformance/integration tests on all backends
Refactored Redis backend, simplifying the Lua scripts and improving exceptional case handling (e.g. tasks disappearing between query and poll)
Main loop only sleeps for the rest of remaining poll_interval before next tick instead of the full amount
General bug fixes, documentation changes, clean up



0.5.0 (2020-03-27)

Fixes bug where a locked dynamic task could be executed again on next tick.
poll_task is now reentrant with regards to locking. If the lease passed in matches the lease on the task, it behaves as though it were unlocked.



0.4.0 (2020-02-16)

Middleware support and optional metrics via Prometheus
Improved the graceful shutdown behavior
Task instance and application context are now available in the task context
Breaking change: dynamic task parameters are now accessed via context.args[‘name’] instead of context.name
Improved examples, documentation and packaging



0.2.0 (2020-01-08)

Timezone support
More efficient poling when Redis backend is used



0.1.1 (2020-01-08)

First release that actually works.



0.0.0 (2019-12-31)

First release on PyPI.

License:

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.