Skip to content

cofin/litestar-saq

Folders and files

NameName
Last commit message
Last commit date

Latest commit

8b2dc80 · Mar 20, 2025

History

96 Commits
Jan 22, 2025
Jan 27, 2025
Mar 20, 2025
Mar 20, 2025
Jan 27, 2025
Mar 20, 2025
Jan 22, 2025
Oct 4, 2023
Jan 27, 2025
Jan 22, 2025
Mar 20, 2025
Mar 20, 2025

Repository files navigation

Litestar SAQ

Installation

pip install litestar-saq

Usage

Here is a basic application that demonstrates how to use the plugin.

from __future__ import annotations

from litestar import Litestar

from litestar_saq import QueueConfig, SAQConfig, SAQPlugin

saq = SAQPlugin(config=SAQConfig(dsn="redis://localhost:6397/0", queue_configs=[QueueConfig(name="samples")]))
app = Litestar(plugins=[saq])

You can start a background worker with the following command now:

litestar --app-dir=examples/ --app basic:app workers run
Using Litestar app from env: 'basic:app'
Starting SAQ Workers ──────────────────────────────────────────────────────────────────
INFO - 2023-10-04 17:39:03,255 - saq - worker - Worker starting: Queue<redis=Redis<ConnectionPool<Connection<host=localhost,port=6397,db=0>>>, name='samples'>
INFO - 2023-10-04 17:39:06,545 - saq - worker - Worker shutting down

You can also start the process for only specific queues. This is helpful if you want separated processes working on different queues instead of combining them.

litestar --app-dir=examples/ --app basic:app workers run --queues sample
Using Litestar app from env: 'basic:app'
Starting SAQ Workers ──────────────────────────────────────────────────────────────────
INFO - 2023-10-04 17:39:03,255 - saq - worker - Worker starting: Queue<redis=Redis<ConnectionPool<Connection<host=localhost,port=6397,db=0>>>, name='samples'>
INFO - 2023-10-04 17:39:06,545 - saq - worker - Worker shutting down

If you are starting the process for only specific queues and still want to read from the other queues or enqueue a task into another queue that was not initialized in your worker or is found somewhere else, you can do so like here

import os
from saq import Queue


def get_queue_directly(queue_name: str, redis_url: str) -> Queue:
    return Queue.from_url(redis_url, name=queue_name)

redis_url = os.getenv("REDIS_URL")
queue = get_queue_directly("queue-in-other-process", redis_url)
# Get queue info
info = await queue.info(jobs=True)
# Enqueue new task
queue.enqueue(
    ....
)