Skip to content

Conversation

Copy link

Copilot AI commented Jan 6, 2026

Removes Redis and RQ dependencies to enable single-process deployment without external queue infrastructure. Jobs and statistics now persist in SQLite with thread-based async execution.

Architecture

Before: FastAPI server → Redis (queue + timeseries) ← RQ workers
After: FastAPI server with embedded SQLite + ThreadPoolExecutor

Implementation

  • New modules:

    • asu/database.py - SQLAlchemy models (Job, BuildStats) with thread-safe connection pooling
    • asu/job_queue.py - RQ-compatible Queue/Job interface using ThreadPoolExecutor
  • Queue operations: Job lifecycle (queued → started → finished/failed) maintained via SQLite transactions, metadata stored as JSON columns

  • Configuration:

    • Removed: redis_url
    • Added: database_path (default: public/asu.db), worker_threads (default: 4)
  • Statistics: Simplified implementation using SQLAlchemy queries instead of Redis TimeSeries. Advanced aggregations deferred (marked TODO).

  • Dependencies: redis>=6.4.0 and rq>=2.6.0sqlalchemy>=2.0.0

Migration Notes

  • Jobs from existing Redis queue will not transfer (queue starts empty)
  • Stats history resets on migration
  • API responses unchanged, backward compatible
  • Podman still required for ImageBuilder container execution

Example

# Job creation (identical interface)
from asu.job_queue import get_queue
from asu.build import build

job = get_queue().enqueue(
    build,
    build_request,
    job_id=request_hash,
    result_ttl="3h",
    failure_ttl="10m"
)

# Job queries work the same
job.is_finished  # Boolean
job.return_value()  # Result dict
job.meta  # Metadata dict

Warning

Firewall rules blocked me from connecting to one or more addresses (expand for details)

I tried to connect to the following addresses, but was blocked by firewall rules:

  • downloads.openwrt.org
    • Triggering command: /usr/bin/python3 python3 -c import sys; sys.path.insert(0, '.'); from asu import main; print('Main app import successful') (dns block)

If you need me to access, download, or install something from one of these locations, you can either:

Original prompt

Objective

Replace the Redis + RQ (Python RQ) task queue system with SQLite database and Python threading to enable running ASU without requiring podman-compose setup. This will simplify deployment while maintaining asynchronous build processing capabilities.

Current Architecture

  • Task Queue: Redis + RQ for job management
  • Stats: Redis TimeSeries (optional, when SERVER_STATS is enabled)
  • Deployment: Requires podman-compose with separate server, worker, and Redis containers

Migration Requirements

1. Replace Redis/RQ with SQLite + Threading

Create new file: asu/database.py

  • Use SQLAlchemy ORM (not raw SQL) for all database operations
  • Define SQLAlchemy models for:
    • Job table: Store job metadata (id, status, meta as JSON, result as JSON, enqueued_at, started_at, finished_at, failure_ttl, result_ttl)
    • BuildStats table: Simple statistics tracking (event type, timestamp, metadata as JSON)
  • Include database initialization functions
  • Use SQLite with proper connection pooling and thread safety settings
  • Database file location: {settings.public_path}/asu.db

Create new file: asu/job_queue.py

  • Implement a thread-based job queue using concurrent.futures.ThreadPoolExecutor
  • Create a Job class that mimics RQ's Job interface with methods:
    • get_meta(), save_meta(), get_position(), is_queued, is_started, is_finished, is_failed
    • latest_result(), return_value()
    • Store job state in SQLite using SQLAlchemy
  • Create a Queue class that mimics RQ's Queue interface with methods:
    • enqueue(), fetch_job(), __len__()
  • Implement proper cleanup of expired jobs based on result_ttl and failure_ttl
  • Handle graceful shutdown of thread pool

2. Update Configuration

Modify asu/config.py

  • Remove redis_url setting
  • Add database_path: Path setting (default: public_path / "asu.db")
  • Add worker_threads: int setting (default: 4) for thread pool size
  • Keep async_queue for backward compatibility but make it always True

3. Update Utility Functions

Modify asu/util.py

  • Remove: get_redis_client(), get_redis_ts() functions
  • Replace get_queue() to return the new SQLAlchemy-based Queue instead of RQ Queue
  • Update add_timestamp() and add_build_event() to:
    • Store stats in SQLite BuildStats table using SQLAlchemy
    • Keep same function signatures for compatibility
    • Simplify implementation (no need for TimeSeries features initially)
  • Remove all import redis and from rq import Queue statements
  • Add SQLAlchemy imports

4. Update Build System

Modify asu/build.py

  • Remove from rq import get_current_job and from rq.utils import parse_timeout
  • Update _build() function to accept custom Job object instead of RQ job
  • Replace get_current_job() with passed job parameter
  • Implement timeout parsing locally (simple function to convert "10m" to seconds)
  • Keep all job.meta and job.save_meta() calls (new Job class will handle these)
  • Update logger name from "rq.worker" to "asu.worker"

Modify asu/routers/api.py

  • Remove from rq.job import Job import
  • Update to use new Job class from asu.job_queue
  • Keep all existing API behavior and response formats
  • Ensure job status checking (is_queued, is_started, is_finished, is_failed) works with new Job class

5. Update Dependencies

Modify pyproject.toml

  • Remove dependencies: redis>=6.4.0, rq>=2.6.0
  • Add dependency: sqlalchemy>=2.0.0
  • Remove from dev dependencies: fakeredis>=2.32.0

6. Update Main Application

Modify asu/main.py

  • Add startup event to initialize database and thread pool
  • Add shutdown event to gracefully stop thread pool and close database connections
  • Import and initialize the job queue system

7. Stats Implementation Notes

  • Defer advanced stats: The initial implementation should focus on basic counters
  • Store simple event counts and timestamps in SQLite
  • Add TODO comments indicating that advanced TimeSeries-style stats can be implemented later
  • Keep the server_stats setting to enable/disable stats collection

8. Thread Safety Considerations

  • Ensure SQLite is configured with proper thread safety (check_same_thread=False for SQLAlchemy)
  • Use SQLAlchemy sessions properly with thread-local storage or scoped sessions
  • Implement proper locking for job queue operations

Important Notes

  • Use SQLAlchemy ORM throughout - do not write raw SQL queries
  • Maintain backward compatibility with existing API responses
  • Keep the same job status workflow (queued → started → finished/failed)
  • Preserve all existing build functionality
  • The migration should allow running ASU as a single process without podman-compose
  • Jobs should persist across application restarts (stored in SQLite)
  • Implement proper error handling and logging

Files to Create

  1. asu/database.py - SQLAlchemy models and database utilities
  2. asu/job_queue.py ...

This pull request was created from Copilot chat.


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Copilot AI and others added 4 commits January 6, 2026 05:23
Co-authored-by: aparcar <16000931+aparcar@users.noreply.github.com>
Co-authored-by: aparcar <16000931+aparcar@users.noreply.github.com>
Co-authored-by: aparcar <16000931+aparcar@users.noreply.github.com>
Co-authored-by: aparcar <16000931+aparcar@users.noreply.github.com>
…settings, better session management

Co-authored-by: aparcar <16000931+aparcar@users.noreply.github.com>
Copilot AI changed the title [WIP] Replace Redis and RQ with SQLite and Python threading Replace Redis + RQ with SQLite + threading for simplified deployment Jan 6, 2026
Copilot AI requested a review from aparcar January 6, 2026 05:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants