Replace Redis + RQ with SQLite + threading for simplified deployment #5
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Removes Redis and RQ dependencies to enable single-process deployment without external queue infrastructure. Jobs and statistics now persist in SQLite with thread-based async execution.
Architecture
Before: FastAPI server → Redis (queue + timeseries) ← RQ workers
After: FastAPI server with embedded SQLite + ThreadPoolExecutor
Implementation
New modules:
asu/database.py- SQLAlchemy models (Job, BuildStats) with thread-safe connection poolingasu/job_queue.py- RQ-compatible Queue/Job interface using ThreadPoolExecutorQueue operations: Job lifecycle (queued → started → finished/failed) maintained via SQLite transactions, metadata stored as JSON columns
Configuration:
redis_urldatabase_path(default:public/asu.db),worker_threads(default: 4)Statistics: Simplified implementation using SQLAlchemy queries instead of Redis TimeSeries. Advanced aggregations deferred (marked TODO).
Dependencies:
redis>=6.4.0andrq>=2.6.0→sqlalchemy>=2.0.0Migration Notes
Example
Warning
Firewall rules blocked me from connecting to one or more addresses (expand for details)
I tried to connect to the following addresses, but was blocked by firewall rules:
downloads.openwrt.org/usr/bin/python3 python3 -c import sys; sys.path.insert(0, '.'); from asu import main; print('Main app import successful')(dns block)If you need me to access, download, or install something from one of these locations, you can either:
Original prompt
Objective
Replace the Redis + RQ (Python RQ) task queue system with SQLite database and Python threading to enable running ASU without requiring podman-compose setup. This will simplify deployment while maintaining asynchronous build processing capabilities.
Current Architecture
SERVER_STATSis enabled)Migration Requirements
1. Replace Redis/RQ with SQLite + Threading
Create new file:
asu/database.pyJobtable: Store job metadata (id, status, meta as JSON, result as JSON, enqueued_at, started_at, finished_at, failure_ttl, result_ttl)BuildStatstable: Simple statistics tracking (event type, timestamp, metadata as JSON){settings.public_path}/asu.dbCreate new file:
asu/job_queue.pyconcurrent.futures.ThreadPoolExecutorJobclass that mimics RQ's Job interface with methods:get_meta(),save_meta(),get_position(),is_queued,is_started,is_finished,is_failedlatest_result(),return_value()Queueclass that mimics RQ's Queue interface with methods:enqueue(),fetch_job(),__len__()result_ttlandfailure_ttl2. Update Configuration
Modify
asu/config.pyredis_urlsettingdatabase_path: Pathsetting (default:public_path / "asu.db")worker_threads: intsetting (default: 4) for thread pool sizeasync_queuefor backward compatibility but make it always True3. Update Utility Functions
Modify
asu/util.pyget_redis_client(),get_redis_ts()functionsget_queue()to return the new SQLAlchemy-based Queue instead of RQ Queueadd_timestamp()andadd_build_event()to:BuildStatstable using SQLAlchemyimport redisandfrom rq import Queuestatements4. Update Build System
Modify
asu/build.pyfrom rq import get_current_jobandfrom rq.utils import parse_timeout_build()function to accept custom Job object instead of RQ jobget_current_job()with passed job parameterjob.metaandjob.save_meta()calls (new Job class will handle these)"rq.worker"to"asu.worker"Modify
asu/routers/api.pyfrom rq.job import Jobimportasu.job_queueis_queued,is_started,is_finished,is_failed) works with new Job class5. Update Dependencies
Modify
pyproject.tomlredis>=6.4.0,rq>=2.6.0sqlalchemy>=2.0.0fakeredis>=2.32.06. Update Main Application
Modify
asu/main.py7. Stats Implementation Notes
server_statssetting to enable/disable stats collection8. Thread Safety Considerations
check_same_thread=Falsefor SQLAlchemy)Important Notes
Files to Create
asu/database.py- SQLAlchemy models and database utilitiesasu/job_queue.py...This pull request was created from Copilot chat.
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.