Skip to content

Conversation

@bencap
Copy link
Collaborator

@bencap bencap commented Jan 29, 2026

This PR introduces a robust, auditable, and maintainable background job system for MaveDB, supporting both standalone jobs and complex pipelines. It provides a strong foundation for future workflow automation, error recovery, and developer onboarding.

Features include:

1. Worker Job System Refactor & Enhancements
Refactored the monolithic worker job system into a modular architecture:

  • Split jobs into domain-specific modules: data_management, external_services, variant_processing, etc ( a94c2fb, new files in jobs).
  • Centralized job registration in registry.py for ARQ worker configuration (a94c2fb, 0416b2d).
  • Added standalone job definitions and improved lifecycle context for job submission (0416b2d, 60ef67d).
  • Integrated PipelineFactory for variant creation and update processes. (See: 987b38a)

2. Job & Pipeline Management Infrastructure
Implemented JobManager and PipelineManager classes for robust job and pipeline lifecycle management: (05fc52b, ae18eeb, 3799d84, 1e447a7, 7b44346)

  • Atomic state transitions, progress tracking, retry logic, and error handling.
  • Pipeline coordination, dependency management, pausing/unpausing, and cancellation.
  • Custom exception hierarchies for clear error recovery.
  • Added context manager for database session management and streamlined context handling in decorators (3ca697a, c61bd41, 010f15c).

3. Decorator System for Jobs and Pipelines
Introduced decorators for job and pipeline management:
(c2100a2, 155e549, 4a4055d, 3c4e6b9, 010f15c)

  • with_job_management: Automatic job lifecycle management, error handling, and progress tracking.
  • with_pipeline_management: Pipeline coordination after job completion.
  • with_guaranteed_job_run_record: Ensures audit trail by persisting a JobRun record before execution.
    Improved test mode support and simplified stacking/usage patterns.

4. Comprehensive Test Suite
Added and refactored unit and integration tests for all job modules, managers, and decorators.
(05fc52b, ae18eeb, a701d53, 806f8ed, 011522c, 010f15c, 8a22306, a716cc9, b0397b4, 8c5e225, 3c4e6b9, 4a4055d, 1fe076a, 1abe4c6)
Enhanced test coverage for error handling, state transitions, and job orchestration.
Introduced fixtures and utilities for easier test setup and mocking.
Categorized tests with markers for unit, integration, and network tests.
(16a5a50, f34939c)

5. Developer Documentation
Added detailed markdown documentation in the worker/jobs/] directory:
(1abe4c6)

  • System overview, decorator usage, manager responsibilities, pipeline management, job registry/configuration, and best practices.
  • Entry-point README and table of contents for easy navigation.
  • Guidance on error handling, job design, and testing strategies.

6. Database & Model Changes

  • Added new tables and enums for job traceability JobRun, Pipeline, JobDependency, etc.
    (1db6b68)
    Alembic migration for pipeline and job tracking schema.
    Updated models and enums to support new job/pipeline features.

7. Miscellaneous Improvements
Dependency updates (e.g., added asyncclick).
(a3f36d1)

  • Logging and error reporting enhancements.
  • Cleaned up legacy code, removed obsolete files, and improved code organization.

bencap added 30 commits January 16, 2026 16:14
…cture

Break down 1767-line jobs.py into domain-driven modules, improving
maintainability and developer experience.

- variant_processing/: Variant creation and VRS mapping
- external_services/: ClinGen, UniProt, gnomAD integrations
- data_management/: Database and view operations
- utils/: Shared utilities (state, retry, constants)
- registry.py: Centralized ARQ job configuration

- constants.py: Environment configuration
- redis.py: Redis connection settings
- lifecycle.py: Worker lifecycle hooks
- worker.py: Main ArqWorkerSettings class

- All job functions maintain identical behavior
- Registry provides BACKGROUND_FUNCTIONS/BACKGROUND_CRONJOBS lists for ARQ initialization
- Test structure mirrors source organization

This refactor ensures ARQ worker initialization is backwards compatible.

The modular architecture establishes a more maintainable foundation for MaveDB's automated processing workflows while preserving all existing functionality.
Implement complete database foundation for pipeline-based job tracking and monitoring:

Database Tables:
• pipelines - High-level workflow grouping with correlation IDs for end-to-end tracing
• job_runs - Individual job execution tracking with full lifecycle management
• job_dependencies - Workflow orchestration with success/completion dependency types
• job_metrics - Detailed performance metrics (CPU, memory, execution time, business metrics)
• variant_annotation_status - Granular variant-level annotation tracking with success data

Key Features:
• Pipeline workflow management with dependency resolution
• Comprehensive job lifecycle tracking (pending → running → completed/failed)
• Retry logic with configurable limits and backoff strategies
• Resource usage and performance metrics collection
• Variant-level annotation status for debugging failures
• Correlation ID support for request tracing across system
• JSONB metadata fields for flexible job-specific data
• Optimized indexes for common query patterns

Schema Design:
• Foreign key relationships maintain data integrity
• Check constraints ensure valid enum values and positive numbers
• Strategic indexes optimize dependency resolution and metrics queries
• Cascade deletes prevent orphaned records
• Version tracking for audit and debugging

Models & Enums:
• SQLAlchemy models with proper relationships and hybrid properties
• Comprehensive enum definitions for job/pipeline status and failure categories
Add comprehensive job lifecycle management with status-based completion:

* Implement convenience methods for common job outcomes:
  - succeed_job() for successful completion
  - fail_job() for error handling with exception details
  - cancel_job() for user/system cancellation
  - skip_job() for conditional job skipping
* Enhance progress tracking with increment_progress() and set_progress_total()
* Add comprehensive error handling with specific exception types
* Improve job state validation and atomic transaction handling
* Implement extensive test coverage for all job operations
- Created PipelineManager capable of coordinating jobs within a pipeline context
- Introduced `construct_bulk_cancellation_result` to standardize cancellation result structures.
- Added `job_dependency_is_met` to check job dependencies based on their types and statuses.
- Created comprehensive tests for PipelineManager covering initialization, job coordination, status transitions, and error handling.
- Implemented mocks for database and Redis dependencies to isolate tests.
- Added tests for job enqueuing, cancellation, pausing, unpausing, and retrying functionalities.
Adds decorators for managed jobs and pipelines. These can be applied to async ARQ functions to automatically persist their state as they execute
In certain instances (cron jobs in particular), worker processes are invoked
from contexts where we have not yet added a job run record to the database.
In such cases, it becomes useful to first guarantee a minimal record is added
to the database such that the job run can be tracked via existing managed job
decorators.

This feature adds such a decorator and associated tests.`
Since decorators are applied at import time, this test mode path is a pragmatic solution to run decorators without side effects during unit tests. It's more straightforward and maintainable than other solutions, and still lets us import job definitions up front to register with ARQ.
Additionally contains some small updates to how decorator unit tests handle the new test mode flag.
…ed_job_data`

- Updated test files to use `with_populated_job_data` fixture for populating the database with sample job and pipeline data.
- Removed the `setup_worker_db` fixture from various test cases in job and pipeline management tests.
- Added new sample job and pipeline fixtures in `conftest.py` to streamline test data creation.
- Improved clarity and maintainability of tests by consolidating data setup logic.
feat(wip): upload files to S3 prior to job invocation, localstack emulation in dev environment
From certain decorator contexts, we wish to not coordinate the pipeline after starting it. This prevents jobs from being double enqueued mistakenly.
bencap added 19 commits January 29, 2026 11:34
…jecting in lifecycle hooks

This contextmanager method ensures sessions are closed in a more consistent and guaranteed manner.
- Updated test assertions to check for "exception" status instead of "failed" in variant creation and mapping tests.
- Enhanced exception handling in job management decorators to return structured results with "status", "data", and "exception" fields.
- Modified job manager methods to align with new result structure, ensuring consistent handling of job outcomes across success, failure, and cancellation scenarios.
- Adjusted integration tests to validate the new result format and ensure proper job state transitions.
- Improved clarity in test cases by asserting the presence of exception details where applicable.
Alters the `complete_job` method to remove default updates to the progress message. This allows the job to set its final progress message, which results in generally more useful messages than the generic options we have at our disposal in the complete job method.
This update will support using job definitions directly in scripts.
@bencap bencap linked an issue Jan 29, 2026 that may be closed by this pull request
@bencap bencap linked an issue Jan 29, 2026 that may be closed by this pull request
- Integrated `send_slack_error` calls in multiple test cases across different modules to ensure error notifications are sent when exceptions occur.
- Updated tests for materialized views, published variants, Clingen submissions, GnomAD linking, UniProt mappings, pipeline management, and variant processing to assert that Slack notifications are triggered on failures.
- Enhanced error handling in job management decorators to include Slack notifications for missing context and job failures.
@bencap bencap force-pushed the feature/bencap/627/job-traceability branch from 17b2975 to e85312a Compare January 29, 2026 21:25
@bencap bencap force-pushed the feature/bencap/627/job-traceability branch from 2e07984 to 85a4268 Compare January 29, 2026 23:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Traceability and Auditing for Variant-level Job Results Consider Retaining Score and Count Raw Files

2 participants