A GitHub Action that turns dependency drift and vulnerabilities into a weekly, prioritized advisory issue, automatically.
Teams receive noisy dependency alerts with no project context. A list like "47 outdated packages" does not tell developers what is urgent, what is safe to defer, or what will break critical paths.
Every run:
- Detects package ecosystems in your repo (npm, PyPI, RubyGems, Go, Rust, Maven, PHP, NuGet)
- Parses manifests and lockfiles to collect pinned dependency versions
- Queries OSV.dev for known vulnerabilities (
/v1/querybatch) - Fetches latest versions from each ecosystem's registry
- Scans your source files with
ripgrepto find where each dependency is actually used - Classifies usage depth per dependency:
CRITICAL->CORE->FEATURE->PERIPHERAL - Computes a risk score (0-10) from four weighted factors
- Categorizes each dependency into an action bucket
- Creates a GitHub issue with a structured advisory
Action buckets:
| Bucket | Score | Meaning |
|---|---|---|
| 🔴 Act This Week | >= 8.0 | High-severity vuln in heavily-used code |
| 🟠 Act This Month | >= 5.0 | Moderate risk, plan for upcoming sprint |
| 🟡 Track Monthly | >= 3.0 | Low risk, keep on radar |
| ⚪ Ignore | < 3.0 | No material risk signal |
Risk score formula:
score = 0.55 x OSV_severity + 0.25 x usage_depth + 0.15 x version_lag + 0.05 x maintenance_health
All weights are configurable without code changes.
If OSV.dev is unavailable, the advisory still gets created with an outage notice and retry guidance.
Add one file to your repository: .github/workflows/dependency-advisor.yml
name: Dependency Aging Advisor
on:
schedule:
- cron: '0 9 * * 1' # Every Monday at 09:00 UTC
workflow_dispatch: # Also runnable manually from the Actions tab
permissions:
contents: read
issues: write
jobs:
advisory:
runs-on: ubuntu-latest
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
steps:
- uses: actions/checkout@v4
- uses: shiwani42/dependency-aging-advisor@v1No secrets to configure; GITHUB_TOKEN is provided automatically by GitHub Actions.
Push the file. The advisory will run every Monday morning and open an issue like:
Dependency Aging Advisory - Week of 2026-04-06
| Ecosystem | Files parsed |
|---|---|
| npm | package-lock.json (v1/v2/v3), package.json |
| PyPI | requirements*.txt, poetry.lock, pyproject.toml |
| RubyGems | Gemfile.lock, Gemfile |
| Go | go.mod |
| Rust | Cargo.lock |
| Maven | pom.xml (with ${property} resolution) |
| PHP | composer.lock |
| NuGet | packages.lock.json |
| Input | Default | Description |
|---|---|---|
create-issue |
true |
Create a GitHub issue with the advisory |
labels |
dependencies,security,advisory |
Comma-separated labels for the issue |
max-items-per-section |
10 |
Max dependencies listed per bucket |
output-dir |
.github/dependency-aging |
Directory for JSON/markdown artifacts |
risk-weight-osv |
0.55 |
Weight for OSV severity in risk score |
risk-weight-usage |
0.25 |
Weight for usage depth |
risk-weight-lag |
0.15 |
Weight for version lag |
risk-weight-health |
0.05 |
Weight for maintenance health |
import-graph |
false |
Generate dependency import graph (see Visualisations) |
- uses: shiwani42/dependency-aging-advisor@v1
with:
risk-weight-osv: '0.70'
risk-weight-usage: '0.15'
risk-weight-lag: '0.10'
risk-weight-health: '0.05'
max-items-per-section: '15'
labels: 'security,deps'The workflow supports workflow_dispatch with all weight overrides available as inputs. Go to Actions -> Dependency Aging Advisor -> Run workflow to trigger on demand.
You can also pass weights through environment variables (useful for matrix runs):
RISK_WEIGHT_OSV=0.7
RISK_WEIGHT_USAGE=0.15
RISK_WEIGHT_LAG=0.1
RISK_WEIGHT_HEALTH=0.05
Each run uploads artifacts retained for 14 days:
| File | Contents |
|---|---|
collection.json |
Detected dependencies, OSV results, latest versions |
analysis.json |
Risk scores, usage evidence, categories per dependency |
advisory.md |
Rendered markdown advisory (same content as the issue) |
advisory.html |
Interactive risk map (open in any browser) |
advisory.graph.html |
Force-directed import graph (only when import-graph: true) |
advisory.dot |
Graphviz DOT source (only when import-graph: true) |
Every run produces two visual outputs automatically, no extra configuration needed.
The CLI prints a colour-coded summary table with a segmented score bar for each dependency directly to your terminal:
------------------------------------------------------------------------
Dependency Aging Advisory - Week of 2026-03-30
------------------------------------------------------------------------
Ecosystems : npm
Analyzed : 4 dependencies
* Act This Week 0
* Act This Month 2
~ Track Monthly 0
o Ignore 2
------------------------------------------------------------------------
DEPENDENCY SCORE BAR (OSV / USAGE / LAG / HEALTH)
------------------------------------------------------------------------
* axios 6.1 ##################..........
0.21.1 -> 1.14.0 PERIPHERAL 4 vuln(s)
* lodash 5.7 ##################.......
4.17.15 -> 4.18.1 PERIPHERAL 6 vuln(s)
o bcrypt 1.7 ......
5.0.1 -> 6.0.0 PERIPHERAL 0 vuln(s)
Each bar is split into four segments showing exactly what is driving the score:
█red: OSV severity contribution▓cyan: usage depth contribution░yellow: version lag contribution·dim: maintenance health contribution
A self-contained dark-theme HTML file (zero external dependencies) is written to the output directory alongside the markdown advisory. Open it in any browser:
open .github/dependency-aging/advisory.htmlThe chart plots every dependency as a bubble:
| Axis / dimension | What it encodes |
|---|---|
| X-axis | Version lag (current to major behind) |
| Y-axis | OSV severity score (0-10) |
| Bubble size | Usage depth (CRITICAL = large, PERIPHERAL = small) |
| Bubble colour | Risk category (red / orange / yellow / gray) |
Hovering a bubble shows a tooltip with the risk score, version delta, vuln count, and the exact upgrade command to run.
For larger codebases where you want to see which files pull in which dependencies, pass --import-graph:
python3 scripts/dependency_aging_advisor.py --project-root . --output-dir /tmp/out run --import-graphOr in the workflow:
- uses: shiwani42/dependency-aging-advisor@v1
with:
import-graph: 'true'This generates two files:
| File | Use |
|---|---|
advisory.graph.html |
Self-contained force-directed graph; drag nodes, zoom, hover for details |
advisory.dot |
Graphviz DOT source; render with dot -Tpng advisory.dot -o graph.png |
The graph is bipartite: dependency nodes (rectangles, colored by risk category) connect to the source files that import them (circles, colored by top-level directory). Only dependencies with detected import hits are shown, so the graph stays readable even on large repos.
Note: The import graph requires ripgrep for usage scanning. On GitHub Actions this is installed automatically by the action. Running locally requires
rgto be on your PATH.
Run against any local project (no GitHub token needed for the analysis step):
python3 scripts/dependency_aging_advisor.py \
--project-root /path/to/your/project \
--output-dir /tmp/advisory-output \
runRun step-by-step:
# Step 1: collect dependencies + OSV data
python3 scripts/dependency_aging_advisor.py --project-root . --output-dir /tmp/out collect
# Step 2: analyze usage depth + compute risk scores
python3 scripts/dependency_aging_advisor.py --project-root . --output-dir /tmp/out analyze
# Step 3: render advisory markdown
python3 scripts/dependency_aging_advisor.py --project-root . --output-dir /tmp/out render
# Step 4: post issue (requires GITHUB_TOKEN + GITHUB_REPOSITORY)
GITHUB_TOKEN=your_token GITHUB_REPOSITORY=owner/repo \
python3 scripts/dependency_aging_advisor.py --project-root . --output-dir /tmp/out post-issueRequirements: Python 3.11+ (uses stdlib only, no pip install), ripgrep for usage scanning.
On Python 3.10 the tool works but uses a regex fallback for pyproject.toml parsing instead of the stdlib TOML parser.
Runs fully offline using synthetic dependency and OSV data. Validates risk bucketing, usage depth classification, and advisory rendering:
./scripts/run_fixture_test.shThe tool scans your source files for import statements using ecosystem-specific regex patterns and classifies each dependency by where it is used:
| Depth | Path patterns matched | Score |
|---|---|---|
CRITICAL |
auth, login, password, jwt, crypto, middleware, security, ... |
10 |
CORE |
/core/, /lib/, /service/, /domain/, /model/, /app/ |
7 |
FEATURE |
/feature/, /module/, /component/, /controller/, /src/, ... |
4 |
PERIPHERAL |
Everything else (test files, scripts, utils) | 2 |
Dev-scope dependencies (from devDependencies, dev-dependencies, [dev] groups) are always capped at PERIPHERAL since they don't run in production.
| Lag type | Score |
|---|---|
| Major version behind | 7.0 |
| Minor version behind | 4.0 |
| Patch version behind | 2.0 |
| Up to date / downgrade | 0.0 |
| Signal | Score |
|---|---|
deprecated, archived, or unmaintained in registry notes |
8.0 |
| Dev-scope dependency | 2.0 |
| Active package | 3.0 |
Confirm manifests or lockfiles are committed to the repository. The tool scans the checked-out repo; files listed in .gitignore will not be present.
The dependency may use dynamic imports, aliased imports, or re-exports. Extend usage_patterns() in the script for custom patterns.
Ensure the workflow has permissions: issues: write. This is required explicitly; the default GITHUB_TOKEN permissions may not include it.
The advisory still renders and the issue is created with an outage section. Re-run the workflow once OSV.dev is restored.
ripgrep may not be installed. Check the Install ripgrep step in the workflow logs. On non-Ubuntu runners you may need to install it manually in a prior step.
- The script has zero Python dependencies: pure stdlib, no supply chain risk from pip packages.
GITHUB_TOKENis used only to create issues and is never logged or exposed in artifacts.- All outbound HTTP calls go to:
api.osv.dev,registry.npmjs.org,pypi.org,rubygems.org,proxy.golang.org,crates.io,search.maven.org,repo.packagist.org,api.nuget.org,api.github.com. - Subprocess calls to
ripgrepuse list arguments (no shell interpolation) with all user-derived patterns run throughre.escape().
MIT