Skip to content

Latest commit

 

History

History
188 lines (146 loc) · 7.27 KB

File metadata and controls

188 lines (146 loc) · 7.27 KB

Drydock Tools

Utilities for analyzing, extracting, scaffolding, building, and shipping projects.

Primary purpose: Accelerate AI agents working with codebases. All analyzers output JSON by default (machine-consumable) with optional --markdown for human review.

Directory Structure

Tools/
├── mcp_server.py       # MCP server (use analyzers from any AI assistant)
├── intake/             # Project intake tools
├── analyzers/          # Code analysis + AI context tools
├── extractors/         # Component extraction tools (planned)
├── scaffolders/        # Project scaffolding tools
├── validators/         # Validation tools (planned)
├── builders/           # Build and ship tools (Oryx, containerd)
└── shippers/           # Artifact shipping tools (planned)

MCP Server (AI Assistant Integration)

Use Drydock analyzers directly from Claude Code, Cursor, VS Code, or any MCP-compatible assistant.

# One-time setup
pip install mcp

Claude Code (~/.claude/settings.json):

{
  "mcpServers": {
    "drydock": {
      "command": "python",
      "args": ["/path/to/Drydock/Tools/mcp_server.py"]
    }
  }
}

VS Code (.vscode/mcp.json):

{
  "servers": {
    "drydock": {
      "command": "python",
      "args": ["/path/to/Drydock/Tools/mcp_server.py"]
    }
  }
}

Available MCP tools:

Tool What it does
drydock_context One-shot full analysis (runs all analyzers)
drydock_codemap Map files, classes, functions, entry points
drydock_boundaries Find component clusters and extraction risks
drydock_dependencies Map external and internal dependencies
drydock_interfaces Extract public API surfaces
drydock_structure File counts, types, sizes, depth
drydock_platforms Detect runtimes (Node, Python, Rust, Go, .NET)

Quick Reference

The One-Shot Command

# Generate a complete AI-ready context for any project:
python analyzers/context_compiler.py <project_path> --output context.json

# Or human-readable:
python analyzers/context_compiler.py <project_path> --markdown --output context.md

This runs ALL analyzers and compiles a single document an AI agent can consume to fully understand a project. Use this first, then drill into specifics.

Intake

Tool Purpose Usage
git_intake.py Bring projects into Drydock python intake/git_intake.py <url_or_path> [--shallow] [--analyze]

Analyzers (AI Agent Tools)

Tool What it answers Output
context_compiler.py Everything (runs all analyzers) JSON/md
codemap.py "What's in this codebase?" - files, classes, functions, entry points JSON/md
interface_extractor.py "What does each module expose?" - public APIs, signatures JSON/md
boundary_detector.py "Where should I cut?" - component clusters, bridge files, orphans JSON/md
dependency_analyzer.py "What depends on what?" - external & internal deps (multi-lang) JSON/md
structure_analyzer.py "How big/deep is this?" - file counts, types, sizes JSON/md
platform_detector.py "What platforms/runtimes?" - Node, Python, Rust, Go, .NET, etc. JSON/md
git_analyzer.py "What's stable/volatile?" - hot files, coupling, contributors md/JSON

Usage Patterns

# Full context (one-shot, best for AI agents)
python analyzers/context_compiler.py /path/to/project -o context.json

# Quick code map
python analyzers/codemap.py /path/to/project -o codemap.json

# Just the public interfaces
python analyzers/interface_extractor.py /path/to/project -o interfaces.json

# Where to extract components
python analyzers/boundary_detector.py /path/to/project -o boundaries.json

# External dependencies
python analyzers/dependency_analyzer.py /path/to/project --json -o deps.json

# File structure overview
python analyzers/structure_analyzer.py /path/to/project -o structure.md

# Platform detection
python analyzers/platform_detector.py /path/to/project --json -o platforms.json

# Git history analysis (needs full clone, not --depth 1)
python analyzers/git_analyzer.py /path/to/project --days 180 -o git.md

Scaffolders

Tool Purpose Usage
project_scaffolder.py Create new project structure python scaffolders/project_scaffolder.py --name <name>

Builders

Tool Purpose Usage
oryx_builder.py Build & ship with Oryx (Docker) python builders/oryx_builder.py build <path>
containerd_builder.py Build & ship with containerd (Docker-free) python builders/containerd_builder.py build <path>

See builders/README.md for detailed documentation.

Language Support

Language codemap interfaces boundaries dependencies
Python Full AST Full AST imports AST + stdlib
JS/TS Regex Regex imports package.json
Rust Regex Regex use/mod Cargo.toml
Go Regex Regex imports go.mod
C# Regex Partial - -
Java Regex Partial - -

Workflow Integration

┌────────────────────────────────────────────────────────────────────────────┐
│                         DRYDOCK WORKFLOW                                   │
│                                                                            │
│  INTAKE          ANALYSIS                EXTRACTION    ASSEMBLY    SHIP    │
│                                                                            │
│  git_intake ──▶ context_compiler ──▶ (AI agent decides) ──▶ Projects/ ──▶ │
│                  ├── codemap              boundary_detector    scaffolder  │
│                  ├── interface_extractor   guides the cuts      │          │
│                  ├── boundary_detector                          ▼          │
│                  ├── dependency_analyzer              oryx_builder         │
│                  ├── platform_detector                containerd_builder   │
│                  └── git_analyzer                              ▼          │
│                                                          Ocean/           │
└────────────────────────────────────────────────────────────────────────────┘

Design Principles

  1. JSON-first - All tools default to JSON output for AI consumption
  2. Composable - Each tool does one thing; context_compiler orchestrates all
  3. Fast - Regex-based parsing for non-Python languages (no external deps)
  4. No external deps - Pure Python stdlib only (ast, re, json, pathlib)
  5. Non-destructive - Analyzers never modify the source project

Adding New Tools

  1. Place in appropriate category folder
  2. Follow naming convention: {action}_{target}.py
  3. Output JSON by default, support --markdown flag
  4. Include docstring with usage
  5. Update this README