Skip to content

HalxDocs/contextpack

Repository files navigation

npm license node



╔═══════════════════════════════════════════╗
║           contextpack  (ctx)              ║
║  Bundle your codebase for LLMs instantly  ║
╚═══════════════════════════════════════════╝

One command. Full codebase. Ready to paste into Claude or ChatGPT.

Stop hunting through folders and copy-pasting files one by one.
ctx . bundles your entire project — file tree, stats, and every source file — into a single LLM-ready document.


npx @halxdocs/contextpack .

Why this exists

When you're debugging with Claude or asking ChatGPT to review your code, you need context — multiple files, the project structure, the config. Copying them one by one is painful. ContextPack does it in one shot.

📦 Packing /your/project...

✅ Done!
   Files:   12
   Tokens:  23.5k tokens
   Output:  /your/project/context.md

Drop context.md into any LLM. Done.


Install

# Use instantly without installing
npx @halxdocs/contextpack .

# Or install globally
npm install -g @halxdocs/contextpack
ctx .

Usage

ctx [directory] [options]
contextpack [directory] [options]

Examples

# Pack current directory → context.md
ctx .

# Pack only the src/ folder
ctx ./src --out context.md

# Output as JSON
ctx . --format json --out context.json

# Just check token count before pasting
ctx . --stats

# Copy straight to clipboard
ctx . --copy

# Filter to relevant parts
ctx . --include src --exclude tests

# Auto-rebuild whenever you save a file
ctx . --watch --out context.md

Options

Flag Alias Description Default
--out -o Output file path context.md
--format -f markdown or json markdown
--include -i Only include paths matching pattern (repeatable)
--exclude -e Exclude paths matching pattern (repeatable)
--copy -c Copy output to clipboard false
--watch -w Watch for changes and repack automatically false
--stats -s Print stats only, no file output false
--max-size Max file size in KB to include 500
--help -h Show help

What the output looks like

# ContextPack

> Generated by contextpack — 2026-03-13T11:09:02.619Z

## 📊 Stats

| Property         | Value           |
|------------------|-----------------|
| Source           | `/your/project` |
| Files            | 12              |
| Total Size       | 45.2 KB         |
| Estimated Tokens | 11.3k tokens    |

## 🗂 File Tree

```
📁 src
  📄 index.ts
  📄 cli.ts
  📄 types.ts
📄 package.json
📄 tsconfig.json
```

## 📄 Files

### `src/index.ts`

> 1.3 KB · 339 tokens

```typescript
// ... your file contents here
```

Smart defaults

ContextPack is opinionated so you don't have to think about it:

  • ✅ Respects .gitignore automatically
  • ✅ Skips node_modules, dist, .git, and build artifacts
  • ✅ Skips binary files — images, fonts, executables
  • ✅ Skips files over 500 KB (configurable with --max-size)
  • ✅ Token estimates use the standard ~4 chars/token heuristic
  • ✅ Warns you when output exceeds common LLM context windows

Token warnings

ContextPack tells you when your output might be too large:

Tokens Warning
> 128k ⚠️ Fits Claude but may exceed GPT-4
> 200k ⚠️ Works with Claude 3, GPT-4 Turbo
> 1M ⚠️ Very large — Gemini 1.5 / Claude only
> 2M ⛔ Exceeds most LLM context windows

Programmatic API

import { pack } from '@halxdocs/contextpack';

const { output, result } = pack({
  dir: './src',
  format: 'markdown',
  copy: false,
  watch: false,
  stats: false,
  maxFileSize: 500 * 1024,
});

console.log(`${result.totalFiles} files, ${result.totalTokens} tokens`);

MIT © HalxDocs

About

Bundle your codebase into a single LLM-ready file. Stop copy-pasting files into Claude/ChatGPT.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors