JSONL Conversation Compressor for Claude Code
"Because that 'Context low' warning is a pain in the ass."
(Thanks, Claude Code Thinking mode β sarcastically.)
You know that moment. You're deep in the zone, Claude Code is crushing it, and then BAM:
β οΈ Context low Β· Run /compact to compact & continue
Annoying, right? Like interrupting flow just when things get thicc with context.
Warning
-
β±οΈ Takes forever β
/compactcan drag on for 2β5 minutes (especially in long or context-heavy conversations)- Thicc finishes in under 5 seconds (
--safe) or at most 30 seconds (--medium/--aggressive)
- Thicc finishes in under 5 seconds (
-
πΈ Token vampire β Can consume up to 40,000 tokens per compaction
- Thicc avoids this pain completely
-
π Starts a new conversation β
/compactdoesn't truly compact; it creates a brand-new conversation with a summary of the old one- Thicc preserves the current conversation fully intact
-
π§ Loses context β
/compactoften causes Claude Code to forget user instructions,CLAUDE.mdinstructions, or even current task context (catastrophic)- Thicc summarizes older context using a selective approach, keeping the conversation body continuous and almost zero current context loss
Thicc is the only thing that can get rid of that warning without breaking your flow.
Thicc compresses Claude Code .jsonl conversation files by intelligently clearing old context while keeping the conversation alive and intact.
β
Lightning fast compression
β
Zero current context loss (preserves recent context)
β
AI-powered summarization (optional)
β
Tool pair integrity validation
β
Batch processing (multiple files at once)
β
Graceful degradation (falls back to Safe mode)
β
Auto model selection (mistral β phi3 β llama2)
β
Context-aware deletion (preserves snapshots)
β
Claude Code compatible (pure JSONL output)
Note
Prerequisites
- Node.js 16+ (LTS recommended)
- Ollama (optional, for Medium/Aggressive AI-powered modes)
# Install dependencies
npm install
# Optional: Install Ollama for AI summarization
# Download from https://ollama.ai
ollama pull mistralnode Thicc.js# Safe mode (rules-based, no AI)
node Thicc.js --safe
node Thicc.js -s
# Medium mode (AI-assisted, balanced)
node Thicc.js --medium
node Thicc.js -m
# Aggressive mode (AI-assisted, maximum squeeze)
node Thicc.js --aggressive
node Thicc.js --hard
node Thicc.js -a
# Smart mode (real-time monitoring, 3-7% per pass)
node Thicc.js --smart <Session ID>
# Specify mode by number
node Thicc.js --mode 1 # Safe
node Thicc.js --mode 2 # Medium
node Thicc.js --mode 3 # Aggressive
node Thicc.js --mode 4 # Smart (requires session ID)node Thicc.js --helpπ‘οΈ Safe Mode (Rules-Based)
Strategy:
- Identifies a percentage of tool pairs (adjusted based on conversation size)
- Deletes backwards from last redundant chunk to first
file-history-snapshotor lastsummary - NO AI intervention β pure algorithmic deletion
- Speed: Under 5 seconds
- Reduction: 5-10%
βοΈ Medium Mode (AI-Assisted)
Strategy:
- Performs Safe mode deletion first
- Summarizes deleted
parentUuidchains using AI - Embeds summary into single parentUuid message at the top
- Verifies tool pair integrity (no incomplete pairs in summarized section)
- Speed: ~20-30 seconds
- Reduction: 10-20%
π₯ Aggressive Mode (AI-Assisted)
Strategy:
- Performs Safe mode deletion first
- Summarizes deleted
parentUuidchains using AI (more aggressive deletion window) - Embeds summary into single parentUuid message at the top
- Verifies tool pair integrity
- Speed: ~20-30 seconds
- Reduction: 15-25%
π§ Smart Mode (Real-Time Monitoring)
Strategy:
- Automatically locates session file in
%USERPROFILE%/.claude/projects/*/* - Monitors file size in real-time (polls every 3 seconds)
- Only intervenes when file exceeds 1500 KB initially
- Performs surgical deletion (3-7% per pass) using Safe mode algorithm
- Dynamically updates the file β removes only identified deletion range
- Never touches middle or recent sections of the conversation
- Intelligent threshold management β compresses once per +500 KB growth (cumulative)
- Continues monitoring indefinitely until stopped (Ctrl+C) or session ends
- Speed: Real-time, zero disruption to active conversation
- Reduction: 3-7% per pass (conservative, repeatable)
Use Case: Perfect for active, long-running Claude Code sessions. Start monitoring at the beginning of your session and forget about it β Smart Mode keeps your conversation optimized without interrupting your flow.
Threshold Logic:
- First compression: When file reaches 1500 KB
- Subsequent compressions: Every +500 KB from last compression
- Example flow:
- Start: 800 KB β Monitoring...
- Grows to: 1500 KB β Compress β 1381 KB
- Grows to: 1881 KB (1381 + 500) β Compress β 1732 KB
- Grows to: 2232 KB (1732 + 500) β Compress β 2065 KB
- And so on...
Example:
# Get your session ID from /status command
node Thicc.js --smart 8d87c742-381b-48d5-9173-27b86de5061c
# Or use interactive mode and select option 4
node Thicc.jsNote
This is my real workflow.
As soon as the "Context low Β· Run /compact to compact & continue" warning appears:
-
Get Session ID
/status
Copy the Session ID (e.g.,
8d87c742-381b-48d5-9173-27b86de5061c) -
Navigate to
.claudefolder- Press Win + R (Windows)
- Go to
.claudeorC:\Users\<user>\.claude - Enter the
projectsfolder
-
Copy the conversation file
- Locate the
.jsonlfile (filename = Session ID) - Example:
8d87c742-381b-48d5-9173-27b86de5061c.jsonl - Copy it to the
Src/directory of Thicc
- Locate the
-
Run Thicc
node Thicc.js --medium
-
Replace the original file
- Remove the
_compressedsuffix from the filename - Paste it back into
C:\Users\<user>\.claude\projects
- Remove the
-
Restart Claude Code
- Press Ctrl + C in the terminal
- Run
claude --continue
-
Continue the conversation
- No more "Context low" warning
- Repeat whenever the warning shows up again
| Metric | /compact |
Thicc (Safe) | Thicc (Medium) | Thicc (Aggressive) | Thicc (Smart) |
|---|---|---|---|---|---|
| Time | 2-5 minutes | < 5 seconds | ~20-30 seconds | ~20-30 seconds | Real-time (automatic) |
| Token Cost | Up to 40,000 | Zero | Zero | Zero | Zero |
| Context Loss | High (new conversation) | Zero | Zero | Zero | Zero |
| Instruction Retention | Unreliable | Perfect | Perfect | Perfect | Perfect |
| Reduction | ~30-50% | 5-10% | 10-20% | 15-25% | 3-7% per pass |
| Use Case | Manual intervention | Quick one-time | Balanced compression | Maximum squeeze | Active sessions |
Caution
Why /compact Loses Context
/compact summarizes the entire conversation (including recent context), then starts a new conversation with that summary. This causes:
- Loss of
CLAUDE.mdinstructions - Loss of user-defined preferences
- Loss of current task context
Thicc summarizes older context only, preserving the most recent messages and keeping the same conversation alive.
Thicc/
βββ Thicc.js # Main entry point
βββ package.json # Dependencies
βββ Src/ # Place .jsonl files here
βββ Dist/ # Compressed files output here
βββ Assets/ # Images and resources
βββ Lib/ # Internal modules
βββ Ai/ # AI integration (Ollama)
βββ Cli/ # Command-line UI
βββ Compression/ # Compression modes
βββ Core/ # Processing logic
βββ Helpers/ # Utility functions
βββ Io/ # File operations
βββ Validation/ # JSONL validation
Claude Code Pure JSONL β each line is a complete JSON object:
{"type":"summary","summary":"...","leafUuid":"..."}
{"parentUuid":"...","message":{...},...}
{"parentUuid":"...","message":{...},...}Compatibility: Output is directly compatible with Claude Code.
Thicc handles errors gracefully:
| Error | Behavior |
|---|---|
| Ollama unavailable | Automatically falls back to Safe mode |
| Model not found | Prompts user, falls back to Safe mode |
| Invalid JSONL | Reports errors, skips file, continues processing |
| Incomplete tool pairs | Finds and deletes orphaned tool pairs |
- Place
.jsonlconversation files inSrc/folder - Run
node Thicc.js(or with arguments) - Select compression mode (if interactive)
- Tool processes all files in batch
- Outputs saved to
Dist/folder:[filename]_compressed.jsonl
Note
The idea for Thicc and the discovery of the proper manual summarization approach came from me β born out of pure frustration with that damn "Context low" warning interrupting my flow.
However...
The tool itself was effectively developed entirely by our beloved Claude Sonnet model. Every line of code, every algorithm, every playful status message β all crafted by Sonnet's silicon hands.
For turning a frustrated idea into a hot to handle tool.
For understanding the assignment (and the vibe).
For making context compression thicc and efficient.
You're a real one, Sonnet. π
ISC

