Universal LLM context manager supporting 14+ languages with method-level filtering, token optimization, and GitIngest format support
npm install @hakkisagdic/context-managerUniversal LLM context optimization tool supporting 14+ programming languages with method-level filtering and token analysis. Perfect for AI-assisted development workflows.
If you find this tool helpful, consider buying me a coffee! Your support helps maintain and improve this project.

---
- context-manager.js - Main LLM context analysis script with exact token counting
- .calculatorignore - Files to exclude from token calculation (EXCLUDE mode)
- .calculatorinclude - Files to include in token calculation (INCLUDE mode)
- README.md - This documentation file
- README-tr.md - Turkish documentation (Türkçe dokümantasyon)
- ✅ Exact token counting using tiktoken (GPT-4 compatible)
- 🌍 Multi-language support - 14+ languages: JavaScript, TypeScript, Python, PHP, Ruby, Java, Kotlin, C#, Go, Rust, Swift, C/C++, Scala
- 🚫 Dual ignore system - respects both .gitignore and calculator ignore rules
- 📋 Include/Exclude modes - .calculatorinclude takes priority over .calculatorignore
- 📊 Detailed reporting - by file type, largest files, statistics
- 💾 Optional JSON export - detailed analysis reports
- 🔍 Verbose mode (default) - shows all included files for transparency
- 🎯 Core application focus - configured to analyze only essential code files
- 📈 Context optimization - perfect for LLM context window management
- 🤖 LLM context export - generate optimized file lists for LLM consumption
- 📋 Clipboard integration - copy context directly to clipboard
- ⚡ JSON format - Structured clipboard output identical to llm-context.json file
- 🎯 LLM-optimized - Clean directory structure without token counts
- 🔗 Consistent exports - Clipboard and file exports use identical JSON format
- 📤 Interactive export - Prompts for export choice when no options specified
- 🔀 Dual context modes - compact (default) or detailed format
- 📄 GitIngest format - Generate single-file digest for LLM consumption (inspired by GitIngest)
bash
Interactive analysis with export selection
context-managerAnalyze with minimal LLM context format
context-manager --context-clipboardSave detailed report
context-manager --save-reportGenerate LLM context file
context-manager --context-exportGenerate GitIngest-style digest (single file for LLMs)
context-manager --gitingestMethod-level analysis
context-manager --method-levelCombine multiple exports
context-manager -g -s # GitIngest digest + detailed reportUse detailed context format (legacy)
context-manager --method-level --detailed-context --context-clipboard
`$3
`bash
Using the NPM package globally
context-manager
context-manager --save-report
context-manager --context-clipboard
`Current Configuration
The tool is configured to focus on core application logic only:
$3
- Core MCP server implementation (utility-mcp/src/)
- Authentication and security layers
- Request handlers and routing
- Transport protocols and communication
- Utilities and validation logic
- Configuration management
- Error handling and monitoring$3
- Documentation files (.md, .txt)
- Configuration files (.json, .yml)
- Infrastructure and deployment files
- Testing and script directories
- Build artifacts and dependencies
- Workflow orchestration files (utility-mcp/src/workflows/**)
- Testing utilities (utility-mcp/src/testing/**)
- All non-essential supporting filesUsage
$3
`bash
Interactive analysis with export selection
context-managerQuiet mode (no file listing)
context-manager --no-verboseWith detailed JSON report
context-manager --save-reportGenerate LLM context file list
context-manager --context-exportCopy context directly to clipboard
context-manager --context-clipboard
`$3
When you run the tool without specifying export options (
--save-report, --context-export, or --context-clipboard), it will automatically prompt you to choose an export option after the analysis:`bash
Run analysis and get prompted for export options
context-managerThe tool will show:
📤 Export Options:
1) Save detailed JSON report (token-analysis-report.json)
2) Generate LLM context file (llm-context.json)
3) Copy LLM context to clipboard
4) No export (skip)
#
🤔 Which export option would you like? (1-4):
`This interactive mode ensures you never miss the opportunity to export your analysis results in the format you need.
Include vs Exclude Modes
The token calculator supports two complementary filtering modes:
$3
- Default mode when only .calculatorignore exists
- Includes all files except those matching ignore patterns
- Traditional gitignore-style exclusion logic$3
- Priority mode - when .calculatorinclude exists, .calculatorignore is ignored
- Includes only files matching include patterns
- More precise control for specific file selection
- Perfect for creating focused analysis sets$3
1. If .calculatorinclude exists → INCLUDE mode (ignore .calculatorignore)
2. If only .calculatorignore exists → EXCLUDE mode
3. If neither exists → Include all files (respect .gitignore only)$3
`bash
EXCLUDE mode: Include everything except patterns in .calculatorignore
rm .calculatorinclude # Remove include file
context-managerINCLUDE mode: Include only patterns in .calculatorinclude
(automatically ignores .calculatorignore)
context-manager
`$3
`bash
context-manager --help
`$3
- --save-report, -s - Save detailed JSON report
- --no-verbose - Disable file listing (verbose is default)
- --context-export - Generate LLM context file list (saves as llm-context.json)
- --context-clipboard - Copy LLM context directly to clipboard
- --detailed-context - Use detailed context format (8.6k chars, default is compact 1.2k)
- --help, -h - Show help messageLLM Context Export
The token calculator can generate optimized file lists for LLM consumption, with two format options:
$3
- Size: ~2.3k characters (structured JSON)
- Content: Project metadata and organized file paths without token counts
- Format: Identical to llm-context.json file - complete JSON structure
- Perfect for: LLM consumption, programmatic processing, structured data needs
- Usage: --context-clipboard or --context-export$3
- Size: ~8.6k characters (comprehensive)
- Content: Full paths, categories, importance scores, directory stats
- Perfect for: Initial project analysis, comprehensive documentation
- Usage: --detailed-context --context-clipboard$3
- Smart file selection - Top files by token count and importance
- Directory grouping - Common prefix compression saves space
- Token abbreviation - "12k" instead of "12,388 tokens"
- Extension removal - ".js" removed to save characters
- Cross-platform clipboard - Works on macOS, Linux, and Windows
- Multiple output formats - JSON file or clipboard ready text$3
`bash
Generate minimal LLM context and save to llm-context.json (2.3k chars JSON)
context-manager --context-exportCopy minimal context directly to clipboard (2.3k chars JSON - identical to file)
context-manager --context-clipboardCopy detailed context to clipboard (8.6k chars)
context-manager --detailed-context --context-clipboardCombine with regular analysis
context-manager --save-report --context-clipboard
`$3
Compact Format (JSON - 2.3k chars):
`json
{
"project": {
"root": "cloudstack-go-mcp-proxy",
"totalFiles": 64,
"totalTokens": 181480
},
"paths": {
"utility-mcp/src/server/": [
"CloudStackUtilityMCP.js"
],
"utility-mcp/src/handlers/": [
"workflow-handlers.js",
"tool-handlers.js",
"analytics-handler.js"
],
"utility-mcp/src/utils/": [
"security.js",
"usage-tracker.js",
"cache-warming.js"
]
}
}
`Detailed Format (8.6k chars):
`
cloudstack-go-mcp-proxy Codebase Context
Project: 64 files, 181,480 tokens
Core Files (Top 20):
1.
utility-mcp/src/server/CloudStackUtilityMCP.js (12,388 tokens, server)
2. utility-mcp/src/handlers/workflow-handlers.js (11,007 tokens, handler)
...All Files:
`json
[{"path": "file.js", "t": 1234, "c": "core", "i": 85}]
`Use Cases
Compact Format (2.3k chars JSON):
1. LLM Integration - Structured data for AI assistants with complete project context
2. Programmatic Processing - JSON format for automated tools and scripts
3. Context Sharing - Identical format in clipboard and file exports
4. Development Workflows - Consistent structure for CI/CD and automation
Detailed Format (8.6k chars):
1. Architecture Planning - Comprehensive project overview for major decisions
2. New Team Member Onboarding - Complete codebase understanding
3. Documentation Generation - Full project structure analysis
4. Code Review Preparation - Detailed file relationships and importance
General Use Cases:
- Development workflow integration
- CI/CD pipeline context generation
- Automated documentation updates
- Project health monitoring
GitIngest Format Export
Context-manager now supports generating GitIngest-style digest files - a single, prompt-friendly text file perfect for LLM consumption.
$3
GitIngest format consolidates your entire codebase into a single text file with:
- Project summary and statistics
- Visual directory tree structure
- Complete file contents with clear separators
- Token count estimates
This format is inspired by GitIngest, implemented purely in JavaScript with zero additional dependencies.
$3
`bash
Standard workflow - analyze and generate digest in one step
context-manager --gitingest
context-manager -gCombine with other exports
context-manager -g -s # digest.txt + token-analysis-report.jsonTwo-step workflow - generate digest from existing JSON (fast, no re-scan)
context-manager -s # Step 1: Create report
context-manager --gitingest-from-report # Step 2: Generate digestOr from LLM context
context-manager --context-export # Step 1: Create context
context-manager --gitingest-from-context # Step 2: Generate digestWith custom filenames
context-manager --gitingest-from-report my-report.json
context-manager --gitingest-from-context my-context.json
`Why use JSON-based digest?
- ⚡ Performance: Instant digest generation without re-scanning
- 🔄 Reusability: Generate multiple digests from one analysis
- 📦 Workflow: Separate analysis from export steps
- 🎯 Flexibility: Use different JSON sources for different purposes
$3
The generated
digest.txt file looks like:`
Directory: my-project
Files analyzed: 42Estimated tokens: 15.2k
Directory structure:
└── my-project/
├── src/
│ ├── index.js
│ └── utils.js
└── README.md
================================================
FILE: src/index.js
================================================
[complete file contents here]
================================================
FILE: src/utils.js
================================================
[complete file contents here]
`$3
- Single File: Everything in one file for easy LLM ingestion
- Tree Visualization: Clear directory structure
- Token Estimates: Formatted as "1.2k" or "1.5M"
- Sorted Output: Files sorted by token count (largest first)
- Filter Compatible: Respects all
.gitignore and calculator ignore rules$3
1. LLM Context Windows: Paste entire codebase as single context
2. Code Reviews: Share complete project snapshot
3. Documentation: Single-file project reference
4. AI Analysis: Perfect for ChatGPT, Claude, or other LLMs
5. Archival: Simple project snapshot format
$3
Context-manager implements GitIngest format v0.3.1. See docs/GITINGEST_VERSION.md for implementation details and version history.
Configuration
$3
The
.calculatorignore file is pre-configured for core application analysis:`bash
Current focus: Only core JS files in utility-mcp/src/
Excludes:
*/.md # All documentation
*/.json # All configuration files
*/.yml # All YAML files
infrastructure/** # Infrastructure code
workflows/** # Workflow definitions
docs/** # Documentation directory
token-analysis/** # Analysis tools themselves
utility-mcp/scripts/** # Utility scripts
utility-mcp/src/workflows/** # Workflow JS files
utility-mcp/src/testing/** # Testing utilities
`$3
The
.calculatorinclude file provides precise file selection:`bash
Include only core JavaScript files
This should produce exactly 64 files
Include main entry point
utility-mcp/index.jsInclude all src JavaScript files EXCEPT workflows and testing
utility-mcp/src/*/.jsExclude specific subdirectories (using negation)
!utility-mcp/src/workflows/**
!utility-mcp/src/testing/**
`$3
For EXCLUDE mode (edit
.calculatorignore):
`bash
Remove lines to include more file types
Add patterns to exclude specific files
Example: Include documentation
*/.md <- comment out or remove this line
Example: Exclude specific large files
your-large-file.js
specific-directory/**
`For INCLUDE mode (create
.calculatorinclude):
`bash
Include specific files or patterns
src/*/.js # All JS files in src
config/*.json # Config files only
docs/api/*/.md # API documentation onlyUse negation to exclude from broad patterns
src/*/.js
!src/legacy/** # Exclude legacy code
!src/*/.test.js # Exclude test files
`Configuration File Priority
1.
.gitignore (project root) - Standard git exclusions (always respected)
2. .calculatorinclude (token-analysis/) - INCLUDE mode (highest priority)
3. .calculatorignore (token-analysis/) - EXCLUDE mode (used when no include file)
4. .calculatorignore (project root) - Fallback EXCLUDE mode locationInstallation
For exact token counting, install tiktoken:
`bash
npm install tiktoken
`Without tiktoken, the tool uses smart estimation (~95% accuracy).
Output Example
`
🎯 PROJECT TOKEN ANALYSIS REPORT
================================================================================
📊 Total files analyzed: 64
🔢 Total tokens: 181,480
💾 Total size: 0.78 MB
📄 Total lines: 28,721
📈 Average tokens per file: 2,836
🚫 Files ignored by .gitignore: 11,912
📋 Files ignored by calculator rules: 198📋 BY FILE TYPE:
--------------------------------------------------------------------------------
Extension Files Tokens Size (KB) Lines
--------------------------------------------------------------------------------
.js 64 181,480 799.8 28,721
🏆 TOP 5 LARGEST FILES BY TOKEN COUNT:
--------------------------------------------------------------------------------
1. 12,388 tokens (6.8%) - utility-mcp/src/server/CloudStackUtilityMCP.js
2. 11,007 tokens (6.1%) - utility-mcp/src/handlers/workflow-handlers.js
3. 7,814 tokens (4.3%) - utility-mcp/src/utils/security.js
4. 6,669 tokens (3.7%) - utility-mcp/src/handlers/tool-handlers.js
5. 5,640 tokens (3.1%) - utility-mcp/src/ci-cd/pipeline-integration.js
`Context Management
Perfect for LLM context window optimization:
- 181k tokens = Core application logic only
- Clean analysis = No noise from docs, configs, or build files
- Focused development = Essential code for AI-assisted development
- Context efficiency = Maximum useful code per token
- Dual mode flexibility = Precise include/exclude control
- Ultra-minimal export = 1k chars (89% reduction) for frequent AI interactions
- Detailed export = 8.6k chars for comprehensive analysis when needed
Integration
You can integrate this tool into:
- CI/CD pipelines for code size monitoring
- Pre-commit hooks for token budget checks
- Documentation generation workflows
- Code quality gates
- LLM context preparation workflows
- Development environment setup
Troubleshooting
$3
- INCLUDE mode active: Remove .calculatorinclude to use EXCLUDE mode
- Wrong files included: Check if .calculatorinclude exists (takes priority)
- Mode confusion: Use verbose mode to see which mode is active$3
- Ensure no inline comments in ignore/include pattern files
- Use file patterns (docs/**) instead of directory patterns (docs/)
- Test specific patterns with verbose mode
- Check pattern syntax: * for recursive, for single level$3
- Too high: Review included files with verbose mode, add exclusion patterns
- Too low: Check if important files are excluded, review patterns
- Inconsistent: Verify which mode is active (include vs exclude)$3
- Check if files are excluded by .gitignore (always respected)
- Verify calculator ignore/include patterns
- Ensure files are recognized as text files
- Use verbose mode to see exclusion reasonsContext Manager
LLM context manager with method-level filtering and token optimization. The ultimate tool for AI-assisted development.
Created by Hakkı Sağdıç
🚀 Features
✅ File-level token analysis - Analyze entire files and directories
🔧 Method-level analysis - Extract and analyze specific methods from JavaScript/TypeScript/Rust/C#/Go/Java
📋 Dual filtering system - Include/exclude files and methods with pattern matching
📊 LLM context optimization - Generate ultra-compact context for AI assistants
🎯 Exact token counting - Uses tiktoken for GPT-4 compatible counts
📤 Multiple export formats - JSON reports, clipboard, file exports
📦 NPM package - Use programmatically or as global CLI tool
🔍 Pattern matching - Wildcards and regex support for flexible filtering
⚡ Performance optimized - 36% smaller codebase with enhanced functionality
📦 Installation
$3
`bash
Local installation
npm install @hakkisagdic/context-managerGlobal installation
npm install -g @hakkisagdic/context-managerRun globally
context-manager --help
`$3
`bash
Clone and use directly
git clone
cd token-analysis
node token-calculator.js --help
`🎯 Quick Start
$3
`bash
Interactive analysis with export selection
context-managerFile-level analysis with clipboard export
context-manager --context-clipboardMethod-level analysis
context-manager --method-level --context-exportAnalysis with reports
context-manager --method-level --save-report --verbose
`$3
`bash
Focus on specific methods only
echo "calculateTokens\nhandleRequest\n*Validator" > .methodinclude
context-manager --method-levelExclude test methods
echo "test\ndebug\nconsole" > .methodignore
context-manager --method-level --context-clipboard
`Usage
$3
`bash
Basic analysis
context-managerMethod-level analysis
context-manager --method-levelSave detailed report
context-manager --save-reportCopy context to clipboard
context-manager --context-clipboardCombine options
context-manager --method-level --save-report --verbose
`$3
`javascript
const { TokenAnalyzer } = require('@cloudstack/context-manager');// Basic file-level analysis
const analyzer = new TokenAnalyzer('./src', {
methodLevel: false,
verbose: true
});
// Method-level analysis
const methodAnalyzer = new TokenAnalyzer('./src', {
methodLevel: true,
saveReport: true
});
analyzer.run();
`🔧 Configuration
$3
Priority Order:
1.
.gitignore (project root) - Standard git exclusions (always respected)
2. .calculatorinclude - INCLUDE mode (highest priority for files)
3. .calculatorignore - EXCLUDE mode (fallback for files)
.calculatorinclude - Include only these files:
`bash
Include only core JavaScript files
utility-mcp/src/*/.js
!utility-mcp/src/testing/**
!utility-mcp/src/workflows/**
`
.calculatorignore - Exclude these files:
`bash
Exclude documentation and config
*/.md
*/.json
node_modules/**
test/
*/.test.js
*/.spec.js
`$3
.methodinclude - Include only these methods:
`bash
Core business logic methods
calculateTokens
generateLLMContext
analyzeFile
handleRequest
validateInput
processDataPattern matching
*Handler # All methods ending with 'Handler'
*Validator # All methods ending with 'Validator'
*Manager # All methods ending with 'Manager'
TokenCalculator.* # All methods in TokenCalculator class
`
.methodignore - Exclude these methods:
`bash
Utility and debug methods
console
test
debug
helper
print*
mainFile-specific exclusions
server.printStatus
utils.debugLog
`$3
| Pattern | Description | Example |
|---------|-------------|----------|
|
methodName | Exact match | calculateTokens |
| pattern | Contains pattern | *Handler matches requestHandler |
| Class. | All methods in class | TokenCalculator. |
| file.method | Specific file method | server.handleRequest |
| !pattern | Negation (exclude) | !test |📤 Output Formats
$3
Use case: General codebase analysis, file organization`json
{
"project": {
"root": "my-project",
"totalFiles": 64,
"totalTokens": 181480
},
"paths": {
"src/core/": ["server.js", "handler.js"],
"src/utils/": ["helper.js", "validator.js"]
}
}
`$3
Use case: Focused analysis, debugging specific methods, LLM context optimization`json
{
"project": {
"root": "my-project",
"totalFiles": 64,
"totalTokens": 181480
},
"methods": {
"src/server.js": [
{"name": "handleRequest", "line": 15, "tokens": 234},
{"name": "validateInput", "line": 45, "tokens": 156}
],
"src/utils.js": [
{"name": "processData", "line": 12, "tokens": 89}
]
},
"methodStats": {
"totalMethods": 150,
"includedMethods": 23,
"totalMethodTokens": 5670
}
}
`$3
Use case: Comprehensive analysis, CI/CD integration, historical tracking`json
{
"metadata": {
"generatedAt": "2024-01-15T10:30:00.000Z",
"projectRoot": "/path/to/project",
"gitignoreRules": ["node_modules/*", ".log"],
"calculatorRules": ["src//*.js", "!src/test/"]
},
"summary": {
"totalFiles": 64,
"totalTokens": 181480,
"byExtension": {".js": {"count": 64, "tokens": 181480}},
"largestFiles": [...]
},
"files": [...]
}
`CLI Options
| Option | Short | Description |
|--------|-------|-------------|
|
--save-report | -s | Save detailed JSON report |
| --verbose | -v | Show included files and directories |
| --context-export | | Generate LLM context file |
| --context-clipboard | | Copy context to clipboard |
| --method-level | -m | Enable method-level analysis |
| --help | -h | Show help message |📊 Use Cases & Examples
$3
Goal: Generate minimal context for AI assistants`bash
Ultra-compact method-level context
code-analyzer --method-level --context-clipboardFocus on core business logic only
echo "handleRequest\nprocessData\nvalidateInput" > .methodinclude
code-analyzer --method-level --context-export
`Result: 89% smaller context compared to full codebase
$3
Goal: Understand project complexity and structure`bash
Analysis with detailed reports
code-analyzer --save-report --verboseTrack largest files and methods
code-analyzer --method-level --save-report
`$3
Goal: Focus on specific problematic methods`bash
Debug authentication methods only
echo "auth\nlogin\nvalidate" > .methodinclude
code-analyzer --method-level --context-clipboardExclude test and debug methods
echo "test\ndebug\nconsole\nlogger" > .methodignore
code-analyzer --method-level
`$3
Goal: Monitor codebase growth and complexity`bash
Daily token analysis for monitoring
code-analyzer --save-report > reports/analysis-$(date +%Y%m%d).jsonCheck method complexity trends
code-analyzer --method-level --save-report
`$3
Goal: Ensure code stays within token budgets`bash
Check if codebase exceeds LLM context limits
TOKENS=$(code-analyzer --context-export | jq '.project.totalTokens')
if [ $TOKENS -gt 100000 ]; then
echo "Codebase too large for LLM context!"
exit 1
fi
`🛠️ CLI Reference
$3
| Option | Short | Description | Example |
|--------|-------|-------------|----------|
|
--save-report | -s | Save detailed JSON report | context-manager -s |
| --verbose | -v | Show included files/methods | context-manager -v |
| --context-export | | Generate LLM context file | context-manager --context-export |
| --context-clipboard | | Copy context to clipboard | context-manager --context-clipboard |
| --method-level | -m | Enable method-level analysis | context-manager -m |
| --help | -h | Show help message | context-manager -h |$3
`bash
Quick analysis with interactive export
context-managerMethod-level analysis with all outputs
context-manager --method-level --save-report --context-export --verboseLLM-optimized context generation
context-manager --method-level --context-clipboardCI/CD monitoring
context-manager --save-report --context-exportDevelopment debugging
context-manager --method-level --verbose
`💻 Programmatic API
$3
`javascript
const { TokenAnalyzer } = require('@hakkisagdic/context-manager');// File-level analysis
const analyzer = new TokenAnalyzer('./src', {
verbose: true,
saveReport: true
});
analyzer.run();
`$3
`javascript
const { TokenAnalyzer, MethodAnalyzer } = require('@hakkisagdic/code-analyzer');// Method-level analysis with custom filtering
const analyzer = new TokenAnalyzer('./src', {
methodLevel: true,
contextExport: true,
verbose: false
});
analyzer.run();
// Extract methods from specific file
const methodAnalyzer = new MethodAnalyzer();
const methods = methodAnalyzer.extractMethods(fileContent, 'server.js');
`$3
`javascript
const analyzer = new TokenAnalyzer('./src', {
// Enable method-level analysis
methodLevel: true,
// Output options
saveReport: true,
contextExport: true,
contextToClipboard: true,
// Verbosity
verbose: true,
// Compact context (for LLM optimization)
compactContext: true
});// Access results
analyzer.run();
console.log('Analysis complete!');
`$3
`javascript
const { MethodAnalyzer, MethodFilterParser } = require('@hakkisagdic/context-manager');// Create custom method filter
const filter = new MethodFilterParser(
'./custom-methods.include',
'./custom-methods.ignore'
);
// Analyze specific file
const methodAnalyzer = new MethodAnalyzer();
const methods = methodAnalyzer.extractMethods(content, filePath);
// Filter methods
const filteredMethods = methods.filter(method =>
filter.shouldIncludeMethod(method.name, fileName)
);
``- Node.js: >= 14.0.0
- tiktoken: ^1.0.0 (optional, for exact token counts)
MIT License - see LICENSE file for details
1. Fork the repository
2. Create your feature branch
3. Add tests for new functionality
4. Submit a pull request
- 🐛 Report Issues
- 📖 Documentation
- 💬 Discussions
---
Created with ❤️ by Hakkı Sağdıç