Lightweight Node.js error handler that uses OpenAI to provide explanations, causes, fixes, and documentation links for runtime errors
npm install ai-error-solutionLightweight, AI-powered error analysis for Node.js
Automatically capture runtime errors and get instant AI-generated explanations, causes, fixes, and documentation linksβall in your console.



---
- π― Zero Dependencies - Uses native curl via child_process (no heavy HTTP libraries)
- π Lightweight - Minimal package size, maximum efficiency
- π§ AI-Powered Analysis - Leverages OpenAI to explain errors in plain English
- π Privacy-First - No telemetry, no data storage, direct API calls only
- β‘ ESM Native - Modern ES Module support
- π¨ Beautiful Output - Clean, colorized console logging
- π οΈ Production-Ready - Timeout handling, retries, graceful failures
---
``bash`
npm install ai-error-solution
Requirements:
- Node.js 18 or higher
- curl installed on your system (usually pre-installed on macOS/Linux, available on Windows)
- OpenAI API key (get one here)
---
Set up the package with your OpenAI API key in your main application file:
`javascript
import { initAutoErrorSolution, fixError } from 'ai-error-solution';
// Initialize with your API key (do this once at app startup)
initAutoErrorSolution({
apiKey: process.env.OPENAI_API_KEY,
model: 'gpt-4o-mini' // Optional: defaults to gpt-4o-mini
});
`
Wrap your error handling with fixError():
`javascript`
try {
// Your code that might throw errors
const result = riskyFunction();
} catch (err) {
// Get AI-powered analysis
await fixError(err);
}
You'll see beautiful, formatted output like this:
`
================================================================================
β Error Detected: TypeError
Cannot read property 'map' of undefined
π§ AI Explanation:
This error occurs when you try to call the .map() method on a variable
that is undefined. The JavaScript engine expected an array but received
undefined instead.
β οΈ Likely Causes:
- The variable was never initialized
- An async function hasn't resolved yet
- The API response didn't include expected data
π§ Suggested Fixes:
- Add optional chaining: data?.map(...)
- Provide a default value: (data || []).map(...)
- Check existence first: if (data) { data.map(...) }
π References:
- https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Errors
- https://javascript.info/optional-chaining
π‘ Note: AI suggestions may not be 100% accurate. Always verify fixes before applying.
================================================================================
`
---
Initialize the package with your OpenAI credentials. Must be called before using fixError().
Parameters:
- options.apiKey (string, required) - Your OpenAI API keyoptions.model
- (string, optional) - OpenAI model to use (default: 'gpt-4o-mini')options.timeout
- (number, optional) - API request timeout in milliseconds (default: 30000)options.maxRetries
- (number, optional) - Maximum retry attempts (default: 1)
Example:
`javascript`
initAutoErrorSolution({
apiKey: process.env.OPENAI_API_KEY,
model: 'gpt-4o-mini',
timeout: 30000,
maxRetries: 2
});
---
Analyze an error using OpenAI and display formatted results.
Parameters:
- error (Error | string, required) - Error object or error messageoptions.silent
- (boolean, optional) - Return analysis without logging (default: false)
Returns:
- Promise - When logging to console (default)Promise
-
Example:
`javascript
// Standard usage (logs to console)
try {
dangerousCode();
} catch (err) {
await fixError(err);
}
// Silent mode (returns data)
const analysis = await fixError(err, { silent: true });
console.log(analysis.analysis.explanation);
`
---
Wrap a function with automatic error handling.
Parameters:
- fn (Function) - Function to wrap
Returns:
- Function - Wrapped function that automatically calls fixError() on errors
Example:
`javascript
const safeFunction = wrapWithErrorHandler(async () => {
// Code that might throw
return await riskyOperation();
});
await safeFunction(); // Errors automatically analyzed
`
---
Register global handlers for uncaught exceptions and unhandled promise rejections.
Parameters:
- options.exitOnError (boolean, optional) - Exit process after handling error (default: false)
Example:
`javascript
setupGlobalHandler({ exitOnError: true });
// Now all uncaught errors will be automatically analyzed
throw new Error('This will be caught and analyzed');
`
---
1. Install dotenv:
`bash`
npm install dotenv
2. Create .env:`env`
OPENAI_API_KEY=sk-your-api-key-here
3. Load in your app:
`javascript
import 'dotenv/config';
import { initAutoErrorSolution } from 'ai-error-solution';
initAutoErrorSolution({
apiKey: process.env.OPENAI_API_KEY
});
`
`bashLinux/macOS
export OPENAI_API_KEY=sk-your-api-key-here
node app.js
---
π― Use Cases
$3
`javascript
import express from 'express';
import { initAutoErrorSolution, fixError } from 'ai-error-solution';const app = express();
initAutoErrorSolution({
apiKey: process.env.OPENAI_API_KEY
});
// Error handling middleware
app.use(async (err, req, res, next) => {
await fixError(err);
res.status(500).json({ error: 'Internal Server Error' });
});
`$3
`javascript
const fetchUserData = wrapWithErrorHandler(async (userId) => {
const response = await fetch(/api/users/${userId});
return response.json();
});// Automatically analyzes errors
await fetchUserData(123);
`$3
`javascript
import { initAutoErrorSolution, setupGlobalHandler } from 'ai-error-solution';initAutoErrorSolution({
apiKey: process.env.OPENAI_API_KEY
});
setupGlobalHandler({ exitOnError: false });
// All uncaught errors are now automatically analyzed
`---
β οΈ Important Notes
$3
- AI Accuracy: AI-generated suggestions may not always be correct. Always verify fixes before applying them to production code.
- API Costs: Each error analysis makes an API call to OpenAI, which incurs costs based on your OpenAI plan.
- Privacy: Error messages and stack traces are sent to OpenAI for analysis. Do not use in production if your errors may contain sensitive data.
- curl Dependency: This package requires
curl to be installed and accessible in your system PATH.$3
- β
Use in development and debugging environments
- β
Store API keys in environment variables (never commit them)
- β
Set reasonable timeout values for production environments
- β
Review AI suggestions before implementing fixes
---
ποΈ Architecture
This package is built with zero dependencies and uses:
- ESM - Modern ES Module system
- Native curl - No heavy HTTP libraries (axios, node-fetch, etc.)
- child_process - Native Node.js process execution
- Middleware pattern - One-time API key initialization
Why curl?
- Minimal package size
- No dependency vulnerabilities
- Universal availability across platforms
- Direct OpenAI API communication
---
π οΈ Troubleshooting
$3
Solution: Install curl on your system.
`bash
macOS (via Homebrew)
brew install curlUbuntu/Debian
sudo apt-get install curlWindows (via Chocolatey)
choco install curlWindows (built-in on Windows 10+)
curl should already be available
`$3
Solution: Make sure you call
initAutoErrorSolution() before using fixError().$3
Solution: Increase timeout or check your internet connection.
`javascript
initAutoErrorSolution({
apiKey: process.env.OPENAI_API_KEY,
timeout: 60000 // 60 seconds
});
``Solution: Verify your API key is correct and has sufficient credits.
---
MIT Β© [Your Name]
---
Contributions are welcome! Please feel free to submit a Pull Request.
---
- npm Package
- GitHub Repository
- OpenAI API Documentation
- Report Issues
---
Most error analysis tools either:
- Require heavy dependencies (bloated package size)
- Send data to third-party services (privacy concerns)
- Auto-modify code (risky in production)
ai-error-solution is different:
- β
Lightweight - No dependencies, tiny package size
- β
Private - Direct API calls, no intermediaries
- β
Safe - Never modifies your code
- β
Transparent - Open source, audit the code yourself
---
Made with β€οΈ for developers who value simplicity and privacy
Star β this project if you find it helpful!