Advanced Node.js memory monitoring with stack trace analysis, user code detection, and memory leak identification
npm install memory-watchAdvanced Node.js Memory Monitoring with Stack Trace Analysis
A powerful Node.js library that not only monitors memory usage but also identifies exactly which functions and files are causing memory issues. Unlike basic memory monitors, Memory Watch provides detailed stack traces, user code detection, and actionable insights for fixing memory leaks.
- š Real-time memory monitoring with customizable thresholds
- šÆ User code detection - identifies YOUR functions causing memory issues (not just Node.js internals)
- š Advanced stack trace analysis - shows exact file paths and line numbers
- ļæ½ Automatic memory leak detection with smart recommendations
- š Comprehensive diagnostics - heap breakdown, active handles, CPU usage
- š¾ Manual context capture for better tracking with captureContext()
- š¬ Multiple action callbacks for alerts, logging, notifications
- š Detailed diagnostic reports with actionable insights
ā Basic memory monitors tell you:
- "Memory usage is high: 85%"
ā Memory Watch tells you:
- "Memory spike in processUserData() function"
- "File: src/services/userService.js:123"
- "Cause: Creating too many objects in loop"
- "Recommendation: Implement data streaming"
``bash`
npm install memory-watch
`javascript
const { MemoryWatch } = require("memory-watch");
const watch = new MemoryWatch({
threshold: 0.8, // 80% of max heap
interval: 30000, // Check every 30 seconds
actions: [
(data) => {
console.log("šØ Memory threshold reached!");
console.log(Usage: ${(data.percentage * 100).toFixed(2)}%);
// See which function caused the issue
if (data.context?.stackTrace?.[0]) {
const trace = data.context.stackTrace[0];
console.log(Problem in: ${trace.functionName});File: ${trace.fileName}:${trace.lineNumber}
console.log();
}
},
],
});
watch.start();
`
`javascript
const { MemoryWatch } = require("memory-watch");
const watch = new MemoryWatch({
threshold: 0.7,
interval: 10000,
actions: [
(data) => {
// Get detailed diagnostic report
const report = generateDiagnosticReport(data);
console.log(report);
// Send alert with specific function details
sendSlackAlert({
message: Memory leak detected in ${data.context?.stackTrace?.[0]?.functionName},
file: data.context?.stackTrace?.[0]?.fileName,
line: data.context?.stackTrace?.[0]?.lineNumber,
});
},
],
});
// Manually track important functions
function processLargeDataset() {
watch.captureContext("processLargeDataset", __filename, 45);
// Your memory-intensive code here
}
watch.start();
`
`typescript`
interface MemoryWatchOptions {
threshold: number; // Memory threshold (0-1)
interval: number; // Check interval in milliseconds
actions: Array<(data: MemoryData) => void | Promise
continuous?: boolean; // Continue monitoring after threshold (default: true)
customMemoryCheck?: () => { used: number; total: number };
}
`typescript`
interface MemoryData {
used: number; // Current memory usage
total: number; // Total available memory
percentage: number; // Usage percentage (0-1)
usedBytes: number; // Memory usage in bytes
totalBytes: number; // Total memory in bytes
timestamp: Date; // Measurement timestamp
breakdown: {
rss: number; // Resident Set Size (physical memory)
heapUsed: number; // Heap memory used
heapTotal: number; // Total heap allocated
external: number; // External memory (C++ objects)
arrayBuffers: number; // ArrayBuffer memory
};
context?: {
triggerSource?: string; // What triggered the measurement
pid: number; // Process ID
nodeVersion: string; // Node.js version
platform: string; // Platform information
stackTrace?: Array<{
functionName: string;
fileName?: string;
lineNumber?: number;
columnNumber?: number;
}>;
activeHandles?: number; // Active timers, servers, etc.
activeRequests?: number; // Active HTTP requests
uptime: number; // Process uptime in seconds
cpuUsage?: {
user: number; // CPU user time
system: number; // CPU system time
};
};
}
`typescript
// Generate detailed diagnostic report
import {
generateDiagnosticReport,
getMemoryLeakIndicators,
} from "memory-watch";
const data = watch.getCurrentMemory();
const report = generateDiagnosticReport(data);
console.log(report);
// Check for memory leak indicators
const leakIndicators = getMemoryLeakIndicators(data);
if (leakIndicators.length > 0) {
console.log("Potential memory leaks detected:", leakIndicators);
}
`
- start() - Start monitoringstop()
- - Stop monitoringgetCurrentMemory()
- - Get current memory statusisRunning()
- - Check if monitoring is activecaptureContext(functionName, fileName?, lineNumber?)
- - NEW! Manually capture user context for better trackingMemoryWatch.checkOnce(threshold)
- - One-time memory check (static method)
`javascript
const { MemoryWatch, generateDiagnosticReport } = require("memory-watch");
const watch = new MemoryWatch({
threshold: 0.8,
interval: 30000,
actions: [
(data) => {
// Show which user function caused the issue
if (data.context?.stackTrace?.[0]) {
const trace = data.context.stackTrace[0];
console.log(šÆ Problem in function: ${trace.functionName});š File: ${trace.fileName}:${trace.lineNumber}
console.log();
}
// Send detailed alert
sendSlackNotification({
message: Memory threshold reached: ${(data.percentage * 100).toFixed(
1
)}%,
function: data.context?.stackTrace?.[0]?.functionName,
file: data.context?.stackTrace?.[0]?.fileName,
line: data.context?.stackTrace?.[0]?.lineNumber,
});
},
],
});
watch.start();
`
`javascript
const { MemoryWatch } = require("memory-watch");
const watch = new MemoryWatch({
threshold: 0.7,
interval: 15000,
actions: [
(data) => {
const report = generateDiagnosticReport(data);
console.log(report);
// Get memory leak indicators
const leakIndicators = getMemoryLeakIndicators(data);
if (leakIndicators.length > 0) {
console.log('š“ Memory leak detected:', leakIndicators);
}
}
]
});
// In your application functions, add manual tracking:
function processLargeDataset(data) {
// Track this function for better memory analysis
watch.captureContext('processLargeDataset', __filename, 25);
// Your processing logic here...
const result = data.map(item => / heavy processing /);
return result;
}
function handleAPIRequest(req, res) {
watch.captureContext('handleAPIRequest', __filename, 35);
// Your API logic here...
}
watch.start();
`
`javascript
const {
MemoryWatch,
generateDiagnosticReport,
getMemoryLeakIndicators,
} = require("memory-watch");
const watch = new MemoryWatch({
threshold: 0.7,
interval: 10000,
actions: [
(data) => {
// Generate comprehensive diagnostic report
const report = generateDiagnosticReport(data);
console.log(report);
// Check for memory leak indicators
const leakIndicators = getMemoryLeakIndicators(data);
if (leakIndicators.length > 0) {
console.log("š“ Memory leak indicators:", leakIndicators);
// Send alert with specific details
sendAlert({
type: "memory_leak",
indicators: leakIndicators,
stackTrace: data.context?.stackTrace,
file: data.context?.stackTrace?.[0]?.fileName,
function: data.context?.stackTrace?.[0]?.functionName,
});
}
// Log problematic source files
if (data.context?.stackTrace) {
const sourceFiles = data.context.stackTrace
.filter((trace) => trace.fileName)
.map((trace) => ${trace.fileName}:${trace.lineNumber})
.slice(0, 3);
console.log("šÆ Check these files for memory issues:", sourceFiles);
}
},
],
});
`
`javascriptpid:${data.context?.pid}
const watch = new MemoryWatch({
threshold: 0.85,
interval: 60000, // Check every minute
actions: [
async (data) => {
// Send to monitoring service with detailed context
await sendToDatadog({
metric: "memory.usage.high",
value: data.percentage,
tags: [
,platform:${data.context?.platform}
,trigger:${data.context?.triggerSource}
,active_handles:${data.context?.activeHandles}
,active_requests:${data.context?.activeRequests}
,
],
stackTrace: data.context?.stackTrace,
});
// Log detailed breakdown for ops team
console.log("Memory breakdown:", {
heap: ${(data.breakdown.heapUsed / 1024 / 1024).toFixed(2)}MB,${(data.breakdown.rss / 1024 / 1024).toFixed(2)}MB
rss: ,${(data.breakdown.external / 1024 / 1024).toFixed(2)}MB
external: ,`
activeHandles: data.context?.activeHandles,
topFunction: data.context?.stackTrace?.[0]?.functionName,
});
},
],
});
`javascript`
// Check if memory usage is above 50%
const result = await MemoryWatch.checkOnce(0.5);
if (result) {
console.log("Memory usage is high:", result);
}
Identify which API endpoints are causing memory leaks:
`javascript
const watch = new MemoryWatch({
threshold: 0.8,
interval: 30000,
actions: [
(data) => {
const apiEndpoint = data.context?.stackTrace?.find(
(trace) =>
trace.fileName?.includes("routes") ||
trace.fileName?.includes("controllers")
);
if (apiEndpoint) {
console.log(
šØ Memory issue in API: ${apiEndpoint.fileName}:${apiEndpoint.lineNumber} Function: ${apiEndpoint.functionName}
);
console.log(); Memory: ${(data.percentage * 100).toFixed(1)}%
console.log();`
}
},
],
});
Monitor for unclosed database connections:
`javascriptā ļø High active handles: ${data.context.activeHandles}
const watch = new MemoryWatch({
threshold: 0.7,
interval: 15000,
actions: [
(data) => {
if (data.context?.activeHandles > 50) {
console.log();
console.log("Possible unclosed database connections or timers");
// Check stack trace for database-related functions
const dbTrace = data.context?.stackTrace?.find(
(trace) =>
trace.functionName?.includes("query") ||
trace.functionName?.includes("connection") ||
trace.fileName?.includes("database")
);
if (dbTrace) {
console.log(
š Check database code: ${dbTrace.fileName}:${dbTrace.lineNumber}`
);
}
}
},
],
});
Use during development to catch memory issues early:
`javascript
const watch = new MemoryWatch({
threshold: 0.6,
interval: 5000,
continuous: false, // Stop after first alert
actions: [
(data) => {
const report = generateDiagnosticReport(data);
console.log(report);
// Save detailed report to file for analysis
require("fs").writeFileSync(memory-report-${Date.now()}.txt, report);
console.log(
"š” Tip: Check the stack trace above for the problematic code"
);
},
],
});
`
Unlike basic memory monitoring tools, Memory Watch provides:
- šÆ Exact source identification: Tells you which file and function is causing memory issues
- š Detailed breakdown: Shows heap, RSS, external memory separately
- š Root cause analysis: Identifies patterns like unclosed handles or large buffers
- šØ Smart leak detection: Automatically detects common memory leak patterns
- š Process insights: Tracks active handles, requests, and CPU usage
- š” Actionable recommendations: Provides specific suggestions for fixing issues
When memory threshold is reached, you'll see detailed reports like:
`
š MEMORY DIAGNOSTIC REPORT
================================
š Overall Usage: 78.5% (245MB / 312MB)
ā° Timestamp: 2025-09-10T11:30:15.123Z
š MEMORY BREAKDOWN:
⢠Heap Used: 178MB
⢠Heap Total: 245MB
⢠RSS (Physical): 312MB
⢠External: 45MB
⢠Array Buffers: 12MB
š„ļø PROCESS INFO:
⢠PID: 12345
⢠Node.js: v18.20.4
⢠Platform: linux
⢠Uptime: 2h 15m 30s
⢠Active Handles: 15
⢠Active Requests: 3
šÆ POTENTIAL SOURCES (Stack Trace):
1. processLargeDataset (/app/src/data-processor.js:45:12)
2. handleApiRequest (/app/src/routes/api.js:123:8)
3. middleware (/app/src/middleware/auth.js:67:15)
š” RECOMMENDATIONS:
ā ļø Heap usage is very high - possible memory leak
ā ļø Check the stack trace above for problematic functions
`
Test the library with included examples:
`bashSimple memory monitoring
npm run example
Development & Testing
To test the library locally:
1. Clone the repository
2. Install dependencies:
npm install
3. Build the project: npm run build
4. Run examples: npm run example-trackingNPM Package Information
- Package Name:
memory-watch
- Version: 1.0.0
- Node.js Support: >=14.0.0
- TypeScript: Full TypeScript support with type definitions
- License: MIT
- Bundle Size: Lightweight (~50KB)Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
$3
`bash
git clone https://github.com/muhcen/memory-watch.git
cd memory-watch
npm install
npm run build
npm run example-tracking
``MIT
Copyright (c) 2025 Mohsen Moradi
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.