Structured logging with multiple transports, log levels, and correlation tracking for BernierLLC ecosystem
npm install @bernierllc/loggerProduction-ready structured logging with multiple transports, log levels, correlation tracking, and performance monitoring for the BernierLLC ecosystem.
- Structured Logging: JSON-based structured log output with metadata
- Multiple Transports: Console, file, HTTP, database, and custom transports
- Log Levels: Configurable log levels with filtering (ERROR, WARN, INFO, HTTP, VERBOSE, DEBUG, SILLY)
- Correlation Tracking: Request/trace ID correlation across async operations
- Performance Tracking: Duration tracking, timers, and profiling
- Data Sanitization: Automatic PII protection and sensitive data masking
- Context Management: Rich context and metadata support
- Child Loggers: Contextual logger inheritance
- Graceful Error Handling: Transport failure recovery and error callbacks
``bash`
npm install @bernierllc/logger
`typescript
import { Logger, LogLevel, ConsoleTransport, FileTransport } from '@bernierllc/logger';
const logger = new Logger({
level: LogLevel.INFO,
context: {
service: 'user-service',
version: '1.0.0',
environment: process.env.NODE_ENV || 'development'
},
transports: [
new ConsoleTransport({ level: LogLevel.DEBUG }),
new FileTransport({
filename: 'app.log',
level: LogLevel.INFO
})
],
sanitize: true,
enableCorrelation: true
});
logger.info('Application started');
logger.error('Database connection failed', new Error('Connection timeout'));
`
`typescript
import { createLogger } from '@bernierllc/logger';
const logger = createLogger({
context: {
service: 'my-service',
version: '1.0.0',
environment: 'production'
}
});
logger.info('User logged in', { userId: '123', email: 'user@example.com' });
logger.warn('High memory usage', { memoryUsage: '85%' });
logger.error('Payment failed', new Error('Invalid card'));
`
`typescript
// Using timers
const timer = logger.startTimer('database-query');
await performDatabaseQuery();
timer.end('Query completed successfully');
// Using time wrapper
const result = await logger.time('cache-operation', async () => {
return await cacheData(key, value);
});
// Using profiling
logger.profile('data-processing');
await processLargeDataset();
logger.profile('data-processing'); // Logs duration
`
`typescript
import { Correlation, CorrelationManager } from '@bernierllc/logger';
// Express middleware
app.use((req, res, next) => {
const manager = CorrelationManager.getInstance();
const context = manager.fromHeaders(req.headers);
Correlation.run(context, () => {
req.logger = logger.child({
requestId: context.correlationId,
method: req.method,
url: req.url
});
next();
});
});
// In route handlers
app.get('/users/:id', async (req, res) => {
req.logger.info('Fetching user', { userId: req.params.id });
// All logs will include correlation ID automatically
});
`
`typescript
const parentLogger = logger.child({
component: 'auth-service'
});
const childLogger = parentLogger.child({
subcomponent: 'token-validator'
});
childLogger.info('Token validated');
// Includes both component and subcomponent in context
`
`typescript
const logger = new Logger({
sanitize: true,
sanitizeFields: ['password', 'token', 'creditCard', 'ssn'],
transports: [new ConsoleTransport()]
});
logger.info('User data', {
username: 'john@example.com',
password: 'secret123', // Will be [REDACTED]
age: 30 // Will be preserved
});
`
`typescript
class SlackTransport implements LogTransport {
name = 'slack';
level = LogLevel.ERROR;
async write(entry: LogEntry): Promise
if (entry.level <= LogLevel.ERROR) {
await this.sendToSlack(entry.message, entry.metadata);
}
}
private async sendToSlack(message: string, metadata: any) {
// Send to Slack webhook
}
}
const logger = new Logger({
transports: [
new ConsoleTransport(),
new SlackTransport()
]
});
`
`typescript`
const logger = new Logger({
level: LogLevel.INFO,
transports: [
new ConsoleTransport({
level: LogLevel.DEBUG,
formatter: new TextFormatter()
}),
new FileTransport({
filename: 'app.log',
level: LogLevel.INFO,
maxSize: 10 1024 1024, // 10MB
maxFiles: 5
}),
new HTTPTransport({
url: 'https://logs.example.com/api',
level: LogLevel.WARN,
headers: { 'Authorization': 'Bearer token' },
batchSize: 100
})
]
});
#### Constructor
`typescript`
new Logger(options: LoggerOptions)
#### Methods
- info(message: string, metadata?: object) - Log info messagewarn(message: string, metadata?: object)
- - Log warning message error(message: string, error?: Error, metadata?: object)
- - Log error messagedebug(message: string, metadata?: object)
- - Log debug messageverbose(message: string, metadata?: object)
- - Log verbose messagesilly(message: string, metadata?: object)
- - Log silly messagestartTimer(label: string): LogTimer
- - Start performance timertime(label: string, fn: () => Promise
- - Time async operationprofile(label: string)
- - Profile code sectionchild(context: Partial
- - Create child loggersetCorrelationId(id: string)
- - Set correlation IDsetUserId(id: string)
- - Set user IDflush(): Promise
- - Flush all transportsclose(): Promise
- - Close logger and transports
`typescript`
enum LogLevel {
ERROR = 0, // Highest priority
WARN = 1,
INFO = 2,
HTTP = 3,
VERBOSE = 4,
DEBUG = 5,
SILLY = 6 // Lowest priority
}
#### ConsoleTransport
Logs to console with appropriate console methods (error, warn, log, debug).
#### FileTransport
Logs to files with rotation support.
#### HTTPTransport
Sends logs to HTTP endpoints with batching and retry logic.
#### DatabaseTransport
Stores logs in databases with connection pooling.
#### JSONFormatter
Outputs structured JSON logs.
#### TextFormatter
Outputs human-readable text logs.
#### StructuredFormatter
Outputs flattened structured logs with prefixed fields.
The logger supports runtime configuration through config files, allowing you to customize logging behavior without modifying code.
1. Initialize configuration:
`bash`
npm run config:init
2. Edit the generated logger.config.js:
`javascript`
module.exports = {
level: 'info',
context: {
service: 'my-service',
version: '1.0.0'
},
console: {
enabled: true,
level: 'debug'
}
};
3. Use in your code:
`typescript`
import { createLoggerFromConfig } from '@bernierllc/logger';
const logger = await createLoggerFromConfig();
logger.info('Application started');
The logger automatically discovers configuration files in this order:
1. logger.config.js (JavaScript - recommended)logger.config.cjs
2. (CommonJS)logger.config.mjs
3. (ES modules)logger.config.json
4. (JSON)package.json
5. → "logger" key
`bashInitialize configuration
npm run config:init
$3
All configuration options can be overridden with
LOGGER_* environment variables:`bash
LOGGER_LEVEL=debug
LOGGER_SERVICE=my-service
LOGGER_CONSOLE_LEVEL=info
LOGGER_FILE_ENABLED=true
LOGGER_FILE_FILENAME=logs/app.log
`$3
For detailed configuration options, examples, and migration guide, see:
Logger Runtime Configuration Documentation
Configuration
$3
`typescript
interface LoggerOptions {
level: LogLevel; // Minimum log level
transports: LogTransport[]; // Array of transports
context?: Partial; // Default context
enableCorrelation?: boolean; // Enable correlation tracking
sanitize?: boolean; // Enable data sanitization
sanitizeFields?: string[]; // Fields to sanitize
formatter?: LogFormatter; // Default formatter
onError?: (error: Error, transport: string) => void; // Error handler
}
`$3
`typescript
interface LogContext {
service: string; // Service name
version: string; // Service version
environment: string; // Environment (dev, staging, prod)
hostname?: string; // Server hostname
pid?: number; // Process ID
thread?: string; // Thread identifier
[key: string]: any; // Additional context fields
}
`Error Handling
The logger includes comprehensive error handling:
`typescript
const logger = new Logger({
transports: [transport1, transport2],
onError: (error, transportName) => {
console.error(Transport ${transportName} failed:, error);
// Handle transport failures (e.g., fallback logging)
}
});
`Transport failures are isolated and don't affect other transports or application execution.
Performance
The logger is optimized for production use:
- Minimal logging overhead (<1ms per log entry)
- Asynchronous transport writing
- Efficient message and metadata size limiting
- Smart batching for network transports
- Memory-conscious correlation tracking
Best Practices
1. Use appropriate log levels - Reserve ERROR for actual errors
2. Include context - Add relevant metadata to log entries
3. Use correlation IDs - Track requests across services
4. Sanitize sensitive data - Prevent PII leakage
5. Configure transports appropriately - Different levels for different outputs
6. Use child loggers - Create context-specific loggers
7. Handle transport failures - Implement error callbacks
8. Monitor performance - Use built-in timing functions
Integration Status
$3
- Status: N/A - This is the core logging package
- Usage: Other BernierLLC packages integrate with this logger for consistent logging across the ecosystem$3
- Status: Ready - Full TypeDoc documentation export support
- Format: TypeDoc with comprehensive API documentation
- Features: Automatic documentation generation, code examples, and type definitions$3
- Status: Integrated - Full service discovery and event bus support
- Implementation:
- Auto-detects NeverHub availability at runtime
- Graceful degradation when NeverHub is not available
- Enhanced transport capabilities when NeverHub is present
- Service registration with logging capabilities
- Event-driven log aggregation and forwarding
- Capabilities:
- Dynamic configuration from NeverHub config service
- Enhanced observability with distributed tracing
- Automatic service health reporting via logs
- Cross-service log correlationDevelopment
`bash
Install dependencies
npm installBuild the package
npm run buildRun tests
npm testRun tests with coverage
npm run test:coverageLint code
npm run lint
``Copyright (c) 2025 Bernier LLC. All rights reserved.
- @bernierllc/retry-policy - Used for transport retry logic
- @bernierllc/csv-parser - May use this logger for debugging
- @bernierllc/email-sender - May use this logger for operation tracking