Complete database toolkit: connections, migration, and operations via AWS Session Manager
npm install @fiftyten/db-toolkit




Complete database toolkit providing secure database connectivity, AWS DMS migrations, DynamoDB operations, and infrastructure management through integrated AWS services.
Standalone Design: Complete functionality with embedded CloudFormation templates and AWS service integrations.
fiftyten-db psql dev -d indicator - complete tunnel + credentials + psql launch fiftyten-db databases dev to see available databases``bashWith pnpm (team standard)
pnpm add -g @fiftyten/db-toolkit
$3
`bash
With pnpm
pnpm dlx @fiftyten/db-toolkit psql dev -d indicatorWith npm
npx @fiftyten/db-toolkit psql dev -d indicator
`Prerequisites
$3
1. AWS CLI configured with appropriate credentials
2. Session Manager Plugin for AWS CLI:
`bash
# macOS
brew install --cask session-manager-plugin
# Linux
curl "https://s3.amazonaws.com/session-manager-downloads/plugin/latest/linux_64bit/session-manager-plugin.rpm" -o "session-manager-plugin.rpm"
sudo yum install -y session-manager-plugin.rpm
`
3. PostgreSQL Client (for database connections):
`bash
# macOS
brew install postgresql
# Ubuntu/Debian
sudo apt-get install postgresql-client
`$3
#### Required Policies
1. Database Connectivity:
BastionHostSessionManagerAccess (for Session Manager connections)
2. Migration Features: DMSMigrationDeploymentAccess (for DMS operations and CloudFormation deployment)
3. DynamoDB Operations: Included in migration policy or separate DynamoDB read access#### Key Permissions Included
- CloudFormation: Create/update/delete migration stacks
- DMS: Manage replication instances, endpoints, and tasks
- EC2: VPC and subnet discovery, security group management
- IAM: Create DMS service roles
- CloudWatch/SNS: Monitoring and notifications
- DynamoDB: Table operations and data access
- Secrets Manager: Database credential access
`json
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "CloudFormationMigrationAccess",
"Effect": "Allow",
"Action": [
"cloudformation:CreateStack",
"cloudformation:UpdateStack",
"cloudformation:DeleteStack",
"cloudformation:DescribeStacks"
],
"Resource": [
"arn:aws:cloudformation:::stack/indicator-migration-stack-/"
]
}
// ... additional statements (see full policy in documentation)
]
}
`Policy Name:
DMSMigrationDeploymentAccess#### Minimal Permissions
For database connections only (without migration or DynamoDB features), the
BastionHostSessionManagerAccess policy is sufficient.Usage
$3
`bash
One command for complete database access (recommended)
fiftyten-db psql dev -d indicatorDynamoDB operations (sensitive fields auto-filtered)
fiftyten-db dynamo list-tables
fiftyten-db dynamo scan trading_orders --limit 10PostgreSQL migration (recommended)
fiftyten-db migrate pg-test dev --source-db legacy # Test connections
fiftyten-db migrate pg-dump dev --source-db legacy --data-only # Migrate data
fiftyten-db migrate pg-stats dev --source-db legacy # Verify migrationAWS DMS migration (for complex scenarios)
fiftyten-db migrate deploy dev
fiftyten-db migrate start devAlternative: Manual tunnel approach
fiftyten-db tunnel dev -d indicator
In another terminal:
psql -h localhost -p 5433 -d indicator_db -U fiftyten
`$3
####
psql - One-Command Database Connection (Recommended)
`bash
fiftyten-db psql [options]Examples
fiftyten-db psql dev -d indicator # Connect to indicator database
fiftyten-db psql dev -d copytrading # Connect to copytrading database
fiftyten-db psql main -d platform -p 5434 # Use different port
`####
tunnel - Create Database Tunnel
`bash
fiftyten-db tunnel [options]Examples
fiftyten-db tunnel dev -d indicator # Tunnel to platform database on port 5433
fiftyten-db tunnel main -d copytrading -p 5434 # Tunnel to copytrading database
`####
databases - Discover Available Databases
`bash
fiftyten-db databases Examples
fiftyten-db databases dev # See what databases are available in dev
`Common Options:
-
-p, --port - Local port for tunnel (default: 5433)
- -d, --database - Database name (platform, copytrading, etc.)
- --region - AWS region (default: us-west-1)####
connect - Direct Database Connection
`bash
fiftyten-db connect [options]Examples
fiftyten-db connect dev -d platform # Connect to indicator database
fiftyten-db connect main -d copytrading # Connect to copytrading database
`####
ssh - SSH into Bastion Host
`bash
fiftyten-db ssh Examples
fiftyten-db ssh dev # SSH into dev bastion host
fiftyten-db ssh main # SSH into production bastion host
`####
info - Show Connection Information
`bash
fiftyten-db info Examples
fiftyten-db info dev # Show dev environment info
fiftyten-db info main # Show production environment info
`####
list - List Available Environments
`bash
fiftyten-db list # Show all available environments
`$3
####
dynamo list-tables - List DynamoDB Tables
`bash
fiftyten-db dynamo list-tablesExamples
fiftyten-db dynamo list-tables # List all tables in the region
`####
dynamo describe - Describe Table Structure
`bash
fiftyten-db dynamo describe Examples
fiftyten-db dynamo describe fiftyten-exchange-credentials-dev
fiftyten-db dynamo describe trading_orders
`####
dynamo scan - Scan Table Data (Security Filtered)
`bash
fiftyten-db dynamo scan [options]Options:
--limit Limit number of items returned
--start-key Start scan from specific key
Examples
fiftyten-db dynamo scan trading_orders --limit 10
fiftyten-db dynamo scan user_profiles --limit 5
`####
dynamo query - Query Table Data
`bash
fiftyten-db dynamo query ""Examples
fiftyten-db dynamo query fiftyten-exchange-credentials-dev "tenant_id = 5010"
fiftyten-db dynamo query trading_orders "user_id = 12345"
`####
dynamo get-item - Get Specific Item
`bash
fiftyten-db dynamo get-item ""For simple keys:
fiftyten-db dynamo get-item trading_orders "id:trd_5f8a2b3c4d5e6f7g8h9i"For composite keys (JSON format):
fiftyten-db dynamo get-item fiftyten-exchange-credentials-dev \
'{"tenant_id":"5010","credential_sk":"USER#john_doe_123#PRODUCT#COPY_TRADING#EXCHANGE#gateio"}'
`DynamoDB Security Features:
- Automatic Field Filtering: Sensitive fields (API keys, secrets, credentials) are automatically hidden
- Safe Operations: Built-in protection against accidental credential exposure
- Audit Trail: All operations are logged for security compliance
$3
Complete AWS DMS migration system with embedded infrastructure:
####
migrate deploy - Deploy Migration Infrastructure
`bash
fiftyten-db migrate deploy [options]Options:
--type Migration type: full-load or full-load-and-cdc (default: full-load)
Features:
• Embedded CloudFormation templates (no external dependencies)
• Auto-discovers VPC and subnet configuration
• Interactive prompts for legacy database credentials
• Target database auto-discovery from existing infrastructure
Examples
fiftyten-db migrate deploy dev # Deploy with full-load migration
fiftyten-db migrate deploy dev --type full-load-and-cdc # Deploy with CDC
`####
migrate targets - List Available Target Databases
`bash
fiftyten-db migrate targets Shows available target databases discovered from storage infrastructure
Examples
fiftyten-db migrate targets dev # List target databases in dev environment
`####
migrate start - Start Migration Task
`bash
fiftyten-db migrate start Examples
fiftyten-db migrate start dev # Start migration (type determined by deployment)
`####
migrate status - Monitor Migration Progress
`bash
fiftyten-db migrate status Examples
fiftyten-db migrate status dev # Show detailed migration progress
`####
migrate validate - Validate Migration Data
`bash
fiftyten-db migrate validate Examples
fiftyten-db migrate validate dev # Comprehensive data validation
`####
migrate stop - Stop Migration Task
`bash
fiftyten-db migrate stop Examples
fiftyten-db migrate stop dev # Stop migration (use before cutover)
`####
migrate cleanup - Cleanup Migration Resources
`bash
fiftyten-db migrate cleanup Examples
fiftyten-db migrate cleanup dev # Destroy migration infrastructure
`#### PostgreSQL Migration Commands (Recommended)
Native PostgreSQL migration using pg_dump/psql with automatic tunneling:
####
migrate pg-test - Test Database Connections
`bash
fiftyten-db migrate pg-test [options]Test with built-in legacy database configuration
fiftyten-db migrate pg-test dev
fiftyten-db migrate pg-test mainTest with external source database
fiftyten-db migrate pg-test dev \
--source-endpoint external-db.example.com \
--source-username postgres \
--source-password "password123"
`####
migrate pg-dump - PostgreSQL Migration
`bash
fiftyten-db migrate pg-dump [options]Basic data-only migration (recommended)
fiftyten-db migrate pg-dump dev --source-db legacy --data-only
fiftyten-db migrate pg-dump main --source-db legacy --data-onlyFull migration with schema
fiftyten-db migrate pg-dump dev --source-db legacyExternal database migration
fiftyten-db migrate pg-dump dev \
--source-endpoint external-db.example.com \
--source-username postgres \
--source-password "password123" \
--data-onlyAdvanced: Table filtering
fiftyten-db migrate pg-dump dev --source-db legacy \
--data-only \
--skip-tables "migrations,typeorm_metadata"fiftyten-db migrate pg-dump dev --source-db legacy \
--data-only \
--include-tables "users,products,orders"
`Options:
-
--source-db - Use built-in legacy database configuration
- --target-db - Target database name (default: indicator)
- --source-endpoint - External source database endpoint
- --source-username - External source database username
- --source-password - External source database password
- --data-only - Dump data only, preserve existing schema
- --skip-tables - Comma-separated list of tables to skip
- --include-tables - Include only these tables (comma-separated)####
migrate pg-stats - Migration Verification
`bash
fiftyten-db migrate pg-stats [options]Compare with built-in legacy database
fiftyten-db migrate pg-stats dev --source-db legacy
fiftyten-db migrate pg-stats main --source-db legacyCompare with external database
fiftyten-db migrate pg-stats dev \
--source-endpoint external-db.example.com \
--source-username postgres \
--source-password "password123"
`PostgreSQL Migration Features:
- Native Tools: Uses pg_dump and psql for maximum PostgreSQL compatibility
- Sequential Tunneling: Creates source tunnel → dump → close → target tunnel → restore → close
- Automatic Verification: Table-by-table row count comparison
- CDK-First Discovery: Modern bastion discovery with fallback patterns
- Security Integration: Automatic password retrieval from AWS Secrets Manager
- Error Handling: Clear PostgreSQL error messages with context
- Table Filtering: Advanced include/exclude table options
- Schema Flexibility: Data-only mode preserves existing target schema
Workflows
$3
Simple and reliable PostgreSQL-to-PostgreSQL migration workflow:
`bash
1. Test connections to both source and target databases
fiftyten-db migrate pg-test dev --source-db legacy2. Perform data-only migration (preserves existing schema)
fiftyten-db migrate pg-dump dev --source-db legacy --data-only3. Verify migration success with table-by-table comparison
fiftyten-db migrate pg-stats dev --source-db legacy4. (Optional) Advanced migration with table filtering
fiftyten-db migrate pg-dump dev --source-db legacy \
--data-only \
--skip-tables "migrations,typeorm_metadata"
`#### Key Advantages
- No Infrastructure Setup: Works immediately without CloudFormation deployment
- PostgreSQL Native: Perfect compatibility using pg_dump/psql
- Automatic Tunneling: Handles Session Manager tunnels automatically
- Built-in Verification: Table-by-table row count validation
- Error Recovery: Clear error messages and automatic cleanup
#### Migration from External Database
`bash
Test external database connection
fiftyten-db migrate pg-test dev \
--source-endpoint external-db.example.com \
--source-username postgres \
--source-password "password123"Migrate data from external database
fiftyten-db migrate pg-dump dev \
--source-endpoint external-db.example.com \
--source-username postgres \
--source-password "password123" \
--data-onlyVerify migration
fiftyten-db migrate pg-stats dev \
--source-endpoint external-db.example.com \
--source-username postgres \
--source-password "password123"
`$3
Complete AWS DMS migration workflow with embedded infrastructure:
#### Migration Advantages
- Standalone Operation: All infrastructure templates embedded in CLI
- Auto-discovery: VPC, subnets, and target databases discovered automatically
- Migration Type Selection: Choose between full-load or full-load-and-cdc
- Portable: Works on any developer machine with AWS credentials
`bash
1. Ensure you have the required IAM permissions
Apply DMSMigrationDeploymentAccess policy (one-time setup)
2. Optional: List available target databases
fiftyten-db migrate targets dev3. Deploy migration infrastructure (standalone - no local repos required)
fiftyten-db migrate deploy dev --type full-load
Auto-discovers target databases, prompts for legacy DB details
4. Start migration
fiftyten-db migrate start dev5. Monitor progress (run periodically)
fiftyten-db migrate status dev6. Validate data integrity
fiftyten-db migrate validate dev7. When migration is complete and validated:
Stop the migration task (prepare for cutover)
fiftyten-db migrate stop dev8. Update application to use new database
(Point your app to the new database endpoint)
9. Cleanup migration resources
fiftyten-db migrate cleanup dev
`#### Migration Features
- Embedded CloudFormation Templates: No external repository dependencies
- Auto-Discovery: VPC, subnets, and target databases discovered automatically
- Migration Type Selection: Full-load or full-load-and-cdc based on requirements
- Security Integration: Legacy credentials never stored, uses AWS Secrets Manager
- Progress Monitoring: Real-time table-by-table statistics and error tracking
- Data Validation: Comprehensive row count and integrity validation
- CloudWatch Integration: Automated monitoring and alerting
- Infrastructure as Code: Complete DMS setup via CloudFormation
$3
`bash
Recommended: One command approach
fiftyten-db psql dev -d indicatorAlternative: Manual tunnel for GUI tools
fiftyten-db tunnel dev -d indicator
Then connect with your favorite tool:
psql -h localhost -p 5433 -d indicator_db -U fiftyten
OR
pgadmin (connect to localhost:5433)
OR
dbeaver (connect to localhost:5433)
`$3
`bash
One command for quick queries (recommended)
fiftyten-db psql dev -d indicatorAlternative: Direct connection approach
fiftyten-db connect dev -d platform
Then run: psql -h DATABASE_HOST -p 5432 -d platform -U fiftyten
`$3
`bash
SSH into bastion for manual operations
fiftyten-db ssh dev
Then you have full shell access with pre-installed tools
`Troubleshooting
$3
- Check that the bastion host is deployed in the specified environment
- Verify your AWS credentials have access to EC2 and SSM$3
- The bastion host may not be fully deployed
- Check SSM Parameter Store for /indicator/bastion/{env}/connection-info$3
- Install AWS CLI: https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html
- Configure credentials: aws configure$3
- Install Session Manager plugin (see Prerequisites above)
- Restart your terminal after installation$3
- The CLI will automatically suggest available ports
- Use a different port: fiftyten-db psql dev -d indicator -p 5434
- Find what's using the port: lsof -i :5433
- Stop local PostgreSQL if running: brew services stop postgresql$3
- Configure AWS credentials: aws configure
- Or use IAM roles if running on EC2
- Ensure MFA device is properly configured$3
- Check that the database is running
- Verify security group rules allow bastion host access
- Confirm database endpoint is correctDevelopment
`bash
Clone the repository
git clone
cd cli-toolInstall dependencies
npm installBuild
npm run buildTest locally
node dist/index.js tunnel dev
``- Uses AWS Session Manager (no SSH keys required)
- Database credentials stored in AWS Secrets Manager
- All connections are encrypted and logged
- Access controlled via AWS IAM permissions
For issues and questions, please check:
1. Infrastructure repository CLAUDE.md
2. AWS Session Manager documentation
3. Create an issue in the repository