Neural Miner: Extract Metadata, Audio, Video & Emotions from YouTube using AI pipelines.
npm install yt-neural-minerbash
ollama pull llama3
ollama pull qwen2vl
`
---
Installation
Install the package globally via npm:
`bash
npm install -g yt-neural-miner
`
---
Usage
Neural Miner provides a robust CLI with two main modes: Run and Sync.
$3
Downloads the video and runs the selected analysis engines.
`bash
miner run "https://www.youtube.com/watch?v=VIDEO_ID"
`
Options:
| Option | Description | Default |
| :-------------- | :------------------------------------------------------------------------ | :---------- |
| -p, --process | Select specific engines (metadata, audio, video, emotions, all) | all |
| --mode | Storage mode (local or db) | Interactive |
| --keep | Keep local files after DB upload | false |
| --cookies | Path to cookies.txt for restricted videos | null |
Example:
`bash
Run only Audio & Metadata, save locally
miner run "https://youtu.be/xyz" -s audio metadata --mode local
`
$3
If you have processed videos locally and want to push the cached data to your database later.
`bash
miner sync "https://www.youtube.com/watch?v=VIDEO_ID" --db "postgresql://user:pass@localhost:5432/mydb"
`
---
Output Structure
When running in local mode, artifacts are organized by Video ID:
`text
output/
└── /
├── video.mp4 # Source Video File
├── audio.mp3 # Extracted Audio Track
├── metadata.json # Structured Metadata (JSON)
├── transcript.txt # Cleaned & Romanized Transcript
├── video_narrative.txt # Frame-by-frame Visual Analysis
└── emotions.json # List of Derived Emotional Tags
`
---
Configuration
You can provide your Database URL in three ways (prioritized order):
1. CLI Flag:
`bash
miner run URL --db "postgresql://..."
`
2. Interactive Prompt:
The CLI will ask you for the URL if it is missing.
3. Environment Variable:
Set MINER_DB_URL in your system environment or a .env` file in the execution directory.