ARIA CLI - Advanced Recursive Intelligence Architecture with Q0-Q38 Cognitive System, AriaSpace Phone↔Computer Sync, Autonomous Agents, and Termux Integration
npm install @alphamatt/aria-cli
█████╗ ██████╗ ██╗ █████╗ ██████╗██╗ ██╗
██╔══██╗██╔══██╗██║██╔══██╗ ██╔════╝██║ ██║
███████║██████╔╝██║███████║ ██║ ██║ ██║
██╔══██║██╔══██╗██║██╔══██║ ██║ ██║ ██║
██║ ██║██║ ██║██║██║ ██║ ╚██████╗███████╗██║
╚═╝ ╚═╝╚═╝ ╚═╝╚═╝╚═╝ ╚═╝ ╚═════╝╚══════╝╚═╝
`
Advanced Recursive Intelligence Architecture




bash
pip install aria-cli
`
$3
`bash
pip install aria-cli[openai] # OpenAI backend
pip install aria-cli[anthropic] # Anthropic backend
pip install aria-cli[local] # Local models (llama.cpp, transformers)
pip install aria-cli[full] # All backends + rich UI
`
$3
`bash
npm install -g @alphamatt/aria-cli
`
$3
`bash
git clone https://github.com/universal-crown-prime/aria-cli.git
cd aria-cli
pip install -e .
`
🚀 Quick Start
`bash
Ask a question
aria ask "What is quantum computing?"
Search for information
aria search "trading systems"
Show Q-System layers
aria layers
Show system status
aria status
Enter interactive chat mode
aria chat
Remember something
aria remember "Meeting at 3pm tomorrow" --key meeting
Recall memories
aria recall --key meeting
Login
aria login
SSH to a host
aria ssh connect production
Create and manage agents
aria agent create my-assistant --type reasoning
aria agent list
aria agent send my-assistant "Analyze this data"
`
🤖 ARIA Agents
Autonomous AI agents with Q-System integration for distributed processing.
$3
| Type | Q-Layer | Description |
|------|---------|-------------|
| task | Q23 | Task execution (shell, files) |
| reasoning | Q10 | Logical analysis and inference |
| coordinator | Q21 | Multi-agent coordination |
| generator | Q19 | Content and code generation |
$3
`bash
Create an agent
aria agent create my-agent --type task --start
List all agents
aria agent list
Get agent status
aria agent status my-agent
Send message to agent
aria agent send my-agent "Process this request" --priority HIGH
Start/stop agent
aria agent start my-agent
aria agent stop my-agent
View agent types and capabilities
aria agent types
aria agent capabilities
`
$3
`bash
Create communication channel
aria agent channel create task-channel --type direct --agent-a agent1 --agent-b agent2
List channels
aria agent channel list
`
$3
`bash
Create a coordinated swarm
aria agent swarm create my-swarm
`
$3
`python
from aria_cli import (
AriaAgent, AgentRegistry, AgentSwarm,
create_agent, get_registry,
TaskAgent, ReasoningAgent, CoordinatorAgent,
AgentMessage, MessageType, AgentCapability,
)
import asyncio
Create agents
task_agent = create_agent("task", "file-processor")
reasoning_agent = create_agent("reasoning", "analyzer")
Register with global registry
registry = get_registry()
asyncio.run(registry.register(task_agent))
asyncio.run(registry.register(reasoning_agent))
Start agents
asyncio.run(task_agent.start())
asyncio.run(reasoning_agent.start())
Send message between agents
message = AgentMessage(
type=MessageType.REQUEST,
sender=task_agent.id,
recipient=reasoning_agent.id,
payload={"action": "analyze", "data": {...}}
)
asyncio.run(reasoning_agent.receive(message))
Create a swarm for coordinated tasks
swarm = AgentSwarm("processing-swarm")
swarm.add_agent(task_agent)
swarm.add_agent(reasoning_agent)
asyncio.run(swarm.initialize())
Execute complex task
result = asyncio.run(swarm.execute({
"goal": "Process and analyze data",
"data": {...}
}))
`
$3
`python
from aria_cli import AriaAgent, QLayer, AgentCapability, AgentMessage
from typing import Optional
class MyCustomAgent(AriaAgent):
def __init__(self):
super().__init__(
name="my-custom-agent",
q_layer=QLayer.Q15_CAUSATION,
capabilities=[
AgentCapability.ANALYSIS,
AgentCapability.REASONING,
]
)
async def process_message(self, message: AgentMessage) -> Optional[AgentMessage]:
action = message.payload.get("action")
if action == "analyze_causation":
result = await self.analyze_causes(message.payload.get("data"))
return message.reply({"causes": result})
return message.reply({"error": "Unknown action"})
async def analyze_causes(self, data):
# Your custom logic here
return {"primary_cause": "...", "contributing_factors": [...]}
`
$3
`python
from aria_cli import (
ChannelManager, DirectChannel, BroadcastChannel,
StreamChannel, get_channel_manager,
)
Get channel manager
manager = get_channel_manager()
Create direct channel between two agents
direct = manager.create_direct("task-comm", agent_a, agent_b)
asyncio.run(direct.open())
Create broadcast channel for events
broadcast = manager.create_broadcast("events", publisher_agent)
broadcast.subscribe(subscriber1.id)
broadcast.subscribe(subscriber2.id)
asyncio.run(broadcast.open())
Broadcast an event
asyncio.run(broadcast.publish({"event": "task_complete", "result": {...}}))
Create streaming channel for continuous data
stream = manager.create_stream("data-stream", producer, consumer)
asyncio.run(stream.open())
Stream data
for chunk in data_generator():
asyncio.run(stream.stream(chunk))
asyncio.run(stream.end_stream())
`
🔐 Authentication
ARIA CLI includes a secure authentication system for managing credentials and sessions.
$3
`bash
Interactive login
aria login
Login with API key
aria login --api-key sk-... --provider openai
Login with SSH key
aria login --ssh-key ~/.ssh/id_rsa
Check current user
aria whoami
Logout
aria logout
`
$3
`python
from aria_cli import AuthManager, get_auth
Get auth manager
auth = get_auth()
Register a new user
auth.register("username", "password", email="user@example.com")
Login
session = auth.login("username", "password")
print(f"Logged in as {session.user.username}")
Check authentication
if auth.is_authenticated:
print(f"Welcome, {auth.current_user.username}")
Login with API key
session = auth.login_with_api_key("sk-...", provider="openai")
Logout
auth.logout()
`
$3
Credentials are stored securely using:
- System keyring (when available)
- Encrypted file storage (fallback)
`
~/.aria/
├── session.json # Current session (encrypted token)
├── .credentials # Encrypted credential storage
└── .key # Encryption key (restricted permissions)
`
🔗 SSH Router
Manage SSH connections, execute remote commands, and create tunnels.
$3
`bash
Add a host
aria ssh add production --hostname prod.example.com --user deploy --key ~/.ssh/id_rsa
List hosts
aria ssh list
Test connection
aria ssh test production
Connect interactively
aria ssh connect production
Execute remote command
aria ssh exec production "ls -la /var/www"
Execute on multiple hosts
aria ssh exec production,staging "uptime"
Create tunnel (local:8080 → remote:80)
aria ssh tunnel production --local 8080 --remote 80
Upload file
aria ssh upload production ./local-file.txt /remote/path/
Download file
aria ssh download production /remote/file.txt ./local-path/
`
$3
`python
from aria_cli import SSHRouter, SSHHost, get_router
from pathlib import Path
Get router
router = get_router()
Add a host
router.add_host(SSHHost(
name="production",
hostname="prod.example.com",
username="deploy",
key_file=Path("~/.ssh/id_rsa").expanduser(),
port=22,
tags=["prod", "web"],
))
List hosts
hosts = router.list_hosts()
for host in hosts:
print(f"{host.name}: {host.hostname}")
Test connection
success, message = router.test_connection("production")
print(f"Connection: {message}")
Execute command
result = router.execute("production", "ls -la")
print(result.stdout)
print(f"Exit code: {result.exit_code}")
Execute on multiple hosts in parallel
results = router.execute_multi(
["production", "staging"],
"uptime",
parallel=True
)
for host, result in results.items():
print(f"{host}: {result.stdout}")
Create SSH tunnel
router.create_tunnel(
"production",
local_port=8080,
remote_port=80,
)
Create reverse tunnel
router.create_reverse_tunnel(
"production",
remote_port=9000,
local_port=3000,
)
Upload file
router.upload("production", "./local-file.txt", "/remote/path/")
Download file
router.download("production", "/remote/file.txt", "./local-path/")
Interactive session
router.connect_interactive("production")
Close tunnels
router.close_tunnel("production", 8080)
router.disconnect_all()
`
$3
The router automatically loads hosts from ~/.ssh/config:
`
~/.ssh/config
Host production
HostName prod.example.com
User deploy
IdentityFile ~/.ssh/id_rsa
Port 22
Host staging
HostName staging.example.com
User deploy
ProxyJump bastion
`
Additional hosts can be stored in ~/.aria/ssh/hosts.json.
📱 AriaSpace - Phone ↔ Computer Sync
AriaSpace is a personal codespace system for securely syncing files between your phone (Android/Termux) and computer. Think GitHub Codespaces, but for your personal devices.
$3
`
Phone (Termux) Computer
┌─────────────────┐ ┌─────────────────┐
│ AriaSpace │ SSH │ AriaSpace │
│ Client │◄────────────►│ Server │
│ │ │ │
│ /storage/ │ Sync │ ~/aria-space/ │
│ emulated/0/ │◄────────────►│ workspaces/ │
│ Meta AI/ │ │ │
└─────────────────┘ └─────────────────┘
`
$3
1. Install Termux from F-Droid
2. Run the ARIA setup script:
`bash
Get the setup script
aria termux setup-script | bash
Or manually:
pkg update && pkg install openssh -y
termux-setup-storage
sshd
`
3. Note your IP address: ip addr show wlan0
$3
`bash
Discover devices on network
aria termux discover
Or add device manually
aria termux add-host termux-phone 192.168.1.100 --port 8022
Test connection
aria termux test termux-phone
Find Meta AI folder
aria termux find-meta-ai termux-phone
List Android folders
aria termux folders termux-phone
`
$3
`bash
Quick setup for Meta AI folder
aria space create-meta-ai termux-phone
Or create custom workspace
aria space create my-docs termux-phone "/storage/emulated/0/Documents" \
--local ~/aria-space/my-docs
List workspaces
aria space list
`
$3
`bash
Bidirectional sync
aria space sync meta-ai
Download only (phone → computer)
aria space pull meta-ai
Upload only (computer → phone)
aria space push meta-ai
Preview changes without syncing
aria space sync meta-ai --dry-run
`
$3
`bash
List files with status
aria space files meta-ai
Browse remote directory
aria space browse meta-ai --path images
Check workspace status
aria space status meta-ai
`
$3
`python
from aria_cli import AriaSpace, TermuxSetup, get_space
Setup Termux connection
setup = TermuxSetup()
setup.wizard() # Interactive setup
Or programmatically
setup.append_ssh_config("termux-phone", "192.168.1.100", port=8022)
Create workspace
space = get_space()
ws = space.create_workspace(
name="meta-ai",
local_root="~/aria-space/meta-ai",
remote_root="/storage/emulated/0/Meta AI",
remote_host="termux-phone",
)
List files
files = space.list_files("meta-ai")
for f in files:
print(f"{f.status.value}: {f.relative_path}")
Sync workspace
result = space.sync("meta-ai")
print(f"Uploaded: {result.files_uploaded}, Downloaded: {result.files_downloaded}")
Download specific file
space.download_file("meta-ai", "images/photo.jpg")
Upload file
space.upload_file("meta-ai", "notes.txt")
Browse remote
entries = space.browse_remote("meta-ai", "images")
for entry in entries:
print(f"{'📁' if entry['is_dir'] else '📄'} {entry['name']}")
Watch for changes (auto-sync)
space.watch("meta-ai", interval=60)
`
$3
| Folder | Path |
|--------|------|
| Meta AI | /storage/emulated/0/Meta AI |
| Downloads | /storage/emulated/0/Download |
| Documents | /storage/emulated/0/Documents |
| Pictures | /storage/emulated/0/Pictures |
| DCIM | /storage/emulated/0/DCIM |
| Movies | /storage/emulated/0/Movies |
| Music | /storage/emulated/0/Music |
| Termux Home | /data/data/com.termux/files/home |
$3
`
~/.aria/
├── spaces/
│ ├── workspaces.json # Workspace configurations
│ ├── state/ # Sync state per workspace
│ │ ├── meta-ai.json
│ │ └── ...
│ └── snapshots/ # Workspace snapshots
│ └── meta-ai/
│ └── 20241222_143052.json
└── termux/
├── devices.json # Discovered devices
└── aria_termux_setup.sh # Setup script
`
🧠 Q-System Architecture
The Q-System is a 39-layer recursive cognitive architecture:
Configuration
ARIA CLI stores configuration in ~/.aria/:
`
~/.aria/
├── config.json # CLI configuration
├── sync_state.json # CLI-GUI sync state
└── q-memory/ # Q-System memory storage
├── memory_*.json # Saved memories
└── ...
`
$3
Edit ~/.aria/config.json:
`json
{
"llm": {
"backend": "openai",
"model": "gpt-4o-mini",
"temperature": 0.7
},
"q_system": {
"default_layer": 8,
"trace_enabled": true
},
"ui": {
"theme": "dark",
"show_layer_info": true
}
}
`
$3
`bash
LLM backends
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export OLLAMA_HOST="http://localhost:11434"
Q38 Cluster
export Q38_API_URL="http://localhost:8080"
export Q38_SYNC_PORT="9000"
`
🧩 Q-System Layers
$3
| Layer | Name | Description |
|-------|------|-------------|
| Q0 | VOID | The null state - pure potential |
| Q1 | PERCEPTION | Raw sensory input processing |
| Q2 | ATTENTION | Focus and salience detection |
| Q3 | PATTERN | Pattern recognition |
| Q4 | MEMORY_SHORT | Working memory |
| Q5 | MEMORY_LONG | Long-term storage |
| Q6 | ASSOCIATION | Concept linking |
| Q7 | CONTEXT | Contextual understanding |
| Q8 | LANGUAGE | Linguistic processing |
| Q9 | EMOTION | Affective processing |
$3
| Layer | Name | Description |
|-------|------|-------------|
| Q10 | LOGIC | Formal logical reasoning |
| Q11 | INFERENCE | Drawing conclusions |
| Q12 | HYPOTHESIS | Generating hypotheses |
| Q13 | ABSTRACTION | Abstract concept formation |
| Q14 | ANALOGY | Analogical reasoning |
| Q15 | CAUSATION | Causal reasoning |
| Q16 | PREDICTION | Future projection |
| Q17 | EVALUATION | Assessment |
| Q18 | SYNTHESIS | Combining ideas |
| Q19 | CREATIVITY | Novel generation |
$3
| Layer | Name | Description |
|-------|------|-------------|
| Q20 | INTENTION | Goal setting |
| Q21 | PLANNING | Strategy formation |
| Q22 | DECISION | Choice making |
| Q23 | EXECUTION | Action taking |
| Q24 | MONITORING | Progress tracking |
| Q25 | FEEDBACK | Loop processing |
| Q26 | CORRECTION | Error correction |
| Q27 | OPTIMIZATION | Performance tuning |
| Q28 | LEARNING | Knowledge acquisition |
| Q29 | ADAPTATION | Behavioral adjustment |
$3
| Layer | Name | Description |
|-------|------|-------------|
| Q30 | AWARENESS | Self-awareness |
| Q31 | REFLECTION | Self-reflection |
| Q32 | METACOGNITION | Thinking about thinking |
| Q33 | IDENTITY | Self-identity modeling |
| Q34 | VALUES | Value system and ethics |
| Q35 | WISDOM | Applied knowledge |
| Q36 | INTEGRATION | Holistic processing |
| Q37 | EMERGENCE | Emergent properties |
| Q38 | TRANSCENDENCE | Beyond individual layers |
🔌 Python API
`python
from aria_cli import QSystem, QLayer, QCommand, QOperator
Initialize Q-System
q = QSystem()
Create and execute a command
cmd = QCommand(
operator=QOperator.ASK,
payload={"question": "What is consciousness?"},
layer=QLayer.Q30_AWARENESS,
)
result = q.execute(cmd)
print(result)
`
$3
`python
from aria_cli import QSystem, LLMConnector, LLMProvider
Initialize with OpenAI
q = QSystem()
llm = LLMConnector(LLMProvider.OPENAI)
Generate with Q-System context
response = llm.generate(
"Explain quantum consciousness",
system="You are ARIA at Q-Layer 38"
)
print(response)
`
$3
`python
from aria_cli import NLPEngine
nlp = NLPEngine()
Parse natural language
parsed = nlp.parse("Search for files containing quantum algorithms")
print(f"Intent: {parsed.intent}") # Intent.SEARCH
print(f"Keywords: {parsed.keywords}") # ['files', 'quantum', 'algorithms']
print(f"Q-Layer: Q{parsed.q_layer}") # Q3 (PATTERN)
`
$3
`python
from aria_cli import AriaConnector, SyncMode
File-based sync
connector = AriaConnector(mode=SyncMode.FILE, source='cli')
connector.start()
Update state
connector.update_state(current_layer=30)
Stop sync
connector.stop()
`
🔄 Migration from v0.1.x
See MIGRATION.md for detailed upgrade instructions.
$3
| Feature | v0.1.x | v0.2.0 |
|---------|--------|--------|
| Q-System Layers | 8 | 39 (Q0-Q38) |
| LLM Integration | Optional | Multi-backend |
| Package Structure | aria_cli/ | src/aria_cli/ |
| CLI Framework | argparse | typer + rich |
| Async Support | ❌ | ✅ |
| GUI Sync | ❌ | ✅ |
📚 Commands Reference
| Command | Description |
|---------|-------------|
| aria ask | Ask a natural language question |
| aria search | Search for information |
| aria chat | Enter interactive chat mode |
| aria layers | Show all 39 Q-System layers |
| aria status | Show system status |
| aria remember | Save to memory |
| aria recall | List/retrieve memories |
| aria help | Show help |
🔧 Development
$3
`bash
git clone https://github.com/universal-crown-prime/aria-cli.git
cd aria-cli
pip install -e ".[dev]"
`
$3
`bash
pytest
`
$3
`bash
pip install build
python -m build
``