Welcome to scmd¶
AI-powered slash commands for any terminal. Works offline by default.
scmd brings AI superpowers to your command lineβoffline, private, and fast. Ask questions in plain English, review code with security templates, chat with AI that remembers context. All without API keys or cloud dependencies.
# Just ask what you want to do
scmd /cmd "find files modified today"
# β find . -type f -mtime -1
# Get instant explanations with beautiful formatting
scmd /explain main.go
# Review with professional security templates
scmd /review auth.js --template security-review
# Chat with full context retention
scmd chat
You: How do I set up OAuth2 in Go?
π€ Assistant: [Detailed explanation...]
You: Show me an example with JWT
π€ Assistant: [Builds on previous context...]
β¨ Key Features¶
-
100% Offline & Private
Local LLMs via llama.cpp + Qwen models. No API keys, no telemetry, your code stays on your machine. Optional cloud backends available.
-
Smart Conversations
Multi-turn chat with context retention, searchable history, auto-save, and markdown export. Pick up conversations anytime.
-
Beautiful Output NEW in v0.4.3
Markdown rendering with syntax highlighting for 40+ languages. Multiple themes, auto-detection, NO_COLOR support.
-
Fast & Lightweight
14MB binary, 0.5B-7B models, GPU acceleration (Metal/CUDA), streaming output. Choose speed vs quality.
-
Man Page Integration
/cmdreads system man pages to generate exact, copy-paste ready commands. Works with 60+ common CLI tools. -
Security Templates NEW in v0.4.0
Professional code reviews with OWASP Top 10 focus. 6 built-in templates for security, performance, API design, testing, docs, and education.
-
Repository System
Commands install like npm packages. Discover 100+ community commands. Create and share your own.
-
Zero Maintenance
Auto-starts, auto-restarts, self-healing. Intelligent error handling with actionable solutions. Never need
pkill.
π Quick Start¶
-
1. Install
-
2. First Run
Beautiful setup wizard appears: - Choose model preset (Fast/Balanced/Best/Premium) - Auto-download (~2 min) - Done! Works 100% offline
-
3. Try Commands
-
4. Explore
π― Highlights¶
Beautiful Markdown Output (v0.5.1)¶
AI responses now render with gorgeous markdown formatting, syntax highlighting, and themes:
scmd /explain quicksort.py
# Output includes:
# - Syntax-highlighted code blocks
# - Formatted headers and lists
# - Tables and links
# - Theme detection (dark/light/auto)
# - Plain text when piped
Performance: Lazy-loaded Glamour renderer with < 1Β΅s overhead when disabled.
Interactive Chat Mode (v0.4.0)¶
Multi-turn conversations with full context retention:
scmd chat
You: How do I implement rate limiting in Express?
π€ [Detailed explanation with code]
You: What about Redis-based rate limiting?
π€ [Builds on previous context...]
/export # Save to markdown
Features: Auto-save, searchable history, resume anytime, markdown export.
Professional Templates (v0.4.0)¶
Standardized code reviews with 6 built-in templates:
| Template | Focus | Example |
|---|---|---|
| security-review | OWASP Top 10 | scmd review auth.js --template security-review |
| performance | Bottlenecks, Big O | scmd review sort.py --template performance |
| api-design | REST practices | scmd review api.go --template api-design |
| testing | Coverage, edges | scmd review test.ts --template testing |
| documentation | Doc generation | scmd explain utils.rs --template documentation |
| beginner-explain | ELI5 mode | scmd explain recursion.go --template beginner-explain |
Create custom templates for your team. Share as YAML files.
Man Page Integration¶
The /cmd command reads system man pages for exact commands:
scmd /cmd "find files modified in last 24 hours"
# β find . -type f -mtime -1
scmd /cmd "list processes sorted by memory"
# β ps aux --sort=-%mem | head -n 20
Detects 60+ common tools. Falls back to general CLI knowledge.
Command Discovery (v0.5.0)¶
100+ community commands now properly discoverable:
# Search commands
scmd repo search git
scmd repo search docker
# Install instantly
scmd repo install official/commit
scmd repo install official/dockerfile
# Use immediately
git diff --staged | scmd /gc
Repository system fixed in v0.5.0 with legacy manifest support and parallel fetching.
π Performance¶
Real-World Benchmarks (M1 Mac, 8GB RAM):
| Task | qwen2.5-0.5b | qwen2.5-1.5b | qwen2.5-3b | qwen2.5-7b |
|---|---|---|---|---|
| Explain 50-line file | 3.2s | 5.8s | 9.1s | 16.3s |
| Generate commit | 2.8s | 4.9s | 7.5s | 14.1s |
| Review 200-line file | 6.5s | 11.2s | 18.7s | 32.4s |
| CLI command | 2.1s | 3.4s | 5.8s | 10.2s |
Available Models:
| Model | Size | Speed | Quality | Best For |
|---|---|---|---|---|
| qwen2.5-0.5b | 379 MB | β‘β‘β‘β‘ | βββ | Quick tasks |
| qwen2.5-1.5b β | 1.0 GB | β‘β‘β‘ | ββββ | Daily work (default) |
| qwen2.5-3b | 1.9 GB | β‘β‘ | βββββ | Complex analysis |
| qwen2.5-7b | 3.8 GB | β‘ | βββββ | Production code |
All models: 8192 token context, GPU acceleration, 4-bit quantization, function calling.
π‘οΈ Stability & Reliability¶
Zero Maintenance Design:
- β Auto-starts - Server starts when needed
- β Auto-restarts - Crashes handled gracefully
- β Self-healing - Detects OOM, context mismatches, recovers automatically
- β Clear feedback - Every error includes actionable solutions
- β No manual intervention - Never need server management
Example Error Handling:
β Input exceeds available context size
What happened:
Your input (5502 tokens) exceeds GPU capacity (4096 tokens)
Solutions:
1. Use CPU mode: export SCMD_CPU_ONLY=1
2. Split input into smaller files
3. Use cloud backend: export OPENAI_API_KEY=...
See Stability Architecture for details.
π Use Cases¶
-
Code Explanation
-
Code Review
-
:material-terminal: Command Generation
-
Git Workflows
-
Learning & Exploration
-
Security Analysis
π§ LLM Backends¶
| Backend | Local | Free | Setup |
|---|---|---|---|
| llama.cpp β | β | β | Default - no setup |
| Ollama | β | β | ollama serve |
| OpenAI | β | β | export OPENAI_API_KEY=... |
| Together.ai | β | Free tier | export TOGETHER_API_KEY=... |
| Groq | β | Free tier | export GROQ_API_KEY=... |
Backend Priority: llama.cpp (default) β Ollama β OpenAI β Together.ai β Groq
# Use specific backend
scmd -b ollama /explain main.go
scmd -b openai -m gpt-4 /review code.py
# List backends
scmd backends
π Documentation¶
-
User Guide
Learn slash commands, model management, chat mode, templates, and repositories.
-
Command Authoring
Create custom AI commands with tools, hooks, composition, and context gathering.
-
Architecture
Understand scmd's design: stability, backends, repository system, and more.
-
Troubleshooting
Common issues and solutions for installation, models, and backends.
π Recent Releases¶
v0.5.1 (2026-01-12)¶
- Fixed: Streaming AI responses now use Glamour markdown renderer
- Beautiful code blocks with syntax highlighting
- Markdown headers, lists, and formatting properly rendered
v0.5.0 (2026-01-12)¶
- Fixed: Repository manifest parsing bug (100+ commands now discoverable)
- Added legacy manifest format support
- Enhanced command discovery UI
- Parallel manifest normalization (10 concurrent requests)
v0.4.3 (2026-01-12)¶
- Added: Beautiful markdown output with Glamour
- Lazy rendering (< 1ns overhead when disabled)
- Syntax highlighting for 40+ languages
- Theme detection and NO_COLOR support
v0.4.2 (2026-01-11)¶
- Added: Template-command integration
- Official commands repository (100+ commands)
- Unified command specification
v0.4.1 (2026-01-11)¶
- Fixed: Replaced CGO-dependent SQLite with pure Go
- Improved cross-platform compatibility
v0.4.0 (2026-01-10)¶
- Added: Interactive conversation mode with SQLite persistence
- Beautiful markdown output with syntax highlighting
- Template/pattern system with 6 built-in templates
- Conversation history management
See Changelog for full history.
π Community¶
- GitHub: sunboylabs/scmd
- Issues: Report bugs or request features
- Discussions: Ask questions and share commands
- Commands Repository: sunboylabs/commands (100+ commands)
π What's Next?¶
-
Get scmd installed via Homebrew, npm, or packages
-
5-minute tutorial to using slash commands
-
Create your first custom AI command
-
Learn from real-world command examples
π Why scmd?¶
| Traditional AI Tools | scmd |
|---|---|
| API keys required | β Works offline by default |
| Web interfaces | β Native terminal integration |
| Fixed prompts | β Customizable command specs |
| Isolated tools | β Repository system for sharing |
| Text generation only | β Tool calling for actions |
| No automation | β Hook system |
| Single-turn | β Multi-turn chat with history |
| Plain text | β Beautiful markdown output |
π License¶
scmd is open source software licensed under the MIT License.
Built with β€οΈ using Go β’ Inspired by the Unix philosophy and modern AI tooling