Meet Gonzo: A Friendly Terminal Dashboard for Log Analysis
1. What Problem Does Gonzo Solve?
You are staring at the terminal. Log lines are scrolling faster than you can read them.
You need to know:
-
Which services are throwing errors right now -
Whether the spike started five minutes or fifty seconds ago -
If a single pattern explains 80 % of the noise
Gonzo turns this chore into a conversation.
It is a single-binary, open-source tool written in Go that streams logs, draws live charts, and—if you want—asks an AI to point out anomalies.
All inside your terminal. No browser, no Docker-compose, no ELK stack.
2. Core Features at a Glance
Capability | What it means in plain English | Everyday scenario |
---|---|---|
Real-time ingestion | Reads stdin , files, or network streams as they arrive |
kubectl logs -f piped straight into Gonzo |
Auto-detection | Recognises JSON, logfmt, or plain text without configuration | Mixed Java, Go, and Nginx logs in the same pane |
Interactive filtering | Type a regex and see matching lines highlighted | Isolate every timeout or 5xx in one keystroke |
Severity heatmap | Colour bars show how many ERROR vs INFO events per minute | Spot a red band at 14:03 and jump to that moment |
AI insights (optional) | Sends clusters of logs to GPT-4, LM Studio, or Ollama | “Why did error rate triple at 14:03?” |
OTLP receiver | Acts as an OpenTelemetry logs endpoint | Collector ships logs to Gonzo over gRPC or HTTP |
k9s-style keys | Vim-like navigation plus mouse scroll | No new shortcuts to memorise |
3. Quick Installation
Pick the method that matches your machine.
3.1 Go tool-chain (any platform)
go install github.com/control-theory/gonzo/cmd/gonzo@latest
3.2 macOS or Linux with Homebrew
brew tap control-theory/gonzo
brew install gonzo
3.3 Download a pre-built binary
-
Visit the releases page -
Download the file for your OS and architecture -
Move it into your $PATH
3.4 Nix (experimental)
nix run github:control-theory/gonzo
Verify the install:
gonzo version
You should see a version string such as v0.9.1
.
4. Ten Practical One-Liners
4.1 Single local file
gonzo -f /var/log/app.log
4.2 Multiple files and globs
gonzo -f "/var/log/nginx/*.log" -f "/var/log/app/*.log"
4.3 Follow mode (like tail -f
)
gonzo -f /var/log/app.log --follow
4.4 Read from stdin
cat application.log | gonzo
4.5 Kubernetes logs in real time
kubectl logs -f deployment/my-app | gonzo
4.6 Docker container logs
docker logs -f my-container 2>&1 | gonzo
4.7 System logs
tail -f /var/log/syslog | gonzo
4.8 Receive logs via OTLP (gRPC & HTTP)
Start Gonzo:
gonzo --otlp-enabled
Now any OpenTelemetry exporter can send logs to
-
gRPC: localhost:4317
-
HTTP: http://localhost:4318/v1/logs
OpenTelemetry Collector example (gRPC):
exporters:
otlp/gonzo_grpc:
endpoint: localhost:4317
tls:
insecure: true
Same idea for HTTP; just change the exporter to otlphttp/gonzo_http
.
4.9 AI analysis with OpenAI
export OPENAI_API_KEY="sk-your-key"
gonzo -f logs.json --ai-model="gpt-4"
4.10 AI analysis with a local model (Ollama)
# 1. ollama serve
# 2. ollama pull gpt-oss:20b
export OPENAI_API_KEY="ollama"
export OPENAI_API_BASE="http://localhost:11434"
gonzo -f logs.json --follow
5. First 30 Seconds in the UI
Key / Mouse action | What happens |
---|---|
Tab / Shift-Tab | Cycle between the four panels |
Mouse click | Jump to any panel instantly |
↑ / ↓ or k / j | Move the cursor up/down |
Space | Pause the entire dashboard (buffer keeps filling) |
/ | Start typing a regex filter |
Enter | Open the Counts modal: heatmap, top patterns, top services |
f | Full-screen log viewer |
q or Ctrl-C | Quit |
Tip: Press ? at any time for an on-screen cheat-sheet.
6. Screens You Will See
-
Main Dashboard
A 2×2 grid:-
Live log lines -
Word frequency bar chart -
Severity doughnut -
Time-series sparkline
-
-
Counts Modal (Enter from the Counts panel)
-
60-minute heatmap of receive time (immune to clock skew) -
Top-3 patterns per severity (drain3 algorithm) -
Top-3 services per severity
-
-
Full-Screen Log Viewer (press f)
-
Vim navigation keys -
Mouse wheel scroll -
Auto-scroll pauses when you scroll up, resumes when you hit End
-
7. Global Pause and Buffering
Press Space once:
-
Charts freeze -
New logs are still collected in RAM
Press Space again: -
Charts jump to the present
No log line is lost even while you stare at a spike.
8. Filtering and Searching
Goal | How to do it |
---|---|
See only ERRORs | /ERROR then Enter |
Hide health-check spam | /^(?!.*health) |
Highlight customer ID | s then type the ID |
Reset all filters | r |
All filters accept Perl-compatible regular expressions.
9. AI Integration Deep Dive
9.1 Supported providers
-
OpenAI (GPT-3.5, GPT-4, custom) -
LM Studio (local) -
Ollama (local) -
Any OpenAI-compatible endpoint
9.2 Runtime model switching
Inside Gonzo, press m.
A modal lists every model your endpoint advertises.
Arrow keys to navigate, Enter to switch, Esc to cancel.
9.3 Auto-selection logic
If you omit --ai-model
, Gonzo chooses the first available model in this order:
-
OpenAI: gpt-4
→gpt-3.5-turbo
→ first listed -
Ollama: gpt-oss:20b
→llama3
→mistral
→codellama
→ first listed -
LM Studio: first model returned by /v1/models
10. Configuration File
Create ~/.config/gonzo/config.yml
once:
files:
- "/var/log/app/*.log"
- "/var/log/nginx/*.log"
follow: true
update-interval: 2s
log-buffer: 2000
memory-size: 15000
ai-model: "gpt-4"
All CLI flags can be placed in this file.
11. Environment Variables Reference
Variable | Purpose | Example |
---|---|---|
OPENAI_API_KEY |
Required for AI features | sk-... |
OPENAI_API_BASE |
Custom endpoint | http://localhost:1234/v1 |
GONZO_FILES |
Comma-separated list | /var/log/app.log,/var/log/error.log |
GONZO_FOLLOW |
Boolean | true |
GONZO_UPDATE_INTERVAL |
Duration | 500ms |
GONZO_LOG_BUFFER |
Integer | 5000 |
GONZO_MEMORY_SIZE |
Integer | 20000 |
GONZO_AI_MODEL |
Model name | gpt-3.5-turbo |
GONZO_TEST_MODE |
Disable TTY | true |
NO_COLOR |
Disable colours | 1 |
12. Shell Completion
Add one line to your shell startup file:
# Bash
source <(gonzo completion bash)
# Zsh
source <(gonzo completion zsh)
# Fish
gonzo completion fish | source
# PowerShell
gonzo completion powershell | Out-String | Invoke-Expression
13. K9s Plug-in (Kubernetes Users)
Add the snippet below to $XDG_CONFIG_HOME/k9s/plugins.yaml
:
plugins:
gonzo:
shortCut: Ctrl-L
description: "Gonzo log analysis"
scopes:
- po
command: sh
background: false
args:
- -c
- "kubectl logs -f $NAME -n $NAMESPACE --context $CONTEXT | gonzo"
After saving, open k9s, select any pod, press Ctrl-L, and Gonzo opens with live logs.
14. Troubleshooting Common Hiccups
Symptom | Quick check |
---|---|
LM Studio “no models” | curl http://localhost:1234/v1/models must return JSON |
Ollama “connection refused” | ollama serve running? ollama list shows your model? |
Binary not found | Ensure $GOPATH/bin or /usr/local/bin is in $PATH |
Colours broken | export NO_COLOR=1 or use a modern terminal |
15. Architecture Sketch (High-Level)
cmd/gonzo/ → main()
internal/
├── tui/ → Bubble Tea widgets
├── analyzer/ → log parsing & severity
├── memory/ → frequency tables
├── otlplog/ → OpenTelemetry decoding
└── ai/ → prompt & chat layer
Libraries used:
-
Bubble Tea – terminal UI engine -
Lipgloss – styling -
Cobra – CLI flags -
Viper – config files
16. Developer Notes
16.1 Prerequisites
-
Go 1.21 or newer -
Make (optional)
16.2 Build commands
make build # single binary in ./dist
make test # unit tests
make dev # fmt, vet, test, build
make cross-build # Windows, macOS, Linux binaries
16.3 Sample data
make demo
This populates the UI with synthetic logs so you can explore keys and colours without touching production data.
17. Contributing
-
Fork the repository -
Create a feature branch: git checkout -b feature/my-idea
-
Commit with a clear message -
Push and open a pull request
The project follows the standard GitHub flow. See CONTRIBUTING.md
in the repo for coding style and test guidelines.
18. Licence and Credits
-
Licence: MIT – see LICENSE file -
Inspiration: k9s for keyboard-driven UX -
UI Libraries: Charm (Bubble Tea, Lipgloss, Bubbles) -
Logistics: OpenTelemetry community for OTLP specs
19. Final Thoughts
Gonzo does not try to replace your entire observability stack.
It does one thing—make live logs readable—and does it well.
Install it, pipe some logs in, and within minutes you will wonder how you ever debugged without it.
Happy troubleshooting!