Turn Your Terminal into an AI Teammate: The No-Hype Guide to Volcengine veCLI
A complete, plain-English walkthrough of installing, logging in, switching models, writing code, deploying a blog and theming—without ever leaving the command line.
3 000+ words, fully based on Volcengine’s official docs, updated September 2025.
1. Six Quick Answers Before We Start
Question | One-sentence reply |
---|---|
What is veCLI? | An open-source CLI front-end that talks to Volcengine’s Ark models and cloud tools; you type plain English, it writes code, runs commands, or queries cloud data. |
Does it cost money? | The package is free; you only pay for the Volcengine tokens you consume. |
Do I need a GPU? | No. Inference runs in Volcengine’s cloud; local laptop only needs Node.js. |
How is it different from GitHub Copilot? | Works outside any IDE; runs on macOS, Linux, Windows; supports multiple models (Doubao, DeepSeek, Kimi, custom); can chain cloud APIs through MCP servers. |
Will my code leak? | File content is not logged by default; enterprise builds can disable telemetry completely (see §9). |
Fastest install path? | npm i -g @volcengine/vecli → set two env-vars → type vecli and chat. |
2. Installation: Two Commands, Three Minutes
2.1 Prerequisites
-
Node.js 18 or newer ( node -v
) -
npm reachable (official registry or your company mirror)
2.2 Global install (recommended)
npm install -g @volcengine/vecli@latest
One-off run without install:
npx @volcengine/vecli@latest
2.3 Smoke-test
vecli --version
# → veCLI 0.9.x (build xxxxx)
3. First Login: AK/SK Illustrated Flow
Works for personal laptops, CI containers, or shared jump boxes.
Step | Where & What |
---|---|
① Get keys | Volcengine console → Access Control → Access Keys → Create. Copy both strings immediately; the secret will disappear after the dialog closes. |
② Export | Append to ~/.bashrc , ~/.zshrc or CI secret: |
export VOLCENGINE_ACCESS_KEY="AKxxxxxxxx" |
|
export VOLCENGINE_SECRET_KEY="SKxxxxxxxx" |
|
③ Verify | vecli auth status |
→ Logged in via AK/SK ✔ |
4. Run Your First Task in 30 Seconds
Non-interactive (one-liner):
echo "write a Python fibonacci function" | vecli
Interactive:
vecli
> write fibonacci and save it as fib.py
veCLI prints the code, then shows a write_file
confirmation.
Type y
→ file created; job done.
5. Everyday Tweaks: Model, Theme, Extra Folders
Goal | Command or setting |
---|---|
Temporary model | vecli --model deepseek-v3-1 |
Permanent model | ~/.ve/settings.json : |
{"model": {"name": "deepseek-v3-1"}} |
|
Change colours | Inside veCLI type /theme → pick Dracula Dark (or build your own, §8). |
Add more workspace folders | vecli --include-directories ./api,./web |
6. Let the AI Do: Files, Shell, Git
Built-in tools trigger on plain English:
You say | veCLI calls |
---|---|
“rename fib.py to fibonacci.py” | move_file |
“git status” | run_shell_command(git status) |
“append ## License to README.md” | replace_file |
Safety: every write or shell action asks for confirmation.
CI mode: --yolo
auto-approves (use consciously).
7. Mini-Project: Generate & Run a Blog Engine
One prompt:
vecli -p "create an Express+EJS blog, home page lists posts, click title for detail, save all files under ~/my-blog"
veCLI outputs:
-
app.js
+ routes -
views/
templates -
package.json
Then run:
cd ~/my-blog
npm install
node app.js
Visit http://localhost:3000
—your blog is live.
8. Power-User Corner: Custom Themes & Commands
8.1 Roll Your Own Colour Theme
~/.ve/settings.json
"ui": {
"customThemes": {
"CorpGreen": {
"name": "CorpGreen",
"type": "custom",
"Background": "#002b36",
"Foreground": "#eee8d5",
"AccentGreen": "#859900",
"Comment": "#586e75",
"Gray": "#839496"
}
}
}
Reload → /theme
→ select CorpGreen.
8.2 Turn Frequent Prompts into Slash-Commands
Scenario: generate unit tests every day.
-
Create folder & file
mkdir -p ~/.ve/commands/test
nano ~/.ve/commands/test/unit.toml
-
TOML content
description = "Generate Jest unit tests for the current file"
prompt = """
Please read @{file} and write comprehensive Jest tests for every exported function.
Requirements:
- Use TypeScript
- At least 2 assertions per function
- Save test as @{file}.test.ts
"""
-
Use inside veCLI
> @src/utils.ts
> /test:unit
AI reads the source, writes the test file, and drops it next to the original.
9. Enterprise Lock-Down: Five Admin Controls
Drop these keys into /etc/vecli/settings.json
(root-owned, 644).
Control | JSON snippet | Effect |
---|---|---|
Force sandbox | "tools": {"sandbox": "docker"} |
All shell & file ops run inside container; host untouched. |
Tool white-list | "tools": {"core": ["ReadFileTool","ShellTool(ls)"]} |
Anything not listed is unreachable for every user. |
Disable telemetry | "privacy": {"usageStatisticsEnabled": false} |
No usage leaves the machine. |
Enforce AK/SK | "security": {"auth": {"enforcedType": "aksk"}} |
Blocks OAuth flows. |
MCP catalog lock | "mcp": {"allowed": ["corp-tools"]}, "mcpServers": {"corp-tools": {...}} |
Users cannot add their own MCP servers. |
Merge order (highest wins):
system override > project .ve/settings.json
> user ~/.ve/settings.json
> system default.
Arrays (e.g. includeDirectories
) are concatenated; colliding single values use the highest-priority file.
10. FAQ (Collected from Forums & Tickets)
Q1. My company blocks npm.
A. Mirror @volcengine/vecli
to your private registry, then npm i -g @volcengine/vecli --registry=https://your-mirror
.
Q2. I need two Volcengine accounts (prod / dev).
A. Keep two project folders, each with its own .ve/.env
containing different AK/SK. Launch veCLI with --project-dir
.
Q3. Garbled text or hang on Windows.
A. Set UTF-8: chcp 65001
; if behind proxy, add https_proxy
env-var. Use --debug
to trace.
Q4. How to auto-approve in CI?
A. Start veCLI with --approval-mode yolo
or list allowed tools: "tools": {"allowed": ["write_file","run_shell_command(npm test)"]}
.
Q5. Can I work completely offline?
A. Inference requires Volcengine cloud; on-prem model serving is not exposed yet.
11. Post-Install Checklist (Print & Stick)
-
vecli auth status
→ ✔ -
vecli test-connection
→ latency < 500 ms -
Pick your cheapest model ( --model
) -
Add a project-level VE.md
with coding rules -
Convert frequent prompts into .toml
commands -
Alias vecli='vecli --sandbox'
on shared PCs -
Append .ve/
to.gitignore
to avoid leaking keys
12. What to Explore Next
-
Wire in the GitHub MCP server—let AI create issues and comment on PRs. -
Build a custom sandbox.Dockerfile
pre-loaded with your tool-chain. -
Use /memory refresh
to hot-reload hierarchical instructions (global + project + sub-folder). -
Pipe telemetry to your own OTLP collector and monitor token spend in real time.
The terminal is no longer a black box; it’s a colleague that never sleeps.
Install veCLI once, offload the boring stuff, and free your brain for real problems.