“
Preface
As software delivery accelerates, developers often juggle between the CLI, scripts, tests, and documentation. Trae Agent empowers you to execute complex workflows—code edits, testing, deployments—using simple natural‑language commands, freeing up both your hands and your focus.
Trae Agent: Your AI‑Powered Automation Companion for Software Engineering
Introduction to Trae Agent
Trae Agent is an LLM‑driven agent designed to streamline everyday software engineering tasks. Whether you need to generate a script, fix a bug, write tests, or update documentation, just issue a natural‑language instruction:
trae-cli run "Generate a project README"
Key benefits include:
-
Natural‑Language Interface
Execute end‑to‑end workflows without memorizing shell commands. -
Multi‑Model Support
Compatible with both OpenAI and Anthropic APIs—switch between models for precision or cost‑efficiency. -
Rich Toolset
Built‑in file editing, Bash execution, sequential reasoning, and more to cover a breadth of development scenarios. -
Interactive Mode
Enter a live REPL‑style session (trae-cli interactive
) to iterate on tasks, view history, and invoke help. -
Configurable & Traceable
Customize via JSON or env vars, and record detailed execution logs (trajectories) for debugging and audit.
Core Features at a Glance
Installation & Quickstart
Follow these four steps to get up and running:
-
Clone the Repository
git clone <repository-url> cd trae-agent
-
Sync Dependencies
uv sync
-
Set API Keys
export OPENAI_API_KEY="your_openai_key" export ANTHROPIC_API_KEY="your_anthropic_key"
-
Run Your First Task
trae-cli run "Create a Hello World script"
-
Override any setting on the fly with flags such as
--model
,--provider
, or--max-steps
. -
Launch interactive mode:
trae-cli interactive --provider openai --model gpt-4o
Configuration Details
Trae Agent reads settings in this priority:
-
Command‑Line Arguments -
Configuration File ( trae_config.json
) -
Environment Variables -
Built‑In Defaults
Example trae_config.json
:
{
"default_provider": "anthropic",
"max_steps": 20,
"model_providers": {
"openai": {
"api_key": "your_openai_key",
"model": "gpt-4o",
"max_tokens": 128000,
"temperature": 0.5,
"top_p": 1
},
"anthropic": {
"api_key": "your_anthropic_key",
"model": "claude-sonnet-4-20250514",
"max_tokens": 4096,
"temperature": 0.5,
"top_p": 1,
"top_k": 0
}
}
}
-
default_provider sets your primary LLM. -
max_steps caps the agent’s reasoning iterations. -
model_providers details each provider’s parameters.
Deep Dive: Trajectory Recording
Why Trajectories Matter
-
Debugging: Rewind every decision, tool call, and LLM response. -
Audit & Compliance: Retain immutable logs for regulatory needs. -
Optimization: Analyze token usage, model efficiency, and tool frequency.
Trajectory File Structure
Trajectories are saved as JSON with fields like:
Sample Snippet:
{
"timestamp": "2025-06-12T22:05:47.000000",
"provider": "anthropic",
"input_messages": [...],
"response": {
"content": "Here’s your script...",
"tool_calls": [...]
}
}
Enabling & Customizing Trajectories
-
Default Filename:
trae-cli run "Refactor module" # Generates trajectory_20250707_160500.json
-
Custom Filename:
trae-cli run "Fix main.py bug" --trajectory-file debug_log.json
Tool Ecosystem & Extensions
-
str_replace_based_edit_tool
– View, create, and patch files. -
bash
– Run shell commands and scripts. -
sequential_thinking
– Break tasks into stages and reflect iteratively. -
task_done
– Flag completion and produce a concise summary.
“
Example:
trae-cli run "Add unit tests for utils module" --working-dir ./project
The agent will think through each step, write tests, execute them, and report results.
Frequently Asked Questions (FAQ)
1. Which platforms does Trae Agent support?
Trae Agent runs on any OS with Python 3.12+, including macOS, Linux, and Windows via WSL.
2. How do I switch LLM providers?
Use the `–provider openai` or `–provider anthropic` flag, or update the `default_provider` in your config file.
3. Will trajectory files overwrite existing ones?
Custom filenames will overwrite if they exist; timestamped defaults are unique.
4. How can I see all available commands?
“`bash
trae-cli –help
trae-cli run –help
“`
5. How do I tweak model parameters?
Add flags like `–temperature` or `–top_p` on the command line, or edit your config file.
How‑To: Start Recording Your Workflow
# 1. Clone & install
git clone <repository-url>
cd trae-agent
uv sync
# 2. Set up env vars
export OPENAI_API_KEY="..."
export ANTHROPIC_API_KEY="..."
# 3. Run & record
trae-cli run "Refactor database module" --trajectory-file db_refactor.json
Conclusion
Trae Agent bridges the gap between natural‑language instructions and robust development workflows. With multi‑model support, a versatile toolset, and transparent trajectory logs, you can automate routine coding tasks, speed up CI/CD integration, and maintain full traceability. Try Trae Agent today and focus on crafting innovative solutions while your AI companion handles the repetitive work.