Building a Visible AI Team with Common Ground: A Complete Guide from Install to First Run
Table of Contents
-
What exactly is Common Ground? -
Why should you spend time on it? -
How the “Partner–Principal–Associate” model works -
Get everything running in 15 minutes (Docker mode) -
Developer mode: three commands to run from source -
Change agent behavior without touching code (YAML crash course) -
Frequently asked questions (FAQ) -
What to do next?
1. What Exactly Is Common Ground?
In one sentence:
Common Ground is an open-source platform that turns a group of AI agents into a transparent consulting team.
Think of it like this:
-
You are the client who drops a big question—e.g., “Map supply-chain risks for solid-state batteries in 2025.” -
A Partner agent chats with you and breaks the request into deliverables. -
A Principal agent schedules those deliverables and assigns them to Associate agents. -
Associates search, analyse, and write in parallel. -
The Principal merges everything into a final report. -
All the while, you watch every move—like cards moving across a Trello board.
2. Why Should You Spend Time on It?
3. How the “Partner–Principal–Associate” Architecture Works
The README shows a Mermaid diagram; below is the same flow in plain English.
WebSocket pushes every thought, tool call, and artefact to your browser in real time.
4. Get Everything Running in 15 Minutes (Docker Mode)
This is the officially recommended path for newcomers. All commands come straight from the README.
4.1 Prerequisites
-
Docker & Docker Compose installed -
A Google account (free Gemini tier is enough)
4.2 Five-Step Launch
-
Clone the repo
git clone https://github.com/Intelligent-Internet/CommonGround cd CommonGround
-
Pull submodules (needed for the Gemini bridge)
git submodule update --init --recursive
-
Move to the deployment folder
cd deployment
-
First-time login (do once)
docker compose run --rm --service-ports login
-
Follow the on-screen OAuth flow. -
If you hit a GOOGLE_CLOUD_PROJECT
error, opendocker-compose.yaml
, uncomment theGOOGLE_CLOUD_PROJECT
line, insert your project ID, and rerun the command.
-
-
Start all services
docker compose up --build -d
When containers show
healthy
, open
http://localhost:8000
4.3 Quick Check
-
The home screen shows a demo project. -
Click Run and watch the Kanban board come alive. -
A greeting from the Partner agent confirms the loop is complete.
5. Developer Mode: Three Commands to Run from Source
If you want to hack the core or swap LLMs, switch to local mode.
Backend
cd core
uv venv # create venv
uv sync # install deps
cp env.sample .env # optional: edit LLM config
uv run run_server.py # http://localhost:8000
Frontend
Open a second terminal
cd frontend
cp .env.example .env.local
npm install
npm run dev # http://localhost:3000
The front-end will auto-proxy to the back-end on port 8000.
6. Change Agent Behavior Without Touching Code (YAML Crash Course)
All “personality” lives in three YAML locations. Edit, save, and the system hot-reloads.
No container restart is required.
7. Frequently Asked Questions (FAQ)
“
All answers come directly from the README or obvious user questions.
Q1: Must I use Google Gemini?
A: No. LiteLLM routes to any provider. Put your API key in the YAML and you’re done.
Q2: How large is the Docker image? Will my old laptop cope?
A: Roughly 2 GB compressed. Eight GB RAM is comfortable.
Q3: My .gemini
folder lives elsewhere. What do I do?
A: Edit the volumes
line in docker-compose.yaml
to point to your real path.
Q4: How do I add a custom search tool?
A:
-
Create core/agent_core/nodes/custom_nodes/my_search.py
. -
Inherit BaseToolNode
and decorate with@tool_registry
. -
List the new tool name in the YAML tool_access_policy
.
The framework discovers it automatically.
Q5: Will a huge project folder slow RAG indexing?
A: Incremental indexing is built in; only new or modified files are re-embedded, so even 100 k files stay fast.
8. What to Do Next?
-
Experiment with team configs: Swap the Partner prompt to a legal expert; split Associates into “Regulation Researcher” vs “Financial Analyst” and compare results. -
Wire in internal APIs: Use MCP to expose an internal database as a tool and ask questions over private data. -
Write a short blog post: Share the YAML you tweaked and any pitfalls you hit; the Discord channel (#show-and-tell) will pin good write-ups. -
Contribute back: The CONTRIBUTING.md file in the repo details how to open a pull request. Even a docs typo helps.
Wrap-up
Common Ground turns “black-box AI” into a visible team.
With one Docker command you can spin up the entire stack, use YAML files like spreadsheets to tweak agent behaviour, and watch a Kanban board track every AI move in real time.
The rest is experimentation, documentation, and sharing—turning AI into a colleague you can see, edit, and trust.