Building a Visible AI Team with Common Ground: A Complete Guide from Install to First Run


Table of Contents

  1. What exactly is Common Ground?
  2. Why should you spend time on it?
  3. How the “Partner–Principal–Associate” model works
  4. Get everything running in 15 minutes (Docker mode)
  5. Developer mode: three commands to run from source
  6. Change agent behavior without touching code (YAML crash course)
  7. Frequently asked questions (FAQ)
  8. What to do next?

1. What Exactly Is Common Ground?

In one sentence:
Common Ground is an open-source platform that turns a group of AI agents into a transparent consulting team.

Think of it like this:

  • You are the client who drops a big question—e.g., “Map supply-chain risks for solid-state batteries in 2025.”
  • A Partner agent chats with you and breaks the request into deliverables.
  • A Principal agent schedules those deliverables and assigns them to Associate agents.
  • Associates search, analyse, and write in parallel.
  • The Principal merges everything into a final report.
  • All the while, you watch every move—like cards moving across a Trello board.

2. Why Should You Spend Time on It?

Pain you probably have How Common Ground solves it
One-shot LLM answers with no traceability Every step becomes a task card, fully logged
High prompt-tuning cost Behavior lives in YAML; change one line and rerun
One-off integrations for each tool Uses the Model Context Protocol (MCP); new tools need one Python decorator
Context scattered across teammates Built-in RAG auto-indexes the workspace so every agent sees the same files
No idea what the model is “thinking” Real-time Flow, Kanban, and Timeline views

3. How the “Partner–Principal–Associate” Architecture Works

The README shows a Mermaid diagram; below is the same flow in plain English.

Step Who What What you see
1 You Ask the research question Chat box in the browser
2 Partner Clarify, plan, create work modules Live chat stream
3 Principal Break modules into tasks New cards appear on a Kanban board
4 Associates Execute in parallel (search, code, write) Cards move from “To Do” to “Doing”
5 Principal Review, integrate, finalise Timeline view shows merge history
6 Partner Deliver the final report Markdown report appears in chat

WebSocket pushes every thought, tool call, and artefact to your browser in real time.


4. Get Everything Running in 15 Minutes (Docker Mode)

This is the officially recommended path for newcomers. All commands come straight from the README.

4.1 Prerequisites

  • Docker & Docker Compose installed
  • A Google account (free Gemini tier is enough)

4.2 Five-Step Launch

  1. Clone the repo

    git clone https://github.com/Intelligent-Internet/CommonGround
    cd CommonGround
    
  2. Pull submodules (needed for the Gemini bridge)

    git submodule update --init --recursive
    
  3. Move to the deployment folder

    cd deployment
    
  4. First-time login (do once)

    docker compose run --rm --service-ports login
    
    • Follow the on-screen OAuth flow.
    • If you hit a GOOGLE_CLOUD_PROJECT error, open docker-compose.yaml, uncomment the GOOGLE_CLOUD_PROJECT line, insert your project ID, and rerun the command.
  5. Start all services

    docker compose up --build -d
    

    When containers show healthy, open
    http://localhost:8000

4.3 Quick Check

  • The home screen shows a demo project.
  • Click Run and watch the Kanban board come alive.
  • A greeting from the Partner agent confirms the loop is complete.

5. Developer Mode: Three Commands to Run from Source

If you want to hack the core or swap LLMs, switch to local mode.

Backend

cd core
uv venv              # create venv
uv sync              # install deps
cp env.sample .env   # optional: edit LLM config
uv run run_server.py # http://localhost:8000

Frontend

Open a second terminal

cd frontend
cp .env.example .env.local
npm install
npm run dev          # http://localhost:3000

The front-end will auto-proxy to the back-end on port 8000.


6. Change Agent Behavior Without Touching Code (YAML Crash Course)

All “personality” lives in three YAML locations. Edit, save, and the system hot-reloads.

File location Purpose Example tweak
core/agent_profiles/profiles/*.yaml Role, system prompt, tool access Switch an Associate from Gemini to GPT-4 with one line: model: gpt-4
core/agent_profiles/llm_configs/*.yaml Global API keys, base URLs, temperature Add a local Ollama endpoint by copying a block
core/agent_profiles/handover_protocols/*.yaml How tasks are passed between agents Route coding tasks to a “Coder” Associate and writing tasks to a “Writer” Associate

No container restart is required.


7. Frequently Asked Questions (FAQ)

All answers come directly from the README or obvious user questions.

Q1: Must I use Google Gemini?

A: No. LiteLLM routes to any provider. Put your API key in the YAML and you’re done.

Q2: How large is the Docker image? Will my old laptop cope?

A: Roughly 2 GB compressed. Eight GB RAM is comfortable.

Q3: My .gemini folder lives elsewhere. What do I do?

A: Edit the volumes line in docker-compose.yaml to point to your real path.

Q4: How do I add a custom search tool?

A:

  1. Create core/agent_core/nodes/custom_nodes/my_search.py.
  2. Inherit BaseToolNode and decorate with @tool_registry.
  3. List the new tool name in the YAML tool_access_policy.
    The framework discovers it automatically.

Q5: Will a huge project folder slow RAG indexing?

A: Incremental indexing is built in; only new or modified files are re-embedded, so even 100 k files stay fast.


8. What to Do Next?

  • Experiment with team configs: Swap the Partner prompt to a legal expert; split Associates into “Regulation Researcher” vs “Financial Analyst” and compare results.
  • Wire in internal APIs: Use MCP to expose an internal database as a tool and ask questions over private data.
  • Write a short blog post: Share the YAML you tweaked and any pitfalls you hit; the Discord channel (#show-and-tell) will pin good write-ups.
  • Contribute back: The CONTRIBUTING.md file in the repo details how to open a pull request. Even a docs typo helps.

Wrap-up

Common Ground turns “black-box AI” into a visible team.
With one Docker command you can spin up the entire stack, use YAML files like spreadsheets to tweak agent behaviour, and watch a Kanban board track every AI move in real time.
The rest is experimentation, documentation, and sharing—turning AI into a colleague you can see, edit, and trust.