conductorv2
MCP Compatibility

Works with every MCP client

Conductor speaks the Model Context Protocol — the open standard for connecting AI agents to tools. Any client that supports MCP via stdio transport connects to Conductor in under a minute.

Auto-configure any client

The quickest way to connect any supported client is the interactive setup command. It detects installed clients and writes the config automatically.

bash
# Detects Claude Desktop, Cursor, Cline, and more
conductor mcp setup

# Or target a specific client:
conductor mcp setup --client claude
conductor mcp setup --client cursor

Manual configuration

If you prefer manual setup, find your client below and add the config block to the appropriate file.

Claude Desktop

Anthropic's desktop application. The reference MCP client — full tool support, approval prompts, and streaming.

claude.ai
macOS~/Library/Application Support/Claude/claude_desktop_config.json
Windows%APPDATA%\Claude\claude_desktop_config.json
json
{
  "mcpServers": {
    "conductor": {
      "command": "conductor",
      "args": ["mcp", "start"]
    }
  }
}

Restart Claude Desktop after editing the config. The server appears under Settings > MCP Servers.

Cursor

AI-native code editor built on VS Code. Supports MCP tools in Composer and Chat contexts.

cursor.com
Project (recommended).cursor/mcp.json
Global~/.cursor/mcp.json
json
{
  "mcpServers": {
    "conductor": {
      "command": "conductor",
      "args": ["mcp", "start"],
      "env": {}
    }
  }
}

Project-level config takes precedence. After saving, reload Cursor (Cmd/Ctrl+Shift+P > Reload Window).

Cline (VS Code)

Open-source AI coding agent extension for VS Code with full MCP support.

marketplace.visualstudio.com
VS Code Settingssettings.json → cline.mcpServers
json
// In VS Code settings.json
{
  "cline.mcpServers": {
    "conductor": {
      "command": "conductor",
      "args": ["mcp", "start"],
      "disabled": false,
      "alwaysAllow": []
    }
  }
}

Open VS Code settings (Cmd+, or Ctrl+,), click 'Edit in settings.json', and add the mcpServers entry. Or configure via the Cline panel > MCP Servers tab.

Continue.dev

Open-source AI code assistant for VS Code and JetBrains with MCP tool support.

continue.dev
Config file~/.continue/config.json
json
{
  "models": [...],
  "mcpServers": [
    {
      "name": "conductor",
      "command": "conductor",
      "args": ["mcp", "start"]
    }
  ]
}

After editing config.json, reload Continue with the refresh button in the Continue panel.

Windsurf (Codeium)

AI-first code editor from Codeium with Cascade agent that supports MCP.

codeium.com
Config file~/.codeium/windsurf/mcp_settings.json
json
{
  "mcpServers": {
    "conductor": {
      "command": "conductor",
      "args": ["mcp", "start"],
      "disabled": false
    }
  }
}

Restart Windsurf after editing. MCP tools appear in the Cascade context.

Zed

High-performance multiplayer code editor with native MCP support in AI assistant.

zed.dev
Config file~/.config/zed/settings.json
json
{
  "assistant": {
    "version": "2",
    "default_model": { "provider": "anthropic", "model": "claude-opus-4-5" }
  },
  "context_servers": {
    "conductor": {
      "command": {
        "path": "conductor",
        "args": ["mcp", "start"]
      }
    }
  }
}

Zed uses 'context_servers' instead of 'mcpServers'. Restart Zed after editing.

Neovim (mcphub.nvim)

Neovim plugin that integrates MCP servers with any Neovim AI assistant.

github.com
Lazy.nvim config~/.config/nvim/lua/plugins/mcphub.lua
json
-- lazy.nvim plugin spec
{
  "ravitemer/mcphub.nvim",
  config = function()
    require("mcphub").setup({
      servers = {
        conductor = {
          command = "conductor",
          args = { "mcp", "start" },
        }
      }
    })
  end
}

Works with Avante.nvim, codecompanion.nvim, and other Neovim AI plugins.

Aider

Terminal-based AI coding assistant with MCP tool support.

aider.chat
CLI flag--mcp-server-command
Config file~/.aider.conf.yml
json
# .aider.conf.yml
mcp_server_command:
  - conductor
  - mcp
  - start

# Or via CLI flag:
# aider --mcp-server-command "conductor mcp start"

MCP tools appear in Aider's tool-use mode. Use --no-auto-commits if using write tools.

OpenAI Desktop

OpenAI's desktop app with MCP support for GPT-4 and o-series models.

openai.com
macOS~/Library/Application Support/OpenAI/Desktop/mcp_servers.json
json
{
  "mcpServers": {
    "conductor": {
      "command": "conductor",
      "args": ["mcp", "start"]
    }
  }
}

OpenAI Desktop MCP support is in beta. Check the OpenAI Desktop release notes for the current config format.

Gemini CLI

Google's Gemini CLI tool with MCP server support.

github.com
Config file~/.gemini/settings.json
json
{
  "mcpServers": {
    "conductor": {
      "command": "conductor",
      "args": ["mcp", "start"]
    }
  }
}

Install with: npm install -g @google/gemini-cli

VS Code (GitHub Copilot)

GitHub Copilot in VS Code now supports MCP servers for tool use in agent mode.

code.visualstudio.com
VS Code settings.jsonmcp.servers
json
// .vscode/mcp.json (project-level, recommended)
{
  "servers": {
    "conductor": {
      "type": "stdio",
      "command": "conductor",
      "args": ["mcp", "start"]
    }
  }
}

Requires VS Code 1.99+ and GitHub Copilot Chat extension. Enable 'github.copilot.chat.mcp.enabled' in settings.

No global install? Use npx.

If you haven't installed Conductor globally, replace conductor with the npx equivalent in any config above:

json
{
  "mcpServers": {
    "conductor": {
      "command": "npx",
      "args": ["-y", "@conductor/cli", "mcp", "start"]
    }
  }
}

Transport: stdio vs HTTP/SSE

All AI client configs above use stdio transport — the client spawns the Conductor process and communicates over stdin/stdout. This is the recommended transport for local use. Conductor also supports HTTP/SSE transport for remote deployments: conductor mcp start --transport http --port 3000