Works with every MCP client
Conductor speaks the Model Context Protocol — the open standard for connecting AI agents to tools. Any client that supports MCP via stdio transport connects to Conductor in under a minute.
Auto-configure any client
The quickest way to connect any supported client is the interactive setup command. It detects installed clients and writes the config automatically.
# Detects Claude Desktop, Cursor, Cline, and more conductor mcp setup # Or target a specific client: conductor mcp setup --client claude conductor mcp setup --client cursor
Manual configuration
If you prefer manual setup, find your client below and add the config block to the appropriate file.
Claude Desktop
Anthropic's desktop application. The reference MCP client — full tool support, approval prompts, and streaming.
~/Library/Application Support/Claude/claude_desktop_config.json%APPDATA%\Claude\claude_desktop_config.json{
"mcpServers": {
"conductor": {
"command": "conductor",
"args": ["mcp", "start"]
}
}
}ℹ Restart Claude Desktop after editing the config. The server appears under Settings > MCP Servers.
Cursor
AI-native code editor built on VS Code. Supports MCP tools in Composer and Chat contexts.
.cursor/mcp.json~/.cursor/mcp.json{
"mcpServers": {
"conductor": {
"command": "conductor",
"args": ["mcp", "start"],
"env": {}
}
}
}ℹ Project-level config takes precedence. After saving, reload Cursor (Cmd/Ctrl+Shift+P > Reload Window).
Cline (VS Code)
Open-source AI coding agent extension for VS Code with full MCP support.
settings.json → cline.mcpServers// In VS Code settings.json
{
"cline.mcpServers": {
"conductor": {
"command": "conductor",
"args": ["mcp", "start"],
"disabled": false,
"alwaysAllow": []
}
}
}ℹ Open VS Code settings (Cmd+, or Ctrl+,), click 'Edit in settings.json', and add the mcpServers entry. Or configure via the Cline panel > MCP Servers tab.
Continue.dev
Open-source AI code assistant for VS Code and JetBrains with MCP tool support.
~/.continue/config.json{
"models": [...],
"mcpServers": [
{
"name": "conductor",
"command": "conductor",
"args": ["mcp", "start"]
}
]
}ℹ After editing config.json, reload Continue with the refresh button in the Continue panel.
Windsurf (Codeium)
AI-first code editor from Codeium with Cascade agent that supports MCP.
~/.codeium/windsurf/mcp_settings.json{
"mcpServers": {
"conductor": {
"command": "conductor",
"args": ["mcp", "start"],
"disabled": false
}
}
}ℹ Restart Windsurf after editing. MCP tools appear in the Cascade context.
Zed
High-performance multiplayer code editor with native MCP support in AI assistant.
~/.config/zed/settings.json{
"assistant": {
"version": "2",
"default_model": { "provider": "anthropic", "model": "claude-opus-4-5" }
},
"context_servers": {
"conductor": {
"command": {
"path": "conductor",
"args": ["mcp", "start"]
}
}
}
}ℹ Zed uses 'context_servers' instead of 'mcpServers'. Restart Zed after editing.
Neovim (mcphub.nvim)
Neovim plugin that integrates MCP servers with any Neovim AI assistant.
~/.config/nvim/lua/plugins/mcphub.lua-- lazy.nvim plugin spec
{
"ravitemer/mcphub.nvim",
config = function()
require("mcphub").setup({
servers = {
conductor = {
command = "conductor",
args = { "mcp", "start" },
}
}
})
end
}ℹ Works with Avante.nvim, codecompanion.nvim, and other Neovim AI plugins.
Aider
Terminal-based AI coding assistant with MCP tool support.
--mcp-server-command~/.aider.conf.yml# .aider.conf.yml mcp_server_command: - conductor - mcp - start # Or via CLI flag: # aider --mcp-server-command "conductor mcp start"
ℹ MCP tools appear in Aider's tool-use mode. Use --no-auto-commits if using write tools.
OpenAI Desktop
OpenAI's desktop app with MCP support for GPT-4 and o-series models.
~/Library/Application Support/OpenAI/Desktop/mcp_servers.json{
"mcpServers": {
"conductor": {
"command": "conductor",
"args": ["mcp", "start"]
}
}
}ℹ OpenAI Desktop MCP support is in beta. Check the OpenAI Desktop release notes for the current config format.
Gemini CLI
Google's Gemini CLI tool with MCP server support.
~/.gemini/settings.json{
"mcpServers": {
"conductor": {
"command": "conductor",
"args": ["mcp", "start"]
}
}
}ℹ Install with: npm install -g @google/gemini-cli
VS Code (GitHub Copilot)
GitHub Copilot in VS Code now supports MCP servers for tool use in agent mode.
mcp.servers// .vscode/mcp.json (project-level, recommended)
{
"servers": {
"conductor": {
"type": "stdio",
"command": "conductor",
"args": ["mcp", "start"]
}
}
}ℹ Requires VS Code 1.99+ and GitHub Copilot Chat extension. Enable 'github.copilot.chat.mcp.enabled' in settings.
No global install? Use npx.
If you haven't installed Conductor globally, replace conductor with the npx equivalent in any config above:
{
"mcpServers": {
"conductor": {
"command": "npx",
"args": ["-y", "@conductor/cli", "mcp", "start"]
}
}
}Transport: stdio vs HTTP/SSE
All AI client configs above use stdio transport — the client spawns the Conductor process and communicates over stdin/stdout. This is the recommended transport for local use. Conductor also supports HTTP/SSE transport for remote deployments: conductor mcp start --transport http --port 3000