Windsurf Integration

Integrate Orchex with Windsurf via the Model Context Protocol (MCP) to orchestrate complex workflows directly from your Windsurf editor.

Overview

Windsurf is Codeium's agentic IDE that combines AI assistance with powerful development tools. By integrating Orchex through MCP, you can:

  • Execute multi-step workflows from within Windsurf
  • Access Orchex's orchestration capabilities through natural language
  • View execution artifacts without leaving your editor
  • Chain complex operations with dependency management

Prerequisites

  • Windsurf IDE installed (download here)
  • Orchex installed globally or locally
  • Node.js 18+ (required for MCP server)

Installation

1. Install Orchex

If you haven't already:

npm install -g @wundam/orchex

Or install locally in your project:

npm install @wundam/orchex --save-dev

2. Configure MCP in Windsurf

Windsurf uses MCP configuration files to connect to context servers. You'll need to create or update your MCP settings file.

Location: The MCP configuration file is typically located at:

  • macOS/Linux: ~/.codeium/windsurf/mcp_settings.json
  • Windows: %APPDATA%\Codeium\Windsurf\mcp_settings.json

If the file doesn't exist, create it.

3. Add Orchex MCP Server

Add the following configuration to your mcp_settings.json. Configure for your preferred LLM provider:

Anthropic (Claude)

{
  "mcpServers": {
    "orchex": {
      "command": "orchex-server",
      "args": [],
      "env": {
        "ANTHROPIC_API_KEY": "your-api-key-here"
      }
    }
  }
}

OpenAI (GPT-4, GPT-4.5)

{
  "mcpServers": {
    "orchex": {
      "command": "orchex-server",
      "args": [],
      "env": {
        "OPENAI_API_KEY": "your-api-key-here"
      }
    }
  }
}

Google Gemini

{
  "mcpServers": {
    "orchex": {
      "command": "orchex-server",
      "args": [],
      "env": {
        "GEMINI_API_KEY": "your-api-key-here"
      }
    }
  }
}

Ollama (Local Models)

{
  "mcpServers": {
    "orchex": {
      "command": "orchex-server",
      "args": [],
      "env": {
        "OLLAMA_BASE_URL": "http://localhost:11434"
      }
    }
  }
}

Using Environment Variables

For better security, use environment variables instead of hardcoding your API key:

{
  "mcpServers": {
    "orchex": {
      "command": "orchex-server",
      "args": [],
      "env": {
        "OPENAI_API_KEY": "${OPENAI_API_KEY}"
      }
    }
  }
}

Make sure your provider's API key is set in your shell environment (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY).

4. Restart Windsurf

After updating the configuration:

  1. Close Windsurf completely
  2. Reopen Windsurf
  3. The Orchex MCP server should now be available

Verification

To verify the integration is working:

  1. Open the Cascade chat in Windsurf
  2. Ask: "Can you list the available Orchex tools?"
  3. You should see tools like:
    • orchex_execute - Execute an Orchex manifest
    • orchex_plan - Plan execution without running
    • orchex_validate - Validate a manifest

Usage

Basic Workflow Execution

Once configured, you can ask Windsurf to execute Orchex manifests:

Execute the manifest at ./manifests/add-feature.yaml

Windsurf will use the orchex_execute tool to run your workflow and show you the results.

Common Commands

Execute a Manifest

Run the Orchex manifest in examples/hello-world/manifest.yaml

Plan Before Executing

Show me the execution plan for ./my-manifest.yaml without running it

Validate a Manifest

Validate the manifest at ./manifests/complex-workflow.yaml

View Artifacts

Execute ./manifest.yaml and show me the artifacts that were created

Advanced Usage

Creating Manifests Interactively

You can ask Windsurf to help you create manifests:

Help me create an Orchex manifest to:
1. Add a new API endpoint for user authentication
2. Update the database schema
3. Add tests for the new endpoint

Windsurf will generate a manifest and can execute it for you.

Debugging Executions

If an execution fails:

The last Orchex execution failed. Can you analyze the error logs and suggest fixes?

Iterative Development

Execute ./feature-manifest.yaml, then if it succeeds, execute ./test-manifest.yaml

Workflow Examples

Example 1: Feature Development

You: "I need to add a user profile feature with these components: API route, database model, tests, and documentation."

Windsurf: Creates a manifest with multiple streams:

  • Stream 1: Database model
  • Stream 2: API route (depends on model)
  • Stream 3: Tests (depends on API)
  • Stream 4: Documentation (depends on all)

Then executes the manifest and shows you the results.

Example 2: Bug Fix Workflow

You: "Fix the authentication bug in src/auth.ts and update all related tests."

Windsurf: Creates and executes a manifest:

feature: fix-auth-bug
streams:
  fix-auth:
    plan: Fix the authentication issue in src/auth.ts
    owns:
      - src/auth.ts
  update-tests:
    plan: Update tests to verify the auth fix
    owns:
      - tests/auth.test.ts
    needs:
      - fix-auth

Example 3: Refactoring

You: "Refactor the user service to use dependency injection."

Windsurf: Creates a multi-stream manifest for:

  1. Extract interfaces
  2. Update service implementation
  3. Update all consumers
  4. Update tests

MCP Tools Reference

`orchex_execute`

Execute an Orchex manifest.

Parameters:

  • manifestPath (string, required): Path to the manifest file
  • options (object, optional):
    • dryRun (boolean): Plan without executing
    • verbose (boolean): Show detailed output
    • maxConcurrency (number): Maximum parallel streams

Example:

{
  "manifestPath": "./manifest.yaml",
  "options": {
    "verbose": true
  }
}

`orchex_plan`

Generate an execution plan without running.

Parameters:

  • manifestPath (string, required): Path to the manifest file

Returns: Execution plan with wave structure and dependencies.

`orchex_validate`

Validate a manifest for syntax and semantic errors.

Parameters:

  • manifestPath (string, required): Path to the manifest file

Returns: Validation results with any errors or warnings.

Troubleshooting

Orchex Tools Not Available

Problem: Windsurf doesn't show Orchex tools in Cascade.

Solutions:

  1. Verify mcp_settings.json is in the correct location
  2. Check that the JSON syntax is valid
  3. Ensure orchex-server is in your PATH (for global install)
  4. Restart Windsurf completely
  5. Check Windsurf's developer console for MCP errors

Authentication Errors

Problem: "API key not configured" errors.

Solutions:

  1. Verify your provider's API key is set in mcp_settings.json:
    • Anthropic: ANTHROPIC_API_KEY
    • OpenAI: OPENAI_API_KEY
    • Gemini: GEMINI_API_KEY or GOOGLE_API_KEY
    • Ollama: OLLAMA_BASE_URL (no key needed)
  2. If using environment variables, ensure they're exported:
    export OPENAI_API_KEY=your-key-here
    # or for Gemini:
    export GEMINI_API_KEY=your-key-here
  3. Restart Windsurf after setting environment variables
  4. Optionally set ORCHEX_PROVIDER to explicitly select your provider

Command Not Found

Problem: "orchex-server: command not found"

Solutions:

  1. For global install, verify:
    which orchex-server
  2. For local install, use the npx configuration shown above
  3. Try specifying the full path:
    {
      "command": "/full/path/to/orchex-server"
    }

Execution Failures

Problem: Manifests fail to execute.

Solutions:

  1. Validate the manifest first:
    Validate my manifest at ./manifest.yaml
  2. Check file paths are relative to project root
  3. Verify all dependencies are correctly specified
  4. Review the execution logs for specific errors

Configuration Options

Advanced MCP Settings

You can customize the Orchex MCP server with additional environment variables:

{
  "mcpServers": {
    "orchex": {
      "command": "orchex-server",
      "args": [],
      "env": {
        "OPENAI_API_KEY": "your-api-key",
        "ORCHEX_PROVIDER": "openai",
        "ORCHEX_MAX_TOKENS": "200000",
        "ORCHEX_LOG_LEVEL": "info",
        "ORCHEX_CACHE_DIR": "~/.orchex/cache"
      }
    }
  }
}

Multiple Orchex Instances

You can configure multiple Orchex servers for different providers:

{
  "mcpServers": {
    "orchex-claude": {
      "command": "orchex-server",
      "env": {
        "ANTHROPIC_API_KEY": "claude-key"
      }
    },
    "orchex-gpt": {
      "command": "orchex-server",
      "env": {
        "OPENAI_API_KEY": "openai-key"
      }
    },
    "orchex-local": {
      "command": "orchex-server",
      "env": {
        "OLLAMA_BASE_URL": "http://localhost:11434"
      }
    }
  }
}

This allows you to switch between providers depending on your use case.

Best Practices

1. Use Project-Specific Manifests

Store manifests in your project repository:

.
├── .orchex/
│   ├── manifests/
│   │   ├── add-feature.yaml
│   │   ├── refactor.yaml
│   │   └── test-update.yaml
│   └── config.json
├── src/
└── package.json

2. Leverage Windsurf's Context

Windsurf automatically provides file context. Use it in your prompts:

Create an Orchex manifest to refactor the currently open file

3. Iterate with Plan First

Always plan before executing complex workflows:

1. Show me the plan for this manifest
2. (Review the plan)
3. Execute the manifest

4. Combine with Windsurf Features

Use Orchex for multi-file orchestration and Windsurf for:

  • Quick single-file edits
  • Code explanations
  • Inline suggestions

5. Version Control Integration

Commit manifests to version control:

git add .orchex/manifests/
git commit -m "Add feature development manifest"

Comparison with Other Editors

Feature Windsurf Claude Desktop Cursor
MCP Support ✅ Native ✅ Native ✅ Via Extension
Inline Execution ✅ Yes ❌ No ✅ Yes
File Context ✅ Automatic ⚠️ Manual ✅ Automatic
Chat Interface ✅ Cascade ✅ Native ✅ Native
Multi-file Edits ✅ Excellent ⚠️ Limited ✅ Good

Next Steps

Getting Help