Choose your plan

Local is always free. Scale to cloud when you need to.

Monthly Annual Save 20%

Free

$0 $0
forever
  • 5 streams
  • 2 waves
  • 1 provider
  • Core MCP tools
  • BYOK
$ npx @wundam/orchex
Popular

Pro

$19 $15
per month
  • 100 runs/mo
  • 15 agents
  • Full self-heal
  • orchex learn
  • Dashboard
  • Priority support
Get Started

Team

$49 $39
per user/month
  • 500 runs/mo
  • 25 agents
  • Shared orchestrations
  • Team management
  • Analytics
  • Shared quotas
Get Started

Enterprise

Custom Custom
contact us
  • Unlimited
  • 50+ agents
  • SLA
  • Dedicated infrastructure
  • Custom integrations
Contact Sales

All tiers include BYOK — bring your own LLM API keys. You pay the LLM provider directly.

Feature Free Pro Team Enterprise
Cloud Runs 100/mo 500/mo Unlimited
Parallel Agents 5 15 25 50+
Waves 2 10 25 Unlimited
LLM Providers 1 2 3 All
Self-Healing Full Full Full + Custom
orchex learn
Dashboard ✓ + Admin
Cost Tracking
API Access
Team Features
Shared Orchestrations
SLA
Support Community Priority Dedicated Custom
What does BYOK mean? +

You bring your own LLM API keys. Orchex charges for orchestration only — you pay OpenAI, Anthropic, or Google directly for token usage. Your keys are encrypted with AES-256-GCM and never stored in plaintext.

Can I start free and upgrade later? +

Yes. Local MCP tools are always free with no account required. Cloud features require signup. Upgrade to any paid tier anytime — your data and history are preserved.

What happens when I hit my run limit? +

Cloud orchestrations pause until the next billing cycle. Local execution is never limited. You can upgrade mid-cycle to get more runs immediately.

Do you offer a free trial for Pro? +

Yes. Every new account gets $5 in cloud credits — enough for approximately 10 orchestrations. 30 days, no credit card required.

How does team billing work? +

Team tier is per-user. Each member gets their own BYOK keys and execution history. The team shares a quota pool. Org admins can set per-user budgets and view team analytics.

What LLM providers are supported? +

OpenAI (GPT-4, GPT-4.5), Google Gemini, Anthropic Claude, DeepSeek, and Ollama (local/self-hosted models). Route different orchestrations to different providers — use the best model for each job.

Ready to orchestrate?

$ npx @wundam/orchex Sign up for free