The MCP Server Is Live: Configure Workforce Wave from Inside Claude, Cursor, and ChatGPT
The WFW MCP server is live at POST /api/v2/mcp.
If you use Claude Code, Cursor, or any MCP-compatible AI assistant, you can now control your WFW agents directly from your development environment — no dashboard tab switching, no manual API calls, no context switching.
What MCP Is (One Sentence)
MCP (Model Context Protocol) is the standard that lets AI assistants like Claude connect to external tools and take real actions on your behalf — the same way a browser extension lets a website interact with your local machine, but for AI.
What You Can Do from Inside Claude
Once you've added the WFW MCP server to your Claude or Cursor configuration, these actions become available as natural language requests:
Provision agents — "Create a dental receptionist agent for Ridgeline Dental at ridgelinedental.com." Claude calls the WFW MCP server, kicks off a Workforce Wave provisioning job, and tells you the agent will be ready in 90 seconds.
Search knowledge bases — "What does the Ridgeline Dental agent know about implants?" Claude queries the agent's KB and returns the relevant documents.
Read call transcripts — "Show me the last five calls where the agent escalated." Claude pulls the transcripts from GET /v2/calls with the right filters applied.
Manage tools — "Enable appointment booking for the Ridgeline agent." Claude updates the tool configuration via the API.
Check agent status — "Is the Ridgeline Dental agent live? What's its current phone number?" Claude hits the status endpoint and tells you.
All of this happens inside your AI assistant. You describe what you want. Claude figures out which API calls to make, makes them, and reports back.
The Developer Story
Here's what using it actually looks like.
We were demoing this internally last week. The scenario: provision a new dental agent for a practice we'd never configured before. The prompt to Claude:
"Provision a dental receptionist agent for Clearwater Family Dentistry. Their website is clearwaterfamilydentistry.com. Use the dental general practice template. When it's done, tell me the phone number."
Claude called the WFW MCP server with those parameters. Workforce Wave started a provisioning job. Claude polled the operation status. Ninety-one seconds later, Claude came back with the phone number and a summary of what Workforce Wave found on the site — pages crawled, KB documents created, persona extracted.
No dashboard. No curl. No JSON to manually construct.
That's what the MCP integration is for. Not replacing the API — the API is still there for production automation. This is for the moment when you're working inside Claude and you want to talk to your voice infrastructure the same way you talk to your code.
Configuration
Add the WFW MCP server to your Claude Code settings.json:
{
"mcpServers": {
"workforce-wave": {
"url": "https://api.workforcewave.com/v2/mcp",
"headers": {
"Authorization": "Bearer YOUR_API_TOKEN"
}
}
}
}
For Cursor and other MCP clients, the configuration format varies — check your client's MCP setup docs and point it at the same URL.
Your API token is the same service account token you use for REST API calls. No new credentials needed.
llms.txt
We've also published llms.txt at the API domain: https://api.workforcewave.com/llms.txt.
This is a machine-readable catalog of every WFW capability — what the platform does, what endpoints exist, what each one is for, and how they relate to each other. It follows the emerging llms.txt standard for AI-readable documentation. If an AI assistant ingests it, it gets a structured map of the full WFW platform — what's possible, not just what's in the currently open spec.
We're treating llms.txt as a first-class artifact going forward. Every new endpoint gets a corresponding entry. Every significant capability change updates it.
The MCP server is live now. Add it to your Claude or Cursor configuration, and your next agent provision can happen in plain English.
API reference for the MCP endpoint is at workforcewave.com/docs/mcp.
Ready to put AI voice agents to work in your business?
Get a Live Demo — It's FreeContinue Reading
Related Articles
The Bot Creation Matrix: Four Ways to Deploy AI, Now All Live on WFW
Dual-mode agent support just shipped, completing the Bot Creation Matrix. WFW is now the only platform where a bot can be the creator and the consumer — entirely human-free.
Rate Limiting and Idempotency: What Your Bot Needs to Know
The two most important API patterns for AI consumers of the WFW API — with concrete examples and a production-ready TypeScript client.
Workforce Wave AI: The Engine Behind Auto-Provisioning
What happens inside the 5-step Workforce Wave pipeline when a partner enters a business URL, why partners get an operationId instead of a 30-second wait, and how ww_operations powers the fleet dashboard progress bar.