Interaction Mode 3 — Machine-Callable
Your AI agents can now make phone calls.
When GPT-5 calls your booking line at 2am to confirm a patient appointment, it gets a structured JSON response in 340ms — no hold music, no disambiguation, no brittle screen-scraping.
REST API, MCP server, and A2A Agent Card. Machine-readable from day one.
<500ms
Structured JSON response to machine callers.
Dual-mode detection, task execution, and typed response — all within a single call session your orchestrator can await.
Machine-Callable Mode — Properties
The Inversion
In 2024, AI was the product — deployed by humans to serve humans.
In 2025, AI is the customer. LangChain agents and Claude Code tasks need to make phone calls, check appointments, query inventory, and trigger workflows. WFW gives them a phone-capable API surface that returns structured data — not hold music. The bot isn't just the product anymore. It's the buyer.
What Breaks Without It
Phone systems weren't designed for machine callers.
Every phone interaction pattern built for humans — IVR, hold music, disambiguation — is friction for a machine caller that needs structured data.
Problem
Phone systems built for humans break for bots
IVR trees, hold music, voice disambiguation prompts — these are human UX patterns. An AI orchestrator calling a standard phone line gets stuck in press-1-for-English hell with no structured output.
Problem
No JSON means no integration
Your LangChain agent or n8n workflow needs a typed response it can parse, route on, and act upon. If the call system returns unstructured text or audio, you can't integrate it into your pipeline.
Problem
No machine-readable discovery means manual wiring
If a voice system doesn't publish an OpenAPI spec, MCP tools, or Agent Card, your AI can't discover what it can do. Every integration is a custom brittle adapter built by a human.
How It Works
From orchestrator call to structured response.
Machine-callable mode is triggered by the caller type, not configuration. When WFW detects a machine caller, the entire session shifts to machine mode automatically.
AI orchestrator calls POST /v2/calls
Your LangChain agent, n8n node, or Claude Code task triggers a call via the REST API or MCP tool. The request includes the target number, agent ID, and any structured task context to pass in.
Dual-mode detection routes the session
WFW detects the machine caller via SIP header inspection and first-utterance pattern analysis. The agent switches to machine mode: structured JSON output, no conversational filler, typed error codes.
Agent executes the task with full Tool Gateway access
The agent has access to the same Tool Gateway as human-facing calls: calendar lookup, booking creation, patient records, availability checks. The machine caller gets the same capability as a human caller.
Structured JSON response delivered in <500ms
Task result is returned as typed JSON — no hold, no disambiguation, no retry logic needed. Your orchestrator receives a parseable response it can immediately act upon, route on, or pass to the next node.
Integration Examples
Three ways to call from your AI system.
REST API, JavaScript, or MCP — whichever fits your orchestration stack.
curl — Initiate a Call
curl -X POST \
https://api.workforcewave.ai/v2/calls \
-H "Authorization: Bearer $WFW_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"agentId": "agt_7rx9...",
"toNumber": "+18435551234",
"metadata": {
"purpose": "appointment_confirmation"
}
}'
// Response
{
"call_id": "call_4mn1...",
"status": "initiated"
}JavaScript — Await Result
const call = await wfw.calls.initiate({
agent_id: "agt_7rx9...",
to: "+18435551234",
task: "check_availability",
context: { date: "2026-04-18" }
});
// Poll or await webhook
const result = await wfw.calls
.awaitCompletion(call.call_id);
// Typed result
console.log(result.extractions);
// { intent: "book_appointment",
// slot: "2026-04-18T10:00",
// confirmed: true }MCP — Claude Code / Cursor
// In your MCP config:
{
"mcpServers": {
"wfw": {
"url": "https://mcp.workforcewave.ai",
"apiKey": "$WFW_API_KEY"
}
}
}
// Claude Code now has:
// wfw_initiate_call
// wfw_get_call_result
// wfw_list_agents
// wfw_check_availability
// + 10 more toolsReal Scenarios
AI systems calling WFW in the wild.
These are real orchestration patterns using real AI frameworks.
A LangChain appointment-scheduling agent receives a patient chat request. It calls POST /v2/calls to check availability via the dental practice's WFW agent, gets a structured slot list in JSON, books the appointment, and returns confirmation — without any human in the loop.
An n8n workflow triggers on a new lead form submission. The n8n HTTP node calls /v2/calls to have the WFW agent call the lead immediately. When the call completes, the call.extractions_ready webhook fires and n8n picks up the lead score and intent for CRM routing.
Claude Code running a client task uses the WFW MCP server to invoke wfw_initiate_call. The call completes and returns structured JSON — appointment booked, confirmation sent. Claude Code continues the task without ever leaving its tool-use flow.
Machine-Readable Discovery
Your AI can discover WFW without a human writing the integration.
WFW publishes three machine-readable discovery documents. Any AI with web access can find these, understand capabilities, and start calling the API.
/.well-known/openapi.jsonFull OpenAPI 3.1 spec. 40+ endpoints, typed request/response schemas.
/llms.txtPlain-language API summary optimized for LLM consumption.
/.well-known/agent.jsonA2A Agent Card — capability declaration for agent-to-agent protocols.
Performance Benchmark
<500ms
Structured JSON response delivered to the machine caller — including task execution.
Dual-mode detection adds under 20ms to session setup. Tool calls (calendar lookup, booking creation) execute in parallel where possible. The entire machine-callable session is optimized for latency, not conversation flow.
Start Building
Give your AI agents a phone.
REST API, MCP server, and A2A Agent Card — three ways for your AI system to discover and call WFW voice agents. $50 free API credit to start.