Give your ElevenAgents voice and chat agents the ability to scrape, search, and crawl the web in real time using Firecrawl. This guide covers two integration paths:
- MCP server — connect the hosted Firecrawl MCP server for zero-code setup.
- Server webhook tool — point a custom tool at Firecrawl’s REST API for full control over requests.
Prerequisites
Option 1: Firecrawl MCP Server
The fastest way to give an agent web access. ElevenAgents supports remote MCP servers, and Firecrawl provides a hosted MCP endpoint.
Add the MCP server
- Open the Integrations page in ElevenLabs and click + Add integration.
- Select Custom MCP Server from the integration library.
- Fill in the following fields:
| Field | Value |
|---|
| Name | Firecrawl |
| Description | Search, scrape, crawl, and extract content from any website. |
| Server type | Streamable HTTP |
| Server URL | https://mcp.firecrawl.dev/YOUR_FIRECRAWL_API_KEY/v2/mcp |
Replace YOUR_FIRECRAWL_API_KEY with your actual key. Leave the Type dropdown set to Value. Treat this URL as a secret — it contains your API key.
You must select Streamable HTTP as the server type. The default SSE option does not work with the Firecrawl MCP endpoint.
-
Under Tool Approval Mode, choose an approval level:
- No Approval — the agent uses tools freely. Fine for read-only scraping.
- Fine-Grained Tool Approval — lets you pre-select which tools can run automatically and which require approval. Good for controlling expensive crawl operations.
- Always Ask (default) — the agent requests permission before every tool call.
-
Check I trust this server, then click Add Server.
ElevenLabs will connect to the server and list the available tools (scrape, search, crawl, map, and more).
Attach it to an agent
- Create or open an agent in the ElevenAgents dashboard.
- Go to the Tools tab, then select the MCP sub-tab.
- Click Add server and select the Firecrawl integration from the dropdown.
Update the system prompt
In the Agent tab, add instructions to the System prompt so the agent knows when to use Firecrawl. For example:
You are a helpful research assistant. When the user asks about a website,
a company, or any topic that requires up-to-date information, use the
Firecrawl tools to search the web or scrape the relevant page, then
summarize the results.
Test it
Click Preview in the top navigation bar. You can test using the text chat input or by starting a voice call. Try a prompt like:
“What does firecrawl.dev do? Go to the site and summarize it for me.”
The agent will call the Firecrawl MCP scrape tool, receive the page markdown, and respond with a summary.
Use this approach when you need precise control over request parameters (formats, headers, timeouts, etc.) or want to call a specific Firecrawl endpoint without exposing the full MCP tool set.
Create a tool that scrapes a single URL and returns its content as markdown.
- Open your agent and go to the Tools tab.
- Click Add tool and select Webhook.
- Configure the tool:
| Field | Value |
|---|
| Name | scrape_website |
| Description | Scrape content from a URL and return it as clean markdown. |
| Method | POST |
| URL | https://api.firecrawl.dev/v2/scrape |
The Method field defaults to GET — make sure to change it to POST.
- Scroll to the Headers section and click Add header for authentication:
| Header | Value |
|---|
Authorization | Bearer YOUR_FIRECRAWL_API_KEY |
Alternatively, if you have workspace auth connections configured, you can use the Authentication dropdown instead.
- Add a body parameter:
| Parameter | Type | Description | Required |
|---|
url | string | The URL to scrape | Yes |
- Click Add tool.
The Firecrawl API returns the page content as markdown by default. The agent receives the JSON response and can use the markdown field to answer questions.
Create a tool that searches the web and returns results with scraped content.
- Click Add tool → Webhook again and configure:
| Field | Value |
|---|
| Name | search_web |
| Description | Search the web for a query and return relevant results with page content. |
| Method | POST |
| URL | https://api.firecrawl.dev/v2/search |
-
Add the same
Authorization header as above.
-
Add body parameters:
| Parameter | Type | Description | Required |
|---|
query | string | The search query | Yes |
limit | number | Maximum number of results to return (default 5) | No |
- Click Add tool.
Update the system prompt
In the Agent tab, update the System prompt:
You are a knowledgeable assistant with access to web tools.
- Use `scrape_website` when the user gives you a specific URL to read.
- Use `search_web` when the user asks a general question that requires
finding information online.
Always summarize the information concisely and cite the source URL.
Test it
Click Preview and try asking:
“Search for the latest Next.js features and give me a summary.”
The agent will call search_web, receive results from Firecrawl, and respond with a summary of the findings.
Tips
- Model selection — For reliable tool calling, use a high-intelligence model such as GPT-4o, Claude Sonnet 4.5 or later, or Gemini 2.5 Flash. Smaller models may struggle to extract the correct parameters.
- Keep prompts specific — Tell the agent exactly when to use each tool. Vague instructions lead to missed or incorrect tool calls.
- Limit response size — For voice agents, long scraped pages can overwhelm the LLM context. Use
onlyMainContent: true in scrape options (or instruct the agent to summarize aggressively) to keep responses concise.
- Tool call sounds — In the webhook or MCP tool settings, you can configure a Tool call sound to play ambient audio while a tool runs. This signals to the user that the agent is working.
Resources