Skip to main content
Firecrawl provides different proxy types to help you scrape websites with varying levels of complexity. The proxy type can be specified using the proxy parameter.

Proxy Types

Firecrawl supports three types of proxies:
  • basic: Proxies for scraping most sites. Fast and usually works.
  • stealth: Stealth proxies for scraping complex sites while maintaining privacy. Slower, but more reliable on certain sites.
  • auto: Firecrawl will automatically retry scraping with stealth proxies if the basic proxy fails. If the retry with stealth is successful, 5 credits will be billed for the scrape. If the first attempt with basic is successful, only the regular cost will be billed.
If you do not specify a proxy, Firecrawl will default to auto.

Using Stealth Mode

When scraping complex websites, you can use stealth mode to improve your success rate while maintaining privacy.
from firecrawl import Firecrawl

firecrawl = Firecrawl(api_key='fc-YOUR-API-KEY')

# Choose proxy strategy: 'basic' | 'stealth' | 'auto'
doc = firecrawl.scrape('https://example.com', formats=['markdown'], proxy='auto')

print(doc.warning or 'ok')
Note: Stealth proxy requests cost 5 credits per request when used.

Using Stealth as a Retry Mechanism

A common pattern is to first try scraping with the default proxy settings, and then retry with stealth mode if you encounter specific error status codes (401, 403, or 500) in the metadata.statusCode field of the response. These status codes can be indicative of the website blocking your request.
# pip install firecrawl-py

from firecrawl import Firecrawl

firecrawl = Firecrawl(api_key="YOUR_API_KEY")

# First try with basic proxy
try:
    content = firecrawl.scrape("https://example.com")
    
    # Check if we got an error status code
    status_code = content.get("metadata", {}).get("statusCode")
    if status_code in [401, 403, 500]:
        print(f"Got status code {status_code}, retrying with stealth proxy")
        # Retry with stealth proxy
        content = firecrawl.scrape("https://example.com", proxy="stealth")
    
    print(content["markdown"])
except Exception as e:
    print(f"Error: {e}")
    # Retry with stealth proxy on exception
    try:
        content = firecrawl.scrape("https://example.com", proxy="stealth")
        print(content["markdown"])
    except Exception as e:
        print(f"Stealth proxy also failed: {e}")
This approach allows you to optimize your credit usage by only using stealth mode when necessary.