# Scrapfly Documentation

## Table of Contents

### Dashboard

- [Intro](https://scrapfly.io/docs)
- [Project](https://scrapfly.io/docs/project)
- [Account](https://scrapfly.io/docs/account)
- [Workspace & Team](https://scrapfly.io/docs/workspace-and-team)
- [Billing](https://scrapfly.io/docs/billing)

### Products

#### MCP Server

- [Getting Started](https://scrapfly.io/docs/mcp/getting-started)
- [Tools & API Spec](https://scrapfly.io/docs/mcp/tools)
- [Authentication](https://scrapfly.io/docs/mcp/authentication)
- [Examples & Use Cases](https://scrapfly.io/docs/mcp/examples)
- [FAQ](https://scrapfly.io/docs/mcp/faq)
##### Integrations

- [Overview](https://scrapfly.io/docs/mcp/integrations)
- [Claude Desktop](https://scrapfly.io/docs/mcp/integrations/claude-desktop)
- [Claude Code](https://scrapfly.io/docs/mcp/integrations/claude-code)
- [ChatGPT](https://scrapfly.io/docs/mcp/integrations/chatgpt)
- [Cursor](https://scrapfly.io/docs/mcp/integrations/cursor)
- [Cline](https://scrapfly.io/docs/mcp/integrations/cline)
- [Windsurf](https://scrapfly.io/docs/mcp/integrations/windsurf)
- [Zed](https://scrapfly.io/docs/mcp/integrations/zed)
- [Roo Code](https://scrapfly.io/docs/mcp/integrations/roo-code)
- [VS Code](https://scrapfly.io/docs/mcp/integrations/vscode)
- [LangChain](https://scrapfly.io/docs/mcp/integrations/langchain)
- [LlamaIndex](https://scrapfly.io/docs/mcp/integrations/llamaindex)
- [CrewAI](https://scrapfly.io/docs/mcp/integrations/crewai)
- [OpenAI](https://scrapfly.io/docs/mcp/integrations/openai)
- [n8n](https://scrapfly.io/docs/mcp/integrations/n8n)
- [Make](https://scrapfly.io/docs/mcp/integrations/make)
- [Zapier](https://scrapfly.io/docs/mcp/integrations/zapier)
- [Vapi AI](https://scrapfly.io/docs/mcp/integrations/vapi)
- [Agent Builder](https://scrapfly.io/docs/mcp/integrations/agent-builder)
- [Custom Client](https://scrapfly.io/docs/mcp/integrations/custom-client)


#### Web Scraping API

- [Getting Started](https://scrapfly.io/docs/scrape-api/getting-started)
- [API Specification]()
- [Monitoring](https://scrapfly.io/docs/monitoring)
- [Customize Request](https://scrapfly.io/docs/scrape-api/custom)
- [Debug](https://scrapfly.io/docs/scrape-api/debug)
- [Anti Scraping Protection](https://scrapfly.io/docs/scrape-api/anti-scraping-protection)
- [Proxy](https://scrapfly.io/docs/scrape-api/proxy)
- [Proxy Mode](https://scrapfly.io/docs/scrape-api/proxy-mode)
- [Proxy Mode - Screaming Frog](https://scrapfly.io/docs/scrape-api/proxy-mode/screaming-frog)
- [Proxy Mode - Apify](https://scrapfly.io/docs/scrape-api/proxy-mode/apify)
- [(Auto) Data Extraction](https://scrapfly.io/docs/scrape-api/extraction)
- [Javascript Rendering](https://scrapfly.io/docs/scrape-api/javascript-rendering)
- [Javascript Scenario](https://scrapfly.io/docs/scrape-api/javascript-scenario)
- [SSL](https://scrapfly.io/docs/scrape-api/ssl)
- [DNS](https://scrapfly.io/docs/scrape-api/dns)
- [Cache](https://scrapfly.io/docs/scrape-api/cache)
- [Session](https://scrapfly.io/docs/scrape-api/session)
- [Webhook](https://scrapfly.io/docs/scrape-api/webhook)
- [Screenshot](https://scrapfly.io/docs/scrape-api/screenshot)
- [Errors](https://scrapfly.io/docs/scrape-api/errors)
- [Timeout](https://scrapfly.io/docs/scrape-api/understand-timeout)
- [Throttling](https://scrapfly.io/docs/throttling)
- [Troubleshoot](https://scrapfly.io/docs/scrape-api/troubleshoot)
- [Billing](https://scrapfly.io/docs/scrape-api/billing)
- [FAQ](https://scrapfly.io/docs/scrape-api/faq)

#### Crawler API

- [Getting Started](https://scrapfly.io/docs/crawler-api/getting-started)
- [API Specification]()
- [Retrieving Results](https://scrapfly.io/docs/crawler-api/results)
- [WARC Format](https://scrapfly.io/docs/crawler-api/warc-format)
- [Data Extraction](https://scrapfly.io/docs/crawler-api/extraction-rules)
- [Webhook](https://scrapfly.io/docs/crawler-api/webhook)
- [Billing](https://scrapfly.io/docs/crawler-api/billing)
- [Errors](https://scrapfly.io/docs/crawler-api/errors)
- [Troubleshoot](https://scrapfly.io/docs/crawler-api/troubleshoot)
- [FAQ](https://scrapfly.io/docs/crawler-api/faq)

#### Screenshot API

- [Getting Started](https://scrapfly.io/docs/screenshot-api/getting-started)
- [API Specification]()
- [Accessibility Testing](https://scrapfly.io/docs/screenshot-api/accessibility)
- [Webhook](https://scrapfly.io/docs/screenshot-api/webhook)
- [Billing](https://scrapfly.io/docs/screenshot-api/billing)
- [Errors](https://scrapfly.io/docs/screenshot-api/errors)

#### Extraction API

- [Getting Started](https://scrapfly.io/docs/extraction-api/getting-started)
- [API Specification]()
- [Rules Template](https://scrapfly.io/docs/extraction-api/rules-and-template)
- [LLM Extraction](https://scrapfly.io/docs/extraction-api/llm-prompt)
- [AI Auto Extraction](https://scrapfly.io/docs/extraction-api/automatic-ai)
- [Webhook](https://scrapfly.io/docs/extraction-api/webhook)
- [Billing](https://scrapfly.io/docs/extraction-api/billing)
- [Errors](https://scrapfly.io/docs/extraction-api/errors)
- [FAQ](https://scrapfly.io/docs/extraction-api/faq)

#### Proxy Saver

- [Getting Started](https://scrapfly.io/docs/proxy-saver/getting-started)
- [Fingerprints](https://scrapfly.io/docs/proxy-saver/fingerprints)
- [Optimizations](https://scrapfly.io/docs/proxy-saver/optimizations)
- [SSL Certificates](https://scrapfly.io/docs/proxy-saver/certificates)
- [Protocols](https://scrapfly.io/docs/proxy-saver/protocols)
- [Pacfile](https://scrapfly.io/docs/proxy-saver/pacfile)
- [Secure Credentials](https://scrapfly.io/docs/proxy-saver/security)
- [Billing](https://scrapfly.io/docs/proxy-saver/billing)

#### Cloud Browser API

- [Getting Started](https://scrapfly.io/docs/cloud-browser-api/getting-started)
- [Proxy & Geo-Targeting](https://scrapfly.io/docs/cloud-browser-api/proxy)
- [Unblock API](https://scrapfly.io/docs/cloud-browser-api/unblock)
- [File Downloads](https://scrapfly.io/docs/cloud-browser-api/file-downloads)
- [Session Resume](https://scrapfly.io/docs/cloud-browser-api/session-resume)
- [Human-in-the-Loop](https://scrapfly.io/docs/cloud-browser-api/human-in-the-loop)
- [Debug Mode](https://scrapfly.io/docs/cloud-browser-api/debug-mode)
- [Bring Your Own Proxy](https://scrapfly.io/docs/cloud-browser-api/bring-your-own-proxy)
- [Browser Extensions](https://scrapfly.io/docs/cloud-browser-api/extensions)
##### Integrations

- [Puppeteer](https://scrapfly.io/docs/cloud-browser-api/puppeteer)
- [Playwright](https://scrapfly.io/docs/cloud-browser-api/playwright)
- [Selenium](https://scrapfly.io/docs/cloud-browser-api/selenium)
- [Vercel Agent Browser](https://scrapfly.io/docs/cloud-browser-api/agent-browser)
- [Browser Use](https://scrapfly.io/docs/cloud-browser-api/browser-use)
- [Stagehand](https://scrapfly.io/docs/cloud-browser-api/stagehand)

- [Billing](https://scrapfly.io/docs/cloud-browser-api/billing)
- [Errors](https://scrapfly.io/docs/cloud-browser-api/errors)


### Tools

- [Antibot Detector](https://scrapfly.io/docs/tools/antibot-detector)

### SDK

- [Golang](https://scrapfly.io/docs/sdk/golang)
- [Python](https://scrapfly.io/docs/sdk/python)
- [TypeScript](https://scrapfly.io/docs/sdk/typescript)
- [Scrapy](https://scrapfly.io/docs/sdk/scrapy)

### Integrations

- [Getting Started](https://scrapfly.io/docs/integration/getting-started)
- [LangChain](https://scrapfly.io/docs/integration/langchain)
- [LlamaIndex](https://scrapfly.io/docs/integration/llamaindex)
- [CrewAI](https://scrapfly.io/docs/integration/crewai)
- [Zapier](https://scrapfly.io/docs/integration/zapier)
- [Make](https://scrapfly.io/docs/integration/make)
- [n8n](https://scrapfly.io/docs/integration/n8n)

### Academy

- [Overview](https://scrapfly.io/academy)
- [Web Scraping Overview](https://scrapfly.io/academy/scraping-overview)
- [Tools](https://scrapfly.io/academy/tools-overview)
- [Reverse Engineering](https://scrapfly.io/academy/reverse-engineering)
- [Static Scraping](https://scrapfly.io/academy/static-scraping)
- [HTML Parsing](https://scrapfly.io/academy/html-parsing)
- [Dynamic Scraping](https://scrapfly.io/academy/dynamic-scraping)
- [Hidden API Scraping](https://scrapfly.io/academy/hidden-api-scraping)
- [Headless Browsers](https://scrapfly.io/academy/headless-browsers)
- [Hidden Web Data](https://scrapfly.io/academy/hidden-web-data)
- [JSON Parsing](https://scrapfly.io/academy/json-parsing)
- [Data Processing](https://scrapfly.io/academy/data-processing)
- [Scaling](https://scrapfly.io/academy/scaling)
- [Walkthrough Summary](https://scrapfly.io/academy/walkthrough-summary)
- [Scraper Blocking](https://scrapfly.io/academy/scraper-blocking)
- [Proxies](https://scrapfly.io/academy/proxies)

---

# Webhook

 [  View as markdown ](https://scrapfly.io/?view=markdown)   Copy for LLM    Copy for LLM  [     Open in ChatGPT ](https://chatgpt.com/?hints=search&prompt=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fscreenshot-api%2Fwebhook%3Flanguage%3Dnode_js%20so%20I%20can%20ask%20questions%20about%20it.) [     Open in Claude ](https://claude.ai/new?q=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fscreenshot-api%2Fwebhook%3Flanguage%3Dnode_js%20so%20I%20can%20ask%20questions%20about%20it.) [     Open in Perplexity ](https://www.perplexity.ai/search/new?q=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fscreenshot-api%2Fwebhook%3Flanguage%3Dnode_js%20so%20I%20can%20ask%20questions%20about%20it.) 

 

 

 Scrapfly's [webhook](https://scrapfly.io/docs/screenshot-api/getting-started?language=node_js#api_param_webhook_name) feature is ideal for managing screenshot tasks asynchronously. When webhook is specified through the `webhook_name` parameter, Scrapfly will call your HTTP endpoint with the screenshot response.

 To start using webhooks first one must be created using [webhook web interface](https://scrapfly.io/dashboard/webhook).

    

  

 webhook management page  The body sent to your endpoint is the same as a regular Screenshot API response. For reconciliation, you will get the `job_uuid` and `webhook_uuid` in the [response headers](#headers).

    

  

 webhook status report on monitoring log page > **Webhook Queue Size** The webhook queue size indicates the maximum number of queued webhooks that can be scheduled. After the scraping process is completed and your application is notified, the queue size is reduced. This allows you to schedule additional screenshots beyond the concurrency limit of your subscription. The scheduler will handle this and ensure that your concurrency limit is met.
> 
>  | ###### FREE   $0.00/mo | ###### DISCOVERY   $30.00/mo | ###### PRO   $100.00/mo | ###### STARTUP   $250.00/mo | ###### ENTERPRISE   $500.00/mo |
> |---|---|---|---|---|
> | 0 | 500 | 2,000 | 5,000 | 10,000 |

 [See in Your Dashboard](https://scrapfly.io/dashboard/webhook)

## Scope

 Webhooks are scoped per scrapfly [projects](https://scrapfly.io/docs/project?language=node_js) and environments. Make sure to create a webhook for each of your project and environment (test/live).

## Usage

> Webhook can be used for multiple purpose, in context of the Screenshot API, to assert you received a screenshot, you must check the header `X-Scrapfly-Webhook-Resource-Type` and check the value is `screenshot`

 To enable webhook callbacks, all you need to do is specify the `webhook_name` parameter in your screenshot requests. Then, Scrapfly will immediately return a promise response and call your webhook endpoint as soon as the screenshot is done.

 Note that your webhook has to be configured to respond to `2xx` response code for webhook to be considered a success. The `3xx` redirect responses will be followed and response codes `4xx` and `5xx` are considered failures and will be retried as per the retry policy.

> The below examples assume you have a webhook named **my-webhook** registered. You can create a webhook named "example" via the [web dashboard](https://scrapfly.io/dashboard/webhook).

 ```
curl -G \
--request "GET" \
--url "https://api.scrapfly.io/screenshot" \
--data-urlencode "" \
--data-urlencode "webhook_name=my-webhook" \
--data-urlencode "url=https://web-scraping.dev/product/1"

```

 

   

 

 

#### Example Of Response

 ```
{
  "job_uuid": "a0e6f3e8-be35-438a-942a-be77aa545d30",
  "log_uuid": "a0e6f3e8-be35-438a-942a-be77aa545d30",
  "log_url": "https://api.scrapfly.io/scrape/log/a0e6f3e8-be35-438a-942a-be77aa545d30",
  "success": true,
  "webhook_enqueued": true,
  "webhook_name": "my-webhook",
  "webhook_queue_limit": 10000,
  "webhook_queued_element": 1,
  "webhook_uuid": "cdf37252-fea7-4267-a568-aa0e5964ee21"
}

```

 

   

 

#### Tracking

 When you enqueue a screenshot request, you receive:

- `job_uuid` - A unique job identifier for reconciliation
- `log_uuid` - The log identifier for debugging and monitoring
- `log_url` - Direct link to the request log in Scrapfly dashboard
 
 When your webhook is notified, you will retrieve the processed job id in the response header `X-Scrapfly-Webhook-Job-Id` to reconcile it in your system. The log headers `X-Scrapfly-Log-Uuid` and `X-Scrapfly-Log-Url` are also included in the enqueue response for immediate access to the request log.

## Retry Policy

 Webhook callbacks are retried if Scrapfly can't notify the endpoint specified in your webhook settings based on this retry policy:

- 30 seconds
- 1 minute
- 5 minutes
- 30 minutes
- 1 hour
- 1 day
 
> If we failed to reach your application more than 100 times in a row, the system automatically disables it, and you will be notified. You can re-enable it from the UI at any point after.

## Development

 Useful tools to develop locally :

- <https://webhook.site> Collect and display webhook
- <https://ngrok.com> Expose you local application through a secured tunnel to the internet
 
## Security

 Webhooks are signed using HMAC (Hash-based Message Authentication Code) with the SHA-256 algorithm to ensure the integrity of the webhook content and verify its authenticity. This mechanism helps prevent tampering and ensures that webhook payloads are from trusted sources.

#### HMAC Overview

 HMAC is a cryptographic technique that combines a secret key with a hash function (in this case, SHA-256) to produce a fixed-size hash value known as the HMAC digest. This digest is unique to both the original message and the secret key, providing a secure way to verify the integrity and authenticity of the message.

#### Signature in HTTP Header

 When sending a webhook request, a signature is generated using HMAC-SHA256 and included in the HTTP header `X-Scrapfly-Webhook-Signature`. This signature is computed based on the webhook payload and a secret key known only to the sender and receiver.

#### Verification Process

 Upon receiving a webhook request, the receiver extracts the payload and computes its own HMAC-SHA256 signature using the same secret key. It then compares this computed signature with the signature provided in the X-Scrapfly-Webhook-Signature header. If the two signatures match, it indicates that the payload has not been tampered with and originates from the expected sender.

 ```
import hmac
import hashlib

# Example secret key (replace with actual secret key)
secret_key = b'my_secret_key'

# Example webhook payload (replace with actual payload)
webhook_payload = b'{"data": "example"}'

# Compute HMAC-SHA256 signature
computed_signature = hmac.new(secret_key, webhook_payload, hashlib.sha256).hexdigest()

# Compare computed signature with received signature
received_signature = '...'  # Extracted from X-Scrapfly-Webhook-Signature header
if computed_signature == received_signature:
    print("Signature verification successful. Payload is authentic.")
else:
    print("Signature verification failed. Payload may have been tampered with.")

```

 

   

 

#### Security Considerations

- **Keep Secret Key Secure:** The secret key used for HMAC computation should be kept confidential and not exposed publicly.
- **Use HTTPS:** Webhook communication should be conducted over HTTPS to ensure data privacy and integrity during transit.
- **Regular Key Rotation:** Periodically rotate the secret key used for HMAC computation to enhance security.
 
## Headers

#### Enqueue Response Headers

When you enqueue a screenshot request with a webhook, the following headers are returned:

- `X-Scrapfly-Log-Uuid` : Unique log identifier for the request
- `X-Scrapfly-Log-Url` : Direct URL to view the request log in the dashboard
 
#### Webhook Callback Headers

When Scrapfly calls your webhook endpoint, the following headers are included:

- `X-Scrapfly-Webhook-Env` : Related environment where webhook is triggered
- `X-Scrapfly-Webhook-Project` : Related project name
- `X-Scrapfly-Webhook-Signature` : HMAC SHA-256 Integrity Signature
- `X-Scrapfly-Webhook-Name` : Name of the webhook
- `X-Scrapfly-Webhook-Resource-Type` : Resource type (e.g., `screenshot`)
- `X-Scrapfly-Webhook-Job-Id` : Unique Job Identifier given in the enqueue call
- `X-Scrapfly-Log-Uuid` : Unique log identifier for the request
- `X-Scrapfly-Log-Url` : Direct URL to view the request log in the dashboard
 
## Related Errors

 All related errors are listed below. You can see the full description and example of the error response on [Errors section](https://scrapfly.io/docs/screenshot-api/errors#webhook) of the documentation.

- [ERR::WEBHOOK::DISABLED](https://scrapfly.io/docs/screenshot-api/error/ERR::WEBHOOK::DISABLED "Given webhook is disabled, please check out your webhook configuration for the current project / env") - Given webhook is disabled, please check out your webhook configuration for the current project / env
- [ERR::WEBHOOK::ENDPOINT\_UNREACHABLE](https://scrapfly.io/docs/screenshot-api/error/ERR::WEBHOOK::ENDPOINT_UNREACHABLE "We were not able to contact your endpoint") - We were not able to contact your endpoint
- [ERR::WEBHOOK::QUEUE\_FULL](https://scrapfly.io/docs/screenshot-api/error/ERR::WEBHOOK::QUEUE_FULL "You reach the maximum concurrency limit") - You reach the maximum concurrency limit
- [ERR::WEBHOOK::MAX\_RETRY](https://scrapfly.io/docs/screenshot-api/error/ERR::WEBHOOK::MAX_RETRY "Maximum retry exceeded on your webhook") - Maximum retry exceeded on your webhook
- [ERR::WEBHOOK::NOT\_FOUND](https://scrapfly.io/docs/screenshot-api/error/ERR::WEBHOOK::NOT_FOUND "Unable to find the given webhook for the current project / env") - Unable to find the given webhook for the current project / env
- [ERR::WEBHOOK::QUEUE\_FULL](https://scrapfly.io/docs/screenshot-api/error/ERR::WEBHOOK::QUEUE_FULL "You reach the limit of scheduled webhook - You must wait pending webhook are processed") - You reach the limit of scheduled webhook - You must wait pending webhook are processed
 
## Pricing

 No additional fee applied on usage.

## Integration

- [Python SDK Built-in server and example](https://scrapfly.io/docs/sdk/python#webhook_server)