# Scrapfly Documentation

## Table of Contents

### Dashboard

- [Intro](https://scrapfly.io/docs)
- [Project](https://scrapfly.io/docs/project)
- [Account](https://scrapfly.io/docs/account)
- [Workspace & Team](https://scrapfly.io/docs/workspace-and-team)
- [Billing](https://scrapfly.io/docs/billing)

### Products

#### MCP Server

- [Getting Started](https://scrapfly.io/docs/mcp/getting-started)
- [Tools & API Spec](https://scrapfly.io/docs/mcp/tools)
- [Authentication](https://scrapfly.io/docs/mcp/authentication)
- [Examples & Use Cases](https://scrapfly.io/docs/mcp/examples)
- [FAQ](https://scrapfly.io/docs/mcp/faq)
##### Integrations

- [Overview](https://scrapfly.io/docs/mcp/integrations)
- [Claude Desktop](https://scrapfly.io/docs/mcp/integrations/claude-desktop)
- [Claude Code](https://scrapfly.io/docs/mcp/integrations/claude-code)
- [ChatGPT](https://scrapfly.io/docs/mcp/integrations/chatgpt)
- [Cursor](https://scrapfly.io/docs/mcp/integrations/cursor)
- [Cline](https://scrapfly.io/docs/mcp/integrations/cline)
- [Windsurf](https://scrapfly.io/docs/mcp/integrations/windsurf)
- [Zed](https://scrapfly.io/docs/mcp/integrations/zed)
- [Roo Code](https://scrapfly.io/docs/mcp/integrations/roo-code)
- [VS Code](https://scrapfly.io/docs/mcp/integrations/vscode)
- [LangChain](https://scrapfly.io/docs/mcp/integrations/langchain)
- [LlamaIndex](https://scrapfly.io/docs/mcp/integrations/llamaindex)
- [CrewAI](https://scrapfly.io/docs/mcp/integrations/crewai)
- [OpenAI](https://scrapfly.io/docs/mcp/integrations/openai)
- [n8n](https://scrapfly.io/docs/mcp/integrations/n8n)
- [Make](https://scrapfly.io/docs/mcp/integrations/make)
- [Zapier](https://scrapfly.io/docs/mcp/integrations/zapier)
- [Vapi AI](https://scrapfly.io/docs/mcp/integrations/vapi)
- [Agent Builder](https://scrapfly.io/docs/mcp/integrations/agent-builder)
- [Custom Client](https://scrapfly.io/docs/mcp/integrations/custom-client)


#### Web Scraping API

- [Getting Started](https://scrapfly.io/docs/scrape-api/getting-started)
- [API Specification]()
- [Monitoring](https://scrapfly.io/docs/monitoring)
- [Customize Request](https://scrapfly.io/docs/scrape-api/custom)
- [Debug](https://scrapfly.io/docs/scrape-api/debug)
- [Anti Scraping Protection](https://scrapfly.io/docs/scrape-api/anti-scraping-protection)
- [Proxy](https://scrapfly.io/docs/scrape-api/proxy)
- [Proxy Mode](https://scrapfly.io/docs/scrape-api/proxy-mode)
- [Proxy Mode - Screaming Frog](https://scrapfly.io/docs/scrape-api/proxy-mode/screaming-frog)
- [Proxy Mode - Apify](https://scrapfly.io/docs/scrape-api/proxy-mode/apify)
- [(Auto) Data Extraction](https://scrapfly.io/docs/scrape-api/extraction)
- [Javascript Rendering](https://scrapfly.io/docs/scrape-api/javascript-rendering)
- [Javascript Scenario](https://scrapfly.io/docs/scrape-api/javascript-scenario)
- [SSL](https://scrapfly.io/docs/scrape-api/ssl)
- [DNS](https://scrapfly.io/docs/scrape-api/dns)
- [Cache](https://scrapfly.io/docs/scrape-api/cache)
- [Session](https://scrapfly.io/docs/scrape-api/session)
- [Webhook](https://scrapfly.io/docs/scrape-api/webhook)
- [Screenshot](https://scrapfly.io/docs/scrape-api/screenshot)
- [Errors](https://scrapfly.io/docs/scrape-api/errors)
- [Timeout](https://scrapfly.io/docs/scrape-api/understand-timeout)
- [Throttling](https://scrapfly.io/docs/throttling)
- [Troubleshoot](https://scrapfly.io/docs/scrape-api/troubleshoot)
- [Billing](https://scrapfly.io/docs/scrape-api/billing)
- [FAQ](https://scrapfly.io/docs/scrape-api/faq)

#### Crawler API

- [Getting Started](https://scrapfly.io/docs/crawler-api/getting-started)
- [API Specification]()
- [Retrieving Results](https://scrapfly.io/docs/crawler-api/results)
- [WARC Format](https://scrapfly.io/docs/crawler-api/warc-format)
- [Data Extraction](https://scrapfly.io/docs/crawler-api/extraction-rules)
- [Webhook](https://scrapfly.io/docs/crawler-api/webhook)
- [Billing](https://scrapfly.io/docs/crawler-api/billing)
- [Errors](https://scrapfly.io/docs/crawler-api/errors)
- [Troubleshoot](https://scrapfly.io/docs/crawler-api/troubleshoot)
- [FAQ](https://scrapfly.io/docs/crawler-api/faq)

#### Screenshot API

- [Getting Started](https://scrapfly.io/docs/screenshot-api/getting-started)
- [API Specification]()
- [Accessibility Testing](https://scrapfly.io/docs/screenshot-api/accessibility)
- [Webhook](https://scrapfly.io/docs/screenshot-api/webhook)
- [Billing](https://scrapfly.io/docs/screenshot-api/billing)
- [Errors](https://scrapfly.io/docs/screenshot-api/errors)

#### Extraction API

- [Getting Started](https://scrapfly.io/docs/extraction-api/getting-started)
- [API Specification]()
- [Rules Template](https://scrapfly.io/docs/extraction-api/rules-and-template)
- [LLM Extraction](https://scrapfly.io/docs/extraction-api/llm-prompt)
- [AI Auto Extraction](https://scrapfly.io/docs/extraction-api/automatic-ai)
- [Webhook](https://scrapfly.io/docs/extraction-api/webhook)
- [Billing](https://scrapfly.io/docs/extraction-api/billing)
- [Errors](https://scrapfly.io/docs/extraction-api/errors)
- [FAQ](https://scrapfly.io/docs/extraction-api/faq)

#### Proxy Saver

- [Getting Started](https://scrapfly.io/docs/proxy-saver/getting-started)
- [Fingerprints](https://scrapfly.io/docs/proxy-saver/fingerprints)
- [Optimizations](https://scrapfly.io/docs/proxy-saver/optimizations)
- [SSL Certificates](https://scrapfly.io/docs/proxy-saver/certificates)
- [Protocols](https://scrapfly.io/docs/proxy-saver/protocols)
- [Pacfile](https://scrapfly.io/docs/proxy-saver/pacfile)
- [Secure Credentials](https://scrapfly.io/docs/proxy-saver/security)
- [Billing](https://scrapfly.io/docs/proxy-saver/billing)

#### Cloud Browser API

- [Getting Started](https://scrapfly.io/docs/cloud-browser-api/getting-started)
- [Proxy & Geo-Targeting](https://scrapfly.io/docs/cloud-browser-api/proxy)
- [Unblock API](https://scrapfly.io/docs/cloud-browser-api/unblock)
- [File Downloads](https://scrapfly.io/docs/cloud-browser-api/file-downloads)
- [Session Resume](https://scrapfly.io/docs/cloud-browser-api/session-resume)
- [Human-in-the-Loop](https://scrapfly.io/docs/cloud-browser-api/human-in-the-loop)
- [Debug Mode](https://scrapfly.io/docs/cloud-browser-api/debug-mode)
- [Bring Your Own Proxy](https://scrapfly.io/docs/cloud-browser-api/bring-your-own-proxy)
- [Browser Extensions](https://scrapfly.io/docs/cloud-browser-api/extensions)
##### Integrations

- [Puppeteer](https://scrapfly.io/docs/cloud-browser-api/puppeteer)
- [Playwright](https://scrapfly.io/docs/cloud-browser-api/playwright)
- [Selenium](https://scrapfly.io/docs/cloud-browser-api/selenium)
- [Vercel Agent Browser](https://scrapfly.io/docs/cloud-browser-api/agent-browser)
- [Browser Use](https://scrapfly.io/docs/cloud-browser-api/browser-use)
- [Stagehand](https://scrapfly.io/docs/cloud-browser-api/stagehand)

- [Billing](https://scrapfly.io/docs/cloud-browser-api/billing)
- [Errors](https://scrapfly.io/docs/cloud-browser-api/errors)


### Tools

- [Antibot Detector](https://scrapfly.io/docs/tools/antibot-detector)

### SDK

- [Golang](https://scrapfly.io/docs/sdk/golang)
- [Python](https://scrapfly.io/docs/sdk/python)
- [TypeScript](https://scrapfly.io/docs/sdk/typescript)
- [Scrapy](https://scrapfly.io/docs/sdk/scrapy)

### Integrations

- [Getting Started](https://scrapfly.io/docs/integration/getting-started)
- [LangChain](https://scrapfly.io/docs/integration/langchain)
- [LlamaIndex](https://scrapfly.io/docs/integration/llamaindex)
- [CrewAI](https://scrapfly.io/docs/integration/crewai)
- [Zapier](https://scrapfly.io/docs/integration/zapier)
- [Make](https://scrapfly.io/docs/integration/make)
- [n8n](https://scrapfly.io/docs/integration/n8n)

### Academy

- [Overview](https://scrapfly.io/academy)
- [Web Scraping Overview](https://scrapfly.io/academy/scraping-overview)
- [Tools](https://scrapfly.io/academy/tools-overview)
- [Reverse Engineering](https://scrapfly.io/academy/reverse-engineering)
- [Static Scraping](https://scrapfly.io/academy/static-scraping)
- [HTML Parsing](https://scrapfly.io/academy/html-parsing)
- [Dynamic Scraping](https://scrapfly.io/academy/dynamic-scraping)
- [Hidden API Scraping](https://scrapfly.io/academy/hidden-api-scraping)
- [Headless Browsers](https://scrapfly.io/academy/headless-browsers)
- [Hidden Web Data](https://scrapfly.io/academy/hidden-web-data)
- [JSON Parsing](https://scrapfly.io/academy/json-parsing)
- [Data Processing](https://scrapfly.io/academy/data-processing)
- [Scaling](https://scrapfly.io/academy/scaling)
- [Walkthrough Summary](https://scrapfly.io/academy/walkthrough-summary)
- [Scraper Blocking](https://scrapfly.io/academy/scraper-blocking)
- [Proxies](https://scrapfly.io/academy/proxies)

---

# Monitoring

 [  View as markdown ](https://scrapfly.io/?view=markdown)   Copy for LLM    Copy for LLM  [     Open in ChatGPT ](https://chatgpt.com/?hints=search&prompt=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fmonitoring%20so%20I%20can%20ask%20questions%20about%20it.) [     Open in Claude ](https://claude.ai/new?q=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fmonitoring%20so%20I%20can%20ask%20questions%20about%20it.) [     Open in Perplexity ](https://www.perplexity.ai/search/new?q=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fmonitoring%20so%20I%20can%20ask%20questions%20about%20it.) 

 

 

 Scrapfly offers a detailed real-time monitoring dashboard that logs all scrape requests and their results. This dashboard tracks all scrape request for the selected Scrapfly project and environment. It can be filtered and inspected for overall scraping performance:

    

  

  [See Your Monitoring Dashboard](https://scrapfly.io/dashboard/monitoring)

> **Logs Retention** | FREE | DISCOVERY | PRO | STARTUP | ENTERPRISE |
> |---|---|---|---|---|
> | Log retention 1 week | Log retention 1 week | Log retention 2 weeks | Log retention 3 weeks | Log retention 4 weeks |
> 
>   [Screenshots](https://scrapfly.io/docs/scrape-api/screenshot), [Debug](https://scrapfly.io/docs/scrape-api/debug), [Cache](https://scrapfly.io/docs/scrape-api/cache) belongs to log and inherit of the same retention - As soon as the log is deleted, they are also deleted

## Filters

 Filters allow you to sample or search logs from the monitoring section to investigate or check that everything works as expected.

#### Time Frame

 Eight pre-configured time frames are available (past month, past week, past day, past three hours, past hour, past 30 minutes, past 15 minutes, past 5 minutes, past 5 minutes). You can also define an arbitrary time frame.

### Dimensions

You can filter the following values:

 | Name | Value | Support Multiple | Description |
|---|---|---|---|
| url | **string** | Yes | Filter for URL, support glob operator with.  **e.g:**  `https://*.wikipedia.org/wiki/*` |
| success | **bool** `true` or `false` | No | Filter for success or failed request - includes &gt;= 500 and network errors |
| domain | **string** | Yes | Filter for a domain, the domain (TLD+n) includes a subdomain, support glob operator with   **e.g.:**  `*.google.com` |
| root\_domain | **string** | yes | Filter for root domain (TLD+1), do not include subdomain, support glob operator with   **e.g.;** `goo*.com` |
| method | **string** | Yes | Filter for method, supported values : GET, PUT, POST, PATCH |
| status\_code | **int** | Yes | Filter for status code   **e.g:**  `401` or `502,504`. Wildcard is also supported `2**` |
| cost | **int** | 1 | Filter based of on cost spent on API Credits |
| origin | **string** | Yes | Filter for origin, supported values : API, SCHEDULER |
| retries | **int** | No | Filter for retries amount |
| duration | **int** | No | Filter for duration |
| error\_code | **string** | Yes | Filter by [error codes](https://scrapfly.io/docs/scrape-api/errors) e.g: `ERR::SCRAPE::SCENARIO_EXECUTION` |
| errored | **bool** `true` or `false` | No | Filter scrape having an error |

 For metrics that support multiple notations, simply separate with `,` like `status=200,204`

### Chaining Filters &amp; Multi Values

 You can chain multiple filters with space.

 ```
status_code=200 host=*.wikipedia.org success=true duration>=5
```

 

   

 

 

 You can filter over multiple values, in this case, `OR` operator is applied.

 ```
status_code=401,500,502,503
```

 

   

 

 

## Operators

Following operators are supported :

- **`=`** Equal
- **`!`** Not equal
- **`>`** Greater than
- **`>=`** Greater or equal to
- **`<`** Lower than
- **`<=`** Lower or equal to
 
## API

> This feature is only available from ENTERPRISE plan

With monitoring API you can query aggregates or domain specific metrics.

- All date are in UTC
- JSON or MSGPACK format are available, use the header `accept` to control it, `application/json` or `application/msgpack`
- Pass your API KEY via `key=xxx` query param or header `Authorization` `Bearer xxxx`
 
  #### GET `/scrape/monitoring/metrics`

Retrieve metrics from the current subscription period, which include three aggregations level: account, project and targets (top 100)

 ```
https://api.scrapfly.io/scrape/monitoring/metrics?key=
```

 

   

 

**Parameters** | Name | Description |
|---|---|
| format | - `structured` default - `prometheus` |
| aggregation | Enable aggregations, possible values are:  - `account` default - `project` - `target`   You can combine multiple aggregations at once: `account,project,target` for example |
| period | This parameter is mutually exclusive with `start` and `end`  - `last5m` - `last1h` - `last7d` - `last24h` default - `subscription` (*use current subscription period*) |
| start | The format should be `Y-m-d H:i:s` see example below. It's mutually exclusive to `period` param, `end` param must be set.     Examples: - `2024-01-01 00:00:00` |
| end | The format should be `Y-m-d H:i:s` see example below. It's mutually exclusive to `period` param, `start` param must be set.     Examples: - `2024-01-01 00:00:00` |
| group\_subdomain | Enable or disable subdomain grouping when `target` aggregation is requested. *When using subdomain grouping, provide the root domain as input. E.g: `web-scraping.dev` and not `www.web-scraping.dev`*  - `true` default - `false` |

   #### GET `/scrape/monitoring/metrics/target`

Retrieve metrics and timeseries for a given target

 ```
https://api.scrapfly.io/scrape/monitoring/metrics/target?domain=httpbin.dev&key=&period=subscription
```

 

   

 

**Parameters** | Name | Description |
|---|---|
| domain   required | - `httpbin.dev` - `web-scraping.dev` - `...` |
| period | This parameter is mutually exclusive with `start` and `end`  - `last5m` default - `last1h` - `last7d` - `last24h` - `subscription` (*use current subscription period*) |
| group\_subdomain | Enable or disable subdomain grouping. *When using subdomain grouping, provide the root domain as input. E.g: `web-scraping.dev` and not `www.web-scraping.dev`*  - `true` default - `false` |
| start | The format should be `Y-m-d H:i:s` see example below. It's mutually exclusive to `period` param, `end` param must be set.     Examples: - `2024-01-01 00:00:00` |
| end | The format should be `Y-m-d H:i:s` see example below. It's mutually exclusive to `period` param, `start` param must be set.     Examples: - `2024-01-01 00:00:00` |