# Scrapfly Documentation

## Table of Contents

### Dashboard

- [Intro](https://scrapfly.io/docs)
- [Project](https://scrapfly.io/docs/project)
- [Account](https://scrapfly.io/docs/account)
- [Workspace & Team](https://scrapfly.io/docs/workspace-and-team)
- [Billing](https://scrapfly.io/docs/billing)

### Products

#### MCP Server

- [Getting Started](https://scrapfly.io/docs/mcp/getting-started)
- [Tools & API Spec](https://scrapfly.io/docs/mcp/tools)
- [Authentication](https://scrapfly.io/docs/mcp/authentication)
- [Examples & Use Cases](https://scrapfly.io/docs/mcp/examples)
- [FAQ](https://scrapfly.io/docs/mcp/faq)
##### Integrations

- [Overview](https://scrapfly.io/docs/mcp/integrations)
- [Claude Desktop](https://scrapfly.io/docs/mcp/integrations/claude-desktop)
- [Claude Code](https://scrapfly.io/docs/mcp/integrations/claude-code)
- [ChatGPT](https://scrapfly.io/docs/mcp/integrations/chatgpt)
- [Cursor](https://scrapfly.io/docs/mcp/integrations/cursor)
- [Cline](https://scrapfly.io/docs/mcp/integrations/cline)
- [Windsurf](https://scrapfly.io/docs/mcp/integrations/windsurf)
- [Zed](https://scrapfly.io/docs/mcp/integrations/zed)
- [Roo Code](https://scrapfly.io/docs/mcp/integrations/roo-code)
- [VS Code](https://scrapfly.io/docs/mcp/integrations/vscode)
- [LangChain](https://scrapfly.io/docs/mcp/integrations/langchain)
- [LlamaIndex](https://scrapfly.io/docs/mcp/integrations/llamaindex)
- [CrewAI](https://scrapfly.io/docs/mcp/integrations/crewai)
- [OpenAI](https://scrapfly.io/docs/mcp/integrations/openai)
- [n8n](https://scrapfly.io/docs/mcp/integrations/n8n)
- [Make](https://scrapfly.io/docs/mcp/integrations/make)
- [Zapier](https://scrapfly.io/docs/mcp/integrations/zapier)
- [Vapi AI](https://scrapfly.io/docs/mcp/integrations/vapi)
- [Agent Builder](https://scrapfly.io/docs/mcp/integrations/agent-builder)
- [Custom Client](https://scrapfly.io/docs/mcp/integrations/custom-client)


#### Web Scraping API

- [Getting Started](https://scrapfly.io/docs/scrape-api/getting-started)
- [API Specification]()
- [Monitoring](https://scrapfly.io/docs/monitoring)
- [Customize Request](https://scrapfly.io/docs/scrape-api/custom)
- [Debug](https://scrapfly.io/docs/scrape-api/debug)
- [Anti Scraping Protection](https://scrapfly.io/docs/scrape-api/anti-scraping-protection)
- [Proxy](https://scrapfly.io/docs/scrape-api/proxy)
- [Proxy Mode](https://scrapfly.io/docs/scrape-api/proxy-mode)
- [Proxy Mode - Screaming Frog](https://scrapfly.io/docs/scrape-api/proxy-mode/screaming-frog)
- [Proxy Mode - Apify](https://scrapfly.io/docs/scrape-api/proxy-mode/apify)
- [(Auto) Data Extraction](https://scrapfly.io/docs/scrape-api/extraction)
- [Javascript Rendering](https://scrapfly.io/docs/scrape-api/javascript-rendering)
- [Javascript Scenario](https://scrapfly.io/docs/scrape-api/javascript-scenario)
- [SSL](https://scrapfly.io/docs/scrape-api/ssl)
- [DNS](https://scrapfly.io/docs/scrape-api/dns)
- [Cache](https://scrapfly.io/docs/scrape-api/cache)
- [Batch (Multi-URL Scraping)](https://scrapfly.io/docs/scrape-api/batch)
- [Session](https://scrapfly.io/docs/scrape-api/session)
- [Webhook](https://scrapfly.io/docs/scrape-api/webhook)
- [Schedule](https://scrapfly.io/docs/scrape-api/schedule)
- [Screenshot](https://scrapfly.io/docs/scrape-api/screenshot)
- [Errors](https://scrapfly.io/docs/scrape-api/errors)
- [Timeout](https://scrapfly.io/docs/scrape-api/understand-timeout)
- [Throttling](https://scrapfly.io/docs/throttling)
- [Troubleshoot](https://scrapfly.io/docs/scrape-api/troubleshoot)
- [Billing](https://scrapfly.io/docs/scrape-api/billing)
- [FAQ](https://scrapfly.io/docs/scrape-api/faq)

#### Crawler API

- [Getting Started](https://scrapfly.io/docs/crawler-api/getting-started)
- [API Specification]()
- [Retrieving Results](https://scrapfly.io/docs/crawler-api/results)
- [WARC Format](https://scrapfly.io/docs/crawler-api/warc-format)
- [Data Extraction](https://scrapfly.io/docs/crawler-api/extraction-rules)
- [Webhook](https://scrapfly.io/docs/crawler-api/webhook)
- [Schedule](https://scrapfly.io/docs/crawler-api/schedule)
- [Billing](https://scrapfly.io/docs/crawler-api/billing)
- [Errors](https://scrapfly.io/docs/crawler-api/errors)
- [Troubleshoot](https://scrapfly.io/docs/crawler-api/troubleshoot)
- [FAQ](https://scrapfly.io/docs/crawler-api/faq)

#### Screenshot API

- [Getting Started](https://scrapfly.io/docs/screenshot-api/getting-started)
- [API Specification]()
- [Accessibility Testing](https://scrapfly.io/docs/screenshot-api/accessibility)
- [Webhook](https://scrapfly.io/docs/screenshot-api/webhook)
- [Schedule](https://scrapfly.io/docs/screenshot-api/schedule)
- [Billing](https://scrapfly.io/docs/screenshot-api/billing)
- [Errors](https://scrapfly.io/docs/screenshot-api/errors)

#### Extraction API

- [Getting Started](https://scrapfly.io/docs/extraction-api/getting-started)
- [API Specification]()
- [Rules Template](https://scrapfly.io/docs/extraction-api/rules-and-template)
- [LLM Extraction](https://scrapfly.io/docs/extraction-api/llm-prompt)
- [AI Auto Extraction](https://scrapfly.io/docs/extraction-api/automatic-ai)
- [Webhook](https://scrapfly.io/docs/extraction-api/webhook)
- [Billing](https://scrapfly.io/docs/extraction-api/billing)
- [Errors](https://scrapfly.io/docs/extraction-api/errors)
- [FAQ](https://scrapfly.io/docs/extraction-api/faq)

#### Proxy Saver

- [Getting Started](https://scrapfly.io/docs/proxy-saver/getting-started)
- [Fingerprints](https://scrapfly.io/docs/proxy-saver/fingerprints)
- [Optimizations](https://scrapfly.io/docs/proxy-saver/optimizations)
- [SSL Certificates](https://scrapfly.io/docs/proxy-saver/certificates)
- [Protocols](https://scrapfly.io/docs/proxy-saver/protocols)
- [Pacfile](https://scrapfly.io/docs/proxy-saver/pacfile)
- [Secure Credentials](https://scrapfly.io/docs/proxy-saver/security)
- [Billing](https://scrapfly.io/docs/proxy-saver/billing)

#### Cloud Browser API

- [Getting Started](https://scrapfly.io/docs/cloud-browser-api/getting-started)
- [Proxy & Geo-Targeting](https://scrapfly.io/docs/cloud-browser-api/proxy)
- [Unblock API](https://scrapfly.io/docs/cloud-browser-api/unblock)
- [Captcha Solver](https://scrapfly.io/docs/cloud-browser-api/captcha-solver)
- [File Downloads](https://scrapfly.io/docs/cloud-browser-api/file-downloads)
- [Session Resume](https://scrapfly.io/docs/cloud-browser-api/session-resume)
- [Human-in-the-Loop](https://scrapfly.io/docs/cloud-browser-api/human-in-the-loop)
- [Debug Mode](https://scrapfly.io/docs/cloud-browser-api/debug-mode)
- [Bring Your Own Proxy](https://scrapfly.io/docs/cloud-browser-api/bring-your-own-proxy)
- [Browser Extensions](https://scrapfly.io/docs/cloud-browser-api/extensions)
- [Native Browser MCP](https://scrapfly.io/docs/cloud-browser-api/mcp)
- [DevTools Protocol](https://scrapfly.io/docs/cloud-browser-api/cdp-reference)
##### Integrations

- [Puppeteer](https://scrapfly.io/docs/cloud-browser-api/puppeteer)
- [Playwright](https://scrapfly.io/docs/cloud-browser-api/playwright)
- [Selenium](https://scrapfly.io/docs/cloud-browser-api/selenium)
- [Vercel Agent Browser](https://scrapfly.io/docs/cloud-browser-api/agent-browser)
- [Browser Use](https://scrapfly.io/docs/cloud-browser-api/browser-use)
- [Stagehand](https://scrapfly.io/docs/cloud-browser-api/stagehand)

- [Billing](https://scrapfly.io/docs/cloud-browser-api/billing)
- [Errors](https://scrapfly.io/docs/cloud-browser-api/errors)


### Tools

- [Antibot Detector](https://scrapfly.io/docs/tools/antibot-detector)

### SDK

- [Golang](https://scrapfly.io/docs/sdk/golang)
- [Python](https://scrapfly.io/docs/sdk/python)
- [Rust](https://scrapfly.io/docs/sdk/rust)
- [TypeScript](https://scrapfly.io/docs/sdk/typescript)
- [Scrapy](https://scrapfly.io/docs/sdk/scrapy)

### Integrations

- [Getting Started](https://scrapfly.io/docs/integration/getting-started)
- [LangChain](https://scrapfly.io/docs/integration/langchain)
- [LlamaIndex](https://scrapfly.io/docs/integration/llamaindex)
- [CrewAI](https://scrapfly.io/docs/integration/crewai)
- [Zapier](https://scrapfly.io/docs/integration/zapier)
- [Make](https://scrapfly.io/docs/integration/make)
- [n8n](https://scrapfly.io/docs/integration/n8n)

### Academy

- [Overview](https://scrapfly.io/academy)
- [Web Scraping Overview](https://scrapfly.io/academy/scraping-overview)
- [Tools](https://scrapfly.io/academy/tools-overview)
- [Reverse Engineering](https://scrapfly.io/academy/reverse-engineering)
- [Static Scraping](https://scrapfly.io/academy/static-scraping)
- [HTML Parsing](https://scrapfly.io/academy/html-parsing)
- [Dynamic Scraping](https://scrapfly.io/academy/dynamic-scraping)
- [Hidden API Scraping](https://scrapfly.io/academy/hidden-api-scraping)
- [Headless Browsers](https://scrapfly.io/academy/headless-browsers)
- [Hidden Web Data](https://scrapfly.io/academy/hidden-web-data)
- [JSON Parsing](https://scrapfly.io/academy/json-parsing)
- [Data Processing](https://scrapfly.io/academy/data-processing)
- [Scaling](https://scrapfly.io/academy/scaling)
- [Walkthrough Summary](https://scrapfly.io/academy/walkthrough-summary)
- [Scraper Blocking](https://scrapfly.io/academy/scraper-blocking)
- [Proxies](https://scrapfly.io/academy/proxies)

---

# Schedule

 [  View as markdown ](https://scrapfly.io/?view=markdown)   Copy for LLM    Copy for LLM  [     Open in ChatGPT ](https://chatgpt.com/?hints=search&prompt=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fscreenshot-api%2Fschedule%20so%20I%20can%20ask%20questions%20about%20it.) [     Open in Claude ](https://claude.ai/new?q=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fscreenshot-api%2Fschedule%20so%20I%20can%20ask%20questions%20about%20it.) [     Open in Perplexity ](https://www.perplexity.ai/search/new?q=Read%20from%20https%3A%2F%2Fscrapfly.io%2Fdocs%2Fscreenshot-api%2Fschedule%20so%20I%20can%20ask%20questions%20about%20it.) 

 

 

 Schedule recurring screenshot captures. Each schedule pairs a [screenshot configuration](https://scrapfly.io/docs/screenshot-api/getting-started) with a recurrence rule and a [webhook](https://scrapfly.io/docs/screenshot-api/webhook); every fire produces an image and the result is delivered to your endpoint asynchronously.

 Use schedules to keep a daily snapshot of a landing page, monitor visual regressions against a list of URLs, or build a time-lapse archive of a website without standing up your own scheduler.

## Concepts

- **kind**: `api.screenshot`; the screenshot configuration is stored under `metadata.screenshot_config`. Every parameter accepted by `/screenshot` works inside a schedule.
- **recurrence**: when the schedule fires next. Either a 5-field cron expression or an interval+unit pair.
- **webhook**: the named webhook that will receive each capture's signed download URL and metadata. Required.
- **status**: `ACTIVE` (firing), `PAUSED` (skipped until resumed), or `CANCELLED` (terminal).
 
> **Webhook is required, all times are UTC** Schedules run in the background, so the captured image and metadata are published to a configured webhook rather than returned in the API response. Create a webhook from the [webhook dashboard](https://scrapfly.io/dashboard/webhook) before creating a schedule.
> 
>  Every date and cron expression on this page is evaluated in **UTC**. The scheduler does not support a per-schedule timezone. If you need a local-clock cadence, convert it to UTC when building the cron expression or the `scheduled_date`.

## Create a schedule

 `POST /screenshot/schedules` creates a new schedule for the authenticated account. The body is your full `screenshot_config` plus a recurrence and a webhook name.

   Python  TypeScript  Go  Rust  CLI  cURL 

  ```
from scrapfly import ScrapflyClient, CreateScheduleRequest, ScheduleRecurrence

client = ScrapflyClient(key='')

sched = client.create_screenshot_schedule(
    {
        'url': 'https://web-scraping.dev',
        'format': 'png',
        'resolution': '1920x1080',
        'capture': 'fullpage',
    },
    CreateScheduleRequest(
        webhook_name='my-webhook',
        recurrence=ScheduleRecurrence(cron='0 8 * * *'),
        notes='Daily homepage screenshot',
    ),
)
print(sched['id'], sched['status'])
```

 

   

 

 

 ```
import { ScrapflyClient } from 'scrapfly-sdk';

const client = new ScrapflyClient({ key: '' });

const sched = await client.createScreenshotSchedule(
  {
    url: 'https://web-scraping.dev',
    format: 'png',
    resolution: '1920x1080',
    capture: 'fullpage',
  },
  {
    webhook_name: 'my-webhook',
    recurrence: { cron: '0 8 * * *' },
    notes: 'Daily homepage screenshot',
  },
);
console.log(sched.id, sched.status);
```

 

   

 

 

 ```
client, _ := scrapfly.New("")

sched, err := client.CreateScreenshotSchedule(
    map[string]interface{}{
        "url":        "https://web-scraping.dev",
        "format":     "png",
        "resolution": "1920x1080",
        "capture":    "fullpage",
    },
    &scrapfly.CreateScheduleRequest{
        WebhookName: "my-webhook",
        Recurrence:  &scrapfly.ScheduleRecurrence{Cron: "0 8 * * *"},
        Notes:       "Daily homepage screenshot",
    },
)
if err != nil { log.Fatal(err) }
fmt.Println(sched.ID, sched.Status)
```

 

   

 

 

 ```
let client = Client::builder().api_key("").build()?;

let mut cfg: HashMap<string value=""> = HashMap::new();
cfg.insert("url".into(), json!("https://web-scraping.dev"));
cfg.insert("format".into(), json!("png"));
cfg.insert("resolution".into(), json!("1920x1080"));
cfg.insert("capture".into(), json!("fullpage"));

let sched = client.create_screenshot_schedule(
    cfg,
    &CreateScheduleRequest {
        webhook_name: "my-webhook".into(),
        recurrence: Some(ScheduleRecurrence { cron: Some("0 8 * * *".into()), ..Default::default() }),
        notes: Some("Daily homepage screenshot".into()),
        ..Default::default()
    },
).await?;</string>
```

 

   

 

 

 ```
scrapfly --api-key  screenshot schedule create \
    --config-inline '{"url":"https://web-scraping.dev","format":"png","resolution":"1920x1080","capture":"fullpage"}' \
    --webhook my-webhook \
    --cron '0 8 * * *' \
    --notes 'Daily homepage screenshot'
```

 

   

 

 

 ```
curl -X POST 'https://api.scrapfly.io/screenshot/schedules?key=YOUR_API_KEY' \
    -H 'Content-Type: application/json' \
    -d '{
        "screenshot_config": {
            "url": "https://web-scraping.dev",
            "format": "png",
            "resolution": "1920x1080",
            "capture": "fullpage"
        },
        "webhook_name": "my-webhook",
        "recurrence": {
            "cron": "0 8 * * *"
        },
        "retry_on_failure": true,
        "max_retries": 3,
        "notes": "Daily homepage screenshot"
    }'

```

 

   

 

 

 

## Recurrence

The `recurrence` object accepts either of two shapes:

- **Cron mode** (`{ "cron": "0 8 * * *" }`): a 5-field cron expression evaluated in UTC. Cron mode wins when set.
- **Interval mode** (`{ "interval": 1, "unit": "day" }`): fixed-interval mode with units `minute`, `hour`, `day`, `week`, `month`.
 
 Both modes accept an optional `ends` object to bound the schedule: `{ "type": "date", "date": "2027-01-01T00:00:00Z" }` stops at a specific date, and `{ "type": "count", "count": 30 }` stops after a fixed number of fires.

### scheduled\_date

 `scheduled_date` is the next time the schedule fires, in **UTC**. If you omit it, the schedule fires immediately and then follows the recurrence. To delay the first capture, set `scheduled_date` explicitly as an RFC3339 timestamp such as `2026-04-27T09:00:00Z` (the trailing `Z` declares UTC).

## List, get, update, delete

The collection and resource endpoints follow the standard REST shape:

- `GET /screenshot/schedules` lists every screenshot schedule on the account.
- `GET /screenshot/schedules/{id}` returns one schedule.
- `PATCH /screenshot/schedules/{id}` updates an active schedule (only supplied fields change; paused or cancelled schedules cannot be patched).
- `DELETE /screenshot/schedules/{id}` cancels a schedule. Returns `204 No Content`.
 
 ```
curl 'https://api.scrapfly.io/screenshot/schedules?key=YOUR_API_KEY'

```

 

   

 

 ```
curl 'https://api.scrapfly.io/screenshot/schedules/SCHEDULE_UUID?key=YOUR_API_KEY'

```

 

   

 

 ```
curl -X PATCH 'https://api.scrapfly.io/screenshot/schedules/SCHEDULE_UUID?key=YOUR_API_KEY' \
    -H 'Content-Type: application/json' \
    -d '{
        "recurrence": { "cron": "0 0 * * *" }
    }'

```

 

   

 

 ```
curl -X DELETE 'https://api.scrapfly.io/screenshot/schedules/SCHEDULE_UUID?key=YOUR_API_KEY'

```

 

   

 

 For a cross-product view (Web Scraping + Screenshot + Crawler schedules in one list), use `GET /schedules` instead.

## Pause, resume and execute now

 Pause stops future fires while preserving the schedule definition. Resume recomputes the next fire from the current time so missed ticks are not replayed. Execute now triggers an immediate capture on top of the regular schedule; it respects `allow_concurrency` and is rejected if the same schedule fired in the last five minutes (set `allow_concurrency=true` to bypass).

 ```
curl -X POST 'https://api.scrapfly.io/screenshot/schedules/SCHEDULE_UUID/pause?key=YOUR_API_KEY'
curl -X POST 'https://api.scrapfly.io/screenshot/schedules/SCHEDULE_UUID/resume?key=YOUR_API_KEY'
curl -X POST 'https://api.scrapfly.io/screenshot/schedules/SCHEDULE_UUID/execute?key=YOUR_API_KEY'

```

 

   

 

## Reliability

- **retry\_on\_failure**: when a capture fails the scheduler retries up to `max_retries` times before recording a failure.
- **allow\_concurrency**: when `false` (default), a fire is skipped if the previous capture is still running.
- **consecutive\_failures**: the response includes a counter of consecutive failed fires. After repeated failures the webhook is surfaced in the dashboard for review.
 
## Errors

 Schedule endpoints share a common error envelope. The full description and example response of each code is on the [Errors section](https://scrapfly.io/docs/screenshot-api/errors#scheduler) of the documentation.

- [ERR::SCHEDULER::ALREADY\_CANCELLED](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::ALREADY_CANCELLED "Schedule is already cancelled and cannot be cancelled again.") - Schedule is already cancelled and cannot be cancelled again.
- [ERR::SCHEDULER::BACKEND\_ERROR](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::BACKEND_ERROR "The schedule operation could not be completed. Retry the request; if it persists, contact support.") - The schedule operation could not be completed. Retry the request; if it persists, contact support.
- [ERR::SCHEDULER::CANNOT\_MODIFY](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::CANNOT_MODIFY "Only ACTIVE schedules can be modified. Resume the schedule first (POST /schedules/:id/resume) or recreate it.") - Only ACTIVE schedules can be modified. Resume the schedule first (POST /schedules/:id/resume) or recreate it.
- [ERR::SCHEDULER::CONCURRENCY\_BLOCKED](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::CONCURRENCY_BLOCKED "An execute-now request was blocked because the same schedule fired within the last 5 minutes. Set allow_concurrency=true on the schedule to permit overlapping fires.") - An execute-now request was blocked because the same schedule fired within the last 5 minutes. Set allow\_concurrency=true on the schedule to permit overlapping fires.
- [ERR::SCHEDULER::CONFIG\_ERROR](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::CONFIG_ERROR "Schedule request rejected due to invalid configuration. Common causes: missing or empty webhook_name, webhook_name does not match a webhook configured on the project, malformed scrape_config / screenshot_config / crawler_config, or missing recurrence and scheduled_date.") - Schedule request rejected due to invalid configuration. Common causes: missing or empty webhook\_name, webhook\_name does not match a webhook configured on the project, malformed scrape\_config / screenshot\_config / crawler\_config, or missing recurrence and scheduled\_date.
- [ERR::SCHEDULER::CRAWLER\_NOT\_IMPLEMENTED](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::CRAWLER_NOT_IMPLEMENTED "Crawler schedule dispatch is not yet implemented in the worker. The fire was recorded but no crawl was started.") - Crawler schedule dispatch is not yet implemented in the worker. The fire was recorded but no crawl was started.
- [ERR::SCHEDULER::DISABLED](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::DISABLED "The targeted schedule has been disabled") - The targeted schedule has been disabled
- [ERR::SCHEDULER::NOT\_FOUND](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::NOT_FOUND "Schedule not found, or not owned by the authenticated account.") - Schedule not found, or not owned by the authenticated account.
- [ERR::SCHEDULER::QUOTA\_REACHED](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::QUOTA_REACHED "Your subscription's schedule quota is exhausted. Cancel an existing schedule or upgrade your plan to create more.") - Your subscription's schedule quota is exhausted. Cancel an existing schedule or upgrade your plan to create more.
- [ERR::SCHEDULER::WEBHOOK\_DISABLED](https://scrapfly.io/docs/screenshot-api/error/ERR::SCHEDULER::WEBHOOK_DISABLED "Schedule fire skipped because its linked webhook is disabled. The schedule has been auto-paused; re-enable the webhook to resume it.") - Schedule fire skipped because its linked webhook is disabled. The schedule has been auto-paused; re-enable the webhook to resume it.