Go SDK

Go SDK is the easiest way to access Scrapfly API in Go (Golang).

It provides a client that streamlines the scraping process by:

  • Handling common errors
  • Automatically encoding and decoding sensitive API parameters
  • Handling and simplifying concurrency
  • Providing an HTML selector engine via goquery
For more on Go SDK use with Scrapfly, select "Go SDK" option in Scrapfly docs top bar.

Step by Step Introduction

For a hands-on introduction and example projects see our Scrapfly SDK introduction page!

Discover Now

Installation

Install the Go SDK using go get:

Quick Use

Here's a quick preview of what the Go SDK can do:

In short, we first create a scrapfly.Client with our Scrapfly key. Then, we use client.Scrape() with a ScrapeConfig to issue our scraping commands.

The returned ScrapeResult contains result data (like page HTML), request metadata and a convenience HTML selector via .Selector() for further parsing.

Configuring Scrape

The SDK supports all features of Scrapfly API, which can be configured through the ScrapeConfig struct:

For scraping websites protected against web scraping make sure to enable Anti Scraping Protection bypass using ASP: true.

For more on available options see API specification which is matched in the SDK where applicable.

Handling Result

The ScrapeResult object contains all data returned by Scrapfly API such as response data, API usage information, scrape metadata and more:

Concurrent Scraping

Use client.ConcurrentScrape() to scrape concurrently at your plan's concurrency limit or a provided limit:

Getting Account Details

To access Scrapfly account information use client.Account():

Examples

Custom Headers

Provide additional headers using Headers in ScrapeConfig. Note that when using ASP=true, Scrapfly can add additional headers automatically to prevent scraper blocking.

Post Form

To post form data, set Method: "POST" and provide Data. By default it uses application/x-www-form-urlencoded.

Post JSON

To post JSON data, set Headers["content-type"] = "application/json" and provide Data.

Javascript Rendering

To render pages using headless browsers using Javascript Rendering feature set RenderJS=true in ScrapeConfig:

Javascript Scenario

To execute Javascript Scenario use JSScenario in ScrapeConfig and enable RenderJS:

Capturing Screenshots

To capture screenshots use RenderJS=true and Screenshots in ScrapeConfig:

Scraping Binary Data

Binary data is returned base64 encoded. Decode it with encoding/base64:

Full Documentation

For full documentation of the Go SDK, see the Go SDK documentation.

Summary