How to Copy as cURL With Safari?

In this guide, we'll explain how to copy requests as cURL with Safari. We'll copy requests for the review data on web-scraping.dev. However, the same approach can be applied to other websites as well:
1. Go to the page URL where you want to copy the requests.
2. Open the browser developer tools on Safari by pressing the F12 key.
3. Select the Network tab from the top bar.
4. Empty the request log (Ctrl + L) to clear it for the desired request.
5. Activate the request to record it. It can differ based on the target, such as:

  • Scrolling down.
  • Clicking on a specific link.
  • Clicking on the next pagination button.
  • Filtering the data using filter buttons.
  • Searching for specific data.

6. Filter the requests by the target request type, Doc (HTML) or Fetch/XHR (JSON). You will find the requests recorded:

requests on safari developer tools

7. Identify the target request to copy by clicking it and reviewing its response.
8. Right-click on the request, select copy, and then copy as cURL (bash):

copy request as cURL
Copy request as cURL
  • The request is now copied as cURL in the clipboard.
  • Optional: convert the cURL request into Python using the cURL to Python tool.
  • Optional: convert the request into ScrapFly API requests from the ScrapFly API player.
ScrapFly API player screenshot
Import cURL request into ScrapFly's API player

We have explained converting cURL requests into Python. However, the same apporach can be used to convert cURL into Node.js and other programming languages using HTTP clients. For further details, refer to our dedicated guide on Postman.

Using API Clients For Web Scraping: Postman

Using API Clients For Web Scraping: Postman

postman article banner
Question tagged: cURL, HTTP

Related Posts

Sending HTTP Requests With Curlie: A better cURL

In this guide, we'll explore Curlie, a better cURL version. We'll start by defining what Curlie is and how it compares to cURL. We'll also go over a step-by-step guide on using and configuring Curlie to send HTTP requests.

How to Use cURL For Web Scraping

In this article, we'll go over a step-by-step guide on sending and configuring HTTP requests with cURL. We'll also explore advanced usages of cURL for web scraping, such as scraping dynamic pages and avoiding getting blocked.

Use Curl Impersonate to scrape as Chrome or Firefox

Learn how to prevent TLS fingerprinting by impersonating normal web browser configurations. We'll start by explaining what the Curl Impersonate is, how it works, how to install and use it. Finally, we'll explore using it with Python to avoid web scraping blocking.