How To Send Multiple cURL Requests in Parallel?

The cURL -Z or --parallel options are used to pass a list of URLs to request in parallel. For this parallel execution, cURL can handle up to 50 operations at the same time.

cURL also provides two additional parameters for this option:

  • --parallel-max
    It specifies the number of connections that cURL can handle simultaneously.
  • --parallel-immediate
    If this parameter is used, cURL will prioritize creating new connections instead of waiting for other connections to finish.

Here is how we can send cURL multiple requests with the --parallel option:

curl --parallel --parallel-immediate --parallel-max 5 https://httpbin.dev/headers https://httpbin.dev/headers https://httpbin.dev/headers

The above cURL command will send three requests in parallel to the URL httpbin.dev/headers. It will handle a maximum of 5 simultaneous requests and prioritize creating new connections.

Alternatively, we can specify the URLs to request and pass it to the cURL --config option. First, create a file urls.txt and the URLs:

url = "https://httpbin.dev/headers"
url = "https://httpbin.dev/headers"
url = "https://httpbin.dev/headers"

Next, we'll use the same cURL parallel requests command and pass the config file:

curl --parallel --parallel-immediate --parallel-max 10 --parallel-max 5 --config urls.txt

For more details on cURL, refer to our dedicated guide.

How to Use cURL For Web Scraping

Explore sending and configuring HTTP requests with cURL through a step-by-step guide. You will also explore advanced usages of cURL for web scraping, such as scraping dynamic pages and avoiding getting blocked.

curl web scraping article banner
Question tagged: cURL

Related Posts

Sending HTTP Requests With Curlie: A better cURL

In this guide, we'll explore Curlie, a better cURL version. We'll start by defining what Curlie is and how it compares to cURL. We'll also go over a step-by-step guide on using and configuring Curlie to send HTTP requests.

How to Use cURL For Web Scraping

In this article, we'll go over a step-by-step guide on sending and configuring HTTP requests with cURL. We'll also explore advanced usages of cURL for web scraping, such as scraping dynamic pages and avoiding getting blocked.

Use Curl Impersonate to scrape as Chrome or Firefox

Learn how to prevent TLS fingerprinting by impersonating normal web browser configurations. We'll start by explaining what the Curl Impersonate is, how it works, how to install and use it. Finally, we'll explore using it with Python to avoid web scraping blocking.