How To Use Proxy With cURL?

Proxies are essential to avoid IP address blocking and accessing restricted web pages over a specific location. In a nutshell, proxies are servers that allow for IP address hiding by requesting the web source through another, acting as a middleware.

To add proxy to cURL, we can use the -x or --proxy options. Let's explore using these options for proxy types.

HTTP and HTTPs Proxies

The general command for setting HTTP proxy with cURL is the following:

curl -x <protocol>://<proxy_host>:<proxy_port> <url>

The protocol indicates the HTTP protocol in the case of the HTTP proxies. It can be either htttp or https depending on the proxy type, as using the https protocol with a proxy that doesn't support it will result in errors. The proxy_host and proxy_port are configured based on the proxy itself. The proxy_host can be represented as a numeric IP address or domain name.

Here is an example of setting HTTP proxies with cURL. The actual proxy hostname and port are mocked:

# HTTP proxy
curl --proxy
curl --proxy http://some_proxy_domain:1234

# HTTPs proxy
curl --proxy
curl --proxy https://some_proxy_domain:1234

Authenticated Proxies

To proxy authentication with cURL, we only have to add the username and password as a prefix to the proxy. The same HTTP and HTTPs rules are also applied:

# HTTP proxy with authentication
curl --proxy http://username:password@
curl --proxy http://username:password@some_proxy_domain:1234

# HTTPs proxy with authentication
curl --proxy https://username:password@
curl --proxy https://username:password@some_proxy_domain:1234

SOCKS Proxies

The socks proxies operate over the SOCKS4 or the SOCKS5 protocol. Here is how to use socks proxies with cURL:

curl --proxy socks4://

curl --proxy socks5://

Provided by Scrapfly

This knowledgebase is provided by Scrapfly — a web scraping API that allows you to scrape any website without getting blocked and implements a dozens of other web scraping conveniences. Check us out 👇