🚀 We are hiring! See open positions

How to fix Python requests SSLError?

by scrapecrow Dec 19, 2022

SSLError error is seen when using Python requests module to scrape pages with untrusted SSL certificates.

import requests
response = requests.get("https://example.com/")
# raises: 
# SSLError: HTTPConnectionPool(host='example.com', port=80)...

# we can disable certificate verficiation (note: dangerous as it disables e2e encryption)
response = requests.get("https://example.com/", verify=False)
# or specify certficate file explicitly (.rem)
cert_location = "certificates/example-com-certificate.pem"
response = requests.get("https://example.com/", verify=cert_location)

The SSLError exception is rarely encountered in web scraping but the easiest way to fix it is to simply disable certification verification (verify=False parameter) if no sensitive data is being exchanged.

Note if manual fix is required the SSL certificates requests is using can be found using the requests.certs.where() method:

import requests
print(requests.certs.where())
'/etc/ssl/certs/ca-certificates.crt'  # example on Linux

This value can also be overridden using REQUESTS_CA_BUNDLE environment variable:

$ export REQUESTS_CA_BUNDLE="/etc/ssl/certs/my-certificates.pem" 
$ python -c "import requests;print(requests.certs.where())"
/ets/ssl/certs/my-certificates.pem

Finally, requests is using certifi to handle all SLL certificate-related operations - try updating it: pip install certifi --upgrade

Related Articles