How to save and load cookies in Selenium?

When web scraping, we often need to save the connection state like browser cookies and resume it later. Using Selenium, to save and load cookies we can use driver.get_cookies() and driver.add_cookie() methods:

import json
from pathlib import Path
from selenium import webdriver

driver = webdriver.Chrome()
driver.get("http://www.google.com")

# Get cookies to a json file:
Path('cookies.json').write_text(
    json.dumps(driver.get_cookies(), indent=2)
)

# retrieve cookies from a json file
for cookie in json.loads(Path('cookies.json').read_text()):
    driver.add_cookie(cookie)

driver.quit()
Question tagged: Selenium, Headless Browsers, Python

Related Posts

How to Use Chrome Extensions with Playwright, Puppeteer and Selenium

In this article, we'll explore different useful Chrome extensions for web scraping. We'll also explain how to install Chrome extensions with various headless browser libraries, such as Selenium, Playwright and Puppeteer.

Intro to Web Scraping using Selenium Grid

In this guide, you will learn about installing and configuring Selenium Grid with Docker and how to use it for web scraping at scale.

How to Scrape Google Maps

We'll take a look at to find businesses through Google Maps search system and how to scrape their details using either Selenium, Playwright or ScrapFly's javascript rendering feature - all of that in Python.