How to scroll to the bottom of the page with Selenium?

When web scraping with Selenium we might encounter pages that require scrolling to the bottom to load more content. This is a common pattern for infinite scrolling pages.

To scroll our Selenium browser custom javascript function window.scrollTo(x, y) can be used. This function scrolls the page to the specified coordinates.

So, if we need to scroll to the very bottom of the page we can use a while loop to continuously scroll until the bottom is reached.
Let's take a look at an example by scraping web-scraping.dev/testimonials:

from selenium import webdriver
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
import time

driver = webdriver.Chrome()
driver.get("https://web-scraping.dev/testimonials/")

prev_height = -1
max_scrolls = 100
scroll_count = 0

while scroll_count < max_scrolls:
    driver.execute_script("window.scrollTo(0, document.body.scrollHeight);")
    time.sleep(1)  # give some time for new results to load
    new_height = driver.execute_script("return document.body.scrollHeight")
    if new_height == prev_height:
        break
    prev_height = new_height
    scroll_count += 1

# Collect all loaded data
elements = WebDriverWait(driver, 10).until(EC.presence_of_all_elements_located((By.CLASS_NAME, "testimonial")))

results = []
for element in elements:
    text = element.find_element(By.CLASS_NAME, "text").get_attribute('innerHTML')
    results.append(text)

print(f"scraped: {len(results)} results!")

driver.quit()

Above, we're scraping an endless paging example from the web-scraping.dev website.
We start a while loop and keep scrolling to the bottom until the browser's vertical size stops changing.
Then, once the bottom is reached we can start parsing the content.

Question tagged: Selenium

Related Posts

How to Use Chrome Extensions with Playwright, Puppeteer and Selenium

In this article, we'll explore different useful Chrome extensions for web scraping. We'll also explain how to install Chrome extensions with various headless browser libraries, such as Selenium, Playwright and Puppeteer.

Intro to Web Scraping using Selenium Grid

In this guide, you will learn about installing and configuring Selenium Grid with Docker and how to use it for web scraping at scale.

How to Scrape Google Maps

We'll take a look at to find businesses through Google Maps search system and how to scrape their details using either Selenium, Playwright or ScrapFly's javascript rendering feature - all of that in Python.