Get all links on a webpage

Hello friends! at times during automation we are required to fetch all the links present on a webpage. Also, this is one of the most frequent requirements of web-scrapping. In this tutorial, we will learn to fetch all the links present on a webpage by using tagname locator.
If you have basic understanding of HTML, you must be aware of the fact that all hyperlinks are of type anchor tag or 'a'.

<a href="selenium-introduction.html">Selenium Introduction</a>

How to fetch all the links on a webpage?

  • Navigate to the desired webpage
  • Get list of WebElements with tagname 'a' using driver.findElements()-
    List<WebElement> allLinks = driver.findElements(By.tagName("a"));
  • Traverse through the list using for-each loop
  • Print the link text using getText() along with its address using getAttribute("href")
    System.out.println(link.getText() + " - " + link.getAttribute("href"));

Sample code to mouse hover over an element

package seleniumTutorials;

import java.util.List;

import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.firefox.FirefoxDriver;

public class GetAllLinks {
	public static void main(String[] args){
		WebDriver driver = new FirefoxDriver();
		//Launching sample website
		//Get list of web-elements with tagName  - a
		List<WebElement> allLinks = driver.findElements(By.tagName("a"));
		//Traversing through the list and printing its text along with link address
		for(WebElement link:allLinks){
			System.out.println(link.getText() + " - " + link.getAttribute("href"));
		//Commenting driver.quit() for user to verify the links printed