Python Forum
Project: Opening all links within a web page - Printable Version

+- Python Forum (https://python-forum.io)
+-- Forum: Python Coding (https://python-forum.io/forum-7.html)
+--- Forum: Web Scraping & Web Development (https://python-forum.io/forum-13.html)
+--- Thread: Project: Opening all links within a web page (/thread-12053.html)



Project: Opening all links within a web page - Truman - Aug-07-2018

I'm trying to write a code that opens all links that are on a web page in one tab for each.
import webbrowser
import bs4
import requests

def my_funct(search_url):

    while True:
	    print(f'Downloading page {url}...')
	    res = requests.get(search_url)
	    res.status_code
	    soup = bs4.BeautifulSoup(res.text, "html.parser")
	    urls = soup.select('#href')
	    if urls == []:
		    print('Page doesn\'t have links!')
		    break
	    else:
		    res = requests.get(urls)
		    numLin = len(res)
		    for i in range(numLin):
		    	webbrowser.open(res, new=2)
	
def test_it(args):
	my_funct(args)
	
if __name__ == '__main__':
	url = input("Please add full url address of desired web site: ")
	test_it(url)
Whatever website that I add within url it prints a message that page doesn't have links. I'm thinking that there is something wrong with the selector that I chose.