Python Forum

Full Version: Project: Opening all links within a web page
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
I'm trying to write a code that opens all links that are on a web page in one tab for each.
import webbrowser
import bs4
import requests

def my_funct(search_url):

    while True:
	    print(f'Downloading page {url}...')
	    res = requests.get(search_url)
	    soup = bs4.BeautifulSoup(res.text, "html.parser")
	    urls ='#href')
	    if urls == []:
		    print('Page doesn\'t have links!')
		    res = requests.get(urls)
		    numLin = len(res)
		    for i in range(numLin):, new=2)
def test_it(args):
if __name__ == '__main__':
	url = input("Please add full url address of desired web site: ")
Whatever website that I add within url it prints a message that page doesn't have links. I'm thinking that there is something wrong with the selector that I chose.