Parse a URL list stored in a CSV - Printable Version +- Python Forum (https://python-forum.io) +-- Forum: Python Coding (https://python-forum.io/forum-7.html) +--- Forum: Web Scraping & Web Development (https://python-forum.io/forum-13.html) +--- Thread: Parse a URL list stored in a CSV (/thread-26618.html) |
Parse a URL list stored in a CSV - paulfearn100 - May-07-2020 """ hello any help greatly appreciated urls stored in the CSV Coloum A lines 1-100, i would like to parse the links in each url to output to another CSV, so i can parse these links for certain data shown. #working code below for one url link_set = set() for link in soup.find_all('a',{'class' : 'horse'}): web_links = link.get("href") print(web_links) link_set.add(web_links) """ [python] import requests from bs4 import BeautifulSoup import csv import requests import time #the csv file lines example #https://gg.co.uk/racing/07-feb-2020/dundalk-1700 #https://gg.co.uk/racing/07-feb-2020/chelmsford-city-1745 #ect, ect , ect #prints screen urls in csv file with open("course_links.csv") as infile: reader = csv.DictReader(infile) for link in reader: res = requests.get(link['course_links']) #time.sleep(5) print(res.url) #prints screen urls in csv file with open("course_links.csv", 'r') as file: csv_file = csv.DictReader(file) for row in csv_file: print(dict(row)) [\python] |