hello,
i'm new at python ,and i'm trying to get urls from website and write it to csv file
this is my code :
1. im not doing it right - what is the error with my code ?
2. how to export all my links to csv ?
thanks
i'm new at python ,and i'm trying to get urls from website and write it to csv file
this is my code :
from urllib.request import urlopen from bs4 import BeautifulSoup html = urlopen("https://XXX") # Insert your URL to extract bsObj = BeautifulSoup(html.read(), "lxml") bsObj.title.text # check if im right website for link in bsObj.find_all('a'): print(link.get('href')) cookieProcessor = urllib3.request.HTTPCookieProcessor()
from pyquery import PyQuery import requests payload = {'inUserName': 'XX', 'inUserPass': 'XX'} url = 'XXX'
from bs4 import BeautifulSoup import requests url = 'XXX' def links(url): html = requests.get(url).content bsObj = BeautifulSoup(html, 'lxml') finalLinks = set() for link in links: finalLinks.add(link.attrs['href']) links = bsObj.findAll('a') for link in links : print (link[0].text, file=open("e:\dvir.csv", "href"))my question is :
1. im not doing it right - what is the error with my code ?
2. how to export all my links to csv ?
thanks