Python Forum
Thread Rating:
  • 1 Vote(s) - 3 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Web Crawler help
#11
Tnx for the great help. I have everything i want now!
I currently write my output to a csv file which i can then work with. The final piece of my puzzle is to get a header on the first row of the csv file. In the code i now i have i start with completely emptying the csv file. 

I tried both to only empty row 2 downwards, or to print the headers with my output. But in both cases it didn't result in a satisfactory output. 

See my code below. Does anybody know how i can print headers only in the first row of the csv file? Many thanks!

import requests
from bs4 import BeautifulSoup
open('output.csv', 'w').close()
import re
 
def fundaSpider(max_pages):
    page = 1
    while page <= max_pages:
        url = 'http://www.funda.nl/koop/rotterdam/p{}'.format(page)
        source_code = requests.get(url)
        plain_text = source_code.text
        soup = BeautifulSoup(plain_text, 'html.parser')
        ads = soup.find_all('li', {'class': 'search-result'})
        for ad in ads:
            title = ad.find('h3')
            title = ' '.join(title.get_text(separator='\n', strip=True).split()[
                             :-3])  # sep by newline, strip whitespace, then split to get the last 3 elements to cut out, then rejoin
            address = ad.find('small').text.strip()
            price = ad.find('div', {'class': 'search-result-info search-result-info-price'}).text.strip()
            price = re.findall(r'\d', price)
            price = ''.join(price)
            size_results = ad.find('ul', {'class': 'search-result-kenmerken'})
            li = size_results.find_all('li')
            size = li[0]
            size = size.get_text(strip=True)
            size = size.split(" ")[0]
            room = li[1].text.strip()
            room = room.split(" ")[0]
            href = ('http://www.funda.nl' + ad.find_all('a')[2]['href'])
            area = get_single_item_data(href)
            print(title + "," + address + "," + price + "," + size + "," + room + "," + area + "," + href)
            saveFile = open('output.csv', 'a')
            saveFile.write(title + "," + address + "," + price + "," + size + "," + room + "," + area + "," + href + '\n')
            saveFile.close()
 
        page += 1
 
def get_single_item_data(item_url):
    source_code = requests.get(item_url)
    plain_text = source_code.text
    soup = BeautifulSoup(plain_text, 'html.parser')
    li = soup.find_all('li', {'class': 'breadcrumb-listitem'})
    return (li[2].a.text)
 
 
fundaSpider(1)
Reply


Messages In This Thread
Web Crawler help - by takaa - Feb-06-2017, 06:57 PM
RE: Web Crawler help - by wavic - Feb-06-2017, 08:53 PM
RE: Web Crawler help - by metulburr - Feb-06-2017, 08:57 PM
RE: Web Crawler help - by takaa - Feb-07-2017, 08:46 AM
RE: Web Crawler help - by wavic - Feb-07-2017, 09:46 AM
RE: Web Crawler help - by takaa - Feb-07-2017, 05:17 PM
RE: Web Crawler help - by snippsat - Feb-07-2017, 05:45 PM
RE: Web Crawler help - by metulburr - Feb-07-2017, 05:53 PM
RE: Web Crawler help - by takaa - Feb-07-2017, 10:12 PM
RE: Web Crawler help - by metulburr - Feb-08-2017, 02:33 AM
RE: Web Crawler help - by takaa - Feb-08-2017, 12:22 PM
RE: Web Crawler help - by takaa - Feb-08-2017, 01:31 PM
RE: Web Crawler help - by wavic - Feb-08-2017, 01:47 PM
RE: Web Crawler help - by snippsat - Feb-08-2017, 02:19 PM
RE: Web Crawler help - by takaa - Feb-09-2017, 11:16 AM
RE: Web Crawler help - by metulburr - Feb-09-2017, 12:07 PM
RE: Web Crawler help - by takaa - Feb-09-2017, 12:08 PM
RE: Web Crawler help - by Larz60+ - Feb-09-2017, 12:10 PM
RE: Web Crawler help - by metulburr - Feb-09-2017, 12:14 PM
RE: Web Crawler help - by takaa - Feb-10-2017, 12:24 PM
RE: Web Crawler help - by metulburr - Feb-10-2017, 01:06 PM
RE: Web Crawler help - by takaa - Feb-14-2017, 01:49 PM
RE: Web Crawler help - by metulburr - Feb-14-2017, 02:43 PM
RE: Web Crawler help - by takaa - Feb-14-2017, 02:54 PM
RE: Web Crawler help - by takaa - Feb-15-2017, 11:02 AM
RE: Web Crawler help - by metulburr - Feb-15-2017, 01:18 PM
RE: Web Crawler help - by takaa - Feb-15-2017, 01:46 PM
RE: Web Crawler help - by snippsat - Feb-15-2017, 03:48 PM
RE: Web Crawler help - by takaa - Feb-15-2017, 04:01 PM
RE: Web Crawler help - by metulburr - Feb-15-2017, 06:03 PM
RE: Web Crawler help - by takaa - Feb-20-2017, 03:10 PM
RE: Web Crawler help - by metulburr - Feb-20-2017, 05:52 PM
RE: Web Crawler help - by takaa - Feb-20-2017, 07:56 PM
RE: Web Crawler help - by metulburr - Feb-21-2017, 02:18 AM
RE: Web Crawler help - by takaa - Mar-04-2017, 07:42 PM
RE: Web Crawler help - by metulburr - Mar-05-2017, 01:12 AM
RE: Web Crawler help - by Stoss - Jan-28-2019, 12:39 PM
RE: Web Crawler help - by takaa - Jan-30-2019, 08:35 AM
RE: Web Crawler help - by metulburr - Jan-30-2019, 06:23 PM
RE: Web Crawler help - by stateitreal - Apr-26-2019, 12:14 PM

Possibly Related Threads…
Thread Author Replies Views Last Post
  Web Crawler help Mr_Mafia 2 2,047 Apr-04-2020, 07:20 PM
Last Post: Mr_Mafia

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020