Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
please help
#1
I'm working on a Python script to parse a large CSV file and extract specific data, but I'm encountering performance issues. Here's a simplified version of my code:
import csv

def extract_data(csv_file):
    with open(csv_file, 'r') as file:
        reader = csv.reader(file)
        next(reader)  # Skip header row
        for row in reader:
            # Extracting data from specific columns
            data = row[1], row[3], row[5]
            process_data(data)

def process_data(data):
    # Some processing on the extracted data
    print(data)

csv_file = 'large_file.csv'
extract_data(csv_file)
The problem is that large_file.csv contains millions of rows, and my script is taking too long to process. I've tried optimizing the code, but it's still not efficient enough. Can someone suggest more efficient ways to parse and extract data from such a large CSV file in Python? Any help would be appreciated!
build now gg
Reply


Messages In This Thread
please help - by natalie321 - Apr-10-2024, 03:52 AM
RE: please help - by Larz60+ - Apr-10-2024, 07:52 AM
RE: please help - by Gribouillis - Apr-10-2024, 08:58 AM
RE: please help - by paul18fr - Apr-10-2024, 09:22 AM
RE: please help - by deanhystad - Apr-10-2024, 09:34 AM
RE: please help - by snippsat - Apr-10-2024, 10:54 AM
RE: please help - by Pedroski55 - Apr-11-2024, 06:05 AM

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020