Sep-29-2022, 05:22 AM
Hi Team,
below code I am using to extract big sql table 60gb.
how to extract sql table to csv using sqlalchemy with headers.
without memory full issue.
mg
below code I am using to extract big sql table 60gb.
how to extract sql table to csv using sqlalchemy with headers.
import pyodbc import csv connection = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER=DESKTOP-GQK64O6;DATABASE=Customer;Trusted_Connection=yes;') cursor = connection.cursor() qry = "select * from employee" cursor.execute(qry) #data = cursor.fetchall() With open("E:\\backup\\output.csv","w","newline="") as outfile writer = csv.writer(outfile,quoting = csv.QUOTE_NONNUMERIC) writer.writerows(col[0] for col in cursor.description) While True: rows = cursor.fetchmany(10000) if len(rows) ==0: print("no records found") break else: x = x+len(rows) print(x) for row in rows: writer.writerows(row) conn.close() print("success") print('time taken' . time.time()-initial,"Seconds")Below is my attempted code, how to fetch extract SQL Table data with header using pycharm.
without memory full issue.
from sqlalchemy import create_engine import pandas as pd DRIVER= "ODBC Driver 17 for SQL Server" server="DESKTOP-GQK64O6" DATABASE="Customer" str_conn = f'mssql://{server}/{DATABASE}?Driver={DRIVER}' connection = create_engine(str_conn).connect() mylist = [] #Read in Chunk for chunk in pd.read_sql_table('employee',connection,chunksize=1000): mylist.append(chunk) df = pd.concat(mylist) ndf = pd.DataFrame(df) ndf.to_parquet("C:\\C_Programming\\output.csv")Thanks
mg