Python Forum
Loading HUGE data from Python into SQL SERVER - Printable Version

+- Python Forum (https://python-forum.io)
+-- Forum: Python Coding (https://python-forum.io/forum-7.html)
+--- Forum: Data Science (https://python-forum.io/forum-44.html)
+--- Thread: Loading HUGE data from Python into SQL SERVER (/thread-7463.html)



Loading HUGE data from Python into SQL SERVER - Sandeep - Jan-11-2018

Hi All,

I have used the below python code to insert the data frame from Python to SQL SERVER database.But when I am using one lakh rows to insert then it is taking more than one hour time to do this operation. Could I get an optimized Python code for my task?


import time
start_time = time.time()
import pyodbc
from sqlalchemy import create_engine
import urllib

params = urllib.parse.quote_plus(r'DRIVER={SQL Server};SERVER=ROSQC50;DATABASE=PADB;Trusted_Connection=yes')
conn_str = 'mssql+pyodbc:///?odbc_connect={}'.format(params)
engine = create_engine(conn_str)
df.to_sql(name='DummyTodaynow',con=engine, if_exists='append',index=False)
print(" %s seconds ---" % (time.time() - start_time))
Appreciate for your help on this!

Thanks,
Sandeep


RE: Loading HUGE data from Python into SQL SERVER - buran - Jan-11-2018

Obviously df.to_sql (and sqlalchemy?) do not make bulk import, but execute individual insert for each row
see https://github.com/pandas-dev/pandas/issues/8953
and also https://stackoverflow.com/questions/33816918/write-large-pandas-dataframes-to-sql-server-database


RE: Loading HUGE data from Python into SQL SERVER - Sandeep - Jan-13-2018

Thanks for the reply Buran. I am not able to understand the codes that is in the links which you mentioned.
Any other code would be helpful?

Thanks,
Sandeep