Python Forum

Full Version: how to load large data into dataframe.
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi All,

I am working on 1 GB Data pulling from database .I want to bring this data into dataframe and do some analysis. Due to huge data ,python not able to execute it.
I read some notes and found chunk command to load data into chunks.
for chunk in
pd.read_sql(query_string ,conn_bmg,chunksize=1000000): this command is executing but how to send this data into data frame..

if I ran this df = pd.read_sql(query_string ,conn_bmg,chunksize=1000000)
df I am getting this message (<generator object SQLiteDatabase._query_iterator at 0x000001BA83B44480>)

please can any one help me on this .. correct my syntax..


thanks in advance