For (mysql)executemany one need to pre-collect the data that one wants to commit.
In such a case one would loop to pre-collect multiple records, after which the pre-collected data would be committed(executemany) after the loop.
Completely loading/pre-processing a large amount of data is generally a bad idea. Processing the source data/file in chunks generally works better with large amounts of data. (ie: read N records, pre-process while you go, commit, do next N records, ...)
Note sure here (not played around with DB's for a relative long time), but I think one could try to delay the actual cursor.execute(). Like only commit after processing N records. (make sure to commit-check after normal loop ending)
In such a case one would loop to pre-collect multiple records, after which the pre-collected data would be committed(executemany) after the loop.
Completely loading/pre-processing a large amount of data is generally a bad idea. Processing the source data/file in chunks generally works better with large amounts of data. (ie: read N records, pre-process while you go, commit, do next N records, ...)
Note sure here (not played around with DB's for a relative long time), but I think one could try to delay the actual cursor.execute(). Like only commit after processing N records. (make sure to commit-check after normal loop ending)