Python Forum

Full Version: Moving large amount of data between MySql and Sql Server using Python
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Hi Python developers,

I'm a beginner in the Python programming environment, I started doing online Python courses and I would like to use Python in the data analysis area.
Therefore, I would like to ask you if there is anyone among you who has done a massive (1: 1) data transfer from MySql to Sql Server, using Python.
I would like to know if this can be done using Python. In this way, I could remove the ETL tool which has limitations when it comes to large data transfers from two different DBMS systems.
Therefore, I would like to know if I can do the following steps in Python:

Step 1. Connect to the Mysql database.
Step 2. Access to the table where I want to get my data.
               something like "select * from table"
Step 3. Copy the data to the new table on the Sql Server.
                
I want to make a simple copy (1: 1) without transformations or data aggregation, etc.

I found some information about Python data frames (Pandas or Dask) but the examples used files for manipulating data and inserting them into a database.

In my case, should I use Python SQL Driver - pyodbc?
Are there other Python drivers that are better at extracting data from databases?

Thanks,
Adrian
Once I've used this Tool to export the data from an MS Access Database to sqlite.
Then you may dump the data with a different program.
But the data could also exported directly to a MySQL Server. I haven't tried it, but the option for this target is available in dbsofts tool.
Is this something you have to do once? Because SQL Server Management Studio (which is free MS software) has a nice import option to import data from different kinds of sources.
@DeaD_EyE, thanks for your info!


@Jeff900, I use the SSIS etl tool and, for small amount of data I am able to import data from MySql(over the network) into Sql Server database. But when I need to catch large amount of data (tables with more 1.5 bilion of rows), the etl tool shows its limits :). From what I know, is a problem with the ADO NET Source connector. Therefore, I'm trying to replace this etl tool only for moving large data from a DBMS to another. Thanks!
Do you use the "Developer" edition from SQL Server? That edition has some limitations, but I'm not sure what the limitations are exactly. But one is the limitations on the number of records/size in/on the database. And that may cause the limitation problem.