Python Forum
Trying to find the fastest way to bulk extract from oracle
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Trying to find the fastest way to bulk extract from oracle
#1
Hi all,
We are on RHEL 8, Oracle DB 12+ and Python 3.9
I am trying to find the fastest way to extract large data from Oracle DB to local Linux file system in form of flat file (csv, txt, etc)
Data we extract from some Oracle tables could be in GBs with 2-3 billion records.

I am planning to use oracle_cx or it’s replacement oracledb module and 4 threads.

I would greatly appreciate any recommendations from more experienced members who might have done this. Any options using Python would work and we can try those out.

Thank you in advance.
Reply
#2
I have used Pandas https://pandas.pydata.org/ with impressive results

specifically:
Also sqlalchemy to be very efficient https://www.sqlalchemy.org/

I was surprised by the speed that I've seen with PostgreSQL and SQLite (both packages), haven't tried with Oracle, but would expect similar results.

Search google and this forum for tutorials.
python300 likes this post
Reply
#3
(Aug-30-2022, 11:49 AM)Larz60+ Wrote: I have used Pandas https://pandas.pydata.org/ with impressive results

specifically:
Also sqlalchemy to be very efficient https://www.sqlalchemy.org/

I was surprised by the speed that I've seen with PostgreSQL and SQLite (both packages), haven't tried with Oracle, but would expect similar results.

Search google and this forum for tutorials.

Thank you for your response. I'll definitely check it out.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Bulk insert pandas df data into SQLite dB ? Johnse 1 3,066 Sep-05-2019, 03:42 AM
Last Post: Larz60+
  the fastest-growing major programming language dvs1 0 2,199 Sep-07-2017, 02:50 AM
Last Post: dvs1

Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020