to_sql function takes forever to insert in oracle database · Issue #14315 · pandas-dev/pandas (original) (raw)
I am using pandas to do some analysis on a excel file, and once that analysis is complete, I want to insert the resultant dataframe into a database. The size of this dataframe is around 300,000 rows and 27 columns.
I am using pd.to_sql
method to insert dataframe in the database. When I use a MySQL
database, insertion in the database takes place around 60-90 seconds. However when I try to insert the same dataframe using the same function in an oracle
database, the process takes around 2-3 hours to complete.
Relevant code can be found below:
data_frame.to_sql(name='RSA_DATA', con=get_engine(), if_exists='append',
index=False, chunksize=config.CHUNK_SIZE)
I tried using different chunk_size
s (from 50 to 3000), but the difference in time was only of the order of 10 minutes.
Any solution to the above problem ?