patternMinor
Migrate / Copy Huge data from Oracle to SQL Server
Viewed 0 times
oraclesqlmigratehugeserverfromdatacopy
Problem
I have huge tables in oracle database, appx 1 crore+ (10 million+) rows and want to migrate / copy those table and data in sql server.
Currently I am using Import functionality of SQL Server for this process. But it takes a day for this process and takes too much time.
Is there any better way? Any good outcome or step (SSIS, Any other functional step) to follow for this process?
Currently I am using Import functionality of SQL Server for this process. But it takes a day for this process and takes too much time.
Is there any better way? Any good outcome or step (SSIS, Any other functional step) to follow for this process?
Solution
Separate the two tasks.
One of the good ETL rules to follow is to get in and out of the source system as fast as possible. Dumping the Oracle data to flat files should be orders of magnitude faster than inserting direct, as should the subsequent import of those files to SQL Server.
I've found
One of the good ETL rules to follow is to get in and out of the source system as fast as possible. Dumping the Oracle data to flat files should be orders of magnitude faster than inserting direct, as should the subsequent import of those files to SQL Server.
- Export from Oracle to flat file. There are several approaches detailed at How Can I Unload Data to a Flat File.
- Import to SQL Server.
bcporBULK INSERTshould yield the fastest raw loading speed. There is a useful list of bulk loading references in Optimising BCP performance for BLOB data.
I've found
BULK INSERT and bcp to be less troublesome (i.e. I create and encounter fewer errors) if you forget CSV, avoid format files, and utilise a non-standard delimiter instead. I've got "~¬" in use on a bcp based ETL job at the moment and so far so good, the data dog I'm fetching from has yet to contain that string.Context
StackExchange Database Administrators Q#47269, answer score: 7
Revisions (0)
No revisions yet.