Using auto correct load option in target table will degrade the performance of BODS jobs. This prevents a full push-down operation from the
source to the target when the source and target are in different datastores.
But then Auto correct load option is an inavoidable scenario where no duplicated rows are there in the target. and its very useful for data recovery operations.
When we deal with large data volume how do we improve performance?
Using a Data_Transfer transform can improve the performance of a job. Lets see how it works 🙂
Merits:
The idea behind here is to improve the performance is to push down to database level.
Add a Data_Transfer transform before the target to enable a full push-down from the source to the target. For a merge operation there should not be any duplicates in the source data. Here the data_transfer pushes down the data to database and update or insert record into the target table until duplicates are not met in source.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
8 | |
5 | |
5 | |
4 | |
4 | |
4 | |
3 | |
3 | |
3 | |
3 |