on 09-30-2015 2:46 PM
Hi All,
1) We are trying to load the data from Oracle to BPC 10.0 MS version through FIM, where we have 5600000 records in source(Oracle).
We need to load the data for the month of aug-15 and the records are nearly 8000.
When I run the job it is taking one hour to load the 8000 records, could you please suggest me if there is any way to increase the performance
2) Even it is taking so much time to open the mapping tables also, please suggest me to increase the system speed.
Thanks in advance.
Srikanth
Hi Srikanth,
Execution of the job is purely a Data Services activity - I suggest you create the post on
Having said that, it will be very hard to answer this on a SCN post - performance has many impacting factors (complexity of mapping but also hardware configuration) that a 'general' statement cannot really be made.
Marc
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Poor performance is most probably due to the selection criteria not pushed down to the underlying database. In that case, the DS job will pull all records into memory before applying the filter. You will have the generated DS code analyzed in order to determine the exact reason why.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
It's basically a two-step process.
You will find the DS logs in the DS log folder on your DS server. Exact location depends on platform and version. Currently, on Windows, they are in C:\ProgramData\SAP BusinessObjects\Data Services\log\JobServer\<repository name>\. You can find timing information in files called tace*.txt.
To check the actual code that is causing the performance issues you'll then have to open the DS job in the DS Designer tool and navigate to the data flow that is taking too long.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.