on 06-24-2014 3:39 PM
Hi Experts,
I realy need some help here!
I have to do a full data load from a DSO to an InfoCube with almost 400 million registers.
First I tried to split the data load by 0CALDAY, but even 6 months it´s too much data, but BW has to deal with this data volume!
I delete the InfoCube Indexes, because of deadlocks I´m using 20 paralel process with 500 registers per package.
But I noticed something, on SM50, the jobs used by the DTP doesn´t refresh, it´s like if they are reserved for the DTP. So I have jobs with 17000s ongoing.
The data load becomes slower as longer as it goes.
Anyone has any idea??
There is a way to make the system refresh the background jobs instead of use the same for all the data load?
Oh, and doesn´t have any ABAP code on transformations.
Thanks!
Regards.
Hi,
Better to load your full load from DSO to cube when there is no much burden on bw server.
Increase dtp packet size and use more parallel process if possible.
Create multiple dtp with different filters to load data to cube.
if possible reschedule other loads and run your load only.
Take basis team help increase required application servers if needed.
Delete index before load to info cube.
Thanks
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Ramanjaneyulu Korrapati, thanks for your answer.
But there is the problem, I have the server for myself, the only thing that runs is the backup once per day.
If I try to increase the dtp packet size I´ll have Deadlocks.
I´ll split the load in a lots of DTP´s, but I would like to see if there is no way to refresh the background job when doing the load, lets say after 1000s?
And the index are deleted always before the data load.
Thanks again.
User | Count |
---|---|
93 | |
10 | |
10 | |
9 | |
9 | |
7 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.