on 02-11-2015 5:04 PM
Dear all,
I am currently checking a DTP run.
In the DTP details of the DTP log I see "Generate Request" at 17:29:19, the next step "Set Status to 'Executable" (still before data processing) at 17:35:11 - that is an extreme time - nearly 6 minutes.
Do you know what exactly the system executes during this time / how to optimize it?
In the job log I can see the following steps:
Date | Time | Message text | Message class Message no. Message type |
11.02.2015 17:29:19 Job started | 00 | 516 | S |
11.02.2015 17:29:19 Step 001 started (program RSPROCESS, variant &..., user ID ...) | 00 | 550 | S |
11.02.2015 17:29:19 Start process DTP_LOAD DTP_50... in run 50.. of chain ... | RSPC | 156 | S |
11.02.2015 17:34:54 SQL: 11.02.2015 17:34:54 ... | DBMAN | 099 | I |
11.02.2015 17:34:54 TRUNCATE TABLE "TESTDATRNRPART0" | DBMAN | 099 | I |
11.02.2015 17:34:54 SQL-END: 11.02.2015 17:34:54 00:00:00 | DBMAN | 099 | I |
11.02.2015 17:34:57 SQL: 11.02.2015 17:34:57 ... | DBMAN | 099 | I |
11.02.2015 17:34:57 TRUNCATE TABLE "TESTDATRNRPART1" | DBMAN | 099 | I |
...
Hello -
Check for any routines in your DTP filters. The DTP has to generate the filter criteria before performing the extraction. If there is complex logic in a routine, you will see the additional time in this step.
It could also be that you need to clean-up your basis tables - reference note 1829728 for BW housekeeping task list. Also consider archiving old requests with RSREQARCH.
Optimization of Data Retention - Data Warehouse Management - SAP Library
Regards,
Tony
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello,
If the issue still could not be resolved, please have a look at th recently released guided answer for DTP perforamance:
Guided Answers: Data Transfer Process (DTP) performance/memory issues
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello,
If the issue still could not be resolved, please have a look at th recently released guided answer for DTP perforamance:
Guided Answers: Data Transfer Process (DTP) performance/memory issues
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Alexander,
are you experiencing any sequential reads on the table RSSTATMANREQMAP when loading? You will be able to see this in SM50 if so, please advise. If you are I have further recommendations,
thanks,
Colm
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Have you optimized your parallel loading by implementing the correction notes listed in the following SAP Composite Note...
1827854 - Enqueue/Lock during parallel data loads
These corrections are hugely beneficial for BW parallel optimization.
Thanks,
Colm
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Alexander,
Please let me know if your issue is resolved.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello Alex,
You urself can check in detailed DTP log where the DTP is taking time. Expand the DTP request to see exact time spent in eact step.
Also in the DTP, click on debugging and go line by line to see in which code system is taking time.
There are ways to optimize DTP performance. Go to RSBATCH for standard DTP settings like increasing the bck process.. also check if there are any routines where optimization can be done.
Regards,
AR
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
a) Packet size of a DTP:
Number specified here has a direct impact on memory consumption during extraction.
If the data is read in semantic groups, all the data records for a particular key have to be contained within the same package. For this reason, a package contains more data records than specified here
b) If the volume of the data is huge then you can create more number of DTPs and can perform selective loading at DTP level as well.
Also check for routines/formula at transformation level.
Hope it helps.
-- KRPK
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
87 | |
10 | |
10 | |
10 | |
7 | |
6 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.