1 2 3 22 Previous Next

SAP Business Warehouse

319 Posts

A recap ...


since the NetWeaver Release 7.0 the SAP JAVA or J2EE stack is a component within the SAP BW Reporting Architecture, and was building the foundation for the BW Integrated Planning (BW-IP) as well. I also spended a lot of time creating documents and giving the main input for the BI-JAVA CTC template which is described for 7.0x here - SAP NetWeaver 7.0 - Setting up BEx Web - Short ... | SCN  and for 7.3x/7.40 here - New Installation of  SAP BI JAVA 7.30 - Options, Connectivity and Security


Then, not really recognized by the Audience (not even me ... ;-) a unspectacular SAP Note was released - Note 1562004 - Option: Issuing assertion tickets without logon tickets which introduced and extension to the parameter login/create_sso2_ticket  - See also the SAP Help Background to that Topic.

After understanding the impact, I have updated the Document - SAP NetWeaver BW Installation/Configuration (also on HANA) and didn't give them much attention. The mayor difference in my active Implementation for SAP BW 7.3x and 7.40 and SAP Solution Manager 7.1 was: No SSO error at all, no connection problems between ABAP and JAVA stack anymore.

Unfortunately countless SAP Notes, SAP online help and SAP tools still referring to the old value - login/create_sso2_ticket = 2 but since 7.40 the "correct" value is now a system default: login/create_sso2_ticket = 3
BTW: did you know, that the parameter is dynamically changeable in tx. RZ11? This allows you to switch the parameter during the usage of the BI-JAVA CTC template and continue the configuration successfully.



Finding out the BI-JAVA system status

the easiest way is to call the SAP NetWeaver Administrator http://server.domain.ext:<5<nr>00>/nwa and proceed to the "System Information" page



Details of the Line "Version": 1000. ...

                                                            Main Version.SPS.PL.date


Running the BI-JAVA CTC template

Call the wizard directly with the following URL - http://server.domain.ext:<5<nr>00>/nwa/cfg-wizard



Checking the result

Now, that this hurdle is taken we have to check the the BI-JAVA configuration with the BI diagnostic tool or directly in the System Landscape of the EP.


And the Result in the BI Diagnostic tool (version 0.427)

Note 937697 - Usage of SAP NetWeaver BI Diagnostics & Support Desk Tool


To get to this final state, you have to additionally check/correct the following settings in the NetWeaver Administrator for evaluate_assertion_ticket and ticket according SAP Note 945055. (The Note is not updated since 2007)


[trustedsys1=HBW, 000]

[trusteddn1=CN=HBW, OU=SSL Server, O=SAP-AG, C=DE]

[trustediss1=EMAIL=xxx, CN=SAPNetCA, OU=SAPNet, O=SAP-AG, C=DE]

[trustedsys2=HBW, 001]

[trusteddn2=CN=HBW, OU=SSL Server, O=SAP-AG, C=DE]

[trustediss2=EMAIL=  , CN=SAPNetCA, OU=SAPNet, O=SAP-AG, C=DE]

As we are now using assertion tickets instead of logon tickets, the RFC connection from ABAP to JAVA looks a bit different:


crosscheck as well with the entry for the default Portal in the ABAP Backend (tx. SM30 => RSPOR_T_PORTAL)


Activating the BEx Web templates

Ok. This is also now solved. Now that the BI-JAVA connection technically works, we can check if the standard BEx Web template 0ANALYSIS_PATTERN works correctly. Please remember that you have to at least once activate the necessary web templates once from the SAP BW-BC, otherwise follow the SAP Note

Note 1706282 - Error while loading Web template "0ANALYSIS_PATTERN" (return value "4")


Now you can call with tx. SE38 the Report RS_TEMPLATE_MAINTAIN_70 and choose as Template ID 0ANALYSIS_PATTERN


Running BEx Web from RSRT

Ok. This only approves that the BEx Web Template can be called directly. But what happens when you call the web template or any query from tx. RSRT/RSRT2? you will encounter (as well I did) that this is a complete different story. Tx. RSRT has three different options to show the result of a query in a web based format, and we are interested in the "Java Web" based output.


The recently added "WD Grid" output is nice to use together with the new BW-MT and BW-aDSO capabilities with SAP BW 7.40 on HANA.

But what we see is this:


Hmm? We checked the BI-JAVA connection and the standard BEx Web Template and still there is an error? Is there a problem with RSRT? Is the parameter wrong?

No. Recently (again not really recognized by the Audience again) another SAP Note was released which also has an impact to SAP EP 7.3x/7.40:

Note 2021994 - Malfunctioning of Portal due to omission of post parameters

After applying the necessary Corrections to the SAP BI-JAVA EP Instance, also the tx. RSRT finally shows the correct output:



If you encounter the following error:


"DATAPROVIDER" of type "QUERY_VIEW_DATA_PROVIDER" could not be generated

Cannot load query "MEDAL_FLAG_QUERY" (data provider "DP_1": {2})



This is solved by the SAP Note (to apply in the ABAP Backend) - Note 2153270 - Setup Selection Object with different handl id

If the following error ("classic RSBOLAP018") occurs, this can have different reasons, e.g. outdated BI-JAVA SCA's, user or connectivity problem.

"RSBOLAP018 java system error An unknown error occurred during the portal communication"

Some of them can directly be solved with the following SAP Notes below:


Note 1573365 - Sporadic communication error in BEx Web 7.X

Note 1899396 - Patch Level 0 for BI Java Installation - Detailed Information

Note 2002823 - User-Specific Broadcaster setting cancels with SSO error.

Note 2065418 - Errors in SupportDeskTool

More Solutions can be found here - Changes after Upgrade to SAP NetWeaver BW 7.3x


Finally ...

On this Journey finding the real connection, I found also some helpful KBA which I added to the existing SCN Document - Changes after Upgrade to SAP NetWeaver BW 7.3x in the 7.3x JAVA section. The good news here is: at the end the documents are very nice for background knowledge and where not needed almost at all, if you stick to the automated configuration.

So, of course the BI-JAVA - EP configuration is a hell of a beast, but you can tame it ... ;-)

I hope this bring's a bit of a light into the successful BI-JAVA configuration

Best Regards

Roland Kramer, PM BW/In-Memory

"the happy one's are always curious"

Business data is often viewed as the critical resource of the 21th century. The more actual the business data is, the more valuable it is considered. However historic data is not utterly worthless. To offer the best possible, meaning the most performant, consistent and correct access to data given a fixed budget, we need to know: Who consumes which slice of our business data at what point in time? This blog is about how to find out valid answers to this question from the perspective of a BW administrator.

Access to the data is granted via SAP BWs analytic engine. SAP BW users access the data via a BExQuery. The analytic engine in turn requests the data from the persistency services. BW (on HANA) offers a multi-temperature data lifecycle concept, with data stored in-memory and columnar format, usage of the non-active data concept, in the HANA ExtendedStorage (aka Dynamic Tiering), usage of the Nearline-Storage options, or archiving and, of course, you can delete the data.

Now given our fixed budget, how should we find out how to distribute the data across the different storage layers?

SAP BW on HANA SP 8 comes equipped with the “Selection statistics”, a tool designed to track data access and then assist finding a proper data distribution. With the selection statistics you can record all data access requests of the Analytic Engine on your business data. The selection statistics can be enabled per Info Provider. If enabled, then for each data access request the minimal and maximal selection date on the time dimension, the name of the Info Provider, the name of the accessing user and the access time are stored.

One of the major use cases for the “Selection Statistics” is for the “Data aging” functionality in the Administrator workbench (Administrative Tools->Housekeeping->Data Aging) is to be able to propose time slices for shifting data to the NearlineStore. Technically the “Data Aging” tool assist in creating:

  • Data Archving Processes
  • Parametrization (variants) of Data Archiving Processes , containing the proposed time slice
  • Process Chains that schedule the Data Archiving Processes

The recording of selection statistics is currently limited to time slices only. This limitation was introduced to

a)      keep the amount of recorded data under control

b)      minimize the impact on the query runtime due to the calculation of the data slices.

c)      emphasize time filters, that are usually provided in all queries and are the most important criteria when it comes to data retention and lifecycle considerations. 

If you would agree to this fine, otherwise feel free to post a comment and share your view.


Here some screenshots that demonstrate the use of the tools:

1.)    Customizing the selection statistics (transaction spro)



2.)    Analyzing the selection statistics




3.)    Using selection statistics for Data Aging


Scenario :


In our project we are using statistical method to calculate number of products left at a customer location considering their past sales( cumulative)  and natural retirement with time.  To calculate the retirement we are using statistical density function to predict retirement over time. To get the current product base we are subtracting predicted retirement from total sales over time.


Now, as this prediction might not give 100% correct values ( in fact it will never give ) , business wants to update the "Current Product Base"  in case that information is available via field intelligence i.e. from the sales representative.




For example  row 1 , our model is predicting "Current Product Base"  for customer C1 as of April-2015 for Product P1 is 50 . However, my sales representative knows it is exactly 60 .  So, he/she updated this value to 60 manually.  We used Integrated Planning functionality in BW to achieve that.  Now, we want to capture who changed the values and when the changes were made.


Step By Step Procedure :

1.  Create  Direct Update DSO  to log the changes:

We logged the changes in a Direct Update DSO.  So first we need to create some characteristics relevant for logging and then create a Direct Update DSO.

We have used 0DATE0 , 0TIME , ZUSERNM( to hold user information ) and ZSAVEID to log the changes. Created a DSO with 0DATE, 0TIME,  ZUSERNM , ZSAVEID these as the key fields together with other characteristics relevant for business.


        InfoObjects Settings :

pic 2.png

Now , we will create a DSO and change the Type of Data Store Object to " Direct Update" from the settings.  We shall use all our business key and above mentioned 4 characteristics as the key of DSO.


    pic 3.PNG

In the Data fields of DSO , you can include all the Key Figures which are supposed to be manually updated. For case our scenario it is actual value of product base.



2. Create Enhancement Spot Implementation to log the changes in DSO :

Now , we shall implement an Enhancement Spot which will do the job of logging manual update.  Every time user updates the value in real time cube, system will generate an Save Id and push that to our DSO along with user name, date and time.


Go to Transaction SE18 , choose Enhancement Spot  RSPLS_LOGGING_ON_SAVEChoose Tab Enhancement Implementation and click on Implement Enhancement Spot  ( highlighted ).

                              PIC 4.png

Put the name of your implementing class and description and then choose OK  . Select suitable Package and then fill the below screen with BAdi name and class name and choose BAdi definition

                      pic 5.png


                        pic 6.png


    Now we have to work on two things  1 ) Implementation Class and 2 ) Filter


    Let us work with implementation class first .  A class will have methods which will do the actual work for us. We have to put our code in those methods.


    Double click on the implementation class of the BAdi definition .

                                          pic 7.png

  It shall bring the below screen and you would be able to see the methods for the implementation class. We have to put our code inside these methods.  Please check the attachment for the code with comments.  You need very minimum adjustment to the code to adapt it for your scenario.

                              Pic 8.png


Here we need to define for which Real Time Cube logging is activated . Assign the cube name  to i_infocube_name  parameter.  Additionally I put my name , so that changes by my user id only would be logged as of now.  Later on we shall comment out second statement.


      PIC 9.png



This method will give us the structure of the  data which will be logged.  In our case it will provide me the structure of the  DSO where I am storing the log.  Please check the appendix for code adjustment with all relevant comments for understanding .




This method actually writes the data to Direct Update DSO in a structure defined in  method 2.

Here we need to mention for which Real Time Cube we want to log the changes and where  ( in our case it is Direct Update DSO) . It could also be a DB table.



This method you can use to write it the log to Database Table if you are using HANA as DB



This method you can use to write it the log to Database Table if you are using HANA as DB

For our case , we are tracking the changes in DSO, so , we did not use method 4 or 5 .  Still , we activated these two ( d and e)  method ,otherwise BAdi activation was throwing error.


**** Please check attached document for complete code

Once we put all our code in respective method , we need to fill Filter for this BAdi implementation.  Double click on the filter area and put your Real Time Cube name.


                    Pic 10.PNG


3. Login to Planning workbook and Update Values :

Now, we need to login to our planning workbook and manually adjust the number of Product Base and then save it in real time cube.



Note , we have changed Actual Product Base for first 4 rows and save them in the planning cubes .


We will check our Direct Update DSO to see if our BAdi has logged all those changes and the user id who changed it.




As we can see , it logged my user id and date, time and save id for the change I did.  If you want to update to some other target only the last changed time and change by user , you can read only the latest record by sorting with time .


Please find complete codes in link ( dropbox) , just need to adjust the portion highlighted.

Dropbox - Class Methods.pdf



Debug Tips :  If you face any problem, please set external breakpoints inside the methods one by one and debug.



For some more detail, please check How to... Log Changes in Plan Data when using the SAP BW Planning Applications Kit






Hi all,


Despite having authorizations for this InfoProvider, an error on a DTP ( XXX -> YYY ) is produced while executing a process chain. The message displayed is:






But the user does have authorization for InfoProvider XXX (and for YYY):




If we generate a trace in tcode ST01 we can see that the error (RC=4) is related to the authorization object S_BTCH_ADM:







To avoid this error the following authorization (authorization object S_BTCH_ADM) should be granted to the user:




This is because the DTP is running serial extraction. In this case the user needs authorization to manage background processing. If you do not want to grant this authorization to the user, you can check the "Parallel Extraction" mode and the authorization problem will be solved:




Best Regards,




Hello everyone, I recently completed BeX upgrade project and I found below information should be helpful for folks who work on similar projects

Business Scenario


Consider a master data info object has large number of attributes and business want to display only a selected number of attributes OR set the sequence of the attributes for display in F4 for help in report output




In the below example I have considered Employee Master Data to have Home Sub-LoS 5 to4 appear in F4 for help window in sequence instead of all other attributes that exists. When I execute a report I see the sequence maintained in F4 for help/Filter is the same as Attribute tab


TCODE -> RDS1 -> Select Appropriate Info Object -> Attribute Tab


F4 for Help.jpg


F4 for Help 2.jpg


SAP Notes for Reference


1080863 - FAQ: Input helps in Netweaver BI




Abhishek Shanbhogue



You have the requirement of updating the DataSource in source system like changing the Extract Structure or changing the Extractor for example from View to Function Module. You do not have authorization to the RSA2 transaction and cannot wait for the SP release or cannot upgrade the SP level.


In standard BI, you can only change the Extractor fields of the DataSource in RSA6 transaction.


In this case, you can write a Z report to achieve the desired results.


I am providing a sample report which changes the Extract Structure for the DataSource without accessing the RSA2 transaction.









        lv_objvers_d  TYPE ROOBJVERS  VALUE 'D',

        lv_objvers_a  TYPE ROOBJVERS  VALUE 'A'.

  DATA: ls_roosource_old TYPE ROOSOURCE,

        ls_roosource_new TYPE ROOSOURCE.

  DATA: ls_roosfield_old TYPE ROOSFIELD,

        ls_roosfield_new TYPE ROOSFIELD.

  DATA: txt(24) TYPE C.
















*                                                                     *


*                                                                     *




  TEXT1 = 'Dear Customer.'.

  TEXT2 = 'You are just running a report, which will change '

        & 'structure of DataSource '.

  TEXT3 = '0GT_HKPSTP_TEXT on your database.'.

  TEXT4 = 'In case of doubt please contact SAP.'.



*                                                                     *


*                                                                     *





     WRITE: / 'Mode...: Update-Run'.

     txt = '  <-successfully updated'.


     WRITE: / 'Mode...: Test-Run'.

     txt = '                      '.









* 1.1 get current values for protocol


    WHERE oltpsource = lv_oltpsource

    AND   objvers    = lv_objvers_d.


* ..1.2 build workarea for update

    ls_roosource_new = ls_roosource_old.

    ls_roosource_new-EXSTRUCT = 'WB2_TEXTSTR1'.


    WRITE: / 'DataSource "0GT_HKPSTP_TEXT" not found in version "D".',

           / 'Nothing to do ... bye.'.






    WHERE oltpsource = lv_oltpsource

    AND   objvers    = lv_objvers_a.


* ..1.2 build workarea for update

    ls_roosource_new = ls_roosource_old.

    ls_roosource_new-EXSTRUCT = 'WB2_TEXTSTR1'.


    WRITE: / 'DataSource "0GT_HKPSTP_TEXT" not found in version "A".',

           / 'Nothing to do ... bye.'.





* Step 3: Update tables ROOSOURCE, ROOSFIELD


* ..3.1 Update ROOSOURCE

    UPDATE ROOSOURCE FROM ls_roosource_new.

    IF SY-subrc IS INITIAL.

*     ..OK, table has been updated successfully



      WRITE: / 'Error on update table ROOSOURCE.'.





    SELECT SINGLE * FROM roosfield INTO ls_roosfield_old

      WHERE oltpsource = lv_oltpsource

      AND   objvers    = lv_objvers_a

      AND   field      = 'SPRAS'.

    IF sy-subrc IS INITIAL.

* ..1.2 build workarea for update

      ls_roosfield_new = ls_roosfield_old.

      ls_roosfield_new-selection = 'X'.



      UPDATE roosfield FROM ls_roosfield_new.


      WRITE: / 'DataSource "0GT_HKPSTP_TEXT" not found in version "A".',

           / 'Nothing to do ... bye.'.





    SELECT SINGLE * FROM roosfield INTO ls_roosfield_old

      WHERE oltpsource = lv_oltpsource

      AND   objvers    = lv_objvers_d

      AND   field      = 'SPRAS'.

    IF sy-subrc IS INITIAL.

* ..1.2 build workarea for update

      ls_roosfield_new = ls_roosfield_old.

      ls_roosfield_new-selection = 'X'.



      UPDATE roosfield FROM ls_roosfield_new.


      WRITE: / 'DataSource "0GT_HKPSTP_TEXT" not found in version "A".',

           / 'Nothing to do ... bye.'.






* Step 4: Protocol

* 4.1 HEADER for ROOSOURCE protocol



  WRITE: / 'Table ROOSOURCE:'.



  WRITE: '1 Field to update: EXTRACTOR STRUCTURE'.








* 4.2 Protocol for ROOSOURCE





  WRITE: / ls_roosource_new-oltpsource,

    AT 50 ls_roosource_old-EXSTRUCT, AT 80 ls_roosource_new-EXSTRUCT.

*                                                                     *


*                                                                     *




* Set test flag on the initial screen

  test = 'X'.



* End of report  Z_0GT_HKPSTP_TEXT.

In SAP NetWeaver BW release 7.3 a new Analysis Authorizations BAdI was introduced: BAdI RSEC_VIRTUAL_AUTH_BADI as part of Enhancement Spot RSEC_VIRTUAL_AUTH. The authorized values or hierarchy nodes can be determined dynamically during query runtime. It does not require any Analysis Authorization objects and PFCG Roles. Virtual Authorizations can be used to enhance any existing “classic” authorization model. I.e. you do not have to make an exclusive choice for one or the other, both classic and virtual can be used simultaneously and complementary.

I would like to share my implementation experience with virtual Profit Center and Cost Center authorizations. For an introduction please read my blog Virtual Analysis Authorizations - Part 1: Introduction. In this blog we will discuss the use case and chosen approach, the solution overview, the control tables and default hierarchies. All implementation details you can find in my document Implementing Virtual Analysis Authorizations.


As already mentioned in my previous blog, our use case was Profit Center and Cost Center authorizations. We had to deal with hierarchy authorizations as well as value authorizations. There existed multiple hierarchies which had to be authorized on many hierarchy nodes. We urgently needed a more dynamic and flexible approach.

We implemented Virtual Authorizations for Profit Center and Cost Center authorizations next to the classic model for all other Analysis Authorizations. We tried to mitigate the “compliance issue” by introducing a Profit Center Basic and Cost Center Basic authorization object with only : (aggregation) and # (unassigned) authorization. These objects are checked by the BAdI and the Profit Center and Cost Center authorization is only processed if the respective “basic” object is assigned to the user. In our case that was a role-based assignment. This way we enhanced the Virtual Model:


  • An additional access key is required to get authorized;
  • It will the improve the traceability and auditability;
  • It will increase the compliance with security standards.

Solution Overview

Virtual authorizations can be realized by implementing BAdI RSEC_VIRTUAL_AUTH_BADI as part of Enhancement Spot RSEC_VIRTUAL_AUTH. The Analysis Authorizations are determined dynamically, i.e. during query runtime. Both value and hierarchy authorizations are supported.

Authorizations per user have to be maintained using two central control tables:


  • Value authorizations;
  • Hierarchy authorizations.


Both control tables can be maintained using their own table maintenance dialog. It is recommended to maintain the control tables in every system separately (i.e. no transports) to remain as flexible as possible. An initial mass upload could be facilitated by LSMW (Legacy System Migration Workbench).

Those control tables only have to maintained once for the respective basis Characteristic, i.e. Profit Center and Cost Center. The authorization for Display Attributes and Navigational Attributes is automatically derived and processed by the BAdI.

Control Tables

The hierarchy authorizations are maintained in control table ZBW_VIRTAUTH_HIE that looks almost equal to table RSECHIE. Here we can enter a Profit Center or Cost Center hierarchy authorization for a particular user.



Figure 1: Control Table - Hierarchy Authorization


The value authorizations are maintained in control table ZBW_VIRTAUTH_VAL that looks almost equal to table RSECVAL. Here we can enter a Profit Center or Cost Center value authorization for a particular user.



Figure 2: Control Table - Value Authorization

Default Hierarchies

Another requirement was to be able to generate hierarchy authorization based on value authorization. The rationale behind it is that the majority of reports are based on “default hierarchies”. Particular roles like Cost Center responsible do not get any hierarchy authorization and as a consequence were not able to run those reports. At the same time, we wanted to prevent double maintenance.

The solution was to define a third control table for Default Hierarchies: ZBW_VIRTAUTH_DEF. Here you can enter one or more default hierarchies for a Characteristic. The BAdI will then generate the hierarchy authorization for the default hierarchy restricted to the authorized values as leaves in the hierarchy.



Figure 3: Control Table - Default Hierarchy


In the example above we have defined the (standard) hierarchy 1000KP1000 as default hierarchy for Cost Center.


In this blog we discussed the use case and chosen approach, the solution overview, the control tables and default hierarchies. All implementation details you can find in my document Implementing Virtual Analysis Authorizations.

In SAP NetWeaver BW release 7.3 a new Analysis Authorizations BAdI was introduced: BAdI RSEC_VIRTUAL_AUTH_BADI as part of Enhancement Spot RSEC_VIRTUAL_AUTH. The authorized values or hierarchy nodes can be determined dynamically during query runtime. It does not require any Analysis Authorization objects and PFCG Roles. Virtual Authorizations can be used to enhance any existing “classic” authorization model. I.e. you do not have to make an exclusive choice for one or the other, both classic and virtual can be used simultaneously and complementary.

I would like to share my implementation experience with virtual Profit Center and Cost Center authorizations. This introductory blog will discuss the rationale, a comparison between classic and virtual authorizations, and the different call scenarios for which the BAdI is processed. For the solution details please read my blog Virtual Analysis Authorizations - Part 2: Solution Details. All implementation details you can find in my document Implementing Virtual Analysis Authorizations.


The main problem with a classic authorization concept is that it is less flexible in situations with a big user population, many authorization objects/roles and frequent changes. E.g. organizational changes impacting large parts of the organization and ongoing roll-outs with big increments in the user population.

Classic use cases for a more flexible and dynamic approach are Profit Center and Cost Center authorizations. Often we have to deal with hierarchy authorizations as well as value authorizations. There might exist multiple hierarchies which have to be authorized on many hierarchy nodes. The number of required authorization objects and roles is likely to become high.

As a consequence, you can expect TCD (Total Cost of Development) as well as TCO (Total Cost of Ownership) becoming too high.

Classic versus Virtual Authorizations

Before diving into the Virtual Authorizations, Iet’s try to compare the classic model with the virtual model.



Figure 1: Evaluation Matrix


The biggest draw-back of the classic model pops up in the efficiency with a big user population in combination with many authorization objects and roles. Here the virtual model shows its added value.

On the other hand, the virtual model is less transparent and clear compared to the classic model. Also in the area of compliance we do not have the out-of-the-box functionality compared to the classic model.

Different Call Scenarios

During query run-time the BAdI is called multiple times. This might be a bit confusing in the beginning when you start working with the BAdI. There are 3 call scenarios:


  • Call scenario 1: InfoProvider-independent or cross-InfoProvider authorizations;
  • Call scenario 2: InfoProvider specific authorizations ;
  • Call scenario 3: Documents protected with authorizations.


Call scenario 1: InfoProvider-independent or cross-InfoProvider authorizations

Scenario 1 can be called multiple times. Importing Parameter I_IOBJNM is not initial and Importing Parameter I_INFOPROV is initial. Importing Parameter I_T_ATR might be filled with authorization-relevant Attributes of the respective Characteristic, if any.

In this call scenario the following authorization is processed:


  • Authorization-relevant InfoObjects; e.g. I_IOBJNM = '0PROFIT_CTR';
  • Authorization-relevant Attributes; e.g. I_IOBJNM = '0WBS_ELEMT' and I_T_ATR with ATTRINM = '0PROFIT_CTR' *);
  • Authorization-relevant Navigational Attributes; e.g. I_IOBJNM = '0WBS_ELEMT__0PROFIT_CTR'.


*) Display Attributes need full authorization; see also SAP Note 1951019 - Navigation Attribute and Display Attribute for BW Analysis Authorization.


Call scenario 2: InfoProvider-specific authorizations

Scenario 2 will be called once only. Importing Parameter I_IOBJNM is initial and Importing Parameter I_INFOPROV is not initial. You can determine the authorization-relevant InfoObjects using Function Module RSEC_GET_AUTHREL_INFOOBJECTS.

In this call scenario the following authorization is processed:


  • Authorization-relevant InfoObjects; e.g. I_IOBJNM = '0PROFIT_CTR';
  • Authorization-relevant Navigational Attributes; e.g. I_IOBJNM = '0WBS_ELEMT__0PROFIT_CTR'.


Call scenario 3: Documents protected with authorizations

I did not experiment with scenario 3 yet. It can be called in the context of documents which are protected with authorizations. In this case, both Importing Parameter I_IOBJNM and Importing Parameter I_INFOPROV are initial.


In this introductory blog we discussed the rationale of virtual authorizations, a comparison between classic and virtual authorizations, and the different call scenarios for which the BAdI is processed. In my blog Virtual Analysis Authorizations - Part 2: Solution Details we will discuss the solution details. All implementation details you can find in my document Implementing Virtual Analysis Authorizations.



Hello everyone, I have been working on multiple BW landscapes and operations support for quite some time and from my experience I saw batch processing has high visibility among leadership (business) and its always challenging to refine the existing batch processes and bring down the overall runtimes as part of continuous process improvements.


I have been fortunate to successfully optimize batch processing in multiple instances and in my blog I intend to advise handful easy tips to optimize batch processing.

1. DTP - Data Transfer Processes


I have always seen when people create DTP’s they never consider optimization aspects of how this will impact are simple techniques you can use to reduce the runtimes for data loading. Combination of Parallel Processing and Data packet optimization there can be dramatic reduction in runtimes


Increase Parallel Processing: There is a provision in DTP to increase the number of parallel processing, if you have available work processes then feel free to increase this number and change the job priority. By default this is set to 3 and the job class is set as “C”

Pic 2.jpg

Pic 1.jpg

Another way of parallel processing is to split the data from the source into smaller chunks in case of full load from source to target and run them in parallel with filters applied.


Example: If you have to load Business Partner master data from CRM/SRM system then you can always split them into chunks depending on value range for Source /Territory/Type and run the DTPs in parallel


Data Packet Size: In DTPs you can always vary the data packet size which is directly proportional to the loading run time. Lesser the size of the data packet lesser is the loading time and vice versa. The default value set is 50k records but it can be changed when in edit mode.


Note: At times even after changing the data packet size the number of records in a packet won’t change and in such cases you will have to change the size of source package


Pic 3.jpg

2. Info Package

For Full Info Packages too we can have parallel processing to split the data from the source into smaller chunks and run them in parallel with filters applied and there is also provision to change the data packet size

Pic 4.jpg


In Scheduler there are other options (“Timeout Time” and “Treat Warnings”) as well which is not for runtime optimization but helpful in case you encountering issues with timeout errors and if warnings are to be ignored

3. DSO


DSO activation can be slow if the batch tables are large in size as these are run through for object activations, you can always ask BASIS team to clean such tables with report RSBTCDEL2, Tcode SM65


BASIS – SQL team should always consider updating the statistics for the DSO and reorg/fragment the tables if required. This can be also a routine activity based on the your requirements and needs


There is a provision to create secondary index for DSO tables and it can be either done by SQL DBA team OR in BW console Tcode SE11 to optimize the runtimes


If you are not reporting on the DSO, the activation of SIDs is not required (this will take up some considerable time in activation); Often the logs show that the activation job takes almost all the time to schedule the RSBATCH_EXECUTE_PROZESS as job BIBCTL_*. RSBATCH_EXECUTE_PROCESS is for scheduling and executing the SID-generation process. If you don't need the relevant DSO for reporting & you don't have queries on it, you can delete the reporting flag in the DSO maintenance. This would be a good way to speed this process up significantly. Check under 'Settings' in the DSO maintenance whether you have flagged the option "SID Generation upon activation".

Helpful SAP Notes & Documents

SAP Note 1392715: DSO req. activation: collective perf. Problem note

SAP Note 1118205: RSODSO_SETTINGS Maintain runtime parameter of DSO

SDN Documenthttp://scn.sap.com/docs/DOC-45290



Abhishek Shanbhogue


In this blog I just want to share few tips related to BW transports that might help in collection & validation efficiently.




Though its a personal choice, but some settings are recommended while some are chosen based on personal comfort level.

Following are the settings I prefer for more clear one shot view:




Its most obvious setting but recommending it implies- we should collect different objects by types only. I mean, preferably, instead of dragging in infoproviders with "Data Flow Before" for collecting transformation, we should go & collect specific transformation from corresponding object type. For example-





Following setting can be chosen once required objects are dragged in for collection:


Using this setting in conjugation with "Necessary Objects" settings makes picture very clear on what to be selected & what not, even for BEx Queries or transformation. For example :-


Here we can right click on required object type & click "Transport All Below". Same ways following is sample for BEx Query collection:




Grouping of BW Objects in Transports

I think there is no fixed rule for this but objective is complete transport without error & import in reasonable amount of time.

Following can be two strategies for grouping:

1) If we are sending our development first time & we have large number of data models & reports, then this strategy is recommended:

Separate Transport Requests based on following groups-

a) Infoobjects, Infoobjects Catalogs & Infoareas

b) Infoproviders

c) Datasources& Infopackages

d) Master Data Transformations & DTP

e) Transaction Data [First Level] Transformations & DTP "Transformations which are between datasource & infoprovider

f) Transaction Data [Second Level & Upwards] Transformations & DTP "Transformations which are between infoprovider & infoprovider

g) Process Chains

h) BEx Queries

i) Customer Exit Codes

If number of objects in any group is very high that group can be divided in parts, as if number of objects are too high sometimes importing that transport can become nightmare.

This is very generic sequence, but important thing is to take care of dependency i.e. dependent objects should go in second step once main objects are moved.

While releasing transports system itself checks dependent objects and give warnings or errors accordingly.


Possible question in section can be- "Why we collected 2 different transports for different levels of transaction data transformations?".

This is required only if we have multiple clients of ECC QA for testing but single client of BW. In this case BW will have two source systems connected, hence we will need to transport all TR's with a) first level transformations (between datasource & infoprovider) and b) datasources two times. Each time with correct destination client in "Conversion of Logical System Names":




2) This strategy can be used when we are making ad-hoc transports. For example if we want to transport only 1 simple data model & 1 query, then all objects can be transported together in same transport request. This approach is not recommended while transporting complex data model where total number of objects to be moved is very high.



Some Tips for Quicker Collection

This tip is mainly for collecting large number of transformations. Suppose we have list of 36 [random number] master data models (36 Infoobjects, some ATTR, some TEXT & some all three HIER ATTR TEXT) to be collected and we took a call to collect these in 3 separate transport requests with 12 (Infoobjects data flows) [We need to take a call if we want to move all 36 together in one TR or break them in multiple TR based on complexity of objects & total number of objects in one transport request] each (To avoid large import times). For sake of simplicity, suppose all master data transformations are between datasource & infoobject.

In excel we can use concatenate formula to generate a list of following pattern-



This list can be used as shown in screenshot below-


This trick may seem foolish for collecting 2 or 3 transformations, but while collecting large number (15-20 or more) of transformations from even larger group (100 or more) of transformations, it will be handy.

This trick will reduce the number of objects shown in "Select Objects" pop-up and show only most relevant ones. Now quickly required transformations can be selected & transfer selection-




Note, here if we change concatenate formula little bit we can achieve following results as well:

a) Restricting result set only for specific source system - "RSDS*<SOURCE SYSTEM NAME>*<INFO...*"

b) Restricting only those transformations which are between BW objects- "TRCS*<INFOBJECT>*"

This technique of applying filter based on wild characters can be used for collecting almost all object types (Except BEx Queries).

If we have very robust development tracker maintained we can directly take UID's of Transformations for filtering [list in select options of object type as shown in 2 screenshots above] (or DTPs) using tables RSTRAN (or RSBKDTP) by making selection on source & target object (or other selection fields based on readily available information like source type or target type)

Another point worth noting here is, if we are collecting DTP & Transformations in a same transport request:

We can use same wild card technique for DTP as well. We need to just drag in all required DTP's and in one shot ["Necessary Objects"] we can collect both DTP & corresponding Transformation (plus dependent routines & formulas).


This trick can be easier to apply if we maintain a development tracker listing developed/changed objects by object type -Infoobject, Infoprovider, Transformations, BEx, Chains etc.



Validation of Transports

When we are working with multiple people in a team, it is a good idea to validate the transports before releasing them.

Following two tables are starting point-

1) E070 this table will give you list of sub tasks in a transport request

2) E071 this table will give all objects captured in sub tasks by object types


We can use following link to check different system tables by different object types:



This link might not have all the system tables but by making use of wild card character "*" we can refer many more.

Some tables for reference-

1) RSZCOMPDIR for verifying BEx Query Tech Names

2) RSZELTDIR for checking different Query elements

3) RSTRAN for verifying Transformations

4) RSTRANROUTMAP & RSTRANSTEPROUT can help in identifying routines of a transformations based on table RSTRAN

5) RSBKDTP for verifying DTP's


Making use of tables listed above we can make quick & basic validations (using Excel & VLOOKUP) for example:

a) All relevant routines are captured or not for transformations collected in a transport request

b) Different Objects (by Object Type) are captured in transport request or not based on development tracker



References for transport related blogs

How to manage Error Free Transports



Note to SCN Members: Please feel free to add more references related to the topic.

Business Scenario


SAP  BW system may needs optimization / performance improvements by which the whole BW system is intact and efficient to fulfill the business needs.


The Following are the high level brief summary points which may serve as a starting point , a general approach to see the system and move forward with the optimization techniques. There are number of performance tuning techniques are available in the BW area. We can incorporate all / any/some of that after getting familiarized with these key points.

This may help a new comer who is on board to the project or an existing consultant.These key points may help to define a optimization project for an existing system and independent of the BW versions.

SAP - BW Optimization Approach at a High Level

  1. Understand the System Landscape
  2. Get familiar with the BW Architecture and the reporting Environment
  3. Understand thoroughly on the Business Areas where the Analytics are performed.
  4. Identify the KPI's and Key reports (Management &  Analytical)
  5. Collect, gather information and understand the road blocks and Business Pain points.
  6. Analyze the optimization / performance tuning steps taken so far and understand the changes done in the system landscape.
  7. Gather information on the highly impacted areas and where the business needs the system to be optimized.
  8. Prioritize the areas where optimization process should start.
  9. Segregate the improvement areas into Data Extraction(Includes Source side as well) , Transformation/Staging,  Analytical/Reporting side.
  10. Take the inventory of the objects pertinent to the each areas defined above.
  11. Identify the Key areas where the system can be improved/optimized by taking out the current changes/enhancing the current changes done so far.
  12. Analyze the impact of changing the existing optimization / performance tuning steps taken so far.
  13. Identify the best performance tuning techniques which can be implemented on the areas (Point 9)
  14. Come up with a solution suggestion where the existing changes can be removed / enhanced to the areas defined in Point 9.
  15. Analyze and identify how to sustain the system live ,up and running when about to perform the optimization process and make sure  business would not be impacted at any point of time.
  16. Draw an Optimization/Project plan based on the prioritized  business areas and get the objects list based on the inventory taken segregated by the improvement areas.
  17. Come up with the plan to identify the deliverable , timelines (The project management work will be involved at this point)
  18. Collaborate and explain the plan internally then go for the approval from Business stake holders.
  19. Create a Unit test plan for Dev and Regression test plan for QA to accommodate the finalized plan.
  20. Start implementing the optimization as planned  with extra care and make sure to do multiple testing from the BW perspective and the Business perspective and make sure nothing breaks.
  21. Continue testing in Dev , meet with internal team and prove the performance/optimization ; done so far against the optimization done earlier.
  22. Finalize and confirm the optimization plan works as expected successfully and move on to the next level in the system landscape.


Hope this gives an idea to the beginners, entry level consultant how to approach for an optimization/performance tuning projects.



This document is meant to reintroduce the importance of SAP BW’s Technical Content and specifically the BW Administration Cockpit. In the recent years, SAP HANA has stolen much of the spotlight from everything else that is equally important to our existing customers who are not ready to move onto the SAP HANA platform.


While everyone has been busy acquiring knowledge and getting acquitted with the latest SAP HANA capabilities and functions, little did we notice that SAP has sneaked in significant changes to the technical content that we have been so familiar with. For example, an Xcelsius dashboard has been included to provide a management style reporting while installation has become much more straightforward. This article is not meant to discuss the importance of using Technical Content in a BW environment but to raise the awareness of how easy it is to implement, what the new functionality can address and learnings that we have gathered while enabling this feature.


This document should be use as a guide to setup the BW Administration Cockpit in an environment where this has not been enabled. The effort is relatively minimal with no significant impact on existing objects but please address the warning messages in the installation log. The estimated effort require to complete this installation from an end to end process should not require anywhere greater than 10 hours on a single resource. 




You may skip this section if you have prior experience with SAP BW’s Technical Content. This section is aim to provide a high level understanding on the importance of using and having visibility of the system’s health through use of the generated numerical logs within the BW application.


Aside from the obvious benefits of being of being able to contextualise error and perform analysis, enabling the BW Administration Cockpit is surprising simple. There are no additional licensing costs associated with it and this feature comes as part of the NetWeaver platform. So in essence, you have a free, powerful and insightful tool that if not leverage, will be such a waste.


The advantage of empowering your clients to monitor the health of the system will allows them a greater knowledge to take proactive measure in ensuring the system stays at its optimal level. Having tangible numbers to indicate who are their active users can be a useful communication tool to drive the adoption of BW to the wider community within an organisation. For example, an organisation would have heavily investment in an enterprise warehouse solution and would like to see it being productively used. What better way to feed these information back to the management team on the number of active reporting users, the type of reports that are frequently used and how it is being used. It can also be used as impact assessments mechanism in the event where an underlying BW object needs to be modified and the need to understand what and importantly who it will be affected can save the team a lot of Monday morning hate mail. 


In my opinion, the most beneficial of all in enabling the BW Administration Cockpit is that the information is provided in an Xcelsius dashboard, it is easy to understanding and the information is not overly sensitive. Because of these reasons, I do not see a valid justification for not sharing this information with the larger community. If an organisation uses SAP portal, it can be included as part of the corporate view where it can help to create a culture where information drives decision making and an open and honest view of how the reporting system is performing is a feature everyone can learn to appreciate. Some of the newly provided content such as data consumption by InfoArea is not included into the Xcelsius dashboard but is part of the delivered content. Using this report allows the business to make informed decision on cost e.g. this report will allow the business to allocate the usage cost across different departments and the below sample data indicates that the Finance department is the largest memory consumer therefore cross departmental charges can take place with the appropriate groups. Another sample report, the BW DB Usage report can give you an insight into the trend of the data growth and this can help with hardware sizing by avoiding preventable upgrades by channelling funds to other areas of improvement.


2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


Solution Overview


If you have been using the technical content from the days of the 3.x version right throughout version 7.x, you will be pleased to know that tons of improvements have been added with plenty of reports and tools to provide you with the insight to assist with performance tuning and improvement. This section of the document will provide coverage on the content update from version BW 7.3 onwards.


Overview of the Xcelsius based BW Administration Cockpit

A worthy mention improvement that has been added is the BW Administration Cockpit to assist with monitoring jobs and the health of the BW system. The Administration Cockpit comes in two flavours, which is, the traditional integration with SAP Portal using BEx queries or the new Xcelsius integration. The Xcelsius Dashboard is a representation of the old queries in a new format but it does bring a breath of fresh air to the way we view information and the Xcelsius Dashboard is accessable through transaction RSTC_XCLS from the BW system. With the use of the Xcelsius based BW Administration Cockpit, you are provided with a high level view of the important information to keep your business users happy and with its intuitive layout you are able to locate the information and drill down to the detail error message to understand the next course of action. The monitoring options that are available to you through the Xcelsius Dashboard are:


  • Alerts: BW InfoProviders with erroneous requests
  • Performance Monitoring: Daily Query Performance and Weekly Process Chain Performance
  • BW Usage Statistics: DB Usage, Query Usage, Top Users, BWA / DB / OLAP usage


       OLAP cache and SAP NetWeaver BW Accelerator usage

        If you are in an environment that uses BW Accelerator to improve query performance, the two new MultiProvider’s (0TCT_MC31 and 0TCT_MC32)                   below can offer you an insight into memory usage that allows you to better utilise your hardware and identify areas of improvements which you can                   perform to spread the load across the server. Some useful queries that are available from the technical content are reports such as CPU usage trends               across periods, hourly CPU trends, Available memory, % of memory used and a breakdown of memory usages by InfoProvider.

2015-03-17 10_22_30-SAP 73x BW Administration Cockpit - Microsoft Word.png

        If BWA is not present, you can still benefit from the Technical Content in the area of report performance if you have enable the OLAP cache consumption.         By default and in many instances, the OLAP cache usage will be enabled and tweaks can be made to provide better efficiency or memory

        handling method when the cache memory has been exhausted, e.g. reading the cache locally or from the database.


With the new BW7.3 release, characteristic 0TCTOLAP_DM has been added into InfoCube’s 0TCT_C01 and 0TCT_CA1 and this acts as a flag to distinguish between the pure OLAP Cache access and the Data Manager access. It is a useful method to identify the objects with high memory usage so that appropriate actions can be taken to switch these objects to use BWA. In another example, when trends on frequently use reporting objects starts to increase, you can create an exception query on an acceptable OLAP Time to flag for appropriate action. Be caution not to overload the BWA server but I guess this is where such report will make sense and allow you to draw the fine line.

2015-03-17 10_22_30-SAP 73x BW Administration Cockpit - Microsoft Word.png


     Report availability and InfoProvider data status


While the previous version of the technical Content was only able to provide you with information on the status of the data load from a Process Chain, InfoPackage and DTP, the BW7.3x content has added queries that allows you to know if a report is available for reporting and relevant information on inconsistent and incomplete data load error surrounding the underlying InfoProvider.


2015-03-17 10_22_30-SAP 73x BW Administration Cockpit - Microsoft Word.png


A use case for this type of information is that it can be useful in furnishing the end user with the status of their reports. To achieve this, a specific customer enhancement query can be built against a department and expose only the applicable InfoProvider’s to them. The objective of this feature is to eliminate report inefficiency, rework and the possibilities of reporting on outdated numbers. The other benefit of this new feature is that it can also list out possible impacted BW objects as a result of data load failures. This type of report can assist with any troubleshooting exercise and also allow you the confidence to identify dependant objects in the event a huge data reloading excise is required to resolve your issue.


2015-03-17 10_22_30-SAP 73x BW Administration Cockpit - Microsoft Word.png


        Database Volume Statistics


As briefly discussed under the Benefits section, the new improvement in the Database Volume Statistics content allows greater visibility on data distribution by hierarchies, BW objects e.g. InfoProvider’s and Table type. The goal of this data model is to provide the BW Administrator another useful tool to analyse the database growth in respective to the BW objects. The new Technical Content that provides this information sits under the 0TCT_C25 InfoProvider. You now have the insight into data consumption by functional area and the sample screen shoot below can easily help indefinite the entity within a business which consumes the most database memory.


2015-03-17 10_22_30-SAP 73x BW Administration Cockpit - Microsoft Word.png


In the past, it was almost impossible to break down the type of database consumption by BW object type without a lot of effort and man days but with this new content, database questions such as “What is the database size of an ODS Change Log or how many tables has the E and F table used?” can now be easily answered without going into ST03N and compiling the required information.


Supporting documentations


The BI Administration Cockpit is a recommended reporting feature that has been provided by SAP through the use of Technical Contents and this document is meant to cover the topic of installation and useful features within the BI Administration Cockpit and Technical BI Content layer.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

Browse through the details found in the standard documentation because it is an excellent way to familiarised yourself with the installation procedure and the instructions provided by SAP are clear and concise.


Standard SAP documentation from SAP Help Portal that details the prerequisite, installation procedure and usage instructions.


How to efficiently use the SAP NetWeaver BW integration in SAP BusinessObjects Xcelsius


SAP NetWeaver BW Administration Cockpit Technical Content BI Statistics (SAP Feb 2011)


The Architecture of BI Statistics



Installation procedure


We discovered that an active SAP portal is a crucial component in having a working cockpit to allow reporting through Xcelsius. Others might argue that having established a BICS connection, it is sufficient to execute any dashboard reporting from BW, however, this was not the case for this exercise.


This installation procedure is meant to act as a guide under BW version 7.4 SP09 (SAPKW74009). Some installation procedure might have changed over the course of time due to product improvement and thus necessary precaution is required to successfully implement this Administration Cockpit in a landscape which might be on a different release.


While the installation of the BW Administration cockpit is simple and straight forward, the documented installation procedure can help to clarify any doubt or questions that might arise in your effort to provide this solution to your client.






At a minimum level, ensure that SAP portal is present in the landscape and it is configured together with BW.


To check, in the BW server, execute this table RSPOR_T_PORTAL under SM30 and you should see some basic settings maintained


Alternatively, contact the system administrator to have this setup.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


Ensure you have the BI Administrator role is added to your login (SAP_BW_BI_ADMINISTRATOR)

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


To activate the Technical Content, you have the choice of doing it via SPRO, execute the RSTCT_ACTIVATEADMINCOCKPIT_NEW program (SE38) or go directly to transaction RSTCT_INST_BIAC.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


SAP has made it really easy to perform the Technical Content installation and it is no longer done under the Data Warehousing Workbench à BI Content section.


After you have made your selection criteria and ready to proceed with the installation, click on the execute button and wait for it to complete.


The options provided are self-explanatory and you would probably want to create a transport request to move these newly created objects across the landscape.


There will be 5 Process Chains added into RSPC under the Unassigned Nodes (NODESNOTCONNECTED) and you can set the schedule execution time prior to starting the installation. The default parameter is 04:00:00


  1. 1.     0TCT_C0_INIT_DELTA_P01
  2. 2.     0TCT_C2_INIT_DELTA_P01
  3. 3.     0TCT_C3_INIT_DELTA_P01
  4. 4.     0TCT_C0_FULL_P01
  5. 5.     0TCT_C25_FULL_P01
2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


Upon completion of the installation, it is advisable to check the installation log for any errors or warnings. Please address these messages accordingly to the nature of the system environment.


We did not encounter any errors or warnings at this stage of the installation process in our internal environment.


If you encounter an error during the Technical Content installation, please refer to the Supporting Information section of this document.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


The extensive list of activated objects can be found under the 0BWTCT InfoArea and importantly, ensure that these Process Chains have been added into RSPC.


  1. 1.     0TCT_C0_INIT_DELTA_P01
  2. 2.     0TCT_C2_INIT_DELTA_P01
  3. 3.     0TCT_C3_INIT_DELTA_P01
  4. 4.     0TCT_C0_FULL_P01
  5. 5.     0TCT_C25_FULL_P01


Note that there will be additional Process Chains added into the list for example, such as Process Chains to monitor BIA Statistics and if you are in an environment where BIA does not exist, this can be ignored.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


When the installation is complete and you have verified that all the necessary process chains are in place, you can begin by loading the master data using Process Chain 0TCT_MD_C_FULL_P01 and subsequently followed by the 0TCT_C* Process Chains.


Xcelsius Dashboard


To use the Xcelsius dashboard, in the BW system, enter RSTC_XCLS and this will launch a web browser session extending to a preconfigured portal address and you should see a similar dashboard below provided that you have setup SAP portal and successfully activated the BW Administration Cockpit.

This dashboard will allow you an overview of three basic monitoring which is the Alerts, Performance and Usage of the system.


Monitoring Type




Alerts will alarm the BW administrator on data load failures for a given problematic InfoProvider or DataSource.


It will highlight error messages, list the impacted objects and the use of the Detail button to display the corresponding backend log.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


The Performance tab will highlight high runtime objects for both Process Chains and queries by using a BEx query condition to select the TOP 20 object.


The Analyse Details button will provide the option for a graphical analysis on a granular level.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png


The Usage tab gives you information on the trend of the data growth for a 12 months period, the most frequently used queries and the most active users.


If BWA is present in the landscape, it will provide the percentage of data used by a query that fetches information from BWA.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

Supporting Information


This section is a collection of additional information that has been useful in providing core information to strengthen the understanding, concept and troubleshooting guide towards the usage of the Technical Content. Please make full use of the attached links and SAP Service Market Place to find updated information on technical areas which might have changed during the course of multiple system improvements.


     1. Discovered errors after the Technical Content installation.


      To avoid having to reinstall the entire Technical Content, use transaction RSTCO_ADMIN to restart the failed installation. A yellow status can also be an             indicator that a newer version has been released and attention is required to handle this warning message. RSTCO_ADMIN can also be used to fix an                 installation that was executed by a user without the proper authorisation for Business Content installation. For supporting information, please refer to OSS 1069134 - Improved monitoring RSTCO_ADMIN

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

     2. The background (SM37) job name is BI_TCO_ACTIVATION.


       Use this to understand the installation procedure, the potential warning or error messages that might occur as a result of your installation.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

     3. Assigning an importance criterion to SAP’s Technical Content.


     This feature will allow you to sort or filter BW technical objects and it needs to be maintained by assigning an importance value against the customer query          that you wish to create or maintain. E.g., by assigning an importance value to a Technical Content Process Chain or InfoCube, you are able to sort that                information to give it prioritisation amongst the other objects that is being monitored. The default importance value for all BW technical objects is set at 50 and      to change this, use transaction RSTCIMP to assign any value between 0 and 100. The underlying table that stores this information is RSTCIMPDIR.

     2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

                    With the customising complete, transfer the value to InfoObject 0TCTBWOBJCT via DataSource 0TCTBWOBJCT_ATTR and verify attribute                     0TCTIMPRTNC.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

          4. Collection of Statistical Information.


          All newly created BW Query, InfoProvider, Web Template and Workbook have been defaulted to collect statistical information and this setting can be           change to disable it, turn it back on and determine the level of aggregation to report on. This setting is maintained using transaction RSDDSTAT and as a           rule of thumb, it is advisable to leave all objects turn on while maintaining a sense on the aggregation data that is required. Once you have evidence of           where performance monitoring is not required, e.g. on InfoProvider’s with low data volume, this setting can be turn off.

          If an InfoProvider has this setting disabled, e.g. InfoProvider ZKUST01, all newly created queries will inherit this property and no statistical information will           be collected for it. However, you can overwrite this setting in the Query tab to explicitly only collect information for that desired Query.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

The amount of data or level of detail to be collected can also be adjusted based on the setting of 1, 2, 9 and 0. Below is an extracted text from SAP and further detail can be found here, http://help.sap.com/saphelp_nw70/helpdata/en/43/e37f8a6df402d3e10000000a1553f7/content.htm

Statistics Detail Level for the Query Object Type

For queries, you also have the option of selecting a detail level for the statistics data. You can choose from the following:


  • 0 – Aggregated Data: The system writes only one OLAP event (event 99999) for the query. This contains the cumulative times within the OLAP processing of the query. The system does not record data from the aggregation layer of the analytic engine or aggregation information.
  • 1 – Only Front End/Calculation Layer Data: The system records all OLAP events, but not separate data from the aggregation layer of the analytic engine.  The system writes only the general data manager event 9000 in the OLAP context as well as the aggregation information.
  • 2 – All: The system records all data from the area for the front end and calculation layer as well as data from the area for the aggregation layer and aggregation information.
  • 9 – No Data: The system does not record any data from the front end and calculation layer or from the aggregated event 99999. However, it does record data for the BEx Web templates and workbooks, depending on the setting.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

          5. Deleting Statistical Data.


          Statistical data can grow at an exponentially rate depending on factors such as the number of users in the system, the frequency of query activities and the           type of aggregation setting that has been enabled under transaction RSDDSTAT. SAP’s data retention period for table RSDDSTAT_* is 14 days but you           can overwrite the standard setting of 14 days by maintaining a numeric value in the RSADMIN table for entry TCT_KEEP_OLAP_DM_DATA_N_DAYS.         

          To do this, use the SAP_RSADMIN_MAINTAIN program to add or modify this entry. The example below holds a value of 7 days.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png

           Alternatively, to manually delete the statistical data use the standard Delete Statistical Data function under transaction RSDDSTAT or execute program           (SE38) RSDDSTAT_DATA_DELETE.

2015-03-06 10_44_54-SAP 74 BW Administration Cockpit - Microsoft Word.png



It will come as no surprise that the BW Administration Cockpit will need to be owned and managed by the IT department to ensure continuous improvement is performed productively. By having these statistical data turned into readable information, it allows an easier way to keep track of what is going on within the IT landscape regardless of the size of your enterprise.


There is no need to reiterate that the function of IT is to support the core business function but put on your green hat to find a business use case for it e.g. in an environment where SLA for BW plays an important KPI e.g. the BW server is hosted by an application provider, these information can be prove to be useful.


It is not enough to just activate the Technical Content and start running the Process Chain to collect the information that has been generated by the system but having a good understanding of the data and the standard reports is especially crucial to perform actionable task to safeguard the health of the BW server. Use the standard reports as building blocks to further enhance and drive specific monitoring and runtime statistics requirements once your team has a better understanding on the other areas to improve on.


In terms of the new features provided by SAP, it is worthwhile to recognise that new contents might be available and be mindful that continuous improvement is certain with every release and upgrade.

BW Performance At source



1) Maintain the control parameter for data transfer in SBIW -> General settings -> Maintain control parameters for data transfer.


Source system table ROIDOCPRMS: It contains the control parameters for data transfer from the source system to BW.

STATFRQU - Number of packets that are transferred before statistical info is sent

MAXPROCS - Maximum number of dialog work processes per upload request used to send the data to the BW system

MAXLINES - Maximum number of records sent in one IDoc packet.

MAXSIZE - Maximum size of an IDoc packet in KB.



Important Points to be considered.


A) Package size = MAXSIZE * 1000 / size of the transfer structure 


Package Size -  Not more than MAXLINES.

Transfer Structure size is determine by using SE11 (ABAP Dictionary) -> Extras -> Table Width -> Length of data division under ABAP.

B) If table ROIDOCPRMS is empty, the systems use default values during run-time. You should not allow these default values to be used.


SAP Note 1597364 - FAQ: BW-BCT: Extraction performance in source system

SAP Note 417307 Extractor package size: Collective note for applications



2) Values for Max Conn and Max Runtime in SMQS (Configure number of IDoc to be processed parallel depending on number dialog  process available in BW)




tRFC processing is very slow and has "Transaction recorded" status in SM58 or IDOC processing delay or Workflow processing delay



  1. Call transaction SMQS
  2. Choose the destination
  3. Click Registration
  4. Increase the Max.Conn (Enter the number  of connection ) , this is directly proportion the available dialog process  in BW system. Example if BW has 30 Dialog process then you can try Max.Conn as 20. 
  5. Increase Max. Runtime(For example 1800)




Sap Notes


1887368 - tRFC process slow in SM58 and Max.Conn is not fully used in SMQS



3) IDoc processing and Performance

The "Trigger Immediately" \ "Transfer IDoc Immed." options should be always used.



How to change the processing mode for the IDocs in question as follows:

For inbound:
-> go to transaction WE20 -> Select Partner Select Inbound Message Type and change the processing method to "Trigger Immediately" .

For Outbound:
-> go to transaction WE20 -> Select Partner Select Outbound Message Type and change the processing method to "Transfer IDoc Immedi."


What will happen IDocs are scheduled , reports RBDAPP01 \ RSEOUT00 in batch mode will be processed via scheduled runs. This should then leave the majority of dialog work processes free for users and mission-critical processes. Hence you will no longer encounter the resources problems you are currently experiencing.


4. Performance problem in collective job RMBWV3nn

Try to maintain the control parameters for the extraction for MCEX(xx) queue by the method below.
Transaction Code LBWR > Queue name (MCEX Queue) > Customizing > No.Of.Documents = 1000 to 3000.Check if this is reflected in Table TMCEXUPD-UPD_HEAD_COUNT field.

The adjustment of  TMCEXUPD- UPD_HEAD_COUNT will need to be tested for each application, as setting too large a value can result in a memory dump.




Part 2 - Performance Improvements in BW System



You have an infocube based on a standard extractor.

But you get error message when DTP's executed: Status 'Processed with Errors' error code RSKB257 without any more details.



A possible scenario:




1. Go to transaction RSDTP and enter DTP's technical ID


Note the DTP's Request ID (example 557.165)



2. Go to SM37 and enter field Job Name: BIDTPR_557165*


> Job name is started with BIDTPR_ then the Request ID (without dot separator) and put an * (asterisk) wildcard character after it.

Then enter field User Name: *


> User Name will be also an * (asterisk) wildcard character.


Only select Canceled job status below.



Delete the From and To dates from Job Start Condition area.


Click Execute.



3. Review job log.


You may find more lines like this and can double click on it:




22.01.2015 07:51:18 Enter rate TRY / EUR5 rate type M for 08.04.2011 in the system settings                             SG           105          E





Long description of the error message:


Enter rate TRY / EUR5 rate type M for 08.04.2011 in the system settings
Message no. SG105



For the conversion of an amount into another currency, an entry is missing in the currency conversion table.


Add the missing entry in the currency conversion table.


Execute function

You can then continue to process the commercial transaction.





Solution for this kind of issue:



With execution of the report RCURTEST in transaction SE38, you may check the missing exchange entries.


This issue may have happened after you have transfered the exchange rates to BW. Check if have you transfered the global settings also.


Please also check if the missing currency of the underlying company code caused this error.

Hello Guys,



I only would share with you the BW-WHM* released notes for February:




NumberDescriptionRelease on
BW-WHM-AWB1600929SAP BW powered by SAP HANA DB: Information25.02.2015
BW-WHM-AWB2121217Remodeling tool for SPO is not working17.02.2015
BW-WHM-AWB2083701Entering an Integer value in a key figure Integer type field in transaction RSI13.02.2015
BW-WHM-AWB2117741Inclusion of ADSOs and HCPRs in data flow model12.02.2015
BW-WHM-DBA2111541Update 1 to Security Note 196581926.02.2015
BW-WHM-DBA-COPR2122411HCPR: CompositeProvider corrections for Release 7.40, part 1227.02.2015
BW-WHM-DBA-COPR2050557No aggregation with 0REQUID in PartProvider before join in CompositeProvider25.02.2015
BW-WHM-DBA-COPR2111944RSOHCPR/RSOADSO include adjustment19.02.2015
BW-WHM-DBA-ICUB2009574Duplicate number range object in different InfoProvider19.02.2015
BW-WHM-DBA-ICUB1999013RSRV - Initial Key Figure Units in Fact Tables test results in exception12.02.2015
BW-WHM-DBA-ICUB2125857Cube writer: Termination when loading extraction from cube12.02.2015
BW-WHM-DBA-ICUB2062714The figure for the year is more than 250010.02.2015
BW-WHM-DBA-ICUB1946893ORA-00060 Deadlock in Cube load05.02.2015
BW-WHM-DBA-ICUB2124904An "Invalid BW namespace definition" message is displayed in application log du03.02.2015
BW-WHM-DBA-IOBJ2103263Enhancement of structure BAPI InfoObject BAPI6108 fields with fields UOMCONV an20.02.2015
BW-WHM-DBA-IOBJ1984625ABAP Dictionary errors during InfoObject activation with change to compounding18.02.2015
BW-WHM-DBA-IOBJ2069619Function module RSD_IOBJ_MULTI_GET returns incorrect results if parameter I_REA12.02.2015
BW-WHM-DBA-IOBJ2111658A system dump occurs when viewing the database table status of a unit type char12.02.2015
BW-WHM-DBA-IOBJ1827295InfoObject with an attribute that has statistics created06.02.2015
BW-WHM-DBA-IOBJ2111737Enhancement of InfoObject impact analysis with regard to local characteristics06.02.2015
BW-WHM-DBA-IOBJ2099865Using RSA1 on InfoProvider is taking a long time04.02.2015
BW-WHM-DBA-ISET2050817Using RSA3 may cause a system dump while trying to reference a missing navigati12.02.2015
BW-WHM-DBA-MD1918525Performance optimization during text loading13.02.2015
BW-WHM-DBA-MPRO2082301Executing RSUPGRCHECK may display inconsistent MultiProvider24.02.2015
BW-WHM-DBA-ODS2070577(advanced) DataStore Object - availability in BW7.4 SP08 and SP0919.02.2015
BW-WHM-DBA-ODS2118329LISTCUBE: ADSOs not displayed in input help19.02.2015
BW-WHM-DBA-ODSV2123371Open ODS view: Error when opening using BW Modeling Tools if source object does17.02.2015
BW-WHM-DBA-ODSV2122710Open ODS view: Impact deletes active version06.02.2015
BW-WHM-DBA-ODSV2123034Open ODS view: Suboptimal query runtime after initial activation06.02.2015
BW-WHM-DBA-ODSV2123726Open ODS view: SQL error with field of type "Client"06.02.2015
BW-WHM-DBA-ODSV2121708Open ODS view: SQL errors for difficult data types03.02.2015
BW-WHM-DBA-OHS2046551Open Hub with query as source27.02.2015
BW-WHM-DBA-OHS2065393The old data is not deleted from DB with Deleting Data from Table when DB Conne18.02.2015
BW-WHM-DBA-OHS1980716Description for Open Hub Destination is truncated from 60 to 30 characters.09.02.2015
BW-WHM-DBA-OHS2122189SAP BW 7.40(SP11) Open Hub Destination can't be activated06.02.2015
BW-WHM-DBA-RMT1944429Remodeling - InfoObject remodeling is not supported18.02.2015
BW-WHM-DBA-RMT1966654RSMRT: Corrections for SAP BW 7.40 SP502.02.2015
BW-WHM-DBA-SPO2063607DTP filter generation from SPO/reading data from RemoteCube doesn't work25.02.2015
BW-WHM-DBA-SPO2079081730SP13: Error Handler in the DTP template with source as SPO or MPRO is switch24.02.2015
BW-WHM-DOC2017437Preliminary Version SAPBWNews BW 7.02 ABAP SP 1706.02.2015
BW-WHM-DOC1332017Preliminary Version SAPBWNews BW 7.02 ABAP SP 0104.02.2015
BW-WHM-DOC1367863SAPBWNews BW 7.02 ABAP SP 0204.02.2015
BW-WHM-DOC1800952SAPBWNews BW 7.02 ABAP SP 1404.02.2015
BW-WHM-DOC1940530SAPBWNews BW 7.02 ABAP SP 1604.02.2015
BW-WHM-DST769414Support Package 23:Lock Manager log written in batch process24.02.2015
BW-WHM-DST2080574Mass activation programs unintentionally load generated programs into main memo18.02.2015
BW-WHM-DST2116836P34; WO DSO: Dump in FM RSM1_DELETE_WO_DSO_REQUESTS06.02.2015
BW-WHM-DST2123548P35; RSSM_GET_TIME; APO: OPEN CURSOR is destroyed06.02.2015
BW-WHM-DST-ARC1858550Downport NLS IQ to BW 7.0X11.02.2015
BW-WHM-DST-DBC1888353DBC SAPLRSDS_ACCESS_FRONTEND termination due to neg. length16.02.2015
BW-WHM-DST-DBC2119576DB connect - error in RSDL after EHP6 update06.02.2015
BW-WHM-DST-DFG2118610P13; DFG: Data flow dumps or hangs06.02.2015
BW-WHM-DST-DS2124619730SP13: Activation of multi-segmented datasource inactivates transformation/DT13.02.2015
BW-WHM-DST-DS1809892Enable extraction from old versions of a DataSource12.02.2015
BW-WHM-DST-DTP2119480Fehler RSTRAN 840, Beratungshinweis 1851875 obsolete26.02.2015
BW-WHM-DST-DTP2080701P13: DTP: Access to active DTAs during import of DTPs25.02.2015
BW-WHM-DST-DTP2116840P35:WO-DSO:ACTIVE-Feld in RSBODSLOGSTATE zurück setzen25.02.2015
BW-WHM-DST-DTP2125520P14:MPRO:DTP:REDUCE:Performance bei vielen Mpro-Requests24.02.2015
BW-WHM-DST-DTP1915498P32; DTP: BDLS does not convert DTPH LOGSYS entries23.02.2015
BW-WHM-DST-DTP2123709P13:REDUCE:MPRO:LPOA:Zu alte Requests werden extrahiert23.02.2015
BW-WHM-DST-DTP1943907P33: DTP: Dump when executing DTP with expert mode simulation; data package17.02.2015
BW-WHM-DST-DTP2118187P13:MPRO:DTP:DTA löschen und Overflow der Mpro-Requests13.02.2015
BW-WHM-DST-DTP2121282P34; DTP; serial extraction with semantic group never ends10.02.2015
BW-WHM-DST-DTP2124611P35:DTP:WO-DSO:REQUDEL:Perf. beim Delta-Req. ermitteln02.02.2015
BW-WHM-DST-HAP2067912SAP HANA transformations and analysis processes: SAP Notes for SAP NetWeaver 7425.02.2015
BW-WHM-DST-HAP2033679SAP HANA transformations and analysis processes: SAP Notes for SAP NetWeaver 7424.02.2015
BW-WHM-DST-PC2021473Addition of process log to mail if the process is successful.13.02.2015
BW-WHM-DST-RDA2125240RDA_RESET: Request is closed without execution of data transfer process (DTP)19.02.2015
BW-WHM-DST-SDL2117128P13:IPAK:3rd party selection fields do not disappear11.02.2015
BW-WHM-DST-TRF2118057BW 7.40 SP8/SP9/SP10: HANA Analysis Processes and HANA Transformations (Part 9)27.02.2015
BW-WHM-DST-TRF1816350731SP8:Syntax errors in routines or Assertion failed during activation of trans26.02.2015
BW-WHM-DST-TRF2109129SAP HANA Execution: Inserted value too large for column25.02.2015
BW-WHM-DST-TRF2117312Collection note HAPs & HANA-Transformations24.02.2015
BW-WHM-DST-TRF2104509SAP HANA Execution: SAP HANA analysis process does not exist12.02.2015
BW-WHM-DST-TRF2125734Program RSDG_TRFN_ACTIVATE sets incorrect transformations to inactive without r06.02.2015
BW-WHM-DST-TRF2118892SAP BW 7.40(SP11) Script changes not transported05.02.2015
BW-WHM-DST-TRF2123327SAP BW 7.40(SP11) Changes of InfoObject not taken into InfoSource of SPO04.02.2015
BW-WHM-DST-TRF1995901730SP12: Performance problem in opening Transformation in change mode03.02.2015
BW-WHM-DST-TRF2100247730SP13: Performance problems in aggregation check in Transformations03.02.2015
BW-WHM-MTD-CTS2042927Incorrect AIM execution: No repeated activation of successfully processed objec17.02.2015
BW-WHM-MTD-CTS2120719RC = 12 during script-based Inhouse transport, error RSO 78117.02.2015
BW-WHM-MTD-CTS1978136Error when deleting BPC objects05.02.2015
BW-WHM-MTD-HMOD2032830External SAP HANA view: Inventory key figures (non-cumulative key figures)24.02.2015
BW-WHM-MTD-HMOD2121712Improved troubleshooting during termination of authorization replication with e18.02.2015
BW-WHM-TC-SCA2111723Password lost in task CL_RSO_UPDATE_BWMANDTRFC_HOST23.02.2015
BW-WHM-TC-SCA2124850Class CL_RSO_DTP_ERROR_LOG_DELETE is missing from Housekeeping task list09.02.2015



Best Regards,



Filter Blog

By author:
By date:
By tag: