1 2 3 19 Previous Next

SAP Business Warehouse

284 Posts

Author(s): Saorabh Trilokinath Shivhare, Deepti Shetty.

Target readers: BW Consultant, BW Engineer, Basis Consultant.


Purpose of the document:

Scope of this doc. Is to cover pre and post BW upgrade activities. One can read what to do, how to do and why to do. Whenever SAP note is applicable it is mentioned and any other ref. for that point is also covered. These all points are based on our understanding and experience.


 What is BW upgrade?

Upgrade BW server from existing version to higher version, in this scenario upgraded from 7.0 to 7.4.

 Why we do upgrade?

Most of the advanced features are available only in higher version e.g. Semantic Partitioned, to collect the statistics of the Bex queries in the database tables.

 Why one should read this doc.?

There are already many materials available for BW upgrade checklist which explain how to do activity.  This doc. Also explain reason behind doing these activities. Along with below points read the last column for reason of Why we do?


1. Some of step may differ in pre and post upgrade activities depending on version to be upgrade, particular feature to test and environment.

2. These steps are considering BW upgrade from 7.X to 7.4 version.


Pre upgrade activities:

Sr. No.What to do?SAP NoteHow to do?Why we do?
1.Run Transaction code RSUPGRCHECK to check DDIC (data dictionary) consistency.If it shows any inconsistency in info object, DSO, Info cube and transfer rule.  Activate inconsistent objects. Otherwise this may cause error due to non-activated DDIC tables in upgrade.

DDIC Objects are data dictionary objects. DDIC objects are tables, views, data type, type group, domain, search help.


ABAP dictionary describes and manages all data definitions used in the system.


This report checks whether DDIC tables that are needed for the BW meta data object, are actively existing.


The report points out the incorrect objects found in the log.

2Get list of all
Inactive objects.
To get list of inactive objects follow steps: Execute SE11 > OBJVERS = 'A' and OBJSTAT = 'INA'. Table name for respective meta data BW object is mentioned in ref. doc.Status of meta data objects should be the same pre and post upgrade.
3.Clean Up

No separate Transaction code or program as it done in several other activities, this includes

PSA clean up, delete log files, deleting objects like Infocubes and DSO's that are not required, deleting aggregates that are empty.

House-keeping activity.
4Stop RDA(Real-Time Data Acquisition) DaemonUse T Code RSRDATo do when using
Daemon services in BW. 
5Remove Temporary BI Tables & Check Invalid Temp tables and Temp Objects.

a) Check for Invalid Temporary Tables in SE14 using the menu path Extras -> Invalid Temp. Table.

b) Execute se38 > program: SAP_DROP_TMPTABLES.


The following objects can be deleted.

• Temporary Tables (Type 01/0P)

• Temporary Views (Type 03/07)

• Temporary Hierarchy Table (Type 02/08)

• Temporary SID Tables (06)

• Generated Temporary Reports

Part of Housekeeping activity. These are DB objects such as tables/ views/ triggers and so on. This have /BI0/0 name prefix.

6Check/Repair Status of Info Objects


1. Log on to the SAP system.

2. Call transaction RSD1.

3. Choose Extras Repair InfoObjects (F8).

4. Choose Execute Repair.

5. Choose Expert Mode Select Objects.

6. On the following screen, in addition to the default checkbox selection, activate the following checkboxes:

- Check Generated Objects

- Activate Inconsistent InfoObjects.

- Deletion of DDIC/DB Objects

- Display Log

7. Execute the program.

This activity is performed so that there should not be erroneous object.


We perform consistency checks on the data and metadata stored in a BW system. This   tests the foreign key relationships between the tables of the extended star schema. (ref. From SAP help).

7Clean/delete the messages for error logsRun Report RSB_ANALYZE_ERRORLOG & RSBM_ERRORLOG_DELETEHouse-keeping activity.
8Check Master Data consistencyRun report
It means there is inconsistency / missing record in master data tables i.e. P/X and Dimension table of Cube where this Master data objects are used.
9Cleanup background
784969use program RSBTCDEL2 Background jobs are scheduled for various
activities these can be V3 job and any other jobs.
10Table SMSCMAID1413569Before you start the
Software Update Manager, check if you use the table SMSCMAID. If so, see SAP
Note 1413569 to avoid a termination of the Upgrade due to duplicate records in
the table.
Table SMSCMAID is used
for scheduling. If index is added in Table SMSCMAID follow this note.  
11Check for missing
TBLIG entries by running report RSTLIBG
783308Run report RSTLIBGThis report list out
inconsistencies in existing Info provider.
in se38
In se38 > execute
To repair indexes of
info cube.
13Clear all logistic
data extractions. Clear delta queue.
Check in source system
for all source systems.
Clear delta queue and
logistic data extraction before upgrade as post upgrade underlying tables and
fields of extract structure may get changed.
14Check inactive
Transfer rules, IC, DSO, aggregates etc and list them.

Once you click on DSO, we need to check whether every
request in DSO is in green status or not. Before upgrade, we should make sure
that all the requests in the DSO should be in green state.

Here we need to check the status of
aggregates, according to SAP recommendation they should be activated before upgrade. Rolling up of aggregates is not necessary, it depends upon the
business requirements.

check only for post
upgrade ref.
15Repair inconsistent
info packages using RSBATCH
To check if Info package
in active state is inconsistent or not. If inconsistent then Repair before
16Take pre snapshots of
critical Bex queries
Check only for post
upgrade ref. Take prompt selection and o/p both screen shots.
17Converting Data
Classes of Info Cubes 


1.  Call transaction SE16 and             check the table RSDCUBE.
2.  Select OBJVERS equal ‘M’, ‘A’; and check the entries for the fields DIMEDATCLS, CUBEDATCLS, ADIMDATCLS, and AGGRDATCLS.
All Info Cubes are listed with their assigned data classes.
3.  Compare the data classes with the naming conventions for data classes described in SAP Note 46272.

If you find incorrect data classes, correct them as follows:
1.  Set up a new data class as described in SAP Note 46272
2.  Execute the RSDG_DATCLS_ASSIGN report.
It allows you to transfer a group of Infocubes from old data class to the new data class and also assigns the Infocubes to the right data class.

Need to check whether cubes are assigned to correct data classes or not. If any DDART data classes (customer data classes) that do not follow the naming conventions described in SAP Note 46272,these data classes will be lost during upgrade due to which the tables of the InfoCube can't be activated.

An error message is displayed in technical settings due to the above situation.


Step No.2 (Pre upgrade) and 5 (Post upgrade).


TableBW meta data Objects.
RSUPDINFOupdate rule
RSTSTransfer Rule
RSISNInfo source
RSDSData source
RSDCUBEInfo cube
RSDIOBJInfo object



Post upgrade activities:


Sr. NoWhat to do?SAP NoteHow to do?Why we do?
1Run the extractors in
the delta mode to verify any technical errors.
Do delta run for any
data source. Delta queue should be blank pre upgrade and once this data load
started, monitor for this data source in delta queue. 
To check delta  load happening correctly.
2Basic Application
Test different BW
flow as Master data object flow, Process chain and one transaction flow from
source system to Bex query end. 
Test different BW
3Sample Data Loads by
running critical master data and transaction data process chains
Test transaction and
master data load.
4Check RFCs and source
system connections.
Execute Tcode – RSA1
and replicate all the source systems.
Test RFC connection
to all source systems.
5Check Transfer rules,
Info Cube, Aggregates and activate inactive objects.
To get list of
inactive objects follow steps: Execute SE11 > OBJVERS = 'A' and OBJSTAT =
'INA'. Table name for respective meta data BW object is mentioned in ref. doc.
Compare list of
objects with pre upgrade and activate objects which got deactivated.
6Activate transfer
structures, Run the program RS_TRANSTRU_ACTIVATE_ALL
Activate transfer
7Reactivation of all
active update rules .Run the program RSAU_UPDR_REACTIVATE_ALL
Activate update rule
8Reactivation of all
7.X data sources. Run the program RSDS_DATASOURCE_ACTIVATE_ALL
Activate Data source
9Activation of
MultiProviders. Execute the program RSDG_MPRO_ACTIVATE
10Bex query output  validationCompare Pre snapshots
with post snapshots of the Bex query output taken.
Compare Bex queries
output pre and post upgrade.
11To correct operator
syntax error in custom program.




12New Functionality

Created a SPO of a
Cube / DSO and loaded data into it. It worked fine.



Semantic Partitioned
is new feature introduced in 7.4. By means of this one can semantically
partitioned based on partition condition. Same gets stored in data targets
depending on partition condition.
13TADIR entries for
Execute program
TADIR is table to
maintain Directory of Repository Objects.
This includes all   dictionary
objects, ABAP programs etc. During upgrade if any fact/views dropped /missed.
This program recreates it. It is part of upgrade to 3.x. Since in most of the
landscape still 3.x flow is used. Missing TADIR entry error may occur.
14Ensure BI Object

1. Go to RSRV transaction

2. RSRV transaction is used to perform consistency check on the data stored in BW.

3. Elementary and Combined test can be performed.



During upgrade if some object got missed. This program will list out.

This needs to be done manually can be done for any BW meta data object.

It’s better to do it for the important cubes e.g. cube related to COPA.

15Ensure Aggregate and
Indexes of cube are active.

Ensure the aggregates and indexes of an
infocube are maintained and active as the case. Can be check for one cube.



16Repair info objects


1. Log on to the SAP system.

2. Call transaction RSD1.

3. Choose Extras Repair InfoObjects (F8).

4. Choose Execute Repair.

5. Choose Expert Mode Select Objects.

6. On the following screen, in addition to the default checkbox selection, activate the following checkboxes:

- Check Generated Objects

- Activate Inconsistent InfoObjects.

7. Execute the program.

This activity is performed so that there should not be erroneous object.

We perform consistency checks on the data and metadata stored in a BW system. This   tests the foreign key relationships between the tables of the extended star schema. (ref. From SAP help).

17Check the flag Record
Statistics while maintaining the OLAP parameters in the Tcode RSRCACHE

Do as mentioned.

Click on Cache parameter >Record Statistics.

It is an
architectural change as part of BW 7.4 we need to do it so that we can collect
the statistics of the Bex queries in the database tables.
18To identify and
repair inconsistencies in the PSA metadata.

1. Execute report RSAR_PSA_NEWDS_MAPPING_CHECK in transaction SE38

2. Execute with the repair flag unchecked. This will provide the list of all obsolete PSAs which are no longer used in segmented datasource.

3. Execute in the Repair mode then execute with the repair flag checked. This will inactivate all the obsolete. PSAs which are no longer used in segmented datasource.



Deletion of obsolete PSA.
19ABAP program
Correction for loading Hierarchy.

As part of BW 7.4,
while loading hierarchy using Info package one may come across this error.
Implemented SAP note 1912874 - CALL_FUNCTION_NOT_FOUND while loading hierarchy.
This same error occur while executing bex query.



Original consistent
value returned by Data source was changed by conversion routine which is not
consistent with Conversion exit. 
Sol:  Check correct conversion
routine entered. Correct conversion routine or data.  Can also activate automatic conversion in
transfer rule.
20Code Inspector to
1823174Execute program is
ZSAP_SCI_DELTA is implemented in se38.
In the BW Releases before 7.4, the maximum length of a characteristic value is limited to 60 characters. As of Release 7.4 SPS2, up to 250 characters are possible. To achieve this, the domain RSCHAVL was changed from CHAR 60 to SSTRING 1333. As a result, data elements that use the domain RSCHAVL are no longer possible (syntax errors) or they cause runtime errors in customer-specific programs. This SAP Note describes solution options for certain problem classes.


1. These steps are considering BW upgrade from 7.X to 7.4 version.
Contents are the general not mentioning any Business function or Technical BW meta data objects of Client source system.


andres diaz

BI in the organizations

Posted by andres diaz Sep 24, 2014

Today companies have expedientado growth in data flow, stored mainly in corporate information systems, and it is becoming more evident the need to obtain information from that source of data for making strategic dictions.



Information is the main source of knowledge, business growth is defined by the way processes are carried out, the companies are systems that are composed of different modules: finance, logistics, commercial and productive.



Each one of the modules or departments of a company need to have real-time information to make strategic tions based on the data you have in your corporate systems.



Business intelligence is not just a computer process is a fundamental component in the sustainable growth of the company that needs the complete knowledge of business processes.

Problem Statement/Business Scenario

As a best practice SAP suggests to delete the change log data in the DSOs but what if there is business logic in the transformation/update rule which consume the data from change log? There isn’t any standard process where selective change log requests can be deleted unless you delete them manually
Consider a scenario where we have a staging DSO which feeds data to 3 different cubes, out of the 3 cubes two are delta enabled and the third one is full request every day with snapshot data. If the CL data for full request isn’t deleted then it will grow exponentially in no time.

As per the below mentioned data model the staging DSO feed data to multiple cubes and among them is a snapshot cube to which full data load happen and there is no reason why we should retain the change log requests. In worst case scenario if we are to reload the data from staging DSO to Cube then it will not require the data in change log




To reduce the change log there are two ways of doing it – Either delete the requests manually or use an ABAP program to delete the change log with selections (full requests). SAP has provided the provision to delete the change log data via process chain but you cannot delete the requests selectively in case of the data being loaded to multiple data targets. To avoid all manual intervention a custom ABAP report can be created which will automate the process and it can be used as a weekly/monthly/yearly housekeeping activity.

ABAP Report Design
An ABAP program can be created to look for all the requests from table RSSELDONE which has the Request ID, Load Date, Info Package ID, System ID and Update Mode. Based on the info package id, load date and update mode all full requests can be identified and deleted from the change log table. Change Log table details can be got from DSO>Manage> Contents>Change Log.



Example: If we have to retain full requests for 30 days alone then the program should delete all the data from change log based on the requests date which are older than 30 days.

Below is the snapshot of the table RSSELDONE which store all the requests






Abhishek Shanbhogue

Hello friends,


Today I would like to talk about a common mistake, that can cause major delays on processing times of incident: Incorrect component assignment of incident and how to identify the correct component for your SAP Incident.


I often notice that new incidents from customers are created under BW-BCT*, as BW-BCT has over 100 sub components, listing almost all Applications and generic sub areas from SAP products. This happens because the description of the sub components is not accurate in some cases and doesn't specify the area.


Please keep in mind, Components under BW-BCT are reserved for issues related to the installation and operation of SAP application extractors used to load data into SAP BW. Choosing the correct component is an important step during the incident creation because it ensures the fastest processing of your request.


We can see at the following image the example: Under path BW->BW-BCT there are over 100 sub components with simple descriptions like "Customer Relationship Management" or "Documentation". In reality those components are related with logic of data extraction for the ECC application that deals with "Customer Relationship Management" and "Documentation", BW-BCT-CRM and BW-BCT-DOC, respectively.



Another reason this happens is because customer selects the application for the Incident as BW,  which will automatically expand BW area for component selection, misleading customer to find the desired component based on the description, which will most likely be under BW-BCT list as it has almost all SAP Applications extraction logics under it.


How to find the correct component for my Incident?

In order for SAP to assign an engineer to work on a customer incident, the incident must be assigned to an active component, that is being monitored by SAP engineers.

Here are a few tips to identify which component is the one for your incident:

Note search

Perform a note search using the SAP SMP (Service Market Place) search tool using SAP xSearch or Note search.
Use key words related to your issue like the transaction code used to reproduce the issue, a table name and the name of the product as in the following example in xSearch, let's try an example: my BPC user has been locked due to incorrect login attempts and I'm trying to find a solution for it:




Notice I'm searching for the text "user locked SU01", the most suited areas would be BC, GRC and SV.
Now look at the narrowing example:



Notice I narrow down the search by adding 'BPC' to the search text. This Narrows the results of Notes from over 280 to only 5, which are for EMP-BCT-NW.


Knowledge Base Article(KBA) or Note

Customers often find a note related to the incident being faced, or even create new incidents based on notes already provided by SAP.
As a general approach, the component where the Note or KBA was created for will be the most suited component to create a new incident:


KBA example:



Note example:




You can find the Component of a KBA or Note under Header Data section, close to the bottom of the page.


Short Dump

Usually when customer faces short dump, you can see an application component assigned to the dump at the header, for example:



This might be usually the correct component for the incident, but not in all cases.
In order to identify the correct component, you should analyze the dump text description and check the programs and the code section where it stopped.

Check the function module/report or Transaction code component

This is usually the best method to identify the component responsible for working on the issue as it shows the exact component responsible for that part of the code.
There are several ways of doing that, I'm going to explain the one I believe most people will have authorization for and that I believe is easy to do:
1. Open the transaction you are facing the issue and navigate on it one step before reproducing the issue.
2.Go to context menu System->Status.

3.Double click on the Program(Screen) value. The code for it will open.

4.Go to context Menu Goto->Attributes. A small popup will open.


5. Double click on the Package value. A new screen will open.
6.You will see the component at Application Component value.



Checking some of the steps mentioned above should help you identify the correct component. However, there isn't a single formula for all issues. Each issue has to be carefully interpreted to find the appropriate component.
Sometimes even we at SAP have a difficult time to identify what component is correct for the issue. That is why, it is imperative that customers provide a Clear description of the issue with a Concrete example with description of the steps under Reproduction Steps section.



Do you have another tip on how to identify the corret component? If so, please let me know in the comment section.

Hello altogether,


some days ago my first printed book was published by Espresso Tutorials and now it's also available as eBook:


Schnelleinstieg BW.jpg

This book is written for beginners in SAP Business Warehouse 7.3. It starts with a short introduction into Business Intelligence and Data Warehouses in common. Then it gives a short overview of Bill Inmons CIF as base for SAPs Business Warehouse and explains LSA and LSA++ in a short section.


The main part is an real world example, loading an IMS Health sample file into BW to get an answer for a business question.

As it's written using SAP Business Warehouse 7.3, I explain how you can use dataflow diagrams to build a very basic data flow to load the sample data.

Then it leads step by step with lots of screenshots through the data modelling. You begin to create your first characteristics, key figures. The next step is to create DataStoreObject, InfoCube and MultiProvider in a simplified multi-layer architecture. The next chapter is about ETL. The term itself is shortly explained. Then I show how to create transformations and DTPs. Finally I show how to put all DTPs together in process chains for master data and transactional data.

The last chapter is about Business Explorer. I explain how you can easily create a BEx query on top of your mutliprovider. With a few screenshots I show you can slice and dice through your data. Exceptions and conditions are explained shortly.


The most benefit is that you get some help how you can avoid the most common pitfalls in data modelling. It also gives some help in case of data load errors. Two other goodies are ABAPs:

1.) how to convert the keyfigure model into an account model

2.) how to easily create multiple key figures using BAPIs


The only disadvantage is that the book is available in german only. An english version is not planned at the moment.


Have fun reading it!




Ever wondered where your DTP/Transformation spent it's time?


This is how you can find out:


At first you have to create a small programm like the following:




                " loading the dtp



         " here you have to place the technical name of your dtp

         I_DTP  = 'DTP_4YIEJ....'


         R_R_DTP = R_DTP.




" we have to run it in sync-mode (not in Batch)




Now run this programm through Transaction SE30 or SAT :se30.png

After the execution of the DTP you will receive the result. In our case it was an select on the psp element. We created an index and the dtp needed only 2 minutes instead of 30 minutes.



Of course you can also simply call the transaction RSA1 through SAT (using the processing mode "serially in the dialog process (for debugging)"). But doing it that way you have to filter out the overhead in the performance log created by rsa1 and the data will not be written to the target (it's only simulated) so you might miss sometimes a bottleneck in your dtp/Transformation.


thanks for reading ...

Questions about fast growing tables in SAP systems are very often asked in SCN forums. Such a tables consumes disk space and even more importantly processing large volumes of data in these tables slows down the system. This also true in area of BW systems. Therefore it is common for SAP Basis guys to do housekeeping on regular basis of these tables. There are SAP Notes available which deals with those tables and advise how to reorganize them. Perfect example of such a Note is 706478 - Preventing Basis tables from increasing considerably. There are many tables discussed in the Note per different areas and also BW areas.

One of them depicts table related to process chains log - RSPCLOGCHAIN.

The table holds logs of Process Chains. In large BW systems running many chains on daily basis table can increase its volume very easily. One of the tables involved in logging of process chains runs is RSPCLOGCHAIN. Regular way of how to get rid of the log of process chains run is to do it from transactions like RSPC or RSPC1. In the log view there is Delete functionality available in the menu:


To you this functionality in automated way an ABAP report RSPC_LOG_DELETE needs to be utilized.  You can set up the job running this report on regular basis for particular chains with selection on logs from date/time or Log ID.

I found this report quite useful in BW housekeeping tasks.


PS: This blog is cross published on my personal blog site.


APD Query Tip

Posted by V J Aug 21, 2014

While working on one of the APD requirement we came across a strange issue.


As per the requirement, we wanted to add few new key figures to the APD query and hide few old ones.


We went ahead and added the new key figures and hid the other ones. We did the mapping and activated the APD.


When we started the testing, we realized the data was not coming in the correct order , It was strange and we couldn't figure out the reason.


We used trial and error method and finally found the solution.


We moved all the hidden key figures to the bottom of column box and the APD output got corrected. It was totally new and unexpected discovery which we were delighted to found, hence sharing it here.


Conclusion : Whenever you have hidden columns in APD query, make sure you move it to bottom of the column for correct data.

Part 2 of this blog series will cover some more important aspects of NLS technology

with SAP BW


1) A prerequisite of NLS data archival is to compress the InfoCubes that you want to

archive upto the latest data load request before archival . In many landscapes

this is normally done as part of maintenance activities but if not then this might be

a time consuming process to implement on all cubes and might impact project

timelines .


2) Many times a question is asked about NLS whether archived data on NLS can be

again restored back to the live SAP BW system . The answer is "Yes"  this is

possible but might lead to inconsistencies in the system and will defeat the purpose

of archival . Data reload option should be used only if there is a critical business need .


3) From a BW security perspective there are some prerequisites as well .    

For BW developers to perform archival related activities under developer role,

authorization object S_ARCHIVE needs to be assigned and activity code , area

and object should be '*' .


For End Users: To read archived data from NLS , users should have authorization Object

S_RS_ADMWB with Activity Code 03,16 assigned and restricted on Object RSADMWBOBJ

These prerequisites might be specific to NLS vendors as well and you need to provide

additional authorizations if mentioned in documentation provided by your NLS vendors .

4) Another important step which is a part of NLS archival is the creation of virtual

InfoCubes . To read the data from NLS you will need to create new virtual cubes

which will actually connect to NLS . If a query is run which needs to read data from

NLS then these virtual cubes will do the trick . You would need to modify the existing

Multi Providers and queries to include these virtual cubes so that data archived on

NLS is available for reporting .

Will cover some more aspects in the next part of this series

Cheers !

Part 1 of this blog series will cover some important aspects of NLS archival technology along with SAP BW .


Some important points to keep in mind before starting a data archival project are :


1) Choosing a NLS vendor . There are many SAP certified NLS vendors and a vendor should be chosen based on

requirement and product offered .


2) Correct sizing of the NLS box is important so that current and future data archival needs can be met.


3) Data to be archived : Archived data is always online for reporting while using NLS , but the reporting speed is

not comparable to reports being run on BWA or HANA appliances . Only data which is rarely accessed should be

put on NLS .


The next step is to determine which InfoCubes and the data volume that should be archived .

Based on BW statistics this decision can be made along with inputs from the business users . InfoCubes which

are big in size and infrequently used  are the best candidates for data archival .


Below is quick chart of the logic that can be  used to determine how a particular InfoCube should be archived



Archival Strategy.jpg


NLS archival is done using time slices and it is important that InfoCubes have time characteristics which can

be used for data archival .


InfoCubes which are refreshed on a daily basis ( Full load ) need not be archived . If SAP ECC system or

source system is also undergoing archival then only live data post archival in the SAP ECC or source system

will get loaded as part of the daily full loads to such InfoCubes and data which has been archived will not be

available in the SAP BW system for reporting .


InfoCubes which are delta enabled and delta loads don't bring data for time slices which have already been

archived are the best candidates for NLS archival . The reason here is that if you archive data from a cube for

a particular time slice and if the daily delta brings in data for the same time slice the data loads will fail .

If your daily delta loads to InfoCubes bring in data for time slices which have already been archived it's better to

create copy cubes and move the data to copy cubes for those particular time slices and archive those copy

cubes instead of the original cubes. Post validation , data can be deleted from original InfoCubes . Existing

multi providers and reports will have to be enhanced to include the new copy InfoCubes .


Every business requirement,landscape is different and there are multiple ways data archival can be done using NLS .

Above is one approach that can be used .

The second part of this series will try to cover some more aspects of NLS archival technology


Cheers !

This blog has been translated with Google Translate. the original blog can be found here: ekessler.de



In a semantically partitioned object (SPO) was created based on a template object can cause an error during activation. Figure 1.1 shows the construction of a semantically partitioned DSOs. As a template DSO is used.



Figure 1.1: Construction of an SPO with template


When activating the SPOs, the error occurs "Not possible to create external SAP HANA view for object <SPO>00 / Message no. RS2HANA_VIEW001".


The error message is noted that attempts to generate an external SAP HANA View for an SPO (or a hybrid provider).




Figure 1.2: Error in activating


The cause of the error is in this case the templates DSO.In the templates DSO is the hallmark External SAP HANA View for reporting set, see Figure 1.3.




Figure 1.3: Indicator External SAP HANA View for reporting in the Template Obejkt


When creating the SPOs first the reference structure for the SPO is generated. The reference structure of SPOs used as a template for each partition of the SPOs.


At installation of the reference structure of the SPO object <SPO>00 the metadata of the original object are copied. The metadata of the template object also includes the indicator External SAP HANA View for reporting.


Currently does not support external SAP HANA views for semantically partitioned object (and HybridProvider). Therefore, the maintainance option for the indicator External SAP HANA View for reporting is for a semantically partitioned object (see Figure 1.4) not available.




Figure 1.4: Properties of the reference structure of the SOPs


We find the indicator HANAMODELFL (External SAP HANA view for BW object) in the table RSDODSO (Directory of all datastores), see Figure 1.5.




Figure 1.5: Indicator External SAP HANA view for BW object


To activate the SPO the indicator HANAMODELFL must be removed for the reference structure of the SPOs. The reference structure has the following naming convention <SPO-name>00. Alternatively, you can search the reference structure on the name of the SPOs in the table RSDODSO (for cubes see table RSDCUBE). Figure 1.6 shows both variants.




Figure 1.6: Search reference structure of the SPO in the RSDODSO


remove indicator HANAMODELFL (SAP HANA View), see Figure 1.7 and save.




Figure 1.7: Remove indicator HANAMODELFL (SAP HANA View)


Subsequently, the SPO could be activated without error.

Hi All,


Many of you might have alredy have this list of items that need to tested specific to your landscape when we do upgrade, But I thought of posting this blog with a complete list of all BW test items from my project experience.


You can make use of this list for projects like BW Upgrade, Service Pack Upgrade, Source System Upgrade, HANA Migration & BEx Upgrade


Below test items have been divided into 4 test components Workbench, BEx Queries, BEx Workbooks, BEx Analyzer, Web Templates & Others


Component Testing Items

Workbench Tcode - RSA1, RSPC, RSPC1, RSPCM, RSRT, RSANWB, RSCRM_BAPI, RSA2, RSA3, RSA7, RSA6, SM50, SM51, SM66, SE38, SM49, SE37, ST12, SE09, SE03, SM12, RSA1OLD.

Workbench Authorizations & Roles

Workbench Master Data Load - Attribute (Source System)

Workbench Master Data Load - Attribute (Flat File)

Workbench Master Data Load - Text (Source System)

Workbench Master Data Load - Text (Flat File)

Workbench Maintain Master Data

Workbench Maintain Master Data Text

Workbench Hierarchy Data Load - Source System

Workbench Hierarchy Data Load - Flat File

Workbench Transaction Data Load - Source System

Workbench Transaction Data Load - Flat File

Workbench Query Execution in RSRT

Workbench APD Execution - All types (Flat File, Info Cube, Multi Provider, DSO..)

Workbench Open Hub - All Types

Workbench Program Execution - Standard & Custom

Workbench Data Reconciliation

Workbench Batch Run - All variants

Workbench check/create indexes on cube

Workbench check/refresh cube statistics

Workbench Compress Cube

Workbench Delete PSA in process chain

Workbench Maintain cube/Contents to examine data in cube

Workbench Create Info Package

Workbench Create DTP

Workbench Create DSO

Workbench Create Cube

Workbench Create Multiprovider

Workbench Create Trasformation

Workbench Create Data Source

Workbench Create Web Template

Workbench Create Query

Workbench Create Process Chain

Workbench Create cube as copy

Workbench Create InfoObject as Reference

Workbench Capture objects in Trasports

Workbench Add characteristic to Cube

Workbench Add characteristic to ODS

Workbench Change Identification in M/P

Workbench Data modelling - Multiprovider integrity

Workbench Add characteristic to M/Provider

Workbench Change existing InfoObject

Workbench Changing Transfer/Update Rules

Workbench Change existing InfoSet

Workbench Create Aggregate (Not Applicable in case of HANA Migration)

Workbench Modify Aggregate (Not Applicable in case of HANA Migration)

Workbench BIA Index (Not Applicable in case of HANA Migration)

Workbench DSO Activation

Workbench MDX Statements

Workbench Interrupts Execution

Workbench Create a Job

Workbench Schedule a job/Release a Job

Workbench Change the Job Status manually

Others  Performance of the MDX Execution pings/runtimes

Others  Interface with PI/XI/DB Connect/UD COnnect/Flat File/Informatica/EDW/GDW/SSIS for data transfer

Query Designer Execute Query in Query Designer

Query Designer Variable Entry Screen

Query Designer F4 for help Window

Query Designer Variants in Variable Entry

Query Designer Report Output

Query Designer Context Menu

Query Designer Export to Excel

Query Designer Download to PDF

Query Designer Filter options

Query Designer Conditions/Exception

Query Designer Go To Option

Query Designer Drill Down/Across Option

Query Designer Create Bookmark

BEx Analyzer Execute Query in Analyzer

BEx Analyzer Variable Entry Screen

BEx Analyzer F4 for help Window

BEx Analyzer Variants in Variable Entry

BEx Analyzer Report Output

BEx Analyzer Context Menu

BEx Analyzer Export to Excel

BEx Analyzer Download to PDF

BEx Analyzer Filter options

BEx Analyzer Conditions/Exception

BEx Analyzer Go To Option

BEx Analyzer Drill Down/Across Option

BEx Workbook Execute Query in Analyzer

BEx Workbook Variable Entry Screen

BEx Workbook F4 for help Window

BEx Workbook Variants in Variable Entry

BEx Workbook Report Output

BEx Workbook Context Menu

BEx Workbook Export to Excel

BEx Workbook Download to PDF

BEx Workbook Filter options

BEx Workbook Conditions/Exception

BEx Workbook Go To Option

BEx Workbook Drill Down/Across Option

BEx Workbook Marco Addon

Web Templates Execute Query in Query Designer

Web Templates Variable Entry Screen

Web Templates F4 for help Window

Web Templates Variants in Variable Entry

Web Templates Report Output

Web Templates Context Menu

Web Templates Export to Excel

Web Templates Download to PDF

Web Templates Filter options

Web Templates Conditions/Exception

Web Templates Go To Option

Web Templates Drill Down/Across Option

Web Templates Create Bookmark



Abhishek Shanbhogue

What is interrupts?


Interrupt is a process type which is used in the process chains to trigger the chain after completion of specific steps either in BW any other source system. Interrupts are very helpful in automating the batchprocess in BW in case you don’t have any third party scheduler.


Business scenario


I am considering a classic example where we have SAP BWextracting data from SAP R3 (ECC) and assuming the batch process chains have to wait until the data pre-calculations and dependent business processes are complete in Source system






How do we achieve?


Step 1: We can find Interrupts in General Services 


Step 2: Create a new Process Chain and include the Interrupt Process soon after the Start Process



   Different Options in the Interrupt Scheduling


  • Immediate
  • Date/Time
  • AfterJob
  • After event
  • At operation mode
  • Factory calendar day



Step 4: Create Event in BW, Tcode SM64 will help you to create an event in SAP BW and ECC




Step 5: In my scenario I am using an event based trigger so that when an ECC job is complete event is triggered which in turn
triggers the BW process chain


So I will make use of this even in the process chain and ECC program



Step 6: To trigger an event in BW from ECC you will need a program for event raise (We have a custom program available so
I am making use of the same program here)


Tcode Se38 >Even raise Program > Maintain variant as Z_TEST which was created earlier




Step 7: Create a Job in ECC using Tcode SM36 and include the program which need to be scheduled followed by an event and
schedule this job as per the business needs


Define the background Job


Assume Z1 is the ECC program dependent on BW process chain so include Z1





Step 8: If you ECC job is scheduled daily at 8PM then the BW PC should also be scheduled at the same time. Once the ECC jobcomplete it will trigger the event which will in turn trigger the BW event


















SAP CI/DB split - Issues and Analysis:-


In SAP Production landscape, we are having CI (Central Instance) including all application servers and DB in a single Host .As HANA Database is there in market the SAP clients are splitting their CI/DB from the host and Keeping CI in one Host (Linux Host) and DB in old host. Later they can move the Database to HANA from IBM DB.



During this the OS of CI also have to change for eg. AIX to Linux.CI will be moved to Linux and DB2 retained on AIX itself. Here the CI should be moved to Linux because SAP HANA is not going to support in AIX environment.so moving CI to Linux is the first step to install SAP HANA database in future. After moving the DB to HANA they can again keep both CI/DB in new Linux host. Both CI and DB should have Linux OS.


Generally after CI/DB split in SAP Landscape we may face many issues regarding communication, Online/offline backups of SAP, performance of BW server etc.


Some of the common issues I have listed below and how the BW, BASIS and DB Team will work together .Actually after CI/DB split we may face performance issues on BW.Some time it may takes long time to open the Transaction code in BW.


In RSA1 go to modelling, and then click on Info provider after that the system will hang. This will happen for most of the work areas in SAP BW.


The first step is to update SAP Basis and DB to check from their end. Both Teams should work together as CI and DB are on different Host.


As per BASIS analysis and advice, the DB team can run “Update Statics” on BW. They will run Reorg and Runstats on BW .it all depends on number of tables present in database and how much data they are carrying.


REORG helps DB2 to ensure indexes should become aware of all new data and also no longer include deleted data and collapse all empty page space created by deletion of data and indexes


RUNSTATS gathers the updated statistics on the volume and distribution of data within tables and indexes. This information is stored in the system tables and is used to query the data.


As a result all tables have been Re-orged and Runstated.


The DB team can add Significant Memory to the DB2 buffer pools as much memory as possible. Then BW Team can validate BW server performance degradation. If performance is still slow, they can run an SAP trace to see where the time is being spent.


SAP BASIS can restart SAP.


Stopping the SAP System: Login into the OS level as a user with SAP administrator authorization (<SID> adm).


Enter the command: stopsap [DB|R3|ALL].


DB stops the database system: stopsap DB


R3 stops the instances & associated processes of the SAP System: stopsap R3


ALL stops both the database system & the SAP System. Stopsap ALL


Starting the SAP System: Log on in your OS level as a user with SAP administrator authorization (<SID>adm).


Enter the command: startsap [DB|R3|ALL].


The following applies to this command:


DB starts the database system:     startsap DB


R3 starts the instances & associated processes of the SAP System:     startsap R3


ALL starts both the database system & the SAP System:    startsap ALL


The DB team can do the settings so that more memory should be given to both SAP and DB2.


If the performance is still down then Basis can analyze the performance issues. They can clear user Buffer from their end. This below screen shows user BUFFER AREA, if we want to reset these area just do the following step




Authorization Values –> Reset User Buffer


Now BW Team can validate the performance.


In Some cases, we can see some issues like the Sequential read time of Transaction codes will be different for different user. Some user’s id are taking less time for Tcode RSA1 and some users Tcode will hang for more time.


We tried accessing RSA1 Tcode from other users (DDIC/USER1) it opens quickly. Whereas it is not so for USER2.

Response time log both the user can be compared. We tried accessing RSA1 Tcode from other users (DDIC/USER1) it opens quickly. Whereas it is not so for USER2.





On further Analysis, Now issue looks to be happening for all the user ids. It’s hanging while accessing table RSDVCHA



Tracing the transaction execution RSA1 is showing hanging on multiple tables.



We can go for quick restart of application and server as well. If BASIS unable to stop database on BW then DB team can stop it from their end. We can ask that Currently SAP application is down please stop the database. And Once DB is stopped, we can go for Restart the servers DB instance and CI instance.


Application has been stopped on server CI instance.

In some scenario if communication is not happening between CI and DB. We can check R3trans from DB servers, if processes are getting hanged.

R3trans Return Codes:-


R3trans sets a return code that shows whether or not the transport has succeeded. You can view details on transports in the log file. Here are the following return codes:

0: No errors or problems have occurred.

4: Warnings have occurred but they can be ignored.

8: Transport could not be finished completely. Problems occurred with certain objects.

12: Fatal errors have occurred, such as errors while reading or writing a file or unexpected errors within the database interface, in particular database problems.

16: Situations have occurred that should not have.


If it happens, the DB team can dig in to DB and it occurs mainly due to multiple locks in db. Which should be cleared up from DB end and they have to check DB2CLI driver for communication. If possible they can reinstall db2cli driver again to verify that db2cli is working perfect.


Accessing DB2 Database using DB2 CLI: -   A DB2 CLI uses a standard set of functions to execute SQL statements and related services at run time. DB2 Call Level Interface is IBM's callable SQL interface to the DB2 family of database servers. It uses function calls to pass dynamic SQL statements as function arguments. DB2 CLI does not require host variables or a pre compiler.


Install the DB2 CLI driver on database server.  To do this, we require free space of approximately 50 MB for each operating system of application servers.


We should also ensure that the main version of the DB2 CLI driver matches the main version of the DB2 software on the database server. The fix pack level of the DB2 CLI driver must be equal to or lower than the fix pack level of the DB2 software on the database server.


After installing again Check whether the SAP system and the SAP programs can then determine the DB2 CLI driver and use it.


In a random directory, execute the following command:


R3trans –x


This call should not return any errors and should report the following:

"R3trans finished (0000)."


BWadm 10> R3trans –d

This is R3trans version 6.24 (release 741 - 12.05.14 - 20:14:05).

Unicode enabled version

R3trans finished (0000).


By this we can ensure communication is happening between CI and DB.


In Another Scenario we can have R3trans is responding fine and application is also connecting fine. But the performance is again slow.


This may be due to online or offline backup’s. Tivoli Storage Manager (TSM) provides a powerful and centralized backup, archive, and storage management. Tivoli Storage Manager also plays a role in backup and recovery of DB2 Content Manager Databases in the event of hardware failure. The DBAs and the storage team can work together to get backup issues solved. Sometime the backup job is continually hanging on a TSM resource (Storage device resource) for long periods of time. And it will create numerous locks that can impact the all SAP jobs which will ends up in hanging BW Transactions codes. In this case we have to cancel the backup jobs and check the BW performance.


In DB daily online backups will be taken and during the case of Application upgrade and before CI/DB Split the offline backup should be taken .After CI/DB split we should monitor the backup jobs as well whether it is creating any locks.


SAP BW 7.4 -Analysis & issues:-

In SAP BW Release 7.4 SPS2 the domain RSCHAVL was converted from CHAR 60 to SSTRING 1333. In 7.4 versions the characteristic 0TCTIOBJVAL references the data element RSCHAVL_MAXLEN and has the type CHAR 250. Since the maximum permitted length of characteristic values is 250 characters, the values of all characteristics can be stored in 0TCTIOBJVAL.

If we compare with previous version, the characteristic 0TCTIOBJVAL directly referenced the data element RSCHVAL, which uses the domain RSCHAVL. So the characteristic 0TCTIOBJVAL and other objects that reference the characteristic have to be changed in 7.4 versions.

Data Element in SAP BW 7.4 Version:-




Data Element in SAP BW previous Version:-



There are characteristics 0TCTLOW and 0TCTHIGH referenced the characteristic 0TCTIOBJVAL. Since both characteristics are frequently used together in the key of a DSO, they can no longer reference 0TCTIOBJVAL.As a result, both together would be 500 characters long. However, the total key length of an ABAP Dictionary table in an SAP system must be shorter. Therefore, a new characteristic 0TCTIVAL60 of type CHAR 60 was introduced and the characteristics 0TCTLOW and 0TCTHIGH now both reference 0TCTIVAL60.


They previously had the type CHAR60 and still have same type post upgrade. As a result, all objects that use these two characteristics together are still executable. However, they work only for applications that have characteristic values no longer than 60 characters.

During SUM tool runs, the XPRA program RSD_XPRA_REPAIR_0TCTIOBJVL_740 which copies the contents of the SID table from 0TCTIOBJVAL (/BI0/STCTIOBJVAL) to the SID table of 0TCTIVAL60 (/BIO/STCTIVAL60).


The characteristics 0TCTLOW_ML and 0TCTHIGH_ML are created and they reference 0TCTIOBJVAL.




In previous version, We used the DSO 0PERS_VAR and it contains the characteristics 0TCTLOW and 0TCTHIGH in its key part. Since this DSO could not store characteristic values that are longer than 60 characters, they BW 7.4 came up with a new DSO 0PERS_VR1.The data part of this DSO contains the two characteristics 0TCTLOW and 0TCTHIGH.

The program RSD_XPRA_REPAIR_0TCTIOBJVL_740, activates the new DSO, and copies the contents of the previous DSO 0PERS_VAR to the new DSO 0PERS_VR1. Then, the personalization should work as usual.

The programs that run during the upgrade activate new objects and, if necessary, copy data from old objects to new objects. However, they do not delete obsolete objects. The DSO 0PERS_VAR for storing the personalized variable values is no longer used.


The database tables RSECVAL and RSECHIE that store analytical authorization objects are no longer required.

Database table RSECVAL


Database table RSECHIE


The program RSD_XPRA_REPAIR_RSCHAVL_740, copied to the table RSECVAL_STRING or RSECHIE_STRING. If that has been successfully executed, we can delete the contents of the tables RSECVAL and RSECHIE.


The tables RSRNEWSIDS and RSRHINTAB_OLAP are also no longer required. The program RSD_XPRA_REPAIR_RSCHAVL_740 also copies the contents of these tables to the new tables RSRNEWSIDS_740 or RSRHINTAB_OLAP_S. If that has been successfully executed, we can also delete the contents of these tables.









ABAP Program:-

In SAP BW 7.4, the domain RSCHAVL was changed from CHAR 60 to SSTRING 1333. As a result, data elements that use the domain RSCHAVL are "deep" types in an ABAP context. Therefore, some ABAP language constructs are no longer possible and it will give syntax errors or they cause runtime errors in customer-specific programs.


Texts with a length of up to 1,333 characters are possible for characteristic values. For this, the structure RSTXTSMXL was created which is a "deep" type in an ABAP context. In the internal method interfaces and function module interfaces that handle the texts of characteristic values, the type RSTXTSML was replaced with RSTXTSMXL. However, the RSRTXTSML structure remains unchanged and is required for the description of metadata.





The change should have little effect on our programs. We must expect problems where we operate on characteristic values that are assigned with a generic type in case of variable exits or where we call SAP-internal functions or methods whose interfaces were changed by SAP.


Most of the problems are syntax errors that result in a program termination. We can use the Code Inspector Tool to systematically detect and resolve these problems. We can run Code inspector program as a check both before and after the upgrade. It will show us the things that need to change as per 7.4 versions.




The include ZXRSRU01 and the function module EXIT_SAPLRRS0_001 will not analyzed by the Code Inspector. This include must be fixed by ABAPER from SE38.The point we should remember is that we should use keyword ‘Type’ instead of ‘Like’. There will be RRRANGEEXIT complex structure after the enhancement; so we should use TYPE instead of LIKE.




During pre upgrade the structure is given.



Post upgrade:-


Similar with Function Module too:-


These are the ABAP code change should be done by an ABAPER after BW 7.4 Post upgrade.



Filter Blog

By author:
By date:
By tag: