1 2 3 19 Previous Next

SAP Business Warehouse

282 Posts


Problem Statement/Business Scenario


As a best practice SAP suggests to delete the change log data in the DSOs but what if there is business logic in the transformation/update rule which consume the data from change log? There isn’t any standard process where selective change log requests can be deleted unless you delete them manually
Consider a scenario where we have a staging DSO which feeds data to 3 different cubes, out of the 3 cubes two are delta enabled and the third one is full request every day with snapshot data. If the CL data for full request isn’t deleted then it will grow exponentially in no time.

As per the below mentioned data model the staging DSO feed data to multiple cubes and among them is a snapshot cube to which full data load happen and there is no reason why we should retain the change log requests. In worst case scenario if we are to reload the data from staging DSO to Cube then it will not require the data in change log

Untitled.jpg

 

 


Purpose
To reduce the change log there are two ways of doing it – Either delete the requests manually or use an ABAP program to delete the change log with selections (full requests). SAP has provided the provision to delete the change log data via process chain but you cannot delete the requests selectively in case of the data being loaded to multiple data targets. To avoid all manual intervention a custom ABAP report can be created which will automate the process and it can be used as a weekly/monthly/yearly housekeeping activity.

ABAP Report Design
An ABAP program can be created to look for all the requests from table RSSELDONE which has the Request ID, Load Date, Info Package ID, System ID and Update Mode. Based on the info package id, load date and update mode all full requests can be identified and deleted from the change log table. Change Log table details can be got from DSO>Manage> Contents>Change Log.

 

Untitled.jpg

Example: If we have to retain full requests for 30 days alone then the program should delete all the data from change log based on the requests date which are older than 30 days.


Below is the snapshot of the table RSSELDONE which store all the requests

Untitled.jpg

 

Untitled.jpg

 

Thanks

Abhishek Shanbhogue

Hello friends,

 

Today I would like to talk about a common mistake, that can cause major delays on processing times of incident: Incorrect component assignment of incident and how to identify the correct component for your SAP Incident.

 

I often notice that new incidents from customers are created under BW-BCT*, as BW-BCT has over 100 sub components, listing almost all Applications and generic sub areas from SAP products. This happens because the description of the sub components is not accurate in some cases and doesn't specify the area.

 

Please keep in mind, Components under BW-BCT are reserved for issues related to the installation and operation of SAP application extractors used to load data into SAP BW. Choosing the correct component is an important step during the incident creation because it ensures the fastest processing of your request.

 

We can see at the following image the example: Under path BW->BW-BCT there are over 100 sub components with simple descriptions like "Customer Relationship Management" or "Documentation". In reality those components are related with logic of data extraction for the ECC application that deals with "Customer Relationship Management" and "Documentation", BW-BCT-CRM and BW-BCT-DOC, respectively.

Capture.PNG

 

Another reason this happens is because customer selects the application for the Incident as BW,  which will automatically expand BW area for component selection, misleading customer to find the desired component based on the description, which will most likely be under BW-BCT list as it has almost all SAP Applications extraction logics under it.

 

How to find the correct component for my Incident?

In order for SAP to assign an engineer to work on a customer incident, the incident must be assigned to an active component, that is being monitored by SAP engineers.

Here are a few tips to identify which component is the one for your incident:


Note search

Perform a note search using the SAP SMP (Service Market Place) search tool using SAP xSearch or Note search.
Use key words related to your issue like the transaction code used to reproduce the issue, a table name and the name of the product as in the following example in xSearch, let's try an example: my BPC user has been locked due to incorrect login attempts and I'm trying to find a solution for it:

 

Capture2.PNG

 


Notice I'm searching for the text "user locked SU01", the most suited areas would be BC, GRC and SV.
Now look at the narrowing example:

 

Capture.PNG

Notice I narrow down the search by adding 'BPC' to the search text. This Narrows the results of Notes from over 280 to only 5, which are for EMP-BCT-NW.k

 

Knowledge Base Article(KBA) or Note

Customers often find a note related to the incident being faced, or even create new incidents based on notes already provided by SAP.
As a general approach, the component where the Note or KBA was created for will be the most suited component to create a new incident:

 

KBA example:

Capture3.PNG

 

Note example:

Capture.PNG

.

 

You can find the Component of a KBA or Note under Header Data section, close to the bottom of the page.

 

Short Dump

Usually when customer faces short dump, you can see an application component assigned to the dump at the header, for example:

Capture.PNG

 

This might be usually the correct component for the incident, but not in all cases.
In order to identify the correct component, you should analyze the dump text description and check the programs and the code section where it stopped.


Check the function module/report or Transaction code component

This is usually the best method to identify the component responsible for working on the issue as it shows the exact component responsible for that part of the code.
There are several ways of doing that, I'm going to explain the one I believe most people will have authorization for and that I believe is easy to do:
1. Open the transaction you are facing the issue and navigate on it one step before reproducing the issue.
2.Go to context menu System->Status.

Capture4.PNG
3.Double click on the Program(Screen) value. The code for it will open.

4.Go to context Menu Goto->Attributes. A small popup will open.

Capture.PNG

5. Double click on the Package value. A new screen will open.
6.You will see the component at Application Component value.

Capture6.PNG

 


Checking some of the steps mentioned above should help you identify the correct component. However, there isn't a single formula for all issues. Each issue has to be carefully interpreted to find the appropriate component.
Sometimes even we at SAP have a difficult time to identify what component is correct for the issue. That is why, it is imperative that customers provide a Clear description of the issue with a Concrete example with description of the steps under Reproduction Steps section.

 

 

Do you have another tip on how to identify the corret component? If so, please let me know in the comment section.

Hello altogether,

 

some days ago my first printed book was published by Espresso Tutorials and now it's also available as eBook:

 

Schnelleinstieg BW.jpg

This book is written for beginners in SAP Business Warehouse 7.3. It starts with a short introduction into Business Intelligence and Data Warehouses in common. Then it gives a short overview of Bill Inmons CIF as base for SAPs Business Warehouse and explains LSA and LSA++ in a short section.

 

The main part is an real world example, loading an IMS Health sample file into BW to get an answer for a business question.

As it's written using SAP Business Warehouse 7.3, I explain how you can use dataflow diagrams to build a very basic data flow to load the sample data.

Then it leads step by step with lots of screenshots through the data modelling. You begin to create your first characteristics, key figures. The next step is to create DataStoreObject, InfoCube and MultiProvider in a simplified multi-layer architecture. The next chapter is about ETL. The term itself is shortly explained. Then I show how to create transformations and DTPs. Finally I show how to put all DTPs together in process chains for master data and transactional data.

The last chapter is about Business Explorer. I explain how you can easily create a BEx query on top of your mutliprovider. With a few screenshots I show you can slice and dice through your data. Exceptions and conditions are explained shortly.

 

The most benefit is that you get some help how you can avoid the most common pitfalls in data modelling. It also gives some help in case of data load errors. Two other goodies are ABAPs:

1.) how to convert the keyfigure model into an account model

2.) how to easily create multiple key figures using BAPIs

 

The only disadvantage is that the book is available in german only. An english version is not planned at the moment.

 

Have fun reading it!

 

Cheers,

Jürgen

Ever wondered where your DTP/Transformation spent it's time?

 

This is how you can find out:

 

At first you have to create a small programm like the following:

 

REPORT ZTEST.

DATA: R_DTP TYPE REF TO CL_RSBK_DTP.

                " loading the dtp

CALL METHOD CL_RSBK_DTP=>FACTORY

    EXPORTING

         " here you have to place the technical name of your dtp

         I_DTP  = 'DTP_4YIEJ....'

    RECEIVING

         R_R_DTP = R_DTP.

DATA L_R_REQUEST TYPE REF TO CL_RSBK_REQUEST.

L_R_REQUEST = R_DTP->CREATE_REQUEST( ).

 

" we have to run it in sync-mode (not in Batch)

L_R_REQUEST->SET_CTYPE( RSBC_C_CTYPE-SYNC ).

L_R_REQUEST->DOIT( ).

 

Now run this programm through Transaction SE30 or SAT :se30.png


After the execution of the DTP you will receive the result. In our case it was an select on the psp element. We created an index and the dtp needed only 2 minutes instead of 30 minutes.

 

ergebnis.png

Of course you can also simply call the transaction RSA1 through SAT (using the processing mode "serially in the dialog process (for debugging)"). But doing it that way you have to filter out the overhead in the performance log created by rsa1 and the data will not be written to the target (it's only simulated) so you might miss sometimes a bottleneck in your dtp/Transformation.

 

thanks for reading ...

Questions about fast growing tables in SAP systems are very often asked in SCN forums. Such a tables consumes disk space and even more importantly processing large volumes of data in these tables slows down the system. This also true in area of BW systems. Therefore it is common for SAP Basis guys to do housekeeping on regular basis of these tables. There are SAP Notes available which deals with those tables and advise how to reorganize them. Perfect example of such a Note is 706478 - Preventing Basis tables from increasing considerably. There are many tables discussed in the Note per different areas and also BW areas.


One of them depicts table related to process chains log - RSPCLOGCHAIN.


The table holds logs of Process Chains. In large BW systems running many chains on daily basis table can increase its volume very easily. One of the tables involved in logging of process chains runs is RSPCLOGCHAIN. Regular way of how to get rid of the log of process chains run is to do it from transactions like RSPC or RSPC1. In the log view there is Delete functionality available in the menu:


RSPC1_delete_log.png


To you this functionality in automated way an ABAP report RSPC_LOG_DELETE needs to be utilized.  You can set up the job running this report on regular basis for particular chains with selection on logs from date/time or Log ID.

I found this report quite useful in BW housekeeping tasks.

RSPC_LOG_DELETE.png


PS: This blog is cross published on my personal blog site.

V J

APD Query Tip

Posted by V J Aug 21, 2014

While working on one of the APD requirement we came across a strange issue.

 

As per the requirement, we wanted to add few new key figures to the APD query and hide few old ones.

 

We went ahead and added the new key figures and hid the other ones. We did the mapping and activated the APD.

 

When we started the testing, we realized the data was not coming in the correct order , It was strange and we couldn't figure out the reason.

 

We used trial and error method and finally found the solution.

 

We moved all the hidden key figures to the bottom of column box and the APD output got corrected. It was totally new and unexpected discovery which we were delighted to found, hence sharing it here.

 

Conclusion : Whenever you have hidden columns in APD query, make sure you move it to bottom of the column for correct data.

Part 2 of this blog series will cover some more important aspects of NLS technology

with SAP BW

 

1) A prerequisite of NLS data archival is to compress the InfoCubes that you want to

archive upto the latest data load request before archival . In many landscapes

this is normally done as part of maintenance activities but if not then this might be

a time consuming process to implement on all cubes and might impact project

timelines .

 

2) Many times a question is asked about NLS whether archived data on NLS can be

again restored back to the live SAP BW system . The answer is "Yes"  this is

possible but might lead to inconsistencies in the system and will defeat the purpose

of archival . Data reload option should be used only if there is a critical business need .

 

3) From a BW security perspective there are some prerequisites as well .    

For BW developers to perform archival related activities under developer role,

authorization object S_ARCHIVE needs to be assigned and activity code , area

and object should be '*' .

 

For End Users: To read archived data from NLS , users should have authorization Object

S_RS_ADMWB with Activity Code 03,16 assigned and restricted on Object RSADMWBOBJ


These prerequisites might be specific to NLS vendors as well and you need to provide

additional authorizations if mentioned in documentation provided by your NLS vendors .


4) Another important step which is a part of NLS archival is the creation of virtual

InfoCubes . To read the data from NLS you will need to create new virtual cubes

which will actually connect to NLS . If a query is run which needs to read data from

NLS then these virtual cubes will do the trick . You would need to modify the existing

Multi Providers and queries to include these virtual cubes so that data archived on

NLS is available for reporting .


Will cover some more aspects in the next part of this series


Cheers !

Part 1 of this blog series will cover some important aspects of NLS archival technology along with SAP BW .

 

Some important points to keep in mind before starting a data archival project are :

 

1) Choosing a NLS vendor . There are many SAP certified NLS vendors and a vendor should be chosen based on

requirement and product offered .

 

2) Correct sizing of the NLS box is important so that current and future data archival needs can be met.

 

3) Data to be archived : Archived data is always online for reporting while using NLS , but the reporting speed is

not comparable to reports being run on BWA or HANA appliances . Only data which is rarely accessed should be

put on NLS .

 

The next step is to determine which InfoCubes and the data volume that should be archived .

Based on BW statistics this decision can be made along with inputs from the business users . InfoCubes which

are big in size and infrequently used  are the best candidates for data archival .

 

Below is quick chart of the logic that can be  used to determine how a particular InfoCube should be archived

 

 

Archival Strategy.jpg

 

NLS archival is done using time slices and it is important that InfoCubes have time characteristics which can

be used for data archival .

 

InfoCubes which are refreshed on a daily basis ( Full load ) need not be archived . If SAP ECC system or

source system is also undergoing archival then only live data post archival in the SAP ECC or source system

will get loaded as part of the daily full loads to such InfoCubes and data which has been archived will not be

available in the SAP BW system for reporting .

 

InfoCubes which are delta enabled and delta loads don't bring data for time slices which have already been

archived are the best candidates for NLS archival . The reason here is that if you archive data from a cube for

a particular time slice and if the daily delta brings in data for the same time slice the data loads will fail .


If your daily delta loads to InfoCubes bring in data for time slices which have already been archived it's better to

create copy cubes and move the data to copy cubes for those particular time slices and archive those copy

cubes instead of the original cubes. Post validation , data can be deleted from original InfoCubes . Existing

multi providers and reports will have to be enhanced to include the new copy InfoCubes .

 

Every business requirement,landscape is different and there are multiple ways data archival can be done using NLS .

Above is one approach that can be used .


The second part of this series will try to cover some more aspects of NLS archival technology

 

Cheers !

This blog has been translated with Google Translate. the original blog can be found here: ekessler.de

 

 

In a semantically partitioned object (SPO) was created based on a template object can cause an error during activation. Figure 1.1 shows the construction of a semantically partitioned DSOs. As a template DSO is used.

 

SPO_1_1.jpg

Figure 1.1: Construction of an SPO with template

 

When activating the SPOs, the error occurs "Not possible to create external SAP HANA view for object <SPO>00 / Message no. RS2HANA_VIEW001".

 

The error message is noted that attempts to generate an external SAP HANA View for an SPO (or a hybrid provider).

 

SPO_1_2.jpg

 

Figure 1.2: Error in activating

 

The cause of the error is in this case the templates DSO.In the templates DSO is the hallmark External SAP HANA View for reporting set, see Figure 1.3.

 

SPO_1_3.jpg

 

Figure 1.3: Indicator External SAP HANA View for reporting in the Template Obejkt

 

When creating the SPOs first the reference structure for the SPO is generated. The reference structure of SPOs used as a template for each partition of the SPOs.

 

At installation of the reference structure of the SPO object <SPO>00 the metadata of the original object are copied. The metadata of the template object also includes the indicator External SAP HANA View for reporting.

 

Currently does not support external SAP HANA views for semantically partitioned object (and HybridProvider). Therefore, the maintainance option for the indicator External SAP HANA View for reporting is for a semantically partitioned object (see Figure 1.4) not available.

 

SPO_1_4.jpg

 

Figure 1.4: Properties of the reference structure of the SOPs

 

We find the indicator HANAMODELFL (External SAP HANA view for BW object) in the table RSDODSO (Directory of all datastores), see Figure 1.5.

 

SPO_1_5.jpg

 

Figure 1.5: Indicator External SAP HANA view for BW object

 

To activate the SPO the indicator HANAMODELFL must be removed for the reference structure of the SPOs. The reference structure has the following naming convention <SPO-name>00. Alternatively, you can search the reference structure on the name of the SPOs in the table RSDODSO (for cubes see table RSDCUBE). Figure 1.6 shows both variants.

 

SPO_1_6.jpg

 

Figure 1.6: Search reference structure of the SPO in the RSDODSO

 

remove indicator HANAMODELFL (SAP HANA View), see Figure 1.7 and save.

 

SPO_1_7.jpg

 

Figure 1.7: Remove indicator HANAMODELFL (SAP HANA View)

 

Subsequently, the SPO could be activated without error.

Hi All,

 

Many of you might have alredy have this list of items that need to tested specific to your landscape when we do upgrade, But I thought of posting this blog with a complete list of all BW test items from my project experience.

 

You can make use of this list for projects like BW Upgrade, Service Pack Upgrade, Source System Upgrade, HANA Migration & BEx Upgrade

 

Below test items have been divided into 4 test components Workbench, BEx Queries, BEx Workbooks, BEx Analyzer, Web Templates & Others

 

Component Testing Items

Workbench Tcode - RSA1, RSPC, RSPC1, RSPCM, RSRT, RSANWB, RSCRM_BAPI, RSA2, RSA3, RSA7, RSA6, SM50, SM51, SM66, SE38, SM49, SE37, ST12, SE09, SE03, SM12, RSA1OLD.

Workbench Authorizations & Roles

Workbench Master Data Load - Attribute (Source System)

Workbench Master Data Load - Attribute (Flat File)

Workbench Master Data Load - Text (Source System)

Workbench Master Data Load - Text (Flat File)

Workbench Maintain Master Data

Workbench Maintain Master Data Text

Workbench Hierarchy Data Load - Source System

Workbench Hierarchy Data Load - Flat File

Workbench Transaction Data Load - Source System

Workbench Transaction Data Load - Flat File

Workbench Query Execution in RSRT

Workbench APD Execution - All types (Flat File, Info Cube, Multi Provider, DSO..)

Workbench Open Hub - All Types

Workbench Program Execution - Standard & Custom

Workbench Data Reconciliation

Workbench Batch Run - All variants

Workbench check/create indexes on cube

Workbench check/refresh cube statistics

Workbench Compress Cube

Workbench Delete PSA in process chain

Workbench Maintain cube/Contents to examine data in cube

Workbench Create Info Package

Workbench Create DTP

Workbench Create DSO

Workbench Create Cube

Workbench Create Multiprovider

Workbench Create Trasformation

Workbench Create Data Source

Workbench Create Web Template

Workbench Create Query

Workbench Create Process Chain

Workbench Create cube as copy

Workbench Create InfoObject as Reference

Workbench Capture objects in Trasports

Workbench Add characteristic to Cube

Workbench Add characteristic to ODS

Workbench Change Identification in M/P

Workbench Data modelling - Multiprovider integrity

Workbench Add characteristic to M/Provider

Workbench Change existing InfoObject

Workbench Changing Transfer/Update Rules

Workbench Change existing InfoSet

Workbench Create Aggregate (Not Applicable in case of HANA Migration)

Workbench Modify Aggregate (Not Applicable in case of HANA Migration)

Workbench BIA Index (Not Applicable in case of HANA Migration)

Workbench DSO Activation

Workbench MDX Statements

Workbench Interrupts Execution

Workbench Create a Job

Workbench Schedule a job/Release a Job

Workbench Change the Job Status manually

Others  Performance of the MDX Execution pings/runtimes

Others  Interface with PI/XI/DB Connect/UD COnnect/Flat File/Informatica/EDW/GDW/SSIS for data transfer

Query Designer Execute Query in Query Designer

Query Designer Variable Entry Screen

Query Designer F4 for help Window

Query Designer Variants in Variable Entry

Query Designer Report Output

Query Designer Context Menu

Query Designer Export to Excel

Query Designer Download to PDF

Query Designer Filter options

Query Designer Conditions/Exception

Query Designer Go To Option

Query Designer Drill Down/Across Option

Query Designer Create Bookmark

BEx Analyzer Execute Query in Analyzer

BEx Analyzer Variable Entry Screen

BEx Analyzer F4 for help Window

BEx Analyzer Variants in Variable Entry

BEx Analyzer Report Output

BEx Analyzer Context Menu

BEx Analyzer Export to Excel

BEx Analyzer Download to PDF

BEx Analyzer Filter options

BEx Analyzer Conditions/Exception

BEx Analyzer Go To Option

BEx Analyzer Drill Down/Across Option

BEx Workbook Execute Query in Analyzer

BEx Workbook Variable Entry Screen

BEx Workbook F4 for help Window

BEx Workbook Variants in Variable Entry

BEx Workbook Report Output

BEx Workbook Context Menu

BEx Workbook Export to Excel

BEx Workbook Download to PDF

BEx Workbook Filter options

BEx Workbook Conditions/Exception

BEx Workbook Go To Option

BEx Workbook Drill Down/Across Option

BEx Workbook Marco Addon

Web Templates Execute Query in Query Designer

Web Templates Variable Entry Screen

Web Templates F4 for help Window

Web Templates Variants in Variable Entry

Web Templates Report Output

Web Templates Context Menu

Web Templates Export to Excel

Web Templates Download to PDF

Web Templates Filter options

Web Templates Conditions/Exception

Web Templates Go To Option

Web Templates Drill Down/Across Option

Web Templates Create Bookmark

 

Thanks

Abhishek Shanbhogue

What is interrupts?

 

Interrupt is a process type which is used in the process chains to trigger the chain after completion of specific steps either in BW any other source system. Interrupts are very helpful in automating the batchprocess in BW in case you don’t have any third party scheduler.

 

Business scenario

 

I am considering a classic example where we have SAP BWextracting data from SAP R3 (ECC) and assuming the batch process chains have to wait until the data pre-calculations and dependent business processes are complete in Source system

 

 

Image1.jpg

 

 

How do we achieve?

 

Step 1: We can find Interrupts in General Services 

Image1.jpg

Step 2: Create a new Process Chain and include the Interrupt Process soon after the Start Process

 

Image1.jpg

   Different Options in the Interrupt Scheduling

 

  • Immediate
  • Date/Time
  • AfterJob
  • After event
  • At operation mode
  • Factory calendar day

  Image1.jpg

 

Step 4: Create Event in BW, Tcode SM64 will help you to create an event in SAP BW and ECC

 

Image1.jpg

 

Step 5: In my scenario I am using an event based trigger so that when an ECC job is complete event is triggered which in turn
triggers the BW process chain

 

So I will make use of this even in the process chain and ECC program

 

Image1.jpg

Step 6: To trigger an event in BW from ECC you will need a program for event raise (We have a custom program available so
I am making use of the same program here)

 

Tcode Se38 >Even raise Program > Maintain variant as Z_TEST which was created earlier

 

Image1.jpg

 

Step 7: Create a Job in ECC using Tcode SM36 and include the program which need to be scheduled followed by an event and
schedule this job as per the business needs

 

Define the background Job

Image1.jpg

Assume Z1 is the ECC program dependent on BW process chain so include Z1

 

Image1.jpg

  Image1.jpg

 

Step 8: If you ECC job is scheduled daily at 8PM then the BW PC should also be scheduled at the same time. Once the ECC jobcomplete it will trigger the event which will in turn trigger the BW event

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

SAP CI/DB split - Issues and Analysis:-

 

In SAP Production landscape, we are having CI (Central Instance) including all application servers and DB in a single Host .As HANA Database is there in market the SAP clients are splitting their CI/DB from the host and Keeping CI in one Host (Linux Host) and DB in old host. Later they can move the Database to HANA from IBM DB.

 

 

During this the OS of CI also have to change for eg. AIX to Linux.CI will be moved to Linux and DB2 retained on AIX itself. Here the CI should be moved to Linux because SAP HANA is not going to support in AIX environment.so moving CI to Linux is the first step to install SAP HANA database in future. After moving the DB to HANA they can again keep both CI/DB in new Linux host. Both CI and DB should have Linux OS.

 

Generally after CI/DB split in SAP Landscape we may face many issues regarding communication, Online/offline backups of SAP, performance of BW server etc.

 

Some of the common issues I have listed below and how the BW, BASIS and DB Team will work together .Actually after CI/DB split we may face performance issues on BW.Some time it may takes long time to open the Transaction code in BW.

 

In RSA1 go to modelling, and then click on Info provider after that the system will hang. This will happen for most of the work areas in SAP BW.


 

The first step is to update SAP Basis and DB to check from their end. Both Teams should work together as CI and DB are on different Host.

 

As per BASIS analysis and advice, the DB team can run “Update Statics” on BW. They will run Reorg and Runstats on BW .it all depends on number of tables present in database and how much data they are carrying.

 

REORG helps DB2 to ensure indexes should become aware of all new data and also no longer include deleted data and collapse all empty page space created by deletion of data and indexes

 

RUNSTATS gathers the updated statistics on the volume and distribution of data within tables and indexes. This information is stored in the system tables and is used to query the data.

 

As a result all tables have been Re-orged and Runstated.

 

The DB team can add Significant Memory to the DB2 buffer pools as much memory as possible. Then BW Team can validate BW server performance degradation. If performance is still slow, they can run an SAP trace to see where the time is being spent.

 

SAP BASIS can restart SAP.

 

Stopping the SAP System: Login into the OS level as a user with SAP administrator authorization (<SID> adm).

 

Enter the command: stopsap [DB|R3|ALL].

 

DB stops the database system: stopsap DB

 

R3 stops the instances & associated processes of the SAP System: stopsap R3

 

ALL stops both the database system & the SAP System. Stopsap ALL

 

Starting the SAP System: Log on in your OS level as a user with SAP administrator authorization (<SID>adm).

 

Enter the command: startsap [DB|R3|ALL].

 

The following applies to this command:

 

DB starts the database system:     startsap DB

 

R3 starts the instances & associated processes of the SAP System:     startsap R3

 

ALL starts both the database system & the SAP System:    startsap ALL

 

The DB team can do the settings so that more memory should be given to both SAP and DB2.

 

If the performance is still down then Basis can analyze the performance issues. They can clear user Buffer from their end. This below screen shows user BUFFER AREA, if we want to reset these area just do the following step

 

TCODE –> SU56

 

Authorization Values –> Reset User Buffer


 

Now BW Team can validate the performance.

 

In Some cases, we can see some issues like the Sequential read time of Transaction codes will be different for different user. Some user’s id are taking less time for Tcode RSA1 and some users Tcode will hang for more time.

 

We tried accessing RSA1 Tcode from other users (DDIC/USER1) it opens quickly. Whereas it is not so for USER2.

Response time log both the user can be compared. We tried accessing RSA1 Tcode from other users (DDIC/USER1) it opens quickly. Whereas it is not so for USER2.

 

USER2:

 

USER1:-

On further Analysis, Now issue looks to be happening for all the user ids. It’s hanging while accessing table RSDVCHA


 

 

Tracing the transaction execution RSA1 is showing hanging on multiple tables.

 

 

We can go for quick restart of application and server as well. If BASIS unable to stop database on BW then DB team can stop it from their end. We can ask that Currently SAP application is down please stop the database. And Once DB is stopped, we can go for Restart the servers DB instance and CI instance.

 

Application has been stopped on server CI instance.

In some scenario if communication is not happening between CI and DB. We can check R3trans from DB servers, if processes are getting hanged.


R3trans Return Codes:-

 

R3trans sets a return code that shows whether or not the transport has succeeded. You can view details on transports in the log file. Here are the following return codes:

0: No errors or problems have occurred.

4: Warnings have occurred but they can be ignored.

8: Transport could not be finished completely. Problems occurred with certain objects.

12: Fatal errors have occurred, such as errors while reading or writing a file or unexpected errors within the database interface, in particular database problems.

16: Situations have occurred that should not have.

 

If it happens, the DB team can dig in to DB and it occurs mainly due to multiple locks in db. Which should be cleared up from DB end and they have to check DB2CLI driver for communication. If possible they can reinstall db2cli driver again to verify that db2cli is working perfect.

 

Accessing DB2 Database using DB2 CLI: -   A DB2 CLI uses a standard set of functions to execute SQL statements and related services at run time. DB2 Call Level Interface is IBM's callable SQL interface to the DB2 family of database servers. It uses function calls to pass dynamic SQL statements as function arguments. DB2 CLI does not require host variables or a pre compiler.

 

Install the DB2 CLI driver on database server.  To do this, we require free space of approximately 50 MB for each operating system of application servers.

 

We should also ensure that the main version of the DB2 CLI driver matches the main version of the DB2 software on the database server. The fix pack level of the DB2 CLI driver must be equal to or lower than the fix pack level of the DB2 software on the database server.

 

After installing again Check whether the SAP system and the SAP programs can then determine the DB2 CLI driver and use it.

 

In a random directory, execute the following command:

 

R3trans –x

 

This call should not return any errors and should report the following:

"R3trans finished (0000)."

 

BWadm 10> R3trans –d

This is R3trans version 6.24 (release 741 - 12.05.14 - 20:14:05).

Unicode enabled version

R3trans finished (0000).

 

By this we can ensure communication is happening between CI and DB.

 

In Another Scenario we can have R3trans is responding fine and application is also connecting fine. But the performance is again slow.

 

This may be due to online or offline backup’s. Tivoli Storage Manager (TSM) provides a powerful and centralized backup, archive, and storage management. Tivoli Storage Manager also plays a role in backup and recovery of DB2 Content Manager Databases in the event of hardware failure. The DBAs and the storage team can work together to get backup issues solved. Sometime the backup job is continually hanging on a TSM resource (Storage device resource) for long periods of time. And it will create numerous locks that can impact the all SAP jobs which will ends up in hanging BW Transactions codes. In this case we have to cancel the backup jobs and check the BW performance.

 

In DB daily online backups will be taken and during the case of Application upgrade and before CI/DB Split the offline backup should be taken .After CI/DB split we should monitor the backup jobs as well whether it is creating any locks.

----------------------------------------------------------------------------------------------------------


SAP BW 7.4 -Analysis & issues:-


In SAP BW Release 7.4 SPS2 the domain RSCHAVL was converted from CHAR 60 to SSTRING 1333. In 7.4 versions the characteristic 0TCTIOBJVAL references the data element RSCHAVL_MAXLEN and has the type CHAR 250. Since the maximum permitted length of characteristic values is 250 characters, the values of all characteristics can be stored in 0TCTIOBJVAL.


If we compare with previous version, the characteristic 0TCTIOBJVAL directly referenced the data element RSCHVAL, which uses the domain RSCHAVL. So the characteristic 0TCTIOBJVAL and other objects that reference the characteristic have to be changed in 7.4 versions.


Data Element in SAP BW 7.4 Version:-


 

 

 

Data Element in SAP BW previous Version:-


 

 

There are characteristics 0TCTLOW and 0TCTHIGH referenced the characteristic 0TCTIOBJVAL. Since both characteristics are frequently used together in the key of a DSO, they can no longer reference 0TCTIOBJVAL.As a result, both together would be 500 characters long. However, the total key length of an ABAP Dictionary table in an SAP system must be shorter. Therefore, a new characteristic 0TCTIVAL60 of type CHAR 60 was introduced and the characteristics 0TCTLOW and 0TCTHIGH now both reference 0TCTIVAL60.


 

They previously had the type CHAR60 and still have same type post upgrade. As a result, all objects that use these two characteristics together are still executable. However, they work only for applications that have characteristic values no longer than 60 characters.

During SUM tool runs, the XPRA program RSD_XPRA_REPAIR_0TCTIOBJVL_740 which copies the contents of the SID table from 0TCTIOBJVAL (/BI0/STCTIOBJVAL) to the SID table of 0TCTIVAL60 (/BIO/STCTIVAL60).


 

The characteristics 0TCTLOW_ML and 0TCTHIGH_ML are created and they reference 0TCTIOBJVAL.

 

 

 

In previous version, We used the DSO 0PERS_VAR and it contains the characteristics 0TCTLOW and 0TCTHIGH in its key part. Since this DSO could not store characteristic values that are longer than 60 characters, they BW 7.4 came up with a new DSO 0PERS_VR1.The data part of this DSO contains the two characteristics 0TCTLOW and 0TCTHIGH.

The program RSD_XPRA_REPAIR_0TCTIOBJVL_740, activates the new DSO, and copies the contents of the previous DSO 0PERS_VAR to the new DSO 0PERS_VR1. Then, the personalization should work as usual.

The programs that run during the upgrade activate new objects and, if necessary, copy data from old objects to new objects. However, they do not delete obsolete objects. The DSO 0PERS_VAR for storing the personalized variable values is no longer used.

 

The database tables RSECVAL and RSECHIE that store analytical authorization objects are no longer required.


Database table RSECVAL


 

Database table RSECHIE


 

The program RSD_XPRA_REPAIR_RSCHAVL_740, copied to the table RSECVAL_STRING or RSECHIE_STRING. If that has been successfully executed, we can delete the contents of the tables RSECVAL and RSECHIE.


 

The tables RSRNEWSIDS and RSRHINTAB_OLAP are also no longer required. The program RSD_XPRA_REPAIR_RSCHAVL_740 also copies the contents of these tables to the new tables RSRNEWSIDS_740 or RSRHINTAB_OLAP_S. If that has been successfully executed, we can also delete the contents of these tables.


RSRHINTAB_OLAP


 

RSRNEWSIDS


 

RSRNEWSIDS_740


 

RSRHINTAB_OLAP_S


 

ABAP Program:-


In SAP BW 7.4, the domain RSCHAVL was changed from CHAR 60 to SSTRING 1333. As a result, data elements that use the domain RSCHAVL are "deep" types in an ABAP context. Therefore, some ABAP language constructs are no longer possible and it will give syntax errors or they cause runtime errors in customer-specific programs.

 

Texts with a length of up to 1,333 characters are possible for characteristic values. For this, the structure RSTXTSMXL was created which is a "deep" type in an ABAP context. In the internal method interfaces and function module interfaces that handle the texts of characteristic values, the type RSTXTSML was replaced with RSTXTSMXL. However, the RSRTXTSML structure remains unchanged and is required for the description of metadata.


RSTXTSML:-

 

RSTXTSMXL:

 

The change should have little effect on our programs. We must expect problems where we operate on characteristic values that are assigned with a generic type in case of variable exits or where we call SAP-internal functions or methods whose interfaces were changed by SAP.

 

Most of the problems are syntax errors that result in a program termination. We can use the Code Inspector Tool to systematically detect and resolve these problems. We can run Code inspector program as a check both before and after the upgrade. It will show us the things that need to change as per 7.4 versions.


 

 

 

The include ZXRSRU01 and the function module EXIT_SAPLRRS0_001 will not analyzed by the Code Inspector. This include must be fixed by ABAPER from SE38.The point we should remember is that we should use keyword ‘Type’ instead of ‘Like’. There will be RRRANGEEXIT complex structure after the enhancement; so we should use TYPE instead of LIKE.

 

DATA:   VAR_NAME TYPE RRRANGEEXIT.



 

During pre upgrade the structure is given.



 

 

Post upgrade:-


 

Similar with Function Module too:-


 

These are the ABAP code change should be done by an ABAPER after BW 7.4 Post upgrade.

-----------------------------------------------------------------------------------------------------




I have seen recently a lot of problems related to the transport of transformations in program CL_RSO_TLOGO_PERSISTENCY, and I thought, it would be a good idea to give a brief overview of this issue, and how this dump can be avoided. Furthermore, I would like to give you some hints, what should be checked to analyze the problem and finally in the last part of this blog entry, I would like to summarize the release specific properties. 

 

 

dump.PNG

 

 

 

Backgroud

 

A versioning of the BW metadata has been implemented in the t-logo framework with BW 7.30. If transformations contain orphaned metadata, this dump occurs as a result. In many cases the issue occurs more specifically because there are some inconsistent entries for affected transformations in tables RSTRANSTEPROUT and RSTRANROUTMAP.

Running report RSTRAN_ROUT_RSFO_CHECK (to check formulas and routines with regards to completeness & consistency of metadata) will return the list of affected TRANIDs & their related errors.

 

 

Effects of note 1369395

 

 

You should only check the post manual steps from this note, where instructions from report RSTRAN_ROUT_RSFO_CHECK are described.

This report will check the formulas and routines with regard to completeness or consistency of metadata.

 

You should run this report first on "Check" mode for indentify the inconsistencies, and then in "Repair" mode for correct them.

 

Caution: You have to perform this manual activity separately in each system into which you transport the Note for implementation.

 

After that, run the report in check mode again and see the results.

 

If the error messages still persist, delete the T version via Class CL_RSTRAN_STAT => DELETE_VERSION_FROM_DB. You can see more information about this step on note 1720202.

 

 

 

This action cannot be transported and must be executed in all target systems of the transportation landscape.

Another very relevant information is that you must transport the cleaned up transformation using a NEW transport. The old transport contains incorrect objects and can no longer be used. You should delete the old transport from all import buffers.

 

I several times the code correction from note can be missing on your system:

When you create a NEW transport request, make sure that all the necessary routines are included, for this check ON the option "All necessary objects".

 

 

List of referenced notes and documents:

 

#1801309 - Dump in in CL_RSO_TLOGO_PERSISTENCY during the transport of transformations in BW 7.30

#1760444 - Transport failure: The object name is not allowed to be empty (R7105)

#1880563 - Manual task for DUMP in CL_RSO_TLOGO_PERSISTENCY

#1720202 - Dump 'ASSERTION_FAILED' in class CL_RSTRAN_GEN / error
message RSTRAN 710 during the activation of a transformation

Hello,

 

I'd like to share some knowledge I recently stumbled upon while creating a transformation and having to debug it, since it did not work as expected.

Since I did not see any hints to this on the SCN or in the help, I think I'd share this:

 

When I use custom routines in transformation, I like to include proper exception handling and making monitor entries, since you can never be sure, what kind of data is being input by the users and you don't want to waste too much time searching for the document that actually caused the error.

The help specifies the exceptions, that can be used like this:

 

Exception handling by means of exception classes is used to control what is written to the target:

  •   CX_RSROUT_SKIP_RECORD: If a raise exception typecx_rsrout_skip_recordis triggered in the routine, the system stops processing the current row and continues with the next data record.
  •   CX_RSROUT_SKIP_VAL: If an exceptiontype cx_rsrout_skip_valis triggered in the routine, the target field is deleted.
  •   CX_RSROUT_ABORT: If a raise exception type cx rsrout_abortis triggered in the routine, the system terminates the entire load process. The request is highlighted in the extraction monitor as having been  Terminated. The system stops processing the current data package. This can be useful with serious errors.

 

(See Routine Parameters for Key Figures or Characteristics - Modeling - SAP Library)

What the help does not say is that there is one obstacle if you try to use it not with a key figure, but with a characteristic:

 

The thought behind seems to be that, if it is not a valid record (i.e. you raise the exception in a characteristic), the whole record is skipped.

 

So instead of a "clear", that happens when you raise the exception with a key figure (example):

Exception2.png


You get a raise exception type cx_rsrout_skip_record:

Exception1.png

If you search for it, you can find it easily in the generated program though.


Hope it helps with the proper usage of the routines and the exceptions.

Actions

Filter Blog

By author:
By date:
By tag: