1 2 3 26 Previous Next

SAP Business Warehouse

378 Posts

There are scenarios when Transformation End Routine is a good fit. In my blog will demonstrate how to simplify Transfer Rules by means of:

 

Reducing Coding

 

In my case I load PO GR data and lookup multiple characteristic values from PO Item level. Instead of repetitive coding similar lookup / mapping for each characteristic in individual Transfer Rule I did once in End Routine. It saved me not only coding efforts, but also increased performance by reducing a number of lookups.

 

End Routine 1.jpg

 

End Routine 2.jpg

 

 

Increasing Reusability

 

During PO GR data load I calculate Delivery Duration based on Due Date, Delivery Duration based on GR data and over / under variances of two durations. I did not like the idea to repeat durations calculation logic in variance transformation rules. Instead I used results of duration calculations in end routine to calculate variances.

 

End Routine 3.jpg

End Routine 4.jpg

Scenario: If I execute DTP in current month, it always picks only "Current month – 1" data.

 

Example:

 

If today's date is 04/22/2016, based on the system date, it will calculate previous month First day and Last day. i.e. it will fetch 03/01/2016 to  03/31/2016.

 

If today's date is 01/22/2016, based on the system date, it will calculate previous month First day and Last day. i.e. it will fetch 12/01/2015 to 12/31/2015.

 

 

Occasionally we need to filter the Date Characteristic InfoObject to extract only “Previous Month” data. Here the filter selection is not on SAP Content InfoObject, the filter selection is on Custom InfoObject.

 

If it is SAP Content InfoObject, we may have few SAP Customer exit/variables to use directly in DTP, but in this example I’m using Custom InfoObject which is created Data Type as DATS.

 

In DTP select the InfoObject and choose Create Routine and write/add the below Code in DTP Routine.

 

* Global code used by conversion rules
*$*$ begin of global - insert your declaration only below this line  *-*
* TABLES: ...


   DATA:   dt_range  TYPE STANDARD TABLE OF rsdatrange,

        btw LIKE STANDARD TABLE OF rsintrange,
        wdt_range
TYPE rsdatrange
.


*$*$ end of global - insert your declaration only before this line   *-*

 

 

*$*$ begin of routine - insert your code only below this line        *-*
     
data: l_idx like sy-tabix.
     
read table l_t_range with key
      fieldname
= ' '.
      l_idx
= sy-tabix.
*....

  CALL FUNCTION 'RS_VARI_V_LAST_MONTH' 
   * EXPORTING
   * SYSTIME          = ' '
  TABLES
   p_datetab 
= dt_range
   p_intrange
= btw.

READ TABLE dt_range INTO wdt_range INDEX 1.

      l_t_range
-fieldname = '/BIC/<Your_InfoObject_Name>'.
      l_t_range
-option = 'BT'.
      l_t_range
-sign = 'I'.
      l_t_range
-low = wdt_range-low.
      l_t_range
-high = wdt_range-high.

APPEND l_t_range
.

 

*  IF l_idx <> 0.
*    MODIFY l_t_range INDEX l_idx.
*  ELSE.
*    APPEND l_t_range.
*  ENDIF.

*$*$ end of routine - insert your code only before this line         *-*

     Sometimes data in Source System is not checked for quality. For example, input data is not checked for non printable characters e.g. tabulation, carriage return, linne feed etc. If user copy and paste data into input fields from email or web page then non printable characters can be entered into the system causing BW data loading issues (not permitted characters). In case of master data quality issue must fixed immediately otherwise problem will become worse with every transaction where incorrect master data is used. In case incorrect of just information fields that are stored in DSO at document level, then data can be fixed in transfer rules.

     What it takes is to correct data in transfer rule start routine using regular expression.

REGEX1.jpg

Prior to executing REPLACE statement HEWORD SOURCE_PACKAGE field contains hex 09 (tabulation) character.

REGEX2.jpg

Once REPLACE statement is executed, non printable character is gone.

REGEX3.jpg

REGEX4.jpg

List of issues faced during the flat file generation from APD.

 

1> Header name for key figures displaying with technical names in flat file.

 

2> Negative sign of key figures like amount and quantity displaying post values in flat file.

      Which will result in wrong Total Amount in flat file.

      i.e.     Amount

                 $1000 –

 

3> Leading zeros has been added in to the key figures of APD.

 

4> Values are getting rounded off. No decimal places displayed in the flat file.

 

Solution:

 

First create Z info objects as per your header field names length.

  1. i.e. ZCHAR20 for field name length 20

 

Assign these Z info objects in your target field of APD Routine as below,

             

Capture.PNG

 

Write following login in routine tab,

 

DATA: ls_source TYPE y_source_fields,
ls_target
TYPE y_target_fields.


ls_target
-Char1 = 'ABC'.
ls_target
-Char2 = 'XYZ'.

APPEND ls_target to et_target.

data : lv_value type p length 16 DECIMALS 2(Add decimal places as per your need)

LOOP AT it_source INTO ls_source.
    

           *    MOVE-CORRESPONDING ls_source TO ls_target.
                    ls_target
-Char1 = ls_source-Char1.
                    ls_target
-Char2 = ls_source-Char2.
          
*    ls_target-KYF_0001 = ls_source-KYF_0001.
                    
clear : lv_value.
                    
if ls_source-KYF_0001 is not initial.
                        lv_value
= ls_source-KYF_0001.
                              
if lv_value is not initial.
                                  ls_target
-KYF_0001 = lv_value.
                                   
if lv_value lt 0.
                                        
SHIFT ls_target-KYF_0001 RIGHT DELETING TRAILING '-'.
                                        
SHIFT ls_target-KYF_0001 LEFT DELETING LEADING ' '.
                                        
CONCATENATE '-' ls_target-KYF_0001 INTO ls_target-KYF_0001.
                                   
endif.
                              
endif.
                      
else.
                             ls_target
-KYF_0001 = '0.00'.
                      
endif.

 

 

Note: Here Char1, Char2 is your info object technical name.

           ABC, XYZ is field name which you want to display in header field of flat file.

    Virtual Cube Function Module can be very easily implemented using CL_RSDRV_REMOTE_IPROV_SRV class services (there is an example in class documentation). I like its simplicity, but unfortunatelly it can not handle complex selections. In my blog, I will explain how to keep Virtual Cube Function Module implementation simple and in the same time handle complex selections enhancing service class.
      Below is Function Module that implementation Virtual Cube reading from SFLIGHT table
*---------------------------------------------------------------------*
*      CLASS lcl_application  DEFINITION
*---------------------------------------------------------------------*
CLASS lcl_application DEFINITION.
 
PUBLIC SECTION.
   
CLASS-METHODS:
      get_t_iobj_2_fld
RETURNING VALUE(rt_iobj_2_fld) TYPE
                    cl_rsdrv_remote_iprov_srv
=>tn_th_iobj_fld_mapping.

ENDCLASS.

*---------------------------------------------------------------------*
*      CLASS lcl_application  IMPLEMENTATION
*---------------------------------------------------------------------*
CLASS lcl_application IMPLEMENTATION.
*---------------------------------------------------------------------*
* get_t_iobj_2_fld
*---------------------------------------------------------------------*
METHOD get_t_iobj_2_fld.

  rt_iobj_2_fld
= VALUE #( ( iobjnm = 'CARRID'    fldnm = 'CARRID' )
                         
( iobjnm = 'CONNID'    fldnm = 'CONNID' )
                         
( iobjnm = 'FLDATE'    fldnm = 'FLDATE' )
                         
( iobjnm = 'PLANETYPE' fldnm = 'PLANETYPE' )
                         
( iobjnm = 'SEATSOCC'  fldnm = 'SEATSOCC' )
                         
( iobjnm = 'SEATSOCCB' fldnm = 'SEATSOCC_B' )
                         
( iobjnm = 'SEATSOCCF' fldnm = 'SEATSOCC_F' ) ).

 
ENDMETHOD.
ENDCLASS.

FUNCTION z_sflight_read_remote_data.
*"----------------------------------------------------------------------
*"*"Local Interface:
*"  IMPORTING
*"    VALUE(INFOCUBE) LIKE  BAPI6200-INFOCUBE
*"    VALUE(KEYDATE) LIKE  BAPI6200-KEYDATE OPTIONAL
*"  EXPORTING
*"    VALUE(RETURN) LIKE  BAPIRET2 STRUCTURE  BAPIRET2
*"  TABLES
*"      SELECTION STRUCTURE  BAPI6200SL
*"      CHARACTERISTICS STRUCTURE  BAPI6200FD
*"      KEYFIGURES STRUCTURE  BAPI6200FD
*"      DATA STRUCTURE  BAPI6100DA
*"----------------------------------------------------------------------

  zcl_aab=>break_point( 'Z_SFLIGHT_READ_REMOTE_DATA' ).

 
DATA(iprov_srv) = NEW
    cl_rsdrv_remote_iprov_srv
( i_th_iobj_fld_mapping = lcl_application=>get_t_iobj_2_fld( )
                                i_tablnm             
= 'SFLIGHT' ).

  iprov_srv
->open_cursor(
    i_t_characteristics
= characteristics[]
    i_t_keyfigures     
= keyfigures[]
    i_t_selection     
= selection[] ).

  iprov_srv
->fetch_pack_data( IMPORTING e_t_data = data[] ).

 
return-type = 'S'.

ENDFUNCTION.
This how BW Query is defined which sends complex selection to Virtual Cube Function Module.
Service Class 2.jpg
Service Class 3.jpg
As you can see the Query reads number of seats occupied in Airbus Airplanes Types (global restriction) for All Carriers, Lufthansa and American Airlines in each 2015 and 2016 years.  Following selection is sent to Virtual Cube Function Module
Service Class 4.jpg
Expression 0 correspinds to global restriction and expressions 1 through 6 correspond to restricted key figures (All Carriers 2015, All Carriers 2016, Lufthansa 2015, Lufthansa 2016, American Airlines 2015 and American Airlines 2016).
Service class in our Virtual Cube Function Module used in such a way that generates wrong SQL Where clause expression. It is not a problem with Service Class as such, but the way it is used.
Service Class 6.jpg
BW Query results are wrong (All Carries data is a sum of Lufthansa and American Airlines. e.g. other carriers data is missing).
Service Class 7.jpg
The problem is that generated SQL Where clause expression does not follow the rule below:
E0  AND (  E1 OR E2 OR E3 ... OR EN ),
where E0 corresponds to the global restrictions and E1, E2, E3 ... EN to other restrictions.
The problem can easily be fixed enhancing CL_RSDRV_REMOTE_IPROV_SRV service class. What it takes is to:

 

Service Class 8.jpg

Creation of BUILD_WHERE_CONDITIONS_COMPLEX method
Service Class 9.jpg
METHOD build_where_conditions_complex.
DATA: wt_bw_selection TYPE tn_t_selection.
DATA: wt_where TYPE rsdr0_t_abapsource.

* E0 AND ( E1 OR E2 OR E3 ... OR EN )
 
LOOP AT i_t_selection INTO DATA(wa_bw_selection)
   
GROUP BY ( expression = wa_bw_selection-expression )
             
ASCENDING ASSIGNING FIELD-SYMBOL(<bw_selection>).
   
CLEAR: wt_bw_selection,
          wt_where.
   
LOOP AT GROUP <bw_selection> ASSIGNING FIELD-SYMBOL(<selection>).
      wt_bw_selection
= VALUE #( BASE wt_bw_selection ( <selection> ) ).
   
ENDLOOP.
    build_where_conditions
( EXPORTING i_t_selection = wt_bw_selection
                           
IMPORTING e_t_where    = wt_where ).
   
CASE <bw_selection>-expression.
   
WHEN '0000'.
     
IF line_exists( i_t_selection[ expression = '0001' ] ).
       
APPEND VALUE #( line = ' ( ' ) TO e_t_where.
     
ENDIF.
     
APPEND LINES OF wt_where TO e_t_where.
     
IF line_exists( i_t_selection[ expression = '0001' ] ).
       
APPEND VALUE #( line = ' ) AND ( ' ) TO e_t_where.
     
ENDIF.
   
WHEN OTHERS.
     
IF <bw_selection>-expression > '0001'.
       
APPEND VALUE #( line = ' OR ' ) TO e_t_where.
     
ENDIF.
     
APPEND VALUE #( line = ' ( ' ) TO e_t_where.
     
APPEND LINES OF wt_where TO e_t_where.
     
APPEND VALUE #( line = ' ) ' ) TO e_t_where.
     
IF ( line_exists( i_t_selection[ expression = '0000' ] ) ) AND
       
( NOT line_exists( i_t_selection[ expression = <bw_selection>-expression + 1 ] ) ).
       
APPEND VALUE #( line = ' ) ' ) TO e_t_where.
     
ENDIF.
   
ENDCASE.
 
ENDLOOP.

ENDMETHOD.
BUILD_WHERE_CONDITIONS_COMPLEX method contains logic to build selection accorindg to the rule. It is calling original
BUILD_WHERE_CONDITIONS method using it as buling block. New LOOP AT ... GROUP BY ABAP Syntax is used to split selection table into individual selections converting then them into SQL Where clause expressions and combining them into final expression as per the rule.

 

 

 

Implemention of Overwrite-exit for OPEN_CURSOR method
CLASS lcl_z_iprov_srv DEFINITION DEFERRED.
CLASS cl_rsdrv_remote_iprov_srv DEFINITION LOCAL FRIENDS lcl_z_iprov_srv.
CLASS lcl_z_iprov_srv DEFINITION.
PUBLIC SECTION.
CLASS-DATA obj TYPE REF TO lcl_z_iprov_srv. "#EC NEEDED
DATA core_object TYPE REF TO cl_rsdrv_remote_iprov_srv . "#EC NEEDED
INTERFACES  IOW_Z_IPROV_SRV.
 
METHODS:
  constructor
IMPORTING core_object
   
TYPE REF TO cl_rsdrv_remote_iprov_srv OPTIONAL.
ENDCLASS.
CLASS lcl_z_iprov_srv IMPLEMENTATION.
METHOD constructor.
  me
->core_object = core_object.
ENDMETHOD.

METHOD iow_z_iprov_srv~open_cursor.
*"------------------------------------------------------------------------*
*" Declaration of Overwrite-method, do not insert any comments here please!
*"
*"methods OPEN_CURSOR
*"  importing
*"    !I_T_CHARACTERISTICS type CL_RSDRV_REMOTE_IPROV_SRV=>TN_T_IOBJ
*"    !I_T_KEYFIGURES type CL_RSDRV_REMOTE_IPROV_SRV=>TN_T_IOBJ
*"    !I_T_SELECTION type CL_RSDRV_REMOTE_IPROV_SRV=>TN_T_SELECTION .
*"------------------------------------------------------------------------*
 
DATA:
    l_t_groupby   
TYPE rsdr0_t_abapsource,
    l_t_sel_list 
TYPE rsdr0_t_abapsource,
    l_t_where     
TYPE rsdr0_t_abapsource.

  core_object
->build_select_list(
   
exporting
      i_t_characteristics
i_t_characteristics
      i_t_keyfigures     
i_t_keyfigures
   
importing
      e_t_sel_list
= l_t_sel_list
      e_t_groupby 
= l_t_groupby ).

  core_object
->build_where_conditions_complex(
   
exporting
      i_t_selection
= i_t_selection
   
importing
      e_t_where
= l_t_where ).

* #CP-SUPPRESS: FP secure statement, no user input possible
 
open cursor with hold core_object->p_cursor for select (l_t_sel_list) from (core_object->p_tablnm)
   
where (l_t_where)
   
group by (l_t_groupby).

ENDMETHOD.
ENDCLASS.
OPEN_CURSOR  Overwrite-exit method has the same logic as original method except that BUILD_WHERE_CONDITIONS_COMPLEX method is called instead of BUILD_WHERE_CONDITIONS
Now when the changes are in place, lets run the report again and see what SQL Where Clause expression is generated
Service Class 10.jpg
Finally, lets run the report again and see if shows correct data.
Service Class 11.jpg
Now data is correct. All Carriers includes all data not only Lufthansa and American Airlines.

Introduction to Roles and Authorizations in BW 7.4

The Roles and Authorization maintained in BW7.4 provides a restriction on accessing reports based on infocube level, Characteristics level, Characteristics Value level, Key Figure level, hierarchy Node Level. The above mentioned restrictions are maintained by using this below mentioned approach;

 

Authorizations are maintained in authorization objects.

Roles contain the Authorizations.

Users are assigned to roles

 

Capture 21.PNG

 

Transactions Used

Infoobject Maintenance - RSD1.

Role Maintenance - PFCG

Roles and Authorization maintenance - RSECADMIN.

User creation SU01.

 

Note: A Characteristic object should be Authorization Relevant to make it available for restrictions. To make a characteristics object, Authorization Relevant; Go to “Business Explorer” tab in Info object details. Without making an object Authorization relevant checked, we cannot use it or include it into the Authorization Object.

 

Enter T code RSD1

Capture.PNG

enter the info object and click on Maintain.

Capture 1.PNG

Click on Business Explorer Tab then select the Authorization Relevant check box.so now we can use this

object in Roles and Authorization.

 

SCENARIO:

In my Scenario we want to create authorization on info object(0FUNCT_LOC) with hierarchy.suppose the hierarchy have three level's and i have 3 user's like User1,User2,User3. but User1 need to access hierarchy level 1 data ,User2 need to access hierarchy level 2 and User3 need to access hierarchy level 3.so that we need to follow the steps.

 

Creating Roles and Authorization objects

Creating Authorization objects

Enter T code RSECADMIN

Capture 2.PNG

then click on Ind.Maint.

 

 

cap2.png

Enter the Authorization name and click on create.

cap1.png

 

 

 

Maintain short,medium,long description and click on Insert Row and enter the objects.

0TCAACTVT Activity in Analysis Authorizations

0TCAACTVT Grant authorizations to different activities like to change and Display, Default value is 03 Display.

0TCAIPROV Authorizations for InfoProvider

0TCAIPROV Grant authorization to particular InfoProviders, Default value is * .

0TCAVALID Validity of an Authorization

0TCAVALID Define when authorizations are valid or not valid, Default Value is * .

and click on insert special characteristics.

 

cap3.png

cap4.png

 

 

cap5.png

 

 

 

now enter the info object 0FUNCT_LOC. and double click on that then go for Hierarchy Authorizations Tab.

click on create option.

cap6.pngcap7.png

 

  

 

select Hierarchy click on browse.

cap8.png

select Node Details and click on browse.

 

 

select particular Node from left side and move to right side what ever we required for particular user.

select particular Type of Authorization is

Capture 12.PNG

then click on continue.

Now click on User Tab.

Capture 13.PNG

 

 

 

click on Indvl Assignment then it will appear the below screen.

cap10.png

 

Enter the User and click on Role Maintenance.

cap11.png

 

click on create single role.

cap12.png

 

enter the description and click on change authorization data ICON.

 

 

cap13.png

 

add the above marked objects and click on generate ICON.

Now come to User tab enter the required user's

 

cap14.png

 

Click on user comparison then we get the below screen.

cap15.png

If we want to give access particular T code then go to Menu tab click on Add that T code and then screen will appear like this.

 

Capture 20.PNG

enter t code and click on Assign Transactions.and save it.

now log in Analyzer or SAP BW with

for the User2 and User3 also we need to follow the same steps.

Overview of Remodeling

 

If we want to modify an DSO that data has already been loaded. We can use remodeling to change the structure of the object without losing data.

If we want to change an DSO  that no data has been loaded into yet, we can change it in DSO maintenance.

 

We may want to change an InfoProvider that has already been filled with data for the following reasons:

 

We want to replace an InfoObject in an InfoProvider with another, similar InfoObject. we have created an InfoObject ourself but want to replace it with a BI Content InfoObject.

 

Prerequisites

 

As a precaution, make a backup of your data before you start remodeling. In addition, ensure that:

we have stopped any process chains that run periodically and affect the corresponding InfoProvider. Do not restart these process chains until remodeling is finished.

There is enough tablespace available in the database.

After remodeling, we have to check which BI objects that are connected to the InfoProvider (for example, transformation rules, MultiProviders) have been deactivated. we have to reactivate these objects manually. The remodeling makes existing queries that are based on the InfoProvider invalid. we have to manually adjust these queries according to the remodeled InfoProvider. If, for example, we have deleted an InfoObject, we also have to delete it from the query.

 

Features

A remodeling rule is a collection of changes to your DSO that are executed simultaneously.

For DSO, you have the following remodeling options:

For characteristics:

Insert or replace characteristics with:

Constants

An attribute of an InfoObject within the same dimension

A value of another InfoObject within the same dimension

A customer exit (for user-specific code)

Delete

For key figures:

Insert:

Constant

A customer exit (for user-specific code)

Replace with: ○ A customer exit (for user-specific code)

Delete You cannot replace or delete units. This avoids having key figures in the DSO without the corresponding unit.


Implementation of Remodeling Procedure To carry out the Remodeling  procedure, Right click on your DSO and in the context menu, navigate through Additional Functions -----> Remodeling.


Capture.PNG                                                                                                                       we will get the following window after clicking on Remodeling. Enter a remodeling rule name and press Create to create a new rule.

Capture1.PNG                                                                             After clicking on Create we will get the following pop-up window where in we have to enter a description for the rule we wish to create (as shown below).

Capture2.PNG                                                                               After entering the description, press the Create button. we will see the following screen.


           Capture4.PNG                                                                              

As we can see, the left pane shows the structure of the DSO in consideration.

To add a new remodeling rule, Click on the Green Plus sign on the Top-Left corner of your screen (Also circled in Red below). It is called the Add Operation to List button.

 

Capture4.PNG                                                                                   You will get the following pop-up where you can add the remodeling rules.

Capture5.PNG                                                                                         Business Requirement The requirement is as follows:

To delete the Time Characteristic 0CALDAY from the data fields.

To add 0COMP_CODE to the key fields with constant value 1100.

To delete the key figure Revenue(ZREVU8) as it is no longer relevant for reporting in this DSO.

We will implement these requirements one by one.

 

In the pop-up that opened in the last step, select the Delete Characteristic Radio Button and enter the technical name of the Characteristic name you wish to delete (0CALDAYin this case)

Capture 6.PNG                                                                                               Confirm by pressing the CREATE button.

capture 9.PNG


Adding Characteristic 0COMP_CODE.with value 1100 to key fields of DSO.

Capture 7.PNG           

 

 

we need to check AS Key Field check box.if we want that in particular position.

click on create button.


capture 10.PNG

To delete key figure we need to follow these steps.

Capture 8.PNG                                                                                                 Then click on create button.

                                                                                                 

capture 11.PNG



after that click on activate and simulate then go for schedule option.

Capture12.PNG                                                                                    simulation done and click on continue then now it will schedule screen.

Capture 13.PNG                                                                                                  select immediate option then it will appear the below screen here we need to select save option.

capture 14.PNG                                                                                                now we will get message like this.

Capture 15.PNG                                                                                                                                                  if we want to see job click on jobs then we will check it. after that DSO will be inactive and we need to activate.

Capture 16.PNG                                                                                    Now Remodeling successfully done on DSO.

 



Hi,

 

anyone who has ever tried to create a pivot table on top of the Bex Analyzer output will have experienced this issue.

When displaying key and text for an info object, the column header for the key is filled, but the text column remains empty without a header.

This makes it impossible to create a pivot table on top of it.

 

Using the Callback macro in Bex 7.x it is possible to scan the column headers in the result area and put in a custom text.

In this blog I describe how to do this.

 

First of all, run the query in Bex analyzer.

 

After running the query, go to view --> macro's --> view macro's

select CallBack and press edit.

macro screen.jpg

 

Scroll below to the following piece of code

callback macro before.JPG

After the End With and before End If, insert the following lines:

 

    'set column headers for key + text

    Dim nrCol As Long

    Dim resultArea As Range

    Set resultArea = varname(1)

    nrCol = resultArea.Columns.Count

    For i = 1 To nrCol - 1

        If resultArea.Cells(1, i + 1) = "" Then

            resultArea.Cells(1, i + 1) = resultArea.Cells(1, i) & "_text"

        End If

    Next i

 

This code will put suffix _text in the column header, based on the preceding column header.

 

The end result in the macro then looks like this:

callback macro after.JPG

After refreshing the query, you will now see the column headers being added based on the previous column header, with _text behind it.

 

Hope this will help a lot of people.

 

Best regards,

Arno


Hi All,

 

 

 

Requirement – There are lot of Bex queries which uses hierarchy node hardcoded .Currently hierarchy maintained at BI and future you are automated by maintaining set hierarchy at ecc side. Your query filtered with node for example “REG IND” but issue is at ecc side you cannot name node with space so if you maintain as “REG_IND” will not show proper result in query.

 

If you do not wants to change Bex queries because it’s huge effort for modification Bex queries.So you can go for following work around.

 

Note – It’s always do correct changes at ecc side or modify bex queries with correct node which is coming from ECC.

 

This blog gives you an idea to change node using abap program.

 

If you search with “how to change hierarchy node “you will find lot of thread which saying you are not able to change node name in BI but able to change description.

 

h1.PNG

 

 

That is correct manually we cannot change but using abap program you can change it.

 

Step1 – Go to T code Se38

 

Provide program name and copy following code.

Here our hierarchy info object is zgkb

 

 

REPORT ZHIRACHY_NODECHANGE.

 

 

data: wa_node_change type /BIC/Hzgkb.

 

 

update /BIC/Hzgkb set NODENAME = 'REG IND'

 

 

where IOBJNM = '0HIER_NODE'

 

 

and OBJVERS ='A'

 

 

and NODEID = 1

 

 

and NODENAME = 'REG_IND'.

 

  

 

Execute this program as one step after ecc hierarchy load complete through process chain.In this way you can change hierarchy node using ABAP program.

 

Thanks for reading. Hope it is useful information..

   

 

Regards,

Ganesh Bothe

 

 

 

 

     When reporting with BOBJ Clients on BW data users might request too detailed information pushing BW system over its limit and causing performance / system stability issues. There is a safety belt functionality which allows to set a maximum number of cells retrieved from BW. In my blog I will explain to set safety belt for different BOBJ Clients. If you do not authorization or system to play with you can create trial BW / BOBJ landscape in Cloud like exaplined here

 

     Setting BW Safety Belt for Analysis OLAP

     It is set in Central Management Console updating Properties of Adaptive Processing Server

   BW Safety Belt 1.jpg

Here are setting and default values

SettingDefault Value

Maximum Client Sessions

15
Maximum number of cells returned by a query100,000
Maximum number of members returned when filtering100,000


To demonstrate how Safety Belt is working lets change Maximum number of cells returned by a query to something small, for example, 5.

 

BW Safety Belt 2.jpg

and restart the Server

BW Safety Belt 3.jpg

Now if we run Analysis for OLAP without drill down, then no error occurs

BW Safety Belt 4.jpg

But if we drill down by Product or Sold-to number of cells will exceed the limit.

BW Safety Belt 5.jpg

 

     Setting BW Safety Belt for Web Intelligence and Crystall Report

     It is set maintaining BICS_DA_RESULT_SET_LIMIT_DEF and BICS_DA_RESULT_SET_LIMIT_MAX paremeters in RSADMIN table. To demonstrate how safety belt works lets set the limits to some small value, for example, 5 running SAP_RSADMIN_MAINTAIN program.

BW Safety Belt 6.jpg

BW Safety Belt 7.jpg

Now if we run Web Intelligence report without drill down, then no error occurs

BW Safety Belt 8.jpg

But if we drill down by Product or Sold-to number of cells will exceed the limit.

BW Safety Belt 9.jpg

Safety Belt for Crystal Reports works the same way as for Web Intelligence

Suraj Yadav

Extraction in SAP BI

Posted by Suraj Yadav Mar 15, 2016

What is Data Extraction?


Data extraction in BW is extracting data from various tables in the R/3 systems or BW systems. There are standard delta extraction methods available for master data and transaction data. You can also build them with the help of transaction codes provided by SAP. The standard delta extraction for master data is using change pointer tables in R/3. For transaction data, delta extraction can be using LIS structures or LO cockpit etc.


Types of Extraction:


  1. Application Specific:
    • BW Content Extractors
    • Customer Generated Extractors
  2. Cross Application Extractors
    • Generic Extractors.

 

extractors.gif



BW Content Extractors


SAP provided the predefined Extractors like FI, CO, LO Cockpit etc, in OLTP system (R/3) . The thing that you have to do is, Install business Content.

 

Lets take an example of FI extractor. Below are the steps you need to follow:

  • Go to RSA6 >> select the desired datasource >> In the top there is a tab Enhance Extract Structure >> Click on it


Untitled.jpg

  • It will take you to DataSource: Customer Version Display. Double click on the ExtractStruct.

Untitled.png

 

  • Click on Append Structure button as shown:

Untitled.png

  • Add the field Document Header Text (eg: ZZBKTXT) in the Append Structure with ComponentType: BKTXT. Before you exit, make sure that you activate the structure by clicking on the activate button.

Untitled.png

  • Required field has been successfully added in the structure of the data source.

Untitled.png

Populate the Extract Structure with Data

       SAP provides enhancement RSAP0001 that you use to populate the extract structure. This enhancement has four components that are specific to each of        the four types of R/3 DataSources :


  • Transaction data EXIT_SAPLRSAP_001
  • Master data attributes EXIT_SAPLRSAP_002
  • Master data texts EXIT_SAPLRSAP_003
  • Master data hierarchies EXIT_SAPLRSAP_004

 

With these four components (they're actually four different function modules), any R/3 DataSource can be enhanced. In this case, you are enhancing a transaction data DataSource, so you only need one of the four function modules. Since this step requires ABAP development, it is best handled by someone on your technical team. You might need to provide your ABAP colleague with this information:

  • The name of the DataSource (0FI_GL_4)
  • The name of the extract structure (DTFIGL_4)
  • The name of the field that was added to the structure (ZZBKTXT)
  • The name of the BW InfoSource (0FI_GL_4)
  • The name of the R/3 table and field that contains the data you need (BKPFBKTXT)

With this information, an experienced ABAP developer should be able to properly code

the enhancement so that the extract structure is populated correctly. The ABAP code itself

would look similar to the one shown below:

 

Untitled.png


  • Now check the data via tcode RSA3.

 

(You can open the four Function Modules given above (Tcode SE37), you will get include statement in all the FMs. Double click on the include program you will get the ABAP code as above for all standard data sources which can be modified.)

 

 

Note: Similarly you can enhance all other SAP delivered extractors. ( For LO Cockpit use tcode LBWE)

 

 

Customer Generated Extractors

 

For some application which vary from company to company like LIS , CO-PA ,FI-SL because of its dependency on organization structure , SAP was not able to provide a standard data source for these application. So customer have to generate their own data source. So this is called Customer generated Extractors.

 

Lets take an example of CO-PA extraction

  • Go to Tcode KEB0 which you find in the SAP BW Customizing for CO-PA in the OLTP system.

Untitled.jpg

 

 

  • Define the DataSource for the current client of your SAP R/3 System on the basis of one of the operating concerns available there.
  • In the case of costing-based profitability analysis, you can include the following in the DataSource: Characteristics from the segment level, characteristics from the segment table, fields for units of measure, characteristics from the line item, value fields, and calculated key figures from the key figure scheme.
  • In the case of account-based profitability analysis, on the other case, you can only include the following in the DataSource: Characteristics from the segment level, characteristics from the segment table, one unit of measure, the record currency from the line item, and the key figures.
  • You can then specify which fields are to be applied as the selection for the CO-PA extraction.

Untitled.jpg

 

 

Generic Extractors


When the requirement of your company could not be achieved by SAP delivered business content data source , Then you have to create your own data source that is purely based on your company's requirement , That is called generic extractors .

 

Based on the complexity you can create Data source in 3 ways .

 

1. Based on Tables/Views ( Simple Applications )

2. Based on Infoset

3. Based on Function Module ( Used in complex extraction)


Steps to create generic extractor:


1. Based on Tables/Views ( Simple Applications )


  • Go to Tcode RSO2 and choose the type of data you want to extract (transaction, Masterdata Attribute or Masterdata Text)

Untitled.png

  • Give the name to the data source to be created and click on create.

Untitled.png














  • On the Create data source screen, enter the parameters as required:

Untitled.jpg

Application Component: Component name where you wish to place the data source in the App. Component hierarchy.

Text: Descriptions (Short, Medium and Long) for the data source.

View/Table: Name of the Table/View on which you wish to create the Generic data source. In our case it is ZMMPUR_INFOREC.

 

  • The Generic datasource is now displayed allowing you to Select as well as Hide field. The fields ‘hidden’ will not be available for extraction. Fields in the ‘Selection’ tab will be available for Selection in the Infopackage during data extraction from the source system to the PSA.

Untitled.jpg


  • Select the relevant fields and Save the data source.

Untitled.png

  • Now save the DataSource.

Untitled.jpg

 

 

2. Based on Infoset


https://www.google.co.in/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0ahUKEwiCruWWusPLAhWBPZoKHa7KArgQFg…


3. Based on Function Module


https://www.google.co.in/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=0ahUKEwjOk46_usPLAhVqIJoKHej3A8wQFg…


 

Note: Data for all types of extractors can be viewed via Tcode RSA3, where you have to give the DataSource name, Data Records/call, No. of Extr calls and the selections:

 

Untitled.png

 


The detailed information on LO-Cockpit, Update modules, generic extractor using FM and Infoset, delta pointer, safety interval will be shared in upcoming blogs.

 

Thanks,

Suraj Yadav

    If you need Business Objects / Business Warehouse Trial then this blog is for you . Right now SAP offers BW 7.4 SP08 and BOBJ 4.1 free Cloud Application Library trials (you pay only for Amazon Web Services only). The fact that these are two separate trials has its pros and cons. Pros is that you have more flexibility to control AWS costs by starting / stopping BW and BOBJ separately. Cons is that you have to connect BW to BOBJ yourself. In my blog I will explain how to connect BW to BOBJ and demonstrate BW / BOBJ end-to-end scenario.

    These are CAL free trials:

BW BOBJ Sandbox 1.jpg

BW BOBJ Sandbox 2.jpg

These are the costs of running BW and BOBJ instances:

BW BOBJ Sandbox 3.jpg

BW BOBJ Sandbox 4.jpg

Note: it is important to check Public Static IP address check-box for BW instance to save you trouble updating BOBJ OLAP Connection every time BW is started.

    Once BW and BOBJ instances are created, in AWS EC2 Management Console you check and make a note of BW IP address (you will need to connect it to BOBJ). As you can also see BW comes also with front e.g. a remote desktop with SAP GUI and BW Modeling Tools in Eclipse.

BW BOBJ Sandbox 5.jpg

 

Create BW OLAP Connection in BOBJ CMC

BW BOBJ Sandbox 6.jpg

It is important to set Authentication mode to Pre-defined, otherwise in Prompt mode Webi will not see our OLAP Connection

Note: that server name is the Public Static IP Address of BW Server from AWS EC2 Management Console

 

Make TCP Ports of BW Server Accessible from Anywere

 

Without this BOBJ will not be able to connect to BW. Open BW Server security group in AWS EC2 Management Console

BW BOBJ Sandbox 7.jpg

Edit Inbound Rules

BW BOBJ Sandbox 8.jpg

Modify first entry

BW BOBJ Sandbox 9.jpg

And delete second entry

BW BOBJ Sandbox 10.jpg

Save Inbound Rules

BW BOBJ Sandbox 11.jpg

 

 

Install SAP GUI Business Explorer

 

What I noticed is that Eclipse BW Modeling Tools are not working because BW Project can not be expaded (it dumps on BW side see trx. ST22). I suggest to install SAP GUI Business Explorer and create BW Queries there and do all the modeling in SAP GUI trx. RSA1. Alternatively you can user other BW Trials (BW 7.4 SP10 on HANA or BW 7.4 SP01 on HANA), but the latter have higher AWS costs.

 

 

Create End to End Scenario

 

Create BW Query and allow external access for the Query

BW BOBJ Sandbox 12.jpg

In Web Intelligence create a new report selecting D_NW_C01 BW Query as a source from BW OLAP Connection

BW BOBJ Sandbox 13.jpg

BW BOBJ Sandbox 15.jpg

BW BOBJ Sandbox 16.jpg

In the context of Transformations you can choose for rule type Formula. SAP delivered a Formula Builder with many ready-to-use Formulas. It is even possible to extend the Formula Builder with your own custom Formulas by implementing Business Add-in (BAdI) RSAR_CONNECTOR.

My requirement was not only to deliver such a custom Formula. I also had to add a message to the Monitor Log in case a certain condition was met. In this blog I would like to share with you how to handle such a requirement and to discuss some implementation aspects.

BAdI Implementation

Custom Formulas can be made available by implementing classic BAdI RSAR_CONNECTOR. Next to the on-line BAdI documentation you can refer to the following documents for detailed implementation instructions:

 

 

Result of my implementation is a new custom group with custom Formulas.

 

Figure_1_Example_Custom_Group.jpg

Figure 1: Example custom group with custom Formulas

 

The last custom Formula is used in the example Transformation rule.

 

Figure_2_Example_Transformation_Rule.jpg

Figure 2: Example Transformation rule with a custom Formula

Monitor Log Messages

Implementing the BAdI was actually the easy part. It was not so easy to implement Monitor Log messages in the custom Formula.

For reference purposes I used standard Formula LOG_MESSAGE. I copied the relevant source code from the standard Formula and appended it to my own implementation. Please refer to line 12 to 17 in the next screenshot.

 

Figure_3_Source_Code_BAdI_Implementation.jpg

Figure 3: Source code in BAdI implementation

 

Unfortunately, my custom Formula did not send any messages to the Monitor Log. After some investigation I found that also standard Formula LOG_MESSAGE did not send the messages to the Monitor Log.

I raised an SAP Support ticket and it turned out to be a program error. After implementing SAP Note 2285154 - 730SP15:Standard formula function LOG_MESSAGE doesn't send the issue was solved for the standard Formula. However, the custom Formula still did not send any messages.

 

Diving a bit deeper I found an interesting explanation. The generated Transformation program is based on template programs such as RSTRAN_RULE_TMPL_ROUTINE.  In the source code you can see how the Formula routine is composed. Pay attention to line 173 in the next screenshot.

 

Figure_4_Template_Program_Transformation_Rule.jpg

Figure 4: Template program for Transformation rule

 

The source code of line 174 to 180 is only applied if the Formula contains string LOG_MESSAGE.

 

So the “hidden feature” is that you have to include string LOG_MESSAGE in the technical name of the custom Formula. Only then the crucial source code is appended to the Transformation rule routine in the generated program.

 

Figure_5_Generated_Transformation_Program.jpg

Figure 5: Generated Transformation program

Conclusion

In this blog I demonstrated how to send a message to the Monitor Log in the context of a custom Formula. You have to append similar source code to the BAdI implementation as found in standard Formula LOG_MESSAGE. A small “hidden feature” is that you also have to include string LOG_MESSAGE in the technical name of the custom Formula. Only then crucial source code is appended to the Transformation rule routine in the generated program. Now the messages are correctly sent to the Monitor log.

Latest BW releases offer better options for coding Customer Exit Variables by means of using:

  • RSROA_VARIABLES_EXIT_BADI BADI instead of RSR00001 exit to structuring code better (since BW 7.31);
  • New ABAP features to write short and robust code (since ABAP 7.40 and some features since ABAP 7.02).

 

I created a simple BW Query based on SFLIGHT data model to demonstrate renewed ABAP in action. The Query displays number passengers per airline who traveled certain distance ranges specified on selection screen.Customer Exit Variable Renewed ABAP 1.jpgCustomer Exit Variable Renewed ABAP 2.jpg

 

If Distance Ranges are not in a sequence, then Customer Exit issues an error message:

Customer Exit Variable Renewed ABAP 3.jpg

So what Customer Exit is doing:

  • Sets Default Values for Characteristic Distance Range Variables;
  • Sets Text Distance Range Text Variables based on user input;
  • Validates Distance Characteristic Range Variables.

I coded all above in RSROA_VARIABLES_EXIT_BADI BADI implementation. It is possible to have multiple BADI implementations and isolate coding for a set of variables that logically belong to each other.

Customer Exit Variable Renewed ABAP 4.jpg

BADI implementations are isolated by means of Filter Values

Customer Exit Variable Renewed ABAP 4.jpg

Customer Exit Variable Renewed ABAP 5.jpg

BADI implementation will be executed for DISTANCE InfoObject Variables (Combination 1) as well as for Texts Variables (Combination 2) and at the time of Variables Validation (Combination 3).

Actual coding is done in IF_RSROA_VARIABLES_EXIT_BADI~PROCESS method of ZCL_RSROA_VAR_EXIT_DISTANCE class

Customer Exit Variable Renewed ABAP 6.jpg

 

Below is IF_RSROA_VARIABLES_EXIT_BADI~PROCESS method code in ABAP 7.40 syntax:

 

METHOD if_rsroa_variables_exit_badi~process.

 
CASE i_step.
 
WHEN 1.       "Before Selection
   
CASE i_vnam.
   
WHEN 'DIST_1'.
      c_t_range
= VALUE #( ( sign = 'I' opt = 'EQ' low = '00600' ) ).
   
WHEN 'DIST_2'.
      c_t_range
= VALUE #( ( sign = 'I' opt = 'EQ' low = '01000' ) ).
   
WHEN 'DIST_3'.
      c_t_range
= VALUE #( ( sign = 'I' opt = 'EQ' low = '05000' ) ).
   
WHEN 'DIST_4'.
      c_t_range
= VALUE #( ( sign = 'I' opt = 'EQ' low = '09000' ) ).
   
ENDCASE.
 
WHEN 2.    "After selection screen
   
CASE i_vnam.
   
WHEN 'DIST_1H_TXT'
     
OR 'DIST_2H_TXT'
     
OR 'DIST_3H_TXT'
     
OR 'DIST_4H_TXT'.
      c_t_range
= VALUE #( ( low  = replace( val = |{ i_t_var_range[ vnam = substring( val = |{ i_vnam }| off = 0 len = 6 ) ]-low }| regex = '^0+' with = '' occ = 1 ) ) ).
   
WHEN 'DIST_2L_TXT'
     
OR 'DIST_3L_TXT'
     
OR 'DIST_4L_TXT'
     
OR 'DIST_5L_TXT'.
      c_t_range
= VALUE #( ( low = replace( val = |{ i_t_var_range[ vnam = i_vnam+0(5) && |{ i_vnam+5(1) - 1 }| ]-low + 1 }| regex = '^0+' with = '' occ = 1 ) ) ).
   
ENDCASE.
 
WHEN 3.    "Validation
   
TRY.
     
DO 3 TIMES.
       
IF i_t_var_range[ vnam = 'DIST_' && |{ sy-index }| ]-low > i_t_var_range[ vnam = 'DIST_' && |{ sy-index + 1 }| ]-low.
         
DATA(w_message) = |Range| && |{ sy-index }| && | is greater then Range | && |{ sy-index + 1 }|.
         
CALL FUNCTION 'RRMS_MESSAGE_HANDLING'
              
EXPORTING
                    i_class 
= 'OO'
                    i_type  
= 'E'
                    i_number
= '000'
                    i_msgv1 
= w_message.
         
RAISE EXCEPTION TYPE cx_rs_error.
       
ENDIF.
     
ENDDO.
   
CATCH cx_sy_itab_line_not_found INTO DATA(itab_line_not_found).
   
ENDTRY.
 
ENDCASE.

ENDMETHOD.

 

In step 1 (before selection screen) BADI sets initial values for Distance Characteristic Variables.

In step 2 (after selection screen) BADI reads Distance Characteristic Variables and populates Distance Text Variables.

In step 3 (validation) BADI validates Distance Characteristic Variables entered on selection screen.

The code is short, well structured and easy to understand. Imagine dumping code for all other BW variables in the system into the same BADI, then it would become unreadable and unmanageable.

 

The same logic implemented in RSR00001 customer exit using ABAP 7.0 syntax would look like this:


DATA: wa_range TYPE rsr_s_rangesid.
DATA: wa_var_range TYPE rrrangeexit.
DATA: w_vnam TYPE rszglobv-vnam.
DATA: w_dist_1 TYPE rschavl.
DATA: w_dist_2 TYPE rschavl.
DATA: w_message TYPE string.
DATA: wa_var_range_1 LIKE rrrangeexit.
DATA: wa_var_range_2 LIKE rrrangeexit.

CASE i_step.
WHEN 1.       "Before Selection
 
CASE i_vnam.
 
WHEN 'DIST_1'.
    wa_range
-sign = 'I'.
    wa_range
-opt  = 'EQ'.
    wa_range
-low  = '00600'.
   
APPEND wa_range TO e_t_range.
 
WHEN 'DIST_2'.
    wa_range
-sign = 'I'.
    wa_range
-opt  = 'EQ'.
    wa_range
-low  = '01000'.
   
APPEND wa_range TO e_t_range.
 
WHEN 'DIST_3'.
    wa_range
-sign = 'I'.
    wa_range
-opt  = 'EQ'.
    wa_range
-low  = '05000'.
   
APPEND wa_range TO e_t_range.
 
WHEN 'DIST_4'.
    wa_range
-sign = 'I'.
    wa_range
-opt  = 'EQ'.
    wa_range
-low  = '09000'.
   
APPEND wa_range TO e_t_range.
 
ENDCASE.
WHEN 2.    "After selection screen
 
CASE i_vnam.
 
WHEN 'DIST_1H_TXT'
   
OR 'DIST_2H_TXT'
   
OR 'DIST_3H_TXT'
   
OR 'DIST_4H_TXT'.
   
READ TABLE i_t_var_range INTO wa_var_range WITH KEY vnam = i_vnam+0(6).
   
SHIFT wa_var_range-low LEFT DELETING LEADING '0'.
    wa_range
-low = wa_var_range-low.
   
APPEND wa_range TO e_t_range.
 
WHEN 'DIST_2L_TXT'
   
OR 'DIST_3L_TXT'
   
OR 'DIST_4L_TXT'
   
OR 'DIST_5L_TXT'.
    w_vnam
= i_vnam.
    w_vnam+5
(1) = w_vnam+5(1) - 1.
   
READ TABLE i_t_var_range INTO wa_var_range WITH KEY vnam = w_vnam+0(6).
    wa_var_range
-low+0(5) = wa_var_range-low+0(5) + 1.
   
SHIFT wa_var_range-low LEFT DELETING LEADING SPACE.
    wa_range
-low = wa_var_range-low.
   
APPEND wa_range TO e_t_range.
 
ENDCASE.
WHEN 3.    "Validation
 
DO 3 TIMES.
    w_vnam
= sy-index.
   
SHIFT w_vnam LEFT DELETING LEADING SPACE.
   
CONCATENATE 'Range' w_vnam INTO w_message SEPARATED BY SPACE.
   
CONCATENATE 'DIST_' w_vnam INTO w_vnam.
   
READ TABLE i_t_var_range INTO wa_var_range_1 WITH KEY vnam = w_vnam.
   
CHECK sy-subrc = 0.
    w_vnam
= sy-index + 1.
   
SHIFT w_vnam LEFT DELETING LEADING SPACE.
   
CONCATENATE w_message 'is greater then' w_vnam INTO w_message SEPARATED BY SPACE.
   
CONCATENATE 'DIST_' w_vnam INTO w_vnam.
   
READ TABLE i_t_var_range INTO wa_var_range_2 WITH KEY vnam = w_vnam.
   
CHECK sy-subrc = 0.
   
IF wa_var_range_1-low > wa_var_range_2-low.
     
CALL FUNCTION 'RRMS_MESSAGE_HANDLING'
          
EXPORTING
                i_class 
= 'OO'
                i_type  
= 'E'
                i_number
= '000'
                i_msgv1 
= w_message.
     
RAISE no_processing.
   
ENDIF.
 
ENDDO.
ENDCASE.

 

ABAP 7.0 syntax is almost twice the size of ABAP 7.40 syntax. The comparison speaks for itself.

 

Now I want quickly go over the ABAP 7.40 syntax features that are the most useful for coding Customer Exit Variables.

 

 

String Templates

They are powerful option for defining Variables values. String Templates saves you trouble wrting multiple CONCATENATE, WRITE, SHIFT, REPLACE, etc statements. All of them can be combined into one String Template thanks to:

  • Chaining Operators && (to concatenate strings)
  • Embedded Expressions  { expression }
  • Build-in functions

In my example, I build w_message string concatenating strings and expressions (sy-index system variable and sy-index offset system variable). Note that also inline declaration of w_message ABAP variable was used to save trouble defining it beforehand.

 

DATA(w_message) = |Range| && |{ sy-index }| && | is greater then Range | && |{ sy-index + 1 }|.

 

 

VALUE Operator

Can be used initalize Variable table parameter in just one statement.

In my example, I set default Distance Ranges as

 

c_t_range = VALUE #( ( sign = 'I' opt = 'EQ' low = '00600' ) ).

 

 

Expression to Access Internal Table

In i_step = 2 (after selection screen) Variable value can be conveniently read from Internal Table using expression rather then READ TABLE statement

In my example, I populate Distance Text Variable reading Distance Characteristic Variable. Note that replace function is strips leading 0. This is also a good example of nested expressions.

 

c_t_range = VALUE #( ( low  = replace( val = |{ i_t_var_range[ vnam = substring( val = |{ i_vnam }| off = 0 len = 6 ) ]-low }| regex = '^0+' with = '' occ = 1 ) ) ).

 

 

Expression in IF Statment

In i_step = 3 (validation) expressions can be used in IF statement to validate Variables values.

In my example, in IF statement I check if preceding Distance Range is not greater then subsequent Distance Range.

 

IF i_t_var_range[ vnam = 'DIST_' && |{ sy-index }| ]-low > i_t_var_range[ vnam = 'DIST_' && |{ sy-index + 1 }| ]-low.
     ...

ENDIF.

Introduction

After running a BW-system for several years, there will be many InfoProviders where you are not sure if they are really being used by BEx Queries. But every time you have to modify or would like to replace the InfoProvider, you will have to edit the Multiprovider in which the Infoprovider is included.

A good option would be to remove the (unused) Infoprovider from the Multiprovider (where it won’t be called at query runtime to read data).

It would take a lot of time to check every query (based on the Multiprovider in which the Infoprovider is included). Therefore, it would be better to check the BW metadata tables via an ABAP program.


Checking Multiprovider and their queries

In order to determine the usage of a part provider in a query, we will have to

  • get the mapped key figures of the Infoprovider within the Multiprovider
  • check if these keyfigures are used in BEx queries/key figures (and no other PartProvider is filtered there)
  • find out if the filter at 0INFOPROV (in restricted key figure) is set to the PartProvider (“Fitting IP-Sel.?”=Y)


Instructions for use:

  • You can use single values, intervals or ranges (e.g. T*) on the selection screen; cubes and DSOs can also be used simultaneously.

HowTo_UnusedPartProv_201602_SelScreen.jpg

Selection-Screen: Example with three InfoProvider (to check for usage)

 

 

The result is split into the

  • Usage of every Infoprovider within a  Multiprovider
    • “Keyf.incl.?” (=Key figure included?): set to X if one key figure of the PartProvider is used in the shown query/structure/key figure (“Mapname”)
    • “IP-Sel.?”(=Infoprovider selection existing?): X will be shown in this column when the global BEx element (“Mapname”) includes a selection of “0INFOPROV”
    • “Fitting IP-Sel.?” (=Fitting 0INFOPROV Selection?): In addition to “IP-Sel.?”, this field will be marked if the 0INFOPROV selection includes the shown Part cube/Provider (e.g. 0INFOPROV=TCHC001)
  • When there are no queries or other global BEx elements based on the resp. Multiprovider, there will be no entries at the details (e.g. Multiprovider: TCGMZ001)
  • At the chapter “==> LIST OF RESULTS:”, the aggregation of the detailed results from before will be displayed. The PartProvider which are not used in the respective Multiprovider (Used?=N) can be  removed from the Multiprovider (from a technical perspective).


  • List of Providers without any Multiprovider inclusion
    • Are these Providers still relevant or can they be deleted? 

 

HowTo_UnusedPartProv_201602_Results.jpg

Results: Example with three InfoProvider (to check for usage)


Summary

With the two lists (“LIST OF RESULTS:” / “CUBES WITHOUT MULTIPROVIDER INCLUSION”) you know which Infoprovider is not used by BEX elements within the resp. Multiprovider or that this Infoprovider is not used by any Multiprovider at all.

 

Please note that the program is only fitting for InfoCubes/DSOs within a Multiprovider and that it is not used for aggregation levels, planning functions or Composite Providers.




CODING

*&---------------------------------------------------------------------*
*& Report  ZBI_BEX_PARTCUBES_UNUSED
*&---------------------------------------------------------------------*
*& Created by/on: C. Heinrich - (11.02.2016)
*& Targets:
*& A) check usage of Cubes/DSO, included in Multiprovider, in queries
*&    and global BEx-elements (e.g. restricted key-figures)
*& B) also show Provider, which are not included in any Multiprovider
*& Additions:
*&  -> beside (all types of) Cubes, also DSOs are supported
*&---------------------------------------------------------------------*
REPORT ZBI_BEX_PARTCUBES_UNUSED LINE-SIZE 160 NO STANDARD PAGE HEADING.

TABLES: RSDCUBET.

**** Variables, internal tables
DATA: BEGIN OF ls_mpro,  " List of MultiProvider
        INFOCUBE TYPE RSINFOPROV,
        PARTCUBE TYPE RSINFOPROV,
        USED(1) TYPE C,
       END OF ls_mpro,
       lt_mpro LIKE STANDARD TABLE OF ls_mpro.
DATA: BEGIN OF ls_keyf, " List of found key-figures
        INFOCUBE TYPE RSINFOPROV,
        PARTCUBE TYPE RSINFOPROV,
        IOBJNM  TYPE RSIOBJNM,
       END OF ls_keyf,
       lt_keyf LIKE HASHED TABLE OF ls_keyf
         WITH UNIQUE KEY INFOCUBE IOBJNM PARTCUBE.
DATA: BEGIN OF ls_compic, " List of queries/elements to MultiProvider
         COMPUID TYPE SYSUUID_25,
         MAPNAME TYPE RSZCOMPID,
         INFOCUBE TYPE RSINFOPROV,
         PARTCUBE TYPE RSINFOPROV,
         FLAG_KEYF(1) TYPE C,     " key-f. of part-cube included(=X)?
         FLAG_SEL_IC(1) TYPE C,   " selection at 0INFOPROV exists? (Y/N)
         FLAG_PARTCUBE(1) TYPE C, " selection at corret part-cube (=Y)?
       END OF ls_compic,
       lt_compic LIKE STANDARD TABLE OF ls_compic.
DATA: lv_used(1) TYPE c.
*** Field-symbols:
FIELD-SYMBOLS: <fs_mpro> LIKE ls_mpro,
                <fs_compic> LIKE ls_compic,
                <fs_keyf> LIKE ls_keyf.

**** A) Selection-screen
SELECTION-SCREEN: BEGIN OF BLOCK b1 WITH FRAME TITLE text-001.
   SELECT-OPTIONS: S_CUBE FOR RSDCUBET-INFOCUBE OBLIGATORY.
SELECTION-SCREEN: END OF BLOCK b1.


*** B) Checking usage of Partprovider(s):
START-OF-SELECTION.

    " --> getting list of Multiprovider, including the Cube(s):
    SELECT RSDCUBEMULTI~INFOCUBE RSDCUBEMULTI~PARTCUBE
      FROM RSDCUBEMULTI
      INTO CORRESPONDING FIELDS OF TABLE lt_mpro
      WHERE RSDCUBEMULTI~OBJVERS = 'A'
        AND RSDCUBEMULTI~PARTCUBE IN S_CUBE.

    SKIP 1. FORMAT INTENSIFIED ON.
    WRITE: / '==> CHECK PER MULTI-PROVIDER AND GLOBAL BEX-ELEMENT:'.
    FORMAT INTENSIFIED OFF.
    LOOP AT lt_mpro ASSIGNING <fs_mpro>.
      " Header: Multiprovider / Part-Cube
      WRITE: / 'Multiprovider: ', <fs_mpro>-INFOCUBE,
               50 'Partprov.: ', <fs_mpro>-PARTCUBE.
      " --> get key-figures of part-cube which are included at the Multiprovider(s):
      SELECT RSDICMULTIIOBJ~INFOCUBE RSDICMULTIIOBJ~IOBJNM
             RSDICMULTIIOBJ~PARTCUBE
        FROM RSDICMULTIIOBJ INNER JOIN RSDIOBJ
          ON RSDICMULTIIOBJ~PARTIOBJ = RSDIOBJ~IOBJNM
         AND RSDICMULTIIOBJ~OBJVERS = RSDIOBJ~OBJVERS
        INTO CORRESPONDING FIELDS OF TABLE lt_keyf
        WHERE RSDICMULTIIOBJ~INFOCUBE = <fs_mpro>-INFOCUBE
          AND RSDICMULTIIOBJ~PARTCUBE = <fs_mpro>-PARTCUBE
          AND RSDICMULTIIOBJ~OBJVERS = 'A'
          AND RSDIOBJ~IOBJTP = 'KYF'.

***   --> now get queries and other BEx-elements similar to restricted key-figures
***       and structures, containing the key-figure(s) of the Partprovider(s):
       SELECT COMPUID MAPNAME INFOCUBE
         FROM RSZCOMPIC INNER JOIN RSZELTDIR
           ON RSZCOMPIC~COMPUID = RSZELTDIR~ELTUID
          AND RSZCOMPIC~OBJVERS = RSZELTDIR~OBJVERS
         INTO CORRESPONDING FIELDS OF ls_compic
         WHERE RSZCOMPIC~INFOCUBE = <fs_mpro>-INFOCUBE
           AND RSZCOMPIC~OBJVERS = 'A'.
         ls_compic-PARTCUBE = <fs_mpro>-PARTCUBE.
         APPEND ls_compic TO lt_compic.
       ENDSELECT.

       " every BEx-component should be checked downwards (recursively),
       " if one of the relevant key-figures is used and the Infoprovider (0INFOPROV)-
       " selection could cover also the part-cube(s):
       WRITE: / 'MultiProvider', 25 'Part-Cube', 40 'Mapname',
                85 'Keyf.incl?', 100 'IP-Sel.?', 110 'Fitting IP-Sel.?'.
       WRITE: / '------------------------------------------------------------' &
                '-------------------------------------------------------------' &
                '-----'.

       lv_used = 'N'. " Start with "is used"=no
       " check every query/key-figure for usage
       LOOP AT lt_compic ASSIGNING <fs_compic>.

           PERFORM CHECK_BEX_ELEMENT_RECURSIVE USING <fs_compic>-COMPUID
                                               CHANGING <fs_compic>.
           IF <fs_compic>-FLAG_KEYF = 'X' AND ( <fs_compic>-FLAG_PARTCUBE = 'Y'
                                              OR <fs_compic>-FLAG_SEL_IC <> 'Y' ).
              lv_used = 'Y'. " is used=yes
           ENDIF.
           WRITE: / <fs_compic>-INFOCUBE,     25 <fs_compic>-PARTCUBE,
                 40 <fs_compic>-MAPNAME,      85 <fs_compic>-FLAG_KEYF,
                 100 <fs_compic>-FLAG_SEL_IC, 110 <fs_compic>-FLAG_PARTCUBE.
       ENDLOOP.
       <fs_mpro>-USED = lv_used.
       SKIP.
    ENDLOOP.


*** C) Log at end of run:
     SKIP 3.
     ULINE.
     " C1. List of partprovider(s) usage
     FORMAT INTENSIFIED ON. WRITE: / '==> LIST OF RESULTS: '. FORMAT INTENSIFIED OFF.
     WRITE: / 'Multiprovider', 30 'Infocube', 50 'Used?'.
     WRITE: / '---------------------------------------------------------'.
     LOOP AT lt_mpro ASSIGNING <fs_mpro>.
       WRITE: / <fs_mpro>-INFOCUBE, 30 <fs_mpro>-PARTCUBE,
               50 <fs_mpro>-USED.
     ENDLOOP.
     SKIP 3.
     " C2. List of not-anywhere used part-cubes
     WRITE: / '--------------------------------------------------------------------------------'.
     FORMAT INTENSIFIED ON. WRITE: / '==> CUBE´S WITHOUT MULTIPROVIDER-INCLUSION: '.
     FORMAT INTENSIFIED OFF.
     PERFORM MISSING_CUBES.






*************************************************************************************
* Form CHECK_BEX_ELEMENT_RECURSIVE
* Key-figure and selection to 0INFOPROV will be checked by every BEx-element down-
* wards (form in recursive mode)
*************************************************************************************
FORM CHECK_BEX_ELEMENT_RECURSIVE USING I_ELTUID TYPE SYSUUID_25
                                  CHANGING C_COMPIC LIKE LS_COMPIC.
      DATA: BEGIN OF ls_range,  " entry at RSZRANGE
              IOBJNM TYPE RSIOBJNM,
               SIGN TYPE RALDB_SIGN,
               OPT  TYPE RSZ_OPERATOR,
               LOW  TYPE RSCHAVL,
               HIGH TYPE RSCHAVL,
            END OF ls_range.
      DATA: lv_eltuid TYPE SYSUUID_25.

      " Selection of Partprovider (at 0INFOPROV) or relevant key-figure present?
      " e.g. RSZRANGE-IOBJNM = '1KYFNM' plus RSZRANGE-LOW = 'MyKeyfigure'
      " or   RSZRANGE-IOBJNM = '0INFOPROV' plus RSZANGE-LOW = 'MyCube'
      SELECT IOBJNM SIGN OPT LOW HIGH FROM RSZRANGE
        INTO CORRESPONDING FIELDS OF ls_range
        WHERE ELTUID = I_ELTUID
          AND OBJVERS = 'A'
          AND IOBJNM IN ('1KYFNM', '0INFOPROV').
        IF ls_range-IOBJNM = '1KYFNM'.
           READ TABLE lt_keyf ASSIGNING <fs_keyf>
             WITH KEY INFOCUBE = C_COMPIC-INFOCUBE
                      IOBJNM = ls_range-LOW
                      PARTCUBE = C_COMPIC-PARTCUBE.
           IF sy-subrc = 0. " Key-figure of Part-Provider used at query!
              C_COMPIC-FLAG_KEYF = 'X'.
           ENDIF.
        ELSEIF ls_range-IOBJNM = '0INFOPROV'. " Selection at Infoprovider exists
           C_COMPIC-FLAG_SEL_IC = 'Y'.
           IF ls_range-SIGN = 'I' AND ls_range-OPT = 'EQ'. " Single value
             IF ls_range-LOW = C_COMPIC-PARTCUBE. " Selection to fitting part-cube exists
                 C_COMPIC-FLAG_PARTCUBE = 'Y'.
             ENDIF.
           ELSEIF ls_range-SIGN = 'I' AND ls_range-OPT = 'BT'. " Interval
             IF ls_range-LOW <> ''
               AND C_COMPIC-PARTCUBE BETWEEN ls_range-LOW AND ls_range-HIGH.
                 C_COMPIC-FLAG_PARTCUBE = 'Y'.
             ENDIF.
           ENDIF.
        ENDIF.
      ENDSELECT.

     " further check sub-elements of current BEx-element:
     SELECT TELTUID FROM RSZELTXREF INTO lv_eltuid
       WHERE SELTUID = I_ELTUID
         AND OBJVERS = 'A'.

       PERFORM CHECK_BEX_ELEMENT_RECURSIVE USING lv_eltuid
                                           CHANGING C_COMPIC.
     ENDSELECT.

ENDFORM.

*************************************************************************************
* Form MISSING_CUBES
* Some part-cube(s) not found in any Multiprovider? To this case, there should also
* a list be generated and shown
*************************************************************************************
FORM MISSING_CUBES.
   DATA: BEGIN OF ls_cube,
           INFOCUBE TYPE RSINFOCUBE, " Cube-Name
           CUBETYPE TYPE RSCUBETYPE, " Cube-/DSO-type
           TXTLG TYPE RSTXTLG, " Infoprovider-Text
         END OF ls_cube,
         lt_cube LIKE STANDARD TABLE OF ls_cube.
   FIELD-SYMBOLS: <fs_cube> LIKE ls_cube.
   " first get all cube(s) to selection:
   SELECT RSDCUBE~INFOCUBE RSDCUBE~CUBETYPE RSDCUBET~TXTLG
     FROM RSDCUBE INNER JOIN RSDCUBET
       ON RSDCUBE~INFOCUBE = RSDCUBET~INFOCUBE
      AND RSDCUBE~OBJVERS = RSDCUBET~OBJVERS
     INTO CORRESPONDING FIELDS OF TABLE lt_cube
     WHERE RSDCUBE~INFOCUBE IN S_CUBE
       AND RSDCUBE~OBJVERS = 'A'
       AND RSDCUBE~CUBETYPE <> 'M' " MPro not relevant
       AND RSDCUBET~LANGU = sy-langu.

   " also, add DSO to the selection:
   SELECT RSDODSO~ODSOBJECT AS INFOCUBE RSDODSO~ODSOTYPE AS CUBETYPE
          RSDODSOT~TXTLG AS TXTLG
     FROM RSDODSO INNER JOIN RSDODSOT
       ON RSDODSO~ODSOBJECT = RSDODSOT~ODSOBJECT
      AND RSDODSO~OBJVERS = RSDODSOT~OBJVERS
     APPENDING CORRESPONDING FIELDS OF TABLE lt_cube
     WHERE RSDODSO~ODSOBJECT IN S_CUBE
       AND RSDODSO~OBJVERS = 'A'
       AND RSDODSOT~LANGU = sy-langu.

   " now check cube(s) at multiprovider assignments:
   LOOP AT LT_CUBE ASSIGNING <fs_cube>.
     READ TABLE lt_mpro TRANSPORTING NO FIELDS
       WITH KEY PARTCUBE = <fs_cube>-INFOCUBE.
     IF sy-subrc = 0. " Cube is included in one multi-provider, no entry necessary
        <fs_cube>-INFOCUBE = 'DELETE'.
     ELSE.
        CASE <fs_cube>-CUBETYPE.
          WHEN '' OR ' ' OR 'T' OR 'W'. "=DSO
            WRITE: / 'DSO  ', <fs_cube>-INFOCUBE, 18 '/', <fs_cube>-TXTLG, 80 ' not included in any Multiprovider!'.
          WHEN OTHERS.
            WRITE: / 'Cube ', <fs_cube>-INFOCUBE, 18 '/', <fs_cube>-TXTLG, 80 ' not included in any Multiprovider!'.
        ENDCASE.
     ENDIF.
   ENDLOOP.
   DELETE LT_CUBE WHERE INFOCUBE = 'DELETE'.


ENDFORM.

Actions

Filter Blog

By author:
By date:
By tag: