1 2 3 26 Previous Next

SAP Business Warehouse

385 Posts

Generally some objects become inactive in the BW systems due processes like transports, system upgrades,   refresh  etc. This can result in the various issues such as failure of data load and execution of queries. There is no single standard program or transaction code in SAP, that could read out the inactive objects in BW Systems. The following program has been developed to list out inactive objects category-wise in ALV format.

These lists can be saved to excel sheets.

 

The program identifies and reports the list of inactive objects in the below categories.

 

  1. Multiproviders
  2. Aggregation Levels
  3. Infocubes
  4. Datasources
  5. Data Store Objects
  6. Infosets
  7. InfoObjects
  8. Transfer structures
  9. Transformations
  10. Open Hub destination
  11. DTP’s
  12. Update Rules
  13. Process Chains

File is attached with the code.

When you execute the program you get the following result:

1.jpg

To have proper naming conventions go back to the se38 editor open the program follow the steps as show in below.

1. Find GOTO in menu bar text elements selection text add the selection texts activate it.

 

2.png

 

2. Similarly double click on  “text-002 FOR FIELD x_call” in the code and maintain the text there

You can see the selection screen as below.

3.png

Now when you execute program you get the desired naming convention.

4.png

If you select all the objects in selection screen and execute, the expected output is as follows:

 

 

5.png

 

Advantages:

 

 

Single point access to all the Inactive Objects in the BW system

Eliminating manual effort to find the inactive Objects.

Can be used in case of frequent transports with huge dependencies, pre and post system upgrades & refresh to compare the list of objects

 

References:

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/00dbbfa9-dba2-2d10-bc80-9d6489e4b1c7?overridelayout=t…

 

 

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a04dd32f-2638-2e10-5596-c5f97c400ac7?overridelayout=t…

SAP BW Land-Scape of Systems for the Upgrade.

Landscape of the System:-

We first need to analyze the Current Landscape of the System. Generally it would be like this ,however let's assume it to be as below.

Lets try to understand few of the Upgrade jargon's like SAP Landscape, Development Freeze ,Transport Freeze, Transport Route During the upgrade, Sequence of Upgrade of Systems.

14.png

Mandatory Systems:-

  1. BW-Dev :- Development System and UT.
  2. BW-QA:- Testing System (UAT)
  3. BW-PRD :- Production System

Optional Systems:- Every client may not have the below systems

  1. BW-SB-SandBox:- Full Fledged test System with Valid and sufficient Data to test.
  2. BW-DR-Disaster Recovery Box:-Back Up system of Production in case of Emergency.

 

After we study the Landscape , based on that we will do the Sequence of Upgrade and the Impact as below,

  1. SandBox
  2. Dev Box
  3. QA Box
  4. PRD Box (in Turn DR)

Pre and Post Upgrade Landscape Sequence Depiction  :- 

15.png

Detailed Sequence of Upgrade and TR Movement :-

 

16.png

Upgrade Steps :-

  • Step 1: Prepare SandBox system (SB) for Upgrade which is copy from PRD .After system copy, upgrade SB to SAP BW 7.4
  • Step 2: After the SB Upgrade , Prepare BD-Dev for upgrade  and Upgrade the Dev System
  • Step 3: Prepare BW-QA for upgrade and Upgrade BW-QA from BW 7.3 to BW 7.4
  • Step 4: Prepare DR for upgrade from BW 7.3 to BW 7.4 just before the BW-PRD
  • Step 5: Prepare and Upgrade BW-PRD system to BW 7.4

Development Freeze (Step 2) :-

Case 1:-During this phase ,there will not be any Developments happening in the system as the Development system itself has to be Upgraded to a Higher Version and the system would be taken by Basis team for the Upgrade. So here we enter into Development Freeze.

Case 2 :-Sometimes to Minimize the Development Freeze period , another Dev system-D2 is created which would be a Copy of old Dev-D1.The D2 will get Upgraded first while D1 functions Normally. However all the Development will have to done in the D2 system manually post upgrade such that the Systems are in SYNC. Once D2 is Upgraded, it will connected to QA system and other Source systems post which D1 will be removed from that Landscape.After this step , QA is taken up for Upgrade thereby reducing the Dev and Transport Freeze Period.e

  • D1- Old Dev System
  • D2- New Dev System

18.png

Transport Freeze (Step 3) :- During this phase ,there will not be any Transports happening in the system as the systems will be of different Version, say Dev system (Upgraded) may be on a Higher Version and the QA System would be on a Lower Version (Older Version) and its not recommended to Move Objects from higher version to lower version.

Transport Route Before Upgrade (7.3)

Dev->QA->PRD

Transport Route During Upgrade

  • Dev Upgrade :QA->PRD

There will be Development Freeze , When Dev system is being Upgraded. The QA system will be Opened for all Emergency Developments/Changes will happen in QA along with the Testing. The Objects will move from QA to PRD. However all Objects developed in QA system will have to Recreated manually in Dev system, post upgrade such that the Systems are in SYNC .This Rework would require extra time and there are chances of missing out on Exact replication of Objects.

  • QA Upgrade: No TR Movement

When QA Upgrade Starts ,till the End of the Production Upgrade there will not be any TR movements. However the Developments will happen as Dev system is Open and TRs can be collected.The Development freeze would be lifted and transport Freeze will still be there.

  • PRD Upgrade (Dev->QA)

When Production system is Being Upgraded, the Dev and QA systems would have been Upgrade and on same Version hence the Development and Testing can happen but TRs can move till QA ....but can move to PRD only move after the Upgrade.The Development freeze would be lifted and transport Freeze will be partial.

Transport Route Post Upgrade (7.4)

Post Upgrade the Normal TR movement will resumeas below,

Dev->QA->PRD

 

DR System Upgrade :- Disaster Recovery (DR) System will get upgraded from the Logs received from Production system.

 

Note :- Based on my Experince i have shared the Upgrade Land Scape Sequence,however it may vary based on Client/Basis/BW/SAP recommendations.

Recently, I have shared some important keys on the subjects of Master Data Redesign in BW. It’s more of a conceptual and informative blog which I hope that it would give you some ideas on how the redesign of Master Data should be done. In this blog, let’s get to the ground and get our hands dirty on some useful ABAP programs that I have created and personally benefited tremendously in getting the insights of existing data flow in SAP Business Warehouse (BW).

 

Back then when I first started my career as a junior consultant, I was first involved in a SAP BW Performance Enhancement Project. Initially, I thought it would be a rather challenging task where I will need to understand what system parameters are or doing necessary system tuning to enhance the performance of the system. To my surprise, none of the aforementioned challenges were part of my task, but was told to document all the existing data flow and the BEx queries instead. It was a really painful experience, as I will need to go through one after another existing data flow (InfoProviders, Transformations and DTPs) and BEx queries manually to get the right information documented. Since then, I realized documentation plays a super vital role when it comes to any redesign, enhancement or support work for SAP BW.

 

Fast forward to this year, I'm involved in a on-going SAP BW project and thought it would be really beneficial to have a simple ABAP program in my pocket that can help me in getting the insight of any existing BW data flows quickly and allow me to have the right information at a glance (without clicking through the InfoProviders / Transformation forth and back). I started working on 2 simple ABAP programs (which I'm going to share with you below) and since then using these programs to assist me in making the right informed decision on how I should proceed with the Redesign / Enhancement or determine whether the new requirement can be catered by the existing data flow.

 

The following ABAP programs are created with the intention to help you in getting the insights of any existing data flow(s) and you may use this for your documentation as well:

  1. ZBW_LIST_IO_BY_INFOPROV
    1. This Program will list the InfoObjects by InfoProviders
  2. ZBW_LIST_TRANSF_MAPPING
    1. This Program will list the transformation mapping between Source and Target

 

*Note:

(1) The intention of splitting the program and instead of combining will give you greater flexibility when it comes to documenting multiple InfoProviders / Transformation.

(2) Tested on BW 7.4.

 

 

 

A. Program(1) - ZBW_LIST_IO_BY_INFOPROV

Overview: This Program will list the InfoObjects by MultiProviders / InfoCube / DSO.


How does it work?

1. Execute the Program in SE38 (source code will be provided in the appendix)


2. Insert the InfoProvider’s technical name and select the type of InfoProvider, then execute (F8) the program.

Blog(1).png

**Note: If you are listing an SPO, remember to put “00” at the end of the technical name. For example, for spo ZSPO01, the technical name should be “ZSP0100”.


3. For MultiProviders / Infocube, result will be generated to list the InfoObjects by Dimension:

Blog(2).png


4. For DSO, result will be generated to list the InfoObjects by key fields and data fields:

Blog(3).png


5. You can then export this list and compile all the InfoProviders in an excel file, enabling you with the ability to search InfoObjects by InfoProviders (without over-relying on the where-used-list in BW)!


Tips:To make a more informed technical decision, most of the time, you will need to understand and know how each InfoObjects are populated in the InfoProviders as well. To obtain that information, you may proceed to 2nd ABAP Program, ZBW_LIST_TRANSF_MAPPING, to get the mapping from the source to the target.

 

 

 

B. Program(2) - ZBW_LIST_TRANSF_MAPPING

Overview: This Program will list the Transformation mapping by the Transformation Target or the Transformation ID.


How does it work?

1. Execute the Program in SE38 (source code will be provided in the appendix)


2. Insert either the Transformation Target’s (DSO / InfoProviders / SPO / InfoSource) technical name and execute (F8) the program.

Blog(4).png

*Note: For SPO InfoProviders, put the Incoming InfoSource as the Transformation Target. For example, for spo ZSPO01, put incoming InfoSource, ZSP01_I as the Transformation Target instead.

 

3. Result will be generated to list the Source and Target with the mapping Rule by Transformation Target and Transformation ID:

Blog(5).png

 

4. You can then export this to combine it with the listing of InfoObjects by InfoProviders using simple vlookup in excel, which will give you the entire mapping of InfoObjects by InfoProviders to its source without needing to go forth and back on different Transformation or InfoProviders.

 

With this, you would be able to know how a particular InfoObject is populated in an InfoProvider and subsequently helps you in making the right decision in:

     a. Redesigning the relevant InfoProviders or data flow

     b. Determining how to enhance an existing data flow or whether a new data flow will need to be developed.

 

Hope it helps:) If you are interested in Lumira as well, may have a look at a blog that I have published previously:

Visualizing Air Pollutant Index (API) with Lumira and Import.io

 

 

Appendix (1) - ZBW_LIST_IO_BY_INFOPROV

REPORT ZBW_LIST_IO_BY_INFOPROV.

DATA: INP TYPE C LENGTH 10.
SELECT-OPTIONS: o_INP FOR INP.

PARAMETER : CUBE RADIOBUTTON GROUP GR1,
                          DSO
RADIOBUTTON GROUP GR1.

TYPES: BEGIN OF ty_tab_dso,
                DSO_N
TYPE RSDODSOBJECT,
               DSO_T
TYPE RSTXTLG,
               IO_N
TYPE RSIOBJNM,
               IO_T
TYPE RSTXTSH,
               POSIT
TYPE RSPOSIT,
               KEYFLAG
TYPE KEYFLAG,
               OBJVERS
TYPE RSOBJVERS,
               OBJVERS_2
TYPE RSOBJVERS,
               OBJVERS_3
TYPE RSOBJVERS,
               LANGU_D
TYPE LANGU,
               LANGU_I
TYPE LANGU,
END OF ty_tab_dso.

TYPES: BEGIN OF ty_tab_dso_disp,
               DSO_N
TYPE RSDODSOBJECT,
               DSO_T
TYPE RSTXTLG,
               IO_N_K
TYPE RSIOBJNM,
               IO_N
TYPE RSIOBJNM,
               IO_T
TYPE RSTXTSH,
END OF ty_tab_dso_disp.

TYPES: BEGIN OF ty_tab_cube,
               IC_N
TYPE RSINFOCUBE,
               IC_T
TYPE RSTXTSH,
               ICD_N
TYPE RSDIMENSION,
               ICD_T
TYPE RSTXTSH,
               IO_N
TYPE RSIOBJNM,
               IO_T
TYPE RSTXTSH,
               POSIT
TYPE RSPOSIT,
END OF ty_tab_cube.

TYPES: BEGIN OF ty_tab_cube_disp,
               IC_N
TYPE RSINFOCUBE,
               IC_T
TYPE RSTXTSH,
               ICD_T
TYPE RSTXTSH,
               IO_N
TYPE RSIOBJNM,
               IO_T
TYPE RSTXTSH,
END OF ty_tab_cube_disp.

DATA: lt_tab_dso TYPE STANDARD TABLE OF ty_tab_dso,
            wa_lt_tab_dso
LIKE LINE OF lt_tab_dso,

            lt_tab_dso_disp
TYPE STANDARD TABLE OF ty_tab_dso_disp,
            wa_lt_tab_dso_disp
LIKE LINE OF lt_tab_dso_disp,

            lt_tab_cube
TYPE STANDARD TABLE OF ty_tab_cube,
            lt_tab_cube_tmp
TYPE STANDARD TABLE OF ty_tab_cube,
            wa_lt_tab_cube
LIKE LINE OF lt_tab_cube,

            lt_tab_cube_disp
TYPE STANDARD TABLE OF ty_tab_cube_disp,
            wa_lt_tab_cube_disp
LIKE LINE OF lt_tab_cube_disp.

FIELD-SYMBOLS: <wa_lt_tab_cube> LIKE LINE OF lt_tab_cube.

* For ALV Table
DATA: it_fieldcat  TYPE slis_t_fieldcat_alv,
            wa_fieldcat 
TYPE slis_fieldcat_alv,

            seltext_m
TYPE SCRTEXT_M.


*&---------------------------------------------------------------------*
*&      MAIN
*&---------------------------------------------------------------------*
*       Retrieve List of IO by DSO
*----------------------------------------------------------------------*
IF DSO = 'X'.
     
PERFORM retrieve_dso.
      seltext_m
= 'DSO'.
     
PERFORM build_fieldcatalog.
     
PERFORM DISPLAY_ALV_REPORT TABLES lt_tab_dso_disp.
ELSE.
     
PERFORM retrieve_cube.
      seltext_m
= 'CUBE'.
     
PERFORM build_fieldcatalog.
     
PERFORM DISPLAY_ALV_REPORT TABLES lt_tab_cube_disp.
ENDIF.



*&---------------------------------------------------------------------*
*&      Form  RETRIEVE_DSO
*&---------------------------------------------------------------------*
*       Retrieve List of IO by DSO
*----------------------------------------------------------------------*
FORM retrieve_dso.

SELECT A~ODSOBJECT ATEXT~TXTLG A~IOBJNM B~TXTSH POSIT KEYFLAG
                A
~OBJVERS ATEXT~OBJVERS B~OBJVERS ATEXT~LANGU B~LANGU
FROM RSDODSOIOBJ AS A
               
JOIN RSDODSOT AS ATEXT
               
ON A~ODSOBJECT = ATEXT~ODSOBJECT
                    
AND A~OBJVERS = ATEXT~OBJVERS
                    
AND ATEXT~LANGU = 'E'
               
JOIN RSDIOBJT AS B
               
ON A~IOBJNM = B~IOBJNM
                    
AND A~OBJVERS = B~OBJVERS
                    
AND B~LANGU = 'E'
               
INTO TABLE lt_tab_dso
WHERE A~ODSOBJECT in o_INP
               
AND A~OBJVERS = 'A'.

SORT lt_tab_dso BY DSO_N POSIT .

LOOP AT lt_tab_dso INTO wa_lt_tab_dso.
      wa_lt_tab_dso_disp
-DSO_N = wa_lt_tab_dso-DSO_N.
      wa_lt_tab_dso_disp
-DSO_T = wa_lt_tab_dso-DSO_T.

IF wa_lt_tab_dso-keyflag = 'X'.
      wa_lt_tab_dso_disp
-IO_N_K = wa_lt_tab_dso-IO_N.
ELSE.
      wa_lt_tab_dso_disp
-IO_N = wa_lt_tab_dso-IO_N.
ENDIF.
      wa_lt_tab_dso_disp
-IO_T = wa_lt_tab_dso-IO_T.
     
APPEND wa_lt_tab_dso_disp TO lt_tab_dso_disp.
     
CLEAR: wa_lt_tab_dso_disp.
ENDLOOP.
ENDFORM.

*&---------------------------------------------------------------------*
*&      Form  RETRIEVE_CUBE
*&---------------------------------------------------------------------*
*       Retrieve List of IO by CUBE
*----------------------------------------------------------------------*
FORM retrieve_cube.

SELECT A~INFOCUBE   AS IC_N
                ATEXT
~TXTLG  AS IC_T
                A2
~DIMENSION AS ICD_N
               A~IOBJNM     AS IO_N
                B
~TXTSH      AS IO_T
                A2
~POSIT
FROM RSDCUBEIOBJ AS A
               
JOIN RSDCUBET AS ATEXT
               
ON A~INFOCUBE = ATEXT~INFOCUBE
                    
AND A~OBJVERS = ATEXT~OBJVERS
                    
AND ATEXT~LANGU = 'E'
               
LEFT JOIN RSDDIMEIOBJ AS A2
               
ON A~INFOCUBE = A2~INFOCUBE
                    
AND A~OBJVERS = A2~OBJVERS
                    
AND A~IOBJNM = A2~IOBJNM
               
LEFT JOIN RSDIOBJT AS B
               
ON A~IOBJNM = B~IOBJNM
                    
AND B~LANGU = 'E'
                    
AND A~OBJVERS = B~OBJVERS
               
INTO CORRESPONDING FIELDS OF TABLE lt_tab_cube
WHERE A~INFOCUBE in o_INP
               
AND A~OBJVERS = 'A'.

* Select Dimension Text
SELECT DTEXT~INFOCUBE AS IC_N
                DTEXT
~DIMENSION AS ICD_N
                DTEXT
~TXTSH    AS ICD_T
FROM RSDDIMET AS DTEXT
INTO CORRESPONDING FIELDS OF wa_lt_tab_cube
FOR ALL ENTRIES IN lt_tab_cube
WHERE LANGU = 'E'
          
AND INFOCUBE = lt_tab_cube-IC_N
          
AND OBJVERS = 'A'
          
AND DIMENSION = lt_tab_cube-ICD_N.

MODIFY lt_tab_cube FROM wa_lt_tab_cube TRANSPORTING ICD_T
     
WHERE IC_N = wa_lt_tab_cube-IC_N
                    
AND ICD_N = wa_lt_tab_cube-ICD_N.
ENDSELECT.

DATA: LINE_CNT TYPE I.
LINE_CNT
= 1.
CLEAR: wa_lt_tab_cube.
LOOP AT lt_tab_cube ASSIGNING <wa_lt_tab_cube>.

      wa_lt_tab_cube_disp
-IC_N = <wa_lt_tab_cube>-IC_N.
      wa_lt_tab_cube_disp
-IC_T = <wa_lt_tab_cube>-IC_T.

      wa_lt_tab_cube_disp
-ICD_T = <wa_lt_tab_cube>-ICD_T.
IF wa_lt_tab_cube-ICD_T <> <wa_lt_tab_cube>-ICD_T.
     
IF <wa_lt_tab_cube>-ICD_T IS INITIAL.
           wa_lt_tab_cube_disp
-ICD_T = 'Key Figures'.
     
ENDIF.
     
APPEND wa_lt_tab_cube_disp TO lt_tab_cube_disp.
ENDIF.
CLEAR wa_lt_tab_cube_disp-ICD_T.

      wa_lt_tab_cube_disp
-IO_N = <wa_lt_tab_cube>-IO_N.
      wa_lt_tab_cube_disp
-IO_T = <wa_lt_tab_cube>-IO_T.
     
APPEND wa_lt_tab_cube_disp TO lt_tab_cube_disp.
      wa_lt_tab_cube
= <wa_lt_tab_cube>.
     
CLEAR wa_lt_tab_cube_disp.
ENDLOOP.

ENDFORM.


*&---------------------------------------------------------------------*
*&      Form  BUILD_FIELDCATALOG
*&---------------------------------------------------------------------*
*       Build Fieldcatalog for ALV Report
*----------------------------------------------------------------------*
form build_fieldcatalog.

IF seltext_m = 'DSO'.

      wa_fieldcat
-fieldname  = 'DSO_N'.    " Fieldname in the data table
      wa_fieldcat
-seltext_m  = seltext_m.   " Column description in the output
     
APPEND wa_fieldcat TO it_fieldcat.

      wa_fieldcat
-fieldname  = 'DSO_T'.
      wa_fieldcat
-seltext_m  = 'Desc'.
     
APPEND wa_fieldcat TO it_fieldcat.

      wa_fieldcat
-fieldname  = 'IO_N_K'.
      wa_fieldcat
-seltext_m  = 'Key-Fields'.
     
APPEND wa_fieldcat TO it_fieldcat.

      wa_fieldcat
-fieldname  = 'IO_N'.
      wa_fieldcat
-seltext_m  = 'Data Fields'.
     
APPEND wa_fieldcat TO it_fieldcat.

      wa_fieldcat
-fieldname  = 'IO_T'.
      wa_fieldcat
-seltext_m  = 'IO Desc'.
     
APPEND wa_fieldcat TO it_fieldcat.

ELSE.

      wa_fieldcat
-fieldname  = 'IC_N'.    " Fieldname in the data table
      wa_fieldcat
-seltext_m  = seltext_m.   " Column description in the output
     
APPEND wa_fieldcat TO it_fieldcat.

      wa_fieldcat
-fieldname  = 'IC_T'.
      wa_fieldcat
-seltext_m  = 'Desc'.
     
APPEND wa_fieldcat TO it_fieldcat.

      wa_fieldcat
-fieldname  = 'ICD_T'.
      wa_fieldcat
-seltext_m  = 'Dimension'.
     
APPEND wa_fieldcat TO it_fieldcat.

      wa_fieldcat
-fieldname  = 'IO_N'.
      wa_fieldcat
-seltext_m  = 'IO Tech'.
     
APPEND wa_fieldcat TO it_fieldcat.

      wa_fieldcat
-fieldname  = 'IO_T'.
      wa_fieldcat
-seltext_m  = 'IO Desc'.
     
APPEND wa_fieldcat TO it_fieldcat.

ENDIF.
endform.                    " BUILD_FIELDCATALOG


*&---------------------------------------------------------------------*
*&      Form  DISPLAY_ALV_REPORT
*&---------------------------------------------------------------------*
*       Display report using ALV grid
*----------------------------------------------------------------------*
form display_alv_report TABLES lt_tab.

*Pass data and field catalog to ALV function module to display ALV list
CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'
     
EXPORTING
           IT_FIELDCAT  
= it_fieldcat[]
     
TABLES
           t_outtab     
= lt_tab
     
EXCEPTIONS
           program_error
= 1
          
OTHERS        = 2.

endform.                   
" DISPLAY_ALV_REPORT

 

 

 

Appendix (2) – Source Code for ZBW_LIST_TRANSF_MAPPING

REPORT ZBW_LIST_TRANSF_MAPPING.


DATA: TARGET TYPE SOBJ_NAME.

SELECT-OPTIONS: O_TARGET FOR TARGET.


DATA: TRANS TYPE RSTRANID.

SELECT-OPTIONS: O_TRANS FOR TRANS.


* For Data Display

DATA: it_fieldcat  TYPE slis_t_fieldcat_alv,

            wa_fieldcat  TYPE slis_fieldcat_alv.


TYPES: BEGIN OF ty_sel,
           TRANID         
TYPE RSTRANID,
           SOURCENAME     
TYPE SOBJ_NAME,

END OF ty_sel.


TYPES: BEGIN OF ty_tab,
           TARGETNAME     
TYPE C LENGTH 15,
           TRANID         
TYPE RSTRANID,
           RULEID         
TYPE RSTRAN_RULEID,
           STEPID         
TYPE RSTRAN_STEPID,
           PARAMTYPE      
TYPE RSTRAN_PARAMTYPE,
           SOURCENAME     
TYPE SOBJ_NAME,
           PARAMNM_SOURCE 
TYPE C LENGTH 15,
           PARAMNM        
TYPE C LENGTH 15,
           RULETYPE       
TYPE RSTRAN_RULETYPE,
           RULETEXT       
TYPE RSTXTLG,

END OF ty_tab.


TYPES: BEGIN OF ty_tab_disp,
           TARGETNAME     
TYPE C LENGTH 15,
           TRANID         
TYPE RSTRANID,
           RULEID         
TYPE RSTRAN_RULEID,
           SOURCENAME     
TYPE SOBJ_NAME,
           PARAMNM_SOURCE 
TYPE C LENGTH 15,
           PARAMNM        
TYPE C LENGTH 15,
           RULETEXT       
TYPE RSTXTLG,

END OF ty_tab_disp.


DATA: lt_sel TYPE STANDARD TABLE OF ty_sel,

        lt_tab
TYPE STANDARD TABLE OF ty_tab,
        wa_lt_tab
LIKE LINE OF lt_tab,

        lt_tab_2
TYPE STANDARD TABLE OF ty_tab,
        wa_lt_tab_2
LIKE LINE OF lt_tab_2,

        lt_tab_final
TYPE STANDARD TABLE OF ty_tab,

        lt_tab_disp
TYPE STANDARD TABLE OF ty_tab_disp,
        wa_lt_tab_disp
LIKE LINE OF lt_tab_disp.

*&---------------------------------------------------------------------*

*&      MAIN

*&---------------------------------------------------------------------*

*

*----------------------------------------------------------------------*

PERFORM data_retrieval.

PERFORM build_fieldcatalog.

PERFORM display_alv_report TABLES lt_tab_disp.



*&---------------------------------------------------------------------*

*&      Form  DATA_RETRIEVAL

*&---------------------------------------------------------------------*

*      

*----------------------------------------------------------------------*

form data_retrieval.


IF O_TARGET IS NOT INITIAL.


SELECT A~TARGETNAME A~TRANID B~RULEID STEPID PARAMTYPE
           SOURCENAME PARAMNM
C~RULETYPE CTEXT~TXTLG AS RULETEXT

FROM RSTRAN AS A JOIN RSTRANFIELD AS B

               ON A~TRANID = B~TRANID
               
AND A~OBJVERS = B~OBJVERS
          
LEFT JOIN RSTRANRULE AS C
          
ON B~TRANID = C~TRANID
               
AND B~OBJVERS = C~OBJVERS
               
AND B~RULEID = C~RULEID
          
LEFT JOIN RSTRANRULET AS CTEXT
          
ON B~TRANID = CTEXT~TRANID
               
AND B~OBJVERS = CTEXT~OBJVERS
               
AND B~RULEID = CTEXT~RULEID
               
AND CTEXT~LANGU = 'E'
          
INTO CORRESPONDING FIELDS OF TABLE lt_tab

WHERE TARGETNAME IN O_TARGET
          
AND A~OBJVERS = 'A'.


ELSE.

SELECT A~TARGETNAME A~TRANID B~RULEID STEPID PARAMTYPE
           SOURCENAME PARAMNM
C~RULETYPE CTEXT~TXTLG AS RULETEXT

FROM RSTRAN AS A JOIN RSTRANFIELD AS B
          
ON A~TRANID = B~TRANID
               
AND A~OBJVERS = B~OBJVERS
          
LEFT JOIN RSTRANRULE AS C
          
ON B~TRANID = C~TRANID
               
AND B~OBJVERS = C~OBJVERS
               
AND B~RULEID = C~RULEID
          
LEFT JOIN RSTRANRULET AS CTEXT
          
ON B~TRANID = CTEXT~TRANID
               
AND B~OBJVERS = CTEXT~OBJVERS
               
AND B~RULEID = CTEXT~RULEID
               
AND CTEXT~LANGU = 'E'

INTO CORRESPONDING FIELDS OF TABLE lt_tab

WHERE A~TRANID in O_TRANS
          
AND A~OBJVERS = 'A'.

ENDIF.

lt_tab_2[] = lt_tab[].

DELETE lt_tab WHERE PARAMTYPE = 1.

DELETE lt_tab_2 WHERE PARAMTYPE = 0.

DELETE lt_tab_2 WHERE RULETYPE = 'END'.

SORT: lt_tab by tranid ruleid, lt_tab_2 by tranid ruleid.

LOOP AT lt_tab_2 INTO wa_lt_tab_2.
LOOP AT lt_tab INTO wa_lt_tab
     
WHERE TRANID = wa_lt_tab_2-TRANID
                    
AND RULEID = wa_lt_tab_2-RULEID.

     
IF sy-subrc = 0.
           wa_lt_tab_2
-PARAMNM_SOURCE = wa_lt_tab-PARAMNM.
     
ENDIF.

     
APPEND wa_lt_tab_2 TO lt_tab_final.
ENDLOOP.

IF sy-subrc <> 0.
     
APPEND wa_lt_tab_2 TO lt_tab_final.
ENDIF.
CLEAR sy-subrc.

ENDLOOP.

CLEAR: lt_tab, wa_lt_tablt_tab_2, wa_lt_tab_2.

LOOP AT lt_tab_final INTO wa_lt_tab.
      wa_lt_tab_disp
-TARGETNAME = wa_lt_tab-TARGETNAME.
      wa_lt_tab_disp
-TRANID = wa_lt_tab-TRANID.
      wa_lt_tab_disp
-RULEID = wa_lt_tab-RULEID.
     wa_lt_tab_disp
-SOURCENAME = wa_lt_tab-SOURCENAME.
      wa_lt_tab_disp
-PARAMNM_SOURCE = wa_lt_tab-PARAMNM_SOURCE.
      wa_lt_tab_disp
-PARAMNM = wa_lt_tab-PARAMNM.

     IF wa_lt_tab-RULETYPE = 'CONSTANT'.
      wa_lt_tab_disp
-RULETEXT = wa_lt_tab-RULETEXT.

     ELSE.
      wa_lt_tab_disp
-RULETEXT = wa_lt_tab-RULETYPE.
ENDIF.
APPEND wa_lt_tab_disp TO lt_tab_disp.
CLEAR: wa_lt_tab_disp, wa_lt_tab.

ENDLOOP.

endform.



*&---------------------------------------------------------------------*

*&      Form  BUILD_FIELDCATALOG

*&---------------------------------------------------------------------*

*       Build Fieldcatalog for ALV Report

*----------------------------------------------------------------------*

form build_fieldcatalog.

wa_fieldcat-fieldname  = 'TARGETNAME'.    " Fieldname in the data table

wa_fieldcat-seltext_m  = 'Target'.   " Column description in the output

APPEND wa_fieldcat TO it_fieldcat.


wa_fieldcat-fieldname  = 'TRANID'.

wa_fieldcat-seltext_m  = 'Transformation'.

APPEND wa_fieldcat TO it_fieldcat.


wa_fieldcat-fieldname  = 'RULEID'.

wa_fieldcat-seltext_m  = 'Rule ID'.

APPEND wa_fieldcat TO it_fieldcat.


wa_fieldcat-fieldname  = 'SOURCENAME'.

wa_fieldcat-seltext_m  = 'Source'.

APPEND wa_fieldcat TO it_fieldcat.


wa_fieldcat-fieldname  = 'PARAMNM_SOURCE'.

wa_fieldcat-seltext_m  = 'Source Field'.

APPEND wa_fieldcat TO it_fieldcat.

wa_fieldcat-fieldname  = 'PARAMNM'.

wa_fieldcat-seltext_m  = 'Target Field'.

APPEND wa_fieldcat TO it_fieldcat.


wa_fieldcat-fieldname  = 'RULETEXT'.

wa_fieldcat-seltext_m  = 'Rule'.

APPEND wa_fieldcat TO it_fieldcat.

endform.                    " BUILD_FIELDCATALOG



*&---------------------------------------------------------------------*

*&      Form  DISPLAY_ALV_REPORT

*&---------------------------------------------------------------------*

*       Display report using ALV grid

*----------------------------------------------------------------------*

form display_alv_report TABLES lt_tab.

*Pass data and field catalog to ALV function module to display ALV list

CALL FUNCTION 'REUSE_ALV_GRID_DISPLAY'
EXPORTING
      IT_FIELDCAT  
= it_fieldcat[]
TABLES
      t_outtab     
= lt_tab
EXCEPTIONS

          program_error = 1
     
OTHERS        = 2.

endform.

This blog explain you about writing the routine in transformation. There are many documents available on where to write the code, but this document explains about how to write the code.

 

Note: The data which is green color is the explanation. The final code will be at the bottom

 

Prerequisite information that are required are.

  1. Source. ( Here YDSO_9 is the source DSO)
  2. Key fields and the data field which is to be populated

Here YMAT_2 is the key field in the source DSO and the field that we need to populate to the target is YMAT_D5 (Data field)


com_1.JPG

 

Types: begin of ty_dso1,

/bic/YMAT_2 type /BIC/OIYMAT_2,

/bic/YMAT_D5 type /BIC/OIYMAT_D5,

end of ty_dso1.

 

 

 

In the above we took /bic/YMAT_2 as type of /BIC/OIYMAT_2. To know the Data element of an infoobject, open the object and we can find the data element as show below

com_2.JPG


Now we are declaring Internal Table and Work Area. Here I have considered IT_Tab1 for defining Internal Table.

Note: Many people use different naming conventions. So please don’t get confused while reading the Internal table or Work Areas.

 

data: it_tab1 type standard table of ty_dso1,

       wa_tab1 type ty_dso1.

 

 

if result_package[] is not initial.


At the below we need to select the required fields from the Active table of source DSO(Source DSO here is YDSO_9).

If you are not sure about the Active table of your DSO. Please refer the below steps.

Go to manage of DSO Contents Tab. Once you Click on Active data you will be able to see the table as shown in the below screenshots.

com_3.JPG


select /bic/YMAT_2 /bic/YMAT_D5 from /bic/AYDSO_900 into corresponding fields of table it_tab1 for all entries in result_package where

/bic/YMAT_2 = result_package-/bic/YMAT_2.

 

Here we are populating the data to Internal Table IT_Tab1 from the source DSO YDSO_900 based on where condition.

If we are not sure about the field names to select from the Source DSO. Go to Se11 enter the name of

/BIC/ AYDSO_900 and we will get information about the fields

The active table name of a dso will be like this /BIC/ADSOName00.

Eg: /BIC/AYDSO_900. Replace YDSO_9 with your DSO Name


com_4.JPG

Here the condition is /bic/YMAT_2 = result_package-/bic/YMAT_2. Which will check the values of YMAT_2 of source DSO with YMAT_2 of Target DSO(Maximum here we will specify the Key fields.)

In some cases we may not have same objects in the lookup DSO and target infoproivder.

In this case we need to consider the similar object related to Key fields.

Example:

We have 0Doc_num in DSO and ZDOCNUM in the target infoprovider. Then we can use these 2 objects in the where conditions like as per below

select 0doc_num /bic/YMAT_D5 from /bic/AYDSO_900 into corresponding fields of table it_tab1 for all entries in result_package where

doc_num = result_package-/bic/ZDOCNUM.

 

if sy-subrc = 0.

sort it_tab1 by /bic/YMAT_2.

  1. endif.
  2. endif.

 

loop at result_package assigning <result_fields>.

Now we need to populate the data to Workarea from Internal table. The below Read statement read data from Internal table and will populate the data to WA based on the Key fields.

  read table it_tab1 into wa_tab1 with key

  /bic/YMAT_2 = <result_fields>-/bic/YMAT_2 binary search.

Now WA holds the data which has to be populated to result fields of YMAT_D5. The below statement populate the data to result fields

  if sy-subrc = 0.

    <result_fields>-/bic/YMAT_D5 = wa_tab1-/bic/YMAT_D5.

 

    endif.

    Endloop.

 

 

The final code will be


Types : begin of ty_dso1,

/bic/YMAT_2 type /BIC/OIYMAT_2,

/bic/YMAT_D5 type /BIC/OIYMAT_D5,

end of ty_dso1.

data: it_tab1 type standard table of ty_dso1,

wa_tab1 type ty_dso1.

if result_package[] is not initial.

select /bic/YMAT_2 /bic/YMAT_D5 from /bic/AYDSO_900 into corresponding fields of table it_tab1 for all entries in result_package where

/bic/YMAT_2 = result_package-/bic/YMAT_2.

if sy-subrc = 0.

sort it_tab1 by /bic/YMAT_2.

endif.

endif.

loop at result_package assigning <result_fields>.

  read table it_tab1 into wa_tab1 with key

  /bic/YMAT_2 = <result_fields>-/bic/YMAT_2 binary search.

  if sy-subrc = 0.

<result_fields>-/bic/YMAT_D5 = wa_tab1-/bic/YMAT_D5.

endif.

endif.

endloop.

Aby Jacob

BW Transport for Infopackage

Posted by Aby Jacob Jun 10, 2016

Hello Friends,

 

Our team had an error while transporting Info-packages to Production server

 

This error occurred only in Production server and affected only Info-packages

 

After almost 3 days , we found an OSS Note to resolve this error - 1965709

 

https://websmp230.sap-ag.de/sap/support/notes/1965709

 

 

 

ISIP Object Type entry.gif

 

TABLE RSTLOGOPROP.gif



And the solution which worked for us is given below:

 

1 - To update this  table RSTLOGOPROP

2 - To delete actual Transport Requests from import buffers as advised in OSS note (need help from Basis team for this)

3 - To create new Transport for table RSTLOGOPROP (move it from development to production)

4 - To create new Transport for Info-packages and to transport again to Production server

 

 

Thanks

Aby Jacob

After reading this great blog from J. Jonkergouw on his website about Delete overlapping requests DataStore Object using ABAP, we had a similar issue, however we needed to keep one historic load in the DSO. That's why we altered the code a bit to keep requests from one DSO.

 

Kudos to the code of Joury, very nice to implement.

 

     DATA:

       l_t_rsiccont   TYPE STANDARD TABLE OF rsiccont,

       lv_ftimestampc TYPE c LENGTH 14,

       lv_ttimestampc TYPE c LENGTH 14,

       lv_frtimestamp TYPE rstimestmp,

       lv_totimestamp TYPE rstimestmp,

       lv_calweek     TYPE /bi0/oicalweek,

       lv_first_date  TYPE scal-date,

       lv_last_date   TYPE scal-date.

 

     CONSTANTS:

       lc_begin_time TYPE c LENGTH 6 VALUE '000000',

       lc_end_time   TYPE c LENGTH 6 VALUE '235959',

       lc_dso        TYPE rsinfocube VALUE 'ZJJ_DSO_NAME',

       lc_dso_out    TYPE rsinfocube VALUE 'ZRB_DSO_NAME'.

 

     FIELD-SYMBOLS:

            <lfs_rsiccont> TYPE rsiccont.

 

*-  Convert system date to calendar week.

     CALL FUNCTION 'ZBW_DATE_TO_ANYTHING'

       EXPORTING

         i_calday  = sy-datum

       IMPORTING

         e_calweek = lv_calweek.

 

*-  Get week first and last day.

     CALL FUNCTION 'WEEK_GET_FIRST_DAY'

       EXPORTING

         week         = lv_calweek

       IMPORTING

         date         = lv_first_date

       EXCEPTIONS

         week_invalid = 1

         OTHERS       = 2.

 

*-  Define last day of the week

     lv_last_date = lv_first_date + 6.

 

*-  Concatenate to a string with format YYYYMMDDHHIISS

     CONCATENATE lv_first_date lc_begin_time INTO lv_ftimestampc.

     CONCATENATE lv_last_date lc_end_time INTO lv_ttimestampc.

 

*-  Convert the from and to string to a timestamp format

*-  Needed to select data from the RSICCONT

     lv_frtimestamp = lv_ftimestampc.

     lv_totimestamp = lv_ttimestampc.

 

*-  Select all requests which are currently in the data monitor

*- The adjustment made is an inner join from the request table

*- which stores the source of the DTP and is filtered out in the where class

     SELECT rnr timestamp FROM rsiccont AS p

     INNER JOIN rsbkrequest AS r ON r~request = p~rnr

     INTO CORRESPONDING FIELDS OF TABLE l_t_rsiccont

     WHERE icube EQ lc_dso

     AND NOT src EQ lc_dso_out

     AND timestamp BETWEEN lv_frtimestamp AND lv_totimestamp.

 

*-  If we start ASCENDING then the oldest requests will be

*-  deleted including the ones till the current date.

     SORT l_t_rsiccont BY timestamp DESCENDING.

 

*-  Start looping over the requests.

     LOOP AT l_t_rsiccont ASSIGNING <lfs_rsiccont>.

 

*-    Delete requests from the DSO

       CALL FUNCTION 'RSSM_DELETE_REQUEST'

         EXPORTING

           request                    = <lfs_rsiccont>-rnr

           infocube                   = lc_dso

           dialog                     = abap_false

         EXCEPTIONS

           request_not_in_cube        = 1

           infocube_not_found         = 2

           request_already_aggregated = 3

           request_already_comdensed  = 4

           no_enqueue_possible        = 5

           cube_in_planning_mode      = 6

           OTHERS                     = 7.

 

*      Uncomment if you want to enable error handling

*      IF sy-subrc <> 0.

*        MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno

*              WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.

*      ENDIF.

 

     ENDLOOP.

One of the most common issues with the BW data loads is the incorrect data from the source system. For occasional failures we edit the PSA records instead of using a routine since it doesn't needs development work and transports.  If we need to correct multiple records then it will be pain to correct them one-by-one. In this blog, I will show how to correct multiple records at once.

Example:

  1. You have loaded the data and it failed with incorrect data. You have checked the PSA records and noticed there are multiple records with the same issue.

1.png

 

   2. You can filter the records which have incorrect data and select all the records and click on ‘Edit’ button.

2.png

 

    3. A blank record opens up on a pop-up screen.  Enter the correct data and save.

3.png

   4. Now you can check that the data is corrected for all the records you have selected.

4.png

 

Don't forget to notify the  owners/analysts to correct the data in the source system

There are scenarios when Transformation End Routine is a good fit. In my blog will demonstrate how to simplify Transfer Rules by means of:

 

Reducing Coding

 

In my case I load PO GR data and lookup multiple characteristic values from PO Item level. Instead of repetitive coding similar lookup / mapping for each characteristic in individual Transfer Rule I did once in End Routine. It saved me not only coding efforts, but also increased performance by reducing a number of lookups.

 

End Routine 1.jpg

 

End Routine 2.jpg

 

 

Increasing Reusability

 

During PO GR data load I calculate Delivery Duration based on Due Date, Delivery Duration based on GR data and over / under variances of two durations. I did not like the idea to repeat durations calculation logic in variance transformation rules. Instead I used results of duration calculations in end routine to calculate variances.

 

End Routine 3.jpg

End Routine 4.jpg

Scenario: If I execute DTP in current month, it always picks only "Current month – 1" data.

 

Example:

 

If today's date is 04/22/2016, based on the system date, it will calculate previous month First day and Last day. i.e. it will fetch 03/01/2016 to  03/31/2016.

 

If today's date is 01/22/2016, based on the system date, it will calculate previous month First day and Last day. i.e. it will fetch 12/01/2015 to 12/31/2015.

 

 

Occasionally we need to filter the Date Characteristic InfoObject to extract only “Previous Month” data. Here the filter selection is not on SAP Content InfoObject, the filter selection is on Custom InfoObject.

 

If it is SAP Content InfoObject, we may have few SAP Customer exit/variables to use directly in DTP, but in this example I’m using Custom InfoObject which is created Data Type as DATS.

 

In DTP select the InfoObject and choose Create Routine and write/add the below Code in DTP Routine.

 

* Global code used by conversion rules
*$*$ begin of global - insert your declaration only below this line  *-*
* TABLES: ...


   DATA:   dt_range  TYPE STANDARD TABLE OF rsdatrange,

        btw LIKE STANDARD TABLE OF rsintrange,
        wdt_range
TYPE rsdatrange
.


*$*$ end of global - insert your declaration only before this line   *-*

 

 

*$*$ begin of routine - insert your code only below this line        *-*
     
data: l_idx like sy-tabix.
     
read table l_t_range with key
      fieldname
= ' '.
      l_idx
= sy-tabix.
*....

  CALL FUNCTION 'RS_VARI_V_LAST_MONTH' 
   * EXPORTING
   * SYSTIME          = ' '
  TABLES
   p_datetab 
= dt_range
   p_intrange
= btw.

READ TABLE dt_range INTO wdt_range INDEX 1.

      l_t_range
-fieldname = '/BIC/<Your_InfoObject_Name>'.
      l_t_range
-option = 'BT'.
      l_t_range
-sign = 'I'.
      l_t_range
-low = wdt_range-low.
      l_t_range
-high = wdt_range-high.

APPEND l_t_range
.

 

*  IF l_idx <> 0.
*    MODIFY l_t_range INDEX l_idx.
*  ELSE.
*    APPEND l_t_range.
*  ENDIF.

*$*$ end of routine - insert your code only before this line         *-*

     Sometimes data in Source System is not checked for quality. For example, input data is not checked for non printable characters e.g. tabulation, carriage return, linne feed etc. If user copy and paste data into input fields from email or web page then non printable characters can be entered into the system causing BW data loading issues (not permitted characters). In case of master data quality issue must fixed immediately otherwise problem will become worse with every transaction where incorrect master data is used. In case incorrect of just information fields that are stored in DSO at document level, then data can be fixed in transfer rules.

     What it takes is to correct data in transfer rule start routine using regular expression.

REGEX1.jpg

Prior to executing REPLACE statement HEWORD SOURCE_PACKAGE field contains hex 09 (tabulation) character.

REGEX2.jpg

Once REPLACE statement is executed, non printable character is gone.

REGEX3.jpg

REGEX4.jpg

List of issues faced during the flat file generation from APD.

 

1> Header name for key figures displaying with technical names in flat file.

 

2> Negative sign of key figures like amount and quantity displaying post values in flat file.

      Which will result in wrong Total Amount in flat file.

      i.e.     Amount

                 $1000 –

 

3> Leading zeros has been added in to the key figures of APD.

 

4> Values are getting rounded off. No decimal places displayed in the flat file.

 

Solution:

 

First create Z info objects as per your header field names length.

  1. i.e. ZCHAR20 for field name length 20

 

Assign these Z info objects in your target field of APD Routine as below,

             

Capture.PNG

 

Write following login in routine tab,

 

DATA: ls_source TYPE y_source_fields,
ls_target
TYPE y_target_fields.


ls_target
-Char1 = 'ABC'.
ls_target
-Char2 = 'XYZ'.

APPEND ls_target to et_target.

data : lv_value type p length 16 DECIMALS 2(Add decimal places as per your need)

LOOP AT it_source INTO ls_source.
    

           *    MOVE-CORRESPONDING ls_source TO ls_target.
                    ls_target
-Char1 = ls_source-Char1.
                    ls_target
-Char2 = ls_source-Char2.
          
*    ls_target-KYF_0001 = ls_source-KYF_0001.
                    
clear : lv_value.
                    
if ls_source-KYF_0001 is not initial.
                        lv_value
= ls_source-KYF_0001.
                              
if lv_value is not initial.
                                  ls_target
-KYF_0001 = lv_value.
                                   
if lv_value lt 0.
                                        
SHIFT ls_target-KYF_0001 RIGHT DELETING TRAILING '-'.
                                        
SHIFT ls_target-KYF_0001 LEFT DELETING LEADING ' '.
                                        
CONCATENATE '-' ls_target-KYF_0001 INTO ls_target-KYF_0001.
                                   
endif.
                              
endif.
                      
else.
                             ls_target
-KYF_0001 = '0.00'.
                      
endif.

 

 

Note: Here Char1, Char2 is your info object technical name.

           ABC, XYZ is field name which you want to display in header field of flat file.

    Virtual Cube Function Module can be very easily implemented using CL_RSDRV_REMOTE_IPROV_SRV class services (there is an example in class documentation). I like its simplicity, but unfortunatelly it can not handle complex selections. In my blog, I will explain how to keep Virtual Cube Function Module implementation simple and in the same time handle complex selections enhancing service class.
      Below is Function Module that implementation Virtual Cube reading from SFLIGHT table
*---------------------------------------------------------------------*
*      CLASS lcl_application  DEFINITION
*---------------------------------------------------------------------*
CLASS lcl_application DEFINITION.
 
PUBLIC SECTION.
   
CLASS-METHODS:
      get_t_iobj_2_fld
RETURNING VALUE(rt_iobj_2_fld) TYPE
                    cl_rsdrv_remote_iprov_srv
=>tn_th_iobj_fld_mapping.

ENDCLASS.

*---------------------------------------------------------------------*
*      CLASS lcl_application  IMPLEMENTATION
*---------------------------------------------------------------------*
CLASS lcl_application IMPLEMENTATION.
*---------------------------------------------------------------------*
* get_t_iobj_2_fld
*---------------------------------------------------------------------*
METHOD get_t_iobj_2_fld.

  rt_iobj_2_fld
= VALUE #( ( iobjnm = 'CARRID'    fldnm = 'CARRID' )
                         
( iobjnm = 'CONNID'    fldnm = 'CONNID' )
                         
( iobjnm = 'FLDATE'    fldnm = 'FLDATE' )
                         
( iobjnm = 'PLANETYPE' fldnm = 'PLANETYPE' )
                         
( iobjnm = 'SEATSOCC'  fldnm = 'SEATSOCC' )
                         
( iobjnm = 'SEATSOCCB' fldnm = 'SEATSOCC_B' )
                         
( iobjnm = 'SEATSOCCF' fldnm = 'SEATSOCC_F' ) ).

 
ENDMETHOD.
ENDCLASS.

FUNCTION z_sflight_read_remote_data.
*"----------------------------------------------------------------------
*"*"Local Interface:
*"  IMPORTING
*"    VALUE(INFOCUBE) LIKE  BAPI6200-INFOCUBE
*"    VALUE(KEYDATE) LIKE  BAPI6200-KEYDATE OPTIONAL
*"  EXPORTING
*"    VALUE(RETURN) LIKE  BAPIRET2 STRUCTURE  BAPIRET2
*"  TABLES
*"      SELECTION STRUCTURE  BAPI6200SL
*"      CHARACTERISTICS STRUCTURE  BAPI6200FD
*"      KEYFIGURES STRUCTURE  BAPI6200FD
*"      DATA STRUCTURE  BAPI6100DA
*"----------------------------------------------------------------------

  zcl_aab=>break_point( 'Z_SFLIGHT_READ_REMOTE_DATA' ).

 
DATA(iprov_srv) = NEW
    cl_rsdrv_remote_iprov_srv
( i_th_iobj_fld_mapping = lcl_application=>get_t_iobj_2_fld( )
                                i_tablnm             
= 'SFLIGHT' ).

  iprov_srv
->open_cursor(
    i_t_characteristics
= characteristics[]
    i_t_keyfigures     
= keyfigures[]
    i_t_selection     
= selection[] ).

  iprov_srv
->fetch_pack_data( IMPORTING e_t_data = data[] ).

 
return-type = 'S'.

ENDFUNCTION.
This how BW Query is defined which sends complex selection to Virtual Cube Function Module.
Service Class 2.jpg
Service Class 3.jpg
As you can see the Query reads number of seats occupied in Airbus Airplanes Types (global restriction) for All Carriers, Lufthansa and American Airlines in each 2015 and 2016 years.  Following selection is sent to Virtual Cube Function Module
Service Class 4.jpg
Expression 0 correspinds to global restriction and expressions 1 through 6 correspond to restricted key figures (All Carriers 2015, All Carriers 2016, Lufthansa 2015, Lufthansa 2016, American Airlines 2015 and American Airlines 2016).
Service class in our Virtual Cube Function Module used in such a way that generates wrong SQL Where clause expression. It is not a problem with Service Class as such, but the way it is used.
Service Class 6.jpg
BW Query results are wrong (All Carries data is a sum of Lufthansa and American Airlines. e.g. other carriers data is missing).
Service Class 7.jpg
The problem is that generated SQL Where clause expression does not follow the rule below:
E0  AND (  E1 OR E2 OR E3 ... OR EN ),
where E0 corresponds to the global restrictions and E1, E2, E3 ... EN to other restrictions.
The problem can easily be fixed enhancing CL_RSDRV_REMOTE_IPROV_SRV service class. What it takes is to:

 

Service Class 8.jpg

Creation of BUILD_WHERE_CONDITIONS_COMPLEX method
Service Class 9.jpg
METHOD build_where_conditions_complex.
DATA: wt_bw_selection TYPE tn_t_selection.
DATA: wt_where TYPE rsdr0_t_abapsource.

* E0 AND ( E1 OR E2 OR E3 ... OR EN )
 
LOOP AT i_t_selection INTO DATA(wa_bw_selection)
   
GROUP BY ( expression = wa_bw_selection-expression )
             
ASCENDING ASSIGNING FIELD-SYMBOL(<bw_selection>).
   
CLEAR: wt_bw_selection,
          wt_where.
   
LOOP AT GROUP <bw_selection> ASSIGNING FIELD-SYMBOL(<selection>).
      wt_bw_selection
= VALUE #( BASE wt_bw_selection ( <selection> ) ).
   
ENDLOOP.
    build_where_conditions
( EXPORTING i_t_selection = wt_bw_selection
                           
IMPORTING e_t_where    = wt_where ).
   
CASE <bw_selection>-expression.
   
WHEN '0000'.
     
IF line_exists( i_t_selection[ expression = '0001' ] ).
       
APPEND VALUE #( line = ' ( ' ) TO e_t_where.
     
ENDIF.
     
APPEND LINES OF wt_where TO e_t_where.
     
IF line_exists( i_t_selection[ expression = '0001' ] ).
       
APPEND VALUE #( line = ' ) AND ( ' ) TO e_t_where.
     
ENDIF.
   
WHEN OTHERS.
     
IF <bw_selection>-expression > '0001'.
       
APPEND VALUE #( line = ' OR ' ) TO e_t_where.
     
ENDIF.
     
APPEND VALUE #( line = ' ( ' ) TO e_t_where.
     
APPEND LINES OF wt_where TO e_t_where.
     
APPEND VALUE #( line = ' ) ' ) TO e_t_where.
     
IF ( line_exists( i_t_selection[ expression = '0000' ] ) ) AND
       
( NOT line_exists( i_t_selection[ expression = <bw_selection>-expression + 1 ] ) ).
       
APPEND VALUE #( line = ' ) ' ) TO e_t_where.
     
ENDIF.
   
ENDCASE.
 
ENDLOOP.

ENDMETHOD.
BUILD_WHERE_CONDITIONS_COMPLEX method contains logic to build selection accorindg to the rule. It is calling original
BUILD_WHERE_CONDITIONS method using it as buling block. New LOOP AT ... GROUP BY ABAP Syntax is used to split selection table into individual selections converting then them into SQL Where clause expressions and combining them into final expression as per the rule.

 

 

 

Implemention of Overwrite-exit for OPEN_CURSOR method
CLASS lcl_z_iprov_srv DEFINITION DEFERRED.
CLASS cl_rsdrv_remote_iprov_srv DEFINITION LOCAL FRIENDS lcl_z_iprov_srv.
CLASS lcl_z_iprov_srv DEFINITION.
PUBLIC SECTION.
CLASS-DATA obj TYPE REF TO lcl_z_iprov_srv. "#EC NEEDED
DATA core_object TYPE REF TO cl_rsdrv_remote_iprov_srv . "#EC NEEDED
INTERFACES  IOW_Z_IPROV_SRV.
 
METHODS:
  constructor
IMPORTING core_object
   
TYPE REF TO cl_rsdrv_remote_iprov_srv OPTIONAL.
ENDCLASS.
CLASS lcl_z_iprov_srv IMPLEMENTATION.
METHOD constructor.
  me
->core_object = core_object.
ENDMETHOD.

METHOD iow_z_iprov_srv~open_cursor.
*"------------------------------------------------------------------------*
*" Declaration of Overwrite-method, do not insert any comments here please!
*"
*"methods OPEN_CURSOR
*"  importing
*"    !I_T_CHARACTERISTICS type CL_RSDRV_REMOTE_IPROV_SRV=>TN_T_IOBJ
*"    !I_T_KEYFIGURES type CL_RSDRV_REMOTE_IPROV_SRV=>TN_T_IOBJ
*"    !I_T_SELECTION type CL_RSDRV_REMOTE_IPROV_SRV=>TN_T_SELECTION .
*"------------------------------------------------------------------------*
 
DATA:
    l_t_groupby   
TYPE rsdr0_t_abapsource,
    l_t_sel_list 
TYPE rsdr0_t_abapsource,
    l_t_where     
TYPE rsdr0_t_abapsource.

  core_object
->build_select_list(
   
exporting
      i_t_characteristics
i_t_characteristics
      i_t_keyfigures     
i_t_keyfigures
   
importing
      e_t_sel_list
= l_t_sel_list
      e_t_groupby 
= l_t_groupby ).

  core_object
->build_where_conditions_complex(
   
exporting
      i_t_selection
= i_t_selection
   
importing
      e_t_where
= l_t_where ).

* #CP-SUPPRESS: FP secure statement, no user input possible
 
open cursor with hold core_object->p_cursor for select (l_t_sel_list) from (core_object->p_tablnm)
   
where (l_t_where)
   
group by (l_t_groupby).

ENDMETHOD.
ENDCLASS.
OPEN_CURSOR  Overwrite-exit method has the same logic as original method except that BUILD_WHERE_CONDITIONS_COMPLEX method is called instead of BUILD_WHERE_CONDITIONS
Now when the changes are in place, lets run the report again and see what SQL Where Clause expression is generated
Service Class 10.jpg
Finally, lets run the report again and see if shows correct data.
Service Class 11.jpg
Now data is correct. All Carriers includes all data not only Lufthansa and American Airlines.

Introduction to Roles and Authorizations in BW 7.4

The Roles and Authorization maintained in BW7.4 provides a restriction on accessing reports based on infocube level, Characteristics level, Characteristics Value level, Key Figure level, hierarchy Node Level. The above mentioned restrictions are maintained by using this below mentioned approach;

 

Authorizations are maintained in authorization objects.

Roles contain the Authorizations.

Users are assigned to roles

 

Capture 21.PNG

 

Transactions Used

Infoobject Maintenance - RSD1.

Role Maintenance - PFCG

Roles and Authorization maintenance - RSECADMIN.

User creation SU01.

 

Note: A Characteristic object should be Authorization Relevant to make it available for restrictions. To make a characteristics object, Authorization Relevant; Go to “Business Explorer” tab in Info object details. Without making an object Authorization relevant checked, we cannot use it or include it into the Authorization Object.

 

Enter T code RSD1

Capture.PNG

enter the info object and click on Maintain.

Capture 1.PNG

Click on Business Explorer Tab then select the Authorization Relevant check box.so now we can use this

object in Roles and Authorization.

 

SCENARIO:

In my Scenario we want to create authorization on info object(0FUNCT_LOC) with hierarchy.suppose the hierarchy have three level's and i have 3 user's like User1,User2,User3. but User1 need to access hierarchy level 1 data ,User2 need to access hierarchy level 2 and User3 need to access hierarchy level 3.so that we need to follow the steps.

 

Creating Roles and Authorization objects

Creating Authorization objects

Enter T code RSECADMIN

Capture 2.PNG

then click on Ind.Maint.

 

 

cap2.png

Enter the Authorization name and click on create.

cap1.png

 

 

 

Maintain short,medium,long description and click on Insert Row and enter the objects.

0TCAACTVT Activity in Analysis Authorizations

0TCAACTVT Grant authorizations to different activities like to change and Display, Default value is 03 Display.

0TCAIPROV Authorizations for InfoProvider

0TCAIPROV Grant authorization to particular InfoProviders, Default value is * .

0TCAVALID Validity of an Authorization

0TCAVALID Define when authorizations are valid or not valid, Default Value is * .

and click on insert special characteristics.

 

cap3.png

cap4.png

 

 

cap5.png

 

 

 

now enter the info object 0FUNCT_LOC. and double click on that then go for Hierarchy Authorizations Tab.

click on create option.

cap6.pngcap7.png

 

  

 

select Hierarchy click on browse.

cap8.png

select Node Details and click on browse.

 

 

select particular Node from left side and move to right side what ever we required for particular user.

select particular Type of Authorization is

Capture 12.PNG

then click on continue.

Now click on User Tab.

Capture 13.PNG

 

 

 

click on Indvl Assignment then it will appear the below screen.

cap10.png

 

Enter the User and click on Role Maintenance.

cap11.png

 

click on create single role.

cap12.png

 

enter the description and click on change authorization data ICON.

 

 

cap13.png

 

add the above marked objects and click on generate ICON.

Now come to User tab enter the required user's

 

cap14.png

 

Click on user comparison then we get the below screen.

cap15.png

If we want to give access particular T code then go to Menu tab click on Add that T code and then screen will appear like this.

 

Capture 20.PNG

enter t code and click on Assign Transactions.and save it.

now log in Analyzer or SAP BW with

for the User2 and User3 also we need to follow the same steps.

Overview of Remodeling

 

If we want to modify an DSO that data has already been loaded. We can use remodeling to change the structure of the object without losing data.

If we want to change an DSO  that no data has been loaded into yet, we can change it in DSO maintenance.

 

We may want to change an InfoProvider that has already been filled with data for the following reasons:

 

We want to replace an InfoObject in an InfoProvider with another, similar InfoObject. we have created an InfoObject ourself but want to replace it with a BI Content InfoObject.

 

Prerequisites

 

As a precaution, make a backup of your data before you start remodeling. In addition, ensure that:

we have stopped any process chains that run periodically and affect the corresponding InfoProvider. Do not restart these process chains until remodeling is finished.

There is enough tablespace available in the database.

After remodeling, we have to check which BI objects that are connected to the InfoProvider (for example, transformation rules, MultiProviders) have been deactivated. we have to reactivate these objects manually. The remodeling makes existing queries that are based on the InfoProvider invalid. we have to manually adjust these queries according to the remodeled InfoProvider. If, for example, we have deleted an InfoObject, we also have to delete it from the query.

 

Features

A remodeling rule is a collection of changes to your DSO that are executed simultaneously.

For DSO, you have the following remodeling options:

For characteristics:

Insert or replace characteristics with:

Constants

An attribute of an InfoObject within the same dimension

A value of another InfoObject within the same dimension

A customer exit (for user-specific code)

Delete

For key figures:

Insert:

Constant

A customer exit (for user-specific code)

Replace with: ○ A customer exit (for user-specific code)

Delete You cannot replace or delete units. This avoids having key figures in the DSO without the corresponding unit.


Implementation of Remodeling Procedure To carry out the Remodeling  procedure, Right click on your DSO and in the context menu, navigate through Additional Functions -----> Remodeling.


Capture.PNG                                                                                                                       we will get the following window after clicking on Remodeling. Enter a remodeling rule name and press Create to create a new rule.

Capture1.PNG                                                                             After clicking on Create we will get the following pop-up window where in we have to enter a description for the rule we wish to create (as shown below).

Capture2.PNG                                                                               After entering the description, press the Create button. we will see the following screen.


           Capture4.PNG                                                                              

As we can see, the left pane shows the structure of the DSO in consideration.

To add a new remodeling rule, Click on the Green Plus sign on the Top-Left corner of your screen (Also circled in Red below). It is called the Add Operation to List button.

 

Capture4.PNG                                                                                   You will get the following pop-up where you can add the remodeling rules.

Capture5.PNG                                                                                         Business Requirement The requirement is as follows:

To delete the Time Characteristic 0CALDAY from the data fields.

To add 0COMP_CODE to the key fields with constant value 1100.

To delete the key figure Revenue(ZREVU8) as it is no longer relevant for reporting in this DSO.

We will implement these requirements one by one.

 

In the pop-up that opened in the last step, select the Delete Characteristic Radio Button and enter the technical name of the Characteristic name you wish to delete (0CALDAYin this case)

Capture 6.PNG                                                                                               Confirm by pressing the CREATE button.

capture 9.PNG


Adding Characteristic 0COMP_CODE.with value 1100 to key fields of DSO.

Capture 7.PNG           

 

 

we need to check AS Key Field check box.if we want that in particular position.

click on create button.


capture 10.PNG

To delete key figure we need to follow these steps.

Capture 8.PNG                                                                                                 Then click on create button.

                                                                                                 

capture 11.PNG



after that click on activate and simulate then go for schedule option.

Capture12.PNG                                                                                    simulation done and click on continue then now it will schedule screen.

Capture 13.PNG                                                                                                  select immediate option then it will appear the below screen here we need to select save option.

capture 14.PNG                                                                                                now we will get message like this.

Capture 15.PNG                                                                                                                                                  if we want to see job click on jobs then we will check it. after that DSO will be inactive and we need to activate.

Capture 16.PNG                                                                                    Now Remodeling successfully done on DSO.

 



Hi,

 

anyone who has ever tried to create a pivot table on top of the Bex Analyzer output will have experienced this issue.

When displaying key and text for an info object, the column header for the key is filled, but the text column remains empty without a header.

This makes it impossible to create a pivot table on top of it.

 

Using the Callback macro in Bex 7.x it is possible to scan the column headers in the result area and put in a custom text.

In this blog I describe how to do this.

 

First of all, run the query in Bex analyzer.

 

After running the query, go to view --> macro's --> view macro's

select CallBack and press edit.

macro screen.jpg

 

Scroll below to the following piece of code

callback macro before.JPG

After the End With and before End If, insert the following lines:

 

    'set column headers for key + text

    Dim nrCol As Long

    Dim resultArea As Range

    Set resultArea = varname(1)

    nrCol = resultArea.Columns.Count

    For i = 1 To nrCol - 1

        If resultArea.Cells(1, i + 1) = "" Then

            resultArea.Cells(1, i + 1) = resultArea.Cells(1, i) & "_text"

        End If

    Next i

 

This code will put suffix _text in the column header, based on the preceding column header.

 

The end result in the macro then looks like this:

callback macro after.JPG

After refreshing the query, you will now see the column headers being added based on the previous column header, with _text behind it.

 

Hope this will help a lot of people.

 

Best regards,

Arno

Actions

Filter Blog

By author:
By date:
By tag: