1 6 7 8 9 10 53 Previous Next

ABAP Development

793 Posts

9/30/14 UPDATE: I've begun work on a tool like I describe here.  It's detailed in this post: Show SCN: The better logger I promised

 

Do you ever have this problem?  You're in the middle of writing a fantastic ABAP program.  It hums, it purrs, it looks reeeeally good.  The code is so clean uncle Bob would be proud.  You use the best design patterns and your application is modularized well.  Then you realize you need to log a few lines of text to view later in SLG1.  Your heart sinks.  Your robust, clean application is about to become polluted by countless 10-line calls to FM 'BAL_LOG_CREATE' and its fellow hoodlums, all because you wanted to save the output of a BAPI you called.  First world problems, AMIRITE?  Anyone?


The obvious solution is to put an object-oriented wrapper around these functions.  This solution is so obvious, in fact, that every department in SAP has written their own verison of it, and my company has added two more.  This is consistent with SAP's approach to developer tools: Quantity Guaranteed.


So why do we need another OO wrapper?  Well, because I believe in the power of collaboration, for one.  None of the other logs have been designed by the developer community.  Another reason is that logs have come a long way since 1999, when SAP released their function modules for the application log, and developers are used to a more concise syntax.  For instance, if I want to write to the console in javascript, it's:


console.log('The System is Down');


but in SAP, I have to declare 2 variables, then spend a page of code calling function modules BAL_LOG_CREATE, BAL_LOG_MSG_ADD, and BAL_DB_SAVE.  Part of the reason is that SAP has multiple logs, while a web browser only logs messages in one console.  So when you log anything in SAP, it must go to a specific application log object and sub-object.  Android (java) also writes to multiple logs.  Its logs are called with two arguments, tag and message, like:


Log.e("SystemMsgs", "The system is Down");


If logging in ABAP was going to be just as awesome (and I know it can be), what would it look like?  Please post your ideas and discuss!  Once enough good ideas are gathered, I'd like to start building something on github.  Anyone is welcome to contribute code or just ideas and insights.  Here's an example what I think would make a good application log API:


log = zcl_app_log=>get( object = 'ZFICO' subobject = 'INTERFACES' ).

TRY.

    zcl_accounting=>do_some_important_upload( ).

    log->add( 'The interface has finished' ).

  CATCH zcx_upload_failed INTO err.

    log->add( err ).

ENDTRY.

Recently I had a situation where I needed to update MIGOs defaults programatically. I tried to search for a solution but couldn't see it mentioned anywhere and hence decided to write it down . I hope that it'll be useful for anyone in a similar situation.

 

Background:


MIGO is a multi-purpose transaction which can be used for various activities in material management - performing goods movements, receipt of purchase order etc. It replaces many older transactions . There are others as well but to list down a few:

 

  • Post a goods receipt to a known purchase order (transaction MB01)
  • Change a material document from goods receipts (transaction MB02)
  • Display a material document from goods receipts (transaction MB03)
  • Cancel a material document from goods receipts (transaction MBST)

 

To understand why we may need to ever need to modify MIGO defaults, consider the below situation.

 

- A file arrives for processing in SAP ( say as a batch job ) .

 

- Initial goods movement is tried using a BAPI. If the processing fails , a user SM35 BDC session is created for processing which can be reprocessed by users ( the users are familiar with MIGO ). A custom transaction for reprocessing is created as the users couldn't be given access to SM35. SM35 allows you to display and potentially reprocess all sessions which can be tricky if the authorisations are a bit leaky.

 

 

MIGO13.png

The failed sessions can then be processed by a custom transaction - the SM35 sessions are persisted to a temporary table to be used by reprocessing transaction.

 

Problem:

 

Everything looks good in theory : all automatic postings are done in background and any errors can be handled by the custom transaction. However, while the user is performing the reprocessing the MIGO session, something interesting happens - if the user opens a parallel MIGO session and performs some other processing in parallel, the subsequent sessions start to fail in the custom transaction. Users could be processing multiple sessions sequentially and might go away and do some other movements in parallel in MIGO.

2.png

 

Why does this happen ?

 

MIGO stores user's defaults trying to optimise the usage so that the user doesn't have to set the selections - this causes the defaults to be always set whenever you use the transaction. The parallel session which the user opened has overridden the defaults and as a result, subsequent failed sessions have different default values set in the screen even though the BDC recording used in SM35 session was set correctly. User defaults is overriding BDC set values .

 

 

Looking at the below image, the BDC session has set values A07 and R10 for action and sub-selection within action.

MIGO3.png

 

However, if the user choses something else in a parallel session ( say A05 and a sub-selection ) , it overrides the action default and subsequent SM35 sessions start failing as then MIGO will start with A05 / sub-selection.

 

Solution:

 

MIGO stores user's defaults in table ESDUS and these defaults correspond to MIGO_FIRSTLINE action. Seeing the below table entries, the settings for the user are:

 

Default is action = A07

and sub-selection for A07 is R10.

 

Hence, A07 / R10 will appear for ACTION and sub-selection ( as shown in above image ) .

 

MIGO4.png

Show Me The Code:

 

Now, we know where they're stored, how to update them ?

 

Class CL_MMIM_USERDEFAULTS can be used to  read and update the parameters. It's a singleton and hence there should be only instance at a given time. Consequently, if we're using it we have to ensure the instance is destroyed . This is achieved by FLUSH_ALL method of the class. Above methods are self explanatory and the constructor requires ACTION value.

 

MIGO5.png

 

So I did the following:


- Instantiate the class using ACTION as "MIGO_FIRSTLINE" and set the instance values.


- Set the values:


o_migo_defaults->set

                                         ( i_element = 'ACTION'
                      i_active
= lv_action      ).

- Flush the value to dB and destroy the instance


o_migo_defaults->flush( ). "Save values to dB
o_migo_defaults
->flush_all( ). "Destroy this instance as others instance will start producing errors


The table has other values used in MIGO defaults ( e.g. default movement type for an action ) and can be similarly updated.

There are several use cases when you need to convert SOLIX table to XSTRING. For example, when you are using sap gateway for downloading files from SAP office, you will need to use conversion from solix to xstring.

 

You can use method cl_bcs_convert=>solix_to_xstring or FM  SCMS_BINARY_TO_XSTRING for this conversion.

 

Now, the issue: SOLIX is a table of RAW 255, that means every line of this table has 255 raw bytes. When you don't specify output length of XSTRING variable, empty raw bytes from the last line are converted as well which means that you will supply corrupted file. It is be readable in most cases, but for example Microsoft Word will show warning message that the file is corrupted, though it can be opened afterwards.

 

Solution is to set number of bytes that needs to be converted (actual file size, not lines of table multiplied by size of line) as can be seen in following example:

 

    DATA:
      lt_file      TYPE solix_tab,
      lv_length   TYPE i,
      ls_doc_data TYPE sofolenti1.

    CALL FUNCTION 'SO_DOCUMENT_READ_API1'
      EXPORTING
        document_id                = iv_id
      IMPORTING
        document_data             = ls_doc_data
      TABLES
        contents_hex               = lt_file
      EXCEPTIONS
        document_id_not_exist      = 1
        operation_no_authorization = 2
        x_error                    = 3
        OTHERS                     = 4.
    IF sy-subrc = 0.
      lv_length = ls_doc_data-doc_size.

      IF lt_att IS NOT INITIAL.
        CALL FUNCTION 'SCMS_BINARY_TO_XSTRING'
          EXPORTING
            input_length = lv_length
          IMPORTING
            buffer       = ev_xstring_content
          TABLES
            binary_tab   = lt_file
          EXCEPTIONS
            failed       = 1
            OTHERS       = 2.
      ENDIF.
    ENDIF.

Here is an example of creating  word documents from templates stored in web repository (SMW0) using wordprocessingml.

A very simple form can be done in around an hour of time.

 

USE CASE

    Prepare the template that needs to be filled with data.

Exmaple 1.png

 

Write the program that will fill the document.

 

  data: LO_FORM type ref to Z_MWL_FORM.

 

   data: L_TEMPLATE type STRING value 'block'.

   data: L_INDEX type I value 1.

   data: L_NUM type I.

 

   create object LO_FORM

     exporting

       I_TEMPLATE = 'Z_TEST'.

 

 

   LO_FORM->REPLICATE(

    I_TEMPLATE_ID = L_TEMPLATE

    I_COPY_NUM = 2

   ).

 

 

   LO_FORM->PREP_SEQ_ACCESS( ).

 

   if LO_FORM->FIND_VARIABLE( 'name' ) eq ABAP_TRUE.

     LO_FORM->SET_VALUE( 'Zhambyl').

   endif.

 

   if LO_FORM->FIND_VARIABLE( 'age' ) eq ABAP_TRUE.

     LO_FORM->SET_VALUE( '25' ).

   endif.

 

   if LO_FORM->FIND_VARIABLE( 'gender' ) eq ABAP_TRUE.

     LO_FORM->SET_VALUE( 'male' ).

   endif.

 

   do 2 times.

 

     if LO_FORM->FIND_BLOCK( L_INDEX ) eq ABAP_TRUE.

       if LO_FORM->FIND_VARIABLE( 'var1' ) eq ABAP_TRUE.

         LO_FORM->SET_VALUE( '1' ).

       endif.

 

       if LO_FORM->FIND_VARIABLE( 'var2' ) eq ABAP_TRUE.

         LO_FORM->SET_VALUE( '2' ).

       endif.

 

       if LO_FORM->FIND_VARIABLE( 'var3' ) eq ABAP_TRUE.

         LO_FORM->SET_VALUE( '3' ).

       endif.

 

     endif.

     add 1 to L_INDEX.

   enddo.

 

   LO_FORM->FINISH_SEQ_ACCESS( ).

 

   LO_FORM->CLEAN(  ).

   LO_FORM->DISPLAY( ).

 

Open the filled document on client machine.

Example 2.png

 

 

Implemantaion details

 

class Z_MWL_FILE definition

  public

   create public .

 

  public section.

 

     data EXTENSION type STRING .

     data TEMPDIR type STRING .

     data BSTRING type XSTRING.

 

     methods: DOWNLOAD " download file from web repository

       importing

         VALUE(I_TEMPLATE) type STRING

          returning

         VALUE(R_SUBRC) like SY-SUBRC.

 

     methods GET_BSTRING " returns xstring representation of file

       returning

         VALUE(R_STRING) type XSTRING.

 

     methods GET_TEMP_DIR " chose file storage location

       returning

         VALUE(R_PATH) type string.

 

     methods SAVE_ON_FRONTEND " upload file to client

       importing

         VALUE(I_STRING) type XSTRING

       returning

         VALUE(R_SUBRC) like SY-SUBRC.

 

   protected section.

   private section.

 

ENDCLASS.

 

 

 

CLASS Z_MWL_FILE IMPLEMENTATION.

 

   method DOWNLOAD.

 

     data: LS_KEY type WWWDATATAB.

     data: LS_MIME type W3MIME.

     data: LT_MIME type standard table of W3MIME.

     field-symbols <LFS_DATA> type ANY.

 

     LS_KEY-RELID = 'MI'.

     LS_KEY-OBJID = I_TEMPLATE.

 

     call function 'WWWDATA_IMPORT'

       exporting

         KEY               = LS_KEY

       tables

         MIME              = LT_MIME

       exceptions

         WRONG_OBJECT_TYPE = 1

         IMPORT_ERROR      = 2

         others            = 3.

 

     if SY-SUBRC eq 0.

       loop at LT_MIME into LS_MIME.

         assign LS_MIME to <LFS_DATA> casting type ('X').

         if <LFS_DATA> is assigned.

           concatenate BSTRING <LFS_DATA> into BSTRING in byte mode.

           unassign <LFS_DATA>.

         endif.

       endloop.

     else.

       R_SUBRC =  SY-SUBRC.

     endif.

 

   endmethod.                    "DOWNLOAD

 

   method GET_BSTRING.

     R_STRING = BSTRING.

   endmethod.                    "GET_BSTRING

 

  method GET_TEMP_DIR.


     data: L_WTITLE type STRING.

     data: L_NAME type STRING.

     data: L_FPATH type STRING.

 

     L_WTITLE = 'CHOSE FILE STORAGE LOCATION'.

 

     CL_GUI_FRONTEND_SERVICES=>FILE_SAVE_DIALOG(

           exporting

             WINDOW_TITLE = L_WTITLE

             DEFAULT_EXTENSION = 'docx'

             FILE_FILTER = 'docx'

           changing

             FILENAME = L_NAME

             PATH = TEMPDIR

             FULLPATH = L_FPATH ).

 

     CL_GUI_CFW=>FLUSH( ).

     R_PATH = L_FPATH.

 

   endmethod.                    "GET_TEMP_DIR

 

  method SAVE_ON_FRONTEND.

     data: LV_FILE_TAB     type standard table of SOLISTI1,

           LV_BYTECOUNT    type I.

     data: L_FPATH type STRING.

     call function 'SCMS_XSTRING_TO_BINARY'

       exporting

         BUFFER        = I_STRING

       importing

         OUTPUT_LENGTH = LV_BYTECOUNT

       tables

         BINARY_TAB    = LV_FILE_TAB.

     "Save the file

 

     L_FPATH = GET_TEMP_DIR( ).

     if L_FPATH is not initial.

       CL_GUI_FRONTEND_SERVICES=>GUI_DOWNLOAD(

         exporting

           BIN_FILESIZE = LV_BYTECOUNT

           FILENAME     = L_FPATH

           FILETYPE     = 'BIN'

         changing

           DATA_TAB     = LV_FILE_TAB

       ).

       if SY-SUBRC ne 0.

         R_SUBRC = SY-SUBRC.

       endif.

     else.

       R_SUBRC = 2.

     endif.

  endmethod.                    "SAVE_ON_FRONTEND

ENDCLASS.


Secondly we create main class that is used to manipulate the word document.

This class relies on classes found in packages: S_OOXML_CORE, SXML and XSLT transformations.

 

class Z_MWL_FORM definition

   public

   create public .

 

   public section.

     data: MAIN_PART type XSTRING .

     data: DOCUMENT type XSTRING .

     data: INTRM_PART type XSTRING .

     data: FINAL_DOC type XSTRING.

 

     methods CONSTRUCTOR importing I_TEMPLATE type STRING. " Finds the main part of word document from zip pakcage container and

                                                                                                    " stores it. I_TEMPLATE is the logical name of the file in smw0

    methods DISPLAY.                     " This method packages updated main part and uploads it to front-end. Dont mind the name.

 

     methods REPLICATE                  " Replicates marked block of text using transformations and substitues standard markups for custom ones

      importing I_TEMPLATE_ID type STRING

                I_COPY_NUM type I.

 

     methods: FIND_VARIABLE  " Finds tag named variable using sxml. I_var is a value for name attribute of this tag.

               importing I_VAR type STRING

              returning VALUE(RV_FOUND) type ABAP_BOOL.

 

     methods: FIND_BLOCK importing I_BLOCK type           " Finds tag named block using sxml. I_block is a value for number attribute of this tag.

              returning VALUE(RV_FOUND) type ABAP_BOOL.     " Block contains several variables that can be copyed with different block numbers

 

    methods: SET_VALUE importing I_VAL type STRING.      " Replaces value of placeholder variable

 

     methods: PREP_SEQ_ACCESS.    " Converts xstring to Xml objects and prepares them for sequencial access

     methods: FINISH_SEQ_ACCESS.  " Converts from sXML back to xstring representation

     methods CLEAN.                           " Clear's all the custom mark up from main part of word document

   protected section.

     data: O_FILE               type ref to ZCL_ZK_MWL_FILE.

     data: O_DOC               type ref to CL_DOCX_DOCUMENT.

     data: O_DOCUMENTPART       type ref to CL_DOCX_MAINDOCUMENTPART.

 

     data: O_SREADER type ref to IF_SXML_READER.

     data: O_SWRITER type ref to IF_SXML_WRITER.

     data: O_SNODE  type ref to IF_SXML_NODE.

     data: O_SVALUE_NODE  type ref to IF_SXML_VALUE_NODE.

   private section.

 

ENDCLASS.

 

 

 

CLASS Z_MWL_FORM IMPLEMENTATION.

 

   method CLEAN.

     if INTRM_PART is not initial.

       call transformation Z_CLEAN

       source xml INTRM_PART

       result xml FINAL_DOC.

     endif.

   endmethod.                    "CLEAN

 

   method CONSTRUCTOR.

     create object O_FILE.

     O_FILE->DOWNLOAD( I_TEMPLATE ).

     DOCUMENT = O_FILE->GET_BSTRING( ).

     try.

         O_DOC = CL_DOCX_DOCUMENT=>LOAD_DOCUMENT( IV_DATA = DOCUMENT ).

* get the maindocument part

         O_DOCUMENTPART = O_DOC->GET_MAINDOCUMENTPART( ).

         MAIN_PART = O_DOCUMENTPART->GET_DATA( ).

 

       catch CX_OPENXML_FORMAT.

       catch CX_OPENXML_NOT_ALLOWED.

       catch CX_OPENXML_NOT_FOUND.

       catch CX_TRANSFORMATION_ERROR.

     endtry.

   endmethod.                    "constructor

 

   method DISPLAY.

     if FINAL_DOC is not initial.

       O_DOCUMENTPART->FEED_DATA( FINAL_DOC ).

     elseif MAIN_PART is not initial.

       O_DOCUMENTPART->FEED_DATA( MAIN_PART ).

     endif.

     FINAL_DOC = O_DOC->GET_PACKAGE_DATA( ).

     if O_FILE->SAVE_ON_FRONTEND( FINAL_DOC ) ne 0.

       message 'Выгрузка отменена' type 'S'.

     endif.

   endmethod.                    "Display

 

   method FIND_BLOCK.

     data: LX_ROOT type ref to CX_SXML_ERROR.

     data: LO_OPELEM type ref to IF_SXML_OPEN_ELEMENT.

     data: L_AT_VAL type ref to IF_SXML_VALUE.

     data: L_VAL type STRING.

 

     if O_SREADER is bound and O_SWRITER is bound.

 

       while RV_FOUND ne ABAP_TRUE.

         try.

             O_SNODE = O_SREADER->READ_NEXT_NODE( ).

             if O_SNODE is initial.

               exit.

             endif.

             if O_SNODE->TYPE eq IF_SXML_NODE=>CO_NT_ELEMENT_OPEN.

               LO_OPELEM ?= O_SNODE.

               if LO_OPELEM->IF_SXML_NAMED~QNAME-NAME eq 'block'.

                 L_AT_VAL = LO_OPELEM->GET_ATTRIBUTE_VALUE( 'num' ).

                 L_VAL = L_AT_VAL->GET_VALUE( ).

                 if L_VAL eq I_BLOCK.

                   RV_FOUND = ABAP_TRUE.

                 endif.

               endif.

             endif.

             O_SWRITER->WRITE_NODE( O_SNODE ).

           catch CX_SXML_ERROR into LX_ROOT.

             exit.

         endtry.

       endwhile.

     endif.

   endmethod.                    "FIND_BLOCK

 

   method FIND_VARIABLE.

     data: LX_ROOT type ref to CX_SXML_ERROR.

     data: LO_OPELEM type ref to IF_SXML_OPEN_ELEMENT.

     data: L_AT_VAL type ref to IF_SXML_VALUE.

     data: L_VAL type STRING.

 

     if O_SREADER is bound and O_SWRITER is bound.

 

       while RV_FOUND ne ABAP_TRUE.

         try.

             O_SNODE = O_SREADER->READ_NEXT_NODE( ).

             if O_SNODE is initial.

               exit.

             endif.

             if O_SNODE->TYPE eq IF_SXML_NODE=>CO_NT_ELEMENT_OPEN.

               LO_OPELEM ?= O_SNODE.

               if LO_OPELEM->IF_SXML_NAMED~QNAME-NAME eq 'variable'.

                 L_AT_VAL = LO_OPELEM->GET_ATTRIBUTE_VALUE( 'mark' ).

                 L_VAL = L_AT_VAL->GET_VALUE( ).

                 if L_VAL eq I_VAR.

                   RV_FOUND = ABAP_TRUE.

                 endif.

               endif.

             endif.

             O_SWRITER->WRITE_NODE( O_SNODE ).

           catch CX_SXML_ERROR into LX_ROOT.

             exit.

         endtry.

       endwhile.

 

       clear RV_FOUND.

 

       while RV_FOUND ne ABAP_TRUE.

         try.

             O_SNODE = O_SREADER->READ_NEXT_NODE( ).

             if O_SNODE is initial.

               exit.

             endif.

             if O_SNODE->TYPE eq IF_SXML_NODE=>CO_NT_ELEMENT_OPEN.

               LO_OPELEM ?= O_SNODE.

               if LO_OPELEM->IF_SXML_NAMED~QNAME-NAME eq 't'.

                 RV_FOUND = ABAP_TRUE.

               endif.

             endif.

             O_SWRITER->WRITE_NODE( O_SNODE ).

           catch CX_SXML_ERROR into LX_ROOT.

             exit.

         endtry.

       endwhile.

 

       clear RV_FOUND.

 

       while RV_FOUND ne ABAP_TRUE.

         try.

             O_SNODE = O_SREADER->READ_NEXT_NODE( ).

             if O_SNODE is initial.

               exit.

             endif.

             if O_SNODE->TYPE eq IF_SXML_NODE=>CO_NT_VALUE.

               O_SVALUE_NODE ?= O_SNODE.

               RV_FOUND = ABAP_TRUE.

               exit.

             endif.

             O_SWRITER->WRITE_NODE( O_SNODE ).

           catch CX_SXML_ERROR into LX_ROOT.

             exit.

         endtry.

       endwhile.

     endif.

   endmethod.                    "FIND_VARIABLE

 

   method FINISH_SEQ_ACCESS.

     data: LX_ROOT type ref to CX_SXML_ERROR.

     data: LO_WRITER type ref to CL_SXML_STRING_WRITER.

     if O_SREADER is not initial and O_SWRITER is bound.

       do.

         try.

             O_SNODE = O_SREADER->READ_NEXT_NODE( ).

             if O_SNODE is initial.

               exit.

             endif.

             O_SWRITER->WRITE_NODE( O_SNODE ).

           catch CX_SXML_ERROR into LX_ROOT.

             exit.

         endtry.

       enddo.

 

       try.

 

           LO_WRITER ?= O_SWRITER.

           INTRM_PART = LO_WRITER->GET_OUTPUT( ).

         catch CX_SXML_ERROR into LX_ROOT.

           exit.

       endtry.

     endif.

   endmethod.                    "FINISH_SEQ_ACCESS

 

   method PREP_SEQ_ACCESS.

     if INTRM_PART is not initial.

       O_SREADER ?= CL_SXML_STRING_READER=>CREATE( INTRM_PART ).

       O_SWRITER ?= CL_SXML_STRING_WRITER=>CREATE( ).

     endif.

   endmethod.                    "prep_seq_access

 

   method REPLICATE.


     if INTRM_PART is initial.

       call transformation Z_REPLICATE

       source xml MAIN_PART

       result xml INTRM_PART

       parameters TEMPLATE_ID = I_TEMPLATE_ID

                  COPY_NUM = I_COPY_NUM.

     else.

 

       call transformation Z_REPLICATE

       source xml INTRM_PART

       result xml INTRM_PART

       parameters TEMPLATE_ID = I_TEMPLATE_ID

                  COPY_NUM = I_COPY_NUM.

     endif.

   endmethod.                    "replicate

 

 

  method  SET_VALUE.

     data: LX_ROOT type ref to CX_SXML_ERROR.

     data: L_XSTRING type XSTRING.

 

*  L_XSTRING = CL_ABAP_CODEPAGE=>CONVERT_TO( I_VAL ).

 

     if O_SVALUE_NODE is bound and O_SWRITER is bound.

       try.

           O_SVALUE_NODE->IF_SXML_VALUE~SET_VALUE( I_VAL ).

           O_SWRITER->WRITE_NODE( O_SNODE ).

         catch CX_SXML_ERROR into LX_ROOT.

           exit.

       endtry.

     endif.

  endmethod.                    "set_value

ENDCLASS.


Following are the transformations used in the class described above.


Transformation z_replicate


<xsl:transformxmlns:xsl="http://www.w3.org/1999/XSL/Transform"xmlns:sap="http://www.sap.com/sapxsl"xmlns:w="http://schemas.openxmlformats.org/wordprocessingml/2006/main" version="1.0">
   <xsl:output encoding="UTF-8" indent="yes" method="xml" omit-xml-declaration="no" standalone="yes"/>
   <xsl:param name="TEMPLATE_ID"/>
   <xsl:param name="COPY_NUM" sap:type="number"/>
   <xsl:template match="node()|@*">
     <xsl:copy>
       <xsl:apply-templates select="node()|@*"/>
     </xsl:copy>
   </xsl:template>
   <xsl:template match="w:sdt">
     <xsl:choose>
       <xsl:when test="descendant::w:tag[@w:val=$TEMPLATE_ID]">
         <xsl:call-template name="multiply">
           <xsl:with-param name="maxCount" select="$COPY_NUM"/>
           <xsl:with-param name="nodeToCopy" select="."/>
         </xsl:call-template>
       </xsl:when>
       <xsl:otherwise>
         <xsl:element name="variable">
           <xsl:attribute name="mark">
             <xsl:value-of select="descendant::w:tag/@w:val"/>
           </xsl:attribute>
           <xsl:apply-templates select="w:sdtContent/node()|@*" mode="variable"/>
         </xsl:element>
       </xsl:otherwise>
     </xsl:choose>
   </xsl:template>
   <xsl:template name="multiply">
     <xsl:param name="maxCount"/>
     <xsl:param name="i" select="1"/>
     <xsl:param name="nodeToCopy"/>
     <xsl:choose>
       <xsl:when test="$i &lt;= $maxCount">
         <xsl:element name="block">
           <xsl:attribute name="num">
             <xsl:value-of select="$i"/>
           </xsl:attribute>
           <!--          <xsl:copy-of select="$nodeToCopy/w:sdtContent/node()|@*"/>-->
           <xsl:apply-templates select="$nodeToCopy/w:sdtContent/node()|@*"/>
         </xsl:element>
         <xsl:call-template name="multiply">
           <xsl:with-param name="maxCount" select="$maxCount"/>
           <xsl:with-param name="nodeToCopy" select="$nodeToCopy"/>
           <xsl:with-param name="i" select="$i+1"/>
         </xsl:call-template>
       </xsl:when>
       <xsl:otherwise/>
     </xsl:choose>
   </xsl:template>
   <xsl:template match="node()|@*" mode="variable">
     <xsl:copy>
       <xsl:apply-templates select="node()|@*" mode="variable"/>
     </xsl:copy>
   </xsl:template>
   <xsl:template match="w:sdtContent/w:r[1]" mode="variable">
     <xsl:copy>
        <xsl:apply-templates select="node()|@*" mode="variable"/>
     </xsl:copy>
   </xsl:template>
    <xsl:template match="w:sdtContent/w:r[position() != 1]" mode="variable">
   </xsl:template>
</xsl:transform>



Transformation z_clean

<xsl:transformxmlns:xsl="http://www.w3.org/1999/XSL/Transform"xmlns:sap="http://www.sap.com/sapxsl"xmlns:w="http://schemas.openxmlformats.org/wordprocessingml/2006/main" version="1.0">
   <xsl:output encoding="UTF-8" indent="yes" method="xml" omit-xml-declaration="no" standalone="yes"/>
   <xsl:template match="node()|@*">
     <xsl:copy>
       <xsl:apply-templates select="node()|@*"/>
     </xsl:copy>
   </xsl:template>
   <xsl:template match="variable">
       <xsl:apply-templates select="node()|@*"/>
   </xsl:template>
   <xsl:template match="block">
       <xsl:apply-templates select="node()|@*"/>
   </xsl:template>
</xsl:transform>




I know this may not be universally agreed upon, but I think it's something worthy of being discussed and so I decided to explain my reasoning behind it.

Although I started my programming career in ABAP over the years I've programmed in Java, C#, Objective-C,etc. In these languages an object has a name, and a fully qualified name:

 

  • Name: Delivery
  • Fully Qualified Name: com.sd.litigations.delivery

 

The benefit of this structure is that the "litigations" team doesn't have ownership on the usage of the "delivery" word, like a trademark. There can be another object called delivery in another package like "com.sd.whatever".

 

In ABAP you have something similar, the custom namespaces, which you may have seen in SAP developments /<application>/ but are rarely used outside SAP (I've used this once in a project). Most of the times we all share the Z or Y namespace.

 

So In my arrogance I create a package called ZLITIGATIONS, go into SE24, and claim the name ZCL_DELIVERY for myself (like I've seen people do so many times....). Why should I be encouraged to do this? Is my litigations application more important that some other package? No.

 

That's why in my projects we use some sort of package symbol in the class name, like ZLT_DELIVERY. This way someone else can create a DELIVERY object without it feeling like a "second best", besides making it easier to search for the class with intelisense (in ABAP for Eclipse).

 

PS: I know some will question why there would be two delivery classes in the first place, but let's keep that for another discussion. Think of another less widely used object, the same reasoning applies.

Hello guys

 

I am trying to get a hand of the patterns in ABAP and how to apply them.

 

I hope this is going to be one of the many posts i will do on the Patterns

 

As we write more code things get so much similar and what i learnt from reading Java design Patterns

 

 

Keep your code open for extension and closed for modification as much as possible!!

 

 

MVC Design Pattern

 

In Model view controller the main goal is:

to be able to loose couple the presentation layer from the backend layer!!

 

  • View is how i get input from user and display output
  • Model is my Business data
  • Controller is the manager between these two

There is always two components in a ABAP report:

 

Lets say an ALV  report

 

  • Selection screen(Presentation layer)
  • Data layer( Fetching of the data-SQL queries etcc-Model layer)
  • Displaying the data(View Layer)

 

So we could divide the code into 3 pieces Model-View-Controller

 

 

Controller might not be very necessary for small reports.

 

having controller is a good approach as this will be like a mediator between the model and the view

 

 

So what are the don'ts:

  • View should never know much about the model
  • Model should never be referencing the vie

Here is an example:

 

Lets say that you write a report and this report users should be able to display the data as an ALV or as a smartforms or even in pdf format!!

 

This is a great example of MVC as the view layer will be changing like a skin as most of the apps use these days even a web browser

 

Here is the code sample

 

Lets start with the example ALV that will display sales order data from tables vbak and vbap

 

VIEW

 

Define an abstract view class where you can extend it to many different types of views!!

 

***Display

class lcl_view DEFINITION ABSTRACT .

  PUBLIC SECTION.

    METHODS: display  ABSTRACT CHANGING it_data TYPE STANDARD TABLE.

ENDCLASS.

 

class lcl_view_alv  DEFINITION INHERITING FROM lcl_view .

  PUBLIC SECTION.

    METHODS: display REDEFINITION.

endclass.

 

****Smartforms view

CLASS lcl_view_smartforms DEFINITION INHERITING FROM lcl_view.

  PUBLIC SECTION.

    METHODS: display REDEFINITION.

ENDCLASS.

 

****PDF view

CLASS lcl_view_pdf DEFINITION INHERITING FROM lcl_view.

  PUBLIC SECTION.

    METHODS: display REDEFINITION.

ENDCLASS.

 

 

 

So now we have 3 diffrent types of views!!

 

 

MODEL

 

 

Model is the business data that we will be getting

 

We could also make model an abstract class and then extend it but for simplicity not now!!

here is an example:

 

 

 

*****DATA Layer

class lcl_model DEFINITION.

  PUBLIC SECTION.

    TYPES: BEGIN OF ty_out,

      vbeln type vbap-vbeln,

      posnr type vbap-posnr,

      matnr type vbap-matnr,

      vkorg type vbak-vkorg,

      END OF ty_out

      .

    TYPES: tt_out TYPE STANDARD TABLE OF ty_out.

    TYPES: gr_vbeln TYPE RANGE OF vbap-vbeln.

    DATA: gt_output TYPE tt_out.

 

    methods: constructor,

              select_data IMPORTING VALUE(rs_vbeln) TYPE gr_vbeln.

    .

  ENDclass.

 

 

CONTROLLER

Controller needs to know about both the view and the model

so that it can notify

The responsibility is with the controller to manage the tasks !!

 

 

class lcl_controller DEFINITION.

  PUBLIC SECTION.

    METHODS: constructor IMPORTING io_view_type TYPE CLIKE OPTIONAL.

    methods: get_data IMPORTING ir_vbeln TYPE lcl_model=>GR_VBELN.

    methods: set_view IMPORTING io_view_type TYPE REF TO lcl_view.

    methods: main IMPORTING  ir_vbeln TYPE lcl_model=>GR_VBELN

                              VALUE(iv_view_type) TYPE string.

    methods: display.

  PROTECTED SECTION.

    DATA: lo_view type REF TO lcl_view.

    DATA: lo_model type REF TO lcl_model.

ENDCLASS.

 

 

SUMMARY

You need to define a loose coupled application and in the future when you need a new View all you need is write a new one extending the abstract view class!!

 

MVC is a great pattern and design and get out there to try as much as you can!!

 

 

All the code i provide is attached see below

 

Sample code: is in text format as well as Saplink format can be provided if you request on <email address removed by moderator> as a nugget!!

i couldnt attach the nugget into this failed somehow!!

I'm writing this blog after reading Understanding Widening Cast in ABAP Objects since it became clear to me that the difference between reference types and object type is not clear for many SCN users who are not used to Object Oriented Programming. Knowing the difference between the two is critical to understanding what you can do with casting and how powerful the concept of inheritance is.

 

I'll start by giving an example of why narrowing cast is so important. Imagine the following scenario, where the specific fruits are children of the super class Fruit:

 

inheritance.png

 

You want to create a table of fruit, then loop at it, and write the name of the fruit to the screen. The required data should can be declared as:

 

DATA: lt_fruits TYPE TABLE OF REF TO ZCL_FRUIT,
            lo_fruit  TYPE  REF TO ZCL_FRUIT,
            lo_mango TYPE REF TO ZCL_MANGO,
            lo_apple  TYPE REF TO ZCL_APPLE,
            lo_orange TYPE REF TO ZCL_ORANGE.



And then you do something like:

 

lo_mango = new ZCL_MANGO( ).
lo_fruit ?= lo_mango.
append lo_fruit to lt_fruit.



 

This is where the difference between reference type and object type becomes critical.

  • The object type is intrinsic to the class of the constructor (new ZCL_MANGO) used to bring it to "life". Think of the object as memory space that contains information, whose type never changes after it is instantiated.
  • The reference type is the type of the pointer (in this case lo_mango and lo_fruit) to the memory space. It's your gateway, your "API",  only through them can you access the memory (the intrinsic object, whose type was determined by the constructor).

 

When I make a cast from lo_mango to lo_fruit, the object itself and therefore its type remains the same. Same variables, same type,  nothing changes except the type of the pointer, the reference. As long as we use a reference of the super class type we only have access in the code to the attributes and methods of the superclass, but that doesn't mean the attributes of the original object were lost. They are still there waiting!

 

This dichotomy is very important, because it allows us to keep similar objects together, in the same table for example, while keeping their intrinsic properties intact. For example lets assume that for the very specific case of the Apple we want to output the apple's type besides the name of the fruit, the code would be something like:

 

Loop at lt_fruit into lo_fruit.
     write lo_fruit->get_name( ).
     if cl_abap_classdescr=>get_class_name( lo_fruit ) = 'ZCL_APPLE'.
          lo_apple ?= lo_fruit.
          write lo_apple->get_type_of_apple( ).
     endif.
Endloop.





 

It should be become even clearer by the usage of cl_abap_classdescr=>get_class_name( lo_fruit ) and the fact that it returns ZCL_APPLE (instead of ZCL_FRUIT), that indeed the object retains all the attributes that were given to him by the constructor, even if the reference is of the super class type.

 

Now imagine a scenario where casting didn't exist, and the code you would need. You would need 3 tables, 3 loops. Now expand this to a real program, inheritance allow much more elegant coding.

Hi ,

I am writing this blog due to the reason, when I search in google , I didn't get proper response

 

If you encounter dump in ALV when click on some ICON, you will get this dump. TO Resolve this you need to create FINAL INternal table structure with Ref. Feilds and not the Domain fields.

 

I mean --> TYPES Declaration :      BMEIN     TYPE BASME" Base Unit  , Instead you need to use

                                                          BMEIN     TYPE MEINS" Base Unit ( MEINS is Ref Field ).

 

With Regards,

Bala M

Hi all,

today's blog is a deeper insight in “The 4th point: think about your developing twice” published in Some recommended Points everybody should remember when developing (ABAP) - featured title "Be a better developer"

 

As a quality manager it is also part of my daily work to share knowledge and to onboard new employees. Out of that I have a lot of different views from people which aren't much influenced. That means I get a lot of different views to code quality and how the different people think about it.

A funny thing is, when I tell them, that I’m part of the quality team, a lot of them are saying that this isn't effective or even they got not that good stories to tell. A lot are saying it is very abstract.

In other words they are saying that this is too far away from the daily work and it is not helping them to improve the code quality.

That’s why I thought, I will share ten facts about developing things.

( Be careful, it is not doing your work and maybe you need more time to get it done )

 

1st point Functionality

Does your code do what you want to?

Of course, everybody of us is doing that in a way, but did you ever thought about it before starting developing?

I mean all the stuff you discovered before starting developing saves you time. So take your time and find your spots to implement your additional coding. If you have a Greenfield development you might save more time by drawing a class diagram with all the relations between, but that is another story to tell

 

 

 

2nd point software reliability

Does your code affect other processes?

Make sure, that your code is not affecting other processes. Of course, if you answered point one with yes, you might have already checked this too, but just because most of us implement that in the first mentioned  point, it is an extra point and I need to mention it.

 

3rd point usability

Would a user understand it easily?

That is a really big point and no, this is not your business. It is your business and I tell you why. We develop the things and we also need to take care about the usability. A simple example:

You see, take your time and think about your screens you develop, no matter which technique you are using. Just because all necessary results show up on the screen doesn't mean that it is a good program at all.

 

4th point efficiency

What about the run duration of my program?

This is also a big point and it needs a bit more explanation. It is not just the run time, which is affected. It is more a design-fact on how well-written your code will be. Good to know before starting coding are the answers to the following questions:

 

 

Do I have a customizing, which is needed in the beginning?

Does a user use the program more often in a row?

How many users will use this program?

 

With these three questions it is possible to make a decision. If a program is used a lot of times you might read your customizing just once and save it in globals to not need to fetch it again. I think you know what I mean and so I don't want to waste your time here… (You know, all about efficiency in this point)

 

5th point changeability

Is it possible to add additional logic to your coding without having devastating consequences?

Easy point, isn't it? Just make sure, that you implemented your code good and it is possible to change it in an easy way. For example, if you added source in different places and one of it cannot exist without another spot make sure you have a reference.

A very effective way to handle this is to implement unit tests. With a unit test you are just one click away

 

6th point transferability

Is it possible to transfer my code to another spot if needed?

Try to make your source as unattached to the spot as you can. Use all the advantages ABAP (or even your programming language) gives us to develop things. Use interfaces, pass the values and extract your code in own classes / functions as much as you can.

 

7th fact readability

Is my source readable if I would see it for the first time?

  1. If you answer this question with yes, just ask the developer next to you and prove yourself
  2. I think it is an easy point and no need to explain it in a long story.

 

8th point understandability

Will I understand the source in 6 month again?

If you answered the readability question with yes you might say isn't it the same? In my opinion no, because just I can read a source does not mean I understand it. Everybody of us saw a lot of coding and I’m pretty sure most are pretty sure, that it isn't easy to work through a snippet and get the idea behind. You need to see your source in a big picture and here it is needed to understand it. Perhaps you aren't that sure now, so you might add some comment-lines to your source and also describe the methods in a few sentences.

 

9th point learnable

Would I teach someone to code like that?

What? That might be right now in your mind, but this is pretty important. If you see your coding in front of you and you scroll through it, you just should think about all the small details and ask you the question mentioned above. If you think you won’t teach someone developing in this way you might think about it more than twice and change it to something you would teach. That is the fact here.

 

10th point needed

Is my code needed?

Ok, that is not a real question you should ask you after developing your stuff. Might save a lot of work

I just want to make sure, that you are really pretty sure that your developing is needed, if you have any doubts, that might be the implementing is not needed out of any reason ask your questions. I know a lot of developments out there which aren't needed in the end and just wasted time for everybody involved….

 

and ten facts sounds a lot better than nine facts

 

 

 

 

That’s it.

These are my personal  ten recommended points to think about twice. Keep these in your mind I’m pretty sure you will save time, perhaps not during the developing, but afterwards analyzing changing or even enhancing your source.

 

The bridge to the quality management:

Do you think these are points we should consider?

 

Yes?  Here is the fact: The first six are out of the ISO 250XX (Old one ISO9126). Now you might not say again that the quality management is not helping you in your daily work. It is always present but most of us don't recognize it as quality management in a classic way, in my opinion a good thing.

 

A summarization might be:

Just combine beauty and functionality.

 

Feel free to leave a comment and happy coding

 

Cheers

Florian

Hello SCN members,

 

Good Evening.

 

Today i want to explain a simple and very important point particularly about the reports sent to persons like President or Vice-President in a Company.

 

I have been asked to change a report based on the user settings. For your understanding i am giving the screen shot as below:

 

SAP Menu ---> System --->User Profile ----> Own Data, click on the Defaults Tab.

 

defaults screen.png

The above Decimal Notation has 3 types of number formats i.e Space, X and Y.

SPACE" 1.234.567,89

          'X'. " 1,234,567.89

        'Y'. " 1 234 567,89

Different Countries use different Decimal Notations based on their habitat or convenience.


Regarding that i have searched so much and people have used below function module for currency and even quantity fields.

HRCM_STRING_TO_AMOUNT_CONVERT.

But the above function module did not work for all the situations.


So, i have made a change as below:


REPORT  ZTEST_QTY_CONV.

DATA : SS_USR01 TYPE USR01.

DATA: LP_DATA     TYPE REF TO DATA,

         L_THOUSANDS TYPE C LENGTH 1,

         L_DECIMAL   TYPE C LENGTH 1,

         L_TRANSLATE TYPE C LENGTH 2.

DATA: LT_RESULTS TYPE MATCH_RESULT_TAB,

       LS_RESULT  TYPE MATCH_RESULT,

       L_MATCH    TYPE STRING VALUE `^\s*-?\s*(?:\d{1,3}(?:(T?)\d{3})?(?:\1\d{3})*(D\d*)?|D\d+)\s*$`.

DATA: L_INT TYPE STRING,

       L_DEC TYPE STRING,

       INPUT TYPE STRING,

       OUTPUT TYPE STRING.

PARAMETERS : P_QTY TYPE CHAR17.

BREAK-POINT.

FIELD-SYMBOLS: <L_INPUT> TYPE ANY.

CREATE DATA LP_DATA LIKE INPUT.

ASSIGN LP_DATA->* TO <L_INPUT>.

<L_INPUT> = P_QTY.

* Get separator from user record

IF SS_USR01 IS INITIAL.

   SELECT SINGLE * FROM USR01 INTO SS_USR01 WHERE BNAME EQ SY-UNAME.

ENDIF.

CASE SS_USR01-DCPFM.

   WHEN SPACE" 1.234.567,89

     L_THOUSANDS = '.'.

     L_DECIMAL   = ','.

   WHEN 'X'.    " 1,234,567.89

     L_THOUSANDS = ','.

     L_DECIMAL   = '.'.

   WHEN 'Y'.    " 1 234 567,89

     L_THOUSANDS = SPACE.

     L_DECIMAL   = ','.

ENDCASE.

IF SS_USR01-DCPFM <> 'Y'.

* Modify regex to handle the user's selected notation

   REPLACE ALL OCCURRENCES OF 'T' IN L_MATCH WITH L_THOUSANDS.

   else.

     REPLACE ALL OCCURRENCES OF 'T' IN L_MATCH WITH ' '.    " (This statement is not happened)

*so, i did as below,

CLEAR : L_MATCH.

*L_MATCH   = `^\s*-?\s*(?:\d{1,3}(?:(T?)\d{3})?(?:\1\d{3})*(D\d*)?|D\d+)\s*$`.  " Removed the T with space.

L_MATCH   = `^\s*-?\s*(?:\d{1,3}(?:( ?)\d{3})?(?:\1\d{3})*(D\d*)?|D\d+)\s*$`.

ENDIF.

IF L_DECIMAL EQ '.'.

   REPLACE ALL OCCURRENCES OF 'D' IN L_MATCH WITH '\.'.

ELSE.

   REPLACE ALL OCCURRENCES OF 'D' IN L_MATCH WITH L_DECIMAL.

ENDIF.

*  if SS_USR01-DCPFM <> 'Y'.

CONDENSE <L_INPUT> NO-GAPS.

* Check the number is valid

FIND REGEX L_MATCH IN <L_INPUT>.

*  endif.

IF SY-SUBRC IS NOT INITIAL.

   MESSAGE 'Invalid' TYPE 'E'.

*    RAISE EXCEPTION TYPE CX_SY_CONVERSION_NO_NUMBER.

ENDIF.

* Translate thousand separator into "space"

CONCATENATE L_THOUSANDS SPACE INTO L_TRANSLATE.

TRANSLATE <L_INPUT> USING L_TRANSLATE.

* Translate decimal into .

CONCATENATE L_DECIMAL '.' INTO L_TRANSLATE.

TRANSLATE <L_INPUT> USING L_TRANSLATE.

* Remove spaces

CONDENSE <L_INPUT> NO-GAPS.

OUTPUT = <L_INPUT>.

*To get the User profile format after all calculations.

DATA :Quantity  TYPE  STPO-MENGE.

           quantity = output.

write: output, / 'Do calculations and print the values in User Profile Settings:', quantity.


Note: Whenever you have changed the user profile and want to see the result of the values like QUAN (usually 13 digits and 3 decimals), you need log out and log in once. Then only user settings will be applied.


Regards,

Siva kumar. D


Introduction

You have probably stumbled upon some cool projects at github like the linux kernel or node-js. Until now it has been quite cumbersome to push your open source ABAP to Git repositories, abapGit will help to make this easier.

 

 

abapGit is a Git client written in ABAP for ABAP, it lets you clone projects or commit objects to the Git repository.

 

 

Installing

  • Download the source code from the github repository
  • Paste the code in new report using SE38
  • Configure SSL in transaction STRUST
  • Run

 

 

Uninstalling

  • Delete report
  • Delete standard texts ZABAPGIT* via SO10

 

 

How abapGit works

First step is to clone a repository, this will create the objects from the repository in the SAP system. After this, one of the following commands will appear:

 

pull

If the files in the repository have been changed, the ABAP objects can be updated with the pull command

 

commit

If the latest changes are implemented in the SAP system, and objects in SAP are changed, the changes can be pushed to the Git repository using the commit command.

 

add

After having pulled or cloned the repository, the objects will be in sync, at this stage it is possible to add new ABAP objects to the Git repository.

 

 

Design/Internals

The "distributed" part of Git is not implemented in abapGit, it will pull data from the repository quite often, and advanced git commands like blame etc. is also not supported. It is currently only possible to serialize reports, classes, data elements, and domains, other objects will be implemented over time. All code will be serialized to .abap files in the repository making it easy to read online, meta data will be serialized to .xml files.

Beware that this is alpha software and provided "as is", take care and only run in test systems.

 

 

 

Hopefully abapGit will help to ease cooperation in ABAP open source projects and inspire more to do open source ABAP.

Initial requirement

 

My requirement was to create two reports with ALV Grid output. These reports should be called via transaction codes, and their data gathering should be done thru function modules, for the identical data will be needed for a Web Tool accessing those data via RFC.

 

 

Issue

 

For there is no data selection and no selection screen in these reports (and there shouldn't be one as required), after starting these transactions the users see nothing for several seconds, until the ALV Grid is displayed. So some users became unsure, if they've correctly started the transactions.

 

 

Looking for a solution

 

I was looking for some ideas, and I found here on SCN some threads pointing to class CL_GUI_TIMER. So I decided to create a function module calling a screen and closing it, after the timer interval has finished.

 

 

Creating the function module

 

First I thought about the data to be passed to the function module:

 

  • a heading text for the popup screen
  • an info text, that some action has been started
  • a label text and a value text to tell,  from where the function module has been called
  • an info text, that the screen will be closed automaticall.

 

Code of the function module

 

FUNCTION zmy_popup_show.

*"----------------------------------------------------------------------

*"*"Local Interface:

*"  IMPORTING

*"     VALUE(I_HEADER_TEXT) TYPE  TEXT80

*"     VALUE(I_INFO_TEXT1) TYPE  TEXT80

*"     VALUE(I_LABEL_TEXT) TYPE  TEXT30 OPTIONAL

*"     VALUE(I_VALUE_TEXT) TYPE  TEXT50 OPTIONAL

*"     VALUE(I_INFO_TEXT2) TYPE  TEXT80 OPTIONAL

*"     VALUE(I_INTERVAL) TYPE  I DEFAULT 5

*"----------------------------------------------------------------------

 

* Filling dynpro fields and interval

  xv_header_text                     =  i_header_text.

  xv_info_text1                      =  i_info_text1.

  xv_info_text2                      =  i_info_text2.

  xv_label_text                      =  i_label_text.

  xv_value_text                      =  i_value_text.

 

  xv_interval                        =  i_interval.

 

* Call info screen 9000

  CALL SCREEN                           '9000'

    STARTING                        AT  5 5.

 

ENDFUNCTION.

 

Here I pass all input parameters to global defined variables. All text fields will be shown at the screen.

 

 

 

Code of the function group's TOP include

 

FUNCTION-POOL zmy_popup.                    "MESSAGE-ID ..

 

* Definitions

CLASS xcl_event_receiver      DEFINITION DEFERRED.

DATA: xo_event          TYPE  REF TO xcl_event_receiver."#EC NEEDED

DATA: xv_header_text    TYPE  text80.

DATA: xv_info_text1     TYPE  text80.

DATA: xv_info_text2     TYPE  text80.

DATA: xv_interval       TYPE  i.

DATA: xv_label_text     TYPE  text30.

DATA: xo_timer          TYPE  REF TO cl_gui_timer.

DATA: xv_value_text     TYPE  text50.

 

* Definition of class XCL_EVENT_RECEIVER

  INCLUDE lzmy_popupcls.

 

For catching the FINISHED event of class CL_GUI_TIMER a local event receiver class is needed:

 

 

Event Receiver Class Definition ...

 

*&---------------------------------------------------------------------*

*&  Include           LZMY_POPUPCLS

*&---------------------------------------------------------------------*

 

*----------------------------------------------------------------------*

*   CLASS xcl_event_receiver DEFINITION

*----------------------------------------------------------------------*

CLASS xcl_event_receiver DEFINITION.

  PUBLIC SECTION.

 

    CLASS-METHODS:

      timer_finished       FOR EVENT  finished

                                  OF  cl_gui_timer.

 

ENDCLASS.                    "xcl_event_receiver DEFINITION

 

 

... and Implementation

 

*&---------------------------------------------------------------------*

*&  Include           LZMY_POPUPCLI

*&---------------------------------------------------------------------*

 

*----------------------------------------------------------------------*

*   CLASS xcl_event_receiver IMPLEMENTATION                            *

*----------------------------------------------------------------------*

*   Handle events                                                      *

*----------------------------------------------------------------------*

 

CLASS xcl_event_receiver IMPLEMENTATION.

 

*----------------------------------------------------------------------*

*       METHOD timer_finished                                          *

*----------------------------------------------------------------------*

*       Action after timer has finished                                *

*----------------------------------------------------------------------*

  METHOD timer_finished.

 

        PERFORM                     exit_dynpro.

 

  ENDMETHOD.                    "timer_finished

 

ENDCLASS.                    "xcl_event_receiver IMPLEMENTATION

 

How to leave will be shown in FORM EXIT_DYNPRO later.

 

 

 

Definition of the Info Screen 9000

 

The flow logic looks pretty simple:

 

PROCESS BEFORE OUTPUT.

  MODULE call_timer.

*

PROCESS AFTER INPUT.

  MODULE exit_dynpro.

 

 

The screen contains the fields

 

  • XV_HEADER_TEXT
  • XV_INFO_TEXT1
  • XV_LABEL_TEXT
  • XV_VALUE_TEXT
  • XV_INFO_TEXT2

 

and some frames:

 

screen_9000.JPG

Remark: Übertschrift means Header/Heading.

 

 

 

Definition of PBO module CALL_TIMER:

 

*&---------------------------------------------------------------------*

*&  Include           LZMY_POPUPO01

*&---------------------------------------------------------------------*

 

*&---------------------------------------------------------------------*

*&      Module  CALL_TIMER  OUTPUT

*&---------------------------------------------------------------------*

*       Create and start timer

*----------------------------------------------------------------------*

MODULE call_timer OUTPUT.

 

* Timer setzen

  CREATE  OBJECT                     xo_timer

    EXCEPTIONS

      OTHERS                      =  4.

 

  CHECK sy-subrc                 EQ  0.

 

  SET HANDLER xo_event->timer_finished FOR xo_timer.

 

  xo_timer->interval              =  xv_interval.

  xo_timer->run( ).

 

ENDMODULE.                 " CALL_TIMER  OUTPUT

 

Here the timer object is created, the event handler is set, the timer interval is set and the timer is started.

 

 

 

Definition of optional PAI module EXIT_DYNPRO:

 

*&---------------------------------------------------------------------*

*&  Include           LZMY_POPUPI01

*&---------------------------------------------------------------------*

 

*&---------------------------------------------------------------------*

*&      Module  EXIT_DYNPRO  INPUT

*&---------------------------------------------------------------------*

*       Leave Screen

*----------------------------------------------------------------------*

MODULE exit_dynpro INPUT.

 

  CASE  sy-ucomm.

  WHEN  'ECAN'.

        PERFORM                     exit_dynpro.

  ENDCASE.

 

ENDMODULE.                 " EXIT_DYNPRO  INPUT

 

This module is optional, if you want to close the info screen manually, too.

 

 

 

Definition of FORM routine EXIT_DYNPRO:

 

*&---------------------------------------------------------------------*

*&  Include           LZMY_POPUPF01

*&---------------------------------------------------------------------*

 

*&---------------------------------------------------------------------*

*&      Form  EXIT_DYNPRO

*&---------------------------------------------------------------------*

*       Leave Screen

*----------------------------------------------------------------------*

FORM exit_dynpro .

 

    FREE                                xo_timer.

    CLEAR                               xo_timer.

 

    LEAVE                           TO  SCREEN 0.

 

ENDFORM.                    " EXIT_DYNPRO

 

 

And the Function Group ZMY_POPUP looks like::

 

*******************************************************************

*   System-defined Include-files.                                 *

*******************************************************************

  INCLUDE lzmy_popuptop.                     " Global Data

  INCLUDE lzmy_popupuxx.                     " Function Modules

 

*******************************************************************

*   User-defined Include-files (if necessary).                    *

*******************************************************************

  INCLUDE lzmy_popupcli.                     " Class Implementation

  INCLUDE lzmy_popupf01.                     " FORM Routines

  INCLUDE lzmy_popupi01.                     " PAI-Modules

  INCLUDE lzmy_popupo01.                     " PBO-Modules

 

 

Now the Function Group ZMY_POPUP and Function Module ZMY_POPUP_SHOW are complete.

 

 

Calling Function Module ZMY_POPUP_SHOW from report

 

After testing this function module in SE37, I added an asynchronous RFC call to my ALV Grid reports looking like

 

   CALL FUNCTION 'Z2V_POPUP_SHOW'

     STARTING          NEW TASK  'POPUP'

     EXPORTING

       i_header_text          =  text-hdr

       i_info_text1           =  text-in1

       i_label_text           =  text-lbl

       i_value_text           =  xv_text_val

       i_info_text2           =  xv_text_in2

       i_interval             =  xk_interval.

 

And the result looks like

 

Info.JPG

 

The screen closes automatically and all could be fine, if there were no ...

 

 

 

Further issues

 

First issue is, that for calling this function module in ARFC mode you need a free user mode. If you have no free user mode, the call won't work. So I've coded a workaround, that this function module is only called, if a free user mode is availabale:

 

  CALL FUNCTION 'TH_USER_INFO'

    IMPORTING

      act_sessions            =  xv_s_act

      max_sessions            =  xv_s_max.

 

   CHECK  xv_s_max           GT  xv_s_act.

 

This is no fine solution but a workaround only..

 

The other issue is, that to process a function module in RFC mode you need the authority to do it, like

 

Authority object:      S_RFC

Authority field:       RFC_TYPE      Value:    FUGR

Authority field:       RFC_NAME      Value:    ZMY_POPUP

Authority field:       ACTVT         Value:    16

 

 

Looking for improvement

 

So I'm searching for an improvement of this solution, where I don't need a free user mode and I don't need additional authorities.

 

I'm looking forward to seeing your suggestions.

 

 

Regards,

 

Klaus


Creating sales orders via ‘BAPI_SALESORDER_CREATEFROMDAT2’ using variant configured materials


SAP Product Id: LO-VC

 

SAP Provides a BAPI viz ‘BAPI_SALESORDER_CREATEFROMDAT2’ in order to create Sales Orders.

The usage of this BAPI is quite simple when used to create sales orders that do not used configured materials. But when it comes to creating sales orders that use variant material configurations, the logic of filling the pre-requisite structures of this BAPI is a little complicated and therefore this blog.

First things first, there is an SAP note which explains how the prerequisite structures should be filled. That note is 0000549563.

The four mandatory structures to be filled are;

 

1. ORDER_CFGS_REF

2. ORDER_CFGS_INST

3. ORDER_CFGS_PART_OF

4. ORDER_CFGS_VALUE

 

The first & the second are pretty much self explanatory; it’s the third and the fourth that require some explanation.

 

1. ORDER_CFGS_PART_OF:


Note 549563 says;

PART_OF_NO is the item number from the BOM

 

But please keep in mind that you will get the right PART_OF_NO if you use the sales BOM & not the production BOM

 

CONFIG_IDPARENT_IDINST_IDPART_OF_NOOBJ_KEY
00000100000001000000020010PROD112
00000200000003000000040010PROD122

 

 

Here is how you find the item number for a material from a Bill of Material (BOM) ;

 

box1.JPG

2. ORDER_CFGS_VALUE:

This is the tricky part since there are a lot of dynamics involved here, for instance,

 

    a)  There could be default characteristics for a particular class, and that might or might not appear in your order

 

    b)  The value range supplied could be over and above than that maintained in characteristics maintenance (Maintained by transaction CT04)

 

So how do you take care of that? We will get to those questions in a moment but first, here is the logic for populating this parameter.

 

The tables to be used are INOB, KSSK, KSML and CAWN

 

     a)  First pass your material in the table INOB ( Link between Internal Number and Object ) where INOB-OBJEK = your material number and fetch the           object number CUOBJ (Also enter your class type, KLART & name of the database table for object OBTAB, for variants this class will be 300)

 

     b)  Then pass the object number ( CUOBJ ) in the KSSK table where KSSK-OBJEK = INOB-CUOBJ to fetch the Internal class number ( CLINT )

 

     c)  Pass the Internal class number (CLINT) into the KSML table such that KSSK-CLINT = KSML-CLINT to finally fetch the Internal characteristic                     (IMERK). Put the results of the steps a), b), & c) in an internal table, say table A.

 

          Now since a picture speaks a thousand words for those who can see, here is what we have done so far in a nutshell

 

box2.JPG

 

     d)  Now select the characteristics & its values from the master table CAWN by passing the material numbers that you are using to create the order.                     Put the results of step d) in another internal table B.

 

          The idea here is that since we have arrived at table A by querying with the material & class, and the result of the material B by passing just the                           materials, the results of table B should be in table A. so we loop at table B and read table A. Here is the code below;

 

box2.JPG

          After this you need to separate the characteristics from the quantity, and you do that like the way shown below (of course you can code in                     whatever way you want to)

 

Capture3.JPG

Capture3.JPG

Capture3.JPG

Capture3.JPG

Capture3.JPG

Capture3.JPG

  Now back to the questions that we asked at the beginning

 

     a) There could be default characteristics for a particular class, and that might or might not appear in your order

         This could be checked by inquiring the value ATSTD of the table CAWN

 

         Here is a screenshot below;

         ss1.JPG    b) The value range supplied could be over and above than that maintained in characteristics maintenance (Maintained by transaction CT04)

 

ss2.JPG

     Use the code n box3, reproduced below

 

box4.JPG

Finally call your BAPI

 

box5.JPG

Don’t worry about the object dependencies; the BAPI takes care of it.

 

That's all folks !

 

Thanks

Hello SCN-Community,

 

As mentioned in my first blog „BRFplus in Big Data scenarios“, I want to share some other topics related to processing mass amounts of data. Faster than I expected, I've solved a problem regarding a dialog for operational analysis. It’s about topic 3 of my last blog:

 

“Exploratory search techniques to determine anomalies in invoices – how can we support the end users during their daily work? Can we use ALV grid to analyze data models with master-detail relationships?”

 

In this blog I want to describe a kind of analysis that can be compared to “looking for a needle in a haystack” or “exploring for something”. If you mention something unspecific and you don’t know what to call, is it worth a try to build a data model in Business Warehouse, for example? How to describe this data model? To build such a model, you have to do some exploratory searching first. This work is done mostly by an export of the related data and playing around with Excel or some kind of statistic software.
If you gained the knowledge about a specific fact in your data and you want to keep track of this constellation you may want to build a check function to reproduce it at any given time. - We do not want the users to leave the system to do their work.
That’s what the blog is all about. How could the user be supported by an exploratory search for anomalies in an operational analytics scenario without leaving the operational system? How could we place the facts of a completed exploratory search into a check that can be used to reproduce the search?

Introduction

We start with the previously discussed business case: Processing a huge number of invoices sent to a statutory health insurance company.

A set of checks is being processed every time new invoices arrive. Every check focuses on a specific aspect or constellation of an invoice that produces clarification cases that have to be analyzed by the end user. But these checks were developed due to a specific concept or aspect. So they target the most common known anomalies in these invoices. What about other constellation that have not been considered before?

One solution to this problem may be a dialog in which the user can do some kind of pre-analysis to look for new patterns to identify anomalies, errors or even detect possible fraud cases among those invoices.

Requirements

This task sounds easy. You take the invoices to look at and put them into an ALV grid. As a result, you can use all filters and sort criteria of the ALV grid. Your are also you are able to save your “query” for reuse as a user-specific layout.

At this point we need to raise some important aspects:

 

  • We have to deal with approximately 28,000,000 invoices a year for a single health insurance company. The number of invoices does not fit into an ALV grid.
  • Each invoice is supported by some medical documentation. We have to be able to search the invoices for the existence of a specific medical treatment, surgery, and so on.
  • These filter criteria have to be saved like usual ALV variants for reuse purposes. Unfortunately we can’t use the normal ALV variants since we are dealing with master/detail views. So we have to develop different techniques, which are main topic of this blog entry.
  • The user must be able to see all of the data of a specific invoice, even the detail data.
  • Our solution cannot be addressed to HANA-specific techniques only. We have to support a kind of HANA-readiness without facing a solution with two different lines of code.

 

In this case we are talking about a generic solution. With this tool, the user is able to do an exploratory search about billing errors in our scenario without moving into a Business Warehouse.

Introducing our UI prototype

The fact of the matter is that we cannot build upon HANA-specific Features, we can only chose a solution with a standard ALV grid. We split the screen of the dialog into two sections. A master view with one ALV grid for displaying the invoices and a detail view containing three ALV grids corresponding to three kinds of supporting documentation.

BALV.jpg

If you double click an invoice, all of the detail data is loaded into the corresponding ALV grids.

Handling of mass data

We decided not to load all invoices into the ALV grid to avoid overloading the master view. During an exploratory search you are not interested in each single invoice. Rather, you have to find another filter to reduce the search result – to give your interest boundaries. For now, we are talking about an extract of the data (called “Ausschnitt” in the screenshots) and the entire data of the underlying table (called “Gesamtdaten”). The size of an extract is defined by the user. With the size of the extract the user decides how many invoices have to be displayed and thus are being transferred to the front-end. It’s a constraint to define the maximum amount of data. To gain this kind of truncation we added a new button to the ALV grid.

Count_Button.JPG

“Sätze” stands for rows or even invoices which definess a kind of truncation. The given filter and sorting criteria of an ALV grid operates on the internal data table of the grid – so it only addresses the extract but operates very quickly.

If we want to reduce our search result effectivelyy the filter and sort criteria are to operate on the whole underlying data table. Due to that we added some other functionality to the grid to control the ALV filter and sort actions.

Sort_Buttons.JPGFilter_Button.JPG

  • „Sortierung Ausschnitt“ : Sort the extract in the front-end (ALV grid standard)
  • „Sortierung Gesamtdaten" : Sort the data in the backend (database) and rebuild the internal table (of the ALV)
  • „Filterung Ausschnitt“ : Filter the extract in the front-end (ALV grid standard)
  • „Filterung Gesamtdaten“ : Filter the data in the backend and rebuild a new internal table on that result (of the ALV)

 

With the help of these new functions we are able to search for various facts in a huge amount of data. If the amount of data to be displayed is too big, we only see the tip of the iceberg due to the truncation.

Detail filter – How to deal with them?

The search by detail data was a bigger problem to deal with. An ALV filter only operates on its own data context. But the detail data is separated in additional ALV grids. So we built a popup with additional filter criteria addressing the keys of the medical documentation we have to deal with.

Detail_Button.JPG

Detail_Filter.JPG

The constraints of the detail filter are handled by a manager class of our master view. Such a constraint is defined as a single Select-Option. This manager class builds upon these constraints a SQL query. We built all combinations of master to detail relationships as an EXISTS clause for each detail relationship. We counted 8 valid combinations (guess 2^3 ) so the SQL queries can be hard coded in the manager class.

 

 

 

Detail A

 

 

EXISTS (MEDICAL

TREATMENTS)

 

 

Detail B

 

 

EXISTS (ICD)

 

 

Detail C

 

 

EXISTS

(SURGERIES)

XOO
XXO
XXX
XOX
OXX
OXO
OOX
OOO


Another problem was taking care of a detail filter while saving an ALV layout in the master view. If the user defined a detail filter it has to be saved together with the normal layout. We solved this by creating an add-on table holding the detail filter under the same key of the ALV layout. To do this ,we added some functionality in to the event handling of the ALV layout button. When the user loads a layout the detail filter is loaded too so the original search result is reproduced.

 

With these extensions to the ALV grid we realized a master to detail relationship with standard ABAP coding to allow the user to do exploratory searches in the whole data volume.

Advanced techniques

With this solution we are not at the end. Due using a Select-Option to define a detail filter, we are not able to do all kinds of filtering. We suggest keepingtrack of all invoices containing the medical treatment A und B except C or D. With a Select-Option, we are only able to select via OR and not via AND.

 

Additionally, we want to transform such a saved layout (with detail filter) to a customer defined check function which is being processed when a new invoice arrives. Under this condition we were faced with a major ABAP limit:

 

The 8 hard coded EXISTS clauses no longer match the new requirements. So we have to deal with a dynamic WHERE clause which does not currently support sub queries. We hit a dead end.

 

Without trying to use HANA-specific functionalities the only solutions we mentioned were the use of native-SQL or the generation of ABAP coding. We decided to choose code generation. Our manager class is able to transform a saved ALV layout with an applied detail filter into an SQL query. This built query is embedded into a generated ABAP class that could be used in our operational analytics framework.

What should be improved? – A wish list / Subject to Change

During the development of this dialog (which is based on the ALV grid class CL_GUI_ALV_GRID on NetWeaver 7.40 SP4), we also examined the use of the ALV-Grid with IDA (HANA-Grid class CL_SALV_GUI_TABLE_IDA on both NetWeaver 7.40 SP4 and SP6). We also noticed the availability of this grid on none HANA systems. That’s a very important fact. This capability may help us to rely on only one codebase. We do not need to separate HANA-only features in another codebase which has to be switched out on non HANA systems. Unfortunately the current version of the HANA grid doesn’t give us the ability to influence the handling of the grid the way a normal ALV grid does. We need the ability to take care of the grid’s internal table of data storage in non HANA scenarios. Another solution could be a refresh function by the grid. By calling the grid’s refresh function, a new SQL query is sent to the database with the defined filter and sorting criteria. Also we need the ability to define the truncation.

 

After solving this, we are still not ready to use the HANA grid due to the fact that we are not able to manipulate the WHERE-clause. Well in the meantime I think the solution will be a master-detail-relationship based on HANA-features. But this doesn’t help us in our case.

 

Due to our business scenario, our solution has to be used by both, HANA and non HANA customers.

 

As I described earlier, we use code generation to avoid native-SQL. But the code generation doesn't help us in case of the HANA grid. Remember we do not have the ability to query the database on our own.

Summary

With our solution, we are able to support the user with an exploratory search technique. It’s a kind of generic solution that does not rely on further development once it is transported to the customer. Due to the fact that the user never wants to jump from one System to another (for instance to jump into a BW system, do an snalysis there and jump back to his operative application), he is able to build a check out of the combined filter criteria to integrate it into the previously defined set of checks that are processed when a new invoice arrives.

 

This prototyping is not at the end. Some technical problems must be solved to build other features on top of this approach. But these problems are tough because they depend on needed extensions to OpenSQL (sub queries in dynamic WHERE clauses) or to the HANA grid.

 

The general availability of the HANA Platform features is not only a huge gain of speed but also of additional functionalities to operate on Big Data. But first we have to lift the customers to that platform. I suggest it is only done by building hybrid solutions that handle of the base business cases which run faster and give some view of benefits if run on HANA. To minimize the effort, we have to rely on only one codeline that we have to support.

As an ABAPer we have SAT, ST05 ( or sometimes ST12 ) for trace in our toolbox, and recently I find this report which could also do the trace job.

 

Although the trace information it generates is quite technical and perhaps more useful for those guys who are interested with the ABAP kernel.

 

How to use this report

 

1. SE38, execute report RSTRC000, mark the checkbox "Keep Work process", so that a free work process will be owned exclusively by you unless you release it via this report again. And change the trace Level to 2: Full trace. Select the component which you would like to trace, for example Database.

 

clipboard1.png

Click save button and you can see the work process 23 is locked.

clipboard2.png

you could observe that the work process 23 has status "halt" in tcode SM50.clipboard3.png

2. Now it is ready to run the program which you would like to trace ( just the similar process as SAT or ST05 ). Use /nse38 to go to abap editor starting from the current screen of report RSTRC000, and run your program. For me, I just run a report which will query material data from database table COMM_PRODUCT. Once the program finishes, run report RSTRC000 again.

 

click button "Default val." so that trace Level is changed to 1 automatically,

clipboard4.png


then click save button and you could observe the previously locked work process 23 is released.

clipboard5.png

Now you could click "Display" button to review trace log:

clipboard6.png

You could export the trace locally to review it. For me I prefer to use my favourite text editor "sublime text" to review text file.

 

Here below I just list the trace review of several trace component which I have already tried myself.

 

Database log

 

from the log, I could find which database tables are involved in the report execution and which ABAP program triggers such access. Some C language call could be observed but due to security or authority reasons maybe, we could not review the source file like ablink.c in folder /bas/*.


clipboard7.png

We could also find the detail OPEN SQL statement from the log, however I could not find the value of query parameter as shown below - they are displayed as ? in the trace.


clipboard8.png

ABAP proc.

 

It just lists all the ABAP class which are involved in the report execution but without method name of those class. In my case from the trace I can just know there are totally 40 different ABAP class with prefix CL_CRM_PROD* ( which are responsible by me)  involved in the execution.

clipboard9.png

Database (DBSL)

 

Since we are currently use HANA as our database, I could have a very draft understanding about how the OPEN SQL like SELECT XXX FROM table statement is executed in HANA.

clipboard10.png

Lock Management

 

This time I would like to trace the lock behavior in tcode COMMPR01. I switch to edit mode which triggers a lock request to enqueue server to lock the product and then I make changes on its description field.

clipboard11.png

in the trace this enqueue request is perfectly recorded:

 

  • the enqueue object
  • the database table on which the enqueue object is working
  • the guid of the product instance being locked
  • the tcode name COMMPR01
  • the user which triggers the enqueue request

clipboard12.png

clipboard13.png

From my point of view this option is a good substitute for the enqueue trace in ST05.

 

enqueue.png

 

Background

 

I run my report ZHANA_OBJECT_SEARCH in the background and

clipboard14.png


and I could see from the job log that it is successfully executed.

clipboard15.png

and this information is also available in RSTRC000 trace:

clipboard16.png

I didn't try all the other trace options and maybe they are useful under some extreme use cases. If you are interested, you can start now try it yourself

Actions

Filter Blog

By author:
By date:
By tag: