1 2 3 46 Previous Next

ABAP Development

684 Posts

Suppose there are 3 Master Pages

Three pages are created in Master Page in Adobe Form:-

Page 1--Cover Page, fix will not repeat.

Page 2--Repeat based on the Item level data with all header data on every page.

Page 3--Repeat based on the Item level data with all header data on every page.

I have drag the table from the data view in Subform_Page2 and Subform_Page3.

Problem:The Column header was coming only on the first page of Page 2 and Page 3 and for the subsequent pages of Page 2 and Page 3 the column header was not getting repeated.




. Hierarchy.JPG


Page 2---->Subform_Page2                                                                                                   


Repeat Column Header (Name , City , Address , State , City 1 , Address1 and City1 ) in every page which is under "OverflowLeader" 

Column Header.JPG

Properties of OverflowLeader : Row :

Column Header_Row.JPG

OverflowLeader: Pagination :

Do as mentioned below :

1. In Place = Click on icon and then Select Top of Content Area---->Choose the Content Area of the Subform_Page2

2..Check the check box " Include Header Row in Initial Page ".

3  Check the check box " Include Header Row in Subsequent Changes".

Column Header_Pagination.JPG

OverflowLeader: Binding :

Do as mentioned below:

1. Check the check box "Repeat  Row for each Data Item.

2..Check the Min Count and add 1.

Column Header_Binding.JPG

I have followed the same steps for Subform_Page3.

Doing the above mentioned steps the Column Header is getting repeated on every page and not only on the first page of Page 2 and Page 3.




I'm writing this blog after reading Understanding Widening Cast in ABAP Objects since it became clear to me that the difference between reference types and object type is not clear for many SCN users who are not used to Object Oriented Programming. Knowing the difference between the two is critical to understanding what you can do with casting and how powerful the concept of inheritance is.


I'll start by giving an example of why narrowing cast is so important. Imagine the following scenario, where the specific fruits are children of the super class Fruit:




You want to create a table of fruit, then loop at it, and write the name of the fruit to the screen. The required data should can be declared as:


            lo_fruit  TYPE  REF TO ZCL_FRUIT,
            lo_mango TYPE REF TO ZCL_MANGO,
            lo_apple  TYPE REF TO ZCL_APPLE,
            lo_orange TYPE REF TO ZCL_ORANGE.

And then you do something like:


lo_mango = new ZCL_MANGO( ).
lo_fruit ?= lo_mango.
append lo_fruit to lt_fruit.


This is where the difference between reference type and object type becomes critical.

  • The object type is intrinsic to the type of the constructor (new ZCL_MANGO) used to bring it to "life". Think of the object as memory space that contains information, whose type never changes after it is instantiated.
  • The reference type is the type of the pointer (in this case lo_mango and lo_fruit) to the memory space. It's your gateway, your "API",  only through them can you access the memory (the intrinsic object, whose type was determined by the constructor).


When I make a cast from lo_mango to lo_fruit, the object itself and therefore its type remains the same. Same variables, same type,  nothing changes except the type of the pointer, the reference. As long as we use a reference of the super class type we only have access in the code to the attributes and methods of the superclass, but that doesn't mean the attributes of the original object were lost. They are still there waiting!


This dichotomy is very important, because it allows us to keep similar objects together, in the same table for example, while keeping their intrinsic properties intact. For example lets assume that for the very specific case of the Apple we want to output the apple's type besides the name of the fruit, the code would be something like:


Loop at lt_fruit into lo_fruit.
     write lo_fruit->get_name( ).
     if cl_abap_classdescr=>get_class_name( lo_fruit ) = 'ZCL_APPLE'.
          lo_apple ?= lo_fruit.
          write lo_apple->get_type_of_apple( ).


It should be become even clearer by the usage of cl_abap_classdescr=>get_class_name( lo_fruit ) and the fact that it returns ZCL_APPLE (instead of ZCL_FRUIT), that indeed the object retains all the attributes that were given to him by the constructor, even if the reference is of the super class type.


Now imagine a scenario where casting didn't exist, and the code you would need. You would need 3 tables, 3 loops. Now expand this to a real program, inheritance allow much more elegant coding.

Hi ,

I am writing this blog due to the reason, when I search in google , I didn't get proper response


If you encounter dump in ALV when click on some ICON, you will get this dump. TO Resolve this you need to create FINAL INternal table structure with Ref. Feilds and not the Domain fields.


I mean --> TYPES Declaration :      BMEIN     TYPE BASME" Base Unit  , Instead you need to use

                                                          BMEIN     TYPE MEINS" Base Unit ( MEINS is Ref Field ).


With Regards,

Bala M

Hi all,

today's blog is a deeper insight in “The 4th point: think about your developing twice” published in Some recommended Points everybody should remember when developing (ABAP) - featured title "Be a better developer"


As a quality manager it is also part of my daily work to share knowledge and to onboard new employees. Out of that I have a lot of different views from people which aren't much influenced. That means I get a lot of different views to code quality and how the different people think about it.

A funny thing is, when I tell them, that I’m part of the quality team, a lot of them are saying that this isn't effective or even they got not that good stories to tell. A lot are saying it is very abstract.

In other words they are saying that this is too far away from the daily work and it is not helping them to improve the code quality.

That’s why I thought, I will share ten facts about developing things.

( Be careful, it is not doing your work and maybe you need more time to get it done )


1st point Functionality

Does your code do what you want to?

Of course, everybody of us is doing that in a way, but did you ever thought about it before starting developing?

I mean all the stuff you discovered before starting developing saves you time. So take your time and find your spots to implement your additional coding. If you have a Greenfield development you might save more time by drawing a class diagram with all the relations between, but that is another story to tell




2nd point software reliability

Does your code affect other processes?

Make sure, that your code is not affecting other processes. Of course, if you answered point one with yes, you might have already checked this too, but just because most of us implement that in the first mentioned  point, it is an extra point and I need to mention it.


3rd point usability

Would a user understand it easily?

That is a really big point and no, this is not your business. It is your business and I tell you why. We develop the things and we also need to take care about the usability. A simple example:

You see, take your time and think about your screens you develop, no matter which technique you are using. Just because all necessary results show up on the screen doesn't mean that it is a good program at all.


4th point efficiency

What about the run duration of my program?

This is also a big point and it needs a bit more explanation. It is not just the run time, which is affected. It is more a design-fact on how well-written your code will be. Good to know before starting coding are the answers to the following questions:



Do I have a customizing, which is needed in the beginning?

Does a user use the program more often in a row?

How many users will use this program?


With these three questions it is possible to make a decision. If a program is used a lot of times you might read your customizing just once and save it in globals to not need to fetch it again. I think you know what I mean and so I don't want to waste your time here… (You know, all about efficiency in this point)


5th point changeability

Is it possible to add additional logic to your coding without having devastating consequences?

Easy point, isn't it? Just make sure, that you implemented your code good and it is possible to change it in an easy way. For example, if you added source in different places and one of it cannot exist without another spot make sure you have a reference.

A very effective way to handle this is to implement unit tests. With a unit test you are just one click away


6th point transferability

Is it possible to transfer my code to another spot if needed?

Try to make your source as unattached to the spot as you can. Use all the advantages ABAP (or even your programming language) gives us to develop things. Use interfaces, pass the values and extract your code in own classes / functions as much as you can.


7th fact readability

Is my source readable if I would see it for the first time?

  1. If you answer this question with yes, just ask the developer next to you and prove yourself
  2. I think it is an easy point and no need to explain it in a long story.


8th point understandability

Will I understand the source in 6 month again?

If you answered the readability question with yes you might say isn't it the same? In my opinion no, because just I can read a source does not mean I understand it. Everybody of us saw a lot of coding and I’m pretty sure most are pretty sure, that it isn't easy to work through a snippet and get the idea behind. You need to see your source in a big picture and here it is needed to understand it. Perhaps you aren't that sure now, so you might add some comment-lines to your source and also describe the methods in a few sentences.


9th point learnable

Would I teach someone to code like that?

What? That might be right now in your mind, but this is pretty important. If you see your coding in front of you and you scroll through it, you just should think about all the small details and ask you the question mentioned above. If you think you won’t teach someone developing in this way you might think about it more than twice and change it to something you would teach. That is the fact here.


10th point needed

Is my code needed?

Ok, that is not a real question you should ask you after developing your stuff. Might save a lot of work

I just want to make sure, that you are really pretty sure that your developing is needed, if you have any doubts, that might be the implementing is not needed out of any reason ask your questions. I know a lot of developments out there which aren't needed in the end and just wasted time for everybody involved….


and ten facts sounds a lot better than nine facts





That’s it.

These are my personal  ten recommended points to think about twice. Keep these in your mind I’m pretty sure you will save time, perhaps not during the developing, but afterwards analyzing changing or even enhancing your source.


The bridge to the quality management:

Do you think these are points we should consider?


Yes?  Here is the fact: The first six are out of the ISO 250XX (Old one ISO9126). Now you might not say again that the quality management is not helping you in your daily work. It is always present but most of us don't recognize it as quality management in a classic way, in my opinion a good thing.


A summarization might be:

Just combine beauty and functionality.


Feel free to leave a comment and happy coding




Hello SCN members,


Good Evening.


Today i want to explain a simple and very important point particularly about the reports sent to persons like President or Vice-President in a Company.


I have been asked to change a report based on the user settings. For your understanding i am giving the screen shot as below:


SAP Menu ---> System --->User Profile ----> Own Data, click on the Defaults Tab.


defaults screen.png

The above Decimal Notation has 3 types of number formats i.e Space, X and Y.

SPACE" 1.234.567,89

          'X'. " 1,234,567.89

        'Y'. " 1 234 567,89

Different Countries use different Decimal Notations based on their habitat or convenience.

Regarding that i have searched so much and people have used below function module for currency and even quantity fields.


But the above function module did not work for all the situations.

So, i have made a change as below:









       L_MATCH    TYPE STRING VALUE `^\s*-?\s*(?:\d{1,3}(?:(T?)\d{3})?(?:\1\d{3})*(D\d*)?|D\d+)\s*$`.











* Get separator from user record





   WHEN SPACE" 1.234.567,89

     L_THOUSANDS = '.'.

     L_DECIMAL   = ','.

   WHEN 'X'.    " 1,234,567.89

     L_THOUSANDS = ','.

     L_DECIMAL   = '.'.

   WHEN 'Y'.    " 1 234 567,89


     L_DECIMAL   = ','.



* Modify regex to handle the user's selected notation



     REPLACE ALL OCCURRENCES OF 'T' IN L_MATCH WITH ' '.    " (This statement is not happened)

*so, i did as below,


*L_MATCH   = `^\s*-?\s*(?:\d{1,3}(?:(T?)\d{3})?(?:\1\d{3})*(D\d*)?|D\d+)\s*$`.  " Removed the T with space.

L_MATCH   = `^\s*-?\s*(?:\d{1,3}(?:( ?)\d{3})?(?:\1\d{3})*(D\d*)?|D\d+)\s*$`.







*  if SS_USR01-DCPFM <> 'Y'.


* Check the number is valid


*  endif.


   MESSAGE 'Invalid' TYPE 'E'.



* Translate thousand separator into "space"



* Translate decimal into .



* Remove spaces



*To get the User profile format after all calculations.


           quantity = output.

write: output, / 'Do calculations and print the values in User Profile Settings:', quantity.

Note: Whenever you have changed the user profile and want to see the result of the values like QUAN (usually 13 digits and 3 decimals), you need log out and log in once. Then only user settings will be applied.


Siva kumar. D


You have probably stumbled upon some cool projects at github like the linux kernel or node-js. Until now it has been quite cumbersome to push your open source ABAP to Git repositories, abapGit will help to make this easier.



abapGit is a Git client written in ABAP for ABAP, it lets you clone projects or commit objects to the Git repository.




  • Download the source code from the github repository
  • Paste the code in new report using SE38
  • Configure SSL in transaction STRUST
  • Run




  • Delete report
  • Delete standard texts ZABAPGIT* via SO10



How abapGit works

First step is to clone a repository, this will create the objects from the repository in the SAP system. After this, one of the following commands will appear:



If the files in the repository have been changed, the ABAP objects can be updated with the pull command



If the latest changes are implemented in the SAP system, and objects in SAP are changed, the changes can be pushed to the Git repository using the commit command.



After having pulled or cloned the repository, the objects will be in sync, at this stage it is possible to add new ABAP objects to the Git repository.




The "distributed" part of Git is not implemented in abapGit, it will pull data from the repository quite often, and advanced git commands like blame etc. is also not supported. It is currently only possible to serialize reports, classes, data elements, and domains, other objects will be implemented over time. All code will be serialized to .abap files in the repository making it easy to read online, meta data will be serialized to .xml files.

Beware that this is alpha software and provided "as is", take care and only run in test systems.




Hopefully abapGit will help to ease cooperation in ABAP open source projects and inspire more to do open source ABAP.

Initial requirement


My requirement was to create two reports with ALV Grid output. These reports should be called via transaction codes, and their data gathering should be done thru function modules, for the identical data will be needed for a Web Tool accessing those data via RFC.





For there is no data selection and no selection screen in these reports (and there shouldn't be one as required), after starting these transactions the users see nothing for several seconds, until the ALV Grid is displayed. So some users became unsure, if they've correctly started the transactions.



Looking for a solution


I was looking for some ideas, and I found here on SCN some threads pointing to class CL_GUI_TIMER. So I decided to create a function module calling a screen and closing it, after the timer interval has finished.



Creating the function module


First I thought about the data to be passed to the function module:


  • a heading text for the popup screen
  • an info text, that some action has been started
  • a label text and a value text to tell,  from where the function module has been called
  • an info text, that the screen will be closed automaticall.


Code of the function module


FUNCTION zmy_popup_show.


*"*"Local Interface:










* Filling dynpro fields and interval

  xv_header_text                     =  i_header_text.

  xv_info_text1                      =  i_info_text1.

  xv_info_text2                      =  i_info_text2.

  xv_label_text                      =  i_label_text.

  xv_value_text                      =  i_value_text.


  xv_interval                        =  i_interval.


* Call info screen 9000

  CALL SCREEN                           '9000'

    STARTING                        AT  5 5.




Here I pass all input parameters to global defined variables. All text fields will be shown at the screen.




Code of the function group's TOP include


FUNCTION-POOL zmy_popup.                    "MESSAGE-ID ..


* Definitions

CLASS xcl_event_receiver      DEFINITION DEFERRED.

DATA: xo_event          TYPE  REF TO xcl_event_receiver."#EC NEEDED

DATA: xv_header_text    TYPE  text80.

DATA: xv_info_text1     TYPE  text80.

DATA: xv_info_text2     TYPE  text80.

DATA: xv_interval       TYPE  i.

DATA: xv_label_text     TYPE  text30.

DATA: xo_timer          TYPE  REF TO cl_gui_timer.

DATA: xv_value_text     TYPE  text50.


* Definition of class XCL_EVENT_RECEIVER

  INCLUDE lzmy_popupcls.


For catching the FINISHED event of class CL_GUI_TIMER a local event receiver class is needed:



Event Receiver Class Definition ...



*&  Include           LZMY_POPUPCLS




*   CLASS xcl_event_receiver DEFINITION


CLASS xcl_event_receiver DEFINITION.




      timer_finished       FOR EVENT  finished

                                  OF  cl_gui_timer.


ENDCLASS.                    "xcl_event_receiver DEFINITION



... and Implementation



*&  Include           LZMY_POPUPCLI




*   CLASS xcl_event_receiver IMPLEMENTATION                            *


*   Handle events                                                      *



CLASS xcl_event_receiver IMPLEMENTATION.



*       METHOD timer_finished                                          *


*       Action after timer has finished                                *


  METHOD timer_finished.


        PERFORM                     exit_dynpro.


  ENDMETHOD.                    "timer_finished


ENDCLASS.                    "xcl_event_receiver IMPLEMENTATION


How to leave will be shown in FORM EXIT_DYNPRO later.




Definition of the Info Screen 9000


The flow logic looks pretty simple:



  MODULE call_timer.



  MODULE exit_dynpro.



The screen contains the fields




and some frames:



Remark: Übertschrift means Header/Heading.




Definition of PBO module CALL_TIMER:



*&  Include           LZMY_POPUPO01




*&      Module  CALL_TIMER  OUTPUT


*       Create and start timer


MODULE call_timer OUTPUT.


* Timer setzen

  CREATE  OBJECT                     xo_timer


      OTHERS                      =  4.


  CHECK sy-subrc                 EQ  0.


  SET HANDLER xo_event->timer_finished FOR xo_timer.


  xo_timer->interval              =  xv_interval.

  xo_timer->run( ).




Here the timer object is created, the event handler is set, the timer interval is set and the timer is started.




Definition of optional PAI module EXIT_DYNPRO:



*&  Include           LZMY_POPUPI01




*&      Module  EXIT_DYNPRO  INPUT


*       Leave Screen


MODULE exit_dynpro INPUT.


  CASE  sy-ucomm.


        PERFORM                     exit_dynpro.





This module is optional, if you want to close the info screen manually, too.




Definition of FORM routine EXIT_DYNPRO:



*&  Include           LZMY_POPUPF01




*&      Form  EXIT_DYNPRO


*       Leave Screen


FORM exit_dynpro .


    FREE                                xo_timer.

    CLEAR                               xo_timer.


    LEAVE                           TO  SCREEN 0.


ENDFORM.                    " EXIT_DYNPRO



And the Function Group ZMY_POPUP looks like::



*   System-defined Include-files.                                 *


  INCLUDE lzmy_popuptop.                     " Global Data

  INCLUDE lzmy_popupuxx.                     " Function Modules



*   User-defined Include-files (if necessary).                    *


  INCLUDE lzmy_popupcli.                     " Class Implementation

  INCLUDE lzmy_popupf01.                     " FORM Routines

  INCLUDE lzmy_popupi01.                     " PAI-Modules

  INCLUDE lzmy_popupo01.                     " PBO-Modules



Now the Function Group ZMY_POPUP and Function Module ZMY_POPUP_SHOW are complete.



Calling Function Module ZMY_POPUP_SHOW from report


After testing this function module in SE37, I added an asynchronous RFC call to my ALV Grid reports looking like



     STARTING          NEW TASK  'POPUP'


       i_header_text          =  text-hdr

       i_info_text1           =  text-in1

       i_label_text           =  text-lbl

       i_value_text           =  xv_text_val

       i_info_text2           =  xv_text_in2

       i_interval             =  xk_interval.


And the result looks like




The screen closes automatically and all could be fine, if there were no ...




Further issues


First issue is, that for calling this function module in ARFC mode you need a free user mode. If you have no free user mode, the call won't work. So I've coded a workaround, that this function module is only called, if a free user mode is availabale:




      act_sessions            =  xv_s_act

      max_sessions            =  xv_s_max.


   CHECK  xv_s_max           GT  xv_s_act.


This is no fine solution but a workaround only..


The other issue is, that to process a function module in RFC mode you need the authority to do it, like


Authority object:      S_RFC

Authority field:       RFC_TYPE      Value:    FUGR

Authority field:       RFC_NAME      Value:    ZMY_POPUP

Authority field:       ACTVT         Value:    16



Looking for improvement


So I'm searching for an improvement of this solution, where I don't need a free user mode and I don't need additional authorities.


I'm looking forward to seeing your suggestions.






Creating sales orders via ‘BAPI_SALESORDER_CREATEFROMDAT2’ using variant configured materials

SAP Product Id: LO-VC


SAP Provides a BAPI viz ‘BAPI_SALESORDER_CREATEFROMDAT2’ in order to create Sales Orders.

The usage of this BAPI is quite simple when used to create sales orders that do not used configured materials. But when it comes to creating sales orders that use variant material configurations, the logic of filling the pre-requisite structures of this BAPI is a little complicated and therefore this blog.

First things first, there is an SAP note which explains how the prerequisite structures should be filled. That note is 0000549563.

The four mandatory structures to be filled are;







The first & the second are pretty much self explanatory; it’s the third and the fourth that require some explanation.



Note 549563 says;

PART_OF_NO is the item number from the BOM


But please keep in mind that you will get the right PART_OF_NO if you use the sales BOM & not the production BOM





Here is how you find the item number for a material from a Bill of Material (BOM) ;




This is the tricky part since there are a lot of dynamics involved here, for instance,


    a)  There could be default characteristics for a particular class, and that might or might not appear in your order


    b)  The value range supplied could be over and above than that maintained in characteristics maintenance (Maintained by transaction CT04)


So how do you take care of that? We will get to those questions in a moment but first, here is the logic for populating this parameter.


The tables to be used are INOB, KSSK, KSML and CAWN


     a)  First pass your material in the table INOB ( Link between Internal Number and Object ) where INOB-OBJEK = your material number and fetch the           object number CUOBJ (Also enter your class type, KLART & name of the database table for object OBTAB, for variants this class will be 300)


     b)  Then pass the object number ( CUOBJ ) in the KSSK table where KSSK-OBJEK = INOB-CUOBJ to fetch the Internal class number ( CLINT )


     c)  Pass the Internal class number (CLINT) into the KSML table such that KSSK-CLINT = KSML-CLINT to finally fetch the Internal characteristic                     (IMERK). Put the results of the steps a), b), & c) in an internal table, say table A.


          Now since a picture speaks a thousand words for those who can see, here is what we have done so far in a nutshell




     d)  Now select the characteristics & its values from the master table CAWN by passing the material numbers that you are using to create the order.                     Put the results of step d) in another internal table B.


          The idea here is that since we have arrived at table A by querying with the material & class, and the result of the material B by passing just the                           materials, the results of table B should be in table A. so we loop at table B and read table A. Here is the code below;



          After this you need to separate the characteristics from the quantity, and you do that like the way shown below (of course you can code in                     whatever way you want to)








  Now back to the questions that we asked at the beginning


     a) There could be default characteristics for a particular class, and that might or might not appear in your order

         This could be checked by inquiring the value ATSTD of the table CAWN


         Here is a screenshot below;

         ss1.JPG    b) The value range supplied could be over and above than that maintained in characteristics maintenance (Maintained by transaction CT04)



     Use the code n box3, reproduced below



Finally call your BAPI



Don’t worry about the object dependencies; the BAPI takes care of it.


That's all folks !



Hello SCN-Community,


As mentioned in my first blog „BRFplus in Big Data scenarios“, I want to share some other topics related to processing mass amounts of data. Faster than I expected, I've solved a problem regarding a dialog for operational analysis. It’s about topic 3 of my last blog:


“Exploratory search techniques to determine anomalies in invoices – how can we support the end users during their daily work? Can we use ALV grid to analyze data models with master-detail relationships?”


In this blog I want to describe a kind of analysis that can be compared to “looking for a needle in a haystack” or “exploring for something”. If you mention something unspecific and you don’t know what to call, is it worth a try to build a data model in Business Warehouse, for example? How to describe this data model? To build such a model, you have to do some exploratory searching first. This work is done mostly by an export of the related data and playing around with Excel or some kind of statistic software.
If you gained the knowledge about a specific fact in your data and you want to keep track of this constellation you may want to build a check function to reproduce it at any given time. - We do not want the users to leave the system to do their work.
That’s what the blog is all about. How could the user be supported by an exploratory search for anomalies in an operational analytics scenario without leaving the operational system? How could we place the facts of a completed exploratory search into a check that can be used to reproduce the search?


We start with the previously discussed business case: Processing a huge number of invoices sent to a statutory health insurance company.

A set of checks is being processed every time new invoices arrive. Every check focuses on a specific aspect or constellation of an invoice that produces clarification cases that have to be analyzed by the end user. But these checks were developed due to a specific concept or aspect. So they target the most common known anomalies in these invoices. What about other constellation that have not been considered before?

One solution to this problem may be a dialog in which the user can do some kind of pre-analysis to look for new patterns to identify anomalies, errors or even detect possible fraud cases among those invoices.


This task sounds easy. You take the invoices to look at and put them into an ALV grid. As a result, you can use all filters and sort criteria of the ALV grid. Your are also you are able to save your “query” for reuse as a user-specific layout.

At this point we need to raise some important aspects:


  • We have to deal with approximately 28,000,000 invoices a year for a single health insurance company. The number of invoices does not fit into an ALV grid.
  • Each invoice is supported by some medical documentation. We have to be able to search the invoices for the existence of a specific medical treatment, surgery, and so on.
  • These filter criteria have to be saved like usual ALV variants for reuse purposes. Unfortunately we can’t use the normal ALV variants since we are dealing with master/detail views. So we have to develop different techniques, which are main topic of this blog entry.
  • The user must be able to see all of the data of a specific invoice, even the detail data.
  • Our solution cannot be addressed to HANA-specific techniques only. We have to support a kind of HANA-readiness without facing a solution with two different lines of code.


In this case we are talking about a generic solution. With this tool, the user is able to do an exploratory search about billing errors in our scenario without moving into a Business Warehouse.

Introducing our UI prototype

The fact of the matter is that we cannot build upon HANA-specific Features, we can only chose a solution with a standard ALV grid. We split the screen of the dialog into two sections. A master view with one ALV grid for displaying the invoices and a detail view containing three ALV grids corresponding to three kinds of supporting documentation.


If you double click an invoice, all of the detail data is loaded into the corresponding ALV grids.

Handling of mass data

We decided not to load all invoices into the ALV grid to avoid overloading the master view. During an exploratory search you are not interested in each single invoice. Rather, you have to find another filter to reduce the search result – to give your interest boundaries. For now, we are talking about an extract of the data (called “Ausschnitt” in the screenshots) and the entire data of the underlying table (called “Gesamtdaten”). The size of an extract is defined by the user. With the size of the extract the user decides how many invoices have to be displayed and thus are being transferred to the front-end. It’s a constraint to define the maximum amount of data. To gain this kind of truncation we added a new button to the ALV grid.


“Sätze” stands for rows or even invoices which definess a kind of truncation. The given filter and sorting criteria of an ALV grid operates on the internal data table of the grid – so it only addresses the extract but operates very quickly.

If we want to reduce our search result effectivelyy the filter and sort criteria are to operate on the whole underlying data table. Due to that we added some other functionality to the grid to control the ALV filter and sort actions.


  • „Sortierung Ausschnitt“ : Sort the extract in the front-end (ALV grid standard)
  • „Sortierung Gesamtdaten" : Sort the data in the backend (database) and rebuild the internal table (of the ALV)
  • „Filterung Ausschnitt“ : Filter the extract in the front-end (ALV grid standard)
  • „Filterung Gesamtdaten“ : Filter the data in the backend and rebuild a new internal table on that result (of the ALV)


With the help of these new functions we are able to search for various facts in a huge amount of data. If the amount of data to be displayed is too big, we only see the tip of the iceberg due to the truncation.

Detail filter – How to deal with them?

The search by detail data was a bigger problem to deal with. An ALV filter only operates on its own data context. But the detail data is separated in additional ALV grids. So we built a popup with additional filter criteria addressing the keys of the medical documentation we have to deal with.



The constraints of the detail filter are handled by a manager class of our master view. Such a constraint is defined as a single Select-Option. This manager class builds upon these constraints a SQL query. We built all combinations of master to detail relationships as an EXISTS clause for each detail relationship. We counted 8 valid combinations (guess 2^3 ) so the SQL queries can be hard coded in the manager class.




Detail A







Detail B






Detail C






Another problem was taking care of a detail filter while saving an ALV layout in the master view. If the user defined a detail filter it has to be saved together with the normal layout. We solved this by creating an add-on table holding the detail filter under the same key of the ALV layout. To do this ,we added some functionality in to the event handling of the ALV layout button. When the user loads a layout the detail filter is loaded too so the original search result is reproduced.


With these extensions to the ALV grid we realized a master to detail relationship with standard ABAP coding to allow the user to do exploratory searches in the whole data volume.

Advanced techniques

With this solution we are not at the end. Due using a Select-Option to define a detail filter, we are not able to do all kinds of filtering. We suggest keepingtrack of all invoices containing the medical treatment A und B except C or D. With a Select-Option, we are only able to select via OR and not via AND.


Additionally, we want to transform such a saved layout (with detail filter) to a customer defined check function which is being processed when a new invoice arrives. Under this condition we were faced with a major ABAP limit:


The 8 hard coded EXISTS clauses no longer match the new requirements. So we have to deal with a dynamic WHERE clause which does not currently support sub queries. We hit a dead end.


Without trying to use HANA-specific functionalities the only solutions we mentioned were the use of native-SQL or the generation of ABAP coding. We decided to choose code generation. Our manager class is able to transform a saved ALV layout with an applied detail filter into an SQL query. This built query is embedded into a generated ABAP class that could be used in our operational analytics framework.

What should be improved? – A wish list / Subject to Change

During the development of this dialog (which is based on the ALV grid class CL_GUI_ALV_GRID on NetWeaver 7.40 SP4), we also examined the use of the ALV-Grid with IDA (HANA-Grid class CL_SALV_GUI_TABLE_IDA on both NetWeaver 7.40 SP4 and SP6). We also noticed the availability of this grid on none HANA systems. That’s a very important fact. This capability may help us to rely on only one codebase. We do not need to separate HANA-only features in another codebase which has to be switched out on non HANA systems. Unfortunately the current version of the HANA grid doesn’t give us the ability to influence the handling of the grid the way a normal ALV grid does. We need the ability to take care of the grid’s internal table of data storage in non HANA scenarios. Another solution could be a refresh function by the grid. By calling the grid’s refresh function, a new SQL query is sent to the database with the defined filter and sorting criteria. Also we need the ability to define the truncation.


After solving this, we are still not ready to use the HANA grid due to the fact that we are not able to manipulate the WHERE-clause. Well in the meantime I think the solution will be a master-detail-relationship based on HANA-features. But this doesn’t help us in our case.


Due to our business scenario, our solution has to be used by both, HANA and non HANA customers.


As I described earlier, we use code generation to avoid native-SQL. But the code generation doesn't help us in case of the HANA grid. Remember we do not have the ability to query the database on our own.


With our solution, we are able to support the user with an exploratory search technique. It’s a kind of generic solution that does not rely on further development once it is transported to the customer. Due to the fact that the user never wants to jump from one System to another (for instance to jump into a BW system, do an snalysis there and jump back to his operative application), he is able to build a check out of the combined filter criteria to integrate it into the previously defined set of checks that are processed when a new invoice arrives.


This prototyping is not at the end. Some technical problems must be solved to build other features on top of this approach. But these problems are tough because they depend on needed extensions to OpenSQL (sub queries in dynamic WHERE clauses) or to the HANA grid.


The general availability of the HANA Platform features is not only a huge gain of speed but also of additional functionalities to operate on Big Data. But first we have to lift the customers to that platform. I suggest it is only done by building hybrid solutions that handle of the base business cases which run faster and give some view of benefits if run on HANA. To minimize the effort, we have to rely on only one codeline that we have to support.

As an ABAPer we have SAT, ST05 ( or sometimes ST12 ) for trace in our toolbox, and recently I find this report which could also do the trace job.


Although the trace information it generates is quite technical and perhaps more useful for those guys who are interested with the ABAP kernel.


How to use this report


1. SE38, execute report RSTRC000, mark the checkbox "Keep Work process", so that a free work process will be owned exclusively by you unless you release it via this report again. And change the trace Level to 2: Full trace. Select the component which you would like to trace, for example Database.



Click save button and you can see the work process 23 is locked.


you could observe that the work process 23 has status "halt" in tcode SM50.clipboard3.png

2. Now it is ready to run the program which you would like to trace ( just the similar process as SAT or ST05 ). Use /nse38 to go to abap editor starting from the current screen of report RSTRC000, and run your program. For me, I just run a report which will query material data from database table COMM_PRODUCT. Once the program finishes, run report RSTRC000 again.


click button "Default val." so that trace Level is changed to 1 automatically,


then click save button and you could observe the previously locked work process 23 is released.


Now you could click "Display" button to review trace log:


You could export the trace locally to review it. For me I prefer to use my favourite text editor "sublime text" to review text file.


Here below I just list the trace review of several trace component which I have already tried myself.


Database log


from the log, I could find which database tables are involved in the report execution and which ABAP program triggers such access. Some C language call could be observed but due to security or authority reasons maybe, we could not review the source file like ablink.c in folder /bas/*.


We could also find the detail OPEN SQL statement from the log, however I could not find the value of query parameter as shown below - they are displayed as ? in the trace.


ABAP proc.


It just lists all the ABAP class which are involved in the report execution but without method name of those class. In my case from the trace I can just know there are totally 40 different ABAP class with prefix CL_CRM_PROD* ( which are responsible by me)  involved in the execution.


Database (DBSL)


Since we are currently use HANA as our database, I could have a very draft understanding about how the OPEN SQL like SELECT XXX FROM table statement is executed in HANA.


Lock Management


This time I would like to trace the lock behavior in tcode COMMPR01. I switch to edit mode which triggers a lock request to enqueue server to lock the product and then I make changes on its description field.


in the trace this enqueue request is perfectly recorded:


  • the enqueue object
  • the database table on which the enqueue object is working
  • the guid of the product instance being locked
  • the tcode name COMMPR01
  • the user which triggers the enqueue request



From my point of view this option is a good substitute for the enqueue trace in ST05.






I run my report ZHANA_OBJECT_SEARCH in the background and


and I could see from the job log that it is successfully executed.


and this information is also available in RSTRC000 trace:


I didn't try all the other trace options and maybe they are useful under some extreme use cases. If you are interested, you can start now try it yourself

Typically, SAP ABAP based programs/applications are not known for a lot of user friendliness. However we can apply a set of design techniques to the programs which combined together can make programs much more user friendly, easy to maintain and support. Easy to use programs are better adopted by users, lead to savings in time and effort throughout the life cycle of the program.


In this post I have summarized a list of techniques which can serve as a checklist when designing/reviewing a program. Some of these techniques are well known however some of these get overlooked during the design and build process.

These techniques can reduce the cost of design phase since the programs become easier to develop and test, result in less defects, less rework and quicker user testing. This results in faster time to market. Moreover these techniques result into more predictable program designs and can lower support cost and good SLA adherence.

Program Lifecycle and benefits of user friendly design.PNG

Figure 1: Program Life-cycle and benefits of user friendly design


When designing a program, the first step is to identify the audience or the users of your program. This will help you determine the environment in which users will use the program and how they will interact with your program. This will help you design the program, the user interface with end user in mind. Understanding the user’s perspective will help you make appropriate design decisions which will lead to development of a good quality program.

Typically, there are 3 types of users for a program –

  1. Business users/end users who will use the program to perform various business functions
  2. Technical consultants or developers who will maintain the program in future and
  3. Support team who will be dealing with supporting  or using the programs (or the product/application) depending on purpose of the program


Types of user.PNG

Figure 2: Types of users


Next step is to apply various applicable techniques to your programs.


A. Easy to use

The program should be easy to use for end user. It should have below listed features/attributes.


  • Simple interface
    • The program must have a simple user interface with clearly organized screen elements.
    • Any field name or label on the screen should be easy to understand and self-explanatory.
    • The program design should adhere to design standards.
    • The interface should have a user interface design consistent with other programs in the organization.
    • The user interface should be easy to navigate.
    • When possible provide keyboard shortcuts to various commands.
    • If the program performs multiple operations then provide a Menu with clearly organized functions and sub-functions.
    • Use contextual menus to provide relevant shortcuts together.


  • Good error handling
    • The program should have robust data validation so that errors are minimized.
    • Always show meaningful error messages which are easy to understand and help in determine next steps to be taken.
    • Create a log file in case there is lot of information to be provided along with the error.
    • Programs should be written with effective error handing so that there are no short dumps.


  • Easy to execute
    • Make sure that documentation related to the program is accessible from program main screen either using the help documentation feature or by giving appropriate links to where documentation can be found.
    • Use meaningful field names on the user interface screen.
    • Try to provide default values on input fields where possible.
    • Use selection screen variants to help users choose various scenarios easily.
    • Use layout variants to help users choose desired output layout quickly.
    • Show progress indicator when program is executing to give a visual idea of how much processing is pending.
    • If the program involves uploading a file for processing then the documentation should contain the file layout and a sample file so that any user of program can figure out the file format to be used. Also in such case mention the file extensions allowed by the program such as .CSV, .TXT, .XLS etc.



  • Easy to use result
    • Provide ability to choose the data export format such as CSV, spreadsheets etc. from selection screen. This makes gathering results in a usable format much faster.
    • Provide ability to print data.
    • Provide ability to receive output data as an email attachment.


  • File uploading
    • If program deals with uploading a file for processing then give options to read a file from desktop (presentation server) or a network directory (application server).


  • Ability to run in background
    • Provide ability to run the program in background in case there is large amount of data to be processed.
    • Provide enough informational messages which will get captured in the job log.
    • Provide ability to receive an email notification when the background job is finished.


B. Easy to maintain

The program should be easy to update or maintain for another developer. It should have below listed features/attributes including the basic hygiene factors such as adherence to development standards and best practices.


  • Adhere to development standards so that code is easy to maintain in future.
  • Make sure that there are no code inspector errors. In case there are certain errors which cannot be resolved, add comments in the code why such error cannot be resolved.
  • Use meaningful variable names.
  • Make sure that the code and the technical specification is well documented.
  • Well format the code using Pretty Printer.
  • Arrange the subroutines in a meaningful sequence and also giving a sequence number in the subroutine name.
  • Modularize the code as much possible to promote reuse and maintainability.


C. Easy to support

The program should be easy to support for the technical/functional support team. It should have below listed features/attributes.


  • Program should be easy to troubleshoot for the support team.
  • Program should generate sufficient logs which support team can use for troubleshooting.
  • Provide detailed technical information about an error or undesirable situation.
  • Make sure that it is easy to pin point type of issue – data, user input, dependency, system.


This is my first post about making programs more usable. In the next posts I will try to cover some of above points in more detail and with some examples or some sample code. However please do share comments about any other points about usability you think should be considered when designing the programs.


Below are some good reads related to this topic:



Thank you for reading.

Scratch no longer!


Ok, so this is going to be a really quick post, just because I think there's a lot of people out there who didn't know about this functionality and it's very helpful.


Many times it has happened to me, I'm working on a Smartform, and I try to test it and nothing happens, the function module aborts with some formatting error or sometimes a "others" exception, and I had to go and try to understand what happened, comment out code or delete windows (making a backup) until I found where the problem was, usually a missing logo or a window exceeding the page limits. A missing text module will also do this.


But there's an easier way to find out what the problem is, SFTRACE!


Let's work through an example.


So, I tried to print/preview a print out, it didn't work, what now?


I execute transaction SFTRACE and turn the trace on:





Try to print again. Refresh the trace. An entry will appear!




Now look at the info... in my case apparently there's a missing graphic!





There you have it. I hope this is helpful.


Best regards,

Bruno Esperança

LSMW : Issue with Symbols in the Load File ( Display Convert Data Step)


Issue :

Found this issue while loading data for General Tasklist through LSMW (Direct/Batch Input) .

Convert Data step is showing the correct record count , but at Display Convert Data Step , system is showing only few records.

(Reference for Tasklist Creation using LSMW , please refer LSMWLSMW – Task List Creation with 5 level input file using Batch Input   )


Using Example that I have used.

The input load file is having  187 records i.e., total of 16 transactions

Step : Read Data 

All the 187 records were read and correct count has been displayed on the screen.


Step : Convert Data

Even after Convert Data step , system is showing correct



Step : Display Converted Data

Here only few records are displayed . Though there are 187 records converted , but at the display level only 61 records are displayed and remaining are skipped.



After lot of R & D , came to know the root cause for this is the presence of a symbol in the load file.


Here , in the above image , a special symbol is being passed to the SAP , which it is unable to identify.

System is unable to identify that symbol.. so it is failing to read from there and it is not reading further .

Because of this , the system is reading till the symbol only .i.e.,   'TX42 Ensure the operator' .



If anyone faces this kind of issue , please make sure to cleanse data in the load file .

Though the symbol is simple , it created a lot of problem.   :-)


Hope this might be helpful...!




V Prasanth Kumar Reddy

As a developer, we frequently come across BDC / BAPI developments. When we have BDC and BAPI, Error Handling plays a crucial rule.


Recently, I was engaged in a development of BDC. And when it comes to BDC. BDCMSGCOLL plays a crucial role when opting for 'CALL TRANSACTION' method.


Here, in my case, I need to simply display a meaning full messages that are returned after 'Call Transaction' into BDCMSGCOLL.


Instead going pillar to post, a simple combination of function modules does the entire job required. The output and sample coding as in below.


Function Modules Used :


*  messages of call transaction
DATA:   messtab TYPE STANDARD TABLE OF bdcmsgcoll,
bapiret2 TYPE STANDARD TABLE OF bapiret2.

****************************SOME BDC Here************

   PERFORM bdc_field   USING 'BDC_CURSOR'   'DF05B-PSBET(01)'.
PERFORM bdc_field   USING 'RF05A-ABPOS'   '1'.
ct_params-defsize = 'X'.
ct_params-updmode = 'L'.
ct_params-dismode = p_mode.

OPTIONS FROM ct_params

imt_bdcmsgcoll = messtab[]
ext_return  = bapiret2.

*Display messages from BAPIRET2
it_return = bapiret2.


With Combination of above two function modules, following desired out put obtained with out any custom code.



BAPIRET2 msg.png


Always giving a simple search into utility function modules from the standard SAP Repository is quite a rewarding. you will come out clean with no bug-fixing and formatting almost every time.


*with SAP_ABA 731

This is my first post in this blog, well playing with ABAP I thought it could be cool to add methods to a class dynamically.  Am sharing with you a sample coding that just do that.


After searching the internet there are some reference to SBMIG_CREATE_CLASS_AND_TTYPES progam. Ideally the program creates a class/type dynamically. Tracing the code I found the following functions



  • SEO_METHOD_CREATE_F_DATA: that creates a method definition in a class.
  • SEO_PARAMETER_CREATE_F_DATA: that creates the method parameter
  • SEO_METHOD_GENERATE_INCLUDE: that creates the code implantation.




There isn’t much of a documentation (or there is and I couldn't find it) so I play with the coding and wrote a simple program that just adds a method to a static class. The program (see the code below) does not save the method, after executing the code the method does not exist. It creates a public static method call “Descr” on an existing class call ZCL_TEST. The new method returns a string ‘Test’.


Data: l_wa_method type vseomethod,
type seo_method_source,
type c
value ' ',
type c
value 'X',
type string,
TYPE seocpdkey,
(10) type c,
type string VALUE 'DESCR',
type REF TO object,
(60) type c value 'ZCL_TEST',
type currkey,
type c VALUE ' '


-clsname = l_v_classname.
-CMPNAME = l_v_method_name.
-DESCRIPT = 'Testing Method'.
-version '1'.
-langu = sy-langu.
-MTDTYPE = '0'.
-MTDDECLTYP = '1'. "static method = 1- instance = 0
-EXPOSURE = '2'. "public = 2, private = 0, protected = 1
-state = '1'.
-REFCLSNAME = l_v_classname.
-REFINTNAME = l_v_classname.
-REFCMPNAME = l_v_method_name.

-clsname = l_v_classname.
-CMPNAME = l_v_method_name.
-SCONAME = 'r_v_desc'.
-VERSION = '1'.
-TYPTYPE = '1'. " like =0,  type=1, type ref=2
-TYPE = 'char10'.

-CPDNAME = l_v_method_name.
-REDEFINE = '0'.
= ' r_v_desc = ''Test 12121''.' .
append l_wa_code to l_tbl_code.

-SOURCE  = l_tbl_code.

-clsname = l_v_classname.
-cpdname = l_v_method_name.

move-CORRESPONDING l_wa_method to  cfkey.
= cfkey
= '0'.

= seox_false
method = l_wa_method

= seox_false
parameter = l_wa_method_parameter
OTHERS    = 1.

= mtdkey
= 'X'
= l_wa_method_source-redefine
= l_wa_method_source-source
= 'X'
= 'X'
OTHERS                  = 1.

call method ZCL_TEST=>(l_v_method_name)
= l_v_r

write l_v_r.


Filter Blog

By author:
By date:
By tag: