1 6 7 8 9 10 58 Previous Next

ABAP Development

866 Posts

Have you ever heard of PANKS?


PANKS is an acronym that stands for Performance Assistant Note and KBAS Search and as its name implies, it’s a new tool that searches for SAP Notes and KBAS from the performance assistant based on the ABAP error code and context.


Whenever you get an error message, the Performance Assistant is there to help you:


pic1.jpg

 


This is the Performance Assistant that comes along with ABAP error messages.


pic2.jpg

 

 

Looking more closely to the Performance Assistant, you will notice there is a new button:


pic3.jpg


Click on it and you will get SAP Notes and KBAS related to the error code that are relevant to your SAP Netweaver system. In the example 3G 103:


pic4.jpg


How does PANKS work?


This is the algorithm used by PANKS to search for SAP Notes and KBAS (Source: Note 2020356):


TRANS = Transaction

ID = Message class

NNN = Message number


  1. TRANS and sy-cprog and (IDNNN or ID + NNN)
  2. TRANS and help_infos-report + (IDNNN and ID + NNN)
  3. TRANS and help_infos-dynpprog and (IDNNN or ID + NNN)
  4. TRANS and help_infos-PROGRAM and (IDNNN or ID + NNN)
  5. TRANS and help_infos-MSGV1 + (IDNNN or ID + NNN) [In case help_infos-sy_dyn = 'U']
  6. TRANS and IDNNN
  7. sy-cprog and (IDNNN or ID + NNN)
  8. help_infos-report and (IDNNN or ID + NNN)
  9. help_infos-dynpprog and (IDNNN or ID + NNN)
  10. help_infos-PROGRAM and (IDNNN or ID + NNN)
  11. help_infos-MSGV1 and (IDNNN or ID + NNN) [In case help_infos-sy_dyn = 'U']


  • If SAP Notes and / or KBAS are found in any of the steps [1-11], the rest of the steps will be omitted.
  • Different fields in our help_info-structure are filled by Screen Processor according to the current state of program at run-time. They do not need to and may not be filled at the same time.They have different meanings: program to which the screen belongs, report from which program to which screen belongs is started, program for sub screen, code from which the program to which screen belongs is started.The user can find out it via System -> Status -> sub screen for ‘Repository-data’ or via usage of ‘technical info’ button, when F1-help is clicked on the relevant screen.


What’s the advantage compared to the info that traditionally provided the performance assistant?


  • Complete integration with the SAP Notes and KBAS repository.
  • Real time, immediate documentation update. Updating the info provided in the Performance Assistant is necessary but it may take some time... Changes in KBAS and SAP Notes take immediate effect!


New PANKS makes part of a collection of tools that makes it possible to search for notes and kbas directly from the system where the issue happens:



Unlike ANST which searches for SAP correction Notes based on ABAP objects, PANKS relies on the quality of the text of the SAP Notes and KBAS, which means, some adjustments to the SAP Notes and KBAS might be needed to get the best results.


Watch the video demo:



Release info:


PANKS will be released with the following SAP_BASIS patches:


Release            Package name

      700                        SAPKB70032
      701                        SAPKB70117
      702                        SAPKB70217
      710                        SAPKB71020
      711                        SAPKB71115
      730                        SAPKB73013
      731                        SAPKB73115
      740                        SAPKB74010


Can’t wait for the support packages?


Apply the following note (At present, withdrawn for maintenance, (Thanks for your patience!):

 

2020356 - Search for SAP Notes in document viewers not possible


If you apply the note manually, take into account that it contains manual steps and that note 2080528 as to be applied as a prerequisite. Note 2094901 is also aprerequisite for release 710.


Special thanks to Natalya Timchouk who has developed the Performance Assistant Note and KBAS Search feature and to Jesús López from our Product Support for his determination to make this possible.


More info on the KBA:


2096401 - FAQs: Performance Assistant Notes and KBAS search PANKS

 

SAP Developer and Architect Summit November 21stand 22nd November 2014

 

 

WP_20141120_001.jpg

A Dustbin

 

Introduction

Last week I had the pleasure of going to the inaugural “SAP Developer and Architect Conference” which is in essence a boiled down version of TECHED – you have all the same speakers as at the real TECHED in Berlin or Bangalore or Las Vegas (e.g. Thomas Jung) presenting the exact same content and making you do the exact same exercises. All this, at a fraction of the normal TECHED price; here in Australia the SAP developer community thought all our Christmases had come at once.

Graham Robinson challenged me at the conference to write a blog saying all the new things I had learned, and I could hardly refuse him. I did bear in mind the so called “Black and Decker” rules I had to abide by when writing about an SAP conference. These were introduced to make sure such blogs were all about the technical content of the presentations, and not what the blogger had for breakfast for example. The rules are simple, and as follows:-

·          Do not mention what you had for breakfast / morning tea / lunch / afternoon tea

·          Do not mention what you said to the taxi driver going to/from the conference

·          Do not mention or show photographs of any sort of waste disposal device – what is called a “dustbin” in the UK or a “trash can” in the USA

·          Do not mention any pop stars you may have met, in particular Stevie Winwood

 

All those rules seemed easy enough to follow, so I thought I would oblige Graham in his request.

 

Keynote

You are probably going to have to lie down before I say that the product HANA was mentioned. Is that a shock? I hope you are OK now. It was fairly strange that the speaker thought the audience would not know the names of SAP tables e.g. BSEG and so tried to keep things agnostic until the last minute. The message however comes through crystal clear – once you understand what HANA is about from a technical point of view you will understand what a sees change is coming. Oracle just does not see this yet,  maybe due to confused messages coming out of SAP, once they do they will have to bite back bug time, or face going under r – this is the proverbial “Oracle Killer”.

 

However I already knew that. What was much more interesting was the philosophy which boiled down to “don’t bother planning any of your IT projects just go ahead and do them, if they work, then pretend you planned it”.  Some people call this “agile”. This messages resonates with the Australian market, we are far more concerned with innovating at a very fast pace then making sure it works first.

SAP are clearly doing this themselves, eating their own dog food as it were – the release of SAP GUI 7.40 speaks volumes – no testing, just rush it out and see what happens.

 

First Morning

 

At some conferences you get to see people (like me) standing on stage and spouting nonsense for an hour. You have no idea if they are telling the truth or not. However if they give you a demonstration system and make you do exercises then it is clear what the product is, warts and all. I decided on a 100% exercise based stream. The first day was clearly going to be a HANA day but that is not a Bad Thing as I reckon all SAP customers will be on this eventually, once they recognise the million foot high ten mile wide elephant in the room.

 

First up, Rich Heilman from SAP talked about how to how to push down your code into the database using SQLScript. The vital point here is as follows – previously SAP recommended using CE FUNCTIONS to wrap your database quires, now they say that CE FUNCTIOBS are rubbish, they are obsolete. Naturally I cannot guarantee they will not become flavour of the month again next yaer, but that is the current state of play.

 

Morning Tea

Whilst the delegates were arriving, we were served “Balut” which is a developed duck embryo boiled alive and eaten in its egg. This dish I s really common in Southeast Asia and often served with beer. The ideal balut is 17 days in the Philippines, but Vietnamese people prefer the 19-21 days old ones. To wash this don we had “airag” which comes from Mongolia. The Mongolian fondness for horses extends past riding and eating them to milking them. The mare’s milk is stored in a cow’s stomach near the entrance of each family’s yurt, or nomadic tent, and left to ferment until it reaches close to 5% alcoholic content, just the thing for breakfast.

 

Moving onto Stevie Winwood - Stephen Lawrence "Steve" Winwood (born 12 May 1948) is an English musician whose genres include rock, blue-eyed soul, rhythm and blues, blues rock, pop rock, and jazz. A multi-instrumentalist, he can play keyboards, bass guitar, drums, guitar, mandolin, violin, and other strings.

 

Second Session

 

As I may have mentioned the HANA session lasts all day so we continue – we had some technical drawbacks like the Wi-Fi not working, and the power points not working, but all credit to the organisers they extended access to the development system for a few days after the conference was over.

 

As I do not work for SAP I can say that in some areas HANA does not yet looks as polished as the mature product will no doubt look like e.g. you have to do a direct database update to authorise your user for debugging. However this is just nit-picking the product is evolving at lightning speed and even after a few minutes you will see the benefits of HANA sticking out like a DB as we say in Australia (stands for “Data Base”).

 

Lunch

Everyone was waiting or lunch as they knew it would be “Casu Marzu” (means rotten cheese) a traditional sheep milk cheese from Sardinia, Italy. The cheese makers set the cheese outside and allow cheese flies to lay eggs inside of it. The larvae which hatch from the eggs have the job of breaking down of the cheese's fats and fermenting it, which kills them. It's unnecessary to clear those dead larvae (white worms) from the cheese before consuming. The “Casu Marzu” was banned for years and only sold on the black market, but few years ago it was declared a traditional food and now it's legal to produce and sell them, which is how this delicacy got to our conference. To wash this down we had “snake wine”. This is made in two ways, first by placing a whole venomous snake into grain alcohol for a few months until the ethanol breaks down the poisons, and then by mixing result with the blood and bile of a freshly killed snake with a shot of local brandy at the SAP booth in the exhibition hall.

 

Moving onto Stevie Winwood - Winwood was a key member of The Spencer Davis Group, Traffic, Blind Faith and Go. He also had a successful solo career with hits including "Valerie", "Back in the High Life Again" and two US Billboard Hot 100 number ones; "Higher Love" and "Roll With It". He was inducted into the Rock and Roll Hall of Fame as a member of Traffic in 2004.

 

Native HANA Development

 

In comes Thomas Jung to take us through developing a HANA application from scratch, no SAP system involved. These exercises would build on each other, starting with what I would call DDIC elements then thet ables, then the procedures, then the user interface.

 

HANA has no user interface, as indeed it should not according to the “separation of concerns” so UI5 was used as the output mechanism.

Afternoon Tea

In Australia we often have Emu Eggs for afternoon tea, but just to be different on the egg front, this time we had “Balut” which is a fairly common and unassuming street food available in both the Philipines and Vietnam. It has earned a widespread reputation as one of the all-time most interesting ethnic delicacies. Most of the eggs with which the western world is familiar with are unfertilized eggs. The “Balut”, though are fertilized duck eggs, incubated or allowed to grow invitro for a certain length of time, usually a few weeks. Peel back the shell and along with a typical soft-boiled eggy interior is also the small inert body of a fetal duck—small bones, feathers, beak and all, some more developed than others. Most accounts suggest the correct way to eat this is slurping it right from the shell with a pinch of salt. There is in addition  a right way to enjoy “Balut” which is to wash it down with “Brennevin” a type of schnapps popular throughout the Nordic countries, and in Iceland; this drink is made with fermented potatoes and caraway seeds. It is lethally strong stuff and is often drunk during the Thorrablot (the winter feast), where it is enjoyed with such delights as ram’s testicles and boiled sheep’s head.

 

After that treat we need a cocktail to calm us down and luckily we have several on offer : There’s Stool Sample: a mixture of cocoa, coffee liqueur, vodka, and cream liqueurs, strawberry syrup and fudge pieces (it’s all about the consistency, apparently); Sanitizer, a palate-cleansing mix of lychee liqueur, cherry brandy, grenadine, vodka and white rum; Urine, made with whiskey, brandy and limoncello, which comes in its own sample tube and Stomach Contents, a tipple, made with vodka and filled with chocolate pieces and Skittles.

 

Moving onto Stevie Winwood - In 2005 Winwood was honoured as a BMI Icon at the annual BMI London Awards for his "enduring influence on generations of music makers." In 2008, Rolling Stone magazine ranked Winwood #33 in its 100 Greatest Singers of All Time.Winwood has won two Grammy Awards.

 

Closing Speeches

 

First up was D.J.Adams, whom I have the pleasure of drinking with at SAP events in Bonn in Germany and Eindhoven in the Netherlands. It was almost like meeting Stevie Winwood.

 

Then we had “Dave” from “Dave’s Brewing Tours” to talk about “innovation in beer”. In Australia 97% of the market is sewn up by two brewing companies, and 190 craft brewers compete for the last 3%.  I can see the connection to IT – we are seen as a commodity, how can we make ourselves different enough to be seen as making a difference?

Usually the last talk is a bad spot to be in as everyone is waiting for the free beer, and then it suddenly occurred to me that the three Master Brewers that Dave had brought on stage would be giving out free beer all evening to us delegates, which indeed they did.

 

Second Day

 

WP_20141120_003.jpg

 

Another dustbin

 

Morning Tea

On the second day we need something to wake up – this is going to be “Ikizukuri” a sashimi made from fishes, octopus, shrimp and lobster. The chef performs the filleting without killing the animals involved and serves them up on a plate with the sliced flesh and still-beating hearts right next to the goldfish bowl in which you put your business card to win an IPAD. To wash this down we have Gilpin Family Whisky which is made from the urine of elderly diabetics. This wonderful concoction is the artistic statement of James Gilpin and unfortunately isn't sold in stores. Gilpin takes the urine of two diabetic patients daily, extracts the high sugar content, then uses that sugar in the fermentation of whisky production. You might think things could not get better than this, but afterwards we had Snake Bile Wine made from Bile extracted from a live cobra and rice wine. To make this concoction the SAP VP of marketing had to get a live cobra, cut him open, remove his gallbladder and extract the sweet, sweet bile, all live on stage whilst demonstrating SAP Luminara. He then mixed that up with rice wine and handed the result out to anyone who enjoys harnessing the power of cobra bile, which was everybody.

Gateway

This session was wonderful. This is just what I want from a conference like this, because I am an anal technical person. Gateway is the mechanism for exposing SAP data to UI5 using the Odata protocol.

This session had a clear theme – if the performance of your UI5 application is bad, here are several you can do to help it, and go through these exercises to learn how to do it yourself rather than having it spoon fed to you. This is teaching someone to fish rather than giving them a fish.

I can only presume that in some cases the generated code in transaction SEGW (ABAP Gateway Modeler) is not as good as it could be – this has been the case with any statement which says “you do not have to write a line of code since 1960.

UI5

The next gentleman did a live coding example of what UI5 looks like – for all of you ABAP programmers out there scared of horrible things like XML and JavaScipt and HTML I would say “don’t be scared – have a look – it is where near as difficult as you might think”.

Lunch

Lunch is the high spot of the day, and this fine day we had “bat paste”: this dish is made by the pre-sales team at SAP catching live bats and forcing them into a pot of boiling milk. When you're done boiling the bats alive you chop them up and mix the pieces in with your chosen variety of herbs and spices. The paste is ten applied to a dish consisting of Ox ***** and “Sannakji” which is a raw Korean dish that consists of live octopuses with sesame and sesame oil. It looks safe but the little tentacles may choke you to death if they get stuck in your throat. This was garnished with Japanese Tuna Eyeballs. To wash this down there were two lovely cocktails – firstly “Baby Mice Wine” made from rice wine and Baby mice. Traditionally a "health tonic" in Chinese and Korean cultures, baby mice are taken shortly after birth, eyes still closed, and dropped alive into a jug of rice wine. The wine is left to ferment for a few months until ready and at serving time, after the wine is imbibed, the embalmed mice are eaten. The second cocktail is the “The Kim Jong Un Nuclear Bomb” which is made from the following ingredients all mashed up in a blender:-
1 Big Mac
1 McDonald's large fries
1 McDonald's tangy BBQ sauce
1 McDonald's milk shake (chocolate, strawberry and vanilla mixed)
1 McDonald's apple pie
Vodka

Evolution of ABAP Programming

This was the icing on the cake – I had been aware of the new ABAP features for a while, but all the examples in SAP online help and on the SCN were all boring, using variables called A and B and C. This is very abstract, but SAP customers work in the real world with real variables that refer to actual things in the real world.

Thomas Jung squared this circle by not bothering with the official examples – he had written his own in ABAP and showed them to us live on stage. When someone asked “what if you do XYZ” then Thomas hanged the code there and then and ran the program….

I would advocate Googling the ABA advances in 7.40 – the bogs by Horst Keller are a wonderful start. Once you get our head around what is possible in ABAP 7.40 you will be on your knees asking the manager to upgrade the SAP system.

Afternoon Tea

For afternoon tea the first dish is the traditional Icelandic food “kaestur hakarl” which means "fermented shark". It's basically bits of stinking, rotten shark carcass. Hakarl is prepared by gutting and beheading a Greenland shark, burying it in sand for up to 12 weeks, then cutting it into strips and hanging them out to dry for a few months. After eating that, we all felt like some caviar -  luckily for us, as normal caviar is not good enough, we turn to the Mexicans who have come up with their own version “Escamoles”. These are the larvae of giant, highly venomous cockroaches. To wash this down we have “Three Lizard Liquor” - a fancy wine predominantly found in Asia which is typically produced by infusing lizards (three to be precise) into rice wine. The traditional eastern medicine theory states that this allows the lizard’s energy to be absorbed by the alcohol and thus transferred to whoever drinks it.

 

 

WP_20141120_004.jpg

 

Two more dustbins

 

Summary

 

That was a wonderful conference, giving you hands on experience of all the new technology.

 

What more could you ask for?

 

I will be 100% going to this next year, if you can reach Australia I suggest you do the same.

 

In the interim I hear there is a new book coming out next April, all about the new features in ABAP. Thomas Jung had written two books about next generation ABAP development, and the were wonderful, but he did not want to write a third, so SAP Press tried to pick the sexiest man in the universe to write a new version, and luckily they caught him between winning the 2013 Mr. Universe contest and running and “Iron Man” marathon in four minutes one second, they managed to beat all his top model girlfriends off him, and he agreed to write a book called “ABAP To The Future”. So that is good news for everybody.

 

Cheersy Cheers

 

Paul

The last section (part1) covered the steps required to enhance vendor master( table/screens ). In this section I will demonstrate the procedure to update the custom fields of vendor master and generate change pointers for these fields ( to eventually generate IDOCS for changes to field values).

 

Unfortunately since there is no standard BAPI to update the vendor master record, the only other options available to update custom fields of vendor master are to either 1) Perform a BDC on XK02 transaction to update custom fields on subscreen with these fields. 2) Enhance vendor API class VMD_EI_API to process and update custom fields.

 

The former (BDC) may not be feasible in many scenarios, specially if the custom screen fields are read only( like in my case) fields or these fields are not added to the subscreen in the first place.

 

Given this shortcoming of BDC I will only cover ( and recommend ) the second option for vendor master update ( vendor API class).

 

Step1: Enhance structure of importing parameter to pass values for custom fields.

  • Enhance structure VMDS_EI_EXTERN by appending fields to include structure CI_CUSTOM. The screenshot below shows vendor safety structure (which was also appended to LFA1) is added as a component to CI_CUSTOM. This ensures any additional fields added to this structure for vendor master (general data) enhancement will automatically update importing parameter structure.

     VMDS_EI_EXTERN_enhancement.png

  • Enhancing VMDS_EI_EXTEN enables passing of values for custom vendor master fields using interface of method VMD_EI_API=>MAINTAIN( ). This method will be called to update vendor master data.

     VMD_EI_API_MAINTAIN.png

Step2: Implement enhancement to read the values passed for custom fields and assign them to LFA1 structure which is already being populated by the standard code. The screenshot below shows implementation of explicit enhancement section EHP603_EXTERN_TO_ECC_UPDATE_02 of enhancement spot ES_VMD_EI_API_MAP_STRUCTURE.

     ZMM_VSCS_ENTERN_TO_ECC_UPDATE.png

    • Although the above shown enhancement only updates custom the central data (general data ) of vendor master, the enhancement spot offers other enhancement sections which can be used in a similar fashion to update other sections of the vendor master ( purchase org, company, banking etc ).
    • As per sap note 1989074 always replace an standard sap implementation instead of changing it.

     This completes the enhancements required in the vendor master API class for updating vendor master.

Step3: The next step involves calling the MAINTAIN method of VMD_EI_API class to update vendor master.  In my scenario I am reading vendor master details from an XML file, transforming it and updating Vendor master, however The code snippet only shows the call to update vendor master, rest is beyond scope of this article.

 

 

 

METHOD update_vendor_master.



  DATA: ls_vmds_ei_main TYPE vmds_ei_main,

        ls_message      TYPE cvis_message,

        lt_return       TYPE bapiret2_t,

        ls_return       LIKE LINE OF lt_return.

  DATA: lt_vmds_ei_extern TYPE vmds_ei_extern_t,

        ls_vmds_ei_extern LIKE LINE OF lt_vmds_ei_extern.

  DATA: lv_success TYPE sap_bool,

        ls_lfa1_new TYPE lfa1.

  DATA: ls_update_err TYPE LINE OF zmmtt_lifnr_update_failed_list.

** Populate the method input parameters.

  ls_vmds_ei_extern-header-object_instance = iv_lifnr.

  ls_vmds_ei_extern-header-object_task     = 'U'"<- indicator to update

  ls_vmds_ei_extern-zzvend_safety_compl    = is_safety_score. "<- input parameter to this method, originally read from XML



  APPEND ls_vmds_ei_extern TO lt_vmds_ei_extern.

  ls_vmds_ei_main-vendors = lt_vmds_ei_extern.



  vmd_ei_api=>maintain(

    EXPORTING

      is_master_data = ls_vmds_ei_main    " Vendor Total Data

    IMPORTING

      es_error       ls_message   )." Error Indicator and System Messages



  IF ls_message-is_error IS INITIAL.

    COMMIT WORK AND WAIT.     "<-- Explicit commit required to finalize changes in database

    IF sy-subrc EQ 0.

      lv_success = abap_true.

    ENDIF.

  ELSE.

    lv_success = abap_false.

  ENDIF.



  IF lv_success EQ abap_true.

    ls_lfa1_new = is_lfa1_old.

    ls_lfa1_new-zzvscs_company_name     = is_safety_score-zzvscs_company_name.

    ls_lfa1_new-zzvscs_isn_code         = is_safety_score-zzvscs_isn_code.

    ls_lfa1_new-zzvscs_naics_code       = is_safety_score-zzvscs_naics_code.

    ls_lfa1_new-zzvscs_trir_3_years     = is_safety_score-zzvscs_trir_3_years.

    ls_lfa1_new-zzvscs_fatality_3_years = is_safety_score-zzvscs_fatality_3_years.

    ls_lfa1_new-zzvscs_scq_v_score      = is_safety_score-zzvscs_scq_v_score.

    ls_lfa1_new-zzvscs_emr_3_years      = is_safety_score-zzvscs_emr_3_years.

    ls_lfa1_new-zzvscs_ravs_score       = is_safety_score-zzvscs_ravs_score.

    ls_lfa1_new-zzvscs_citation_info    = is_safety_score-zzvscs_citation_info.

    ls_lfa1_new-zzvscs_dashboard_score  = is_safety_score-zzvscs_dashboard_score.

    ls_lfa1_new-zzvscs_hes_showstpr     = is_safety_score-zzvscs_hes_showstpr.

    ls_lfa1_new-zzvscs_eval_rept        = is_safety_score-zzvscs_eval_rept.

    ls_lfa1_new-zzvscs_score_accept_hdr = is_safety_score-zzvscs_score_accept_hdr.

    ls_lfa1_new-zzvscs_changed_by       = is_safety_score-zzvscs_changed_by.

    ls_lfa1_new-zzvscs_changed_at       = is_safety_score-zzvscs_changed_at.

    ls_lfa1_new-zzvscs_changed_on       = is_safety_score-zzvscs_changed_on.



    rv_success = me->perform_update_change_pointers( iv_lifnr = iv_lifnr is_lfa1_old = is_lfa1_old is_lfa1_new = ls_lfa1_new ).
     " perform_update_change_pointers -> covered in next step


  ELSE.
     " custom error handling logic based on requirements.
    ls_update_err-lifnr = iv_lifnr.

    ls_update_err-name  = is_lfa1_old-name1.

    ls_update_err-changed_on = sy-datum.

    ls_update_err-text       = text-e06.



    lt_return = ls_message-messages.

    READ TABLE lt_return INTO ls_return WITH KEY type = 'E'.

    IF sy-subrc EQ 0.

      REPLACE FIRST OCCURRENCE OF '&' IN ls_update_err-text WITH ls_return-message.

    ENDIF.

    APPEND ls_update_err TO me->mt_lfa1_failed.

    CLEAR ls_update_err.

  ENDIF.

ENDMETHOD.

 

Step3: If the vendor master update is successful in the previous method, the next step is creation of change documents for the custom fields, which will eventually create the change pointers ( for IDOC creation).

  • To ensure change documents are created for custom fields, your code needs to keep a copy of vendor master before and after the change ( is_lfa1_old and ls_lfa1_new in the above code ).

 

 

METHOD perform_update_change_pointers.

  DATA:

        l_lfb1      TYPE lfb1,

        l_lfm1      TYPE lfm1,

        l_ylfa1     TYPE lfa1,

        l_ylfb1     TYPE lfb1,

        l_ylfm1     TYPE lfm1,

        lt_xlfas    TYPE STANDARD TABLE OF flfas,

        lt_xlfb5    TYPE STANDARD TABLE OF flfb5,

        lt_xlfbk    TYPE STANDARD TABLE OF flfbk,

        lt_xlfza    TYPE STANDARD TABLE OF flfza,

        lt_ylfas    TYPE STANDARD TABLE OF flfas,

        lt_ylfb5    TYPE STANDARD TABLE OF flfb5,

        lt_ylfbk    TYPE STANDARD TABLE OF flfbk,

        lt_ylfza    TYPE STANDARD TABLE OF flfza,

        lt_xknvk    TYPE STANDARD TABLE OF fknvk,

        lt_xlfat    TYPE STANDARD TABLE OF flfat,

        lt_xlfbw    TYPE STANDARD TABLE OF flfbw,

        lt_xlfei    TYPE STANDARD TABLE OF flfei,

        lt_xlflr    TYPE STANDARD TABLE OF flflr,

        lt_xlfm2    TYPE STANDARD TABLE OF flfm2,

        lt_xwyt1    TYPE STANDARD TABLE OF fwyt1,

        lt_xwyt1t   TYPE STANDARD TABLE OF fwyt1t,

        lt_xwyt3    TYPE STANDARD TABLE OF fwyt3.

  DATA: lv_objectid TYPE cdobjectv,

        ls_update_err TYPE LINE OF ZMMTT_LIFNR_UPDATE_FAILED_LIST.



  lv_objectid = iv_lifnr.

  GET TIME.         "<- Gets the latest date and time.

  CALL FUNCTION 'KRED_WRITE_DOCUMENT' IN UPDATE TASK

    EXPORTING

      objectid                = lv_objectid

      tcode                   = me->gc_tcode_change

      utime                   = sy-uzeit

      udate                   = sy-datum

      username                = sy-uname

      planned_change_number   = ' '

      object_change_indicator = me->gc_update

      planned_or_real_changes = ' '

      no_change_pointers      = ' '

      upd_knvk                = space

      n_lfa1                  = is_lfa1_new

      o_ylfa1                 = is_lfa1_old

      upd_lfa1                = abap_true

      upd_lfas                = ' '

      upd_lfat                = ' '

      n_lfb1                  = l_lfb1

      o_ylfb1                 = l_lfb1

      upd_lfb1                = ' '

      upd_lfb5                = ' '

      upd_lfbk                = ' '

      upd_lfbw                = ' '

      upd_lfei                = ' '

      upd_lflr                = ' '

      n_lfm1                  = l_lfm1

      o_ylfm1                 = l_lfm1

      upd_lfm1                = ' '

      upd_lfm2                = ' '

      upd_lfza                = ' '

      upd_wyt1                = ' '

      upd_wyt1t               = ' '

      upd_wyt3                = ' '

    TABLES

      xknvk                   = lt_xknvk

      yknvk                   = lt_xknvk

      xlfas                   = lt_xlfas

      ylfas                   = lt_xlfas

      xlfat                   = lt_xlfat

      ylfat                   = lt_xlfat

      xlfb5                   = lt_xlfb5

      ylfb5                   = lt_xlfb5

      xlfbk                   = lt_xlfbk

      ylfbk                   = lt_ylfbk

      xlfbw                   = lt_xlfbw

      ylfbw                   = lt_xlfbw

      xlfei                   = lt_xlfei

      ylfei                   = lt_xlfei

      xlflr                   = lt_xlflr

      ylflr                   = lt_xlflr

      xlfm2                   = lt_xlfm2

      ylfm2                   = lt_xlfm2

      xlfza                   = lt_xlfza

      ylfza                   = lt_xlfza

      xwyt1                   = lt_xwyt1

      ywyt1                   = lt_xwyt1

      xwyt1t                  = lt_xwyt1t

      ywyt1t                  = lt_xwyt1t

      xwyt3                   = lt_xwyt3

      ywyt3                   = lt_xwyt3.



  if sy-subrc eq 0.

    COMMIT WORK AND WAIT. "<-- Explicit commit required to finalize changes in database

    if sy-subrc eq 0.

      rv_success = abap_true.

    endif.

  ENDIF.
     "Custom error handing logic
  if rv_success ne abap_true.

    ls_update_err-lifnr = iv_lifnr.

    ls_update_err-name  = is_lfa1_new-name1.

    ls_update_err-changed_on = SY-DATUM.

    ls_update_err-text       = text-e05.

    APPEND ls_update_err to me->mt_lfa1_failed.

    CLEAR ls_update_err.

  ENDIF.

ENDMETHOD.

 

 

 

This completes vendor master update. In the next part will cover creation of CREAMAS idocs for vendor master changes.- part3

Dynamic Node in Web Dynpro ALV

 

  Dynamic programming is very helpful to provide solution for some critical business requirement in SAP GUI and this legacy is carried forward to the
dynpro era also. Here I worked on a requirement on which output table columns are not fixed. Based on material number in output all other data has been

changed based on classification maintained for that material in material master, so we cannot fix field catalog for this requirement or in web dynpro we
cannot create fixed node and attribute to map the ALV d
ata in interface controller.

 

So to achieve this following steps need to be taken care :

 

 

Step 1 :

Create an internal table which having information of field name and field type(data element) for that material, like :

 

DN1.png

 

here in this type :

 

FNAME - Hold the name of the field

FTYPE - Hold Data type for the field

 

Now based on dynamic condition fill this internal table.

 

Step 2 :

 

Create Dynamic Structure and Dynamic Table based on fields fill in above steps:

 

Here I have created a method which is having field information in table "gt_field_list".This method having 2 exporting parameter which give
reference of dynamically created data type reference of s
tructure and internal table, later you can assign these variable into field symbols for
further processing.

 

 

E_STRUCTURE -Will return reference for structure.

E_TABLE           - Will Return standard internal table reference with line type as e_structure.

 

DN2.png

 

 

Step 3

 

After getting reference of structure and table call above method in get data method and do following coding. After the below coding <f_line1> and <f_tab>
field symbols are ready to use to fill internal final data.

 

DN3.JPG

Step 4

 

Create Dynamic node in Web Dynpro Application:

 

Following code will create a node named "DYNAMIC_NODE" in root node of view.

 

Here node attributes will be added from dynamic structure "gs_structdescr"  and this structure is of same type describe in step 2   "e_structure".

 

 

DN5.png

  

Step 5 :

 

After creating dynamic node now its turn to get reference of newly created node and bind data for this node to the internal table which we filled in step 3.

 

DN1.png

 

 

Step 6:

 

After binding the table now its turn to instantiate ALV component which can be done by following steps:

DN1.png

Step 7: Execute application for dynamic node.

 

Thanks and Regards,

Gagan Choudha

I’d call myself an ABAP guy. Though I’ve done some Java looooong time back and occasional web development as a hobby, the only serious programming language which I’d claim myself to have mastered is ABAP. Every year, this brings me into an uncomfortable situation when going to TechEd. Hardly anyone at SAP seems to actually care about me being able to earn money with the thing(s) I know. It’s been like this for the past years in Madrid and Amsterdam and looking at the agenda, I was sure it would be no different this year. Luckily at second sight, I was wrong.

For those who could not attend, I’d like to sketch where I have encountered news from the ABAP side. Of course, I was not able to join all the sessions I’d like to have joined, so I’d appreciate reading about your personal highlights as well!


Developer’s garage – get your own ABAP System now!

On the evening before day 1, developer’s garage was opened for the first time at teched. Idea was that developers could inform themselves about how to provide cool software on the technology stacks SAP is offering: HANA Cloud Platform, WebIDE fka. River RDE and last but not least AS ABAP. I was brave enough to approach the SAP-crew waiting there despite the non-suble negative marketing for the AS ABAP. And I felt rewarded: Christopher Kaestner gave me an insight into what it takes to actually provide a stack upon which cloud systems can be deployed. Something I never cared about, but which was really interesting. And more important from a practical side, he instructed me how to use the SAP Cloud Appliance Library and how to set-up my ABAP on HANA-system using it.


While this is easy, it’s still a comparatively risky thing if you’re having your large HANA-instance run for a long time, particularly if you don’t have a real business case you’re working on but if you’re doing this merely  for personal education (about 1€/h). However, there are cheaper alternatives: There’s also a virtualized version of the AS ABAP on MaxDB which has much lower hardware-requirements (currently <25€/month). Why not develop the core of your application on another DB and start the HANA-based instance only if you’re doing the actual HANA-specific part? Anyway, it’s good style to use interfaces and alternative implementations on other databases and I believe that the actual HANA-power will only be required for few operations (just speaking of the ABAP on Hana-usecase, not of the HANA XS-developer, of course). This is how I’ll most probably try it. Once the "on MaxDB-System" is on the same SP-level (currently, it's quite old, SP02).

The cheapest variant would of course be if it was possible to get an AS ABAP trial run on a local machine and what was said in the garage leaves me in an optimistic mood that we’ll not have to wait for this another two years (I promised Christopher not to post how quickly they plan to provide this, as everything which is said at TechEd anyway is subject to change).


Key note – an ABAP container in HCP?

Many posts have already spoken about the keynote. Also I found it very refreshing but at the same time had the impression, that SAP has not forgotten its roots. Even BSEG had a short but important appearance. With respect to ABAP, there was nothing explicitly said, but promising enough, there was a slide picturing the future layering of the SAP development structure. With HANA in its core, the HCP is the PaaS-environment into which applications are being deployed: JavaEE, HANAXS, HTML5 well, and ABAP. At least according to the slide shown.

Lateron after TechEd, Bjoern Goerke unfortunately disclaimed. We might have to wait some time longer until we can actually deploy our artifacts to the HCP. I guess that enabling the TMS for multi-tenancy (multiple isolated spaces in which the workbench-objects live on the same server) might be a bit tricky.


Lectures on ABAP on HANA and Core Data Services (CDS)

Also during day 1, I attended two sessions hosted about enhancements on ABAP. Carine Tchoutouo Djomo presented very nicely various aspects of the code-to-data paradigm (aka. Code pushdown). And largely, this has nothing to do with HANA, but with the general avoidance of transport of data between the database and the application server. Majorly, this was about

  • The vast enhancements to OpenSQL
  • Tools which help analyzing performance of database access dynamically (SQLM) and to combine this is information into a check-cockpit (SWLT)
  • The new ALV for Dynpro and Webdynpro with database-based paging

Only ABAP managed database procedures, which allow to code a stored procedure inside the ABAP are specific to HANA. Currently. From a metamodel-perspective, there is no limitation to HANA as DB, but the database vendor has to implement a set of libraries in order to enable coding in its stored procedure language. And I guess SAP has to be kind enough to accept his contribution and enable a keyword for it. Whether will see the statement LANGUAGE PL_SQL in the next years might be subject to the result of the next America’s cup race. I don’t have to go more into detail here as Carine was so kind to record all the demos and make them available here on SCN.

I wish this was a common best practice for other topics as well. Briefly addressed in this session were also the mighty Core Data Services. This new artefact which can be modeled in eclipse replaces DDIC views. Of course, you’ll still be able to use se11-based DB views also after 7.40, but CDS-views are so much more powerful. Basically you can model the FROM-clause of your OpenSQL statically. Without limitations to inner-joins, join-on-join etc. But acutally, a CDS-view is much more powerful as it also allows to define structured data-relations: A view always has a “flattened” result-structure. Within a CDS-view, also associations can be defined so that to me a CDS-view seems to be the ideal artifact for providing an OData service. To me, this CDS-core is the most powerful change in ABAP during the past years and a major reason to upgrade to 7.40. However, with respect to its extended capabilities, I’m a bit sceptical:

 

  • CDS supports annotations which can be interpreted by a CDS-consumer, e. g. whether an attribute can be used as keyfigure in some analytical artifact. Honestly, I don’t like this with respect to layering: The consumer should be attributing characteristics to CDS-entities, not the other way round.
  • CDS is being positioned as the access layer for any kind of consumer who wants to get data out of the system. However, there are other technologies at “higher” architectural layers which serve a similar purpose. Most important to me: SADL, the service adaptation definition language. It can also build views on top of various artifacts including DB tables, but also more rich entities such as BOPF business objects. This is also necessary from my point of view as particularly for write-accesses further data transformation or validation might be necessary which CDS by its nature cannot provide. In addition, not all data is persisted. For this case, CDS support various expression types (aggregations, case-statements). But not every transient information can be determined using them – or the resulting code would be not maintainable.

My conclusion on CDS is that I highly appreciate it for its extended viewbuilding capabilities, but I’ll not use the annotations unless required by a runtime engine.


UI5 and WebIDE

Now, last but not least, this is not an ABAP-topic, is it? No, it definitely isn’t (at least not until Björn Görke announces the ABAP-container in HCP). But it impacts the way backend applications in ABAP need to be implemented (shameless cross-link). Also, I feel that despite this beautiful new UI technology has matured, there is still a need for business applications to be developed in ABAP. I simply cannot imagine a productive way of building backends for business applications with integration to SAP products (e. g. the business suite) with the same efficiency as in ABAP. And this is not only about the actual laguage and ABAP runtime: The toolsets around business needs (roles and authorization, software logistics, integrated testing, output management, customizing, business rules, …) are very well established and even if not technologically brilliant in some places, they are known to SAP operations teams. What I’m particularly missing in this new technological layer is some application level framework like BOPF which helps to modularize semantically and which reduces the tasks of the developer on developing “business logic” – since this still is what most SAP-projects are being instantiated for.

So in my opinion, UI5 and the surrounding technology can help to play to the strengths of each technology: Implement, deploy and manage "business logic" with ABAP, really separate the UI, decouple via Gateway and have the zillions of JavaScript developers who don't need to know about ABAP implement beautiful UI5 user interfaces.


Conclusion

ABAP is not dead at all. In fact, it's currently being renovated not only with respect to syntax (If you have not yet read Horst Keller's blogs about SP08-Language enhancements, do that right now), but it's also getting enhanced with artifacts which allow a good integration into the other state-of-the-art-technologies by SAP (HANA, OData). For me, this means that I need to do the following:

  1. Understand SAPs current technologies and its impacts on application architecture
  2. Learn to master the interface-technologies from the ABAP-side (hoping that BOPF and SADL help me out on implementing lots of the OData-contract on my own)
  3. Understand the common architectural principles and further improve on my custom application's architecture. The patterns apply for other platforms as well - and knowing them well simplifies switching to other platforms if necessary.

*** If you just want to skip to the good stuff here's the github project with a saplink nugget: lucastetreault/zPhabricator · GitHub



Phabricator is a collection of open source web applications that help software companies build better software. It is developed and maintained by Facebook and largely based on their own internal tools.

hero.png

 

The major components of Phabricator are:

  • Differential - a code review tool
  • Diffusion - a repository browser
  • Maniphest - a bug tracker
  • Phriction - a wiki

 

My company has a very mature Java Development team and they have been using Phabricator's Differential for their code review process for quite some time. Our ABAP team has experimented with various processes for enabling and enforcing code reviews and I recently hacked together some integration so that we can use Phabricator for our ABAP code as well. Once I've cleaned up the code and made it easy to configure without having to change a bunch of code I'll set up a project on github with everything you need to get up and running!

 

Basically the goal is to get a nice side by side comparison so that we can quickly and easily see what was changed and to be able to control the release of transport tasks by checking the status of the review. You will not be able to release tasks until the review has been accepted.

diff.PNG

 

***Since a couple days ago when I first put this blog up I have spent a couple late nights working on this and have managed to have the whole process inside of SAP. I'll leave the "old new" process at the bottom of this blog so you can see where I started...

 

Here is what our NEW process looks like:

 

You are done coding in a transport task and you are ready to have it reviewed and moved up to the test tier.

transport.PNG

 

Go to transaction ZPHAB and enter the details required to create a code review:

zphab.PNG

 

Run the program and a revision will be created:

zphab result.PNG

 

Browse to the link it gives you and you can view the revision:

 

review.PNG

 

If you make more changes to the transport task (ie: if your reviewer requested changes) you can use ZPHAB again and it will update the revision with the latest changes.

 

If you forgot to submit a task for review you will get the following error when you try to release it:

7.png


If you submitted the task for review but it hasn’t been approved you will get the following message including the URL of the revision in Phabricator so you can send it off to someone to follow up if needed.

8.png

 

 

 

 

 

 

 

**** This is the "old new" process before I did a bunch of work to have the whole process in SAP.

Here is what our (old)new process looks like:


You are done coding in a transport task and you are ready to have it reviewed and moved up to the test tier.

1.png

 

Open a command prompt and navigate to C:\Phabricator. When you get there, run the command sapdiff <task number>

2.png


You’ll get a popup of notepad++ and you’ll need to enter a few details:

3.png


<<Replace this line with your Revision Title>> -- This needs to be set to the task number that you are submitting for review

Reviewers – Enter the phabricator user name of the person (or people) you want to review this

4.png


Save the file, then close notepad++. The differences have now been pushed to Phabricator. You will receive an email confirming that a ‘revision’ has been opened and the reviewers will receive an email asking them to review your code.



If you forgot to submit a task for review you will get the following error when you try to release it:

7.png


If you submitted the task for review but it hasn’t been approved you will get the following message including the URL of the revision in Phabricator so you can send it off to someone to follow up if needed.

8.png


I'm pretty excited about this! It's a HUGE improvement over our old process. Hope you like it!


- Lucas


What is real-time in ABAP and why ABAP Channels?


The new technologies like Cloud, Mobility, and In-Memory open new opportunities and lead the industries to the real-time business. Become real-time business means react immediately to the market changes, provide higher responsiveness, accelerate business processes, adjust responses in real-time based on changed business conditions, deliver fast and personalized service for customers, or exploit new business chances not possible before.

 

But what does real-time mean for the running business? From the business user point of view, one of the most important definitions is low latency, meaning how fast the application responds to a click or other user interaction (independent on a computer or a mobile device). Ideally users expect that the input data is processed within milliseconds and they receive immediate responses.

 

Real-time means also always having the latest up-to-date information about application data and state (e.g. the current stock market prices, airline seat availability, latest deadlines for delivery, accurate information on inventory – value, status, receipt, location and disposition), and react in real-time to incoming events. The events can be of various nature and span from the simple notification (e.g. the vacation request was approved in the system), broadcasting (e.g. new repurchase prices are available and must be distributed to the users), process control information (e.g. delivery of sales orders took place, the billing process for customers can start) to events, which require immediate reaction (e.g. illegal trade pattern is detected). Another aspect of real-time is best support for collaborative scenarios (e.g. business users work on the same data records, collaboration platforms, interaction centers).

 

All in all developing real-time applications in ABAP is about the fast, event-driven, highly interactive and collaborative scenarios, which must be supported in ABAP applications.

 

How is it implemented in ABAP? ABAP development for SAP HANA enables real-time processing for huge amounts of business data by exploiting the strengths of In-Memory technology. Using SAPUI5/SAP Fiori on top of AS ABAP offers real-time UI experience and simplified business interactions by means of intuitive highly-responsive consistent UIs across multiple devices. The ABAP channels complete this real-time offering by supporting interactive and collaborative scenarios based on event-driven paradigm independent of database (yes, not only on HANA!) and UI (yes, it works also 'as prototype' for Dynpros!). ABAP Channels infrastructure is general available with SAP NetWeaver AS ABAP 7.40 Support Package 5 (SP05).

 

 


Real-time UI access to ABAP data with ABAP Push Channel (APC)

 

As described above, one of the important real-time characteristics is immediate access of users to the frequently changing business information and data in ABAP backend, which will support timely business insights, decision making and productivity.

 

The examples for this can be for instance bringing the irregularly and often changing product information and sale conditions to browser UI (SAPUI5, Web Dynpro ABAP, BSP) as soon as data change takes place in the ABAP backend, pushing short-lived financial data relevant for trades to the browser UI, statistics, process information (sales figures in different areas, financial statistics, manufacturing process status). In all these cases the data presentation on UI (charts, diagrams, text and numbers) must be updated immediately accordingly to the modified data in the ABAP backend.

 

In such situations in order to present the latest up-to-date information from the ABAP backend in real-time on the browser based UI most ABAP applications used polling techniques with multi-seconds intervals or refresh button periodically to get newest updates to UI. These approaches are quite resource consuming result in insufficient performance and poor user experience. The availability of real-time data in ABAP backend is unpredictable and it is unnecessary effort to make requests and open and close HTTP connections.

 

The ABAP Push Channel (APC) technology replaces such inefficient polling approach through WebSockets. ABAP Push Channel is bi-directional message based communication channel representing the WebSocket integration in ABAP, allowing e.g.to push notification to the UI as soon as data change in ABAP backend happens. This is a sign for a user to request the changed data from ABAP backend and update UI.

 

Picture2.png

 

More details about ABAP Push Channel and step-by-step guidance how to implement it in ABAP in Masoud’s Blog:

ABAPChannels Part 1: WebSocket Communication Using ABAP Push Channels.

 

 

Real-time communication between ABAP sessions with ABAP Messaging Channel (AMC)

 

Other aspect of the real-time behavior is reacting immediately to the events. Event-driven communication between ABAP sessions across the boundaries of ABAP application servers will streamline the business process workflow, speed up performance and reduce the rate of consumption of database resources. An ABAP application might need to identify and react to certain events in the other ABAP session as soon as they occur. Moreover an ABAP application might involve data processing in order to provide some output to be delivered to another application, running in another ABAP session and therefore has to send notification. Traditionally to realize such real-time event-driven behavior ABAP applications often poll periodically for the state information on the database. Bottleneck problems on the database may occur with numerous active polling from many ABAP applications simultaneously.

 

One of the examples of such real life situations can be long running calculations (sales statistics, loan calculations). A user may initiate a loan calculation (via asynchronous RFC) and it is running so long, that he is not any more interested in the status of calculations and the results and may want to proceed with another actions. In such case the running session (initiated e.g. by RFC) only consumes resources and the calculation must be stopped. In current implementations the ABAP application polls for the status of the calculation on the database. Another example is long running batch jobs (mass data changes, create deliveries from open sales orders, create invoices) which process data gradually. To simulate the real-time behavior, the calling ABAP session polls in short time intervals on the database for the status in order to request already processed data or get updated information.

 

Another example is the business workflow e.g. the manufacturing process, which consists of several productions steps (e.g. manufacturing of components), which are initiated in parallel by the central process (e.g. assembling) on different application servers and must report back their status as soon as they get accomplished. The central process would need to poll on the database periodically in order to get the status of the productions steps. All these situations result either in heavily loaded database by batch jobs, bottlenecks, or by resource- and time-intensive applications reducing database performance.

 

Instead of using periodically polling technique, a publish/subscribe mechanism can be used for prompt notification which can eliminate unnecessary database load. The ABAP Messaging Channel (AMC) infrastructure replaces the polling through publish/subscribe mechanism and acts as a broker for the messages exchange between different ABAP sessions, which can reside on different ABAP application servers. After ABAP sessions act as publisher and subscribers respectively, as soon as an ABAP session publishes notification to AMC, all subscriber ABAP sessions get notified.

 

Picture1.png

Picture3.png

 

More details about AMC and step-by-step guidance how to implement it in ABAP in Masoud’s Blog:

ABAP Channels Part 2: Publish/Subscribe Messaging Using ABAP Messaging Channels


Real-time collaboration with ABAP Push Channel (APC) and ABAP Messaging Channel (AMC)

 

Another essential element of the real-time business are real-time collaboration scenarios, which allow multiple users in different locations in internet to communicate instantly or work on the same business objects simultaneously sharing the same backend resources and even allowing collaborative real-time editing. Applying this to ABAP would mean for example SAPUI5 users in different ABAP sessions which reside on different ABAP application servers can communicate (send messages) and work together on the same ABAP business data records.

 

One of the business scenarios would be broadcasting and distributing of the irregularly and frequently changing business data in ABAP backend to the browser UIs of multiple users across the boundaries of the ABAP application servers (see APC use examples above)

 

Another real-time scenario would be collecting the data from any kind of hardware device (e.g. traffic control tool, medical equipment, RFID scanner, vehicle tracking control, multimedia system, etc.) in the ABAP system and display them on the UI, or react to event (robot reached its destination in the warehouse,

telephone call/chat arrived in the interaction center) in ABAP system with subsequent UI update with latest data without having to poll for the data changes.

 

In situations when multiple users work with the same data (e.g. customers or suppliers information) in the browser UI, and one user changes several data records, they must be updated immediately on all other user UIs.

 

If one goes a step further there are also scenarios of collaborative real-time editing of business object data in which multiple browser UI users have to edit jointly the data of the same ABAP object and modified data should be displayed on UI of other users in the real-time without having to lock the whole object.

 

For all these scenarios the collaboration of ABAP Push and Messaging Channels is the best choice. The publish/subscribe infrastructure of the ABAP Messaging Channels can be used with ABAP Push Channel to enable WebSocket UI clients to act as subscriber. This allows WebSocket UI clients either to receive notifications about data changes or to publish notifications requesting WebSocket UI clients to reload the data from source.

 

new_Picture1.png

 

More details about using AMC/APC for collaboration and step-by-step guidance how to implement it in ABAP in Masoud’s Blog:

ABAP Channels Part 3: Collaboration Scenario Using ABAP Messaging and ABAP Push Channels.

Do you also have the requirement to provide data from your ABAP-System to an Excel workbook? Then, I hope you heard about the fantastic abap2xlsx-library by Ivan Femia. If not, you should immediately head over to their project website. For our project, it saved a tremendous amount of effort and time. Let me briefly explain why and how it’s even more easy to provide beautiful Excel download from your ABAP system.


Starting point

During the specialization of our application some years back, users and consultants together defined complex Excel files they wanted the system to be generated. As we’re using a FPM-based Webdynpro user interface, we could not rely on using the GUI frontend services, but hat to use the OLE-based APIs. They are not only slow, but also quite clumsy to handle and need a lot of code. In  our sample, we had used more than 5.000 lines of code in order to produce a file which was at least almost matching the requested format.


ABAP2XLSX

Searching for alternative manipulation of Excel from ABAP, I came across abap2xlsx which has a beautiful API for creation and modification of Excel files. Particularly binding tabular data into a table-entity of the workbook is incredibly efficient. After having clarified the licensing questions, we communicated how much cheaper development would be if only we could rely on our customer using Excel 2007 (or at least Excel 2003 and the compatibility pack). Talking about savings, convincing them was not too difficult.


Templating

With all the facilities given by ABAP2XLSX, it still was quite some effort to define those rich files which may comprise  more than 100 fields and some tables: For each field, you as a developer have to define a label, format the cells, rows and columns and bind the data. Also for minor graphical changes (e. g. formating) or to the texts, a developer is needed. How beautiful would it be if the business expert could just provide a new version of a template? It would not even be required that the template layout remains if named cells would be used for filling the  template. We therefore decided to split up the provisioning of a the initial Excel as a view and the controller logic to fill it (which is in general a good idea).

template.png

Some sample template designed by a "business user" observer the beautiful formatting...


The MIME Repository as template store

Having got a prepared template by a business user, one important question remained: How to store, access and manage the lifecycle of a template? Of course, you could simply put the file into the filesystem of your application server, but there’s a much better option: The MIME Repository is a tool integrated into the ABAP workbench for managing storage of binary data. You can simply create an own folder for your application and upload your template-files to it. This give you

  • A transportable object which integrates into the deployment (transportation) of your ABAP-application
  • Authorization mechanism in order to limit who’s allowed to access and update which template
  • A nice separation of the presentation and the logic (though of course you might have to bridge some shortages with respect to i18n, depending on your customer)

MIME_rep_se80.png

The MIME-repository UI in SE80 - and the uploaded template


There’s an ABAP-API in order to load the binary content from which you create the ZEXCEL-object. You could for example use a factory:

METHOD create_from_mime_repository.  

  DATA lv_mime_file TYPE xstring. 
  DATA lo_excel_reader TYPE REF TO zif_excel_reader.  
  DATA lx_excel TYPE REF TO zcx_excel.

    cl_mime_repository_api=>get_api( )->get( 
          EXPORTING        i_url                  =  iv_mime_path           
          IMPORTING        e_content              = lv_mime_file           
          EXCEPTIONS OTHERS = 8 ). 

    IF sy-subrc <> 0.   
     RAISE EXCEPTION TYPE zcx_excel
          EXPORTING error = |File could not be found in MIME repositoy at { iv_mime_path }|. 
    ENDIF.  

    CREATE OBJECT lo_excel_reader TYPE zcl_excel_reader_2007.   

     TRY.        "Instantiate the Excel object on the basis of the binary date from the MIME-Repository
         ro_excel = lo_excel_reader->load( i_excel2007 = lv_mime_file ).   
    CATCH zcx_excel INTO lx_excel.    "excel loading error      
      RAISE EXCEPTION TYPE zcx_excel          
               EXPORTING error = |File at { iv_mime_path } could not be interpreted as Excel file|.
    ENDTRY.

ENDMETHOD.

Having done this very small coding, you’ll see your efficiency tremendously improved: You basically need one line of code per cell into which you’d like to populate data. Not for tables though: you need at least four lines of code – for the complete table.

"Load file from MIME-path

go_excel = zcl_excel_factory=>get_instance( )->create_from_mime_repository( '/SAP/PUBLIC/excel_templates/Template_Sample.xlsx' ).

 

"Fill some elementary data into a predefined format

go_excel->get_active_worksheet( )->set_cell(    
     ip_column    = 2       
     ip_row       = 1       
     ip_value     = 'Fruits'      ).

 

"Add tabular data

go_excel->get_active_worksheet( )->bind_table(     
     ip_table = gt_item
     is_table_settings = VALUE #( top_left_column = 'A' top_left_row = 4 )  
).

filled.png


Can you do this any easier?


Feedback appreciated!

Oliver


Excel Creation Methods :

Today there are a number of methods to create a formated Excel file like Standard FMs, IXML Interface, OLE, Using XML syntax , Using HTML syntax and few  more Interfaces ( available over Code Share Pages )..!!


Flexibility and Ease of Use :

Well, taking about  Ease of use, I would always prefer Standard FMs.


But the problem with Mr. FM is, there stiffness. I mean they are not really flexible enough to fulfill my business needs. My Business Team always thinks, creating an Excel thru SAP is equivalent to creating an Excel thru MS Office..!! ;(

Basically FMs creates Excel, which are more sort of fixed format file..!!


Next comes IXML interfaces, they are obviously flexible but I find them a bit bulky interms of Memory ComPlexity..!! Probalby, because you need to instnciate a Class and then set its attribute before using just a simPle Style, font or a new color../!! My business asks for a highly decorated Excel..!! Seems tough this way..!!


OLE ,, really tough stuff to deal with, once OLE starts to create Excel,, I usually move out of my desk, m sure, it will take a handsome amount of time..!!

Yes, Mr. OLE is really laZy, just like me. Takes too much of time for Excel creation..!!


So, I will oPt for XML or HTML Syntax out of all the available candidates..!!


Hmmm, I didnt talked about the other interfaces, actually I could not dare to PeeP inside,, My SAP has emPowered me with some beautiful tools.

No need to look at something else..!!


The HTML Way :

Create all you want using HTML Tags and Syntax, download it in a XLS file format and Now its ready to use..!!

As simPle, as a two minute noodle,, well.. Yes it is..!! lol

 

You can create a Table, Paint it easily, with the color, your near and dear wants..!!

<TABLE>

<td colspan =4>

<div><font size =5>Test Excel</font></div>

</TD>

</TR>

<tr>

<td colspan =9>

<font size =5>Excel Details</font>

</TD>

</TR>

 

 

HTML Tutorial will tell in detail about the HTML Syntax..

 

 

Create the Excel using your tags, keep them in an internal table or better in a String..!!

 

Use ,,

 

cl_bcs_convert=>string_to_solix with code page 4103..!!

 

Create the Doc Object using

 

cl_document_bcs=>create_document with tye 'RAW'..

 

Add the Excel attachment using document->add_attachment, with tye 'xls'..

 

Now, set_document,, add recipient  and then send document

 

For Mail sending area, Object report BCS_EXAMPLE_7 is also helpful..!!

 

Comments and Rectification :

Kindly put your comments, which can help me to further add values to blogs..!!

My mentors and guides over SDN, kindly rectify me if I am wrong somewhere..!!


Thanking You All..!!

related page 1
related page 2
related page 3


NOTE: Before beginning, the XLSX Workbench functionality must be available in the your system.

 

Let's use standard demo report BCALV_GRID_DEMO is available in the every system (the report demonstrates how to use ALV-Grid control). We will export an output table to the Excel based form.

 

1 PREPARE A PRINTING PROGRAM.

 

1.1 Copy standard demo report BCALV_GRID_DEMO to the customer namespace, for example Z_BCALV_GRID_DEMO :

IMAGE_4.PNG

 

1.2 In the copied report, add new button to GUI-status 'MAIN100':

IMAGE_5.PNG

 

1.3 In the report line number 40, insert next code to processing the new OK-code :

IMAGE_7.PNG

 

1.4 Activate objects:

IMAGE_8.PNG

 

2 PREPARE A FORM.

 

2.1 Launch XLSX Workbench, and in the popup window specify a form name TEST_GRID , and then press the button «Process»:

IMAGE_9.PNG

Empty form will be displayed:

444_19_1.png

2.2 Push button444_19_2.PNGto save the form.

 

 

2.3 Assign context FLIGHTTAB to the form:

IMAGE_12.PNG

 

Herewith, you will be prompted to create a form's structure automatically (based on context):

 


We will create form structrure manually,

therefore we should press the button

 

 

 

 

2.4 Add a Pattern (HEADER LINE) to the Sheet:

IMAGE_26.PNG

 

 

2.5 Add a Loop to the Sheet:

IMAGE_29.PNG

 

2.6 Set context binding for created Loop:

IMAGE_23_.PNG

 

 

2.7 Add a Pattern (POSITION) to the Loop:

IMAGE_32.PNG

 

2.8 Add a Values to the Pattern (POSITION):

IMAGE_37.PNG

 

2.9 Make markup in the Excel template:

IMAGE_38.PNG

 

2.10 Set template binding for Pattern (HEADER LINE). To do it, you have to perform next steps successivery:
  • Select node PATTERN_HEADERLINE in the tree of the form structure;
  • Select cells range A1:I1 in the Excel template;
  • Press a button IMAGE_39.PNG located in the item «Area in the template» of the Properties tab.

IMAGE_42.PNG

 

 

2.11 Set template binding for Pattern (POSITION). To do it, you have to perform next steps successivery:
  • Select node PATTERN_POSITION in the tree of the form structure;
  • Select cells range A2:I2 in the Excel template;
  • Press a button IMAGE_39.PNG located in the item «Area in the template» of the Properties tab.
IMAGE_43.PNG

 

2.12 Set template binding for Values. To do it, you have to perform steps, described in the previous item. Mapping is figured below:
  • CORRID           (cell: A2)
  • CONNID           (cell: B2)
  • FLDATE            (cell: C2)
  • PRICE              (cell: D2)
  • CURRENCY     (cell: E2)
  • PLINETYPE      (cell: F2)
  • SEATSMAX      (cell: G2)
  • SEATSOCC      (cell: H2)
  • PAYMENTSUM (cell: I2)


2.13 Finally, you have to set template binding as figured below:

IMAGE_47.PNG

 

2.14 Activate form by pressing button444_30.PNG.

 

 

 

3 EXECUTION.


Run your report Z_BCALV_GRID_DEMO; ALV-grid will be displayed :

IMAGE_48.PNG


Press button444_43.PNGto export Grid to Excel form :

IMAGE_49.PNG


Hi community!

 

I have come across the need to have a vendor contact created (you know, those contacts you find in XK01/2/3), and, to my disbelief, really the only possible way to do this programatically, is with a batch input!!

 

Well, not anymore. I have created a class that allows you to create and edit vendor contacts (deleting might be easy as well, but I haven't looked into it yet). Also, it should be fairly easy to extend this to create customer contacts as well. If that's what you need, go ahead and try to do it Ideally, there should be a generic "contact" class, and a subclass for vendor and customer contacts.

 

The most basic way to create a vendor contact is to use static method CREATE_CONTACT and populate vendor number (LIFNR) and last name (NAME1) in structure IM_HEADER_CONTACT_DATA. After that, it's up to you to figure out how to use it

 

You will find the nugget for the class under folder PURCHASE_TO_PAY/VENDOR_CONTACT of project object, here.

 

If you have any questions, feel free to ask, but don't expect 24/7 technical support, that doesn't come for free

 

Best regards,

Bruno

 

DISCLAIMER

 

This is not, by any means, a finished "commercial product". It still definitely needs a lot of work to be a final stable solution. However, it should be a great starting point to whatever you need. If you do some work on it and feel like sharing an improved version of this, please let me know.

 

Thanks.

I recently worked on a project which entailed enhancing vendor master, updating CREMAS04 Idoc with a new segment for custom records and finally being able to generate IDOCS with new segment when these custom fields of (added to )vendor master are updated. In this 3 part blog I will try to cover the development effort required to achieve the aforementioned functionality in detail.

 

The entire development effort can be classified into three sections as follows:

1) Extending the vendor master

2) Enhancing method VMD_EI_API->MAINTAIN( ) method to programmatically update vendor master tables (LFA1, LFB1, LFM1, LFM2 .. ) - part2.

3) Enhancing the CREAMS04 basis type to include the new segment. - part3

 

 

  1. Extending the vendor master.

        Extending vendor master involves enhancing the vendor master database table and vendor master screen. For the purpose of this blog I am going        to enhance LFA1 (vendor master general - header level ) table with custom fields.

    • Step 1: Carry out steps listed under step ‘Adoption of Customer’s Own Master Data Fields’. Follow this configuration path to reach this step.

     SAP IMG->Financial Accounting->Accounts Receivable and Accounts Payable->Vendor Accounts->Master Data->Preparations for Creating Vendor Master Data->Adoption of Customer's Own Master Data Fields.

SPRO_patch_to_modification_free_enhancement.png

    • Step 2:Execute step 'Prepare Modification-Free Enhancement in Vendor Master Record' step.

In this step you specify the screen group and screen number associated with the new vendor master fields.

Screengroup for venodr master.pngScreenNumber_for_vendor_master.png

    • Step 3:Execute step "Business Add-In: Process of Master Data Enhancements'.

             This steps allows user to create a implementation for BADI - VENDOR_ADD_DATA. In my example I am just activating the custom screen group

              Add on , However this BADI provides other methods to query data for add on fields, save data to table other then vendor master, initialize add-              on append structure ( discussed later ) and much more.

              BADI_VENDOR_ADD_DATA.png

                Implement badi venodor add data.png

    • Step 4:Execute the Business Add-In: customer subscreen step next.

                This steps allows for creation of implementation of BADI- VENDOR_ADD_DATA_CS. This badi provides methods to manipulate the screen elements of subscreen which are all associated with the screen-group declared in previous group. It also provides methods to set values to and read values from the screen fields on the subscreens.

                A filer with the screen group has to be specified before methods of this BADI can implemented.

               BADI_VENDOR_ADD_CS_FILTER.png

               BADI_VENDOR_ADD_CS_IMPLEMENTATION.png

    • Step 5:Next, LFA1 has to be enhanced with append structure consisting of custom fields which will show up on the new subscreen (9030 in this case).

               To do this, go to SE11 and open definition of table LFA1. Click on 'Append Structure..' option and then click on 'Create' to create new                append structure. Since for this demo IDOCS have to be created for any change in values stored in these custom fields. Make sure that the 'Change document' option is turned on at the data element level .

               Save all changes and active.

 

                Create a new append structure.png

                   Append structure with change documents.png

    • Step 6:The Next Step involves adding the fields from append structure to the custom subscreen (accessible via XK** transaction ).

               Execute transaction SE51 and create a new subscreen for program and screen number specified above ( step 4 and step2 respectively: program: SAPMZMSW_VEND_EPA, Screen: 9030 ).

               Create subscreen for new vendor fields.png

                  Click on the 'Dictionary/Program fields window' button to select the append structure fields from LFA1 table structure.

                  Select custom fields from append structure.png

                    Arrange the fields on the layout screen and save all changes.

                  Arrange screen layout for subscreen.png

    • Step 7: In this example the above added screen fields are all display only fields, hence no  PBO is necessary. However to populate these fields for display, LFA1 needs to be declared as a global variable (TABLES) in the dynpro program. Standard processing routing will automatically populate the LFA1 structure with value in the database. Since the screen elements created above all refer to the LFA1 structure they values will automatically be passed to the screen fields and display.

               LFA1 declaration in dynpro program.png

    • Step 8: Finally if all the steps are correctly followed you should be able to view a vendor in XK03 and see the subscreen that was added in the above step. However since all the fields on this subscreen are newly added fields they will all be empty. In the next part of this series, I will go over the steps required to  update the LFA1 structure from an external source.

                XK03_screen1.png

                XK03_screen2.png

You may have been in a situation where you have to process multiple objects. When you do this sequentially, it'll take some time before that processing finishes. One solution could be to schedule your program in background, and just let it run there.

But what if you don't have the luxury to schedule your report in background, or what if the sheer amount of objects is so large that it would take a day to process all of them, with the risk of overrunning your nightly timeframe and impacting the daily work.

 

Multi Threading

It would be better if you could actually just launch multiple processing blocks at the same time. Each block could then process a single object and when it finishes off, release the slot so the next block can be launched.

That could mean that you could have multiple objects updated at the same time. Imagine 10 objects being processed at once rather than just one object sequentially. You could reduce your runtime to only 10% of the original report.

 

It's actually not that hard. If you create a remote enabled function module, containing the processing logic for one object, with the necessary parameters, you can simply launch them in a new task. That creates a new process (you can monitor it in transaction SM50) which will end as soon as your object is processed.

 

Newtask.png

 

Here's a piece of pseudo-code to realise this principle.

data: lt_object type whatever. "big *** table full of objects to update

 

 

 

while lt_object[] is not initial.

     loop at lt_object assigning <line>.

 

          call function ZUPDATE starting new task

               exporting <line>

               exceptions other = 9      

 

          if sy-subrc = 0.

               delete lt_object.

          endif.

     endloop.

endwhile.


Something like that.

Notice how there's a loop in a loop, to make sure that we keep trying until every object has been launched to a process. Once an object has been successfully launched, remove it from the list.

 

Queue clog

But there's a catch with that approach. As long as the processing of an individual object doesn't take up too much time, and you have enough DIAlog processes available, things will work fine. As soon as a process ends, it's freed up to take on a new task.

 

But what if your processes are called upon faster than they finish off? That means that within a blink of an eye, all your processes will be taken up, and new tasks will be queued. That also means that no-one can still work on your system, because all dialog processes are being hogged by your program.

queue clog.png

* notice how the queue is still launching processes, even after your main report has already ended.

 

You do not want that to happen.

 

First time that happened to me was on my very first assignment, where I had to migrate 200K Maintenance Notifications. I brought the development system to its knees on multiple occasions.

The solution back then was double the amount of Dialog processes. One notification process finished fast enough before the main report could schedule 19 new tasks, so the system never got overloaded.

 

Controlled Threading

So what you want, is to control the number of threads that can be launched at any given time. You want to be able to say that only 5 processes may be used, leaving 5 more for any other operations. (That means you could even launch these mass programs during the day!)

But how do you do that?

 

Well, you'll have to receive the result of each task, so you can keep a counter of active threads and prevent more from being spawned as long as you don't want them to.

 

caller:

data: lt_object type whatever. "big *** table full of objects to update

 

 

 

while lt_object[] is not initial.

     loop at lt_object assigning <line>.

          call function ZUPDATE starting new task

               calling receive on end of task

               exporting <line>

               exceptions other = 9      

      

          if sy-subrc = 0.

               delete lt_object.

               add 1 to me->processes

          endif.

     endloop.

endwhile.

receiver

RECEIVE RESULTS FROM FUNCTION 'ZUPDATE'.

substract 1 from me->processes

 

This still just launches all processes as fat as possible with no throtling. It just keeps the counter, but we still have to do something with that counter.

And here's the trick. There's a wait statement you can use to check if the number of used processes is less than whatever you specify.

But this number is not updated after a receive, unless you logical unit of work is updated. And that is only done after a commit, or a wait statement.

But wait, we already have a wait statement, won't that update it?

Why yes, it will, but than it's updated after you waited, which is pretty daft, cause then you're still not quite sure whether it worked.

 

so here's a trick to get around that.

caller:

data: lt_object type whatever. "big *** table full of objects to update

 

 

while lt_object[] is not initial.

     loop at lt_object assigning <line>.

 

 

          while me->processes <= 5.

               wait until me->processes < 5.

          endwhile.

 

 

          call function ZUPDATE starting new task

               calling receive on end of task

               exporting <line>

               exceptions other = 9      

 

 

          if sy-subrc = 0.

               delete lt_object.

               add 1 to me->processes

          endif.

     endloop.

endwhile.

 

That'll keep the number of threads under control and still allow you to achieve massive performance improvements on mass processes!

 

Alternatives

Thanks to Robin Vleeschhouwer for pointing out the Destination groups. Starting your RFC in a specific destination group, your system administrators can control the number of processes in that group. The downside is that it's not as flexible as using a parameter on your mass-processing report, and you have to run everything past your sysadmins.

 

Another sweet addition came from Shai Sinai under the form of bgRFC. I have to admit that I actually never even heard of that, so there's not much I can say at this point in time. Except, skimming through the doco, it looks like something pretty nifty.

Welcome back to another ABAP Trapdoors article. It’s been a while – I’ve posted my last article in 2011. In the meantime, I’ve collected a number of interesting topics. If you’re interested in the older articles, you can find the links at the end of this article.

 

A few weeks ago, I had to code some seemingly simple task: From a SAP Business Workflow running in system ABC, a sub-workflow with several steps had to be started in another system, or even a number of other systems. Since a BPM engine was not available, I decided to use simple RFC-enabled function modules to raise workflow events in the target system. The sub-workflows can then be started via simple object type linkage entries. While this approach works quite well for my simple scenario, I ran into an altogether unexpected issue that took me quite a while to figure out.

 

There are two function modules to raise the workflow events: SAP_WAPI_CREATE_EVENT and SAP_WAPI_CREATE_EVENT_EXTENDED. In my case, I used the extended function module because I was working with class-based events. So the call basically was

 

  CALL FUNCTION 'SAP_WAPI_CREATE_EVENT_EXTENDED' DESTINATION l_rfcdest
    EXPORTING
      catid             = 'CL'
      typeid            = 'ZCL_MY_CLASS'
      instid            = l_instid
      event             = 'MY_EVENT'
    ... 

To my surprise, it did not work - the system kept telling me that the event M does not exist. After spending a considerable time debugging and scratching my head, I finally identified the issue. Since it can be tricky to reproduce this particular issue, here is a very simple function module to demonstrate the actual problem:

 

FUNCTION ztest_rfc_echo.
*"----------------------------------------------------------------------
*"*"Lokale Schnittstelle:
*"  IMPORTING
*"     VALUE(I_INPUT_VALUE) TYPE  STRING
*"  EXPORTING
*"     VALUE(E_OUTPUT_VALUE) TYPE  STRING
*"----------------------------------------------------------------------

  e_output_value = i_input_value.

ENDFUNCTION.

(If you want to try this for yourself, make sure that the function module is RFC-enabled.)

This is no more than a simple value assignment – Text goes in, text comes out, right? Let’s see. Here is a demo program to check it out:

 

REPORT ztest_rfc_conversion.

DATA: g_value TYPE string.

START-OF-SELECTION.

  CALL FUNCTION 'ZTEST_RFC_ECHO'
    EXPORTING
      i_input_value  = 'This is a C literal'
    IMPORTING
      e_output_value = g_value.
  WRITE: / '1:', g_value.

  CALL FUNCTION 'ZTEST_RFC_ECHO'
    EXPORTING
      i_input_value  = `This is a STRING literal`
    IMPORTING
      e_output_value = g_value.
  WRITE: / '2:', g_value.

  CALL FUNCTION 'ZTEST_RFC_ECHO' DESTINATION 'NONE'
    EXPORTING
      i_input_value  = 'This is a C literal'
    IMPORTING
      e_output_value = g_value.
  WRITE: / '3:', g_value.

  CALL FUNCTION 'ZTEST_RFC_ECHO' DESTINATION 'NONE'
    EXPORTING
      i_input_value  = `This is a STRING literal`
    IMPORTING
      e_output_value = g_value.
  WRITE: / '4:', g_value.

 

In this program, the function module is called twice within the same session and twice starting a new session, using both a character literal and a string literal (note the difference between 'X' and `X`!). And the output is:

 

output.png

 

As you can easily see, the character literal is chopped off after the first character, but only if the function module is called via RFC. The same thing happened in my original program since the parameter EVENT of the function module SAP_WAPI_CREATE_EVENT_EXTENDED is of type STRING.

 

I considered this a bug, especially since neither SLIN nor the code inspector warned about this issue. As a good SAP citizen, I created a SAPnet ticket. After a lengthy discussion, I was told

There is no "implicit conversion" in RFC, therefore application need to adjust(or choose) a proper data types for calling/receiving RFMs.

In the end, this problem is very similar to the one explained by Jerry Wang in his blog a few weeks ago – another trapdoor in the development environment you constantly have to keep in mind when doing RFC programming if you want to avoid being reduced to a single character with a lot of trouble…

 

Older ABAP Trapdoors articles

Who should read this?

First and foremost, if you don't know what a design pattern is, then this blog is most certainly targeted at you.

 

Secondly those who don't really understand the Model View Controller (MVC) design pattern should also keep on reading.

 

What is the MVC?

My intention was never to write a blog on what it is or how to implement it.  There is heaps of content on the internet and also on SCN.  A simple search and you can find tons of stuff out there.  It's been around for ages, in fact it's older than me, so it's pretty old.

 

Here are some of my favourites:

http://blog.codinghorror.com/understanding-model-view-controller/

http://en.wikipedia.org/wiki/Model%E2%80%93view%E2%80%93controller

 

Where are you going wrong?

I've worked at quite a few clients over the years and it never ceases to amaze me at the number of developers unaware of MVC & it's importance to SAP frameworks.  Even more horrifying is the number of developments/extensions I have seen that break this pretty straight forward design pattern. 


The most classic example of this is those who code business logic within the UI (a view). This results in two very frustrating outcomes

  • Data input via other views (i.e. via middleware) will not have the same business validation logic and as result the application data may be corrupted
  • Validation logic has been duplicated in multiple views. Maintainance requires double the effort and if not managed could result in inconsistencies.

.

 

Why should I learn it?

It's the same pattern that keeps cropping up again and again and it does not appear to be going away any time soon...

 

FrameworkMVC Links
BSPModel View Controller (MVC) - Business Server Pages - SAP Library
ABAP WebDynproWeb Dynpro and the Model View Controller Design - SAP NetWeaver Composition Environment Library - SAP Library
OpenUI5OpenUI5 SDK - Demo Kit
Introduction to UI5 - MVC 1 - YouTube
Introduction to UI5 - MVC 2 - YouTube

 

So if you don't know MVC, you really can't say you know these frameworks and even worse are potentially using them incorrectly.

 

Many non SAP programming languages incorporate frameworks built on this concept and the sooner you spot it the easier it will be for you to adopt a particular framework and the less likely you are to fall into one of the pitfalls described in previous section.

 

Cleaner Code

The ability to have a clean separation of code, that promotes code reuse is fantastic.  So much so that I find myself using this priniciple even when writing ABAP reports.  The ability to plug in different views makes it quick to extend the logic to expose the report data as an e-mail, alv, file download, etc.  And is abundantly clear to anyone maintaining it on where to make changes.

 

One final note

Please, please, please.  If you don't understand it and before you attempt to create or extend any more content, take some time out to learn this pattern.  Your collleagues, your employers and everyone who works with your code will forever appreciate it.

Actions

Filter Blog

By author:
By date:
By tag: