The present e-mail is a complement of other blogs we have written about SAP ChaRM with Workflow and Notifications, like


As incentive, you will be using the latest available SAP classes to deliver e-mails:  CL_BCS and CL_DOCUMENT_BCS.  They even replaced the already known function modules SO_*DOCUMENT* and SO_OBJECT_SEND.  That will make happy your manager and will add a line to your resume.


Unavoidable Reality: 

Once you implement SAP Workflow with ChaRM, it could turn into an avalanche with people asking you all sorts of new features they want.


One of the first ones we got was "Would it be possible that the if a user gets an e-mail, other people in the HR Org Structure get copied that same e-mail in case the person to whom is directed the e-mail is not in the office and any of the other individuals is acting  in her/his absence?"  Not even a thank you, you guys did a good job, but just a quick breath after our go live, having to go back to the board to do some thinking  ... again.  


The answer is simple.  Yes, it is possible, but the SAP Workflow Step e-mail delivered by SAP does not have the capability of sending Carbon Copies of Blind Carbon Copies.  For that you have to use a standard activity and use a method that will call delivered classes provided by SAP, that do the work for you.


Before we start, notice there is some material related in SAP Workflow on how to resolve this, but we are just adding our experience over here for the ChaRM user community, to still show that as long as you can develop some basic ABAP code, and you are willing to, you can do it yourself.

Let's begin.


Our initial scenario:  We have a WF that has to e-mail steps.  Let's take a look at one of them.


As you can see below, we are passing the list of e-mail accounts (one or several) in the container variable MANAGEREMAIL.


The variable looks like below and in the properties tab you could define it as multiline or not, depending whether you are passing one or various values.



Note:  The location of the e-mail addresses for user accounts and business partners is table ADR6.  The link below shows the diagram with the path to get to that table from the ChaRM stand point.


With the current scenario you may provide a list of e-mail addresses, but each individual will be in the To:, field of an e-mail and not option for somebody getting there as CC or BCC of the e-mail, with this type of WF step called E-Mail.


So, what is the solution?

Instead of using the e-mail step, create a standard task, which is to call a method, to be created by you, in which you define receive some parameters from the WF, including the list of e-mail addresses, and then step by step you define the e-mail subject, the body of the e-mail (content),who is it going to, who is getting CC or BCC, and finally you deliver that e-mail.


Our new scenario:



Let's take a look at the first Activity that handles the new approach for the e-mail.



As you see, we call a method.



Below, some of the code in the method that is relevant for this blog.


begin_method zch_email_when_urgent_rfc changing container.

* Place all your variables here.   Weshow some variables that are relevant for this excercise.

* E-mail related variables to SAP classes CL_BCS and CL_DOCUMENT_BCS:


    lv_send_request     TYPE REF TO cl_bcs, " Send request
TYPE so_obj_des, " E-Mail Subject
TYPE bcsy_text, " Mail body
TYPE bcsy_text, " Atchmnt for success
TYPE soli, " Work area for attach
TYPE REF TO cl_document_bcs, " Mail body
TYPE REF TO if_sender_bcs, " Sender address
TYPE REF TO if_recipient_bcs, " Recipient
TYPE sood-objlen, " Size of Attachment
TYPE i, " Lines count
TYPE ad_smtpadr, " Email ID
TYPE bcsy_resv,
TYPE bcss_resv,
TYPE soodk-objtp VALUE 'RAW'. " TXT format


* Other variables.


      lv_rfcnumber           TYPE crmd_orderadm_h-object_id, " RfC number


     lv_rfcreqname           TYPE crmt_partner_no, " Requester BP

      lt_manageremail      TYPE adr6-smtp_addr OCCURS 0, " List of e-mails from managers.
TYPE adr6-smtp_addr, " work area for internal table.

     lt_teamleademail      TYPE adr6-smtp_addr OCCURS 0, " Team Lead only e-mail address. " NEW
TYPE adr6-smtp_addr. " work area for internal table. " NEW



*  The retrieval of the values from the container into the variables should go here ...


* Now the e-mail subject.


CONCATENATE 'Urgent RfC ' lv_rfcnumber ' requires Manager approval.' INTO lv_subject.


* preparation of the e-mail body.  Here is where you do the carving of your e-mail.  Nothing really fancy at this moment, only passing of some variables.


wa_text-line+0 = 'The following Urgent RfC requires your approval'.

* you need to append line by line to lv_body
APPEND wa_text TO lv_body.

* a blank line ... yeah!!! even that
APPEND space TO lv_body.

* and some example of the rest of the lines in the e-mail.
CLEAR : wa_text.
-line+0 = 'Urgent Criteria '.
-line+38 = ':'.
-line+40 = lv_rfcemergencycriteria.
APPEND wa_text TO lv_body.
CLEAR : wa_text.
-line+0 = 'RfC Number '.
-line+38 = ':'.
-line+40 = lv_rfcnumber.
APPEND wa_text TO lv_body.
CLEAR : wa_text.
-line+0 = 'Requester '.
-line+41 = ':'.
-line+43 = lv_rfcreqname.
APPEND wa_text TO lv_body.


wa_text-line+0 = 'Regards,'.
APPEND wa_text TO lv_body.
APPEND space TO lv_body.
APPEND space TO lv_body.
CLEAR : wa_text.
-line+0 = 'Change Management Team.'.
APPEND wa_text TO lv_body.
CLEAR : wa_text.

* Once you complete the e-mail body, it is time to tell the system this is an e-mail.    FYI, we did not invent anything below.  We just replaced with our variables in what is the standard set of steps to create e-mails using classes CL_BCS and CL_DOCUMENT_BCS.

*create a persistent send request.   First instruction to define from now on we are on e-mail mode.
= cl_bcs=>create_persistent( ).

*create document for mail body   As you see, we pass the subject and the e-mail body.
= cl_document_bcs=>create_document(
= 'RAW'
= lv_subject "Mail subject
= lv_body ). " Mail body

*add the document to send request.   This is also standard.
CALL METHOD lv_send_request->set_document( lv_document ).

*sender address ... you (sy-uname) or usually user WF-BATCH.
= cl_sapuser_bcs=>create( sy-uname ).
CALL METHOD lv_send_request->set_sender
= lv_sender.

* Now, the list of recipients.   As we have them in a table, we have to pass one by one in a loop.
LOOP AT lt_manageremail INTO lv_manageremail_wa.
     lv_email = lv_manageremail_wa.
= cl_cam_address_bcs=>create_internet_address( lv_email ).

     CALL METHOD lv_send_request->add_recipient
= lv_recipient
= 'X'
= ' '
= ' '
= ' '.


* Now, the list of CCs.   As we have them in another table, we have to pass one by one in a loop.

* I know, I know.  We could have passed them in the same table with an additional field to tell the difference

* whether the record is arecipient, CC or BCC.  Sorry for being lazy!!


* We count the lines first, as it may be possible our HR evaluation path may have not found any records.
DESCRIBE TABLE lt_teamleademail. " NEW
IF sy-tfill > 0.

     LOOP AT lt_teamleademail INTO lv_teamleademail_wa. " NEW
= lv_teamleademail_wa. " NEW
= cl_cam_address_bcs=>create_internet_address( lv_email ).

          CALL METHOD lv_send_request->add_recipient
= lv_recipient
= 'X'
= 'X'
= ' '
= ' '.

* Once all is completed, we deliver the e-mail imimmediately, which even bypassing SOST/SCOT delays.
->set_send_immediately( 'X' ).

* send mail
CALL METHOD lv_send_request->send( ).





That is all colleagues.   For specific details about all you can do in the e-mail, fancy fonts, attach documents, etc., please follow the link below.  Thanks for that to Sri Hari Anand Kumar.






This blog is a How to Step by Step Config, If you are interested in activating the Enhanced Approval Procedure contained within Solution Manager 7.1 Sp10


If the customer has more than one Approval Process and wants to utilize a Rule based Approval Procedure, for their Change Docs,  they can do so using this new feature:










Choose New Entries,
and enter the following data:



User Name


Field Name


Sequence Number


Field Value










Create Approval Procedure Rules






You must create two rule
policies in the WebClient UI before you can use them for your approval




    • Approval procedure rule: AI_CM_CR_RFC_PRO
    • Approval step determination rule: AI_CM_CR_RFC_STE




      • In the WebClient UI, choose Service Operations -> Rule Policies.

(The system displays the search screen for rule policies        )






  • Choose New, and
    choose Approval Management as the context. You have to create two rule
    policies. Save the first one, before you create the other:




For the approval procedure rule, enter AI_CM_CR_RFC_PRO
in the Rule Policy field.




For the approval step determination rule, enter AI_CM_CR_RFC_STEin the Rule Policy field.

(The system creates a default policy variant.)

  • 4.png




  • Select the rule policy
    top node to set authorizations.
  • Select Draft Rules line item, then *Subnode to
    create a Rule folder
    by adding a subnode.



  • Select the newly-created
    rule folder, and choose *Subnode to create a new Rule:
  • 7.png
  • Under Rule Details,
    enter a name and a description for your new rule.
  • 8.png
  • Add actions and
    conditions to your rule folder, as follows:


Conditions for the approval procedure rule:







Transaction Type


Request for Change

Request for Change

Actions for the approval procedure rule:





Approval Procedure

Change Request
  Approval Procedure















Conditions for the approval step determination










Approval Procedure


SMCR0001 / ZMCR0001


Approval Step


SMCR000001 /


Approval Step Approver









Actions for the approval step determination








Set partner for step
  approval by PFC

Partner function













  1. 9.png
  2. 8. When you have created
    your rules and checked them to ensure correct logic and syntax, you can release
    and save them.







Result: You have created two rule policies which you can use for approval management.







For more information on
rule policies, see the SAP Help Portal under ->
<Release and language> -> Application Help -> Basic Functions ->
Rule Modeler

I just released KBA 2045839 .


It has the product instance definitions of Netweaver 7.3, EHP1 for Netweaver 7.3 and Netweaver 7.4.


This is by no means everything that there is for these product versions in PPMS, but it is a start.


Let me know if you like the document.

So after some time at home (parental leave, etc.) today I want to share with you, how and where to change behaviour in the approval procedure easily if you are able to do coding (or in this special case insert the code).

I will not get into approval procedure detail, more information of the this can be found here f.e.:


Standard behaviour:
When using the approval procedure in Change Request Management with more than one approver, the Request for Change is only rejected after all approvers have voted, meaning executed their approval step.

Which means if you have a Request for Change with five approvers and the first one rejects the Request for Change, you have to wait until all approvers voted so the Request for Change is rejected. But it will be rejected as one reject is enough to completely reject the whole Request for Change.

This is the standard behaviour taken over from CRM standard.

Advantages and disadvantages:

The advantages are that all approvers know about the Request for Change and that it might be more convenient that if this has to be clarified with the rejecters, you know all of them in one step.


The disadvantage is that you have to wait until all approvers executed their decision until you get feedback because the Request for Change will nevertheless be rejected.


I was shortly contacted by a customer asking if this could be changed.

The customer wanted that the first reject leads to a general rejection of the Request for Change because it doesn't make sense for him to wait until others rejected, too. I assume they have a process which allowed to go again into approval from 'Rejected' which might not be a 'FINI' status at all. But that's a guess.


Here is now how the behaviour can be implemented:

  1. Copy a standard function module and change the code
  2. Register the new function odule instead of the old in the CRM event Framework

The standard behavior evaluating the approval steps and setting the approval procedure is implemented via the CRM Event Framework which can be reached via transaction 'CRMV_EVENT'.

event framework.jpg

There function modules are registered to be called by specified events, like save of the CRM document, etc.

Enter the function name 'AIC_SRQM_RFC_APPROVAL_STAT_EC' and press 'Callback for Cat./Obj./Event'.

You will see that the function is registered to run on the event 'AFTER_SAVE' if for the object 'APPROVAL' the attribute 'STEP' has changed.

function regiesterd.jpg


This function  'AIC_SRQM_RFC_APPROVAL_STAT_EC' has to be replaced by our own copied function ( f.e. 'Z_SRQM_RFC_APPROVAL_STAT_EC') with changed code.

To do so, call transaction 'SE37', copy the function to the name stated above to an existing function group of your choice and save everything in your transport. Be aware that the top-include of your function group needs the includes


INCLUDE crm_approval_con.

INCLUDE crm_object_kinds_con.

INCLUDE crm_object_names_con.

INCLUDE crm_events_con.

INCLUDE crm_mode_con.

INCLUDE crm_log_states_con.

INCLUDE crm_status_con.


so the copied function module is syntax free.


Then replace the code listed in the screenshot with the code further down and activate everything.


* Replacement code ****

  READ TABLE ls_approval_wrk-approval_steps_wrk WITH KEY approval_result = gc_status-request_for_change_rejected TRANSPORTING NO FIELDS.

  IF sy-subrc NE 0.

* --4. whether all step are processed

    CHECK cl_crm_approval_utility=>all_steps_done(

          it_approval_s_wrk = ls_approval_wrk-approval_steps_wrk ) EQ true.

    lv_approved = true.

    LOOP AT ls_approval_wrk-approval_steps_wrk ASSIGNING <fs_approval_s_wrk>

            WHERE approval_result EQ gc_status-request_for_change_rejected.

      lv_approved = false.




    lv_approved = false.



Afterwards register this function in the CRM Event Framework by going into 'Edit' mode and entering the new function name.

replace 2.jpg


That's all. Now the new function with the changed code is called and the Request for Change is set to 'Rejected' when the first approver rejects his approval step.

By the way,  if you are familiar with the Basis Enhancement Framework ,it is as well possible to replace the code directly in function 'AIC_SRQM_RFC_APPROVAL_STAT_EC', then you do not have to change the registration of the function in the CRM Event Framework.


So, hope that helped a bit if you find errors, get in contact with me,


We have successfully completed the technical monitoring setup project with one of the client last week; It went go live as planned. And this was my last project in Singapore. We celebrated success party with the client end as well as my farewell party with my peers and friends.  We were very pleased to have one of my manager from previous company on the party; he is one of my well-wisher too.  We had a wonderful conversation with various things. One of the catchy questions from him was about Technical monitoring. The question from him was very interesting; it was that, what are the challenges from your end for small and easy projects like technical monitoring.  Any project whether it was small or big always  has challenges, only critical level may differ.


Like my manager, might be many of you had the similar doubt, hence thought of sharing with you all my experience on setting up technical monitoring.



I Clear Draft of Requirement



The major work in the technical monitoring would be on defining Template strategy. Though SAP had given the standard templates for setting up monitoring, we need to customize it anyway. It is always good practice to create custom template based on the standard one and work only on the custom templates. Due to this, We have to define the clear plan on how many custom templates been created, and how could it be assigned to systems. More over Customization is not meant of custom alerts or metrics, even if you are changing the notifications, priority, severity, threshold all would be considered as customization. We need to be very clear on what are the metrics are need to be monitored, what are the metrics are need to trigger the alert, what is the mode of sending notification whether Email or third party, who are my recipients. This all needs to be clear in our hand, before start the setup.


You can get the available metrics and alert with the detail description from the SAP Standard template overview tab under solman_setup -> Technical monitoring -> Step 4 Template maintenance




You can use this excel file for finalising your entire requirement on beforehand.



II Custom Alerts



There are lots of special features available in Technical monitoring like alerts grouping, metric group, and variant settings. Analyse the entire standard and change individual setting accordingly on your need. If you are going to create any custom metrics based on CCMS MTEs, make sure you have created the data collector also on Z space. This could be very helpful during the time of tracking. The document which we followed was here, How to Create Custom CCMS Metrics.





Please see the appendix of the document on page number 27 for custom data collectors creation.


Also note that, every time you change the metrics you need to reactivate the template, then only all the changes get activated. If you created more templates and assigned to lots of systems this deactivation and activation takes lots of time, hence make sure all custom alerts are finalized and created before. If you have test systems would be very nice that you can test before active in production.



III Fulfilling Pre requistie



Almost all the solution manager scenarios setup can be started only after the pre requisite met. The major issue in technical monitoring would be data collectors. As you all know that technical monitoring is completely different from the earlier system monitoring which is CCMS based. Technical monitoring collectors have lots of collectors, like RFCs, Diagnostic agents, SAP Host agents etc.




Most of the pre requisites are checking this collector’s connectivity and status. The major pre requisites are complieting the entire solman_Setup -> system preparation, basic configuration, and Managed system configuration.  Hence we need to make sure that all steps are marked green.




The other things which I could consider as pre requisite would be EWA reporting, DIA agent connection, Technical Monitoring content update, ST/PI, SAP Host agent upgrade to latest level.


There are some metrics needs additional parameters needs to be set like NFS Share onitoring, In such case make sure you set the parameter and restarted the system.



IV Troubleshooting



We do have major issues in this area, like most of the Metrics data would not be collected or rated grey or wrongly rated or wrongly define.


The wonderful places helped to overcome most of the major issues are, Content check and compare tool.




Metric Monitor, this helps to identify the range of metrics value variation over period of time.




Data collection check, this is the first place where we can get the cause of the grey monitoring.




And also MAI Tools, this is one of the very powerful transaction code, which almost helped to fix very difficult collection issues, like authorisation and etc. Check out more here, Monitoring and Alerting Infrastructure Analysis Tools - SAP Solution Manager - SAP Libr



V House Keeping



The standard housekeeping from SAP might not be sufficient, if you are having more systems or more administrators and vice versa. Please make sure that you defined the house keeping for alerts as well as the BI store.

For alerts can be defined directly in Technical monitoring setup.




For BI housekeeping, please look at my prior blogs for more clarity here,How Is The Health Of Your SAP Solution Manager BI Content?


The other minor challenges like authorization, maintaining global notifications, integration third party ticketing tool,  work mode management all are significant too.


Hope this blog help out all those who planning for setting up Technical Monitring.


Continuing our meet the ASUG speaker series at SAP TechEd && d-code series I am pleased to introduce Heiko Zuerker who is speaking at ASUG SAP TechEd d-code session ITM 120 How to be Successful with Run SAP Like a Factory


About Heiko:

(pictured to the right - photo supplied by Heiko)


Heiko Zuerker is an IT Manager at Rockwell Automation. Born and raised in Germany, he moved to the United States in 2001. Heiko has over 20 years of IT experience, including various roles in desktop and server support, security, and SAP Basis.

More recently, he has been focusing on SAP continuous improvement and has been a pioneer in implementing “Run SAP Like a Factory.”


If he’s not working late at the office, you find him either presenting at the SAP TecheEd, diving with sharks, or crawling through Lake Michigan shipwrecks.



This year's presentation is very special to him, since it's his 5 year anniversary for presenting at the SAP Teched/d-code through ASUG.


About his session:


Here is the abstract from the session listing:


Come and learn from Rockwell Automation's 2 1/2 years of "Run SAP Like a Factory" experience. Learn how they plan and implement its phases, how they have set up and run their Operations Control Center (OCC), the challenges they have faced and are still facing, and how they continuously improve. Hear also how Run SAP Like a Factory has transformed their SAP support





Join ASUG at SAP TechEd && d-code

Venetian/Palazzo Congress Center

ASUG SAP d-code Las Vegas sessions are now published - for a complete listing please see here

Save the date Monday, October 20th for ASUG SAP TechEd d-code Pre-conference Day


Meet ASUG SAP d-code Speaker Charles Reeves - Implementing Enterprise Master Data Management

ASUG SAP d-code SAP BW 7.4 powered by SAP HANA Speaker - Introducing Pawel Mierski

ASUG SAP d-code Sessions Are Published - Featuring SAP Mentors

Journey to Mobile BI - Meet ASUG SAP d-code Speaker Peter Chen

Did you know?

Meet ASUG SAP TechEd d-code Speaker Kumar Chidambaram - Holistic BI BW on HANA Approach

General Information

The information in this blog entry is generally based on SAP Solution Manager 7.1 SP10.  Different versions of SAP Solution Manager may vary in specific steps in the SOLMAN_SETUP transaction.

SOLMAN_SETUP was introduced to SAP Solution Manager with version 7.0 Enhancement Pack 1 (commonly referred to as SAP Solution Manager 7.01 or SAP Solution manager 7.0 SP18) to automate the configuration of the SAP Solution Manager system to a unified baseline.  This baseline generally covers the basic Root Cause Analysis (RCA) scenario.

Depending on the support stack level of the SAP Solution Manager system, the available views in SOLMAN_SETUP may differ, but the main components that make up the baseline configuration (System Preparation, Basic Configuration and Managed System Configuration) are present in all versions.

The following screen is based on a 7.1 SP10 system:



The overview screen is very useful, as it highlights any activity that is part of the System Preparation or Basic Configuration scenarios but which does not have a completed (=green) rating.

It is generally recommended to check the overview regularly, as the step concerning the implementation of the SAP Solution Manager central note (System Preparation -> 4 – Implement SAP Note) may need to be updated with the latest version of the Central Note.  The system regularly checks with the SAP backbone for the latest version of the Central Note and compares this to the version registered in the Solution Manager system (transaction SNOTE).  If the versions do not match, the activity check is re-rated yellow, and this will be then visible in the SOLMAN_SETUP Overview.



What is required for the baseline system configuration?

Due to the interaction of the SAP Solution Manager system with the SAP backbone, the RFC connection SAPOSS must be established and working properly.  This can be tested in transaction SM59:


and adjusted if necessary in transaction OSS1:




What SAP assistance is available for SOLMAN_SETUP?

SAP provides two main forms of interactive assistance for SOLMAN_SETUP.

The first is the interactive Expert Guided Implementation (EGI) Training.  It is delivered as a 5-day empowering session, with an SAP expert providing guidance and demonstration in the mornings, followed by application of the lesson by the attendee on their own systems afterwards.

The second is the Solution manager Starter Pack service (SMSP).  This is a 5-day SAP service, which can be delivered either onsite or remotely, and includes configuration of the Solution Manager system as well as up to 3 managed systems.


What is the sequence of steps required for the baseline configuration of the SAP Solution Manager system?

System preparation:


Basic Configuration:



Managed System Configuration (Dual stack scenario example):




Technical Specifics

Where is the most time lost during the SOLMAN_SETUP configuration of the SAP Solution Manager system?

This section deals with some of the more time-intensive situations that I have come across in my experience setting up SAP Solution Manager systems.

Once again, these instances are specifically based on SAP Solution Manager 7.1 SP10, but while some details may vary, they have been observed in other versions.



The basic configuration of the SAP Solution Manager system is dependent on various prerequisites, which are listed in the following SAP Notes:

SAP Note 1483508   Solution Manager 7.1: Root Cause Analysis pre-requisites

SAP Note 1478974   Diagnostics in SAP Solution Manager 7.1

SAP Note 1843689   Solution Manager 7.1 SP Stack 10: recommended corrections                           

     (see SAP Note 1595736 for older Support Package Stacks)

SAP Note 1582842   Profile parameters for Solution Manager 7.1

SAP Note 1701770   LMDB CIM Model and CR Content prerequisite

The Parameters are checked as part of SOLMAN_SETUP System preparation step 2 – Check Installation.


If the parameters do not match the list, they will need to be adjusted before proceeding.

In the case of SAP Note 1843689 – Solution Manager SP Stack 10:  recommended corrections, patches for certain Java stack components are required, which are designed to fix known issues with the configuration procedures.

Similarly, the managed system will need to meet certain software requirements, as listed in SAP Note 1483508 - Solution Manager 7.1: Root Cause Analysis pre-requisites.  Without these prerequisite in place, some of the automated managed system configuration items may not succeed.


SLD infrastructure

Prior to the introduction of the LMDB in SAP Solution Manager 7.1, it was essential to have a single, central SLD connected to Solution Manager.  Therefore, if the customer's system landscape did not contain at least one functional SLD, a fallback was needed.  To that end, a local SLD was included in the SAP Solution Manager stack, to act as this fallback SLD.

Likewise, if there were multiple functional SLDs present, one SLD would need to act as a central, or consolidation, SLD.  If this was not possible with the existing SLDs, the Solution Manager local SLD could fill that role.

With the advent of the LMDB in SAP Solution Manager 7.1, and the new ability to connect multiple SLDs to the LMDB at the same time, this requirement is no longer in effect.  The local SLD is still available as a fallback option, but in case of multiple functional SLDs, the general recommendation is to connect them to the Solution Manager system directly.

Sample SLD infrastructure:




LMDB initial Synchronization

Introduced with SAP Solution Manager 7.1, the LMDB represents the new central system data repository, gradually replacing SMSY as the basis for Solution Manager functionality.

The SLD – LMDB synchronization is set up in the System Preparation step 6.2 – Set Up LMDB:


First, the LMDB Object Server will need to be configured (1), then the synchronization connection can be established (2).

Once the LMDB – SLD connection has been established and initial synchronization started, you will see the status in process:


Note: The process will take several hours. Please refer to SAP Notes 1555955, 1594654 and 1669645 for information regarding the performance during SLD content synchronization.

You can follow the progress of this process by monitoring the corresponding background job in SM37.  (job SAP_LMDB_LDB*)


Note that the synchronization job does not necessarily stop on errors.  Instead, if this job encounters an error situation, it does not stop it just resets the worklist of synchronization items.  While the program is ‘intelligent’ enough to realize that the first items (up to the point of failure) have already been synchronized and need not be again, checking these items again still takes time.  Once the initial point of failure is once again reached (the batch of synchronization items for which the program timed out initially), the likelihood is high that another timeout will occur at the same point, creating an endless loop.

Therefore, periodic checks of the job log especially of long running initial synchronization jobs are highly recommended.  Likewise, periodic checks with transaction ST22 for related system dumps can lead to early detection of errors.

Individual steps of this job are governed by the already mentioned parameters (icm/conn_timeout and icm/keep_alive_timeout), which can cause the longer steps to time out (errors ICM_HTTP_TIMEOUT in ST22), resetting the job.  Thus, it is imperative that these parameters are increased from the default to accommodate the processing time.  I have found that especially the default value for icm/keep_alive_timeout of 60 seconds is not enough, and should be increased to at least 300 (seconds).


BW System Setup

Since the introduction of Solution Manager Diagnostics (SMD, sometimes also called Root Cause Analysis or RCA) to SAP Solution Manager in 3.2, SAP Solution Manager has contained a BW system to handle the bulk of the RCA data.  This BW system has in the past required a lot of manual configuration.  With the advent of SOLMAN_SETUP, however, much of the setup has been automated.  Still, there are certain caveats that one must consider when configuring the SAP Solution Manager system. Knowing the various pitfalls of the BW system configuration can save significant amounts of time of troubleshooting the system.

First, there are the various BW system options available:

    • BW in the productive SAP Solution Manager client

In this scenario BW is used in the same productive client as SAP Solution Manager. This option allows simpler configuration and isolates the BW activities conducted for solution life cycle management from the data on a production BW instance. This is SAP’s recommendation.

    • BW in a separate client in SAP Solution Manager system
      In this scenario BW activities are conducted in a separate client on the SAP Solution Manager system. This scenario provides increased security, as user access is more restricted. However, you must maintain users separately and this increases your administration effort. There is no technical benefit.
    • BW in a separate, non-productive BW system
      In this scenario, the BW activities are conducted in a separate, non-productive BW system in the landscape. Data is sent to this system from the SAP Solution Manager system via RFC. This is only needed in rare cases for sizing purposes.


Note:  Use of a separate, productive BW system is not recommended.

Selection of the BW scenario to be used is made in Basic Configuration step 2.1 – Specify SAP BW System:



With setup credentials provided as part of step 2.2 – Set up Credentials:


The BW Administrator section will be active if the second BW system option (BW in different client/system) is selected.

Actual configuration of the BW system is completed in the Basic Configuration section 5 – Configure Automatically:




Note that the following activities must be executed manually, before continuing with the automatic configuration steps (see SAP Note 979581 for details):

    • Activate BW source system
    • Adjust BW authorization concept (if listed)

The BW system is activated via the background job BI_TCO_ACTIVATION, which must be completed before proceeding.

Note that the Activity log in Solman_Setup for this activity does not track job completion, only successful launch of the job.

After the background job BI_TCO_ACTIVATION finishes, check the BW Activation Status in transaction RSTCO_ADMIN. If you get a red rating press Start Installation Button again:


Once the BW Source system has been fully activated, the remaining activities can be performed automatically by selecting the Execute All button:


Note:  Full execution of the automatic activities may take multiple hours.

Note:  Multiple instances of the CCMS_BI_SETUP job (such as with the activity Activate BW Content for RCA) may cause warnings and require manual subsequent executions of these failed activities.  This is due to the fact that, as with the BI_TCO_ACTIVATION job activity, SOLMAN_SETUP only verifies that the background job was successfully launched.  It does not wait for completion of that job.

Thus, when subsequent activities attempt to start another instance of the CCMS_BI_SETUP job with different scenario variants (i.e. setting up different portions of the BW system), any still running copies of CCMS_BI_SETUP will prevent new iterations and a warning will occur.


Hopefully, this blog post has provided some information on how to prevent, detect and/or remedy some of the more egregious time-wasting situations in the baseline configuration of the SAP Solution Manager system.

More than 5 years ago we wrote a blog on how to best monitor a discrete manufacturing process. Back then key figures like "Production/Process Orders overdue for Release" and "Production/Process Orders overdue for Technical Closure" have been described. Some other key figures, that describe process steps in-between, were left out back at that time. Such key figures as

  • Production/Process Orders Released without first Confirmation
  • Production/Process Orders overdue for Final Confirmation
  • Production/Process Orders overdue for Delivery Completed


    All those key figures were optimized from a technical runtime perspective, i.e. those key figures are only reading from tables AUFK, AFKO and AFPO and do not select data from the status table JEST, which is typically huge.

    Customers are usually so used/trained on looking at system status (e.g. CRTD, REL, PCNF, CNF, PDLV, DLV) for a production/process orders, that those users are having difficulties to understand what the above mentioned key figures are measuring. This  blog shall help to bridge this gap and shall explain how the key figures in Business Process Analytics map to the system status in a production/process order.

    Mapping Business Process Analytics key figures to system status in order



    I will explain the key figures in the logical order of a discrete manufacturing process

      1. Production/Process Orders overdue for Release
        • measures those orders which have been created (status CRTD) and where the scheduled release date already lies x days in the past, but where no actual release took place (i.e. status REL not yet reached)
      2. Production/Process Orders Released without first Confirmation
        • measures those orders which have been released (status REL) and where the actual release date lies x days in the past, but where no initial confirmation took place (i.e. status PCNF not yet reached)
      3. Production/Process Orders overdue for Final Confirmation
        • measures those orders which have been released (status REL) and where the scheduled end date already lies x days in the past, but where no final confirmation took place (i.e. status CNF not yet reached)
      4. Production/Process Orders overdue for Delivery Completed
        • measures those orders which have been released and at least initially confirmed (status REL & PCNF) and where the scheduled end date already lies x days in the past, but where no delivery completed flag was set (i.e. status DLV not yet reached)
      5. Production/Process Orders overdue for Technical Closure
        • measures those orders which have been released, initially confirmed and are complete regarding delivery (status REL & PCNF & DLV) and where the scheduled end date already lies x days in the past but where no technical completion took place (i.e. status TECO not yet reached)



    Let's summarize it in a short table:


    Key figureActive status reachedPossible further active statusWaiting for status
    Production/Process Orders overdue for ReleaseCRTDREL
    Production/Process Orders Released without first ConfirmationRELPCNF
    Production/Process Orders overdue for Final ConfirmationRELPCNFCNF
    Production/Process Orders overdue for Delivery CompletedREL & PCNFCNF or PDLVDEL
    Production/Process Orders overdue for Technical ClosureREL & PCNF & DELCNF




    Further reading


    Frequently Asked Questions about Business Process Monitoring and Business Process Analytics are answered under and respectively.


    The following blogs (in chronological order) provide further details about Business Process Analytics and Business Process Monitoring functionalities within the SAP Solution Manager.

    Is there any smartest and fastest impact analysis tool for SAP applications? The answer is “BPCA” i.e. Business Process Change Analyzer.


    How many of us are aware of this smart tool? Well!!  We have implemented this tool in our project and it works in very wonderful way.


    In SAP, change is the constant thing which may occur by support package upgrades, custom releases, etc. So it is important to assess the impact of change.

    Many times we have common questions during production movement of the new change that BPCA answers automatically:

    •          What’s changing?
    •          What’s the impact of change?
    •          How do we identify what are the impacted business processes to be tested as part of no-negative impact testing?
    •          Can we get recommendation for regression tests?


    SAP centric solutions are changed on a regular basis by SAP or customers, which require customers to test their business processes thoroughly. Sometimes it becomes difficult to identify critical business processes where ‘BPCA’ comes into picture. BPCA helps to find out list of all impacted business processes for solution manager project.


    In simple terms process follows as: 1. Capture the impacted steps. 2. Validate the steps. 3. Mitigate the risk and impact by performing regression testing. 4. Confirm the results





    This blog will help in understanding the concept of Business Process Change Analyzer (BPCA).


    Pre-requisite: Basic knowledge of SAP Solution Manager Concepts.


    To get answers for questions on BPCA:  What and how BPCA does the change impact analysis? How it works? What are the prerequisites to implement this?  Let us see detailed steps.


    What we will cover in this blog?


    • Introduction: What is BPCA?
    • Technical Prerequisites for BPCA
    • BPCA Preparation: Identify and mark critical business processes
    • What are TBOMs and TBOM Generation Ways?
    • Results Interpretation with Change Impact Analysis.
    • Test Scope Optimization


    Let’s begin!!!


    1.     1. Introduction: What is BPCA?


    In day to day scenarios, SAP solutions are changed either by SAP or by customers when there is need for enhancement. In such scenarios, customers require to test their business processes thoroughly to ensure the particular change does not have any negative impact on other or existing business processes. Test scope identification is the important activity which helps to determine time and effort required to perform testing.

    Before we start the actual testing it is important to differentiate between the types of SAP solution change.


    Types of SAP solution change could be:

    • Maintenance or Enhancement to SAP support packages
    • Configuration changes
    • Custom developments, etc.


    For these types of changes standard test management approach is depicted below in picture:


    Standard test management approach:

    •       Perform initial risk assessment on the effects of change on critical business processes.
    •       Based on impact analysis results, plan for testing.
    •       Once Test Planning is done, execute the test cases either manually or by automated test scripts.


    During all this process there is major pain point: Which business processes are affected by this planned change?Let me explain the situation by giving project scenario:


    In any project, whenever there is planned change which needs to be moved to production system specific approvals needs to be taken for production movement.So if there is a change to existing functionality then complete testing needs to be performed and business approval needs to be taken.For getting the change approved, change needs to be presented in front of approver board who will take decision to approve or reject the change. In such meetings, questions will be asked about Integration testing, business acceptance tests and non-negative testing. Based on the answers from change owner change will be approved but there will not be exact proof on impact assessment and testing performed for it.


    So major challenges faced in this process are:

    •         How to identify which business processes are impacted by this change?
    •         How to get the test recommendations?


    In order to address these pain points SAP introduced a new type of analysis application called “Business Process Change Analyzer” which is capable of performing change impact analysis.

    We can perform this analysis based on the transport request number result of which will be the list of business process steps impacted by that particular change.


    2.     2. Technical Prerequisites for BPCA:


    Let us see prerequisites for BPCA implementation:


    • Adequate Business Process Repository in Solution Manager: Business process repository should be adequate in solution manager with all Business Scenarios/Business Processes and Business process steps.
    • Business Process Steps with Transactions or Programs: Business process steps should be assigned with relevant ABAP object e.g. transaction code or report program.
    • Test Cases Defined per Business Process: Test cases should be defined per business processes
    • Test Cases uploaded in Solution Manager: Test cases should be uploaded to solution manager.
    • TBOM Creation: TBOMs should be created for each business process step.


      3.  BPCA Preparation: Identify and mark critical business processes:


    BPCA uses a SAP solution manager project as basis for analysis and for structuring the results as well.


           1.  First project needs to be created in solution manager using SOLAR_PROJECT_ADMIN transaction code. Once project creation is complete, Business Blueprint needs to be configured via t-code SOLAR01. Business blueprint function enables to design, document and hierarchically catalogue all the business processes into Business Scenarios/Business Processes and Business Process Steps.


    As shown in the below picture business blueprint comprises the following elements in a hierarchy:




    2.    2. Business processes must be defined in SAP solution manager within project. Below is one sample project where Material Creation and change are business processes.


                                    Fig: Sample blueprint structure


    3.    3. Managed systems i.e. the system landscape where that particular transaction/program will be executed must be assigned with logical component which will connect SAP solution manager to that system landscape. A logical component is an administrative entity which assigns logical systems, in the entire system landscape and across projects. e.g. We want to test ‚‘MM01‘ t-code in specific system landscape then to connect to that system via SAP solution manager logical components will be used , in short it is the intermediate connection between SAP solution manager and managed system.


          4. After the transactions are assigned make sure test cases are assigned to each business process or business process step.




    • Identify and mark critical business processes: Let us see how to mark critical business processes. The criticality of business process can be set in the business blueprint of the project. The criticality setting will be helpful in prioritizing business processes during Test Scope Optimization and during BPCA analysis as well.

            We can set Business Process Priority in attributes tab of business process, for this we need to configure customer attribute.


            Let us see how to define customer attribute: Below is the path where we can define new attribute: Here we can define new customer attribute ‘Business 

           Process Priority’.




       So to mark business process as critical process, we can set ‘Business Process Priority’ as 1.We can define business priorities e.g. if it is a critical  

       business process then priority would be ‘1’.


            Let us see how we can mark the critical business process.In below example Material Creation is critical business process so that process is selected in



       Then go to attributes of the business process and in customer attributes set Business Process Priority to ‘1’. So ‘Material Creation’ is marked as critical 

       business process.


    4.   4. What are TBOMs and TBOM Generation Ways?


    Till this step we have seen BPCA preparation first step that business process setup with marking critical business processes.


    Let us see next step i.e. TBOM generation.


    Users should run the business process transaction so that BPCA can collect the technical objects used during the execution of the business process. This collection of technical objects is called as technical bill of material or TBOMs. For example, if we want to create TBOM for’ Material Creation’ Business Process Step which is executed using the transaction “MM01” in the R/3 system. Execute the transaction in SAP solution manager which will connect to ECC system, give required input parameters and complete the creation of material. Same time BPCA will enable a trace in the ERP system and will collect all the objects such as data dictionary objects/includes/FMs, etc. This list of objects will become TBOM for transaction “MM01”.


    Let us see what TBOM generation ways are:


    BPCA TBOM Generation Ways:


    •         Static TBOM
    •         Semi-Dynamic TBOM
    •         Dynamic TBOM


    TBOMs can be created using three methods, let us see in details.


    1. Static TBOMs: Static TBOMs are created by scanning through source code of a transaction or report program. Statically it will note down all the list of technical objects used in that particular report or transaction. They have restriction on level of scan i.e. how deeper it will be able to scan the objects. Sometimes static TBOMs likely to be in accurate as they might miss few objects which are in deep levels of code. Static TBOMs are not recommended for productive use of impact assessment.


    To create static TBOM select the business process step and go to attributes. In attributes go TBOM tab and click on create TBOM, pop up will come select Static then create tbom.


    Step by step TBOM creation1.    Go to business blueprint and select the process step for which TBOM needs to be created. Once step is selected go to Transactions tab and select Attribute button.


    2.    Then go to TBOM tab and click on ‘Create TBOM’. It will give 3 options static, dynamic and semi-dynamic. Select static and branching levels up to 5.



    3.   Click on ‘OK’, static TBOM generation will be started and pop up will come “TBOM Created”.


    4.   After creation we can see the contents of TBOM:


    5.  Below are the screenshots for TBOM contents and list of technical objects: Chart shows percentage of SAP components: e.g. ABAP objects percentage, etc.Below is TBOM example for transaction MM01 which contains total 373 objects. See that different types of objects have been collected program/code objects, table contents, data dictionary, business transactions, etc.




    2. Dynamic TBOM: Dynamic TBOM is similar to taking ST05 trace; it will make the list of objects by enabling the trace. That means for creating dynamic TBOMs user has to execute that particular business process step or transaction either manually/automatically and dynamic TBOMs will record technical objects used during execution for that transaction. Dynamic TBOMs are more accurate than static TBOMs.


    Let us see brief on creation of dynamic TBOMs: Once you select the ‘Dynamic’ option it will ask start recording option. Click on ‘Start Recording’ and complete the execution of transaction on managed system. Once done stop recording, dynamic TBOM will be created.

    After creation of TBOM contents can be viewed.



    ·    3. Semi Dynamic TBOM: Semi-dynamic TBOMs are created using UPL data (usage procedure logging) from production system. UPL is kernel level logging. Solution Manager has to be 7.1 SP11 and managed system above SP9 and above to create Semi-dynamic TBOMs.

    Semi-dynamic TBOMs are created in mass fashion using background job in BPCA. They are most accurate as they are based on usage data in production system.


    UPL data is collected at OS or kernel in the managed system. Two main jobs are executed in production system for collection of UPL data:


    •          Collector Job – This job runs every 45 minutes to collect the UPL logs.
    •          Daily Job – This job runs on daily basis to extract usage statistics. We can execute the report /SDF/SHOW_UPL to see UPL data sample.

    Let us see how to create semi-dynamic TBOMs:


    1. To create semi-dynamic TBOM execute transaction SOLMAN_WORKCENTER and go to “Administration”.In Administration click on Go to TBOM Utilities.


    2. Click on ‘Generation of Static and Semi-Dynamic TBOMs’ option.


    3. Select option ‘create semi-dynamic TBOM’ and enter the period for which UPL data needs to be fetched.



    4. Once this is done schedule a job and change the state to ‘Immediate’.

    5. Once the job is finished, check the job log for semi-dynamic TBOM. Job log shows TBOM creation is finished.


    These are the 3 ways to create TBOMs Static/Dynamic/Semi-Dynamic.



    5.   5. Results Interpretation with Change Impact Analysis:

    Let us see the process to find the results of change impact analysis and also find out how to interpret these details.

    1.   1. Go to test Management work center by executing SOLMAN_WORKCENTER.

    2.   2. Go to BP Change Analyzer View. This view shows us list of analysis which was done previously also allows us to create new analysis.Scroll down below the screen and we can see already done analysis.

      s                          pic25.jpg

    3.   Now let us see how to create new analysis:

          1.In the initial screen enter all the details required for analysis as shown in below screenshot:


    4.   2.After all the details are entered click the RUN button, this will run impact analysis for that transport request.

          3.After execution check the analysis, analysis will list down all the affected business process steps:


    4.Above is the output of impact assessment, it lists all the impacted business process steps because of the particular transport request.


    6.   6. Test Scope Optimization:

    While performing BPCA analysis for large or huge changes like support package upgrades, BPCA analysis gives lot many business process steps as result. That is to say, this analysis is technically correct as many objects are modified during these upgrades.

    In such scenarios, we may look at the impact analysis in different way to optimize and reduce the test scope using different parameters.


    BPCA helps users to optimize and reduce test scope using below criteria:

    • Test Object Coverage: In test coverage, BPCA will use number of objects which are impacting certain business process. Example ‘Material Creation’ process step has large impact which is almost 40% of impacted objects. This technique will give users clear idea on which are the most impacted business process steps and accordingly how to optimize the test scope to test only those processes.

    • Test Efforts: Another way to optimize the test scope is to use test efforts. In few cases there would be automated test cases assigned to business process steps. In such scenario testing would be more efficient and easier. So based on manual efforts required for testing test scope can be optimized.

    • Business Process Attributes: We have already seen how to mark critical business processes for which user need to set ‘Business Process Priority’ in attributes of business process. This way of marking critical process helps BPCA to optimize the test scope.

    Below are samples of Test Scope Optimization:



    Also BPCA can be integrated with HP Quality Centre and there are few prerequisites for this integration.I can share more details on integration part once we implement this in our project

    I hope this blog gave helpful insight on BPCA implementation and its benefits.

    Often I had seen  most of them, talking about Interactive reporting, one of the interesting feature in solution manager 7.1. Today I got some time to explore in detail. Got answers for some queries like what is this all about and how this is different from other reporting approach. Through this blog I would like to share my learning with you all and looking forward to get your experiences.



    What is interactive reporting in solution manager context?



    As you know all that Solution manager has various reporting options, one of the prominent and recently promoted would be interactive reporting, which is very catchy,  The common use case for interactive reporting is to perform ad hoc analyses in real time. But these reports are useful for scenarios that have limited data volume, where the most importance given for quick result than the amount of information analyzed.  Since solution manager has its own BI content, Interactive reporting would be much suitable.



    How IR differ from other reporting?



    Technically, I was not seen any difference, both using BI cubes and queries, templates. But the scenarios are different.  More over any reports delivered always use the queries generated in BW and data is retrieved from the BW to CRM for reporting cause. The difference here, we were identified under the scenarios where it has been used.


    When we setup the dedicated or current client in Solution manager system as BI client, then the best reporting suggestion would be interactive. Because no much processing required, most of the time would be very basic reports.


    Whereas if you have dedicated BI system, we can consider BI reporting would be more efficient due to huge data volume. This is more complex with heavier level of processing. Hence BI reporting would be the better option.


    Other huge difference I found that you can create, edit, and display interactive reports in ITSM web CRM_UI.  This helps to reports retrieve data in real-time on demand.  We have self-guided wizard to assist us during the creation of this reports.  You can then release these reports for certain users.

    For more information, see ITSM analytics. But this is not with BI reporting, most of the BI reporting are pre existed. Whereas BI reporting has more navigation and conditions for drilling down to base level.


    Good thing is that, solution manager has the use case for both the scenarios. For detail level of analysis like RCA we could use BI reporting whereas for technical monitoring scenarios we could use Interactive reporting


    Below are the Technical Monitoring Interactive reporting which shows basic details with the restricted navigation, this is helpful for everyone including end users.




    Here the RCA BI Reporting, which has detail analysis much helpful for the administrators




    Below the guided procedure for Interactive reporting creation in ITSM





    Is it new to solution manager?



    No, from our cross check, we identified the IT performance reporting which been used since SM 7.0 Ehp1 is the same concept. Now IT Performance reporting is also enhanced as new interactive reporting templates. Yes, IT performance reporting in SM 7.1 is same as Technical monitoring -> Interactive reporting, Both are using the same web templates.


    In solution manager 7.0 IT performance reporting the web template 0SMD_RS_NAVIGATION has only reports





    The same web template enhanced in SM 7.1 SP10 with more than 50+ reports as below, it is now called as "Interactive reporting"





    Unique feature in IR reporting



    If I compare IR reporting with BW reporting, might BW reporting ends with high rating, But If I look for unique feature of IR reporting, it is efficient, accurate. mostly it is targeted to all kind of users.  You can display reports in table and charts. The following chart types are available: Column chart, Line chart, Pie chart, Bar chart and Stacked column chart






    You can use these reports to analyze data in many different ways, including a breakdown of individual documents. The report data is retrieved in real-time. You can export report data to Microsoft Excel and print reports.



    Technical settings


    This is again vary scenario to scenario,  we need to activate the corresponding bi content. For Interactive reporting in Technical monitoring you can do it via Technical monitoring Guided procedure setup.






    For interactive reporting in ITSM, you need to manually add some of the configuration services, refer 


    And also SAP ITSM Analytics on SAP Solution Manager 7.1 - SAP IT Service Management on SAP Solution Manager - SCN Wiki

    Other technical demos for IR reporting

    There are lots of demos available in solution manager rkt website, few are listed below, you all can also try and update your use cases below as comments.



    The everyday duty of IS/IT organizations -especially SAP Competency Centers- is to deliver regularly innovation and new features to the business, according to the requirements, budget and priorities. But is also to maintain in operational conditions the existing solutions.

    SAP recommends to deliver 2 major releases per year and minor releases on a monthly or weekly  basis, depending on the maturity and stability of your solution. This is directly inspired by the editor's own release strategy of Enhancement Packages and Support Packages Stacks.


    SEA - Intro.jpg


    As any technical or functional project, an EhP or SPS implementation project has to be budgeted, prepared and planned accordingly to mitigate the risks and delays, but above all the preceding to avoid potentially negative effects on live processes and solutions.




    Scope & Effort Analyzer (SEA) is an innovative tool designed for those people who have to manage maintenance events on their SAP systems and need to get a clear understanding of the change impact as well as the test scope and related effort. It has recently been shipped with SAP Solution Manager 7.1 SP11 (March 2014) and helps to predict the major cost and effort drivers of maintenance projects without the need to physically deploy any software packages. We highly recommend it to be used in an early planning phase of each and every software upadate project.


    The analysis results covers two parts :

    • Adaptations and development management: identification of affected custom code and modifications, required adjustments in the SAP system, since software updates comes with updates or deletions of SAP standard objects. Detailed effort estimation for custom code and modification adjustments.
    • Test management: Identification of required test scope, test planning, recommendations for creation of missing test cases and execution of manual tests. Detailed effort estimation for regression tests and recommendations based on test scope optimization


    It relies on Usage and Procedure Logging (UPL), a new functionality available in any ABAP based system as of SAP NetWeaver 7.01 SP10 or equivalents. UPL is used to log all called and executed ABAP units (procedures) like programs, function modules down to classes, methods and subroutines or smart forms without any performance impact. UPL will give you 100% coverage of your solution usage. This also includes the detection of dynamically called ABAP elements. UPL is the technology to close existing gaps in the SAP workload statistics, which only captures static calls as opposed to static and dynamic calls.

    In other terms, with the help of UPL technique it is now possible to calculate the impact to custom code, modifications and business processes taking into consideration of the real system usage.

    The Maintenance Optimizer scenario automatically calculates the update vector (that’s a detailed list of all technical support and/or enhancement packages to reach the target product version or stack). Scope and Effort Analyzer calculates all SAP ABAP objects which are either deleted, changed (updated version) or newly delivered with this software update. This ABAP object list or Bill of Material (BOM) is the central element to calculate the impact on your SAP system even without a physical installation of those packages.

    In addition to this, semi-dynamic TBOM generation and the automated generation of SAP Module oriented blueprints are two additional source of value. These features ensure to identify the impact on your business processes / transaction codes with the objective to outline the test scope and recommendations how to reduce the test effort with help of Test Scope Optimization (TSO) functionality. This is achieved through a program-based optimization of the number of changed objects by business process and the test effort of associated test cases.




    Testimonial and success story: the French customer Coliposte (Groupe La Poste) is one of the very first world references for the usage of SEA. With the help of our consultants they have achieved in a remarkably short time the implementation of this new tool in the context of their EhP7 for SAP ERP adoption project.

    David Bizien, the CIO for Financial Department of Coliposte explains in the following video how SEA helped him :

    • Forecasting the overall effort and budget for the EhP7 upgrade project
    • Focusing on the most critical impacts
    • Identifying the required skills and competencies for adaptations and developments
    • Booking and mobilizing the appropriate resources for testing
    • Taking into account the team planning constraints




    This interview was recorded at SAPPHIRE NOW 2014 in Orlando.

    If you wish to learn more about this secured (no code or sensitive information exposed outside your company) and very comprehensive (covers both SAP standard and customer-specific objects) but nevertheless free of charge tool (usage included as part of your SAP Enterprise Support contract), feel free to post questions here or to visit SAP Marketplace.


    To be very synthetic, Scope & Effort Analyzer is the ultimate tool to secure your SAP maintenance or evolution projects !

    Concept: This blog allow you to configure the SLA assignation based on the information of the incident, actually using customizing is not possible to use different conditions, this configuration allow you to play with several inputs to assign the SLA that you need.



    In Solution Manager you need to execute the transaction BRFPLUS


    In the Menu, select “Create Application

    Define the Application name:


    Remember to SAVE and Activate the application.

    Create the New Elements:

    Right click on the application and select to create a new element..


    Create the element using the “Bind to DDC element (data Dictionary)


    Us the DDIC element CRMT_SRV_SERWI to bind the Service Profile


    Remember to SAVE and Activate


    Execute the same steps for CRMT_SRV_ESCAL


    Now you will see the Data objects created and activated


    Create the Decision Table:

    Now you need to create the Decision Table, right click on the application and select Expression—>Decision Table..



    Create and navigate to the table


    Now you need to define the columns.


    Insert a new column from the Context Data Objects.


    Select the Application SOLMAN and add the fields that you want to use for the assignation, for this example we use PRIORITY and CATEGORY


    Now you need to add the Result Columns


    In this case we use the objects created previously (CRMT_SRV_SRWI and CRMT_SRV_ESCAL)




    Now you need to add new data to the table, you need to select the insert Icon


    In this example we add 2 entries with different sets Priority and Service Profileimage036.png

    Create the Function:

    Now you need to add a function, Right click on the Application and then Select Function


    Create the function and navigate to the object


    Now we add the Data objects that will be used for the function, in the tab signature select Add Existing Data Objects.



    Add all the objets found in the SOLMAN Application (This allow you to use different  columns in your decision table.



    Now we need to define the Result data object , select Create in the action button.


    Create a new Structure object to allow you to pass the 2 values


    In the new structure, click in Add Existing Data Objects



    Select the Application ZSLA and add the 2 new objects created in the previous steps.


    Remember to SAVE and ACTIVATE

    Create the Ruleset:

    Now is time to create a new RuleSets, you need to select your Function (Blue pyramidal Icon) and then the Assigned Rulests tab, then click on “Create RuleSet”



    Define the new name


    After create the Ruleset you need to create a new Rule, to do that, you need to Click on “Insert Rule” and then in Create


    Now write a description, and then select in “Then --> Add”, the option “Process expression—>Select..”



    Now select the decision table created before


    Now you need to SAVE and Activate.

    Development Part


    To implement this part you need to ask your development team for support,  you need to create a new PPF Action (you can replicate the configuration of the action ZMIN_STD_FIND_PARTNER_FDT).

    Create a new implementation for coping the implementation ZAI_SDK_TEAM_DETERM and using the function module 'CRM_SERVICE_I_READ_OW' and CRM_SERVICE_I_CHANGE_OW to update the Incident


    Best Regards


    How to Add or Remove a System from a ChaRM Project


    In a Scenario such as needing to add a 4th system to a Logical Component (Pre-Prod), or Remove such a system, please follow the steps below:


    In our example we have a Project: ERET_M

    Logical component for a system S71: SAP SOLUTION MANAGER [Solution Manager ABAP Stack]


    101 – Dev

    102 – QA

    103 – Pre Prod

    104 – Training

    105 – Prod

    Retrofit : RIG 501


    Basic steps:


    Close current cycle of maintenance project






    Save through the next phases until reaching



    Being Completed:




    Next Agree and Proceed Through,







    Adjust Logical components as necessary

    From SMSY:





    Create a new Logical Component in LMDB




    Click Display -> Change


    Green Check to Launch LMDB



    Add Technical System and Adjust Type



    Use transaction Solar_Project_admin

    Add / Remove appropriate systems (in this example, adding Test System 104)


    Create New Task List



    View Task List




    Adjust TMS






    Dev -> qa


    Qa -> pre prod


    Pre prod -> prod


    Final Product



    Dbl Click S71 and Remove Transport Layer, add Development System in table below








    Check Project

    Create new cycle



    Activate CTS Functionality




    Check the Landscape

    Create Task List






    Confirm Changes to Project Landscape in Task List:



    Manually catch up transports where required.






    Do you need to know by the time you enter the office if the jobs that were scheduled during the night ended without errors?

    Do you want to receive customized job reports on your mobile device?

    Would you like to be able to receive the result of a Job Management POWL query, for example the list of jobs in a 24h time window as shown below, by email?

    00 jsm_wc_smse_query.png




    1. You have SAP Solution Manager Support Package 10 or higher installed


    2. You have to create the required POWL queries in the Job Management Work Center that either

    a) Collects the job data from the SAP system directly or

    b) Collects the job data from an external job scheduler like SAP CPS by Redwood

    The queries have to use dynamic time windows in order to get for example the job runs during the past 24 hours.




    Once the queries are available you navigate in the Job Management Work Center to the Administration view and start the configuration of the POWL Query Result notification. As the name implies this works for al kinds of POWL queries and not only for the queries defined in the Job Management Work Center.

    10 jsm_wc_admin - config.png

    In the configuration application you just select Add, then select the POWL query you want to receive (make sure that you have selected the correct type of query) and enter your business partner number.

    11 jsm_wc_admin_new_query_result.png

    After pressing OK you should see your configuration in the list of available configuration. Select your configuration and the assigned business partners will be displayed:

    12  jsm_wc_admin_new_query_result_2.png

    Of course it is possible to assign more than one business partner to a configuration as you can see from this example:

    13 jsm_wc_admin_new_query_many_recipients.png



    Now that the configuration has been done return to the  Administration view in the Job Management Work Center and select Schedule Job in section POWL Query Result Configuration.

    20 jsm_wc_admin -schedule.png

    The job scheduling of Job Scheduling Management will be started already prefilled with the report to be schedule. Just enter your preferred start time and recurrence and you're done.

    21 jsm_wc_schedule_email_job.png




    What's next? Just wait for the job to run and you will receive an email containing the results of your POWL query, see the example below:

    30 POWL_email.png

    This is a follow-up post, and part 1 is available here.


    a) Open Job Management WorkCenter. transaction SM_WORKCENTER, Select Job Management


    b) Select Job Documentation, Click on ‘All’ for example

       Select relevant Job Document, click on Job Documentation-> Edit

       You can create a new Job Document as well.

    job doc.jpg

    c) In the New Window that opens, click on Systems tab

    jobdoc systems.jpg


    d)The new Job Monitoring Config UI should be launched, it may launch old BP MON Monitoring config based on the selection.

    But the current document only deals with MAI based Monitoring



    e) The Add Jobs button is disabled if at least one monitored object is found in MAI corresponding to the Job document.

    On Press of “Add Jobs From Managed System”

    The filter values in the popup come prefilled with Job Information provided in the job document.

    add jobs managed.jpg



    On press of “Get jobs from Abap System”, the row in the lower table is selected and added.

    The lower table has the possibility of creating a new monitored object or use existing monitored object.

    The JobDocu ID assigned to the monitored object, is the Job Documentation ID of the Calling job document.

    On Press of “Add Jobs from External scheduler” the filter values are also prefilled.

    The user selects the filter values and presses on Find, multiple rows are displayed.



    g) After the job is added you can see that the add jobs button is disabled.


    job as mon object.jpg


    Thank you for going through my blog.


    Filter Blog

    By author:
    By date:
    By tag: