http://scn.sap.com/servlet/JiveServlet/showImage/38-110543-496382/asug.png

 

Continuing our meet the ASUG speaker series at SAP TechEd && d-code series I am pleased to introduce Heiko Zuerker who is speaking at ASUG SAP TechEd d-code session ITM 120 How to be Successful with Run SAP Like a Factory

Zuerker_Heiko_SAPTeched12_Head_Small.png


About Heiko:

(pictured to the right - photo supplied by Heiko)

 

Heiko Zuerker is an IT Manager at Rockwell Automation. Born and raised in Germany, he moved to the United States in 2001. Heiko has over 20 years of IT experience, including various roles in desktop and server support, security, and SAP Basis.

More recently, he has been focusing on SAP continuous improvement and has been a pioneer in implementing “Run SAP Like a Factory.”

 

If he’s not working late at the office, you find him either presenting at the SAP TecheEd, diving with sharks, or crawling through Lake Michigan shipwrecks.

 

 

This year's presentation is very special to him, since it's his 5 year anniversary for presenting at the SAP Teched/d-code through ASUG.

 

About his session:

 

Here is the abstract from the session listing:

 

Come and learn from Rockwell Automation's 2 1/2 years of "Run SAP Like a Factory" experience. Learn how they plan and implement its phases, how they have set up and run their Operations Control Center (OCC), the challenges they have faced and are still facing, and how they continuously improve. Hear also how Run SAP Like a Factory has transformed their SAP support

 

 

_______________________________________________________________________________________________________________

 

Join ASUG at SAP TechEd && d-code

OCTOBER 20-24
Venetian/Palazzo Congress Center


ASUG SAP d-code Las Vegas sessions are now published - for a complete listing please see here


Save the date Monday, October 20th for ASUG SAP TechEd d-code Pre-conference Day


Related:

Meet ASUG SAP d-code Speaker Charles Reeves - Implementing Enterprise Master Data Management

ASUG SAP d-code SAP BW 7.4 powered by SAP HANA Speaker - Introducing Pawel Mierski

ASUG SAP d-code Sessions Are Published - Featuring SAP Mentors

Journey to Mobile BI - Meet ASUG SAP d-code Speaker Peter Chen

Did you know?

Meet ASUG SAP TechEd d-code Speaker Kumar Chidambaram - Holistic BI BW on HANA Approach


General Information

The information in this blog entry is generally based on SAP Solution Manager 7.1 SP10.  Different versions of SAP Solution Manager may vary in specific steps in the SOLMAN_SETUP transaction.

SOLMAN_SETUP was introduced to SAP Solution Manager with version 7.0 Enhancement Pack 1 (commonly referred to as SAP Solution Manager 7.01 or SAP Solution manager 7.0 SP18) to automate the configuration of the SAP Solution Manager system to a unified baseline.  This baseline generally covers the basic Root Cause Analysis (RCA) scenario.

Depending on the support stack level of the SAP Solution Manager system, the available views in SOLMAN_SETUP may differ, but the main components that make up the baseline configuration (System Preparation, Basic Configuration and Managed System Configuration) are present in all versions.

The following screen is based on a 7.1 SP10 system:

1.png

 

The overview screen is very useful, as it highlights any activity that is part of the System Preparation or Basic Configuration scenarios but which does not have a completed (=green) rating.

It is generally recommended to check the overview regularly, as the step concerning the implementation of the SAP Solution Manager central note (System Preparation -> 4 – Implement SAP Note) may need to be updated with the latest version of the Central Note.  The system regularly checks with the SAP backbone for the latest version of the Central Note and compares this to the version registered in the Solution Manager system (transaction SNOTE).  If the versions do not match, the activity check is re-rated yellow, and this will be then visible in the SOLMAN_SETUP Overview.

2.png

 

What is required for the baseline system configuration?

Due to the interaction of the SAP Solution Manager system with the SAP backbone, the RFC connection SAPOSS must be established and working properly.  This can be tested in transaction SM59:

3.png

and adjusted if necessary in transaction OSS1:

4.png

5.png

 

What SAP assistance is available for SOLMAN_SETUP?

SAP provides two main forms of interactive assistance for SOLMAN_SETUP.

The first is the interactive Expert Guided Implementation (EGI) Training.  It is delivered as a 5-day empowering session, with an SAP expert providing guidance and demonstration in the mornings, followed by application of the lesson by the attendee on their own systems afterwards.

https://service.sap.com/~sapidb/011000358700001779602008E/EGI-IntroPage-2011-3-7.htm

The second is the Solution manager Starter Pack service (SMSP).  This is a 5-day SAP service, which can be delivered either onsite or remotely, and includes configuration of the Solution Manager system as well as up to 3 managed systems.

https://websmp105.sap-ag.de/~sapdownload/011000358700000806252007E

 

What is the sequence of steps required for the baseline configuration of the SAP Solution Manager system?

System preparation:

6.png

Basic Configuration:

7.png

8.png

Managed System Configuration (Dual stack scenario example):

9.png

10.png

 

Technical Specifics

Where is the most time lost during the SOLMAN_SETUP configuration of the SAP Solution Manager system?

This section deals with some of the more time-intensive situations that I have come across in my experience setting up SAP Solution Manager systems.

Once again, these instances are specifically based on SAP Solution Manager 7.1 SP10, but while some details may vary, they have been observed in other versions.

 

Parameters/Prerequisites

The basic configuration of the SAP Solution Manager system is dependent on various prerequisites, which are listed in the following SAP Notes:

SAP Note 1483508   Solution Manager 7.1: Root Cause Analysis pre-requisites

SAP Note 1478974   Diagnostics in SAP Solution Manager 7.1

SAP Note 1843689   Solution Manager 7.1 SP Stack 10: recommended corrections                           

     (see SAP Note 1595736 for older Support Package Stacks)

SAP Note 1582842   Profile parameters for Solution Manager 7.1

SAP Note 1701770   LMDB CIM Model and CR Content prerequisite

The Parameters are checked as part of SOLMAN_SETUP System preparation step 2 – Check Installation.

11.png

If the parameters do not match the list, they will need to be adjusted before proceeding.

In the case of SAP Note 1843689 – Solution Manager SP Stack 10:  recommended corrections, patches for certain Java stack components are required, which are designed to fix known issues with the configuration procedures.

Similarly, the managed system will need to meet certain software requirements, as listed in SAP Note 1483508 - Solution Manager 7.1: Root Cause Analysis pre-requisites.  Without these prerequisite in place, some of the automated managed system configuration items may not succeed.

 

SLD infrastructure

Prior to the introduction of the LMDB in SAP Solution Manager 7.1, it was essential to have a single, central SLD connected to Solution Manager.  Therefore, if the customer's system landscape did not contain at least one functional SLD, a fallback was needed.  To that end, a local SLD was included in the SAP Solution Manager stack, to act as this fallback SLD.

Likewise, if there were multiple functional SLDs present, one SLD would need to act as a central, or consolidation, SLD.  If this was not possible with the existing SLDs, the Solution Manager local SLD could fill that role.

With the advent of the LMDB in SAP Solution Manager 7.1, and the new ability to connect multiple SLDs to the LMDB at the same time, this requirement is no longer in effect.  The local SLD is still available as a fallback option, but in case of multiple functional SLDs, the general recommendation is to connect them to the Solution Manager system directly.

Sample SLD infrastructure:

12.png

 

 

LMDB initial Synchronization

Introduced with SAP Solution Manager 7.1, the LMDB represents the new central system data repository, gradually replacing SMSY as the basis for Solution Manager functionality.

The SLD – LMDB synchronization is set up in the System Preparation step 6.2 – Set Up LMDB:

13.png

First, the LMDB Object Server will need to be configured (1), then the synchronization connection can be established (2).

Once the LMDB – SLD connection has been established and initial synchronization started, you will see the status in process:

14.png

Note: The process will take several hours. Please refer to SAP Notes 1555955, 1594654 and 1669645 for information regarding the performance during SLD content synchronization.

You can follow the progress of this process by monitoring the corresponding background job in SM37.  (job SAP_LMDB_LDB*)

15.png

Note that the synchronization job does not necessarily stop on errors.  Instead, if this job encounters an error situation, it does not stop it just resets the worklist of synchronization items.  While the program is ‘intelligent’ enough to realize that the first items (up to the point of failure) have already been synchronized and need not be again, checking these items again still takes time.  Once the initial point of failure is once again reached (the batch of synchronization items for which the program timed out initially), the likelihood is high that another timeout will occur at the same point, creating an endless loop.

Therefore, periodic checks of the job log especially of long running initial synchronization jobs are highly recommended.  Likewise, periodic checks with transaction ST22 for related system dumps can lead to early detection of errors.

Individual steps of this job are governed by the already mentioned parameters (icm/conn_timeout and icm/keep_alive_timeout), which can cause the longer steps to time out (errors ICM_HTTP_TIMEOUT in ST22), resetting the job.  Thus, it is imperative that these parameters are increased from the default to accommodate the processing time.  I have found that especially the default value for icm/keep_alive_timeout of 60 seconds is not enough, and should be increased to at least 300 (seconds).

 

BW System Setup

Since the introduction of Solution Manager Diagnostics (SMD, sometimes also called Root Cause Analysis or RCA) to SAP Solution Manager in 3.2, SAP Solution Manager has contained a BW system to handle the bulk of the RCA data.  This BW system has in the past required a lot of manual configuration.  With the advent of SOLMAN_SETUP, however, much of the setup has been automated.  Still, there are certain caveats that one must consider when configuring the SAP Solution Manager system. Knowing the various pitfalls of the BW system configuration can save significant amounts of time of troubleshooting the system.

First, there are the various BW system options available:

    • BW in the productive SAP Solution Manager client

In this scenario BW is used in the same productive client as SAP Solution Manager. This option allows simpler configuration and isolates the BW activities conducted for solution life cycle management from the data on a production BW instance. This is SAP’s recommendation.

    • BW in a separate client in SAP Solution Manager system
      In this scenario BW activities are conducted in a separate client on the SAP Solution Manager system. This scenario provides increased security, as user access is more restricted. However, you must maintain users separately and this increases your administration effort. There is no technical benefit.
    • BW in a separate, non-productive BW system
      In this scenario, the BW activities are conducted in a separate, non-productive BW system in the landscape. Data is sent to this system from the SAP Solution Manager system via RFC. This is only needed in rare cases for sizing purposes.

16.png

Note:  Use of a separate, productive BW system is not recommended.

Selection of the BW scenario to be used is made in Basic Configuration step 2.1 – Specify SAP BW System:

17.png

18.png

With setup credentials provided as part of step 2.2 – Set up Credentials:

19.png

The BW Administrator section will be active if the second BW system option (BW in different client/system) is selected.

Actual configuration of the BW system is completed in the Basic Configuration section 5 – Configure Automatically:

20.png

21.png

22.png

Note that the following activities must be executed manually, before continuing with the automatic configuration steps (see SAP Note 979581 for details):

    • Activate BW source system
    • Adjust BW authorization concept (if listed)

The BW system is activated via the background job BI_TCO_ACTIVATION, which must be completed before proceeding.

Note that the Activity log in Solman_Setup for this activity does not track job completion, only successful launch of the job.

After the background job BI_TCO_ACTIVATION finishes, check the BW Activation Status in transaction RSTCO_ADMIN. If you get a red rating press Start Installation Button again:

23.png

Once the BW Source system has been fully activated, the remaining activities can be performed automatically by selecting the Execute All button:

24.png

Note:  Full execution of the automatic activities may take multiple hours.

Note:  Multiple instances of the CCMS_BI_SETUP job (such as with the activity Activate BW Content for RCA) may cause warnings and require manual subsequent executions of these failed activities.  This is due to the fact that, as with the BI_TCO_ACTIVATION job activity, SOLMAN_SETUP only verifies that the background job was successfully launched.  It does not wait for completion of that job.

Thus, when subsequent activities attempt to start another instance of the CCMS_BI_SETUP job with different scenario variants (i.e. setting up different portions of the BW system), any still running copies of CCMS_BI_SETUP will prevent new iterations and a warning will occur.

 

Hopefully, this blog post has provided some information on how to prevent, detect and/or remedy some of the more egregious time-wasting situations in the baseline configuration of the SAP Solution Manager system.

More than 5 years ago we wrote a blog on how to best monitor a discrete manufacturing process. Back then key figures like "Production/Process Orders overdue for Release" and "Production/Process Orders overdue for Technical Closure" have been described. Some other key figures, that describe process steps in-between, were left out back at that time. Such key figures as
    

  • Production/Process Orders Released without first Confirmation
  • Production/Process Orders overdue for Final Confirmation
  • Production/Process Orders overdue for Delivery Completed

     

    All those key figures were optimized from a technical runtime perspective, i.e. those key figures are only reading from tables AUFK, AFKO and AFPO and do not select data from the status table JEST, which is typically huge.

    Customers are usually so used/trained on looking at system status (e.g. CRTD, REL, PCNF, CNF, PDLV, DLV) for a production/process orders, that those users are having difficulties to understand what the above mentioned key figures are measuring. This  blog shall help to bridge this gap and shall explain how the key figures in Business Process Analytics map to the system status in a production/process order.

    Mapping Business Process Analytics key figures to system status in order

     

     

    I will explain the key figures in the logical order of a discrete manufacturing process

      1. Production/Process Orders overdue for Release
        • measures those orders which have been created (status CRTD) and where the scheduled release date already lies x days in the past, but where no actual release took place (i.e. status REL not yet reached)
      2. Production/Process Orders Released without first Confirmation
        • measures those orders which have been released (status REL) and where the actual release date lies x days in the past, but where no initial confirmation took place (i.e. status PCNF not yet reached)
      3. Production/Process Orders overdue for Final Confirmation
        • measures those orders which have been released (status REL) and where the scheduled end date already lies x days in the past, but where no final confirmation took place (i.e. status CNF not yet reached)
      4. Production/Process Orders overdue for Delivery Completed
        • measures those orders which have been released and at least initially confirmed (status REL & PCNF) and where the scheduled end date already lies x days in the past, but where no delivery completed flag was set (i.e. status DLV not yet reached)
      5. Production/Process Orders overdue for Technical Closure
        • measures those orders which have been released, initially confirmed and are complete regarding delivery (status REL & PCNF & DLV) and where the scheduled end date already lies x days in the past but where no technical completion took place (i.e. status TECO not yet reached)

     

     

    Let's summarize it in a short table:

     

    Key figureActive status reachedPossible further active statusWaiting for status
    Production/Process Orders overdue for ReleaseCRTDREL
    Production/Process Orders Released without first ConfirmationRELPCNF
    Production/Process Orders overdue for Final ConfirmationRELPCNFCNF
    Production/Process Orders overdue for Delivery CompletedREL & PCNFCNF or PDLVDEL
    Production/Process Orders overdue for Technical ClosureREL & PCNF & DELCNF

    TECO

     

     

    Further reading

     

    Frequently Asked Questions about Business Process Monitoring and Business Process Analytics are answered under http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Monitoring and

    http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Analytics respectively.

     

    The following blogs (in chronological order) provide further details about Business Process Analytics and Business Process Monitoring functionalities within the SAP Solution Manager.

    Is there any smartest and fastest impact analysis tool for SAP applications? The answer is “BPCA” i.e. Business Process Change Analyzer.

     

    How many of us are aware of this smart tool? Well!!  We have implemented this tool in our project and it works in very wonderful way.

     

    In SAP, change is the constant thing which may occur by support package upgrades, custom releases, etc. So it is important to assess the impact of change.

    Many times we have common questions during production movement of the new change that BPCA answers automatically:

    •          What’s changing?
    •          What’s the impact of change?
    •          How do we identify what are the impacted business processes to be tested as part of no-negative impact testing?
    •          Can we get recommendation for regression tests?

     

    SAP centric solutions are changed on a regular basis by SAP or customers, which require customers to test their business processes thoroughly. Sometimes it becomes difficult to identify critical business processes where ‘BPCA’ comes into picture. BPCA helps to find out list of all impacted business processes for solution manager project.

     

    In simple terms process follows as: 1. Capture the impacted steps. 2. Validate the steps. 3. Mitigate the risk and impact by performing regression testing. 4. Confirm the results

     

     

    pic1.jpg

     

    This blog will help in understanding the concept of Business Process Change Analyzer (BPCA).

     

    Pre-requisite: Basic knowledge of SAP Solution Manager Concepts.

     

    To get answers for questions on BPCA:  What and how BPCA does the change impact analysis? How it works? What are the prerequisites to implement this?  Let us see detailed steps.

     

    What we will cover in this blog?

     

    • Introduction: What is BPCA?
    • Technical Prerequisites for BPCA
    • BPCA Preparation: Identify and mark critical business processes
    • What are TBOMs and TBOM Generation Ways?
    • Results Interpretation with Change Impact Analysis.
    • Test Scope Optimization

     

    Let’s begin!!!

     

    1.     1. Introduction: What is BPCA?

     

    In day to day scenarios, SAP solutions are changed either by SAP or by customers when there is need for enhancement. In such scenarios, customers require to test their business processes thoroughly to ensure the particular change does not have any negative impact on other or existing business processes. Test scope identification is the important activity which helps to determine time and effort required to perform testing.

    Before we start the actual testing it is important to differentiate between the types of SAP solution change.

     

    Types of SAP solution change could be:

    • Maintenance or Enhancement to SAP support packages
    • Configuration changes
    • Custom developments, etc.

     

    For these types of changes standard test management approach is depicted below in picture:

             pic2.jpg

    Standard test management approach:

    •       Perform initial risk assessment on the effects of change on critical business processes.
    •       Based on impact analysis results, plan for testing.
    •       Once Test Planning is done, execute the test cases either manually or by automated test scripts.

     

    During all this process there is major pain point: Which business processes are affected by this planned change?Let me explain the situation by giving project scenario:

     

    In any project, whenever there is planned change which needs to be moved to production system specific approvals needs to be taken for production movement.So if there is a change to existing functionality then complete testing needs to be performed and business approval needs to be taken.For getting the change approved, change needs to be presented in front of approver board who will take decision to approve or reject the change. In such meetings, questions will be asked about Integration testing, business acceptance tests and non-negative testing. Based on the answers from change owner change will be approved but there will not be exact proof on impact assessment and testing performed for it.

     

    So major challenges faced in this process are:

    •         How to identify which business processes are impacted by this change?
    •         How to get the test recommendations?

     

    In order to address these pain points SAP introduced a new type of analysis application called “Business Process Change Analyzer” which is capable of performing change impact analysis.

    We can perform this analysis based on the transport request number result of which will be the list of business process steps impacted by that particular change.

     

    2.     2. Technical Prerequisites for BPCA:

     

    Let us see prerequisites for BPCA implementation:

     

    • Adequate Business Process Repository in Solution Manager: Business process repository should be adequate in solution manager with all Business Scenarios/Business Processes and Business process steps.
    • Business Process Steps with Transactions or Programs: Business process steps should be assigned with relevant ABAP object e.g. transaction code or report program.
    • Test Cases Defined per Business Process: Test cases should be defined per business processes
    • Test Cases uploaded in Solution Manager: Test cases should be uploaded to solution manager.
    • TBOM Creation: TBOMs should be created for each business process step.

     

      3.  BPCA Preparation: Identify and mark critical business processes:

     

    BPCA uses a SAP solution manager project as basis for analysis and for structuring the results as well.

    1.  

           1.  First project needs to be created in solution manager using SOLAR_PROJECT_ADMIN transaction code. Once project creation is complete, Business Blueprint needs to be configured via t-code SOLAR01. Business blueprint function enables to design, document and hierarchically catalogue all the business processes into Business Scenarios/Business Processes and Business Process Steps.

     

    As shown in the below picture business blueprint comprises the following elements in a hierarchy:

     

    pic3.jpg

     

    2.    2. Business processes must be defined in SAP solution manager within project. Below is one sample project where Material Creation and change are business processes.

      pic4.jpg

                                    Fig: Sample blueprint structure

     

    3.    3. Managed systems i.e. the system landscape where that particular transaction/program will be executed must be assigned with logical component which will connect SAP solution manager to that system landscape. A logical component is an administrative entity which assigns logical systems, in the entire system landscape and across projects. e.g. We want to test ‚‘MM01‘ t-code in specific system landscape then to connect to that system via SAP solution manager logical components will be used , in short it is the intermediate connection between SAP solution manager and managed system.

    4.  

          4. After the transactions are assigned make sure test cases are assigned to each business process or business process step.

          

              pic5.jpg

     

    • Identify and mark critical business processes: Let us see how to mark critical business processes. The criticality of business process can be set in the business blueprint of the project. The criticality setting will be helpful in prioritizing business processes during Test Scope Optimization and during BPCA analysis as well.

            We can set Business Process Priority in attributes tab of business process, for this we need to configure customer attribute.

          

            Let us see how to define customer attribute: Below is the path where we can define new attribute: Here we can define new customer attribute ‘Business 

           Process Priority’.

     

            pic6.jpg

       

       So to mark business process as critical process, we can set ‘Business Process Priority’ as 1.We can define business priorities e.g. if it is a critical  

       business process then priority would be ‘1’.

       

            Let us see how we can mark the critical business process.In below example Material Creation is critical business process so that process is selected in

            blueprint.

       pic7.jpg

       Then go to attributes of the business process and in customer attributes set Business Process Priority to ‘1’. So ‘Material Creation’ is marked as critical 

       business process.

       pic8.jpg

    4.   4. What are TBOMs and TBOM Generation Ways?

     

    Till this step we have seen BPCA preparation first step that business process setup with marking critical business processes.

     

    Let us see next step i.e. TBOM generation.

     

    Users should run the business process transaction so that BPCA can collect the technical objects used during the execution of the business process. This collection of technical objects is called as technical bill of material or TBOMs. For example, if we want to create TBOM for’ Material Creation’ Business Process Step which is executed using the transaction “MM01” in the R/3 system. Execute the transaction in SAP solution manager which will connect to ECC system, give required input parameters and complete the creation of material. Same time BPCA will enable a trace in the ERP system and will collect all the objects such as data dictionary objects/includes/FMs, etc. This list of objects will become TBOM for transaction “MM01”.

     

    Let us see what TBOM generation ways are:

     

    BPCA TBOM Generation Ways:

     

    •         Static TBOM
    •         Semi-Dynamic TBOM
    •         Dynamic TBOM

     

    TBOMs can be created using three methods, let us see in details.

     

    1. Static TBOMs: Static TBOMs are created by scanning through source code of a transaction or report program. Statically it will note down all the list of technical objects used in that particular report or transaction. They have restriction on level of scan i.e. how deeper it will be able to scan the objects. Sometimes static TBOMs likely to be in accurate as they might miss few objects which are in deep levels of code. Static TBOMs are not recommended for productive use of impact assessment.

     

    To create static TBOM select the business process step and go to attributes. In attributes go TBOM tab and click on create TBOM, pop up will come select Static then create tbom.

     

    Step by step TBOM creation1.    Go to business blueprint and select the process step for which TBOM needs to be created. Once step is selected go to Transactions tab and select Attribute button.

                            pic9.jpg

    2.    Then go to TBOM tab and click on ‘Create TBOM’. It will give 3 options static, dynamic and semi-dynamic. Select static and branching levels up to 5.


                 pic10.jpg

                 pic11.jpg

    3.   Click on ‘OK’, static TBOM generation will be started and pop up will come “TBOM Created”.

                             pic12.jpg  

    4.   After creation we can see the contents of TBOM:

                           pic13.jpg

    5.  Below are the screenshots for TBOM contents and list of technical objects: Chart shows percentage of SAP components: e.g. ABAP objects percentage, etc.Below is TBOM example for transaction MM01 which contains total 373 objects. See that different types of objects have been collected program/code objects, table contents, data dictionary, business transactions, etc.

     

                           pic14.jpg

                           pic15.jpg

    2. Dynamic TBOM: Dynamic TBOM is similar to taking ST05 trace; it will make the list of objects by enabling the trace. That means for creating dynamic TBOMs user has to execute that particular business process step or transaction either manually/automatically and dynamic TBOMs will record technical objects used during execution for that transaction. Dynamic TBOMs are more accurate than static TBOMs.

     

    Let us see brief on creation of dynamic TBOMs: Once you select the ‘Dynamic’ option it will ask start recording option. Click on ‘Start Recording’ and complete the execution of transaction on managed system. Once done stop recording, dynamic TBOM will be created.

    After creation of TBOM contents can be viewed.

                            pic16.jpg

                            pic17.jpg                       

    ·    3. Semi Dynamic TBOM: Semi-dynamic TBOMs are created using UPL data (usage procedure logging) from production system. UPL is kernel level logging. Solution Manager has to be 7.1 SP11 and managed system above SP9 and above to create Semi-dynamic TBOMs.

    Semi-dynamic TBOMs are created in mass fashion using background job in BPCA. They are most accurate as they are based on usage data in production system.

     

    UPL data is collected at OS or kernel in the managed system. Two main jobs are executed in production system for collection of UPL data:

     

    •          Collector Job – This job runs every 45 minutes to collect the UPL logs.
    •          Daily Job – This job runs on daily basis to extract usage statistics. We can execute the report /SDF/SHOW_UPL to see UPL data sample.

    Let us see how to create semi-dynamic TBOMs:

     

    1. To create semi-dynamic TBOM execute transaction SOLMAN_WORKCENTER and go to “Administration”.In Administration click on Go to TBOM Utilities.

                           pic19.jpg

    2. Click on ‘Generation of Static and Semi-Dynamic TBOMs’ option.

                          pic20.jpg

    3. Select option ‘create semi-dynamic TBOM’ and enter the period for which UPL data needs to be fetched.

                          pic21.jpg

                            pic22.jpg                  

    4. Once this is done schedule a job and change the state to ‘Immediate’.

    5. Once the job is finished, check the job log for semi-dynamic TBOM. Job log shows TBOM creation is finished.

                           semi-dyn.png

    These are the 3 ways to create TBOMs Static/Dynamic/Semi-Dynamic.

     

     

    5.   5. Results Interpretation with Change Impact Analysis:

    Let us see the process to find the results of change impact analysis and also find out how to interpret these details.

    1.   1. Go to test Management work center by executing SOLMAN_WORKCENTER.

    2.   2. Go to BP Change Analyzer View. This view shows us list of analysis which was done previously also allows us to create new analysis.Scroll down below the screen and we can see already done analysis.

      s                          pic25.jpg

    3.   Now let us see how to create new analysis:

          1.In the initial screen enter all the details required for analysis as shown in below screenshot:

                                   pic26.jpg

    4.   2.After all the details are entered click the RUN button, this will run impact analysis for that transport request.

          3.After execution check the analysis, analysis will list down all the affected business process steps:

                            pic27.jpg

    4.Above is the output of impact assessment, it lists all the impacted business process steps because of the particular transport request.

     

    6.   6. Test Scope Optimization:

    While performing BPCA analysis for large or huge changes like support package upgrades, BPCA analysis gives lot many business process steps as result. That is to say, this analysis is technically correct as many objects are modified during these upgrades.

    In such scenarios, we may look at the impact analysis in different way to optimize and reduce the test scope using different parameters.

     

    BPCA helps users to optimize and reduce test scope using below criteria:


    • Test Object Coverage: In test coverage, BPCA will use number of objects which are impacting certain business process. Example ‘Material Creation’ process step has large impact which is almost 40% of impacted objects. This technique will give users clear idea on which are the most impacted business process steps and accordingly how to optimize the test scope to test only those processes.


    • Test Efforts: Another way to optimize the test scope is to use test efforts. In few cases there would be automated test cases assigned to business process steps. In such scenario testing would be more efficient and easier. So based on manual efforts required for testing test scope can be optimized.


    • Business Process Attributes: We have already seen how to mark critical business processes for which user need to set ‘Business Process Priority’ in attributes of business process. This way of marking critical process helps BPCA to optimize the test scope.


    Below are samples of Test Scope Optimization:


                             pic28.jpg

                               pic29.jpg

    Also BPCA can be integrated with HP Quality Centre and there are few prerequisites for this integration.I can share more details on integration part once we implement this in our project


    I hope this blog gave helpful insight on BPCA implementation and its benefits.

    Often I had seen  most of them, talking about Interactive reporting, one of the interesting feature in solution manager 7.1. Today I got some time to explore in detail. Got answers for some queries like what is this all about and how this is different from other reporting approach. Through this blog I would like to share my learning with you all and looking forward to get your experiences.

     

     

    What is interactive reporting in solution manager context?

     

     

    As you know all that Solution manager has various reporting options, one of the prominent and recently promoted would be interactive reporting, which is very catchy,  The common use case for interactive reporting is to perform ad hoc analyses in real time. But these reports are useful for scenarios that have limited data volume, where the most importance given for quick result than the amount of information analyzed.  Since solution manager has its own BI content, Interactive reporting would be much suitable.

     

     

    How IR differ from other reporting?

     

     

    Technically, I was not seen any difference, both using BI cubes and queries, templates. But the scenarios are different.  More over any reports delivered always use the queries generated in BW and data is retrieved from the BW to CRM for reporting cause. The difference here, we were identified under the scenarios where it has been used.

     

    When we setup the dedicated or current client in Solution manager system as BI client, then the best reporting suggestion would be interactive. Because no much processing required, most of the time would be very basic reports.

     

    Whereas if you have dedicated BI system, we can consider BI reporting would be more efficient due to huge data volume. This is more complex with heavier level of processing. Hence BI reporting would be the better option.

     

    Other huge difference I found that you can create, edit, and display interactive reports in ITSM web CRM_UI.  This helps to reports retrieve data in real-time on demand.  We have self-guided wizard to assist us during the creation of this reports.  You can then release these reports for certain users.

    For more information, see ITSM analytics. But this is not with BI reporting, most of the BI reporting are pre existed. Whereas BI reporting has more navigation and conditions for drilling down to base level.

     

    Good thing is that, solution manager has the use case for both the scenarios. For detail level of analysis like RCA we could use BI reporting whereas for technical monitoring scenarios we could use Interactive reporting

     

    Below are the Technical Monitoring Interactive reporting which shows basic details with the restricted navigation, this is helpful for everyone including end users.

     

    t1.JPG

     

    Here the RCA BI Reporting, which has detail analysis much helpful for the administrators

     

    t1.JPG

     

    Below the guided procedure for Interactive reporting creation in ITSM

     

    t1.JPG

     

     

    Is it new to solution manager?

     

     

    No, from our cross check, we identified the IT performance reporting which been used since SM 7.0 Ehp1 is the same concept. Now IT Performance reporting is also enhanced as new interactive reporting templates. Yes, IT performance reporting in SM 7.1 is same as Technical monitoring -> Interactive reporting, Both are using the same web templates.

     

    In solution manager 7.0 IT performance reporting the web template 0SMD_RS_NAVIGATION has only reports

     

    t1.JPG

     

     

    The same web template enhanced in SM 7.1 SP10 with more than 50+ reports as below, it is now called as "Interactive reporting"

     

    t1.JPG

     

     

    Unique feature in IR reporting

     

     

    If I compare IR reporting with BW reporting, might BW reporting ends with high rating, But If I look for unique feature of IR reporting, it is efficient, accurate. mostly it is targeted to all kind of users.  You can display reports in table and charts. The following chart types are available: Column chart, Line chart, Pie chart, Bar chart and Stacked column chart

     

     

    t1.JPG

     

     

    You can use these reports to analyze data in many different ways, including a breakdown of individual documents. The report data is retrieved in real-time. You can export report data to Microsoft Excel and print reports.

     

     

    Technical settings

     

    This is again vary scenario to scenario,  we need to activate the corresponding bi content. For Interactive reporting in Technical monitoring you can do it via Technical monitoring Guided procedure setup.

     

     

    t1.JPG

     

     

    For interactive reporting in ITSM, you need to manually add some of the configuration services, refer https://websmp206.sap-ag.de/~sapidb/011000358700001194612012E 

     

    And also SAP ITSM Analytics on SAP Solution Manager 7.1 - SAP IT Service Management on SAP Solution Manager - SCN Wiki


    Other technical demos for IR reporting


    There are lots of demos available in solution manager rkt website, few are listed below, you all can also try and update your use cases below as comments.


    https://websmp106.sap-ag.de/~sapidb/011000358700000665452012E/index.htm

     

    https://websmp106.sap-ag.de/~sapidb/011000358700000479372011E/index.htm

    '

    The everyday duty of IS/IT organizations -especially SAP Competency Centers- is to deliver regularly innovation and new features to the business, according to the requirements, budget and priorities. But is also to maintain in operational conditions the existing solutions.

    SAP recommends to deliver 2 major releases per year and minor releases on a monthly or weekly  basis, depending on the maturity and stability of your solution. This is directly inspired by the editor's own release strategy of Enhancement Packages and Support Packages Stacks.

     

    SEA - Intro.jpg

     

    As any technical or functional project, an EhP or SPS implementation project has to be budgeted, prepared and planned accordingly to mitigate the risks and delays, but above all the preceding to avoid potentially negative effects on live processes and solutions.

     

    SEA_2.jpg

     

    Scope & Effort Analyzer (SEA) is an innovative tool designed for those people who have to manage maintenance events on their SAP systems and need to get a clear understanding of the change impact as well as the test scope and related effort. It has recently been shipped with SAP Solution Manager 7.1 SP11 (March 2014) and helps to predict the major cost and effort drivers of maintenance projects without the need to physically deploy any software packages. We highly recommend it to be used in an early planning phase of each and every software upadate project.

     

    The analysis results covers two parts :

    • Adaptations and development management: identification of affected custom code and modifications, required adjustments in the SAP system, since software updates comes with updates or deletions of SAP standard objects. Detailed effort estimation for custom code and modification adjustments.
    • Test management: Identification of required test scope, test planning, recommendations for creation of missing test cases and execution of manual tests. Detailed effort estimation for regression tests and recommendations based on test scope optimization

     

    It relies on Usage and Procedure Logging (UPL), a new functionality available in any ABAP based system as of SAP NetWeaver 7.01 SP10 or equivalents. UPL is used to log all called and executed ABAP units (procedures) like programs, function modules down to classes, methods and subroutines or smart forms without any performance impact. UPL will give you 100% coverage of your solution usage. This also includes the detection of dynamically called ABAP elements. UPL is the technology to close existing gaps in the SAP workload statistics, which only captures static calls as opposed to static and dynamic calls.

    In other terms, with the help of UPL technique it is now possible to calculate the impact to custom code, modifications and business processes taking into consideration of the real system usage.


    The Maintenance Optimizer scenario automatically calculates the update vector (that’s a detailed list of all technical support and/or enhancement packages to reach the target product version or stack). Scope and Effort Analyzer calculates all SAP ABAP objects which are either deleted, changed (updated version) or newly delivered with this software update. This ABAP object list or Bill of Material (BOM) is the central element to calculate the impact on your SAP system even without a physical installation of those packages.


    In addition to this, semi-dynamic TBOM generation and the automated generation of SAP Module oriented blueprints are two additional source of value. These features ensure to identify the impact on your business processes / transaction codes with the objective to outline the test scope and recommendations how to reduce the test effort with help of Test Scope Optimization (TSO) functionality. This is achieved through a program-based optimization of the number of changed objects by business process and the test effort of associated test cases.

     

    SEA_3.jpg

     

    Testimonial and success story: the French customer Coliposte (Groupe La Poste) is one of the very first world references for the usage of SEA. With the help of our consultants they have achieved in a remarkably short time the implementation of this new tool in the context of their EhP7 for SAP ERP adoption project.

    David Bizien, the CIO for Financial Department of Coliposte explains in the following video how SEA helped him :

    • Forecasting the overall effort and budget for the EhP7 upgrade project
    • Focusing on the most critical impacts
    • Identifying the required skills and competencies for adaptations and developments
    • Booking and mobilizing the appropriate resources for testing
    • Taking into account the team planning constraints

     

                                     

     

    This interview was recorded at SAPPHIRE NOW 2014 in Orlando.

    If you wish to learn more about this secured (no code or sensitive information exposed outside your company) and very comprehensive (covers both SAP standard and customer-specific objects) but nevertheless free of charge tool (usage included as part of your SAP Enterprise Support contract), feel free to post questions here or to visit SAP Marketplace.

     

    To be very synthetic, Scope & Effort Analyzer is the ultimate tool to secure your SAP maintenance or evolution projects !

    Concept: This blog allow you to configure the SLA assignation based on the information of the incident, actually using customizing is not possible to use different conditions, this configuration allow you to play with several inputs to assign the SLA that you need.

     

     

    In Solution Manager you need to execute the transaction BRFPLUS

     

    In the Menu, select “Create Application
    image001.png

    Define the Application name:

    image003.png

    Remember to SAVE and Activate the application.
    image006.png


    Create the New Elements:

    Right click on the application and select to create a new element..

    image007.png

    Create the element using the “Bind to DDC element (data Dictionary)

    image009.png

    Us the DDIC element CRMT_SRV_SERWI to bind the Service Profile

    image011.png

    Remember to SAVE and Activate

     

    Execute the same steps for CRMT_SRV_ESCAL

    image015.png

    Now you will see the Data objects created and activated

    image017.png


    Create the Decision Table:

    Now you need to create the Decision Table, right click on the application and select Expression—>Decision Table..

    image019.png

     

    Create and navigate to the table

    image021.png

    Now you need to define the columns.

    image023.png

    Insert a new column from the Context Data Objects.

    image026.png

    Select the Application SOLMAN and add the fields that you want to use for the assignation, for this example we use PRIORITY and CATEGORY

    image027.png

    Now you need to add the Result Columns

    image029.png

    In this case we use the objects created previously (CRMT_SRV_SRWI and CRMT_SRV_ESCAL)

    image032.png

     

     

    Now you need to add new data to the table, you need to select the insert Icon

    image033.png

    In this example we add 2 entries with different sets Priority and Service Profileimage036.png


    Create the Function:

    Now you need to add a function, Right click on the Application and then Select Function

    image038.png

    Create the function and navigate to the object

    image039.png

    Now we add the Data objects that will be used for the function, in the tab signature select Add Existing Data Objects.

    image042.png

     

    Add all the objets found in the SOLMAN Application (This allow you to use different  columns in your decision table.

    image044.png

     

    Now we need to define the Result data object , select Create in the action button.

    image046.png

    Create a new Structure object to allow you to pass the 2 values

    image048.png

    In the new structure, click in Add Existing Data Objects

    image050.png

     

    Select the Application ZSLA and add the 2 new objects created in the previous steps.

    image052.png

    Remember to SAVE and ACTIVATE


    Create the Ruleset:

    Now is time to create a new RuleSets, you need to select your Function (Blue pyramidal Icon) and then the Assigned Rulests tab, then click on “Create RuleSet”

    image054.png

     

    Define the new name

    image056.png

    After create the Ruleset you need to create a new Rule, to do that, you need to Click on “Insert Rule” and then in Create

    image057.png

    Now write a description, and then select in “Then --> Add”, the option “Process expression—>Select..”

    image060.png

     

    Now select the decision table created before

    image062.png

    Now you need to SAVE and Activate.


    Development Part

     

    To implement this part you need to ask your development team for support,  you need to create a new PPF Action (you can replicate the configuration of the action ZMIN_STD_FIND_PARTNER_FDT).

    Create a new implementation for coping the implementation ZAI_SDK_TEAM_DETERM and using the function module 'CRM_SERVICE_I_READ_OW' and CRM_SERVICE_I_CHANGE_OW to update the Incident

     

    Best Regards

    @WenSolman


    How to Add or Remove a System from a ChaRM Project

     

    In a Scenario such as needing to add a 4th system to a Logical Component (Pre-Prod), or Remove such a system, please follow the steps below:

     

    In our example we have a Project: ERET_M

    Logical component for a system S71: SAP SOLUTION MANAGER [Solution Manager ABAP Stack]

    Clients:

    101 – Dev

    102 – QA

    103 – Pre Prod

    104 – Training

    105 – Prod

    Retrofit : RIG 501

     

    Basic steps:

     

    Close current cycle of maintenance project

    1.png

    2.png

     

     

     

    Save through the next phases until reaching

     

     

    Being Completed:

    8.png

    9.png

     

    Next Agree and Proceed Through,

     


    10.png

     

    12.png

    14.png


     

    Adjust Logical components as necessary

    From SMSY:

     

    16.png

    17.png

     

    Create a new Logical Component in LMDB

     
    18.png

     

     

    Click Display -> Change

     

    Green Check to Launch LMDB

     

    19.png


    Add Technical System and Adjust Type

    20.png

     

    Use transaction Solar_Project_admin


    Add / Remove appropriate systems (in this example, adding Test System 104)

    21.png


    Create New Task List

    22.png

     

    View Task List

    23.png

     

     

    Adjust TMS

    24.png

    25.png

    26.png

     

     

    Dev -> qa

    27.png

    Qa -> pre prod

    28.png

    Pre prod -> prod

    29.png

    Final Product

    30.png

     

    Dbl Click S71 and Remove Transport Layer, add Development System in table below

    31.png

    Save

    32.png

     

     

     

     

    Check Project


    Create new cycle

    33.png

     

    Activate CTS Functionality

     

     

    34.png

    Check the Landscape

    Create Task List

    35.png

     

     

    36.png


    37.png

    Confirm Changes to Project Landscape in Task List:


     

     

    Manually catch up transports where required.

     

     

     

     

     

    Do you need to know by the time you enter the office if the jobs that were scheduled during the night ended without errors?

    Do you want to receive customized job reports on your mobile device?


    Would you like to be able to receive the result of a Job Management POWL query, for example the list of jobs in a 24h time window as shown below, by email?

    00 jsm_wc_smse_query.png

     

    Prerequisites:

     

    1. You have SAP Solution Manager Support Package 10 or higher installed

     

    2. You have to create the required POWL queries in the Job Management Work Center that either

    a) Collects the job data from the SAP system directly or

    b) Collects the job data from an external job scheduler like SAP CPS by Redwood

    The queries have to use dynamic time windows in order to get for example the job runs during the past 24 hours.

     


    Configuration:

     

    Once the queries are available you navigate in the Job Management Work Center to the Administration view and start the configuration of the POWL Query Result notification. As the name implies this works for al kinds of POWL queries and not only for the queries defined in the Job Management Work Center.

    10 jsm_wc_admin - config.png

    In the configuration application you just select Add, then select the POWL query you want to receive (make sure that you have selected the correct type of query) and enter your business partner number.

    11 jsm_wc_admin_new_query_result.png


    After pressing OK you should see your configuration in the list of available configuration. Select your configuration and the assigned business partners will be displayed:

    12  jsm_wc_admin_new_query_result_2.png


    Of course it is possible to assign more than one business partner to a configuration as you can see from this example:

    13 jsm_wc_admin_new_query_many_recipients.png


    Automation

     

    Now that the configuration has been done return to the  Administration view in the Job Management Work Center and select Schedule Job in section POWL Query Result Configuration.

    20 jsm_wc_admin -schedule.png


    The job scheduling of Job Scheduling Management will be started already prefilled with the report to be schedule. Just enter your preferred start time and recurrence and you're done.

    21 jsm_wc_schedule_email_job.png

     

    Result:

     

    What's next? Just wait for the job to run and you will receive an email containing the results of your POWL query, see the example below:

    30 POWL_email.png

    This is a follow-up post, and part 1 is available here.

     

    a) Open Job Management WorkCenter. transaction SM_WORKCENTER, Select Job Management

    JOB MANAGEMENT .jpg

    b) Select Job Documentation, Click on ‘All’ for example

       Select relevant Job Document, click on Job Documentation-> Edit

       You can create a new Job Document as well.

    job doc.jpg

    c) In the New Window that opens, click on Systems tab

    jobdoc systems.jpg

     

    d)The new Job Monitoring Config UI should be launched, it may launch old BP MON Monitoring config based on the selection.

    But the current document only deals with MAI based Monitoring

    abap.jpg

     

    e) The Add Jobs button is disabled if at least one monitored object is found in MAI corresponding to the Job document.

    On Press of “Add Jobs From Managed System”

    The filter values in the popup come prefilled with Job Information provided in the job document.

    add jobs managed.jpg

     

    f)

    On press of “Get jobs from Abap System”, the row in the lower table is selected and added.

    The lower table has the possibility of creating a new monitored object or use existing monitored object.

    The JobDocu ID assigned to the monitored object, is the Job Documentation ID of the Calling job document.

    On Press of “Add Jobs from External scheduler” the filter values are also prefilled.

    The user selects the filter values and presses on Find, multiple rows are displayed.

     

    warning.jpg

    g) After the job is added you can see that the add jobs button is disabled.

     

    job as mon object.jpg

     

    Thank you for going through my blog.

    In continuation to earlier blog (Part 1) , we are going to explore the steps to configure and implement multiple SLA for same customer in Solution Manager 7.1 IT Service Management where single SLA is not sufficient.


    Practical example would be a company/customer wants to have 24/7 support for SAP incidents and 12/5 support for non SAP incidents like hardware or IT Asset incidents etc.


    Please note: Before proceeding on it further, it is assumed that the Solution Manager system is already configured with Incident Management and custom Transaction Type like ZMIN or YMIN is present in the system as per requirement. Please check the below links for setting up the same.

    http://www.service.sap.com/~sapidb/011000358700000608542012E

    http://www.service.sap.com/~sapidb/011000358700000608872012E


    Basically above signifies, the minimum configuration to have a properly copied transaction type (ZMIN/YMIN) which is having the relevant status profile and other configurations with master data in place.


    Further, in our example, let us assume below 2 different Service & Response criteria for our 2 different category of incidents

    • SAP Tickets – assuming 24/7 support
    • Non SAP Tickets – assuming 12/5 support
    • Customer name -  assuming XYZ Corp India


    (PS. Above are just dummy/Imaginary values and you can replace with with real values or as per requirement)


    For SAP Tickets, the availability of support should be 24/7 i.e. 24 hours and 7 days a week. This means users can post ticket at any time or day and the response should be provided as per below agree criteria as mentioned in below table:


    Customer

    Priority

    IRT/MPT

    Deadline

    XYZ India

    1

    First Response By

    1

    Hour

    To Do By

    2

    Hour

    2

    First Response By

    2

    Hour

    To Do By

    4

    Hour

    3

    First Response By

    8

    Hour

    To Do By

    12

    Hour

    4

    First Response By

    12

    Hour

    To Do By

    16

    Hour

    Table 1: SLA for SAP Tickets with 24/7 support


    Please note: Timestamps First Response By and To Do By are for measuring the IRT and MPT respectively which are defined in the system for setting up the deadline.

    For non SAP Tickets, the availability of support should be 12/5 i.e. 12 hours and 5 days a week. This means users can post ticket at any time during but support will be provided for 12 hours for 5 days within a week, excluding Saturday and Sunday.


    The response or resolution for incidents should be provided as per below agreed criteria as mentioned in below table:


    Customer

    Priority

    IRT/MPT

    Deadline

    XYZ India

    1

    First Response By

    3

    Hour

    To Do By

    5

    Hour

    2

    First Response By

    5

    Hour

    To Do By

    9

    Hour

    3

    First Response By

    7

    Hour

    To Do By

    12

    Hour

    4

    First Response By

    11

    Hour

    To Do By

    16

    Hour

    Table 2: SLA for non SAP Tickets with 12/5 Support

     

    Let us start configuring the SLA as per our requirements mentioned in Table 1 and 2 with below 5 main steps in below sequence:

    1. Creation or Configuration for Service and Response Profile
    2. Creation of Product and assignment of Service & Response Profile
    3. Assignment of Service Products to Multi Level Categorization(MLC) & Sold to Party
    4. Configuration of SLA Determination procedure in SPRO
    5. Assignment of SLA Determination Procedure to transaction type

     

    Step 1: Creation or Configuration for Service and Response Profile


    First we need to create two different Service and Response Profiles in the system for XYZ Corp India customer. Therefore, logon to Solution Manager 7.1 system, now enter the transaction code “crmd_serv_sla” or follow the below SPRO path.


    SAP Solution Manager Implementation Guide->SAP Solution Manager->Capabilities (Optional)-> IT Service Management-> SLA Escalation->Edit Availability and Response Times.


    This will help us to configure Service and Response Profiles.

    Click on Service Profile as highlighted in below figure and then click New Entries button as highlighted to create a new Service Profile.


     

    Next step is to enter code (say ZSP), provide a description and then click on highlighted icon to maintain availability service timelines as per requirement for SAP Tickets/incidents. Thus, we create this one for 24/7 support.


     

    Click on the highlighted icon will help to maintain the timelines or period. To maintain the service availability times 24*7 proceed as shown below.

     

    Note: I am assuming all days are working here but it is possible to define our own Calendar (via SCAL transaction) which is very basic and easy for any Solution Manager Functional Consultant.

     

    Further, click copy button as highlighted above then and save it.


    On returning to screen or figure 2, again click new entries or repeat steps to create our second service profile for 12/5 timeline for non SAP Incidents/Tickets.

    Once completed with Service Profile creation, we can now proceed further for response profile setup.

    Next step is to configure Response Profile in the same transaction (crmd_serv_sla), so click Response Profile and click New Entries button.

     


    Create a Response Profile ZRP and maintain a description. Choose Priority checkbox as shown.

     


    Now click on Indicators for Response Times on the left side in above screen, which will change the right part of screen and we can maintain the priority values available in the system by pressing F4 as shown. Map all the priority values as requirement.

     


    Once added from the figure 8, all the above priorities will be attached to response profile as shown below

     


    Next for each Priority, we need to assign the relevant First Response By and To Do By parameter as per our Table 1.

    Therefore choose Priority “1” as shown below and then click on Response Times on the left side of the screen.

     


    Double clicking the Response Times node on the left side of our figure will open sub screen on the right part of same figure Maintain the values for two durations as shown below

    • SRV_RF_DURA – for IRT
    • SRV_RR_DURA – for MPT

     

     

    Similarly, proceed for Priority 2, 3 and 4 by repeating above steps and maintain durations.

     

    Similarly now we will create another Response Profile ZRP2 and maintain the parameter First Response By and To Do By as per Table 2 by repeating all above steps.


    Step 2: Creation of Product and assignment of Service & Response Profile


    Based upon example requirement for customer/company XYZ Corp India, we will now proceed to create 2 Service Products for SAP and Non SAP Incidents respectively. To create a service product, we can access the transaction commpr01, and this will open the below screen.

    First, hit the search button in the Find Area.


    Then a default SAP Service Product “Investigation” will be shown which is created via the Solman_setup – ITSM wizard. Double click to choose it and enter the change mode by clicking the Pencil or edit icon.  Further, assign the Service and Response Profile which will be empty initially.


    Save it.


    Similarly, we copy the above product using copy button to a different product by name Hotline and assign 12/5 Service and Response profile as shown below


     


    Thus, we are now ready to move to assign them in Multi level categorization or step 3 as mentioned earlier in the blog.


    We will continue in another blog with remaining or mentioned steps and its available now (Part3)


    With Solution Manager 7.1 SP12 Job Monitoring an integration of Job Monitoring with Job Documentation has been achieved. Currently only the Monitored Object created via job monitoring creates a Job Documentation with proper information or vice versa and there is not sync between them.

     

    The following snapshots are used to demonstrate the  integration:

     

    a) Launch Job Monitoring WorkCenter under solman_setup

     

    workcenter.jpg

     

     

    b) Goto "Define Scope" Step, Step 4 and select the scenario.

    define scope.jpg

     

     

    c) Select scenario and navigate to Step 5.1

         Select "Add Jobs" and select from Managed system from the menu.

    A popup opens, search for a job and select it.

    add jjobs.jpg

     

    d) Press on Add selected Job as Monitored Object. you can see in the image below that the Job Documentation ID field is empty.

    no jobdoc.jpg

     

    e) Press on Save. That will create a new Job Documentation. Internally it looks for an existing job doc with the same parameters as the job, if it exists and is not linked to any other Monitored object, it is assigned or else a new one is created.

    jobdoc created.jpg

     

    f) Click on the "Show Job Documentation" link to launch the newly created job document

     

    Lauch Jobdoc.jpg

     

    Hope you liked my blog. Thank you. Part 2 on this link, will cover how is it integrated form the Job Scheduling management Workcenter side.

    Hello forum,

     

    Today I see a mail from some "SAP Consultant" asking us why SAP Solution Manager Key don't work on SMSY...

     

    Well, as indicated on the next SAP note, Solution Manager Key is no longed provided trough SMSY transaction, you have to do that using LMDB as indicated on note:

     

    1888840 - How to generate installation keys in Solution Manager 7.1


    You can see on the next pictures how to access to that LMDB functionality:


     

    LMDB stat screen

    cap1.png

     

    cap2.png


    To do that you don't need to finish the managed system setup, you can select any system on LMDB technical systems selection and then click on Generate Installation Key, then on the next screen only fill the fields with the correct information to generate the KEY.

     

    I check SAP notes to see if there is any new update about that, and found the next one interesting:

     

    811923 - SAP Solution Manager Key


    "...The SAP Solution Manager Key is no longer required..."


    I don't install SAP software but it's curious to see people for Solution Manager Key, and there is a note where SAP indicate that the key is not longer required :-O

     

    Anyone have more information about that last SAP note ?

     

    Best Regards,

    Lluis



    As you all know removing a managed system from Solution Manager can be a tedious time consuming process. With SP11, SAP has provided a guided procedure for removing (decommissioning) a system from Solution Manager 7.1. The steps below show you how to access the guided procedure and high level preview of the steps provided.

     

    SAP also added steps for decommissioning in the latest Solution Manager 7.1 SP11 Operations guide. You will need access to the SAP support Portal to access the document. Here is the link to the guide and a screen shot of the location:  https://websmp109.sap-ag.de/~sapidb/011000358700000631992012E.PDF

     

    1.jpg 

     

    1. You need the following composite role to run guided procedures

         a. IT Operator  composite role = SAP_TASK_INBOX_ALL_COMP

    2. Enter Transaction SOLMAN_WORKCENTER in SAPGUI

    3. Select Technical administration Workcenter

    • Select Guided Procedure Management
    • Select the Managed System
    • Select the Guided Procedure browser drop down
    • Select either Start new window or Start Embedded

     

      2.jpg

    4. Select the Decommissioning Procedure and select Execute

      3.jpg

    5. Select Edit

    6. Before continuing to the next step you must understand that by decommissioning the system you will lose data on the managed system. Keep in mind all of the steps may not be required.

    7. Set the execution status to performed and select next to move on to step 2.

      4.jpg

    8. Step 2 – Application Clean-up has 2 automatic steps and a number of manual steps.

         a. Automatic activities are completed by selecting execute all

        • Remove Technical Monitoring Templates via SOLMAN_SETUP
        • Delete Session data, Reports, Early Watch Reports, DVM, Service Level Reporting with report “RDSMOPREDUCEDATA”

         b. Complete the manual activities by reading the Documentation for each step and the selecting Start Webdynpro to navigate to the location where the steps need to be completed.

         c. The manual activities are all about removing the scheduled jobs and monitoring that is configured in Solution manager.

         d. When all manual activities are complete set the Execution Status to Performed and select next to move on to the next step.

      5.jpg

    9. Step 3 – Cross Application Clean-Up

         a. This step has you delete the Extractors, RFC’s, Transport Routes, and uninstalling the Diagnostic agent from the managed system.

         b. This step and the following steps have all Manual Activities. Read the documentation and complete as needed.

    6.jpg

    10.  Step 4 - Planning Projects and Solutions Clean-up

         a. These manual activities are removing Solutions and deleting logical components from the LMDB and SMSY.

      7.jpg

    11.  Step 5 – Software Life Cycle Management Clean-up

         a. This step has you remove more product systems and system data from solution manager.

      8.jpg

    12.  Step 6 – Landscape Management Clean-up

         a. This step has you remove the remaining system data from LMDB and the SLD.

      9.jpg

    13.  The final step is just an overview displaying the status of the other steps.

      10.jpg

    14.  That is it, once complete all data on the managed system is completely removed from Solution Manager.

    Hello all,

     

    It's been some time since we have the KBA methodology at SAP. I like the idea of being able to release documents for you as quickly as you need them, but we don't get a lot of feedback about them.

     

    I would like to know if you have any issues following a Maintenance Optimizer KBA, because I want to make them as good as I can.

     

    So, don't be shy, and let me know if you have comments about any SV-SMG-MAI KBAs.

     

    I am looking forward to hearing from you!

    Actions

    Filter Blog

    By author:
    By date:
    By tag: