1 4 5 6 7 8 33 Previous Next

SAP Solution Manager

484 Posts

As part of my blog series on SAP Solution Manager 7.1 features and functions, I decided to write this blog post to give a high level overview of the steps needed to set up Business Process Monitoring in SAP Solution Manager 7.1 (used SP12 here). It’s not my intention to detail everything out in this blog post but instead give a view on which high level steps are needed in terms of configuration.


For a detailed setup overview, check out the great presentation (BPM setup roadmap 81 pages) that you can find in the media library on https://service.sap.com/bpm or through the presentation material on business process operation - business process monitoring in https://service.sap.com/rkt-solman.


In a previous blog post on getting started with SAP Solution Manager I explained briefly the initial steps to get started with SAP Solution Manager. For Business Process documentation, as a prerequisite, you have performed the system preparation, basic configuration for SAP Solution Manager and the managed SAP system setup for the SAP systems involved in the monitoring scenario.


Quick configuration steps cheat chart


First step: preparation for the Business Process Monitoring scenario through SAP Solution Manager: Configuration workcenter


You start with the guided procedure for Business Process Monitoring  which you can access via the menu in the Solution Manager Configuration workcenter (or transaction SOLMAN_SETUP which is the same in the end).



This guided procedure will help you to get the prerequisites in place to set up a Business Process Monitoring scenario.


Prerequisite notes are checked for example during this guided procedure (via RTCCTOOL) and the detail log (link show) will show you what is missing in terms of SAP notes to ensure you have the latest corrections implemented.


Prerequisite before setting up a business process monitoring scenario

You need business process documentation in a solution within SAP Solution Manager (a structure that represents the business processes & business process steps) in order to do business process monitoring.


There are multiple ways you can achieve this but I won’t cover them all in this blog because it would take this blog too far off topic. If you already utilize business process documentation, you can leverage what you have already build. Otherwise, if you want to set up a simple Business Process Monitoring scenario (let’s call it a test) it doesn’t have to take a lot of time to get started.

You can create a Solution Manager implementation project using transaction SOLAR_PROJECT_ADMIN.  The minimal configuration there is that you give the project a title, you choose the language and you insert the logical components in the system landscape tab which represent the involved SAP systems for the business process steps.



Once that is done, you can create a business process structure using transaction SOLAR01. Once your structure is ready (just keep it simple to start) you need to insert the structure into a solution. That’s done using transaction SOLMAN_DIRECTORY. If you don’t yet have a solution, you also need to create a solution, this can be via the SAP Solution Manager Administration workcenter.


Setting up the business process monitoring scenario


In the business process monitoring setup, you continue the setup (follow the steps, hit create and follow through the configuration) and you can configure business process monitoring against business process steps.


After you run through the configuration steps, you can see the monitoring overview in the Business Process Operations (new) workcenter under Business Process Monitoring. Recently introduced is integration into the Monitoring and Alerting Infrastructure (MAI) which makes the technical architecture and the look & feel of the scenario, the same as for Technical Monitoring.




As you might know Business Process Analytics was first introduced in 2010 with SAP Solution Manager 7.0 support package 23. A lot has happened since then, e.g. Business Process Analytics went mobile on the iPad. An overview about all features and the evolution of key figure content can be found in our central Business Process Analytics document. But something is not yet widely known - Business Process Analytics can also be used in an ad-hoc and real time manner. This additional feature is already available since SAP Solution Manager 7.1 support package 10 and it works no matter whether your managed system is running on SAP HANA or "anyDB". The name of our Web Dynpro application implies that it is only working for SAP HANA, but this is not correct.


Technical prerequisites

You require SAP Solution Manager 7.1 with support package 10 or higher.


Your managed system should have ST-A/PI 01Q with support package 2 or later implemented. Additionally you should implement SAP note 2085063 - Business Process Analytics powered by SAP HANA related collector returns result list in wrong format

If you a re looking for a complete description of all necessary technical prerequisites, it is best to refer to SAP note 2105720 - Business process Analytics powered by SAP HANA - prerequisites.


Additionally required configuration


In order to get the ad-hoc/real time Business Process Analytics working you have to create a new connector instance for the BPM_HANA_SUITE connector in the Data Source Manager.

Dat source manager.png

As the new Web Dynpro application is not integrated in the Business Process Operations work center, you should create your own favorite in order to directly access the application.




Using Business Process Analytics powered by SAP HANA


After creating the favorite, you can just click on the hyperlink and can access the selection screen from Business Process Analytics powered by SAP HANA. It looks very similar to the "traditional" Business Process Analytics. The main difference is that you do not have to select a solution. You just chose a system and client combination and then you have to filter for the key figure of interest (out of the 900+ available ones).


Selection screen.png


When selecting and executing one key figure the data collection is immediately triggered in the connected managed system and as soon as the data collection is finished, you get the Business Process Analytics analysis screen. SO compared to the "traditional" Business Process Analytics no setup and scheduling of key figures is required.



Also the analysis screen is very similar to the "traditional" Business Process Analytics, but "only" the Advanced Benchmarking and the Age Analysis are available as "Analysis Types". No Benchmarking or Trend Analysis is provided as these features require some InfoCube access, which is not available per definition as we perform some ad-hoc, real time execution.


The Detail Analysis is of course available in order to jump into the corresponding result list and accessing every single document if required.


Further reading

You can find all necessary information about Business Process Analytics in this document.


Frequently Asked Questions about Business Process Monitoring and Business Process Analytics are answered under http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Monitoring and http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Analytics respectively. The following blogs (in chronological order) provide further details about Business Process Analytics and Business Process Monitoring functionalities within the SAP Solution Manager.


Using business process documentation in SAP Solution Manager? Then you definitely want to know about the Solution Documentation Assistant (codenamed SODOCA). SODOCA allows you to verify your business process documentation structure. Are those SAP processes that we have documented really in use and perhaps an even more interesting question, did we miss something? Did we forget to document something?

Are the processes we claim are in use, in use?


The first thing you see when you open up a SODOCA analysis result is the business process structure and the checks that have been performed which are represented by traffic lights.

A green light means:

The process is in use in the SAP system (during the period that was analyzed)

A yellow light means:

One of the underlying processes of this business process is not in use in the SAP system (during the period that was analyzed)

A red light means:

The process was not used in the SAP system (during the period that was analyzed).

Based on this result, the business process documentation can be adjusted. It’s not because a process (or step) is red that you have to remove it. It’s just an indication that it was not in use during the period you analyzed and then you should ask yourself the question, why is that the case?

It can be related to the period analyzed (some processes are only performed once a year for example). To overcome that, you could analyze a period of a year as well for example.

Another reason might be that you think they are using this transaction but in reality they are using another transaction (or report or Z ...) that allows them to do the same thing.

Did we forget something in our business process documentation?


In the result, under the tab Object Usage, you can find a tab Not in Analysis (given you selected these options when you set up the analysis).

Here, you can find an overview of transactions (and other object types when selected) which are reported to be in use in the SAP system but not present in the business process documentation structure. So this is interesting in terms of checking whether or not you didn’t miss anything. It can of course also be the case that a new process has been put in place but the business process documentation was not updated.

A repetitive cycle


During SAP TechED && Dcode Berlin, attendees (yes, more than once) asked me how do I get started with SAP Solution Manager. What is the best practice?


Since SAP Solution Manager 7.1 has a huge set of scenario's and applications which can be leveraged by the customer it can be a tricky question once you pass the initial configuration to get started.

SAP Solution Manager contains a mixture of multiple products, it has a full CRM, an embedded BI (unless you opt out and use a separate BI system), a Solution Manager specific part and it runs on SAP Netweaver technology.

Since I like to share knowledge and since I got this question more than once, I decided to start writing this blog post to help people get a better understanding. Since it's such a  broad  topic to cover, I won't be able to pull this off in a single blog post so if this blog is valued and found interesting, I can continue and blog more to end up with a series of blog posts to help community members figure out how all of this fits together.


1) Installation


Where it starts is, you install SAP Solution Manager 7.1. As a SAP customer, you can simply download the appropriate files and get started. Of course it's not too simple in real life so you'll need a SAP Basis admin who takes care of it. An installation typically includes a database, an application server, a host agent, a diagnostics agent, Wily Introscope and depending on the size of the landscape more. The SAP Basis admin also takes care of BC specific post processing for the Netweaver technology stack, such as the SAP Transport Management System configuration just to give a small example.

A SAP Solution Manager landscape typically consists of two or more SAP Solution Manager systems. Some customers (small - midsize) only have one SAP Solution Manager system. In that  case, you would want to clone that system to test support package stack updates / upgrades on a clone of that system to avoid unplanned downtime if something would go wrong. If you plan to actively use SAP Solution Manager and use multiple scenario's over time, I would advise going for at least two SAP Solution Manager's to make it easier to test changes on your landscape. If you want to seriously invest in the IT Service Management scenario's (all the management modules) it can even make sense to go for a three tier landscape to have a better change management control if you're working with custom flows.


2) System preparation and basic configuration


picture 1.0: part of SOLMAN_SETUP left menu pane


Once that is done, you have SAP Solution Manager up and running. Th next thing is the system preparation and basic configuration of Solution Manager which is performed through either transaction solman_setup or through the Solution Manager configuration workcenter. For this you'll need either a SAP Basis admin with knowledge on SAP Solution Manager or a SAP Solution Manager consultant with a technical background (preferably) since many steps are related to technical actions such as connectivity, data replication, BI content activation and so on.


3) Managed system setup


picture 1.1: part of SOLMAN_SETUP left menu pane + managed system setup right pane


Next is the managed system setup (see picture 1.1, note that I blanked out actual system / host names) which lets you configure the SAP systems connected to SAP Solution Manager. In order for SAP systems to be known in SAP Solution Manager, they have to be known in the System Landscape Directory (SLD) connected to SAP Solution Manager and synchronized into the Landscape Management Database (LMDB). Another technical story in which you want maximal automation to avoid isses.


Already in the early phases, the customer can choose to leave out certain elements depending on the use-case of their SAP Solution Manager implementation. There are customers who have two or more SAP Solution Manager landscapes which are used for different scenario's. For example, one landscape for Technical Monitoring and one landscape for IT Service Management (Change Request Management etc). For Technical Monitoring, you need diagnostic agents, for IT Service Management, you don't. This can result in a SAP Solution Manager system where the managed system setup has red traffic lights. So depending on the use case, it can be ignored. The reason for those separate landscapes (mostly seen at large customers) is that they want to update one landscape more frequently than the other and therefor split the use-cases over multiple SAP Solution Manager landscapes.


4) The gates are open


picture 1.2: SOLMAN_SETUP left menu pane

As of this point, the gates to a large amount of scenario's open up. Depending on the scenario you want to set up, prerequisites exist. I won't list all possible scenario's and prerequisites here because that would just be too much. Steps 1,2 and 3 is what we called the "Technical Base" during an implementation project a few years back.

Confusion rises - some scenario's / features are "just" available right now

Here is where it can start to get complicated to understand where to go next or what to do next. A number of scenario's will just work without further action (except for bug fix notes perhaps) after steps 1,2 and 3. Some scenario's will be up and running but might have missing data for example. This is because from a technical point of view, the configuration can get rather complex. Due to the amount of prerequisites (thinking about Root Cause Analysis now for example), chances are pretty high, not every prerequisite was met during installation, post processing and that you'll need additional work to make every single feature work properly. Root Cause Analyis is one example which you don't see in the solman_setup menu tree because it should already be functioning after performing steps 1,2 and 3. Experience tells me it won't be correct and you'll need to take action to make it work properly.

As from step 4 you need to think about functional prerequisites as well, for plenty of scenario's, you need business process documentation for example. Business process documentation is a next logical step in advancing the use of SAP Solution Manager if you want to leverage scenario's such as Business Process Change Analyzer, Test Scope Optimization and Scope and Effort Analyzer for example. Again, business process documentation is not part of this menu tree because that should also work after steps 1,2 and 3 you can just go and use the scenario but you do have to know how to get started within that scenario of course. I'll try to cover some of those aspects in upcoming blog posts.

Implementing scenario's that require more work


picture 1.3: guided procedure to configure scenario Change Request Management

Many scenario's (visible on picture 1.2) offer guided procedures which can guide you to configure that specific scenario. If you want to go and configure Change Request Management for example, this would be the next step.

Expert Guided Implementation

SAP also offers options to help customers out to implement SAP Solution Manager scenario's. You can check out the expert guided implementation initiative of SAP over at support.sap.com/escademy

I haven't yet used this service because I spent a lot of time in SAP Solution Manager but I have heard from customers who haven't got the time to build up that kind of experience in SAP Solution Manager that this service can be very useful to help them get started. The idea is that an expert from SAP helps you through the configuration for the scenario and with multiple calls, follows up on progress and helps you further to use the scenario.

SAP Enterprise Support Value Maps

Another recently introduced intiative of SAP are Value Maps | SAP Support Portal . The idea behind these are providing guidance to which resources (can be Expert Guided Implementation - EGI or something else) the customers can leverage to help set up a specific scenario.

General advice

For any scenario which you want to use, it's advised to go and look at relevant SAP notes to ensure you have the necessary prerequisites in place. Sometimes, the guided procedure in SOLMAN_SETUP (if available for the scenario you look at) will automatically check and report on the needed SAP notes which might be required on either your SAP Solution Manager system or Managed SAP system or both. As I've mentioned before, not all scenario's are in here so for some you'll need to check the relevant SAP notes yourself and check if your system is ready, meets those prerequisites to ensure the scenario will work properly.

SAP Solution Manager is not that "simple" in most cases once you step outside the "we only use SAP Solution Manager to download files" realm of things. So it can definitely make sense to go for EGI or hire someone with SAP Solution Manager knowledge to aid you in your efforts. Figuring out complex scenario's yourself can take a lot of time.

As of SAP Solution Manager 7.1 the CRM Web UI is the standard access to ITSM and ChaRM.

The Web Client User Interface is the first step into a new era of user interfaces regarding usability and flexibility for the business user.




Predefined business roles are the entry point into the application from an end user point of view.

SAP delivers the following business roles to be used for ITSM and ChaRM:

  • SOLMANREQU for Reporter / End User / Key User
  • SOLMANDSPTCH  for Dispatcher
  • SOLMANPRO for Processor and Administrator


Favorites and Tags are very good functionalities that can help the end user in the daily work.

By using the following buttons, the user can save the current ITSM/ChaRM document as favorite or create a tag could to identify the documents related to the same topic.



In this way, it is easy to access the documents later because the favorites and tags are visible in the home screen.



However, in the above business roles the favorites and tags are not enabled by default.

If the end users want to use it, they need to go to the following:

  1. Go to the Personilize menu
  2. Go to the assignment block Settings
  3. Enable the Favorites and Tags


From my point of view, it should be better to have it enabled by default, so the end users will be able o use it without doing any personalization.

In order to make the favorites and tags enabled by default in the business role, follow the step by step bellow:

  1. Go to the following IMG path:
    SAP Solution Manager Implementation Guide -> Customer Relationship Management -> UI Framework -> Technical Role Definition -> Define Parameters
  2. Copy the standard parameter profiles into a custom namespace
  3. Create new entries for the following parameters and Save
  4. Go the business role definition:
    SAP Solution Manager Implementation Guide -> Customer Relationship Management -> UI Framework -> Define Business Role
  5. Select the business role and go to Assign Function Profiles
  6. Replace the Function Profile ID PARAMETERS with the new parameter profile that you have created


After that, all users assigned this business role will be able to see the favorites and tag button while processing the ITSM/ChaRM documents.



After working on SAP Solution Manager during more than 2 years I would like to share with you our experience. Your feedback as “Solmaner” or SAP Solution Manager customer is more than welcome to help us improving our service!

Our approach is based on the “no big bang” principle. Thus we built our SAP Solution Manager platform gradually, step by step.

We first rebuilt the technical foundation when migrating from the version 7.0 to 7.1. After the technical foundation was laid out, we focused on getting the functional foundation (business process documentation) ready to be able to offer more extensive services.




After the initial project, we proposed to our customers over time, the following services based on SAP Solution Manager  functionalities:


  • Technical Monitoring
    •   In general (interface enabled)
    • Specific scenario for Java performance (interfaced)


  • Incident Management
    • Incident creation from within SAP (interfaced)
    • Automated incident creation based on monitoring (interfaced



  • Custom Development Management Cockpit (CDMC)


  • Business Process Documentation


    • Solution Documentation Assistant
    • Business Process Change Analyzer/Test Scope Optimization


  • Change and Release Management (CHARM) (interfaced)


Technical Monitoring

The technical monitoring service gives you a lot of information about your SAP system health on regular basis via the EWA & SLRs reports and more. Other functionalities like Root cause analysis or DVM (Data Volume Management) give you also a good insight of your SAP systems status. I qualify it as the easiest service to promote as we didn’t have to do much to convince our customers about the added value it brings. It indeed increases the quality of the systems by alerting potential technical bottlenecks (security performance) and inefficiencies (performance, actual software levels) in an early stage.



Incident Management

Regarding the SAP Solution Manager incident management functionality we only use a part of it in the sense that the support messages are created from SAP satellite systems but registered automatically in a separate service management tool thanks to SAP Solution Manager. Our customers who requested this service deal with applications based only on SAP technology. We have another customer using our technical monitoring service but he doesn’t want  to use this service as he has different applications based on different technologies: SAP, JAVA, . Net etc.



Custom Development Management Cockpit (CDMC)

CDMC helps to manage and optimize the usage of custom development objects on the SAP ABAP stack.We used CDMC only twice in 2 years time during upgrades. Each time the analysis listed thousands of objects to go through to decide whether you remove them or you keep them: this is really time consuming. Besides our customer don’t find it user friendly. Thus we have to find a way to improve this service to have more customers using it.



Business Process Documentation

We spent most of our time developing our Business Process Documentation service. What we call Business Process Documentation in our service portfolio is a set of SAP Solution manger functionalities and concepts:

  • BPCA
  • TBOM
  • CBTA


I like to use these acronyms; it makes me look like a technical specialist while as Service Delivery Manager I rely on the technical consultants and SAP mentor of my team to set up the scenarios. Nevertheless I can give you the high level principles behind these acronyms:


  • SODOCA: SOlution DOCument Assistant it checks whether the business process structure you define is correct by identifying the omitted or the unused transactions.


  • BPCA: the Business Process Change Analyzer allows you to check the impact of your changes on the documented business processes. Then the Test Scope Optimization permits you to identify the most impacted businesses on which you should focus your tests. Later on we will use this functionality in combination with CHARM (ChAnge and Release Management).



  • TBOM: Technical Bills Of Material. TBOM’s (Technical Bills Of Material) are used by other functionality in SAP Solution Manager such as BPCA (Business Process Change Analyzer). The generation of TBOM’s is part of business process documentation.



  • CBTA : Component-Based Test Automation: record and play back test scripts approach without additional licenses fees.


We succeed to have a recurrent scenario on monthly release:

  • Identify the most impacted business processes by the changes (transport requests) with BPCA
  • Identify the business processes to test to cover 100% of test coverage with Test Scope Optimization


In the coming months we would like to use CBTA to execute the test scenarios related to the identified business processed via BPCA/Test Scope Optimization scenario. In this way we will be able to propose an end to end change process.We would like also to use CBTA to generate automatically the TBOMs (Technical Bills Of Material) to review our business process structure.


ChAnge and Release Management (CHARM)

We set up recently CHARM service but it not yet used in production. A SAP Solution Manager consultant was surprised that we didn’t start with CHARM implementation. In an IT organization where there are already standards defined you have to give solid arguments to introduce another tool that has the same functionalities than the existing tools. The real added value of CHARM that we highlight is the transport management for SAP changes. Indeed SAP changes are managed in one service management tool but the release managers use additional excel files to manage the SAP transport requests. With CHARM the end to end change management process is supported by one tool with the possibilities of using transport of copies . Like for the incident management part the only customers willing to use CHARM are the ones who deal with applications based only on SAP technology. I am wondering is there any company using CHARM to follow their non-SAP changes: if a reader is in this case I am eager to listen or read his/her story.



To conclude I believe that the real added value that SAP Solution Manager can bring to SAP operations is not sufficiently known. The fact that you have to implement a lot of OSS notes (at least in our case) to have one scenario working properly doesn’t encourage a wider usage of the SAP Solution Manager functionalities.


From the exchanges I have so far with people using SAP Solution Manager they use mostly CHARM and Project Management functionalities. I haven’t met anybody using BPCA/Test Scope Optimization or SODOCA and I met only one team who started using CBTA.



I am also wondering what is the real priority that SAP is giving to SAP Solution manager improvement now and for the coming years. This improvement can be done via a better communication about SAP Solution Manager capacities that are really underestimated.



Unfortunately, they are still lots of people out there with preconceived ideas about SAP Solution Manager based on the earlier releases of Solution  Manager (pre 7.1).People have to be aware that there is a huge gap between the version 7.0 and 7.1 and I hope with the version 7.2 planned  for the beginning of 2016 there will be more customers using SAP Solution Manager for the continuous improvement of their IT operations.


I am attending the SAP TECHED in Berlin November 11th-13th I hope meet out there a lot of solmaners:




EXP17550Sharing an SAP Solution Manager 7.1 Implementation ExperienceNetworking Session (30min)   Thu 2014/11/1314:30 - 15:00Expert Lounge EL 2.1, Hall 2.2


I can’t finalize this article without paying tribute to SAP passionate Tom Cenens who is also a SAP mentor. In a team all team players are important as each of them brings his added value. We are all convinced that Tom is part of those people who motivate and help people to grow by sharing their passion and knowledge with respect and humility.

In this blog we will show how to maintain your Business Process Structure of your Implementation project with the ARIS tool ( supported via a Template project in Solution Manager) This blog will only cover the ARIS part related to the Business Process projects preparation. The part related to Solution Manager Template Projects will be covered in Boris Milosevic’s blog.


Background: A template project is maintained in the SAP system (client) where you can make changes. For more information on this please refer to Boris Milosevic’s blog How To Set Up Project Environment in Order to Maintain your Implementation Project with ARIS tool.


The environments in the two tools SAP Solution Manager and ARIS are synchronized and the ARIS content represents Business’ view of the Business Processes and the Solution Manager represents the system view of the Business Processes of the enterprise.


The schema below shows the different steps to follow when creating a new project to improve a given Business Process area:

Microsoft PowerPoint.png

When synchronizing between ARIS and Solution manager there are object attributes in the ARIS objects that define how the synchronization should be done. By following the steps above and by using the Transform functionality in ARIS the new [Variant] copy of the Business Process area to be updated is redirected to synchronize with the Implementation Project in Solution Manager instead of the Solution Template Project.

In the end of the step-by-step process I explain a short workaround needed because of a Transform bug in many versions of ARIS. We’re using Version, which needs the workaround.

1. To start the update you synchronize the two “Master” environments.

VMware Fusion.png

  • Select the Solution Template in SAP Solution Manager.
  • Accept the default settings and selections.


The ARIS Master and the Solution Manager Master Template are now synchronized and contain matching business context.

Microsoft PowerPoint.png

2. Copy the ARIS Master Group (Project) and Paste it as a Variant in the [Parent] Group Node.

Microsoft PowerPoint.png

The new structure is visible in ARIS – this will be our Implementation project.

Microsoft PowerPoint.png

Now you need to take a look at Boris Milosevic’s blog How To Set Up Project Environment in Order to Maintain your Implementation Project with ARIS tool to find out what needs to be done in Solution Manager.


4. In ARIS create a New, Temporary ARIS Database (TMP)

VMware Fusion.png

This database will be Transferred (ARIS functionality) from the Implementation Project in Solution Manager and is only used to Transform the SAP attributes of the objects in the ARIS Variant Project copy to allow it to later be synchronized with the Implementation Project in Solution Manager.


5. Transfer Implementation Project from Solution Manager  to ARIS TMP database


Verify the Implementation Project Structure in the TMP db.

Check Warnings at end of this blog, and make adjustments if necessary.

In the Variant MASTER Database, Select the Copied Variant Structure and

6.Transform from the [Template] Project in the TMP Database.


This step is necessary to properly set the SAP Attributes in the ARIS Variant Master Project [to “Point” to the Implementation Project in Solution Manager]


Select the TMP Database as the Corresponding Database Containing the Implementation

Project and OK

Microsoft PowerPoint.png

7. Work in the Variant Master make all changes required by the business and the organization. Assign, Publish and Validate Models with Process Owner.

8. Synchronize the ARIS Master (Variant Copy) Project back to the Implementation Project in Solution Manager. Select the Variant’s root-object in ARIS.

[Read the Warnings slide at the end of this blog before proceeding.]


The Project is Now Approved and Ready to be Published in ARIS.

Make sure Model Level Assignments are Assigned to the New Processes in

the Variant Copy Master. Delete the Original Master Processes.




There is a known bug with certain versions of ARIS. When creating a variant copy

of TMPL, the root object is also copied.

However the Transform function does not change the SAP ID or Project ID when

Transforming the project from the TMP Database. A manual action is required.


The Variant copy has been created and before Transforming the SAP Attributes from the TMP ARIS Database You need to Export (XML Export) the Root-object from the TMP Database and Import it into the MASTER Variant, then Consolidate the Variant copy and the imported object, using the imported object as the Master.

After the Consolidation check the SAP Attributes of the Variant Root-object. Make sure the [SAP] Synchronization Project Attribute is the Implementation Project in Solution Manager.

After this You can Perform the Synchronization with Solution Manager.

As of Solution Manager 7.1 SP12, you have the option to use Business Process Monitoring on MAI. This new Business Process Monitoring function uses the monitoring and alerting infrastructure, which is quite different from the infrastructure for classic BPMon.


The most important infrastructure changes are:


  1. Storage of monitoring configuration

    In classic BPMon, the monitoring configuration was stored in the DMD. The generated and activated customizing was stored in cluster tables. For BPMon on MAI, all monitoring configuration (saved, generated and activated) is stored in the MAI infrastructure. Only the link between the business process context and the monitoring object is stored in the DMD.
  2. Triggering of Data Collection

    In classic BPMon, the trigger for the data collection came via the CCMS on SAP Solution Manager, using the BPM_LOCAL RFC destination. For BPMon on MAI, this trigger now comes via the DPC PULL Extractor in the Extractor Framework. The BPM_LOCAL RFC destination is no longer used in BPMon on MAI.
    Since the DPC PULL extractor can only be triggered every 5 minutes, also for BPMon on MAI data cannot be collected more frequent than every 5 minutes (in theory, PUSH metrics could be collected more frequently, but almost all BPMon metrics currently available are PULL metrics).
    In classic BPMon, the scheduling maintained in CCMS or the information stored in table DSWP_BPM_TIMESCH was used for determining which monitoring objects were due for data collection. For BPMon on MAI, the scheduling information is stored in the MAI infrastructure. Table DSWP_BPM_TIMESCH does not contain entries for monitoring objects collected via BPMon on MAI.
  3. RFC destination for data collection

    In classic BPMon, the TMW RFC destination is used for data collection. In case no TMW destination exists, the READ destination is used as a fallback (in earlier releases only the READ RFC destination was used). In BPMon on MAI, the data collection is executed via the TMW RFC destination. If no TMW destination exists for the managed system, no data collection will be possible.
  4. Add-Ons installed on managed systems

    In classic BPMon data collectors on the managed system were called directly (i.e. coding shipped in ST-A/PI was called directly from SAP Solution Manager). In BPMon on MAI, the BPMon data collectors are called via the DPC Extension (which is part of ST-PI) on the managed system. This means that for BPMon on MAI, it is mandatory to have ST-PI SP10 or higher implemented on the managed system, and the managed system has to have a basis release 7.0 or higher.
  5. Alert Storage

    In classic BPMon, alerts were stored in table DSWP_BPM_ALERTS. In BPMon on MAI, alerts are stored in the MAI infrastructure. You will not find entries for BPMon objects collected via MAI in table DSWP_BPM_ALERTS.
    In classic BPMon, alerts could be transfered to cube 0SM_BPM for trend analysis. In BPMon on MAI, this cube is no longer used. The BPMon alerts  are stored in the MAI twin cubes.


The changes in the infrastructure mean that the prerequisites for Business Process Monitoring on MAI have changed compared to classic BPMon. For the prerequisites for Business Process Monitoring on MAI, please see SAP Note 1949245.


For an introduction to Business Process Monitoring on MAI and links to further blogs for BPMon on MAI see Business Process Monitoring on MAI available with Solution Manager 7.1 SP12.

How To Set Up Project Environment in Order to Maintain your Implementation Project with ARIS tool



In this blog we will show how to maintain your Business Process Structure  of your  Implementation project with the ARIS tool ( supported via Template project functionality)! This blog will cover only Solution Manager part related to the projects (Template) preparation and part related to ARIS will be covered in the Joakim Peleus  blog How To Set Up Project Environment in Order to Maintain your SolMan Implementation Project with ARIS !


Before we start with this subject  will be nice  if you can read blog related to maintenance of the Template project in SAP Environment:


  • A template project have to  be maintained in the SAP system (client) where you have possibility to make changes (which is not productive client according to SAP rules and best practice)  - client should be open for changes. For more info on this subject please refer the blog Template Projects in Productive Systems - Solution Manager - SCN Wiki


To explain complete process (ARIS <> Solman)  please follow the steps on the below screen shot.








Template Project Creations and Preparation


In our BLOG  example we will create our Template project in the DEV environment , therefore client will be open for changes.


First you will create a template project:



create a template project



Activations of the Global Roll out activity



now it’s a time to create a template name for your template project



next step in solar01 transaction is to   select which BP belongs to which template by following the below steps:



After you fished with set up of the your Z_TEMPLATE_1 to the corresponding BP Scenarios and choosing Global Attribute as the Local, you can go back and to the SOLAR_PROJECT_ADMIN transaction to activate yours template -  Z_TEMPLATE_1 in order to become visible!


You have to make your template visible because only then you can use it in the other project like is Implementation ,...!



after you switch visibilities yours template will be ready for use, and visible for other projects!





In case you are maintaining your Template project in your DEV environment now is proper point of time to  transport your Template project from your development environment to your production environment for the further work!!






after creations of the implementation project go to the project  SCOPE tab and there select your recently created templates.



now you can go to the solar01 transactions to check are your templates are available in your implementation project and to check is the source is your template project (in our case BM_TEMP_XX)




Now you implementation and template project are ready for further use in the ARIS process of synchronizations and maintenance .


In case you have questions please do not hesitate to contact either me or Joakim Peleus !

A common need in a test script recording / playback tool is the need to insert dates throughout business processes. While I was working together with end-users at customer side, we noticed the date was captured in a raw format as let’s say today 22.10.2014 which is fine for today but can be problematic tomorrow since the input field of this date was a “required end date”.


Thus we need to be able to pass a dynamic parameter to the script that takes today’s date and adds days or months or years to that date in order for the process to run through fine when we play back the test script.


Otherwise you see this nice error in your task bar of SAPgui:


ECATT parameter &DATE?


So what are the options then? CBTA script can leverage ECATT parameters so the &DATE would translate into the date of today. Well, if your configuration is correct and you’ve implemented a couple of recent (at moment of writing) notes to make this work:


2025280 - CBTA - Support of &DATE eCATT variable

2013565 - eCATT - Integration of external test tools - &DATE not resolved

So I tried to use &DATE but it still failed by giving me a date in format DD/MM/YYYY while the system expected DD.MM.YYYY. Darn, that’s too bad, right. So I contacted SAP and the CBTA team actually told me I need to look at my end-user environment date settings because this parameter takes the settings from the local system (Windows XP in my case, so my laptop at customer side which runs the script to do this automated test).


Where to find this configuration, depends on the Windows version but in essence you want to look at regional options / date-time / calendar options in your respective Windows version or at your registry (check out windows support website to locate the registry key for it).



Just changing the short date format for my own language / environment (Belgium – Dutch) didn’t cut it. I had to go and change English (United States) into this format to make the script execution take the right format but that worked in the end.


To use it in CBTA, you can use &date (comes from ECATT capaibilities)


So putting today’s date in the parameters tab in your CBTA script, can work.


But that doesn’t meet the requirements of getting a date in the future …


VBscript as input?


So I continued to explore the next option which is to use VBscript as input for the CBTA input parameter.


The date() function returns the date of today but in a wrong format, darn again, as it returns the date in MM.DD.YYYY and I need DD.MM.YYYY instead. So using just date() won’t cut it either.


Looking at more date / time functions of VBScript, I noticed day(), month(), year() and now(). Furthermore, using day() +1 is possible in VBScript.




To use VBscript in that input parameter value field, you need to use %=% so for example %=date()% will then translate in 10.22.2014.


Next month?


So if I want to have next month as input I can use the following line of VBscript:


Update 01.12.2014: The above function works fine, except in December!

To overcome this, you can include the DateAdd() VBScript function. Thanks to Mark Goovaerts to figure out the code update for me .



Note that you can use a shorter expression if you don't have specific format requirements. In this case, I have a prerequisite to have a "." in between the mm dd yyyy.

Instead of using +1, we use the DateAdd function to get the current date, one month from now without "next year" issues.


Et voila, there it is, now run the script and let’s see if that works.



Works like a charm!

Use Case Description

Select a value from a message text to be used as output parameter


During test automating you might face the challenge that only a part of a string is required as output parameter for subsequent steps to be used as input parameter.

Example: Let’s assume when creating a sales order with VA01 you just get the message ‘'Standard Order 12695 has been saved' and you would like to use only the number for subsequent steps, e.g. create delivery.

For most SAP standard transactions like VA01 you can capture the value directly from the standard message parameters in the status bar (e.g. MESSAGEPARAMETER1) via default components CBTA*GETMESSAGEPARAMS. But for some other transactions / applications and especially in the situation of custom code a number might not be available as an dedicated screen element but only as part of the complete message string. In this case the following procedure will help you to select the number and put it into an output parameter.

Although the better option for VA01 would be to use the message parameter directly via CBTA_GUI_SB_GetMessageParams I would like to use this as an example to outline the procedure.


A further requirement to this use case was that the number of digits of the output value should be flexible.


There is one option to change the string in 2 steps:

  1. Deleting the left part next to the number with VBScript Replace Function
  2. Deleting the right part next to the number with VBScript Replace Function


Another and simpler option is to use the VBScript Split Function, which can do the same in one step. (Thanks to Fabien Graille for the valuable input).

If the target value has a fixed number of characters the VBScript Mid Function could be used to do the same also in just one step.


For web applications there is an easier way to use the default component CBTA_WEB_A_GetMessageParams to extract parts of a message via message patterns.

For further VBSript Functions and Operators please check: VBScript Language Reference

To be able to follow this description I recommend to check the Standard How-To Guide for CBTA as well as the CBTA Default Component Reference in advance.

Reading the text message from screen


When ever you intend to check values or read values from the screen I recommend to use the Test Creation Wizard to capture the String from the screen via ‘Add Checkpoint’ during initial recording.

add checkpoint.jpg

and select the option ‘Get Data’:

get data.jpg

After finishing the recording and saving the script to SAP Solution Manager you need to edit the script in the Test Composition Environment (TCE).


Adjustment of Script after recording


In the TCE select the relevant Default Component that captures the text from the screen e.g. CBTA_GUI_GETPROPERTY. In the Parameters section select ‘Fixed’ in the Usage column and enter a meaningful name in the Value column for the Parameter ‘TARGETFIELD’:

Target field.jpg

With this action you provide the text string to the execution context as a token with name COMPLETE_TEXT.

Now you need to add new default component CBTA_A_SETINEXECUTIONCTXT to perform the string operation.


To do so please set both parameters to ‘Fixed’ and enter a new name of a token e.g. NUMBER_ONLY and the following VBScript command for value of the parameter ‘THEVALUE’:

%=Split($COMPLETE_TEXT$," ")(2)%



This command splits the text into several parts (sub strings) using space (" ") as delimiter and returns the 3rd sub string. As the first sub string starts at count (0), (2) is required for this example.


As final step you need to add the default component CBTA_A_GETFROMEXECUTIONCTXT to read the number from the token in execution context and write to an output parameter which can be used as import parameter in subsequent scripts.


Set usage of parameter NAME to Fixed and for OUTPUT to Exposed

get context.jpg

For better transparency you should rename the generated parameter ‘OUTPUT’ to a more ‘speaking’ name like ORDER_NUMBER in Parameter tab strip:


If you now execute the script you can see in the log how the initial message text has been changed and the value populated as output parameter:


Now you can use the TCE to create a composite script and map this output parameter to the subsequent script as input parameter.

For many years you've had the option to monitor the successful execution of your core business process via classic Business Process Monitoring. As of SAP Solution Manager 7.1 SP12, you also have the choice to monitor these business processes  via BPMon on MAI.

What are your advantages when using BPMon on MAI:

  • unified infrastructure for monitoring (same infrastructure used by technical monitoring), meaning reduced administration effort for you
  • unified use cases for job monitoring and interface channel monitoring - managed objects you have set up in Technical monitoring can be re-used in Business Process Monitoring on MAI, reducing the load caused on the managed system by the data collection
  • unified alert handling experience for end users, alert inbox used for BPMon on MAI uses same layout as alert inbox for technical monitoring
  • full use of features available in MAI for BPMon alerts (e.g. for alert notification and metric monitoring)
  • use of 'future' function (as of Solution Manager 7.2, classic BPMon will no longer be available)
  • Data providers for classic BPMon are reused by BPMon on MAI. This means that also your customer monitors will continue working for BPMon on MAI





You can decide per solution whether to use Classic BPMon or BPMon on MAI. On one Solution Manager system you can use solutions with BPMon on MAI in parallel to solutions using Classic BPMon. Important is that BPMon on MAI is only available for managed systems with basis 7.0 or higher. If you want to monitor a managed system with a lower basisrelease, you have to use classic BPMon.

You can automatically migrate a solution with classic BPMon to BPMon on MAI via report R_AGS_BPM_MIGRATE_SOLU_TO_MAI. This migration cannot be reversed, so the migration should normally start with the copying of the solution. For details on the migration steps see SAP note 2010999.

For details on how to set up Business Process Monitoring on MAI, see the available Setup Guide. For a general overview, see the Overview Presentation.

Other Blogs for BPMon on MAI

Further Information

Frequently Asked Questions about Business Process Monitoring are answered at http://wiki.sdn.sap.com/wiki/display/SM/FAQ+Business+Process+Monitoring

Have you ever wondered why there isn't a better way to do a remote availability check within MAI for HTTP availability of a system rather that letting the diagnostics agent ping on the very same host that it is installed on?


Well you could create an RFC of type G (HTTP) in your SolMan and then create a metric that monitors this specific RFC but that's rather just a workaround and has its major disadvantages.

Or you could create a GRMG Lite Scenario and then integrate the MTE into MAI, but that will just work in the context of your SolMan system and therefore has pretty much the same disadvantages as creating RFCs.


Well in MAI this kind of functionality is available from SP12, it's called "URL availability Monitoring".

If you are on a pre-SP12 level then there is still a way with EEM.


In this example we are going to ping the Google website and monitor its availability.


What you need:

    • As always a solid "foundation", meaning system preparation, basic configuration.
    • At least one Diagnostic Agent that will serve as an EEM robot.
    • I would suggest a rather newer SP stack on your SolMan 7.1 (as of SP12 this blog might be of less use)
    • All the EEM notes containing corrections that fit your release.
    • EEM Authorization and workcenter roles according to the SolMan security guide of your release.


How it is done:

First you need to get your infrastructure ready for EEM, this is done in the step 2 of the EEM wizard.


  1. Call transaction SOLMAN_SETUP and navigate to the "Technical Monitoring" section followed by the "End-User Experience" radio ButtonEEM_Step1.jpg
  2. Execute the infrastructure prerequisite checks in step 2.1EEM_Step2.1.jpg
  3. In Step 2.2 "Global Settings" choose the user "SM_INTERN_WS" user as means of internal communication (make sure you created this user in system preparation and remember its password)
  4. Choose the options for "Webservice Notification" and "Server Side Collection". You only need the third option if you have to troubleshoot EEM data collection.EEM_Step2.2.jpg
  5. In step 2.3 we are going to leave the standard settings for data lifetime and housekeeping but you might want to pay attention to the help section to understand what is what and which settings would fit your needs. EEM_Step2.3.jpg
  6. In step 2.4 "Configure Automatically" simply hit the "Execute All" button and wait until the job has completedEEM_Step2.4.jpg
  7. In step 2.5 "BW Basic Settings" you can see the RFC used for the connection to your BW client or system this will in most cases be "NONE". You also can set the time zone and there is a checkbox for content activation enforcement. If you execute this wizard for the first time the checkbox will be set and is not de-selectable you only have to select this checkbox again after an upgrade with new BW content. Attention: If the checkbox is selected you have to make sure the client is changeable for LOCAL objects (refer to help section for further info) EEM_Step2.5.jpg
  8. In step 2.6 "Configure EEM Robots" it gets a little tricky but not too much. You might ask yourself where you can get an EEM robot from, how it is installed and so on. Relax, an EEM robot is nothing more than a Diagnostics Agent. It even gets better you can just use the Diag Agent already installed on your SolMan. On the right-hand side of the split screen cklick on "Check Agents" then choose the agent installed on your SolMan from the list and copy it to the left side of the screen with the little "<" arrow.EEM_Step2.6.jpg
  9. After that you can enter location data on the left-hand side and saveEEM_Step2.6b.jpg
  10. In step 2.7 "Workmode Settings" you define in which work modes data should be collected. There's also work mode settings for the EEM robots but that's not much use for our purposes as it doesn't have the "Maintenance" and "Planed Downtime" work modes.EEM_Step2.7.jpg

So much for the basic setup, now comes the really tricky part. You need to create a script that will actually be executed and perform an HTTP GET request to the server  you want to monitor and deploy that script on your robot.


  1. In step 3.1 "Create Scripts" you download the EEM script editor first (if you have a newer SP then you should choose the one from your local SolMan, otherwise the version from the Wiki might be newer.)EEM_Step3.1.jpg
  2. Once you've downloaded the zip file unpack its contents and start the executable named "EemEditor.exe"EE_Edit_1.jpg
  3. First you need to configure the connection to your Solution Manager in the "Editor Configuration"EE_Edit_2.jpgEE_Edit_3.jpg
  4. Then you create a new eclipse project that holds your scriptsEE_Edit_4.jpgEE_Edit_5.jpg
  5. Then you can import the existing EEM script that comes with every Solution Manager by right-clicking on the project and choosing "Download Scripts..."EE_Edit_6.jpg
  6. When you first start there's only the EEM Selfcheck script in the repository, so we are going to use it as a template.EE_Edit_7.jpgEE_Edit_8.jpg
  7. You can now copy and paste the script or simply rename it by right-clicking on itEE_Edit_9.jpg
  8. You can then rename the scripts internal steps and change the URL to the server you want to monitor.EE_Edit_10.jpg
  9. You can also set the schedule interval directly from within the script. This defines every how many seconds the script runs. You can also set thresholds for response times and other things in this configurations dialog.EE_Edit_12.jpgEE_Edit_13.jpg
  10. Make sure you test and save the script. EE_Edit_11.jpg
  11. Once you are done editing you have to upload the script again to the repository by right-clicking on it and choosing "Upload..."EE_Edit_14.jpg
  12. Warnings can usually be ignored, not so in case of any errors...EE_Edit_15.jpg


Once you have edited and uploaded the script you are ready to implement the actual monitoring by continuing withing the EEM wizard in SolMan (again transaction SOLMAN_SETUP as above). This is described in Part II of this blog: http://scn.sap.com/community/it-management/alm/solution-manager/blog/2014/10/06/how-to-replace-grmg-with-enduser-experience-monitoring--part-ii

Welcome back to the second part of this blog, what we have done so far ist getting the infrastructure ready and create a script that will do the monitoring job for us. The first part can be found here: http://scn.sap.com/community/it-management/alm/solution-manager/blog/2014/10/06/how-to-replace-grmg-with-enduser-experience-monitoring--part-I


  1. In the step 3.2 "Maintain Scenario" you first create your own scenario and give it a name that speaks for itself. Although you can have several scripts in one scenario, keep in mind that reporting is done on a scenario level. Meaning if you just use the scenario for monitoring and alerting HTTP availability then you are probably well served with one scenario that contains all your scripts for all the systems. BUT if you want to report your systems HTTP availability later on the you are probably better off with one scenario per system.EEM_Step3.2.jpgEEM_Step3.2b.jpg
  2. After you have created your EEM scenario you can assign your systems (ABAP or JAVA) to it. You need to have at least one system in your scenario despite the fact that it is really more important for the Workmode Management than for the monitoring and alerting of the scripts. In this demo case I just put my Solution Manager into the EEM scenario.EEM_Step3.2c.jpgEEM_Step3.2d.jpgEEM_Step3.2e.jpg
  3. In step 3.3 "Assign Script" you will see your script that you've uploaded before and can assign it to the scenario you just created EEM_Step3.3.jpg
  4. In step 3.4 "Distribute Script" you deploy your script to the robot(s). Personally I prefer the matrix view here.EEM_Step3.4.jpg
  5. In Step 4 "Monitoring" you should first enter your global Parameters including make sure you have the right password for the SM_EXTERN_WS user.EEM_Step4.jpg
  6. Still in the same step you can also reconfigure the scheduling and thresholds of you scriptsEEM_Step4b.jpg
  7. In step 6 "Alerting" it gets really interesting as here you can define for which scripts you want to have alerts and also enter recipients for notifications. Make sure you do this on script level instead of a scenario level because you ideally have one script per monitored system. This also allows you to differentiate between performance and availability alerting. You can also enter the recipients for the notifications to every alert. After you have entered all information do not forget to click on the "Configure Alerting" button.EEM_Step5.jpg
  8. In Step 6.1 you define how the SLA fulfilment is evaluated. Make sure you have a look at the documentation to choose what fits best your needs. The choices here aren't really that many and values are calculated over the whole scenario, so if you want to have your SLA fulfilment monitored per system then you probably have to go with one scenario per system or otherwise you are stuck with averaged or best child calculations over several systems.EEM_Step6.jpg
  9. In step 6.2 you enter the thresholds for your SLA fulfilment as in how many milliseconds are still ok for a response time.EEM_Step6b.jpg
  10. In step 7 you can create users that serve you as templates for real life user authorizations. They do not have any technical functionality other than for testing purposes or serving as templates. Make sure to also consult your Solution Manager security guide on this.EEM_Step7.jpg
  11. In the final step you can export the whole wizard as a documentation of what you have done. This is pretty handy.EEM_Step8.jpg

Finally you can enjoy the EEM functions with all the integrations and reporting from transaction SM_WORKCENTER.





All in all it's a pretty big effort just to monitor an URL the way it used to be in GRMG but it you have crucial systems like your SLD, ADS or CPS then it might be worth it. I think the guys at SAP already figured that out and that's why they came up with the new URL availability monitoring as of SolMan 7.1 SP12.


Well that's all for now, I hope you find this helpful.

Introduction and Use Case description

With SAP Solution Manger and the optional adapter it is possible to synchronize content like processes, process steps as well as related Requirements with SAP Quality Center by HP (also known as HP ALM or HP Quality Center). This activity can be automated so the transfer is performed in background e.g. on a daily base. Details of this solution are described in SAP Help Portal: Test Organization with the SAP Quality Center by HP  and the How-to-Guide available in the Service Marketplace How-to Guide of Adapter.

The standard behavior of the adapter was designed in away that only processes and steps that have been selected manually in the last transfer are transferred in background. Some customers have asked for a solution that transfers all processes and process steps as well as the related requirements that have assignments in the transfer tab. To allow this, a small modification is required which is described in this Blog Post.

Please note that this is not a SAP standard solution but a modification of SAP standard code and not part of any SAP Maintenance contract to be implemented at your own risk. Customers with SAP One Service, MaxAttention or Active Embedded contract can request 'Expert on Demand' Support for further assistance.


Adjustments of SAP Standard code to get required functionality

Although we tested the coding in our environment we strongly recommend to perform the following changes in a non-productive SAP Solution Manager first and record the changes to a transport to be imported to productive SAP Solution Manager only after your own tests.


Copy report ‘RS_SM_QC_REQUIREMENT_SYNC’ to customer namespace:

To keep the extend of modification as low as low as possible and to keep the exiting behavior for manual transfer in SOLARnn we recommend to copy the report ‘RS_SM_QC_REQUIREMENT_SYNC’ to customer namespace e.g. ZRS_SM_QC_REQUIREMENT_SYNC.

  1. In transaction SE38 enter report name and click on copy button:001.png
  2. Enter new name for target program:
  3. Select sub objects to be copied:
  4. Activate program:

Modification of method ‘CL_AGS_HP_READ_BP / READ_PRJ_STRUC:

  1. In transaction SE80 select ‘Repository Browser’  Class / Interface and enter ‘CL_AGS_HP_READ_BP’ and press ‘Enter’:
  2. Expand the Methods tree and double-click on method ‘READ_PRJ_STRUC’
  3. Switch to Change mode via change button and register the object via Service Market Place http://service.sap.com/sscr to get the Access key.
  4. Insert the  following coding as shown in the screen shots bellow:
    Section 1 in line 19:
      IF sy-cprog = 'ZRS_SM_QC_REQUIREMENT_SYNC'.
        CLEAR: a_select_nodes.

    Section 2 line 51:
  5. Save and activate the coding.


Impact of coding changes:

The changes will check if the currently executed report is ‘ZRS_SM_QC_REQUIREMENT_SYNC’. If yes it skips the check regarding batch mode and existing filter (existing node selection from last manual transfer) and ensures that all nodes with assignments on transfer-tab are selected to be transferred.

There is no impact for the manual transfer via SOLAR01 or SOLAR02 as well with the use of the standard report 'RS_SM_QC_REQUIREMENT_SYNC’.


Scheduling of automated Blueprint Transfer:

If the job already has been scheduled use transaction SM37 to adjust the job step entry ABAP program to ‘ZRS_SM_QC_REQUIREMENT_SYNC’.

For scheduling a new job you can use transaction SM36 and enter ABAP program ‘ZRS_SM_QC_REQUIREMENT_SYNC’ as job step.


Filter Blog

By author:
By date:
By tag: