1 2 3 12 Previous Next

SAP Sourcing

172 Posts


As, with the new wordservice, as we have come to know as the SAP Sourcing DOCX Generation Service, a crucial step in this process is converting all legacy contract specific documents to .docx and .dotx formats.


You can find details on this via this link here.  The following is a summary of this and best practices, which includes a section on common errors and general instructions on how to run an l called script.





Running the .doc and .docx converter best practices


How to run the .doc to .docx converter:

1)     Ensure that you are on a version of CLM that is conducive to the new CG.  This would include versions 7.0 SP5 and up for 7.x and 9.0 SP11 for versions 9.x and up.

2)     Next, the application will need to be pointed to the old .Net based Contract Generator (old CG) as it's required that the converter be run while this is true.  Here's an example


Log in as the system user:

contractgen  contractgen.serviceurl = http://xx.xyz.yyy/Wordservice/Wordservice

System context

contract gen contractgen.wordservice.enabled = TRUE

3)     Run the converter script via DBimport

4)     If, step 3 is successful, please then set the system property to FALSE as follows

     contractgen  contractgen.wordservice.enabled = FALSE



Common Errors and how to address them

1)     Running the conversion script via DBimport fails

a.     If, you see the following errors in the logs, you may need to apply a known workaround where you will need to run the script through a specific process know specifically as an explicitly called script.  Details on the script is found below in the section on how to run the script.



Errors in logs

Problems were encountered processing script convert_doc_to_docx.xml at com.sap.odp.install.engine.DbsetEngine.execute(DBsetup.engine.java)


2.     The Explicitly called script is not found

a.     You may need to import the following workbook.  Under Setup ->System Setup->Scheduled Task type, check if the Explicitly Called

script task type is shown when using the "All inactive Scheduled Task Type" query?  If, so, edit it and make it not inactive.

b.     If it's not there either, it should be in the "scheduled_task_types" sheet of the appropriate upgrade workbook in the ref guide.  (I.e.  "Enterprise Upgrade Workbook - 7.0.08") You can take that data and import it manually in the system ( just that one sheet/row )

c.     Customers with multitenant option may not see this if this is turned on.  Please, turn this OFF for the duration of the script being run, then turn it back on after it is run.


The significance of a context in an Explicitly called script:

Unlike other scripts, which execute automatically based on their context, explicitly called scripts do not depend on a context.  The context will be taken from the session from where we login.  For, example, if I login as an enterprise user then that user's context will be taken from the session.  We don't have explicitly called scripts at the system login level but we have the option to run an explicitly called script via Enterprise level and buyer levels.


Here is some more information from our HELP



In, the below example, if you intend to run the doc to docx converter at the system context level, then you will have to log into each of the three enterprise contexts level, and run the conversion because we do not have the option to run the doc to docx converter at the system context level as the context will be taken from the session.





General Instructions on how to run this script



Once you have completed the prerequisites described in previous section(Roadmap to transitioning from BDC on ME32K to using RFC for service outline agreements  )and created a mapping program to map the input parameters of RFC : BBP_ES_OA_UPDATE you can proceed with following steps to update contract data in purchase order header (EKKO) and condition header (KONH) tables.


All the Three Updates listed below are one-time updates in a system. Once the update has been executed it will never have to be repeated again, the RFC will ensure the data is consistence going forward.


  • Updating EKKO.

          One of the first checks that the RFC performs when it is invoked is to check if the status of outline agreement is set to M - Purchasing Document           from  E-Sourcing (ie. EKKO-STATU = 'M'), for Change/Update operation. If you have been using BDC to create/update outline agreement, this field is           most likely blank as it is not visible on the ME31/32K screens.


          The idea behind this check is to ensure that the RFC is not being used to update outline agreements which were not created by the RFC. I haven't been    able to locate a OSS note which will set status of relevant outline agreement records to 'M', so I am updating this value using a direct table update.

        NOTE: Again to reiterate, this is only needed if you want the existing outline agreements created via BDC to work with RFC, any outline agreements created from scratch using the RFC will not require this update.

  • Updating ESLL.

     When creating a new OA through the RFC, ESLL-EXTERNALID is copied over to ESLL-EXTROW and they are always kept in sync. However if you have been relying on the BDC to update service outline agreement, the EXTERNALID field is most likely blank. To update Outline agreement using RFC ESLL-EXTERNALID should be equal to ESLL-EXTROW. If this condition fails the update will not work.

     In my scenario I have created a utility to perform a one time direct table update which copies the ESLL-EXTROW to ESLL-EXTERNALID field to ensure they are always in sync. Also consider implementing sap note: 2222611: BAPI for service PO Entry sheet and contract :esll-extow - EHP 5


  • Updating KONH.

         Unfortunately every time to you add a date rate to conditions for your services, the new condition created in created in the KONH table with the incorrect date validity period ( old record is not demarcated correctly ) as shown in the image below. I am not sure why this happens but according to my functional counter part this has been happening forever and since the validity period in ME32K is pulled from A081 table ( which has right from/to dates) it has never been a problem.  I haven't explored this issue in detail , but OSS Note: 1970016 - BAPI_CONTRACT_CHANGE: Unexpected item condition record  validity periods are created might be useful. Not sure if this is a bug or expected behavior, since a new contract created via this RFC created KONH records with proper validity period demarcation which match records in A081 table.


     Anyhow, if the date range on KONH table are incorrect for service conditions of any outline agreement, update using RFC will not work for these objects. As with updating EKPO table, I went with direct table update of date range in KONH by copying the correct validity period from A081 table. Below screenshot shows the KONH entry after table update ( matches A081 from above image ).


   The hierarchy diagram below should help you create a simple update program to correct the KONH validity period by updating them with period from A081 table.


  • One the prerequisites from the previous blog are implemented and the existing contract records in the system are updated as explained above you should be ready to use RFC: BBP_ES_OA_UPDATE and replace the BDC.
  • Please keep in mind a major limitation of using the RFC is  that you need to pass all the data present in the outline agreement in case of change operation to the RFC. So take the following Outline agreement scenario.





00100010Service Item 10 for Plant1
00100020Service item 20f or Plant1
00200010Service item 10 for Plant2



0010001012/31/999905/25/2016Condition 1 for Service item 10,line item 10
0010001005/24/201601/01/2015Condition 1 for Service item 10,line item 10
0010002012/31/999901/01/2015Condition 2 For Service item 20, line item 10
0020001012/31/999905/25/2016Condition 1 for Service item 10, line item 20
0020001005/24/201601/01/2015Condition 1 for Service item 10, line item 20


In the above scenario, if you requirement is to only add a new Validity period ( 05/25/2016 to 12/31/9999) highlighted condition above, you still need to pass all other conditions for all crafts for all line items event if they are not changed. Any record which is not passed to the RFC though the interface will be DELETED.

This is a major short coming of the RFC approach as compared to BDC, Following are the expected time line based on line items as per E-sourcing community blog ( SAP Sourcing) .


In our test system, we have noticed the update takes upto 28~ 30 minutes to update outline agreement with 12-14 plants, each with 20 services and 2 or 3 validity periods each with 26 conditions. This kind of performance is unacceptable in our scenario so we have customized the RFC: BBP_ES_OA_UPDATE further to pass only changed records. If you face the same performance bottleneck continue to the next part of this blog for further details.


Before proceeding with the steps outlined in this blog, I would highly recommend getting familiar with the attached SAP sourcing integration guide ( or get the latest one from sap support portal). Majority of the prerequisites are listed out this document. Moreover, the RFC:BBP_ES_OA_UPDATE is extremely slow for updates, as it need all outline agreement for processing event if you are not changing them ( more details in part 2  Roadmap to transitioning from BDC on ME32K to using RFC for service outline agreements - Part 2  )



I must say the process to transition away from BDC is anything but straightforward or intuitive and for us it involved tons of regression testing to finally get the exact same behavior as the BDC.

  1. Activate switch MM_SFSW_P2PSE using transaction SFW5. This switch is part of business function LOG_MM_P2PSE_1. This business function forms the backbone of integration with SRM/SAP CLM. I recommend consulting with your software licensing department to ensure turning this switch wont have any unwanted licensing implications.MM_SFWS_P2PSE.png
  2. Implement the following OSS notes ( depending upon current ERP version ), many of these notes have side effects so you need to apply the note which resolves the side effect and hence they go in pairs. These were valid for our ERP system ( Netweaver 7.31 Ehp 5 ), you might need to apply additional/fewer notes based on your current system version.
    OSS NoteDescription
    1769023MEOUT 005: Item category 9 not supported
    1811870Heterogeneous behavior when publishing a Master Agreement
    1834156CCM: SE 377 during the creation of purchase order
    1866215ATC errors in MM-PUR application
    1953816S&041: External currency amount too big - unable to convert
    2002974CCM: Service item contains wrong condition information
    2015642VK 022: Inbound SOA failed with service items(s)
    2020259CCM: Incorrect condition item index for the supplement condi
    2023829Historic condition information are lost when CLM contract is
    2032941Header price conditions are not created for SAP Sourcing Master
    2127266CCM: Central contract change fails with an error message in
    2142004CCM: Scales not created for supplement conditions
    2142245Condition update for Info record result in program termination
    2155192CCM: Condition not updated for the newly added service items
  3. Perform the following configuration Steps as outlined in the configuration guide.
    • Create Cross-System Company codes and Business Area (Step 3.2.2)
      • In transaction SALE. Choose: Modeling and implementing Business Processes>Global Organizational Units>Cross-System Company codes
      • In transaction SALE: Choose: Modeling and implementing Business Processes>Global Organizational Units>Cross-System Business Areas
    • Create a Pseudo Vendor
      • In transaction MK01 create a pseudo vendor. You must pick an account group that allows an external number range.
      • In customizing (SPRO)>Integration with Other mySAP.com components>E-Sourcing>Settings for E-sourcing Integration enter the vendor from step1.
    • Create Text IDs for SAP ERP (5.1.5 in config guide)
      • Header Texts (In SAP ECC ): SPRO>Purchasing>RFQ/Quotation>Texts for RFQs/Quotations>Define Text Types for Header Texts
      • Item Texts (In SAP ECC): SPRO>Purchasing>RFQ/Quotation>Texts for RFQs/Quotation>Define Text Types for Item Texts
    • Block Integrated fields from user changes in SAP ERP ( 5.1.6 in config guide)
      • Importing field selection keys ESSA and ESCO with their related settings into your system usina a Business configuration (BC) set using transaction SCPR20 and entering BC set BBP_ES_CONTRACT_FIELD_CONTROL and activating it.
    • If implementing RFx scenarios check sections ( 5.1.7 and 5.1.8)
  4. The RFC - BBP_ES_OA_UPDATE is what encapsulates the logic for creating outline agreements with service line items and service conditions. Logic in the RFC has checks related to steps outlined above hence trying to use it directly without performing the above steps will not work. If you have experi
    ence with using BAPI_CONTRACT_CREATE before you should be familiar with the RFC update presented below.

          Note the TABLES parameters highlighted below, these are used to pass Service information (Service, texts, conditions, validity periods) during outline agreement creation.


  SRV_LINE corresponds to data in ESLL table.

  SRV_COND_VALIDITY corresponds to data in A081 table

  SRV_COND  corresponds to data in KONH table.


Logically when preparing data for calling the RFC: SRV_LINE should be filled followed by SRV_COND_VALIDITY followed by SRV_COND. The attached document shows the mapping between the structure fields ( of RFC parameters noted above) and SAP tables. I did use SRV_TEXT and SRV_COND_SCALE_QUAN but it should not be very difficult to figure out how they are mapped


The next part  will cover the following topics:

  • How to prepare existing outline agreements for update using BBP_ES_OA_UPDATE
  • Enhancement to resolve issues arising from trying to update existing outline agreements
  • Enhancements to improve performance when using the new RFC for outline agreement update.



    Integration Guide ( https://websmp103.sap-ag.de/~sapidb/011000358700000270922012E ).

The Global Search feature in SAP Sourcing allows you to find a document by searching on its attributes.  For example, you can find a contract document using supplier, contact name, and item number.



Review: Security, Configuration settings added in 9 SP 22 / 10 SP 6


You can use security settings to control which users are able to use Global Search. Starting in these SP's, the System group has a Global Search permission setting.  As delivered, all users have access to Global Search.


If you only need global search capability for certain documents, you can limit Global Search by classid using the system property userinterface$globalSearch.classids.


For further information about these features, see the Configuration Guide for your release level at http://help.sap.com/sourcing.


New: Reduce Global Search footprint with SAP Sourcing 9 SP 23 / 10 SP 8


The Global Search capability requires an index of document data.  To maintain this, each time you create or update a global-search-enabled document, an entry is created in the XML Extract Queue table.  The index is updated by the scheduled task "Search XML Extract," which processes the extract queue and updates the index with the new/changed data.


As originally implemented, the XML file includes almost all of the persisted data for a document.  In SAP Sourcing versions earlier than 9 SP 23 or 10 SP 8, the XML for Global Search was causing database administrators to add more and more disk space to support the sourcing application.  In these recent releases, significant changes were made to the preparation of the XML file to reduce the amount of data included, without negatively affecting the usefulness of the feature.


  • Several field types were removed from the XML output file, including BOOLEAN, INTEGER, DECIMAL, and FLOAT.
  • Optimizations were made to remove data when the data was for internal use or had little value for search.  Examples of these are the line item formula results, which store a calculated value for a mathematical expression, and Supplier Entered Attributes on line items, when the supplier did not enter a value.


These optimizations are expected to reduce the XML size by 50% or more; to reduce the SQL activity during extract processing by 50% or more; and to shorten the Index Refresh processing time by 70% or more.


Some of this improvement happens automatically as part of the upgrade.  The XML for three document types, however, is potentially very large:


  • Master Agreements
  • Agreements
  • RFx Responses


Therefore, to minimize the risk of disruption to daily operations, the re-creation of the index data specifically for these three document types is reserved as a process requiring some manual steps.


To gain the full benefit of the optimization, see SAP Note 2251373 for further description of the global search improvements and for details on planning and executing the related manual post-upgrade steps.

In this blog, I will explain how to add predefined web links for all the users. There may be a requirement to add web links which is common to all users and it will be difficult and time consuming for each user to add it manually. Hence we can add this by using the Enterprise ID to the Workbench page Template.


Also this template will be applicable only for the new users created after the change is done and not for the existing users in the system. If a existing user requires a change then the web links should be added manually by the user or the Enterprise ID to modify the workbench page template of that particular user.


Let me show you the steps for addition of web links to workbench page template:

  • Login to the system with Enterprise ID


  • Click on Workbench Page under the category Workbench
  • Select the query All Workbench Page Templates in the query dropdown


  • Select the desired Home option. In our case, it is the Home default template for Buy-side


  • Select Layout and add Links tab to the channel list available


  • Add the set of links in the Link Tab which you require. As an example we have added Google.com


  • Click on Done and Save the Workbench Template.


The users who are added to the system newly after the template is changed will be able to get the Links tab with the links added to them. For users already existing in the system, they have to navigate to their own workbench page template and modify as given below to get the options.

Today we'll try to provide a (better) alternative to one cumbersome SAP standard process - deploying custom jars to SAP Sourcing.


I've always wondered who would go to such pain just to add one .jar to the classpath, when even standard Java has better options. And why?!

For example, my code breaks about 100 times before it's ready to be fully tested. That means I would need about 1 year of BASIS time just to do 100 repacks and redeploys to test a piece of code.


So, let's keep it simple:



STEP 1 - You'll obviously need a jar . Nothing better than a Hello World



STEP 2 - Copy jar file anywhere on the Sourcing server




STEP 3 - That's ALL there is. We just need to use it now, like this:


// load jar before imports - this is where BeanShell beauty comes in
this.interpreter.getClassManager().addClassPath(new File("/usr/sap/xfer/CLM/btoma/Import/Path/HelloWorld.jar").toURI().toURL());
// @bogdan.toma
import com.tnd.eso.addons.playground.HelloWorld;
HelloWorld helloWorld = new HelloWorld();
throw new ApplicationException(session, helloWorld.getWorldStatus());



>>>>> result from testing STEP 3 on a toolbar script:



Indeed the world is doomed if we have to follow that 'custom jar deployment' process.




Bogdan Toma


Check out some other cool topics below.


SAP Sourcing scripts - editing and source control maintenance

[ANN] ScriptsRepo - Deploy Tool for SAP Sourcing Scripts

ScriptsRepo is a free and open source application for deploying script definitions to SAP Sourcing. It is designed to aid to the development of SAP Sourcing scripts in a controlled and collaborative process. ScriptsRepo is closely integrated with git and makes use of the powerful commit-hash for versioning scripts.

In 21st century, one should never have to say phrases like:

- what version is this script?

- what did I actually change here 6 months ago?

- where did my code snippet go?

- damn, I forgot to take a backup !!!

- and many more ...



Once you've gone through this document all you'll ever need to do to get your script definitions into SAP Sourcing system is:

  • java -jar scriptsrepo-0.4.1.jar --deploy










Note: this document is indended as a placeholder for the release management and high-level detail of the application. If you are not comfortable with the details below please go through the tutorial provided.




  1. Scripts maintained locally, as recommended in: SAP Sourcing scripts - editing and source control maintenance .
    • File name should be script's EXTERNAL_ID or DISPLAY_NAME, plus your desired file extension.
  2. SFTP access to your SAP Sourcing (DEV) application server
  3. 4 (four) directories created on SAP Sourcing app server (to be used for Data Import Scheduled Task)
    • /.../Upload
    • /.../Queue
    • /.../Archive
    • /.../Data (notice the extra folder, this is not directly used by the SchedTask)
  4. Data Import Scheduled Task configured on the Upload/Queue/Archive directories above (set it at high frequency, <5min)




Installation & Usage

  • Download the app using the link above for latest release.


execute this step at first usage and every time your script metadata changes (eg. adding new scripts or inactivating old scripts) - in plain words, if you manually change anything other than the script code in SAP Sourcing, you'll need to re-import.

  • Extract an .oma file with all script definitions (save it as allscripts.oma in the same directory as the jar)
    • easiest way - use export option Object List and select FCI-ScriptDefinition-OML
  • Open Command Prompt (on Windows), Terminal (on Mac/Linux), navigate to the download directory and execute
    • java -jar scriptsrepo-0.4.1.jar --import



the initialization process above will generate a config.properties file at the first usage in the same directory (if the file already exists, it will not be overwritten)

  • config.properties options
      • GIT       >> (If git is used as SCM. The app will set the script version to the GIT commit hash.)
      • LOCAL  >> (If no SCM used, only local files. The tool will set script version as the last modify date of the script local file.)
    • REPOSITORY_DIR                 >> Location of script files. (on windows use forward-slash for directory separation)
    • REPOSITORY_FILE_ID           >> Metadata of the script definition. (this is used to link the script with the local file. eg: EXTERNAL_ID, where local file is named EXTERNAL_ID+file extension)
    • DATA_FILE_EXTENSION        >> Extension used for script files. (eg .java)
    • ESO_DATA_DIR                             >> 'Data' directory on SAP Sourcing app server - from prerequisites (where to publish script data files)
    • ESO_UPLOAD_DIR                   >> 'Upload' directory on SAP Sourcing app server - from prerequisites (where to publish script import medatada xml)
      • DUMMY  >> Test protocol, doesn't do transport. Displays resulting metadata xml to console.
      • SFTP      >> SSH File Transfer protocol
    • HOST     >> SSH host - irrelevant when TRANSPORT_PROTOCOL is DUMMY
    • PORT     >> SSH port
    • USER      >> SSH username
    • PASS     >> SSH password



execute the deploy action whenever you want to publish your local scripts to SAP Sourcing. ScriptsRepo app will handle the rest.

  • java -jar scriptsrepo-0.4.1.jar --deploy




I am interested in hearing your comments, ideas, hints, questions, code contributions, anything that you could think of.

I'm planning to keep this project active, at least for making my life a lot easier implementing SAP Sourcing enhancements.


Some things are missing, and could be enhanced (if SAP agrees to fix some issues)

  • Explicitly Called Scripts - disabled
    • can't work because the SAP standard importer is broken
  • Workflow Scripts
    • no option yet to import; I'm investigating the use of V10 webservices to incorporate this





ScriptsRepo is created and maintained by Bogdan Toma and is released under GPLv3.

Scripts are, in short, one of the winning points of SAP Sourcing. As much as I appreciate the technology behind them (myself being a core java dev), they present huge gaps when it comes to methodology of development and maintenance.


The methodology in this case translates to either editing scripts directly in SAP Sourcing (yes, some people are masochists) or a low-level copy-paste. I cannot stress enough about the issues that can arise out of this.


This blog post will cover my recommendations for scripts on:

  • Editing - editor, formatter
  • Maintaining - source control maintenance, git, SourceTree app




Editing SAP Sourcing Scripts


When I’m not coding in Eclipse, my editor of choice for SAP Sourcing scripts (for Windows) is SynWrite (download here).


After installing SynWrite, we’ll add two plugins for it:

  • Java lexer
  • Code formatter


Adding java lexer

    • Open SynWrite and follow menu Options -> Add-ons manager -> Install
    • Search for Lexer: java and double click, confirm installation.
    • ***Optional*** - Configure java lexer to parse .bsh files (if you prefer to keep your scripts with .bsh extension instead of .java)
      • Options -> Customize lexers library -> Java -> File extensions: java jav bsh


Installing code formatter plugin

    • The formatter I'm using is a customised jsbeauty plugin.
    • To install, simply open the formatter zip file plugin.Alexey.extended.JsFormat.zip (download here) in SynWrite and confirm installation. File -> Open
    • Format your code using Ctrl+Alt+F or Plugins -> JS Format -> Format




Maintaining SAP Sourcing Scripts

Now that we have an editor that can ensure the formatting is consistent, and will not impact our SCM, my recommendation is to:


  • Download all your scripts from SAP Sourcing, and maintain them locally on your system
    • Use either script's EXTERNAL_ID or DISPLAY_NAME as file name (with extension .java or .bsh) for easy identification.


  • Install a SCM tool of your preference and keep your scripts version-controlled
    • My recommendation is to use only the best available: git
    • For an easy-to-use program you can simply download SourceTree which will handle everything for you with an excellent GUI.



The result:




I like SynWrite a lot compared to other alternatives (Notepad++ etc), because of the option to have a good formatter and a neat tree structure which shows all your functions in the script (cool!).


Screen Shot 2016-02-21 at 12.16.12.jpg





Just compare what ever you are using as methodology (manual backups, oma extracts, attachment blobs etc) with the screenshot below.





If you've followed on to this point you might be having two questions:


  • Can we have code autocomplete and validation like in a full-fledged IDE?
    • NOT in SynWrite or other editors you might be using.
    • YES, it can be done in Eclipse or Netbeans; I am using such options within Eclipse. BUT, coding beanshell in Eclipse is (for the moment) way too complicated to qualify for a public release or recommendation.


  • What's the point of all of this, if it doesn't solve the initial problem - having to copy-paste scripts from local files back into SAP Sourcing?
    • This post aims to provide a <<phase-one (the basis)>> for script management - getting the files in a coherent format into a version-control system.
    • In the next couple of days (stay tuned) I will release the <<phase-two>>: A tool that will automatically deploy your scripts from your SCM into Sourcing. So make sure you get your scripts in a local format under version control as soon as possible.



Bogdan Toma

In this blog, I am planning to show how you can restrict access to queries/reports. The same is applicable to query groups also if you want to restrict access to a group of queries.


Access can be given to Users, Groups or Companies. There is a precondition that atleast one of the person in the access list should be having read/write mode else this method will not work as some person should be having edit access.


Let me show you the steps for the same.


  • Select any query/report to be given access. In this example,  I am taking the test query which i have created.



  • Go to Access List tab and enter the list of collaborators i.e. Users/Groups/Companies. Here we are giving access to only myself which is shown below.



  • Click on Save and then Done to complete the changes.


By this way, the users/groups/companies other than the mentioned in the access list will not be having access for the same. They will be getting an error message that they do not have access for that particular object



Release 10.0 SP08 of the SAP CLM ERP Integration - now available on SAP Service Marketplace - provides new standard functionality to support multiple Price Validity Date Periods on CLM Agreements published to ERP as Outline Agreements.


For additional details on this new feature, please refer to SAP Knowledge Base Article 2231435 - Price Validity Feature:



Prior to SAP CLM Release 10.0 SP08, in order to create multiple Price Validity Date Periods on an ERP Outline Agreement published from CLM, the application of custom or standard SAP Notes was required or custom development had to be performed.  Although this resulted in ERP Outline Agreements having multiple Price Condition Validity Date Periods on a line item, the corresponding CLM Agreement line item had only one Price Condition Validity Date Period - an out of sync condition.


This article provides an overview of the upgrade support in SAP CLM Release 10.0 SP08 to migrate (i.e., synchronize) the Outline Agreement's multiple Price Validity Date Periods to its corresponding CLM Agreement so that after the migration, the two documents Price Validity Date Periods data exactly matches.


Migration Process




The new migration tool executes via a 2 step process:

  1. New ERP Extract of Outline Agreements Price Validity Date Periods data to an XML file.
  2. CLM import of the XML file.



The ERP Extract program provides flexible options for selecting single Outline Agreements, ranges of Outline Agreements, or all Outline Agreements.  It will only extract Outline Agreements that have been published from CLM Agreements.


Analysis Phase - Warning Mode Extract/Import

In order to understand what Price Validity Date Periods mismatches exist, the extract/import can be first run in "warning mode".  This allows the mismatches to be reviewed  first in order to verify that the Outline Agreement's multiple Price Condition Validity data is correct vs. the CLM Agreement having unpublished price validity updates that should be published to ERP before the synchronization update occurs from the Outline Agreement.


The output of this CLM import is a report identifying the out of sync Outline Agreements/CLM Agreements.


Synchronization Phase - Update Mode Extract/Import


For any out of sync Outline Agreements/CLM Agreements, once tit has been determined that the Outline Agreement contains the current and correct Price Validity Date Periods data that should be on the CLM Agreement's new Price Validity Date Periods collection, the extract/import can then be run in "update mode" to synchronize the two documents.


The output of this CLM import are updated CLM Agreements whose Price Validity Date Periods data matches exactly what is on the ERP Outline Agreement, as well as a report that lists all of the CLM Agreements synchronized.


It should be noted that after the upgrade, an alternative to this automatic synchronization is always manual synchronization by using the CLM Agreement UI to update the multiple Price Condition Validity Date Periods.


Additional Information


For additional details on this new upgrade migration capability, please refer to SAP Knowledge Base Article 2231437 - Price Validity Upgrade Guide:




After completing a successful round of customer validation testing where customer feedback has been very positive, Release 10.0 SP08 of the CLM ERP Integration provides new standard functionality to support multiple Price Validity Date Periods on CLM Agreements published to ERP as Outline Agreements.  SP08 is now available for download on SAP Service Marketplace.


The design goal of this new feature was to replicate, as much as possible in CLM, the standard behavior and use cases supported by SAP ECC's Outline Agreement Price Condition Validity Date Periods functionality in order to enhance the integration between these 2 systems.


Line Item Support for Price Validity Date Periods


Enhanced Line Items User Interface



Header Support for Price Validity Date Periods


Enhanced Line Items User Interface



Additional Information


For additional details on this new feature, please refer to SAP Knowledge Base Article 2231425 - Price Validity Feature:



Hello all,


    Is it possible to disable import line item button in line item tab of master agreement ? any suggestion is welcome .


Thanks a lot .


Workflows tend to be part of the core business in SAP Sourcing and business requirements will most often transform a 'simple' workflow process into a very complex one.


Since the introduction of WF in Sourcing (and the TWE 'era') a specific pattern to script workflows has emerged and it has been carried forward into all projects. Today we are going to change that.



  • Implementing recommendations in this blog post might make your life easier
  • Throughout this blog post it will be assumed that consultants have following knowledge
    • Workflows definition / update / scripting
    • Java EE/BeanShell




The typology used throughout projects is fairly similar:

  • Workflow definition with X number of Approval Steps (total maximum possible steps)
  • Approval Matrix Definition maintained *somewhere*
  • Specific Approval Sequence picked up in PRE-PHASE-CHANGE script and added to the document into an Extension Collection
  • Workflow scripting handles the 'next approver' from the document Extension Collection




For each approval step the script is required to

  • Check if approval status is APPROVED or REJECTED
      • Approval Sequence is complete -> move document to 'Approved' phase
      • If not complete, add next approver
      • move document to 'Draft' (or previous phase)


Pattern in case of APPROVED for above is:

  • for WF STEP 1:
if(approvalMatrix.size() == 1) {
  //move to Approved
if(approvalMatrix.size() >= 1) {
  //add next approver

  • for WF STEP 2:
if(approvalMatrix.size() == 2) {
  //move to Approved
if(approvalMatrix.size() >= 2) {
  //add next approver

  • etc ... for all steps



WHY IS THAT A PROBLEM? And why is it complicating our lives?

The pattern above implies that:

  • there is one distinct version of PRESCRIPT for EACH workflow step
  • similarly, there will be one distinct version of POSTSCRIPT for EACH workflow step

To maintain such a workflow, an IT consultant will have to:

  • save/maintain each prescript/postscript version individually
  • do a lot of repetitive work for any minor update
  • be VERY careful when doing updates not to disturb the process




The solution is to make scripting GENERIC.

  • Single version of PRESCRIPT for ALL steps
    • the script should be able to 'know' from which step it was executed
  • Single version of POSTSCRIPT for ALL steps
    • the script should be able to 'know' from which step it was executed





  • Step 1: Make sure you have a gate identifier. Solution is to 'decode' the current step from the Activity ID (last digit/digits)

Screen Shot 2015-07-30 at 4.41.19 PM.png



  • Step 2: Use following code to identify gate -> 'current approval step'
    • Script will get ending digit(s) from identifier:
      • 'approval_gate_1' = 1
      • 'approval_gate_12' = 12
      • 'approval_gate_1485' = 1485
import java.util.regex.Matcher;
import java.util.regex.Pattern;
private Integer gate = null;
// btoma - Get the step No. from gate NativeID - last digit(s)
// btoma - to facilitate PRESCRIPT on all steps
void getGate() {
  Pattern p = Pattern.compile("\\d+");
  Matcher m = p.matcher(nativeName);
  while(m.find()) {
  gate = Integer.valueOf(m.group());

  • Step 3: Update WF script code, to use gate instead of hardcoded numbers
if(approvalMatrix.size() == gate) {
  //move to Approved
if(approvalMatrix.size() >= gate) {
  //add next approver

  • Step 4: Copy-Paste your PRESCRIPT on rest of WF steps


  • Step 5: Repeat 1-4 for POSTSCRIPT




  • Maintain scripts locally, in a source control sistem (SVN, GIT).
  • Perform changes locally, and copy-paste your script to all WF steps in one go.
  • Don't forget about logging and error reporting


The end.

Bogdan Toma

As promised, this is the sequel to Working with FTP folders in SAP Sourcing 9.x


SAP Sourcing 10 introduces the FileTransferConfiguration BO which allows for a "more configuration, less coding" approach in working with FTP folders.


**WARNING**- this blog post will make use of Internal APIs. Even if I have tested and confirmed them working in 10.x versions, great care should be taken while implementing and supporting these APIs.

  • Prerequisites
    • Define your File Transfer Configuration (FTP location for file exports)

Screen Shot 2015-06-12 at 12.23.41 PM.png


  • Required imports
import java.io.File;
import com.sap.odp.common.platform.HomeLocator;
import com.sap.odp.doccommon.filetransfer.FileTransferConfigBo;
import com.sap.odp.util.filetransfer.FileTransferClientIfc;


  • Prepare file
    • For example let's extract the first attachment in the document
File exportFile = doc.getAttachments().get(0).getAttachment().getFileData(session);


  • Write file to FTP
ftcHome = HomeLocator.lookup(session, "odp.doccommon.FileTransferConfig");
FileTransferConfigBo ftCfg = ftcHome.findByExternalId("EXPORT_EXT_SYSTEM");
FileTransferClientIfc ftc = ftCfg.getClient();
try {
     // if you plan to export multiple files with loops, implement them here, aka build your FTC client only once
     ftc.putFile(exportFile, ftCfg.getUrl() + exportFile.getName());
} finally {
     ftc.close(); //always make sure you close the client


Bogdan Toma

One of the recurring inquiries in SAP Sourcing has been the use of FTP within scripts. While this might be a simple task in Java, the biggest problem is that Sourcing 9.x doesn't use the all-too-familiar Apache Commons FTPClient.


In this blog post I will explain how to make use of the internal functions available in SAP Sourcing 9.x to facilitate working with FTP folders.


**WARNING**- this blog post will make use of Internal APIs. Even if I have tested and confirmed them working in 9.x versions, great care should be taken while implementing and supporting these APIs.


  • Prerequisites:
    • Check and confirm that your FTP integration System Properties are defined and access is granted to respective directory.

Screen Shot 2015-06-02 at 12.13.04 AM.png



The fun part: scripting


  • Import some classes
import java.io.*;
import java.util.*;
import com.sap.eso.sapintegration.util.SapIntUtil;
import com.sap.odp.doc.integration.IntegrationConfig;


  • Define your FTP Server
String FTP_SERVER = "";
String FTP_USER = "MyFtpUsername";
String FTP_PASSWORD = "MyFtpPassword";
String FTP_OUTBOUND_DIRECTORY = "my/outbound/directory";


  • Prepare your file
    • Let's assume you have an AttachmentIfc you want to push to FTP location
File exportFile = attachmentIfc.getFileData(session);
String xmlString = "sample XML text";
tempDirectory = IntegrationConfig.getEnterpriseProperty(session, "ftp.temp_dir");
File exportFile = new File(tempDirectory + File.separator + "my_file.xml");
writer = new OutputStreamWriter(new FileOutputStream(exportFile), "UTF-8");


  • Finally, write your File to FTP
SapIntUtil.instance().ftpFile(session, exportFile.getParent(), FTP_OUTBOUND_DIRECTORY, FTP_SERVER, FTP_USER, FTP_PASSWORD, exportFile.getName());


**PS** For SAP Sourcing 10.x a different post will follow because the changes introduced allow for a better implementation with the use of File Transfer Configuration


Bogdan Toma


Filter Blog

By author:
By date:
By tag: