In this blog I am going to share the invoice tolerance limits learning and understanding which would help to understand  invoice blocking techniques and provide a base to apply in real time situations depending on the client's need.


This document will help consultant who is going to implement Open Text Vendor Invoice Management (VIM)


In the Procure to Pay(P2P) life cycle procurement part ends when Account Payable(AP) processor/Invoice clerk posts the vendor invoice in SAP using MIRO transaction which is also called Logistics Invoice Verification(LIV). It is tedious job for invoice clerk manually to verify each and every invoice line item is conforming to agreed price or quantity in PO. SAP provides systemic way of verifying this kind of discrepancies and block the invoice for payment using the 2 digit key called "Tolerance key".

Lower & upper tolerance limits for all possible discrepancies can be maintained in tolerance key. I would try to explain all the invoice tolerance keys in this blog with possible examples.

There are 2 kinds of invoice matching in SAP which are controlled by tolerance keys.

  • 3 way match:

Invoice line item is checked against corresponding purchase order and good receipt documents item for price & quantity matching


  • 2 way match:

Invoice is checked against only to PO price/qty if there is no goods receipt planned


Let us understand the how the automatic block is working via tolerance keys. In SAP we have many tolerance keys, I am going to discuss only below keys in this part AN,AP,BD,BR,BW,DQ,DW.

Tolerance Limits:

    • SAP tolerance limits work only for MIRO transaction
    • Invoice posted in FB60 is not subject to tolerance keys limit check
    • Tax amount is not included during tolerance check
    • It is stored in table T169G


Invoice blocking ways:

In SAP vendor invoice can be blocked for payment by anyone of the following way.

  1. Automatic block:

Only if there is a discrepancy due to price or quantity or date variance in an invoice.
If the item amount is exceeded from the limit maintained in the tolerance key.

   2.Manual block:

Invoice processor could manually block an invoice either at item level or header level

   3.Blocking through payment term:

Invoice can be blocked always with a particular payment term even if there is no variance

   4.Stochastic blocking:

Random blocking of invoices without any variance

   5.Blocking at vendor master level:

Invoice can be blocked always for a particular vendor if specific blocking key is maintained at vendor master level

Blocking indicators:

       Blocking indicators are available both at header and item level of invoice document. System set this indicator in the document wherever and whenever it is appropriate.

Header level:

Table: RBKP_BLOCKED                      Field: MRM_ZLSPR

Possible values:


Automatically blocked due to existence of blocking reasons


Stochastically blocked


Manual payment block set in header - no blocking reasons


Automatically blocked due to entry via Web Invoice


Item level:

Table: RSEG                                                  Value = X



Blocking reason price


Blocking reason quantity


Blocking reason date


Blocking reason OPQ


Blocking reason Project


Manual blocking reason


Blocking reason amount


Blocking reason: Quality



1.AN – Amount for an item without order reference


“System checks every line item in an invoice with no order reference against the absolute upper limit defined.”

Without order reference means direct posting to G/L or Material.


System behavior:

If only the invoice line item is greater than absolute upper limit the invoice would be blocked.

Let us understand with below example.


It updates RBKP table without a header Block – R. But updates table RBKP_BLOCKED with payment block – ‘A’.

No entries in RSEG table as it is directly posted to G/L or Material


Tolerance Key




Auto release


No blocking indicator update(ZLSPR)

A-     Auto block

No update

(item amount block)

Not possible


2.AP – Amount for an item with order reference


“System checks every line item in an invoice with order reference against the absolute upper limit defined.”


  • Item amount check must be activated at company code level – OMRH
  • Item amount check must allowed for item category and GR indicator - OMRI

System behavior:

Let us understand with below example


No blocking indicator at header table RBKP but RBKP_BLOCKED table has blocking indicator ‘A’.

Blocking indicator RSEG- SPGRS (Blocking reason item amount) is set at item level.

Item amount block must be released manually. There is no automatic release possible even if we perform any one of the following activities

  • Post subsequent credit
  • Adjust AP tolerance limit


Tolerance Key




Auto release


No blocking indicator update(ZLSPR)

A-Auto block


(item amount block)

Not possible


3.BD – Form small differences automatically


“The system checks the balance of the invoice against the absolute upper limit defined. If the upper limit is not exceeded, the system automatically creates a posting line called Expense/Income from Small Differences, making the balance zero and allowing the system to post the document”


System behavior:

Let us understand with below example

Small difference within tolerance limit:

As per the PO reference the invoice amount is 1000 USD.

But vendor actual invoice copy has amount of 1002 USD.

AP invoice processor enters the invoice amount as per the vendor’s invoice copy which is 2 USD higher than PO price.


Since small difference is within the tolerance limit, invoice would be posted without block. Small difference amount would be debited to small difference G/L account maintained for the transaction event key DIF in OBYC.

               Same rule is applied in the lower side as well

Small difference above the tolerance limit:

PO price: 1000 USD

Vendor invoice amount: 1003 USD


If the small difference above the tolerance limit then system would not allow to post the invoice with hard error “ Balance is not zero”. Still AP invoice processor could able to post invoice via menu bar option Edit -> Accept and Post provided if he/she has authorization object M_RECH_AKZ allowed.


If we post with above option then system would behave as below.

  • Invoice will be posted without block
  • Small difference G/L account (DIF) get posted
  • RBKP – MAKZN (net amount accepted manually) field will get updated with small diff. amount : 3.00



Tolerance Key





Auto release


Within range

No blocking indicator update(ZLSPR)

No update

No blocking reason update




No blocking indicator update(ZLSPR)

MAKZN field updated with diff. amount

No update

No blocking reason update



4.BR: Percentage OPUn variance (IR before GR)


The system calculates the percentage variance using below formula and compares the variance with the upper and lower percentage tolerance limits.


Pre-requisite to simulate this scenario:

  1. No GR based IV
  2. Variable order unit is activated at material level
  3. Maintain tolerance key DW with “Do not check” active




Material master:



‘CRT’ is maintained as order unit

‘EA’ is maintained as order price unit in info record 1 EA = 100 USD

PO Details:

PO has been created with CRT as Order unit and EA as Order price unit

Po quantity: 2 CRT = 24 EA

Invoice details:

Invoice is simulated before GR as below

Scenario 1:

Order unit quantity – 2 CRT

Order price unit quantity – 22 EA and amount 2200 USD

Observation: There is no warning message on the variance.

Let us use the above formula to find the variance percentage.


Difference is 100 – 91.6 = 8.9 % which is within the BR tolerance limit 10% maintained

Scenario 2:

Order unit quantity – 2 CRT

Order price unit quantity – 21 EA and amount 2100 USD


Observation: There is a warning message as below.


Let us use the above formula to find the variance percentage.


Difference is 100 – 87.5 = 12.5 % which is more than the BR tolerance lower limit 10%. Hence we are seeing the above warning message which will block the invoice for payment.


Tolerance Key




Auto release


No blocking indicator update(ZLSPR)

A- Auto block


(item amount block)

Not possible

5.BW: Percentage OPUn variance (GR before IR)


“The system calculates the percentage variance using below formula and compares the variance with the upper and lower percentage tolerance limits.


Let us understand with same master data

PO Details:

PO quantity in order unit – 2 CRT

Po quantity in order price unit – 24 EA


GR Details:

GR quantity in order unit – 2 CRT

GR quantity in order price unit – 22 EA


IR Details:

Scenario 1:

IR quantity in order unit – 2 CRT

IR quantity in order price unit – 20 EA

Observation: There is no warning message on the variance.


Variance % = (20/2) / (22/2) *100

                     = 90.9%

Difference is 100 – 90.9 = 9.1 % which is within the BR tolerance limit 10% maintained

Scenario 2:

IR quantity in order unit – 2 CRT

IR quantity in order price unit – 19 EA

Observation: There is a warning message as below


Variance % = (19/2) / (22/2) *100

                     = 86.4%

Difference is 100 – 86.4 = 13.6 % which is more than the BR tolerance limit 10% and invoice posted with payment block.


Tolerance Key




Auto release


No blocking indicator update(ZLSPR)

A- Auto block


(item amount block)

Not possible


6.DQ: Exceed amount: quantity variance

This tolerance key has both absolute and percentage limits.


If a goods receipt has been defined for an order item and a goods receipt has already been posted, the system multiplies the net order price by (quantity invoiced - (total quantity delivered - total quantity invoiced)).


If no goods receipt has been defined, the system multiplies the net order price by (quantity invoiced - (quantity ordered - total quantity invoiced)).


System behavior:

Absolute limits:

Let us see the system behavior if only absolute values are maintained and percentage limits are marked as “Do not check”.


Upper limit: 100.00                                                       Lower limit: 100.00


Test data :- (GR has been defined)

PO quantity = 100 EA

PO price      = 100 USD

GR quantity = 50 EA

      a) System behavior when invoice quantity is 51

Variance   = PO price x   (quantity invoiced - (total quantity delivered - total quantity invoiced))

                 = 100 * (51 – (50-0))

                 = 100 * (1)

                 = 100

Variance 100 is equal to upper limit 100 and there will not be any warning message.


      b) System behavior when invoice quantity is 52 

Variance = PO price x   (quantity invoiced - (total quantity delivered - total quantity invoiced))

                 = 100 * (52 – (50-0))

                 = 100 * (2)

                 = 200

Variance 200 is more than upper limit 100 and there will be a warning message as below.



When we post invoice with above variance it will get blocked for payment.


It updates tables as below

Tolerance Key




Auto release


No blocking indicator update (ZLSPR)

A- Auto block


(Quantity block )



Automatic release possible if the blocking reason RSEG- SPGRM is deleted when we post GRN for the balance quantity or credit memo for the excess invoiced quantity.

Same rules apply on the lower side as well

Test data :- (GR has not been defined)

PO quantity = 100 EA

PO price      = 100 USD

GR not possible

     a)System behavior when invoice quantity is 51

     Variance = PO price x   (quantity invoiced - (total quantity ordered - total quantity invoiced))

                 = 100 * (51 – (50-0))

                 = 100 * (1)

                  = 100

     Variance 100 is equal to upper limit 100 and there will not be any warning message.


     b)System behavior when invoice quantity is 52

     Variance = PO price x   (quantity invoiced - (total quantity ordered - total quantity invoiced))

                 = 100 * (52 – (50-0))

                 = 100 * (2)

                 = 200

     Variance 200 is more than upper limit 100 and there will be a warning message as below. Invoice will be blocked for payment and same tables will be updated as above.


Percentage limits:

“We can also configure percentage limits for the quantity variance check. In this case, the system calculates the percentage variance from the expected quantity, irrespective of the order price, and compares the outcome with the percentage limits configured.”


Let us see the system behavior with same example if only percentage limits are maintained and absolute values are marked as “Do not check”.


Upper % limit: 10.00                                                     Lower % limit: 10.00


Test data :- (GR has been defined)

PO quantity = 100 EA

PO price = 100 USD

GR quantity = 50 EA

     a)System behavior when invoice quantity is 55

     Variance = (quantity invoiced - total quantity delivered)/ (quantity expected)*100 %

                 = (55-50)/50 * 100 %

                 = (5/50)*100 %

                  = 10 %

     Variance 10% is equal to upper limit 10 % and there will not be any warning message.


     b)System behavior when invoice quantity is 56

     Variance = (quantity invoiced - total quantity delivered)/ (quantity expected)*100 %

                    = (56-50)/50 * 100 %

                    = (6/50)*100 %

                    = 12 %

     Variance 12% is more than upper limit 10% and there will be a warning message as below. Invoice will be blocked for payment and same tables will be updated as above.



Both Variances active:

If both absolute & percentage variance are active the system would block which ever tolerance is first breached.

Quantity check for Delivery cost:

The system also carries out a quantity variance check for planned delivery costs when we post only planned delivery cost.

Tolerance key not maintained:

If the tolerance key DQ is not maintained for a company code when we perform the same transactions discussed earlier, system considers this as zero tolerance and block the invoice for the payment for any deviation.


7.DW: Quantity variance when GR quantity = zero


If a goods receipt is defined for an order item but none has as yet been posted, the system multiplies the net order price by (quantity invoiced + total quantity invoiced so far).

The system then compares the outcome with the absolute upper tolerance limit defined.

System behavior:

This tolerance key works only for PO based invoice verification because GR based invoice verification will not allow IR without GR.

DW absolute upper limit as 100:

PO quantity = 100 EA

PO price       = 10 USD

No GRN posted

     a)If the IR quantity is 10 then system calculates variance as below

     Variance = Net order price * (quantity invoiced + total quantity invoiced so far)

                    = 10 *(10+0) = 100

     This value is equal to DW limit hence there is no warning message and system will not block the invoice.

     b)If the IR quantity is 11 then system calculates variance as below and block the invoice.

     Variance = 10 *(11+0) = 110


This value is more than DW tolerance absolute upper limit hence invoice got blocked.

It updates tables as below

Tolerance Key




Auto release


No blocking indicator update (ZLSPR)

A- Auto block


(Quantity block )


Automatic release possible if the blocking reason RSEG- SPGRM is deleted when we post GRN for the balance quantity or credit memo for the excess invoiced quantity

DW absolute upper limit as “Do not check”:

PO quantity = 100 EA

PO price       = 10 USD

No GRN posted

If the IR quantity is 112 then also system would not block even though this is beyond the DQ tolerance since it has been maintained as “Do not check” and there is no GR has been done.


“One should be very careful to use this option as this would by-pass quantity variance DQ block if there is no GR posted where GR has been planned”


DW tolerance key is not maintained:

If this key is not maintained for a company code it would always blocks the invoice for PO based invoice verification when there is no GR has been posted.



* SAP IMG documentation

Hello SAP Practitioners,

Greetings for the day,


In my last project I've encountered an issue and I've noticed that many people are still finding difficulty in getting the exact answer for their question in SCN and Google as well. As I've identified the cause for the issue, thought of creating this document by intending it would help them to understand their issue and resolve the same.


Summary of the Issue:

When I click on print preview button for a particular purchase order, the currency format was displaying incorrectly but the other purchase orders were displaying accurately, as it is displayed in change/display mode of purchase order screen.


Example: For the purchase order 4500002417 I've entered amount as USD 1,000,000.00 and when I click on print preview it has displayed the same as in the purchase order change/display screen. But for the other purchase order 4500002416, I've entered amount as QAR 1,000,000.00 but when I click on print preview, it has displayed as 1.000.000,00. I wonder how come it is displaying differently from the change/display screen of the same purchase order! Decimal was getting separated by a ‘Comma’ (,) Instead of a ‘Dot’ (.).


Please find the below screen shots for your better understanding.

PO 4500002417.JPGPO 4500002416.JPG

From the functional point of view I've analyzed a lot to identify the cause for the issue but couldn't succeed. Then I went Google's help but I didn't get a clear answer as I was seeing many responses were talking about only 'Decimal Format Settings @ User Profile' - SU3. Finally I took an ABAPer help to fix the issue as I did not find the right answer in Google.


Settings for Decimal Format in ‘User Profile’ (SU3): Before moving further I would like to make you clear that, decimal format which we are maintaining in ‘User Profilewouldn't be having any impact on purchase order print preview, instead It would be having impact on only purchase order creation/change/display screen, so for the above issue, no need of checking the user profile-decimal format.



Cause for the Issue:

When I found the cause for the issue I came to know that, the actual system behavior is correct. You might ask me how the system is behaving correctly, when it is showing the decimals wrongly? The answer is, the currency format in the 'PO Print Preview' is derived with the combination of 'Vendor' & 'Vendor Country'. Few countries are separating decimals with comma (,) and few are with period/dot (.).

Example: If we are raising a purchase order to a British Vendor, usually he reads his PO value in his local currency, so system would check the vendor & his country and based on that it would display the currency format in the print preview. As shown in the below table, the currency format for British vendor would be displayed like 1,234,567,890.00.


As per the above purchase order 4500002416, we are sending the PO to a German vendor, so his currency format is different than the other PO 4500002416, hence it is showing as 1.000.000,00 instead 1,000,000.00.


Note: Not all the countries in world are using the same decimal format, different countries officially designate different symbols for the decimal mark. Please see the below table as an example, how the decimal notations are different between the countries.


Excel 2.jpg.png

Please go through the below link to know more about the 'Decimal Notation'.

Decimal mark - Wikipedia, the free encyclopedia


Where can I see/maintain the currency format for the respective countries?

The standard SAP practice is, while implementing SAP, Finance Team and Business Team will have a discussion on maintaining the currency format for each countries. As it is having major impact on all the financial transaction, they will take a call at the time of implementation itself. Changing the currency format at the later stage is not advisable.


You can see the settings in OY01 as shown in the below screen shot.


How this functionality works only for the 'PO Print Preview' not for the 'PO Create/Change/Display Screen'?

As per the standard SAP, the purchase order forms contains a function called ‘Set Country’. This function will check the purchase order vendor and country of the vendor, then it goes to the table T005X (Countries – Decimal point and date format (SET COUNTRY)) --> pass the country name and gets the currency format for the country and updates the same in purchase order print preview.


This functionality works at the smart form program level not at the purchase order create/change/display program level, so we will not be facing any 'Decimal Notation' issue in the purchase order screen. PO screen works as we set the decimal notation in the user profile.



How to check whether my PO form has this function?

You can go to NACE settings find the smart form name--> and then go to smart form - Initialization tab--> search for term 'set country', if you find an entry as shown below, that means your PO form is calling this function and your decimal notation works based on this logic.

Smart Form 1.jpg


Clarified the cause of the issue to the business team and advised them to not to make any changes to the function as it working fine as per the standard practice. But based on the business requirement we have given them one more output type 'ZNEU', so that whenever they want to have a print with  country specific currency format they can use 'NEU' output type and whenever they want as per the normal currency format they can use 'ZNEU' output type.


In order to fulfill the customer requirement, we have created a new Z smart form, created 'ZNEU' output type and assigned the standard PO print program & Z-Smart Form in NACE settings--> and then maintained 'Output-Condition Record for Purchase Order' in MN04. Now whenever he clicks on print preview, he will get two output types, so that he can select whatever he wants.


Hope, I've clearly explained you the scenario to the best of my knowledge. Do let me know, if anything is not clear for u, so that I can explain in detail.


Thank you so much for reading the blog.



Narayana N

As we are optimizing SAP supply chains sustainably, all over the globe and also distribute the SAP Add-On Tools (by SAP Germany), bigbyte ( supports a wide portfolio of SAP-using manufacturing companies in the US, Europe and Asia.


Some of those companies use the Add-On Tools and some do not - at least not yet.      Man!... what a difference. Not that I want an easy play, but with the Tool-using companies we'll achieve better results in half the time. These tools simply near-perfect the SAP-ERP system (there are tools for APO too) and automate the work.


As an example... when optimizing your replenishment strategies, you have to analyze and classify your material portfolio into materials by consumption value, consumption consistency or predictability, lead times and life cycle (is the material new, obsolete, slow mover or regular) and then set up the master data (the four MRP screens) accordingly. PD for expensive, unpredictable and spotty items, consumption based strategies for predictables, reorder procedures for items you want to keep in stock, lot size procedures that go with the strategy and safety stock strategies.


If you work without the SAP Add-On Tools the process goes like this: Build a list of materials with high optimization potential using transactions MC.9, MC42, MC49, MC48, MC50, MD07, MCBA and much more... then analyze each individual material using historic, outdated graphics, LIS transactions, dual classification, slow mover reports, dead stock reports, MD04 timeline, table MVER and an XYZ analysis in Excel (all of this will take you about 4 hours per material). Then maintain the materials MRP 1 through 4 screens to the best of your knowledge.... repeat for each material... repeat for each material EVERY MONTH !! (things change, right?)


Using the MRP Monitor, you start the analysis for the entire Plant and the monitor classifies and performs a segmentation immediately into six dimensions - ABC for consumption value, XYZ for consumption history with a coefficient of variation, EFG for lead time, UVW for price, LMN for size or volume and Life Cycle. That is your analysis right there. In the Monitor result your portfolio is segmented into these 6 dimensions or classes and you will now select any given class and assign a pre-defined policy (the program reads the policy for the items marked as A,X,E and V and updates all material master with the respective MRP Type, Lot Size procedure, min maxes, safety stock settings, strategy group, consumption strategy and availability checking rule. And it does so every months.


Have a look at the MRP Monitor and the other SAP Add-On Tools... it certainly will be worth your while before you engage in a one-time, six month, loosely defined inventory optimization project.

In upgrade and transformation projects we are required to bring the legacy data in new system. This legacy data can broadly be classified into following:

  • Master data e.g. master data can be Material master, Vendor master, Source list, Purchase info record, Contracts etc.
  • Transaction data e.g. Reservation, Purchase Requisition, Purchase Order, Invoice


Master data serves as a base for executing the transactions hence is required to be brought in the system before transaction data. Master data is mostly retained As-Is (unchanged) in the new system, one of the main reasons for this is familiarity of the users with master data.


Transaction data might be needed to be brought in the new system because of its open quantity, in the legacy system, which is to be settled in the new system. Open quantity refers to the quantity for which a follow-on document is not posted in the system. E.g. If a Purchase Order is created is created for 10 qty and Goods receipt is done for 5 qty then Purchase Order is Open for posting Goods receipt of 5 qty.


Purchase Requisitions are internal documents for a client and are mainly used for putting forward a particular person’s/department’s requirements to purchasing department. This Purchase Requisition can further be used as reference for a RFQ, PO, etc. In order to do a PO conversion, it might be required the Purchase Requisitions referenced in the POs to be created first. Hence while considering conversion of transaction data in purchasing we often consider Purchase Requisition as one of the transaction data to be brought into new system.


Here we will be mainly talking about how we can bring Purchase Requisitions from legacy system to new system, inclusion/exclusion criteria, dependencies, important considerations, related assumptions, challenges involved.

Inclusion and Exclusion criteria:

  • Inclusion criteria can be defined as parameters on which Purchase Requisitions will be considered for conversion. E.g. Purchase requisitions of only document type NB to be considered for load. Another example of inclusion criteria can be only completely released purchase requisitions created in last one year to be considered for conversion. There can be as many inclusion criteria as per Client’s requirement.
  • Exclusion criteria can be defined as parameters on which Purchase Requisitions will not be considered for conversion. E.g. Deleted line items of the Purchase requisitions will not be considered for load. Another example of exclusion criteria can purchase requisitions created for a particular plant will not be considered for conversion. There can be as many exclusion criteria as per Client’s requirement.



  • All required configurations related to Purchase Requisition is in place before the data is loaded
  • Material Master, Service Master, Vendor master must be created and uploaded
  • Purchase Info record, Contracts, Unit of Measure, Purchase Group
  • Material Group assignment to G/L accounts
  • Account assignment data as G/L accounts, cost centers, WBS elements need to be created and uploaded


Important Considerations:

  • Method of extracting, cleansing and loading the data needs to be defined beforehand.
  • Preceding documents of Purchase requisitions are created in the system. Preceding document can be a Plant maintenance work order, planned order or internal order.
  • Contracts referred in the legacy Purchase Requisitions must be released.
  • Release Strategy and Release status of the considered Purchase Requisitions.
  • Workflow related to Purchase Requisitions should be setup but should not be activated. If the workflow is activated, releasing the Purchase Requisitions collectively can lead to trigger of large number of workflows which can hamper system performance and can affect other update tasks.
  • Error log generation format and analysis.



  • Numbering for Converted Purchase Requisitions will not change.
  • External numbering will be used during the conversion period only. Document numbering will be switched to internal numbering once the conversion is complete.
  • Data Cleansing in Legacy SAP is required before the data is ready to be migrated to SAP
  • Purchase Requisitions can have multiple line items

  •     Month end process must be run before conversion.

  • Any input file fields that are not used as input into the creation API/TCODEs will be ignored.


Challenges involved:

  • Accurate validation of Extracted, Cleansed and Loaded data.

Dear All,


Recently we have done an External Service Management process for contract employees in integration with HCM module. This involves some development also, to automate the process. After successfully implementing the process, I just thought of sharing the experience with all my SCN colleagues, thinking that this may throw some lights on your similar kind of requirements.



My customer hires various services (like SAP Consulting) from different service providers (like IT companies). The contract employees (like consultants) are supposed to enter the time and the service provider will be paid based on the hours worked by the employees. The requirement is to automate the process and map this in SAP.



HCM module has been already implemented. A custom info type (9001) exist exclusively for contract employees which contains details like Service provider, contract terms etc.



The process starts with creation of Purchase requisition by user department.


To identify the employee, we have added the field PERNR (Personnel Number) in customizing,

Materials Management --> External Services Management --> Define Screen Layout



Now, the field Personnel number will appear in the PR and PO.



Purchasing department will create purchase order to the service provider.


In the custom info type for contract employees, a new field PO Number and PO Item is added. A program is developed to fetch all the Service PO created on that day and scheduled it in background to run every night. Also, the program picks the PERNR from ESLH and ESLL tables and for that PERNR, it updates the custom info type table (PA9001) with this PO number and item.


Now the contract employee fills his working hours in CATS time sheet. Once the time sheet is approved, a scheduled job will fetch the personnel number from time sheet. From the info type table, the program will fetch the PO number and item and create a Service Entry Sheet, using the BAPI BAPI_ENTRYSHEET_CREATE.


A subsequent job for standard transaction ML85 (Program RMSRVF00) releases the Service entry.


Once the service entry sheet and material document is posted Finance department can do invoice verification ate month end for the accepted quantity and pay the vendor. We are planning to automate the invoice verification also, once the process gets stabilized.


This is just my experience in addressing the said requirement. Expecting your valuable feedback and suggestions. Also request you to share your experiences in addressing similar requirements, if any.




Once you set your supply chain policy, you will have to fine tune your master data setup so that it drives good service levels with low inventories. The SAP Add-On Tool "Safety Stock and Reorder Point Simulator' lets you perform an optimization and update material master records collectively


Video: Safety Stock and Reorder Level Simulation

watch a video on how to use the SAP Add-On Tool 'Inventory Controlling Cockpit' for analysis and optimization


This blog is an attempt to share the individual 'understandings' and 'expectations' related to the emerging trends in SAP Forecasting/Supply Optimization vis-a-vis Demand Sensing and Inventory & Service Level Optimization currently projected with SAP’s takeover of Smartops.


Understandings as above relate to outcome based on the work experience in the core ERP system (MM/PP modules)  and

Expectations translate to whether the ways of working will change because of the new developments and lead to additional learning.


    For quite some time we have been hearing about the takeover of Smartops and there would have been people interested to know if it had got anything in store for them. The merger formalities seem to be complete and post merger it has become SAP-EIS (Enterprise and Inventory Service Level Optimization). A team is set to deliver what is called as the Enterprise Demand Sensing (EDS) and Multi-Stage Inventory and Service Level Optimization.


a. Enterprise Demand Sensing: (Positioned as a Cloud-based Analytics solution).

   Traditional forecasting techniques using time series methods predominantly which always have the risk of poor prediction tag leading to lost opportunity.

Loads of sales/consumption history data across years provides forecast results usually on a long term horizon of say a year atleast and when they start creeping in towards the short term because of these sudden changes order execution at the short term level gets tougher and tougher.

More CPG/Retail companies like P&G, Unilever, Kraft/Mondelez to name a few have gone for what is called as demand sensing techniques using predictive analysis/intelligence techniques to make much better forecasting and quickly adapting to the market changes. Companies in other industries/ areas are also seen to catch up this aspect.

    To-the-minute/ current realities of supply chain are considered with an ability to respond to sudden demand spikes/real time demand because of say an instant promotion activity in a networking site for instance or a natural disaster is what is attributed as the USP about this demand sensing. Intelligence to be provided should come via the SAP Demand Signal Management (DSiM) to process the data in a HANA backed environment. Guesstimates say improvements in forecasting accuracy should be seen in limits of 30% northwards.

Take a look at the clipping made available in

The video clearly mentions that this software has the option of integrating with an ERP as well as APO-Demand Planning application.

As a layman with limited understanding of new developments and belief that forecasting is what we use as V* MRP types using a host of forecast models in ERP the question is whether the forecasting algorithms would undergo any changes in the future in the ERP system. If the understanding is right and again if that happens would that be as a Core-ERP offering or would be for something for IS-Retail scenarios only as it is mostly the CPG/Retail customers that are the prime targets?


b. Enterprise Inventory and Service Level Optimization: (Positioned in the SCM suite as an Integrated S&OP offering again within the Supply Planning space)

    At the very first instance hearing about the name Enterprise /Multi-stage/Multi-Echelon Inventory and Service Level Optimization itself was very ambiguous tasting like a bitter gourd forget about the association part. Some one prunes it Service level optimization in discussions and there is a sincere belief that the term had been heard definitely before. Some more tries to associate the relation and the material master MRP view specifying the Service level field flashes across. Bit of reading here and there then brings us closer to the association part.

   What we actually try to do with these service levels in ERP Forecasting is to fix up a percentage based on defined KPIs/criteria. We come across terms like  a 'normal distribution' usage to take into account probabilities etc to meet optimized service levels. Finally things start falling in place and we understand that what we are trying to achieve as optimization the service levels is at a single location/ single-stage namely the manufacturing plant. This is called 'Single-Stage Optimization' and this is what we on a maximum work in the core-ERP side.

   In any Supply chain, viewed from the point of view of a plant we see two things:

  1. 1.Upstream view--> Represents all partners above the plant and supplying raw materials/components pushing out raw material inventory. Covers the procurement network and called 'Tiers'. Multiple layers then become 'Multi-tiers'.
  2. 2.Downstream view--> All partners downstream of the plant involved with fulfilment aspects with FG inventory. Covers the distribution network and termed 'Echelons' with multiple levels becoming 'Multi-Echelons'.

(Side note being what is in Plant or the centre point is the manufacturing network covering WIP/In-transit Inventory).

Fulfilment (Customer is King) forces Echelons to take priority over other two letting us know why it is named 'Multi-Echelon/Multi-stage Inventory and Service Level Optimization'.

   Any Supply Chain experiences pains/issues in one form or another and they get consolidated at the very high level as 'complexity' issues and 'variability' issues.  Again 'variability' breaks up into 'demand' variability' and 'supply variability'.

   Looking at the earlier SCM SNP space and with the offerings provided earlier the 'Supply variability' was definitely a pain point and with the new offering, SAP is set to address the issue with the new setup/merger. Handling supply variability and optimizing is expected to be the USP here. And you can see that Demand sensing is fitted in to handle the Complexity issue part.


In parallel during these 'inferring' days, on the SCN site we started to see a host of materials being uploaded and made available in this same forum.

Has real good materials set to explain Safety Stock Fundamentals and Stochastics, Cycle and Pipeline Stock: The Deterministic View etc., and what Smartops does differently there. Read and get benefited.


In the materials you can see a mention about usage of 'Gamma distribution' compared to the traditional 'Normal Distribution' methods we know of in our ERP system today and the advantages etc., making us to think again if any new changes will come over. With the SCM SNP space opening up to inventory modeling and optimization there seem to be more opportunities for specialists concerned with the appropriate and new skills in this space.


Will utilise this blog to look for relevant additions/corrections from experts working in the Forecast related space with forecasting models/time series data/Inventory Modeling techniques in Supply Chain and share their thoughts. Also useful would be pointers to other targets/forums who might be able to help further.

May be premature days to expect still but any information provided will be useful to understand things better and share it back in the forums for general benefit of all.

Thanks for your precious time now for reading and also for your future time to write back.




Since I have come across the SAP Add On Tools last spring, we have rolled out many of these tools at 5 very happy customers, whose supply chain runs on SAP. At the center of it all is the MRP Monitor with its invaluable capabilities of segmentation and policy stratification. But there is so much more you can do with it and I wonder every day:


"How can anyone perform effective materials planning without it?"


So let me first address and explain my definition of effective materials planning:


"Effective Materials Planning is the process of maintaining the basic data of a materials portfolio so that it supports automation to keep optimal inventories that provide great service levels at any time under changing conditions"


To perform effective materials management with SAP you should perform tasks in four areas


- Portfolio Management where the planner needs to set up 'swimming lanes' and keep transparency and manageability

- Policy Setting which I consider the heart and soul of any planning system. Maintaining the right policy drives automation and the right inventory levels with optimized replenishment.

- Exception Monitoring to counter variability and the Unforeseeable.

- inventory Optimization to constantly capitalize on opportunities for lowering of cost and increase of availability and service level.


... and without the MRP Monitor

You can’t do selective portfolio management

You can’t set policy for more than one material at a time

Exceptions are not meaningful without good policy and selective portfolio management

Inventory optimization becomes an impossible undertaking


So here I am... In a very cold state, building an effective materials planning system with a client who installed MRP Monitor, Lot Size Simulator, Inventory Controlling Cockpit and the Service Level Monitor... All SAP native Add-On Tools.


The client runs their supply chain on SAP since the mid 90s and has undergone various optimization efforts with various degrees of success. The last one had cost them the price of a Learjet and delivered the value of a holiday trip to Elizabeth, NJ in January. The consulting company promoted and taught exception management with MD06 and a subsequent 'rule' setup in the material master... One item at a time! The fact that each materials controller had to manage about 5000 items didn't deter the 'thought leaders' from the opportunity to run a 6 months, revenue generating 'value' project (value for whom?).


To make a long story short... There were endless workshops on MRP types, lot sizes, safety stocks and various other fields in the MMR - all very important stuff - but nothing that helped managing a large portfolio and setting policy for optimized inventories or great service levels.


Then the client acquired the MRP Monitor! Now we're in the position to get the materials planners to take control over their parts. The first thing we did was to use the MRP Monitor and it's valuable KPIs to sort out the more important parts from the lesser to the least important parts to watch on a daily basis.  First we utilized the Monitors Life Cycle classification to list all items marked for deletion and moved those items into the MRP Controller bucket 'obsolete'. The MRP Monitor has a function with which you can update policy for many materials at the same time. You can also use that function to mass change a large number of materials - as an example: to move all items marked for deletion into the MRP Controller OBS.


the portfolio already shrunk from over 5000 parts to less than 2000.


Then we took one planner's MRP Controller key, listed all items and selected those which did not have one single movement over the past 12 months. (number of movements is a KPI in the MRP Monitor). Out of this list we excluded the 'new parts (again... this is a class provided in the Life Cycle analysis of the MRP Monitor). These non-movers we moved into an MRP Controller bucket "least important items to look at".


Then we moved all items with movements less than 80 per year into another MRP Controller bucket "less important items to look at" and everything else was left in the "important" MRP controller bucket, which by now included less than 500 items.


Now the fun began: Policy Setting! First we called up the MRP Monitor for  the "least important item" - about 3000 parts that had no consumption over the last 12 months. No need for overthinking on policy. Take everything of consumption driven policies to a PD and scratch all safety stocks. As mentioned before, the MRP Monitor allows for mass updates and we simply selected all items in that bucket, set PD, initialized (meaning we set that field to zero) the fields safety stock, safety time and Range of Coverage and so updated 3000 materials with a policy that instantly eliminated all exception messages and plans them only if there is demand. The Materials Planner will look at that bucket only once in a while and more is really not necessary.


Now we went into the bucket "less important items" that contained about 1500 items. Still a lot but manageable. What's of note here is that all the items that can not be classified (and had no movements) are out of the way, so whats left here is not super important but needs to be looked at. Good news is that all these materials have movements and are classified, which means we can set policy by class and segmentation.


And that we did. Now you take your high consumption value, consistent consumption, short lead time items and update them with a reorder policy and a fixed lot size - easily updated with the policy update functionality ion the MRP Monitor. Within hours we updated over 1500 materials with a fitting policy before we directed our attention to the 500 most important parts for that specific Materials Planner.


Again we called up the MRP Monitor Result, this time for the 500 most important parts, parts that have a lot of movement and are classified according to ABC for consumption value, XYZ for consumption consistency, EFG for lengths of replenishment item, UVW for price, LMN for volume or size and a life cycle classification (New, Regular, Obsolete, slow mover, marked for deletion, or dying). Now we will have to pay more attention, these are our movers and shakers and we can look at one segment at a time... and update any policy we'd like. to keep these parts optimized will take longer than before. But that's ok since these are the important parts. we'll also check on these more often and that's ok too. Because they are important.


And that is what Effective materials Planning is all about: separating the important movers and shakers from the bulk that can be set up with a policy that automatically takes care of them. These important 500 materials we need to tweak and pay attention to. They will make our planning or break it.


Isn't this so much better than having to look at 5000 items every day? in MD06 or MD07? By the way... when those 'thought leaders' had finally optimized 300 materials after 6 months, the policy didn't fit anymore, the parts were discontinued and there were still 4700 materials left to be optimized.


Effective Materials Planning without the MRP Monitor is not possible ! No matter how much money you spend on 'thought leaders'...

Hello All,


Few months back i came accross one issue in our project that the user has split the validity of the info record and due to this the same price is getting copied to other splitted valdity date price. User has inserted the new validity period in between the validity period 06.12.2013-31.12.9999. Below we can see that the user has entered validity period 01.07.2014-31.12.2014 due to which the info record got split and got divided t three validity period.


Due to this, when the user changes the price for period 01.01.2015 / 31.12.2099 price also getting changed for period 06.12.2013 / 30.06.2014 to the same price. In other words, system changes the price for both periods to either 1.013,72 or 963,72.  system does not change the price for period 01.07.2014 / 31.12.2014.


This issue has happened becasue during the maintenance of the conditions of an info record, a new period is inserted into an existing period. The existing period is split into two records. In this case, the same condition record is assigned to both 'old' partial records. As a result, changes to the conditions in one of these periods also immediately affects the other one. There is SAP note for this NOTE: 358998. This problem is due to the system design. There is no technical solution for this issue. So with the help of below steps, we can correct the info record without deleting and creating the new one.


Step 1:

Go to ME12 and select conditions. Now select the first validity period and click new validity period.



Step 2:

Change the validity period from 06.12.2013 to 31.12.2099 and save.


You will get a pop up as below. Click on enter.



Now again go to ME12 and select conditions and select new validity periods as shown below.

step 3.jpg

Step 4:

Now instead of splitting the period, add one by one validity period starting with first validity period 06.12.2013 / 30.06.2014: 1,013.72 EUR and click on save.


Step 5:

Now again go to ME12 and select conditions and select new validity periods as shown below.



Step 6:

Add second validity period 01.07.2014 / 31.12.2014: 988.72 EUR and click on save.


Step 7:

Now again go to ME12 and select conditions and select new validity periods as shown below.



Step 8:

Add second validity period 01.01.2015 / 31.12.2099: 963.72 EUR and click on save. You will get a pop up. Click on enter and proceed.


In this way you can correct the info record.


Hopefully this document will help you in solving the info record issue for split validity period.


Thanks and Regards,

Chetan Pardhi

This blog is the continuation to Stumbling blocks in purchasing info record migration by LSMW IDOC method


The project is outlined in the other blog, so I can directly start with the difficulties I had with the migration of the price conditions.

When you migrate info records by IDoc method, then the price conditions are not included. They have to be loaded in an extra step.

So again the question about the method. Batch Input vs. IDoc method.

In my last blog I explained that the IDoc method had the advantage that I can work with preassigned numbers and can create a mapping directly from the conversion step in LSMW.  This advantage does not count for the conditions of an info record, because there is no extra mapping which has to be given to the business. The mapping was already there with the info record migration.

Still I decided to go with IDoc method, as it allows me to reload the data again and again if I do something wrong. With the batch input method I would need 2 LSMW objects worst case, for initial creation and for a change if something goes wrong. Trust me, even I make mistakes . Eventually already with the decision to use IDoc method here. You can leave a comment.

Now I knew how I am going to load the data, but I still had to solve the issue how do I get the data from the legacy system. If you check SCN with key words how send COND_A  then you get 70 hits but rarely an answer. I actually found just one correct answer in relation to info record conditions and will spread it here again: it is possible from MEK3 transaction via condition info. However, at the time of migration I did not know about that. And I still think that it wouldn't be suitable for me, as MEK3 expects the condition type before you can continue to condition info, and I had 28 different condition types in the source system. Further I cannot even go into the condition info if there is no access defined for the condition type.

In the meantime I got an idea to send the conditions together with the info record using serialization, I will try this in my next info record migration.

But in this migration I decided to download the conditions using SE16 and a simple SQVI QuickView.

The goal was to migrate only valid conditions, no historic conditions.

I started with SE16 and downloaded all valid conditions from table A017 and A018; excluded the deleted conditions and those who had an validity end date lower than today. (We did not load conditions at material group level, hence no download from A025 table)


As you can see, the condition record number is listed at the end.

Next step was the development of the QuickView.

Just a join of the table KONH and KONP since we decided to do the few scales (KONM table) manually.

The chosen fields for our migration were:

List fieldsSelection fields






























As the number of records exceeded the maximum numbers that can be added in the multiple selection I used the table number and excluded the records with deletion indicator. Nothing else as selection criteria.

116000 condition records caused a file of 44 MB. Too big to output the result from the QuickView directly in Excel. So it had to be downloaded as text and then imported to Excel.

Here you have to careful, if you have more about 60000 lines or more, then you can find 2 empty lines at position 59994 and 59995.

If you continue with sorting and filters, then you do not get all records, hence you need to remove those empty lines.


As a next step the fields have to be formatted. All value fields have to be formatted as number with 2 decimals and without delimiter for thousands.

After this I entered a formula VLOOKUP to identify the condition records that equal the condition record number from the A017 table download. Only those were kept for next processing step.

If you wonder why this is needed then have a closer look at this validity end date from KONH table. It is almost everytime 31.12.9999.  If you enter a new price in the info record, then you usually care only about the valid from date and the condition is valid until a new condition is entered. But when a new condition is entered it does not change the valid-to date of the old condition. SAP selects the valid condition only based on the valid-from date.

And here I was lucky that I only had to load current valid conditions. I tried it a few times for all conditions but this caused a bigger chaos, because I was not able to process the IDocs in the needed sequence: the oldest condition first, the newest condition at the end. Even I had the correct sequence in the CONV file in LSMW, they became a mess when the IDocs got generated and processed.

As said, I kept only the latest conditions identified with VLOOKUP in the file. Be aware that this VLOOKUP for that many records can take hours depending on your PC and the numbers of processors. With 24 processors this was a job of a little more than 5 minutes.

Still I had to do a bit more cosmetic to get what I needed to load this file.

In this picture from LSMW you can see the IDoc structure: a header, the validity period, the conditions

COND_A struc.png

The source file has to be prepared to deliver data for each of this segments in the structure.

Starting basis is my download file which has only the valid records now, this will be used for the detail, the KONP segment of the structure.


For the header segment I just copy the columns A to E  into a new Excel tab.

for the validity period segment I copy the columns A to G into a new Excel tab.

From both new tabs I remove the duplicate records. a_duplicate.png

Now I save each tab as CSV file and assign the CSV files in LSMW to my source structures.CONDALSMW.png

When you execute the step READ DATA then SAP joins all those 3 files based on the common fields in the beginning of each structure.

The rest is just the standard process of executing LSMW.

Important in the field mapping is the VAKEY  field. SAP uses a variable key (VAKEY) to access the conditions. This is concatenated field with material number in the beginning, followed by vendor number, purchasing organisation, plant (only in case of A017 conditions) and info record type. As usual in a migration you have to replace the old values with new values to enable SAP again to find those conditions.

This means you have to create a data declaration to split this VAKEY field into its parts, like this:

data:  begin of ZZVARKEYA,

       ZLIFNR like LFM1-LIFNR,

       ZMATNR like MARc-MATNR,

       ZEKORG like LFM1-EKORG,

       ZWERKS like MARC-WERKS,


       end of zzvarkeyA.

you may want to do this for the source field and for the target field.

And then you need a little ABAP coding to replace the individual values. I have this mapping old to new  in a Z-table with 3 fields, the identifier e.g. EKORG for purchasing organiatisation, the old value and the new value. I use a select statement to retrieve the value from this Z-table.

The value field KBETR can cause as well some headaches, especially if you have conditions with currencies that have no decimals like JPY  or where the value is percentage. In both cases SAP has a strange way to store the values in the table.

The value field is defined in the data dictionary with 2 decimals. A currency like JPY   does not get decimals when it is stored, it is just moved as is to this value field. And a price of 123JPY is then stored as 1.23 JPY.

A percentage is allowed to have 3 decimals, but stored in a field with 2 decimals makes it look extraordinary too.

And last but not least you may have negative condition values in the source file because of discount conditions.

All these circumstances have to be taken care in the conversion routine, which is in my case like this

g_amount = ZKONP-KBETR / 100.

if ZKONP-KONWA = '%'.

  g_amount = ZKONP-KBETR / 1000.



  g_amount = ZKONP-KBETR.


if  ZKONP-KSCHL1 = 'RA00'

or ZKONP-KSCHL1 = 'RA01'

or ZKONP-KSCHL1 = 'RB01'

or ZKONP-KSCHL1 = 'RC00'.

g_amount = g_amount * -1.


write g_amount

   to E1KONP-KBETR decimals 2.



Because of the CSV file format I have to devide the value by 100 to get the normal condition value.

In case of a percentage the source value has to be devided by 1000

In case of the zero decimal currencies, I can take the value as is from the source.

If the condition value is negative, then I need to multiply with -1 to get a positive value as I an only carry positive values in this IDoc.

And the last lines take care about dots and commas as decimal separator. In Germany we use the comma as the separator for the decimals, while the Americans use the dot as a separator. And as I get often Excel files from all over the world for migrations I always have to care about the decimal separators.

And finally I want to share an issue that came up during our tests as I had not completed the condition migration.

The info records itself got loaded like explained in the previous blog including the information about the price and effective price, even when they were zero.



You can see that both fields are protected like they are when conditions exist. However, the conditions were not loaded at that time.

When the user clicked the condition button...nothing happened. A inconsistent situation, the info record itself has the information that conditions exist, while they are not there, hence the condition button does not work.

The only workaround is to change a value in another field and save the info record. Then go in again, now you click the condition button as usual. (Don't forget to correct the previously changed field again)

Are you using automatic safety stock calculation with SAP's forecasting screen in the Material Master Record? As you might know there is a way to have the system calculate the static safety stock value (in the MRP2 screen) based on three influencing factors


1. replenishment lead time - the longer it takes to replenish, the higher the safety stock should be

2. service level - the higher the expected service level, the higher the safety stock should be (it actally grows exponentially with higher service levels)

3. consumption regularity from period to period, expressed by the mean absolute deviation or MAD


the lead time to replenish and the service level setting come from the MRP2 screen in the material master, whereas the MAD is calculated by the material forecast. The material forecast then calculates the safety stock (and a reorder level in case you are using a VM or V2 MRP type) and updates the field "safety stock" in MRP2 every time you run the forecast.


Now... if you are using this type of functionality, you can run the forecast periodically and have the system come up with safety stocks based on one specific service level setting. If yu want some automation and better visibility and results, I suggest you have a look at SAP's safety stock and reorder point simulator.


The tool is integrated with the MRP Monitor and therefore lets you select a group of materials for simulation based on criteria like consumption value, consumption consistency, length of replenishment lead time, slow / fast mover and much more. You then load these materials into the simulator in order to see the simulation results. Besides the ERP method described above, the tool calculates safety stock levels based on an advanced method that takes into consideration variability in lead time and variability in demand additionally to the three factors discussed before. You can then compare the two. But the tool also lets you simulate these two results for various service levels. If you do so you can display a regression graph that shows how the safety stock levels increase gradually with every step up in service levels. As the exponential growths kicks in the curve will become steepe and the jump in safety stock increases becomes more severe. That way you are able to pick the last good service level which produces an optimal safety stock level.


Once that is derived, you can select the optimal service level for every material and update all the material masters with safety stock, service level and reorder points. You acn also pick whether the ERP method or advanced method should be used.


safety stock simulation.png



A nice added value is, that you also get KPIs like "safety stock value" and "safety stock coverage in working days", which are not available in the standard.

Based on the subject  it is easily to guess that I am not a friend of LSMW recording....for material master

I wonder at all why this is the 1st choice for so many user who start using LSMW for data migration. It may be understandable for people in my age who had to use batch input recording in the ancient times before LSMW was released, but looking at the Avatars here in SCN I see mainly young people....having trouble with their LSMW recording.

Why do they use recordings if SAP has already provided import methods? Did they miss to read the documentation?



from: SAP Library - Legacy System Migration Workbench
You can use the recording function to create a new object (or a new "import method") if neither a standard batch input program nor a standard direct input program nor the BAPI/IDoc method is available for a data object.

No, I do not want to demonize LSMW recording. I use it too, for data objects where SAP has not provided any import method, and as well for some quick and dirty mass changes on 1 or 3 fields. I have even used it for material master, in case the change was in the initial screen and in Z-fields which are not available in standard import methods.

But let me try to explain why the recording method is not suitable for a material master.

The LSMW recording is a static method. The program which is generated from a recording does exactly the same screen sequence as earlier recorded. It puts the cursor into the same fields like in your recording. It expects the same error cases. Just the data itself can be variable.

The material master is rather dynamic, depending on the material type customizing you may have different views to maintain, you may even have different or additional fields in the same view, eventually having different field attributes  (mandatory/optional).

In short using a static method for a dynamic data object is like doing a break-dance with a full body plaster cast.

Here the proof with some screen shots.

A recording is made for a spare part material. Only Basic data view, Purchasing view, MRP1 view  and General Plant data / Storage 2 view is selected.


in the recording this selection is technically made like shown below, you just have 4 selection indicators, which has the position from the pop-up in brackets. You can count above, 01 = Basic Data 1, 04 = Purchasing , 07 is MRP1  and 12 is Storage 2 view.


You do all the LSMW steps, I don't want to explain them again, there are many documents in SCN showing in pictures what you have to do step by step (just the explanation why you have to do it this or that way is a little lacking - no wonder why we can find 140 hits for: LSMW recording MM01 problem) and finally you created the batch input session and executed it.

You did not have any issue with your ERSA materials, as you did the recording for them, but in the moment the session wants to create a FERT material the trouble starts. You may find error messages in the log like "Formatting error in the field xxxxx"  (87 hits in SCN for this error).


But why did you get this error? The material master for FERT has additional fields active in the Basic data screen, e.g. the division and this field is even mandatory while it was not visible at all for the ERSA material in your recording.

What do we learn? You have to create an extra recording for each material type. Is this then really time saving compared to work with standard SAP given import methods?

Let's move on to next issue and other errors. If you have to load about 80000 materials like I had in my last migration, then you could see an awful big error log. For this example here I just continued with this single material processing it in foreground. The same errors can be seen in the status bar  which would have been listed in the error log in case of background processing.

Look at the first picture above, which has the view selection from your ERSA material and compare it with the next screen shot which is our FERT material to be created:


The view 01, 04, 07 and 12 are selected like in our recording. But look at the text, view 04 was the purchasing view, now it is the Sales Org Data 1 view. Position 07 was the MRP1 view, now it is Foreign Trade, position 12 was storage 2 view, now it is MRP1.

The static recording has just remembered the position, it did not record what view is assigned to this position. Because this is dynamic information, retrieved from customizing during runtime.

What happens now?

First SAP reaches the Basic Data 1 view,  but as you can see, there are more mandatory fields for which we do not have values in our template,  fields which were not even recorded before. I enter the values manually to get further.


The static program continues from Basic Data 1 view to the 4th view (Sales Org Data 1) and wants enter the purchasing value key. But there is no field for purchasing value key in the Sales view, hence the next error is logged:



127 hits in SCN for: Field "does not exist in the screen"  batch input

continuing to the next screen, to catch the next error:


We are now on the foreign trade view, this view SAPLMGMM 4004 was not at all in our recording. SAP just came to this view because of view selection 07.  We haven't recorded this screen, we have no data for this screen, right now we have data for MRP1 view to insert, but we cannot do this in Foreign trade view. Continue

Now we are at the MRP1 view (actually view 12 from the recording), and we see again an error:


But why does SAP complain about no batch input data? We had data for MRP1 view in the session. Yes, but our MRP1 view field values were already processed in the foreign trade view. Now we are in this view but do not have anymore the data for this view. Bad luck.

As you see the static recording method is not suitable if you cannot ensure the same static sequence and fields like in the recording.

You would need an extra recording for any variance (including unexpected error messages).

Alternative recording:

There is a way to record the views instead of just the position of the view. At least this can be used if you only want maintain data for those views. It is still not  flexible to  process sales views for FERT material and purchasing view for ERSA.

In the recording select only the Basic data view, this is usually the first view an in most material types available (we have some material types without basic data view, this situation is possible).

Then select the other views from within the material master via the pull down menu on top right:


This way SAP remembers the view names instead of the position and you can at least process all 4 views from this example.

However, the extra mandatory fields for FERT materials are still not in if you use ERSA material for the recording. And you do not have screen fields in ERSA materials when you record FERT materials instead, which causes then another error.

Further alternatives are the transactions MMZ1 Create and MMZ2 Change Materials master  from SAP's stone age. They have a static view selection with boxes to be flagged. But they are really historic and you have to check carefully if you can get all needed fields.

The better option is to get used to the SAP given import methods. The BAPI method is explained in my blog LSMW Material master by BAPI method - Part 1


My migration season is over for this year. Time to blog about certain difficulties and things I learned and want share with the community, my colleagues and myself (in case I forget until i have my next migration).


Among the last activities in my task list was the migration of purchasing info records.


But let me first talk a bit about the project. Due to the companies history we have many different SAP ERP systems and on the long term almost all shall be consolidated into one big ERP system. We are doing this already for some years and have still some more years in front of us. Aside of these big merger projects we have as well carve outs and smaller roll outs to companies that are not yet on any SAP system. While those big mergers have a project time of 1 or 1,5 years, those smaller projects are usually done in a few weeks or months.

In this last project we had to migrate data from 2 ERP systems into our target system.

In numbers: 14 companies, 36 plants, 23 purchasing organisations

Along with this migration an organisation change was executed, several companies had a legal merger into 1 company, others kept their independence.

Purchasing organisations got reorganized, in both directions, from n:1 and some from 1:3

Not to forget that the data had to be harmonized, thousands of vendors and customers and materials existed in all 3 systems, and even within a system some vendors and customers existed multiple times and had to become just one in the target system.


This was enough to scare you, and I  think even the most naive person can imagine that such a  data migration cannot be executed with a LSMW-Mickey-Mouse-recording with just 10 fields in a flat file.


Yes, here we are already at the first decision: which import method will be used. BAPI method is not available, so we have the option of IDOC, SAP given batch input and own recording.


My preferred migration method is LSMW with IDOC import method. I extract the data from the source system using the standard ALE scenarios as it usually saves me the work for an extra extraction program. You can read in detail  about the setup of an ALE for such a migration in my blog LSMW migration with IDOC method  and using IDOC as source  Part1: Extract by ALE

Another advantage is that I can already create a mapping old info record to new info record number during the conversion. IDOC method allows to create new info records with pre-assigned numbers. In batch input methods the info record number is only known after posting and creation of a mapping list very tedious.

Recording is always only the last option, if no SAP given method is given, as recording is a static method which does here not give the needed flexibility to migrate info records with an unknown number of purchasing organisation views.

So the decision had to be made between standard batch input and IDOC method.


Here I want emphasize on the specific issues to the info records migration using IDOCs as data carrier and import method


  • The extraction was planned to be done with transaction ME18  but we did not have a role with this transaction, as distribution of info records is not in our process model. So I had to use SE38 to execute report RBDSEINF to send the info records into a file which I later use as my source file.

  • How the field mapping for a huge file full with Idocs is done was already explained in my blog LSMW migration with IDOC method and using IDOC as source Part2: Import The field mapping is actually not an issue, however, one stumbling block is even here.  We usually know what fields we use in our process, so we focus on those fields and may not map the other fields in the field mapping or may just assign an initial value to such a "not used" fields. Due to historic reasons there is one field which is too short: Export/import procedure for foreign trade , and SAP added another field with a bigger length at the end of the IDOC. From the name you actually focus on the short field, and if you forget to map this second field too then you only migrate a truncated  value to the target system

  • The real trouble starts with the extraction, with sending the IDOCs by ALE. This IDOC is not really made for migrations (even SAP says in OSS note 327090 it is not made for data transfers), it is just made to distribute info records from a central system to satellite systems. And for such process you need to have an ISO code assigned to any unit of measure used in an info record. If you do not have such a process, then you eventually haven't ISO codes in all units. If you send the info records in foreground then SAP stops with a hard error message when it reaches an info record that has a unit without ISO Code.


You can just click Exit at the error message, and you are taken back to the menu. You do not know at which IDOC it stopped, you do not know vendor nor material.

To avoid such situations I create a small Quickview joining table EINA and EINE, listing info record number, vendor, material and all unit of measure fields. Sort this by the units and then check table T006 if each used unit has an ISO code. Based on the number of occurrences we decide if we do additional customizing for units and ISO codes, or if we exclude those info records from the automatic migration and create them manually - if needed at all.


  • And for the sake of an unreleased function module, this INFREC Idoc does not work like you can expect from a BAPI. The defaults are not taken from material master if you do not transfer the data for reminder days and Under- and Overdelivery tolerance in the IDoc. OSS note 327090 has a modification to make this possible if you want. And even you have the values for reminder days and tolerances in your IDOC you may still experience a surprise. In case the reminder days are set to zero, defaults are taken from purchasing value key entered to the material group in Customizing of Entry Aids for Items Without a Material Master











This a real case, however, it has quite inconsistent data. material master has purchasing value key 1210, entry aids have PVK 6001, and the info record even a 3rd variant. We have to ask ourselves if such inconsistency should be carried over, or if it is better to have the data in unique fashion again, according to the design of the target system.


  • The next difficulty I faced was a typical beginner mistake, I almost want to say an error because of plagiarism. I admit I am lazy, I copy the LSMW object from my last project and just concentrate on new mapping. Since many materials got new units we have to re-calculate Standard Quantity and Minimum Quantity. For this I make use of a standard SAP function module MB_UNIT_CONVERSION. I never had any problem in past migrations, but there is always a first time. My conversion step dumped. And because it worked the last years I did not check the coding in my program, I suspected the error in the data. Finally I found that I had just missed to catch the Exceptions when calling the function module. There were 7 materials (manually created) that had not the expected alternative units to allow a unit conversion.


  • Still this was not the last stumbling block.  We met situations where the material master in the target system had a material status that does not allow procurement activities. And even vendor masters with a block.  Since we mapped to existing master data where possible this situation caused posting errors. You cannot create info records if you have a materials status with procurement block. Hence you cannot create info records with the IDOC either.



No real issues for me but in any case good to know: you have to care about the info record number, if you leave it blank and let SAP automatically create a number when posting then the info record texts are not created. If you just map the number fields and MOVE the old number, then you either create info records outside your number range, or even worse you overwrite existing info records.  But I had chosen the IDOC method to create the info record number already in the conversion routine especially for the mapping which I need for the subsequent step of pricing condition load with COND_A Idoc.

About this more in my next blog: Difficulties in price condition load for purchasing info records with COND_A IDoc method


If you work the first time with a new import method, then you should search for OSS notes and read especially the consulting notes, but even error correction notes have a lot good info and may explain the SAP design.

Below you find some notes as reference for the above mentioned "trouble" , but there are many more about INFREC in


832669 - INFREC: Export/import procedure code has only five digits

326028 - INFREC: Infosätze werden doppelt (EINA) angelegt

481034 - FAQ: Data transfer (batch input) in purchasing

327090 - INFREC: Default data is not transferred

327083 - INFREC: Data transfer of purchasing info records

Hi All,

Hope you are doing well.

I am a SAP MM Consultant.

Recently I have configured new Pricing procedure for new Purchasing organization.

I want to share with you all.

Firstly we have to know about Pricing procedure.


What is Pricing procedure?

The main concept of pricing procedure is combination of different type charges, like Gross price, freight, discount, surcharges etc etc.

We use pricing procedure to determine these all condition into one procedure, where can we found the sub-total for net amount.


To understand pricing procedure we have to comfortable with these things

1. Condition Table

2. Access Sequence

3. Condition Type

4. Condition Record.

5. Schema Group

6. Calculation Schema

7. Schema Determination


Let's discuss about all these points in details.


1. Condition Table

It's a table where can we save the all fields with the combination for individual condition record. Suppose I use Plant for condition table, then the condition record will be created for plant only.

We can use many field in one condition tables.


2. Access Sequence

The main think of Access sequence is, it search condition record for condition type from condition table.

One access sequence contains many condition tables.

Suppose we maintain 4 condition tables in one access sequence. Then when a condition type is searching for condition record via this access sequence, the access sequence will allow to search on only this 4 condition tables.


3. Condition Type

In simple term condition type is used for different types of charges. Like gross price, discount, freight, rebate etc etc.

Suppose we purchase a material for price 10, we get a discount of price 2. Then the price 10 will be goes to a condition type and the price 2 will be goes to another condition type.


4. Condition Record

Condition record contains the record which is maintained against condition table with regards to condition type.

It can be fetch via access sequence and condition type.

Suppose we maintain a condition record against condition table (vendor) with regards to condition type. Then whenever the vendor used this condition type, the condition record will fetched.


5. Schema Group

It has assigned to our vendor and Purchase Organization, It helps the vendor and purchase organization to choose pricing procedure.

One schema group will be assign to vendor and one schema group assign to Purchase organization.


6. Calculation Schema

Here we maintain the pricing calculation, like gross price, discount, rebate, surcharges etc. We maintain here the calculation for all condition type and group together condition types.


7. Schema Determination

Here we maintain the pricing procedure for purchasing document. We maintain calculation schema combination of per each vendor schema group and each purchase organization schema group.







Step 1 : Maintain Condition Table


T-code M/03 or Path SPRO-IMG-MM-Purchasing-Conditions-Define Price Determination Process-Maintain Condition Table-Create condition table.

The initial screen will be appear for create condition type.


Give the new condition table no. You can use old condition table number as reference.

(Note: As per standard SAP, they give the entire condition table for general business requirement. If any condition table doesn't fulfill your requirement, then you can create new condition table. Please use greater than 900 as condition table (recommended)).


Now press enter


You can see the left side will blank in initial screen, Right side is always filled. Just double click on the right side field which you can add for this condition table.

When you double click the field the field will be blue colour and the field is appear in left side.

As we can see the 3 fields I have selected in this condition table.

Save your data.

You can change the option


Step 2 : Maintain Access Sequence


T-code M/07 or path SPRO-IMG-MM-Purchasing-Conditions-Define Price Determination Process-Define Access Sequences.

The initial screen will appear.


You can create a new access sequence or maintain the condition table in existing access sequence (depend on your business process)

If you want to create then click New Entries. (Or you can use copy reference, Just select the access sequence and press Copy as...)


Enter your Access Sequence and give the description and also you can choose Access category.

Then select the AS and double click on Accesses


Click on New Entries

Enter the access number, condition table and if required the give routing number and Exclusive.

If you tick exclusive indicator, then if valid condition record found then system will stop searching after the first.

Choose as per your requirement.


You can add more condition table on this access sequence,

Save your data.


Step 3 : Maintain Condition Type

T-code M/06 or path SPRO-IMG-MM-Purchasing-Conditions-Define Price Determination Process-Define Condition Types-Define Condition Type

The initial screen will be appear


As for gross price SAP default maintain as PBXX for time independent condition and PB00 for time dependent condition.

Time Independent condition is use with validity period, which can be differ as validity period ( we are use in info record, RFQ, Contract, SA(as per configuration for document type)).

So we don't need to create new condition type for gross price, although you can create your own starting with Z.

Assign you newly created access sequence to the gross price condition type (I have maintain in for PBXX)


Here you can find all types of condition type like as freight, discount, rebate, cash discount etc. If it can fulfill your requirement, then no need to create new condition type. If doesn't the go for New Entries.


Here I have maintain ZCAH as discount, I give condition class A, Calculation type A, Plus/Minus X-Negative

(Note: Not required to maintain like that, you can maintain as per your requirement.)

As per this way please create other condition types as per your requirement (which will be used in Pricing Procedure)

Save you data.


Step 4 : Maintain Calculation Schema

T-code M/08 or path SPRO-IMG-MM-Purchasing-Conditions-Define Price Determination Process-Define Calculation Schema

This is very very important part in pricing procedure.

Here we give the calculation for all condition types.


Here also you can find the SAP default calculation schema, you can also use one of them as your pricing procedure. Although you can create new calculation schema as per your requirement.

Just click on New Entry. (or you can copy old schema and can modify yourself. then select copy as...)


Give the name and description for new procedure.

Then Select this procedure and press Control (in left side)


You can see a lots of option there, You can read all option help which is provide by SAP (Just select the portion and press F1)

Maintain PBXX as gross price in step 1, counter 1, leave the from and to blank, Subtotal will be value 9-Copy values to KOMP-BRTWR (gross value).

Maintain other condition types as per your requirement.


I have maintained ZCAH and ZCAS, I have used from 1, it means it will calculate the value against Gross Price (PBXX)

I have used Total Discount amount, from 7 to 9, It shows the total discount value (Total of ZCAH and ZCAS)

At last I have used TOTAL AMOUNT, It shows net value for this pricing procedure.


Step 5 : Maintain Schema Group for Vendor

T-code OMFN or path SPRO-IMG-MM-Purchasing-Conditions-Define Price Determination Process-Define Schema Group-Schema Groups: Vendor

Just click on New Entries and enter the Schema Grp Vndr and give the description.


Here I have maintained Z1.


Step 6 : Maintain Schema Groups for Purchasing Organizations

T-code OMFM or path SPRO-IMG-MM-Purchasing-Conditions-Define Price Determination Process-Define Schema Group-Schema Groups for Purchasing Organizations

Just click on the new entries and enter the Schema GrpPOrg and description.


Here I have maintained ZMM1.


Step 6 : Assign Schema Group to Purchasing Organization

T-code OMFP or path SPRO-IMG-MM-Purchasing-Conditions-Define Price Determination Process-Define Schema Group-Assignment of Schema Group to Purchasing Organization

Here find your Purchase Organization, and assign Schema GrpPOrg to Purchase organization.


Step 7 : Maintain Schema Determination

T-code OMFO or path SPRO-IMG-MM-Purchasing-Conditions-Define Price Determination Process-Define Schema Determination-Determine Calculation Schema for Standard Purchase Orders



Step 8 : Assign Schema group to Vendor

When you are going to create a vendor from XK01 in purchasing data you will find "Schema Group, Vendor"

Here assign the schema group which you have created on step 5.

You can also maintain it on existing vendor via XK02.



Step 9 : Maintain Condition Record

T-code MEK2

Maintain the condition record against the key combination.


Save you data.



Finally all configurations has been done.


Now see the result.

Create PO with this purchase organization and with this vendor.


We can see the my new pricing procedure is working perfect.


We can use this in SA.

Just we have to configure this for document of schedule agreement.

Go to path SPRO-IMG-MM-Purchasing-Scheduling Agreement-Define Document Types


Just un-tick Time Dep. Condition for SA document type.

Then Create SA for this vendor and purchasing organization.



We can also maintain the fixed discount for particular vendor.

Suppose one of my vendors gives me material as 5 % discount for all material.

We can maintain it in condition record.

Go to MEK2


Click on Supplement Condition.

Maintain there your condition type for ZCAH as 5.

You can see the default 5% discount has been automatically arrived in PO or SA.



Hope it will helpful for all.


Filter Blog

By author:
By date:
By tag: