Hi Gurus,


I have came across requirement of Printing QR Code (Quick Response Code) or 2D Bar Code on Smartforms or Download it as an Image file.


Here, I am going to explain "How it can be done?"


Step 1: Upgrade the Kernel if Required


Log in to the SAP >> SAP Easy Access Screen >> System Menu >> Status




Now, Click on the component to check the SAP_BASIS release 731





Close the pop-up dialog and click on the arrow button. The Kernel Version Should be 321 or superior. If it is not then ask your Basis Team to upgrade it.




Step 2: Implement SAP Note 2029824


Ask the Developer to implement the note

2029824 - Support for QR code and data matrix bar code

Step 3: SAP Script Font Maintenance

After successful implementation of the SAP Note. Go to Transaction SE73.


Click on System Bar Code option and Click on Change.



Click on Create(F5).



Click on New (new Barcode Technology)


Define the Name and Description of the Bar code.


Select Bar Code Symbology as "QR Code 2005" and Click OK. Select Alignment as "Normal"


SS01.png SS01.png

Choose Mode, Module Size, Error Correction Level (Refer the SAP note 2030263 for further information)









1 - FNC1_POS1 (FNC1 in first position)

2 - FNC1_POS2 (FNC1 in second position)


Module Size: Numeric User Entry


Error Correction Level:

L - 7 % Error Correction capability

M - 15% Error Correction capability

Q - 25% Error Correction capability

H - 30% Error Correction capability


Save the Bar Code Definition. Select the same and Test.



Step 4: Create Program to Download the QR Code as BMP (image File)


Write the below code in the SE38 Program. This Program can download the QR Code as image (.bmp format) on your presentation system.


*YOTQR starts


*& Report  YOTQR2 this program dirctly print BMP file of any barcode









PARAMETERS: barcode like tfo05-tdbarcode default 'ZQRCODE',

            barcdata(50) type c lower case default '1234567890',

            filename type string LOWER CASE default 'd:\1.bmp'.


DATA:errmsg(80)   TYPE c,

      bc_cmd       LIKE itcoo,

      bp_cmd       LIKE itcoo,

      bitmapsize   TYPE i,

      bitmap2_size TYPE i,

      w            TYPE i,

      h            TYPE i,

      bitmap       LIKE rspolpbi OCCURS 10 WITH HEADER LINE,

      bitmap2      LIKE rspolpbi OCCURS 10 WITH HEADER LINE,

      l_bitmap     TYPE xstring,

      otf          LIKE itcoo OCCURS 10 WITH HEADER LINE.



PERFORM get_otf_bc_cmd  IN PROGRAM sapmssco

                       USING barcode



CHECK sy-subrc = 0.

bp_cmd-tdprintcom = 'BP'.

PERFORM get_otf_bp_cmd  IN PROGRAM sapmssco

                       USING barcode




CHECK sy-subrc = 0.

PERFORM renderbarcode IN PROGRAM sapmssco

                     TABLES bitmap

                      USING bc_cmd







CHECK sy-subrc = 0.

perform bitmap2otf IN PROGRAM sapmssco

                   tables bitmap


                    using bitmapsize



data length type i.

data hex type xstring.

data bitmap3 type xstring.

FIELD-SYMBOLS  <fs>   type x.



clear: hex, bitmap3.

loop at otf.

  length = otf-tdprintpar+2(2).

  assign otf-tdprintpar+4(length) to <fs> casting.

  hex = <fs>(length).

  concatenate bitmap3 hex into bitmap3 in BYTE MODE.




* convert from old format to new format

hex = 'FFFFFFFF01010000'.

CONCATENATE bitmap3(8) hex bitmap3+8 into bitmap3 in BYTE MODE.

clear hex.

shift hex right by 90 places in BYTE MODE.

CONCATENATE bitmap3(42) hex bitmap3+42 into bitmap3 in BYTE MODE.



data bitmap4 type SBDST_CONTENT.



      buffer                = bitmap3 " xstring


      binary_tab            = bitmap4.

data bitmap4_size type i.

bitmap4_size = xstrlen( bitmap3 ).





   OLD_FORMAT                     = 'BDS'

   NEW_FORMAT                     = 'BMP'

   BITMAP_FILE_BYTECOUNT_IN       = bitmap4_size


   BITMAP_FILE_BYTECOUNT          = bitmap2_size


    bitmap_file                    = bitmap2

   BDS_BITMAP_FILE                = BITMAP4


   NO_BITMAP_FILE                 = 1

   FORMAT_NOT_SUPPORTED           = 2

   BITMAP_FILE_NOT_TYPE_X         = 3

   NO_BMP_FILE                    = 4


   BMPERR_NO_COLORTABLE           = 6



   BMPERR_EOF                     = 9

   BDSERR_INVALID_FORMAT          = 10

   BDSERR_EOF                     = 11.



  CALL METHOD cl_gui_frontend_services=>gui_download


      bin_filesize = bitmap2_size

      filename     = filename

      filetype     = 'BIN'


      data_tab     = bitmap2[]


      OTHERS       = 3.


Step 5: Create Smartstyle for QR Code


Define your Smartstyle as below screenshots



Define the QR Code in Character Format as below Screenshot




Step 6: Use SmartStyle in SmartForm

Define the smartstyle in the form attributes as per the below screenshot


Define Text as below screenshots. Caution: keep the <c1> (small letters) not <C1> (capital Letters) otherwise QR code will not get Printed)


in this example, I've used material Number to be encoded in QR Code.


Test The SmartForm

SS01.png SS01.png

We can use the same procedure to include the QR code on your SmartForms.

This Bar Code can encode up to 255 characters (it is the limit).But, I think, 255 characters enough.

Using this you can shrink the size of your label. You can encode the address information as QR Code and save the space on the form for more details and many more.

Recently you may notice a problem that the GR material document is cancelled, but the corresponding inspection lot is not cancelled together with the material document. You can still see the inspection lot in t-code MD04.


Here is basically what happened:



  • The indicator "Manual sample calc." is set for the inspection type 01/04 in MM03 or no inspection plan or material specification is created for the material although the inspection type requires one.
  • No early lot creation in case of origin 04, which means the field "Control insLot" is not set to "Y Early lot creation for the order item" for the affected inspection type.

    It looks something like below in MM03:



  1. Post goods receipt to a order in t-code MB31 or MIGO.
  2. An inspection lot is created with status CRTD.
  3. Check MMBE. QI stock increased.
  4. Cancel the GR material document which created the inspection lot in MBST.



  1. QA03 : Inspection lot is not cancelled. The status is still CRTD.
  2. MMBE: QI stock disappeared.
  3. MD04: Inspection lot is still displayed.


Permanent fix:

Implement SAP Note 2186161. This note fixes a bug in this case. After applying this note, when you post new GR material document cancellations in MBST, the inspection lot will be cancelled together with it. Lot status becomes LTCA automatically and won't be displayed in MD04.


Inconsistency correction:

For existing inspection lots, to correct the inconsistency, follow the steps below:

  1. Execute the report ZQEVAC20 in note 48815 for the material and plant.
    After this report is executed, the inspection lot status should become SPCO.
    The inspection lot is no longer displayed in MD04:
  2. If you are still concerned about the lot status CRTD and want to change it to LTCA, you may firstly assign an inspection plan and calculate the sample size of the inspection in QA02. Then the lot status would become REL. Then you can go to QA11 to record a dummy usage decision so that lot status becomes UD.
    At last cancel the inspection lot in QA12.
  3. If you are unable to carry out step 2 because you cannot assign an inspection plan to the inspection lot, report an incident to SAP Product Support under component QM-IM for further help.

I have a different material and those are produced in different batches. Say have 21 batches are there and QM user will recording the result for one batches. system will automatically copied all the characteristics result in other batches.


Please provide the standard configuration for that one. How can I mapped. Inspection type to be used 04.

1. Overview

Friends, this indeed is a huge topic with several edges to discuss upon. You would possibly find many threads on Data Migration which discuss more about technical aspects such as specific problems related to data migration tools. Whether LSMW is good or BDC gives faster and accurate results or is there any other tool for this and so on! Here I shall not be discussing about the technical complexities involved about the tool. The idea is how neatly this entire activity could be planned, what factors should be considered for planning and how this should be executed to have hassle free experience during Data Migration phase!

2. Frequently faced problems during Data Migration

Do you really think the only problems that most of you face in data migration phase, are related to the tools used? I don’t think so! There are other factors also that contribute here. If I look back to the projects I worked in or if you ask to the experienced consultants who worked on data migration, the answers are like this.

  • Delay in the receipt of the data from the business users
  • No time left for technical validation as the Go Live deadline is approaching faster
  • No authorization in production server for uploading the data
  • Tool is not working or giving incorrect results when the uploading is in progress
  • The concerned master data transaction is locked by some other user at the same time resulting in to errors
  • Records could not be loaded as dependent masters do not yet exist in the system
  • Incorrect data such as field contains more characters than permissible or sometimes a new field is added by customer at the last moment and there is no provision in the tool to incorporate the same
  • Immediately after Go Live, someone from the business team comes with the complaint that the uploaded master data is not correct or the inventories were loaded with incorrect valuation. 
  • Few records were left or not provided by the user and not uploaded at all in the system

3. Importance of Planning

The above list could go on if we go at very minute levels. And you can observe the variety of issues here. There could be issues from consulting side or from business side covering both technical and non-technical.  The important question is – Can the above problems be avoided? Answer according to me is yes. If you have strong plan and execution strategies in place, these all can be suppressed. In planning, you define the check sheet (based on your past experience) for the activities that you are supposed to do along with timelines for every single step. Planning check sheet provides the transparency and clarity to the consultant and to the business users about their individual roles in this phase.

Master data is treated as key data element in any SAP project and that is why it should be given utmost importance. Master data serves as the main base for the success or the failure of the system and hence lots of awareness needs to be spread among the users who deal with this from business side. At the same time consultant should also have the knowledge about its consequences.  It is not like that the only data migration phase affects with this but it affects the entire business. Hence accurate master data should be the topmost priority for any organization that would be running on SAP. In my opinion successful data migration is joint responsibility of business and consulting team.

4. Factors to be considered while planning

Many project managers and delivery managers plan this as a special event with very detailed matrix of sequential events to be followed along with responsibility fixation, which really helps in streamlining the project resources. Separate coordinator is also assigned in some cases to monitor this activity closely.  Although the detailed level planning is done by most of the managers it is equally important that ever team member should understand and be aliened to the same. I may not be in a position to provide each and every single step here as it could vary project to project and industry to industry based on project size. However below are the few common factors that can help plan this activity well!

4.1. Identify the appropriate masters & fields required during data migration

The ideal time to start is as soon as Business Blue Prints (BBPs) are signed off. The signing off indicates that the requirement gathering phase is over and there is rear or no chance that the new requirement will be entertained. So at this stage or probably when you design the solution approach you are pretty clear about what types of master data are required from business side. You should start listing down such masters (e.g. Vendor Master, Customer Master, Material Master, Routing, Inspection Plans, BOM, Equipment Master etc.) belonging to each module.

It’s not like that you utilize each and every field on SAP screen for any specific master. This depends upon the business requirements. And as the BBP phase is signed off by now, you can have idea about the fields that are required to be utilized. You should also consider the custom fields if you are planning to develop it as a part of your solutions. List down all such fields master wise and module wise. This all serves as input for data migration tool development. Tool development also takes some time. Hence it is better if you start on it as soon as BBPs are signed off.

4.2. Tool Development

Once you have identified the required masters and fields the next step is to develop a tool for data migration. You should choose the tool wisely.

Although there is no any specific rule that which tool should be used but I personally first try to find if the standard tool is available such as transaction codes like MASS, MM50, QA08, QI06. By opting such standard transaction codes you minimise the risk of tool failure in Production environment. The pity is you do not have standard code for every master you want.

The next featured options are LSMW or BDCs. Some consultant prefer LSMWs while others choose BDCs. There are few examples where people rely on eCATTs also. For some of the objects standard LSMWs are available by batch input method. LSMWs save much time of ABAP development over BDCs. But there are few objects where BDCs prove useful over LSMW. You should wisely opt and finalize as early as possible what you will be adopting. Accordingly you should take into consideration the needed fields that you have already listed down.

4.3. Tool Testing

Testing of these tools is the next important factor. You should test each and every of these tools by uploading some dummy records in to your test server first. Based on the test results you should take corrective actions for the tool. May be you can verify in to SAP tables whether the records are generated or not. It is better to take dump before uploading these masters so that later you can compare for the correctness.

Once you are satisfied with the tool results, you should circulate the master data templates to your core users who will be responsible for filling up the master data. The templates should contain below information

  • Field Name
  • Field Description
  • Maximum Number of characters allowed
  • Explanation of each filed
  • Information about whether any field is mandatory
  • Any other extra information if needed

It is better if you can share sample filled templates to core users. That will help them to understand how the master data needs to be filled. 

4.4. Use of reusable assets

Time saving results in to profitability! If you are working in a roll out project then there is a chance that your earlier team or you yourself have already developed the tools for data upload. And these are proven tools as they were already used for upload. So why not to use the same again to upload the data in new plant on the same server? I always recommend to use such tools to the maximum extend you can in the new plant. Assumption is all the master data fields are same in the new plant. You can thus minimize the tool failure threat.

4.5. Master data awareness sessions to business users

The successful data migration does not only mean the successful data uploading in the system without any error. It means more that that. The best measure of this success is very less or merely no tickets immediately after go live related to data. It is thus joint responsibility of business users and consultants. Business users are responsible for the data collection and accuracy of the data provided while it is consultant’s skill is to upload the data as received successfully in the system. If either of above fails then it leads adverse impact on the business.

Having said this it is important that the business users should be well aware about the consequences of providing inaccurate or incomplete master data or providing tentative (to be) data that how badly it will hit to their future businesses. This point suggests that consultants should have healthy discussions with business users for making them aware about these facts and explaining the roles and responsibility.

According to me this phase should cover below points

  1. Publishing and explaining the data migration plan to business users along with clarity of roles of the user and the consultant
  2. Arrange the meetings with them to explain the overall importance of accurate master data and how it affects the future business conditions
  3. Importance of dependent master data. – This would ensure that all the parties refer to the same masters. E.g. Material Master is prepared by MM team but for the same material master Bill of Materials is prepared by PP team. So they both should be using the same material code or description while data collection. This keeps all departments in sync with each other and you can minimise the risk of referring the same code differently.
  4. With one on one session with the respective users you can explain them each and every field of the master data template. You should also make them aware about the field length and other parameters which are mandatory to be filled in.
  5. Duplication of records is another threat to data migration. It will be good if the users are made aware about this.


4.6. Sample data validation and feedback to Core User

You should demand for the sample data from your core user for your verification. This will help to ensure that users are filling the template as required. This also helps to avoid rework and clears the understanding between you and the user. Based on your consent the user can start collecting bulk data.

4.7. Status Meetings at Regular Interval

Perhaps this is the most scary thing that most of the folks feel. But it is always advisable to have meetings are regular intervals to have status review about data collection. It is the forum where both (Core User and consultants) can raise their concerns and the solution can be explored. The more transparency you maintain in these meetings the less problems you face while loading the masters. Such meetings help you to track the progress of data collection especially the dependent masters.

4.8. Cut off for receiving master Data

You should set the cut-off date in planning phase for receiving the data from business users. It is expected that business users should submit the entire or whatever applicable master data by this date. This cut-off date should be mutually agreed well in advance with them. This is really a god practice I would say , as this can suppress many problems that may arise in future. Imagine a situation where you have invested a week on the the validation work on the data received (using your excel expertise) and the file is ready for upload. At this time time an user sends many additional records saying he/she missed them to incorporate in earlier file. Gosh! So you will think that if the proper idea was given to them well in advance they could have prepared the entire data by the cut-off date without hampering their regular office work.

I accept that every time this theory cannot be true.  But if mutually agreed with customer then both the parties can save lots of time and can concentrate on further steps.

4.9. Validation and sign off

Once you receive the data from users you validate it. During validation if some abnormalities observe then you resend the file for correction to the users. The cycle repeats until the final version of file in ready to upload format is reached.  Several tools are available these days for validation purpose. You can even develop a new tool for validation purpose in SAP as well which can compare the values in excel or txt file with dependent masters and can give you the list of abnormalities in few seconds. Using excel techniques you can suppress duplicate records. You can thus ask your user just to rectify those entries and resend the corrected version. Back up of the data is also important here. You should always have the practice of storing or taking back up of every file you worked on. Version management helps you to identify the latest file on which work was done. Loss of worked file may lose your hundreds of hours, remember!

Once the validation is done there should be formal sign off from the business side. This is to authenticate the contents submitted are correct from business perceptive. This can avoid many disputes.


4.10. Uploading and Verifying uploaded data

Some teams prefer to upload the data first in mock server or QA servers to double ensure that you are not getting any errors while uploading the data. This depends on the hardware and availability of the server infrastructure for this purpose. If the results are successful you can upload the data in Production environment.  Now there could be very detailed check sheet involved when you actually start the data migration. Giving such check sheet here is highly impossible. This is more or of technical sequencing part rather than administrative. For example

  • Whether the programs or LSMWs have been transported to Production server?
  • Have you got the necessary authorizations for upload in Production server?
  • Whether the sequence of the master data to be uploaded is finalized
  • Whether all consultants are in sync with this?


It is also advisable that the business team should be available while the data migration is in progress so that any decisions if required in the data correction at the last moment can be possible. To cross check the data uploaded in Production server. You should keep 2 days or some time as placeholder before you actual Go Live. You can use this time to rectify if something went wrong. This ascertains that all is well!


It is bit challenging to provide the detailed check sheet of Do’s and Don’ts here as it may vary dynamically but I am hopeful that if the above factors are considered for data migration planning, you will have less surprises in the data migration phase.

Nothing can replace your own prepared check sheet and your own experience that you have gained or the things that you have learnt from the mistakes. You should include every minute point that you think could be important (doesn't matter how silly it is) so that it would not be escaped from your mind at later stages. Go on marking it as green once you complete that step. Trust me it helps a lot especially when the dependencies are involved. We all learn from the past. Every project adds new value to our repository. So you should go on refining the check sheet based on your project learning and experiences of your seniors.

Like I mentioned in the beginning this is indeed a huge topic. I might have missed few important points as well. I sincerely invite all your ideas, suggestions or corrections to the document to make this even simpler and neatly organised. Plan wisely! I reiterate - Include every minute point that you think could be important! You will have less surprises at the data migration phase!

Thank you and good luck to you all!


Anand Rao


SAP Quality Management has been an integral component of the SAP ERP solution since its inception.  The QM module has more than 10K customers globally and over 500K users.  It is mature capability that continues to be enhanced with each new ECC enhancement and service pack.  QM capability spans and is utilized in almost every step of the end-to-end, procure-to-pay and order-to-cash processes that ERP enables.  The traditional challenge is that there was no way to tie each step of the end-to-end QM process together.  In order to “close the loop”, SAP created the Quality Issue Management (QIM) solution as a level of abstraction to sit on top of QM and any other system with quality related information and to tie it all together.  A single instance of QIM is meant to sit in top of and integrate with multiple instances of ECC across the enterprise, as well as any other non-SAP, 3rd party quality systems to tie all quality information together.  It is a single place where users from across the enterprise, as well as external users (customers and suppliers), can come to raise, collaborate and resolve, not only quality related issues, but any issue, activity or topic for collaboration.  QIM was built for quality issue management, but can be used to manage any type of issue, or used to manage any type of collaborative activity. 

Some of the types of “issues” QIM has been specifically built and used to manage include:

• Resolution of customer complaints • Resolution of complaints against a supplier

• A CAPA process triggered by a nonconformance issue detected during production

• A Product Technical Complaint process specifically for the Life Science industry

• An 8D process for Discrete Industries

• A Claims process for Professional Services industries

• Handling of any problems that occur during a project

• Handling Change Requests

• Resolution of Compliance issue

The types of issues that can be managed by QIM span the end-to-end, procure-to-pay and order-to-cash  processes of the enterprise:

• Procurement – complaints against suppliers, supplier audits

• R&D – product claims, change requests

• Manufacturing – product defects, non-compliance, issues with out-sourced production or subcontractors. 

• Sustainability – environmental audits, risk management

• Distribution – inventory inspections, transportation issues

• Service – customer complaints, consumer requests

For example, QIM helps tie a customer complaint to specific product defects due to manufacturing or raw materials from a supplier or to suggested improvements for product design.  QIM also helps manage different industry specific issue types such as CAPA processes for compliance in life science industries and 8D processes for managing complaints against a supplier in the automotive industry.  Needless to say, QIM is a flexible, versatile solution. 

QIM is relatively new to the SAP solution portfolio, released in 2012, but is gaining mindshare and traction in the industry.  On September 16, come hear about how Johns Manville, a US based roofing product manufacturer, has successfully implemented QIM to manage their 8D and CAPA processes.  They were challenged with getting visibility to information related to customer complaints being collected in multiple ECC and other systems.  There was no one place for all the people required to resolve an issue to go to collaborate.  QIM helped them bring the process, information and users across the plants into a single place for managing quality related issues.  Please register now at the link below and join us for this informative session.  I hope to see you there.


Hi All,


Recently I have been a part of a SCN discussion thread, where we have discussed about some critical factors needed to be considered professionally while KT is planned in IT out-sourcing model. We had some really thought provoking inputs provided by Martin, Anand and off course Craig, who have moderated our entire discussion in the below thread:



I would like to summarize this discussion in the form of a BLOG, which can add value to a wider audience, who time and again come across such situations in their business activities. Before I proceed ahead in documenting the points discussed, I would like to sincerely thank all the participants for sharing their valuable inputs. Will request others consultants/readers reading this blog to share their thoughts and experiences as well, so that this document can be improved for better usability of the community.



Their are certain essential factors needed to be considered during different stages of KT process, majorly applicable for an IT outsourcing model:


A) Before KT - A basic preparation form the team receiving KT can really help this process a big way.


  • The team receiving KT needs to understand what kind of industry and major business processes, your client is engaged with, For example, is it Chemical, or Automotive, or Mining, or Manufacturing, or Agro-business, or Hi-Tech, etc
  • Go through client's home page and try to gather some generic inputs about the industry they are working with and their geographical presence, which is adding as a plus factor for success in competitive business.
  • If possible, make a SWOT analysis based on your own high-level understanding about the major processes implemented, depending on data available on public domain. This is not a mandatory step, but at times during KT you can ask some specific questions which may add value.
  • If possible, try to collaborate with your counter parts, having different industry domain experience and discuss your observations at high level . If someone who has worked in shop-floor for similar domain, but NOT a part of this present KT team, then his inputs can be really useful for your better planning and preparation.


B) During KT - Important technological and behavioral factors, needed to be emphasized religiously.


  • As Craig rightly suggested, it's not always a easy or lucrative task for someone who is responsible for the KT process. There may be 'uncertainties' linked to it and hence it needs thorough consideration form the team.
  • Utmost courtesy needs to be maintained through the entire KT process, based on cultural sensitivity factors.
  • Try to be a Good and Active listener while KT process is on. There is no herm in asking a question more than one times, but if the same is already answered before to someone else, then it may not look good to ask it again.
  • While KT, try to make the sessions interactive by sharing your researched inputs, and try to get better clarifications. Please note that KT responsible may not have all detail level inputs ready at the moment to share. So, please keep the channel for further collaboration open, for example email feedback, reverse KT, etc.
  • Please plan for some technical sessions, where major aspects for processes can be explained to maximum possible extent. If due to time constraints, this step is not planned or getting delayed, then you can follow-up with team for email sharing of the documents. In turn, you can go through the details provided and send email back to KT stakeholder, if there are any open points.



C) After KT -


  • Please collect available documentations form KT provider like Business Process List (Commonly called as BPML), Detailed Process List (Either Visio or Power Point presentation of detailed processes implemented), RICEFW list (Majorly used by ABAP Team), GAP list (For any un addressed topics), Customization Document (majorly used by Functional Team), Z-report list, Interface List (which are not included in RICEFW list), Batch Job List, etc.
  • If there are any open documentation not available during KT, then please ensure you receive them before KT process is over. Because once KT is done, then the entire responsibility lies with the next team to maintain business continuity.
  • Plan for a reverse KT, if agreed in your contract. You can do it informally too through emails, whichever is comfortable to you and KT responsible.
  • Once KT process is done, please share your vote of THANKS to KT provider. This is very important to show professional respect to the team who have done a tremendous job and helped your team to complete the KT.




Hi,I would like to share one of quality inspection scenario I have mapped.


Business Scenario : Business involved in Panel manufacturing process with MTO scenario,where panel components are supplied to subcontractor and after completion of assembly at subcontractor end,subcontractor inform to buyer for Quality Inspection,Buyer inform to Quality engineer that assembly is ready for quality inspection,after completion of quality inspection at vendor location.Quality engineer give clearance to subcontractor for delivery of goods at factory.Quality report are prepared manually and informed to buyer and store manager.on receipt of assembly at factory,Final Inspection of assembly is completed.


Challenges in current process :

1.Defect occur at vendor end inspection can not be capture.

2.Reason for defect happend during inspection not available.

3.Number of Inspection held at vendor-end not available.

4.Reason for delay not captured either by subcontractor or by Quality engineer or by buyer.

5.Cost of Quality Inspection can not be identified.


SAP Solution :

  1. Assembly material Master should be updated with Inspection type 89 -other inspection.
  2. On receipt of message from subcontractor to Buyer for rediness of assembly.buyer to create Internal problem QM notification.
  3. Send email to Quality engineer through Quality notification change screen.
  4. Quality engineer visit vendor location and perform Quality Inspection of assembly as per check list and update result to vendor and buyer.
  5. Quality engineer has to create Inspection lot against QM notification and update result against it as per vendor end Quality report.
  6. Quality report is scan and attached to document through DMS.
  7. In case of defect during Inspection,rework will be done by vendor and re-inspection for same assembly will be captured by different QM notification.
  8. Final Inspection will be done on goods receipt against Purchase order.
  9. Accepted assembly moves to unrestricetd use and rejected will be moves to block stock.


Process-flow-chart for vendor end Inspection


Advantages of mapping vendor-end-inspection in SAP.

1.Defect appearing in vendor-end-inspection available for quality analysis.

2.Reason for delay is identified and corrective action taken for it,to have on time delivery to cutomer.

3.Number of re-inspection are captured which help in corrective action.

4.Cost of Inspection can be identified.



I hope scenario discuss herewith will be helpful to you



Hi,I want to share my experience of implementing calibration process in SAP.we have use SAP MM,SAP QM and SAP PM module for mapping required scenario.


Business Scenario :Business has to do calibration of test equipments on periodic basis,calibration of few equipment is inhouse and for few equipment it will be done by external vendor.after calibration test equipment location details,calibration date are updated in document called history card.history card consist of unique serial number which is also label on test equipments.schedule of test equipment calibration is maintain manually.Quality control team has to review documents regularly to know Test equipments pending for Calibration.


Challenges during SAP Implementation :

1.Uploading of existing Test Equipments in SAP system

2.Capturing history of each and every Test Equipment with Calibration status,Last Calibration date,Procurement details,Location details.

3.Automatic reminder for Test Equipment Calibration.

4.Capturing Calibration Certificate.

5.Identify Test Equipments at Vendor Locations.


SAP Implementation

Steps we followed for mapping calibration process in SAP are as below :


Master data creation:


1.Identified Test equipments with common attributes and created unique material code for same with serial number profile.

2.serial number profile profile is created in such way that Equipment number and Serial number will be same

3.Physical stock of Test Equipment are noted in excel sheet.

4.Test Equipment stock upload done in SAP using initial stock movement 561.

5.Test Equipment at vendor location uploaded in system using initial stock movement 561-O.

6.Attributes required for Calibration check are created in SAP as Master Inspection Characteristics (MIC).

7.Test Equipment with common check list are identified and group task prepared for such equipment with Master Inspection Characteristics

8.For special equipments Equipment task list is prepared with MIC.

9.Maintenance Plan/Calibration Plan created with frequency of calibration period ie.monthly,half yearly or yearly.

10.Scheduling of maintenance plan (IP24) as background job.


Calibration order execution :

1.Calibration order get generated in system automatically base of calibration(IW39).

2.Calibration order are review and release for Calibration operation.

3.On release of order,In case of Inhouse Calibration,Inspection lot will get generated,Quality Inspector can perform calibration and update result in system against Inspection lot ( QA32 ).after completion of operation,operation confirmation was done.Calibration order status updated as completed.

4.In case of external calibration,Purchase requisition generated is converted into Purchase order buy purchasing,Test equipment issue (541) to vendor against PO and during good receipt Inspection lot will get generated,Inspection can be done before taking calibrated instrument into unrestricted stock, confirmation of external calibration operation and closing of calibration order will be done.

5.Calibration certificate are scanned and attached in equipment master using document Management system.



1.History of Test Equipment Captured from Equipment Master,Serial number history option (IE03).

2.Calibration orders list with status ( IW39 ).


Advantages of SAP Implementation:

1.No need to maintain history card,equipment history captured through Equipment master

2.Test Equipment pending for Calibration easily available (IW39)

3.Repeatation of work reduced

4.Tracebility of equipment is become easy.

5.Real time data available for Quality Control team.

6.Cost of Calibration can be captured.



I hoped above mention scenario will help you.



I'm listing the QM documents I find useful in the SCN space.  Will try to update it regularly.


Trouble shooting Notes and posts:

How to find the suitable notes/KBAs

Frequently Asked Questions - FAQ QM

10 Most viewed Notes from Quality Management

Often used QM notes

5 Most viewed KBAs from Quality Management



SAP Standard QM Report

Plant Maintenance Tables (PM/CS/QM)

Useful Customization Transactions for SAP Quality Management (QM) Module

User Exits - SAP QM

How to make field as input,required,display only and hidden.


QM process:

Quality Control of Geochemical Samples in Mining Industry

QM processes in Oil and Gas Industry- Feedback

QM processes in Oil and Gas Industry Part 2

Cutover Stock Migration with QM approach form As-Is to To-Be system

Customer issue management Via QM

Line Rejection

Key Quality Management Configuration for Integration with Other Logistic Modules

Source Inspection Through Internet Results Recording

Vendor-end Quality Inspection Process

Quality Calculation in Vendor Evaluation (MM) - SAP

Calibration process mapping in SAP


Inspections using Multiple Specification

Inspection with batch classification

Learn about how Johns Manville has implemented and is using Quality Issue Management


QM-PT(Quality Planning)

QM/Characteristics/Specifications/Classification – Blog 1

QM/Characteristics/Specifications/Classification – Blog 2

QM/Characteristics/Specifications/Classification – Blog 3

Manufacturer Part Profile in QM

Linking MIC to Class Characteristic

SAP QM Master Inspection Characteristic Architectures

Control indicators in QPMK and storage in field QPMK-STEUERKZ

Physical Samples Management in SAP QM


QM-IM-IL (Inspection lot creation)

Recurring Inspection process in SAP QM

Recurring Inspection process with Batch

[PRODTST]Inspection lot creation: Recurring inspection

Inspection Lot Origin (Types) for Various Material Movement Types

Deactivating Inspection lot creation form MvT 107 without impacting the functionality for MvT 109

Inspection after Goods Receipt for Production Order

In-process inspection in Discrete Manufacturing

Audit Inspection (Inspection Type 07)

Inspection lot creation for return sales order

Inspection lot creation : sample size calculation

QM-IM-RR (Result recording)

Calculation of Standard deviation, Cp & Cpk in SPC

Inspection Points: Key settings and Usage

Control Charts and Process Control in SAP, a White Paper By Shashank Shekhar, Deloitte Consulting

Control charts in SAP QM


QM-IM-UD (Usage Decision)

Reversal of Usage Decision and Stock Posting of Inspection Lot

Stock Posting during UD, of Non-QI stock


QM-QC(Quality Control)

Dynamic Modification of Inspection Scope, a White Paper By Shashank Shekhar, Deloitte Consulting LLP

Basic Steps to Create Dymanic Modification Rule

Common issues during dynamic modification


QM-QN (Quality Notification)

FAQ: Assigned Objects in Quality Notification

Use of assigned object functionality in QN

User-fields in Catalog Tabs of Notifications - QQMA0008,10,11 and 12

User-Fields in PM/CS/QM Notifications : Screen-Exit QQMA0001


QM-CA(Quality Certificates)

COA Programs - Why are they being changed?

Guidelines for Creating a QM Certificate for Delivery in Product Lifecycle Management



DMS- How to Store documents in a Remotely located system Via SAP

Basic steps of report generation by query with example and scenario


If you want to learn more about QM, you may want to follow the following experts:

Craig S

Natalia Machado

Nitin Jinagal


You can also find more useful documents in the QM WIKI page:

Product Lifecycle Management (PLM)


If you are also interested in PP(Production Planning), have a look at the following post:

Useful documents on SCN



This FAQ provide answers to repeated asked questions regarding a new functionality: Assigned Objects in Quality Notification.

Technical Name of the Business Function: EHP 4




What is Assigned Objects in Quality Notification?


You can assign multiple objects (batches, serial numbers..) to a notification at ITEM LEVEL. This enables you better document problems or errors.

How can I activate Assigned Objects in Quality Notification?

Please make sure that you have activated the EHP4 with business function OPS_QM_NOTIFICATION and all its dependencies e.g. EA-PLM and OPS_QM_EXTENSIONS.

You would see that there should be a separate tab in Item level for the assigned objects.

Please go to customization below to add the new tab. The screen area number is 840.

QM > Quality notification>  'Overview of notification type'

During results recording does it generate a single notification for all batches/serial numbers?

No, the functionality Assigned Objects is not integrated with defect recording, this is missing functionality.



How can I select and import multiple batch numbers to the Quality Notification?

You have two options:

- You can enter ranges:

When you enter a range (such as 1 - 5), the system adopts all batches and/or serial numbers where the key is between 1 and 5 as assigned objects in the notification item.


- You can enter generic values:

When you enter a range (such as 5  * ), the system adopts all batches and/or serial numbers where the key is starts with 5 as assigned objects in the notification item.



Where can I find more information about Assigned Objects?

You can find in transaction SFW5 the prerequisites of activation of business function OPS_QM_NOTIFICATION using button 'Dependencies' and a detailed documentation and release information using the correspondent buttons next to the business function.



If you have more doubts about this topic, please comment and maybe I can help you

Following the idea of compilation of notes (Often used QM notes) and KBAs (5 Most viewed KBAs from Quality Management), this one is for ranking the 10 most viewed notes from Quality Management (from the year and from the month).

I hope this can help to solve your issues quickly



10 Most viewed Notes from 2016

392 Views - 175842 - Inspection lot: Reversal of goods movements from usage decision

259 Views - 48815 - Checking possible inconsistencies between MM and QM

141 Views - 33924 - Reversing usage decisions

87 Views - 208271 - Sending e-mail from notification print control

83 Views - 2161987 - Automatic usage decision for results recording

71 Views - 174877 - UD: status inconsistency suspected for inspection lot

63 Views - 332626 - Inspection plan: Restoration after deletion

51 Views - 370191 - Notification worklist: User-defined fields in message list

46 Views - 1729128 - QM: Downport of QM Customizing switches of EHP7

40 Views - 2191172 - Automatic reversal and automatic closing of inspection lots

*Until 11.02.2016




10 Most viewed Notes from 2015

3.353 Views - 175842 - Inspection lot: Reversal of goods movements from usage decision

2.037 Views - 48815 - Checking possible inconsistencies between MM and QM

1.112 Views - 33924 - Reversing usage decisions

625 Views - 208271 - Sending e-mail from notification print control

590 Views - 174877 - UD: status inconsistency suspected for inspection lot

547 Views - 332626 - Inspection plan: Restoration after deletion

448 Views - 1972228 - Adapting the EWM interface in QM

425 Views - 736920 - Maintainance of selected sets requires transport request

409 Views - 1970287 - FAQ about the Usage Decision

396 Views - 370191 - Notification worklist: User-defined fields in message list


*Until 31.12.2015


This blog is about Frequently Asked Questions in Quality Management and aims to collect the main information of our threads in QM and publish it in one single document, so the information can be easily find and people can search for a specific subject.

This document will be updated when new important threads appears.

If you have a suggestion of a good thread that are out of the list please give some feedback and I can add it here.



Usage Decision

reversal of Usage decision

Inspection lot have a status of Usage Decision, yet there is no UD code visible in the inspection lot

how to restrict the usage decision

Usage decision with retroactive date or manual

Automatic Usage Descision after Result Recording.

Quality Notification

Work flow through in Quality notification

Customer field in QM11 transaction

FM or BAPI For task creation in quality notification

QM Notification: defect quantity not updated correctly

Customer screen area of quality notification

Dynamic Modification Rule

(Inspection Lot) Dynamic Modification Rule for Production Order

Dynamic Rule at Charateristic Level

DMR & Annual Inspection

How to Implement Dynamic Modification Rule for Material & multiple vendor Combination

Inspection Lot

Cancelled Inspection Lots can be reverte the system status

Create Inspection lot with QI stock

QA32 No inspection plan could be found

Inspection lot QM CRTD

Inspection Lot - Change Batch

Remove MIC from particular Inspection lot

Inspection lot creation time

Inspection Plan

CWBQM to remove MIC from Inspection plan

CWBQM - to change mass IP

Assign Inspection lot to lab technicians

LSMW - Inspection Plan Upload - Direct Input Method

QA32 No inspection plan could be found ...again?

Set up multiple inspection plans for one product and assign to different customer.

Result Recording

Colletive Result recording and Usage Decision

Change Result Recording after digital signature done

SMS alert to vendor about result recording

Long Text Issue

Results cannot be entered

Calculated Characteristics

Failing to do results recording

Smaller than, greater than limits

Stock Posting

Blocking of Stock Posting after doing Usage Decision

QI Stock in MD04

Stock inconsistency between MM and QM

Stock posting from "unrestricted usage" to "Quality inspection" stock

Quality Certificate

Vendor's Quality certificate recording for incoming material

SAP Quality Certificate for Procurement

Certificate Profile FM


Batch status at recurring insp lot creation

Batch field disappearing

Quality Info Record

using QI06

Sampling Procedure

Sampling Scheme

Sampling scheme

Physical sample

Physical sample

sample drawing procedure - Auto sample calculation.

Calibration Inspection

Inspection Lot For Calibration Order in CRTD status

Test Equipment Tracking

Stability Study

EHP 5 Vs EHP4 for stability study

Stability Study

Master DataQM Different Inspection types with the same Inspection Origin
Multiple Specification

Multiple specification in recipe

Multiple Specifications in QM

Digital SignatureDigital Signature in In-Process Inspection

Unable to create an FMEA object using a template, with copy actions indicator selected

FMEA and Control Plan creation SAP

Hello everyone


I believe that there are some QM statistics that may be of your interest, so I decided to make this blog post to show you KBAs (Knowledge Base Articles) View Count.

My idea is to do two rankings, Most viewed from Year and Most viewed from Month.

But, what is these rankings for?

I believe that if any ranked KBA did not work for you, you can bring the solution that resolved your problem.


If you guys like this idea, I can monthly update the ranking (if there are changes). Please let me know



5 Most viewed KBAs from 2015

281 Views - 1705428-  Error message IM254 'Invalid object number (status management)' or IM258 'Object does not exist (status management)' in transaction QM02

239 Views - 1911886 - Transaction code QS41 and QS51 ask transport request

208 Views - 1705378 - Dump SAPSQL_ARRAY_INSERT_DUPREC in result recording for an inspection lot when using the BAPI BAPI_INSPOPER_RECORDRESULTS

174 Views - 2011512 - Dump SAPSQL_ARRAY_INSERT_DUPREC in program SAPLCOVB

156 Views - 2068196 - Error "Change the inspection stock of material XXXX in QM only" during Stock Transfer


5 Most viewed KBAs from 2016

96 Views - 1705428-  Error message IM254 'Invalid object number (status management)' or IM258 'Object does not exist (status management)' in transaction QM02

77 Views - 1649612 - Error message M7021 "Deficit of SL Stock in quality inspection" while canceling the goods receipt for a qualitiy managed material

66 Views - 1680949 - Customizing or SAP application menu for the "Copy Inspection Results" function is missing

59 Views - 2068196 - Error "Change the inspection stock of material XXXX in QM only" during Stock Transfer

58 Views - 2117983 - How to return stock to delivery if stock is already posted during usage decision


*Until 24.04.2016



5 Most viewed KBAs from May 2016*

29 Views - 1705428-  Error message IM254 'Invalid object number (status management)' or IM258 'Object does not exist (status management)' in transaction QM02

14 Views - 2011512 - Dump SAPSQL_ARRAY_INSERT_DUPREC in program SAPLCOVB

14 Views - 2117983 - How to return stock to delivery if stock is already posted during usage decision

12 Views - 2068196 - Error "Change the inspection stock of material XXXX in QM only" during Stock

10 Views - 1688077 - Information messages QP866 or QP879 while replacing master inspection characteristics using transaction QS27


This kind of surprised me! And you, do you use these KBAs a lot in your projects, implementations or daily work?

I'm waiting for your opinion


Natália Machado

This Blog details the usage of sampling schema in process inspection of type 03.

Requirement: There was specific requirement during inprocess inspection to derive the inspection sample quantity based on the inspection lot size. Accordinly the samples are inspected and acceptance or rejection criteria is defined.

Scenario 1: If the lot size quantity is 100 or less, then upto lot size of 100 inspection sample quantity should be 2 quantity and anything beyond 100 quantity then inspection lot sample quantity should be 2% of the inspection lot size.

Scenario 2: If the lot quantity is 500 or less, then upto lot size of 500 inspection sample quantity should be 1% of inspection lot size and anything beyond 500 quantity then inspection lot sample quanity should be 5 quantity.

Approach : This requirement is addressed by defining the sampling schema and assigning the same to sampling procedure. In turn sampling procedure is assigned to inspection plan.

Scenario 1: Step followed are given below:

  1. Creation of Sampling Schema
  2. Creation of Sampling Procedure
  3. Assigning Master Inspection Characteristics /Sampling procedure in Routing
  4. Sample Size in inspection lot

Step 1: Creation of Sampling Schema: ( Tcode : QDP1)

Sampling plays an important role to derive the number of samples to be inspected based on the inspection lot size.

Enter the sampling schema name and its text. Select Attributive insp under Valuation Parameters and Severity/AQL under Sampling tables for as displayed below.


On enter system pop up another screen to enter the Insp severity and AQL value. Select Insp severity as 004 -Normal Inspection and AQL value as 2. AQL value is the maximum number of defects. Since as per scenario 1 we require 2 samples if the lot size is less than or equal to 100.


Click on choose, then system will display the below screen to maintain the lot size and sample quantity.

If lot size is 100 or less then sample size is maintained as 2.

If lot size is 150 then sample size = 150 * 2 % = 3. Like wise lot size and sample size is defined below upto the maximun lot size used by business.

on SAVE sampling schema S2% is created in system.


Step 2: Creation of Sampling Procedure: ( Tcode : QDV1)


Enter the sampling procedure name, text and select Sampling type as Use sampling schema and valuation mode as Valuation according to char attrib.code as displayed below. Since inspection are not used Radio button without inspection points selected.


Click on continue and maintain the sampling schema S2% as displayed below.


Click on addtional data and select the inspec severity and AQL Value as displayed below.


System will display the below message, click on yes. Sampling procedure will be created.



Step 3: Assign MIC and Sampling procedure in Routing ( Tcode : CA02)

MIC assigned for operation 0030 along with sampling procedure SP-2%.



Step 4: Sample Size generated (QA02)

Case1 : Production Order and Inspection lot size is 50. Sample qty generated is 2 as per sampling schema lot size 100 and below sample size should be 2. Below screen shot displays the sample size.


Case2 : Production Order and Inspection lot size is 100. Sample qty generated is 2 as per sampling schema lot size 100 and below sample size should be 2. Below screen shot displays the sample size.



Case3 : Production Order and Inspection lot size is 120. Sample qty generated is 3 as per sampling schema lot size 100 and above sample size should be 2% of lot size. As per sampling schema from lot size above 100 to 150 sample qty should be 3. Hence system generated sample size of 3.


Similarly for lot size 150 given below.


Similarly for lot size 200.



Scenario 2: If the lot quantity is 500 or less, then upto lot size of 500 inspection sample quantity should be 1% of inspection lot size and anything beyond 500 quantity then inspection lot sample quanity should be 5 quantity.

Here sampling schema to be defined such that upto 500 qty sample size should be derived based on % and beyond 500 it should be constant number.

Sampling Schema defined as shown below and assigned to sampling procedure.



Case 1: Order created for 200 qty and inspection lot generated for 200 qty. Based on the schema sample size generated for 2 qty as displayed below.



Case 2: Order created for 600 qty and inspection lot generated for 600 qty. Based on the schema sample size generated for 5 qty as displayed below.


Conclusion:  With the usage of sampling schema required sample  size can be derived based on the order lot size. Suitable for process where acceptance/rejection of the entire lot is based on no of the samples tested and their results. In such case deriving the sample size considering order lot size plays important role. Deriving the correct sample size can be achieved through usage of sampling schema.

QM/Characteristics/Specifications/Classification – Blog 3


Previous Blogs:


QM/Characteristics/Specifications/Classification – Blog 1


QM/Characteristics/Specifications/Classification – Blog 2


GC’s and MIC’s  - Linking


So now you have gone through with the client and you have a list of General Characteristics (GC’s) that you need along with the descriptions, UOM’s and decimal places. So what’s next for this master data? I hate to say “that depends” but yes.. it does..


Before we link the GC’s up with a master inspection characteristic (MIC) we need to determine spec ranges or values for the test.  My suggestion is that regardless of whatever design you ultimately decide on, you should at this point determine the maximum ranges you intend to allow for each test.  This is NOT a spec range even though that term is utilized by many.  These values should be the minimum and maximum values that the test can theoretically have. 


Tests measured in percentages are the easiest ones here.  You normally won’t ever report anything below zero or anything above 100. Yes, there are exceptions.  Going above 100% for a measured test is usually pretty rare but it happens.  Some other tests may exceed 100% but these are often calculated tests that involve measuring some type of change.  In any case, you want to establish the highest upper limit and lowest lower limit. These may or may not be similar to what SAP calls an upper or lower plausibility limit which is maintained in the MIC.  Often times however, the plausibility limits may be set to even narrower ranges at times.


For our examples here I will use a percent test and a PPM test.  We will stick pretty much to standard tests for now.  We’ll use 0 – 100% and 0 – 10,000 for ppm. Why 10,000 you say?  Just because in this case.  In reality the people responsible for testing will know what is reasonable.  Just because a test can have a large value doesn’t mean we need to allow it.  Our goal is to reduce errors at every step possible and minimize potential errors that could impact the customer. 


If a customer has a ppm test where they have never, ever, recorded a result over 500 ppm, you might set the upper range to 500.  Even though theoretically a result of 6000 ppm is physically possible. 


A quantitative characteristic will either be a one-sided or two-sided spec.  In a one-sided spec the value will be written as <= 100.0 or >= 100.0.  A two sided spec as 93.0 – 95.0.  Make sure you include spaces between the operand and the values. A key point to keep in mind is that even though SAP allows you to enter  < and > signs they are not honored when linked to a MIC.    SAP will interpret a < as <= and a > as a >= in the MIC.  So if you really want < 100.0, you must enter <= 99.9 so that in the MIC, 100.0 will be valuated as out of spec.

This can cause various problems when using decimals or rounding and in how values are reported.  For more info on this you can check out this SCN discussion: http://scn.sap.com/thread/3170725.


When creating the general characteristic, the unit of measure (UOM), total characters, and number of decimal points must be provided.  The decimal point is a character and must be included in the count of characters.  100.0 would be counted as having 5 characters and one decimal space.


For our two characteristics we set up as follows.







Spec Range


Iron, ppm




<= 10000


Assay, %




0 – 100.0


We are now ready to create our general characteristics.   The above info is all you really need to create the general characteristics.  If you are building a spreadsheet for data loading you will need a Data Type field and other fields.  This isn’t a blog about data loading but is primarily about specifications so I’m not going to cover all the fields.  But if you are consulting, some of the fields you should understand and can be valuable to a client are:


Chars Group

Groups characteristics

Often done when multiple businesses share a client.  In which case it often relates to authorization group.

  1. Auth. Group

Limits editors for GC


Interval vals allowed

Allows range to be used

Primarily for one-sided specs.  Can be used for two-sided to put in a lowest detectable level value.

Negative Vals Allowed

Allows values < zero


Single Value

Default for GC’s.

MUST be used for all characteristics you want to link to MIC’s.

Entry required

Value must be recorded

Not normally used when linking to MIC’s

  1. Exp. Display

Allows entry and display in scientific notation

Not often used in most places





On the values screen.. DO NOT click on “additional values” on this screen.

Enter in the range you wish to allow for the test.  For the tests above, this would be <= 10000 and 0 – 100.0.


After making sure the characteristic is released, the new characteristic can be saved.


Qualitative characteristics might be discussed in a different blog since the goal of these blogs is to discuss options for handling spec ranges.  Since qualitative characteristics have no ranges and work by using catalogs, there can be no specs by material or customer.  So they really aren’t relevant to these discussions.


We should now have two general characteristics created.  The next blog will discuss setting up the MIC.


Filter Blog

By author:
By date:
By tag: