John reported an issue to SAP and it is recognized as program error. SAP corrected it with note,
however, John found this note is pilot released. Why doesn’t SAP
release it to all customers if it is to
correct system error? Is there any risk for me to apply this pilot released note? John has several concerns…

        Actually, when the development department provides a standard preliminary correction to correct
an error in the existing standard function of an SAP application, initially, this correction is only available
for the release in which you have reported
the error. If necessary, the correction will also be made available
for other releases and in Support Packages, and at that time, it will be set to ‘released’ status.

        But John still has following concerns:

1. Why couldn’t I find this note by myself?
-> The pilot released note cannot be searched. But it can be exposed by using the note number. You can
    go to
Service Marketplace (SMP) -> Note and choose "Launch the SAP Note and KBA search" or
    "Search using the SAP ONE Support Launchpad".



2. Can I apply the pilot note by using SNOTE?
-> Yes, you can. But please note that you may have to apply the note again using SNOTE after you import

    a Support Package. You may want to refer to the following note in this case: 
681069 - Note Assistant: Notes in the status 'pilot release'

3. Is that more risky to apply a pilot note than a released note

-> Generally, there will be no difference on the risk between a standard released note and a pilot released

    note. The note is "Pilot Released" just because it is only available to a part of customers with particular

    issues, but it is also been tested,  verified and followed our standard note releasing process. However,

    if the pilot released note contains modification, see question below for details.


4. If the pilot note contains modification, what should be noted?
->  Status 'Pilot release' is not only used for standard preliminary corrections. It can also be used for  
     modification notes. In this case, please note that these changes will not be included in any SAP
     Support Package, nor will they be scheduled for any future development. In particular, this modification
     is not covered by SAP Standard Support. You may want to refer to the following note regarding
     modification:    170183 - What to consider when applying modifications.

Do you have any further questions? Let me know in the comment.

SAP e-mail can be transformed to support your communication strategy, reaching recipients inside and outside your organization, rich in graphical content and corporate branding.


In this blog I am going to share a few examples of how SAP-generated e-mails can provide a better user experience. 

Payment Advice

Payment (or ‘Remittance’) Advice notes are typically sent to suppliers after a payment run to inform them of payments made and cleared invoices.  The typical output process involves a SAPscript or a PDF form, which generates spool output for printing or added as an attachment to a plain e-mail.


click to enlarge

Order Confirmation

Sales Order Confirmation (or ‘acknowledgement’) output is typically sent to customers when a sales order is created or changed.  The typical output process involves a mixture of EDI messaging and standard SAP PDF form output. 


click to enlarge

Workflow Notification

Workflow Notifications are typically sent to employees to inform them to action tasks, or that requests have been actioned.  In this case, the body of the e-mail contains the instruction or message.  There are two scenarios:

  • Multiple notification: A single e-mail contains a list of workflow items for an employee to action.



click to enlarge



  • Single notification: The e-mail relates to a specific workflow task.  In this example, more information from the workflow can be added into the e-mail, as it can be designed for a particular workflow / workflow task.


click to enlarge

E-mails can contain company logos, product images, barcodes and other graphical images.  They can contain links to on-line content and social media.  They work on any mobile device, and don't get stuck in firewalls.

Companies using 3rd party vendors for output document processing can make huge cost savings by generating rich e-mail content from SAP.

HTML e-mails are typically less than 10% of the size of equivalent e-mails with PDF attachments, so for organizations generating huge quantities of e-mail, there's a document management and performance saving.

For more information check out

While the general perception is that any IT project (& especially SAP) always overruns schedule as well as budget, there are certain learnings I would like to share with you customers/prospects, which should make you aware of the potential red-flags to take corrective actions (or best, preempt and plan for it) and thereby stick closest possible to the project plan / budget.


Even though problems/red-flags vary from project to project, like unique thumb prints (& no way I’m listing all of them), I would like to address the mother of all SAP projects and that is:



SAP Implementation Projects /Rollouts -

End-to-End Implementation is every SAP consultant’s dream and ..every customer’s nightmare!

Risks involved are just huge. Here are some pointers:


1. Choice of Implementation partner – This is a decider. This also doesn’t mean that you have to necessarily go for Tier-one vendor as a risk-mitigation measure. Besides the usual capability-check (which all customers do without fail), the other key thing is to find out the fitment between the vendor & your business-users. (Simple question – Can the vendor resources speak your business-users’ language (besides talking usual jargon)? And here, I just don’t mean knowledge of a particular language but overall fitment.  Since SAP implementation (& especially Requirement Gathering aka Business Blueprint) involves vigorous interaction with business users, ignore this point at your own peril.


2. Implementation team – You must assess each & every vendor resource (or, especially Lead roles and or critical roles, if the project size is huge) to find out his/her credentials, breadth & depth of respective module’s knowledge & relevant domain experience. The consultant needs to understand not only your business processes but also should know mapping them in SAP.  And remember, there are consultants who are glib talkers but struggle in configuring processes in SAP and then there are consultants who can configure almost any process in SAP but can’t open their mouth! Ideal SAP resource is having a blend of both!! (To be realistic – look for closest possible balance). Pay special attention while choosing Project Manager. See the project team (in terms of size & skills) is commensurate with the scope. If any ramp-up / spike is needed for any rollout wave or any particular phase (like Realization), plan for it in advance. If there is any offsite/offshore component (and it has to be minimal & something which needs less/no user interaction), get the clarity beforehand.

3. Team work – You should be taking due care in choosing your own team (business leads who would be involved extensively along with the vendor resources, like, current role they’re playing, what role/additional responsibility they would be shouldering post-Go-Live & can they explain the business processes properly). See that your team is sufficiently empowered to take business decisions / provide inputs. (In plain words – they should not go to CFO or Business-heads for every small inputs /decision, causing delay to project timelines). Also, both teams need to gel and work as one team.  Otherwise the issues of non-cooperation, delayed/no responses & distrust could snowball into major escalation impacting project timelines. Team-building exercises, weekend picnics etc. go a long way in building the camaraderie.

Remember, a SAP Implementation Project is like a juggernaut! Let it roll smoothly and in intended direction.


4. Scope creep - Here the golden rule is to follow a frozen/signed-off Business Blueprint like a Bible & avoid any other changes (which are deviations from the agreed scope – amounting to Change Request). Besides the known risks of project timelines getting impacted and budget over-run, by every user-exit you’re increasing the maintenance overheads and also increasing the complexity of future technical upgrades. Notwithstanding SAP is a very comprehensive ERP software & that it allows changes/customization, you need to draw a line somewhere (& stick to it) else, as they say in jest, your SAP becomes ‘Z-SAP’!


5. Change Management – Accept it.  All of us don’t like change. (Just think of changing your iPhone for Android or vice-versa J ) Users, who are used to working on a particular software day in, day out for years, are bound to feel anxious while moving to SAP (& add to that, the unspoken fear of retrenchment, due to process automation/optimization/redundancy) So, may it be fear of learning (or any other), it needs to be addressed duly, much before the Go-Live.  No matter how best SAP is implemented, if the users are not using it properly or not using it at all, it is a Failure you can’t afford!

6. Master template- This is applicable only for roll-outs. You need to design and agree on the master template (also called as global template) which would be rolled out globally. Here the key is, to follow template to 70% to 80% and allow country-specific localization (besides the mandatory statutory requirements of tax, financial statements etc.) to max 30%. The rollout waves (timelines/sequence/dependency) need to be planned & the core-team (from each country) needs to be identified to make it a success

7. Legacy Data Migration – In plain words, getting your current data (master data & transactions) into SAP. But it is never as easy as it appears. No wonder, I’ve seen many projects getting delayed in the last and most crucial lap of the implementation cycle. The issues of duplicate master records (e.g. same customer existing under 5 different codes!), inactive masters, the monster material master data (& how each field will be mapped in SAP), how many of years of transactions do we really need, are some of the nagging questions which organizations need to address. If the project size & complexity is huge, formation of a separate task force for data migration, right in the early stage of project would accord due focus to this topic. Organizations need to address the issues of data duplication / data clean up / data validation (in other words, ‘data massaging’) else if such data is migrated to SAP, it becomes a case of ‘garbage in, garbage out’. Vendor team needs to provide a detail migration plan showing how they will upload various masters/transaction, using which method/technology, in which sequence, cutover date etc.  To iron out all the issues, a mock run of data migration is a must.


8. Documentation – Unless properly planned, knowledge resides mostly (& only) in the heads of individuals (vendor resources and or business teams) or in their notebook PCs. So, when these individuals leave, you’re in trouble. (Keep aside the data confidentiality issues, I’m talking of basic documentation related to project). You also understand the severity of this issue especially when new vendor is on-boarded (say, post-Go-Live) & knowledge-transition is planned.


Here are some best practices to tackle this:

  1. Designate someone as Documentation Lead. His brief will be simple, ensure due documentation of every process, in every phase of the project. Plan for his backup resource too.
  2. Store it on a shared drive / network / cloud with due access mechanism.
  3. If the project is huge (thumb rule – If the number of SAP users is more than 300 it is a BIG project) , choose a good documentation management tool and ensure it is used
  4. Ensure vendor team is documenting & storing it on the designated place
  • v. Plan one Documentation Audit to find out the gaps
  • vi. As a bare minimum,  a SAP implementation project’s repository should include following documents (names may differ but essence would remain) :
    1. Project Charter
    2. Project Plan
    3. Business Blueprint
    4. Business Process Procedure
    5. Test cases (Unit testing / Integration testing / User Acceptance testing / regression testing (if relevant)
    6. Data Migration plan
    7. List of custom objects with one-liner description
    8. Functional specifications
    9. Technical specifications
    10. User roles (authorization matrix)
    11. SAP Patches / OSS Notes implemented (when & why)


9. Project Governance – You need someone (from your side) to work closely with the vendor team, to monitor progress almost on a daily basis, to look for the chinks / red-flags at every stage,  in short - to follow what has been covered in earlier points. Considering the criticality, business impact & the budget, very often SAP projects are seen to be driven by CFOs (& not CIOs). In either case, ensure the right senior stakeholders are involved in the Project Steering Committee. Ego management is one more delicate task you need to handle adroitly. Ensure the Steering Committee meets at preplanned intervals & issues are tabled/addressed.




Good Luck to join the ever-growing fraternity of ‘Happy SAP Customers’!!!

As you may already know, we can create COND_A IDocs from condition records report such as VK13, TK13, AKE3 etc (as posted here: Quick and Easy - EDI/ALE - Pricing Condition Records). However, when we are dealing with large number of condition records, the foreground send condition process will takes time and we have to keep our user session until it finished. Thus, we need to somehow create COND_A IDocs in background.

As a quick solution, I choose to utilize transaction recorder with the following steps:


Step 1 – Create Recording


Go to transaction SHDB > New Recording > Put your recording name and VK13 transaction code for my case > Start Recording

scn conda 1.jpg


In the recording session, the flow should be input Condition Type > Choose Condition Information > Fill up the selection parameter > Execute > Select All (F7) > Send Condition > Fill up Message Type COND_A and target system in Logical System. Then go back until we exit the recording session.

There is may be some challenge when recording selection parameter, especially when we want to use multiple single values. I use ‘select ranges’ to overcome this situation.

scn conda 2.jpg



Also, I use ‘Select All’  to simplify my recording, thus I have to set selection parameter in such a way that only condition records that I want that will be appear. Attached is sample of my recording.

Step 2 – Create Session

After I saved my recording, next step will be create session based on my recording.

scn conda 3.jpg


I will send multiple condition types, thus for every condition type I will adjust the recording by just changing the condition type code and the selection program (RV13ANA*) if it has different access sequence then the previous one.

scn conda 4.jpg

Step 3 – Run the Session in Background

Go to transaction SM35 > Choose your session > Process> Choose Background > Process


scn conda 5.jpg


Now I can send condition records from multiple condition type in the same time and without developing custom ABAP program.

This is a short paper that I have written for the unit BCO6615 - Strategic use of ERP Systems at Victoria University, Melbourne.




In the current digital world, companies have access to various advanced technologies and they adopt these technologies based on their preferences and strategies. Hsu (2013) states that Enterprise systems (ES) like enterprise resource planning (ERP) systems and supply chain management (SCM) systems integrate the information flow across various business modules, automate the operational process and provide access to real time information to make quick decisions. However, just having the ES is not sufficient for the companies to have competitive advantage (CA) in the market. They would require business strategies to effectively use their ES and hold their position in the market.  Porter (2008) says organization structure drives profitability and competition, not just whether an organization is low tech or high tech, unregulated or regulated, mature or emerging. Organizations use strategic approaches like integration, niche, differentiation, growth and innovation, coupled with their ES to increase their revenues and market share, and achieve CA over competitors.  This paper focuses on Integration and niche strategies, and provides two case studies for each strategic approach.

Integration of Enterprise Systems

Enterprise integration ensures the communication between enterprise units to achieve the enterprise goals. Enterprise integration could be at various levels, namely physical integration – network of devices, computers and machines; application integration – linking various software and databases; business integration – integration of functions that control, monitor and manage business processes (Chen, Doumeingts & Vernadat 2008). Hsu (2013) says that information flow between enterprise systems of a company increases operational efficiency and provides CA over other companies. In addition, to create business integration capability, organizational resources are prominent than IT resources and they play an intervening role to achieve a CA in an organization. 

Integrating SCM with suppliers and customers systems provides the advantages of higher consumer satisfaction, lower operational cost and a prominent position in the market.  Moreover this integration leads to profitability for the entire supply chain members and gives the organization advantage over its competitors(Mzoughi, Bahri & Ghachem, 2008).

Mzoughi, Bahri and Ghachem (2008) state that by developing strategies in collaboration with suppliers and maintaining communication with them organization achieves a superior position in the market. For instance Proctor and Gamble (P&G) the largest producer of consumer and household goods in USA and spread over 80 countries world-wide uses the information technology to integrate its business with suppliers and consumers.


P&G has implemented SAP ERP system which is used to manage its core business processes like finance and SCM. The firm’s global IT strategy is to integrate the existing IT systems with SAP and implement the future SAP solutions globally. This would result in lower operational costs and increased business value across global locations. P&G focusses on SCM and is ranked in the global top 25 supply chains. P&G uses SAP SCM for forecasting their demand and resource planning. In addition it uses another IT system called Axway solutions to transfer files across to its supply chain partners. This managed file transfer (MFT) solution integrates with the SCM and securely transfers the files to its suppliers. It provides an effective and easy maintenance communication framework to transfer invoices, orders shipments and other messages to its supply chain members. It helps in maintaining relationship with its customers and improve success rate with product innovation and sustainability goals.  P&G estimates that the integration of IT systems with ERP would increase their cash flow by 100 million dollars globally (pbworks 2015).

Another company Kiva systems which is acquired by Amazon integrates cloud robotics with warehouse management systems (WMS) for effective warehouse handling and operational efficiency. D'Andrea (2012) says that cloud Robotics effectively integrates the new buzz technologies like Cloud, Big Data and Internet of things. Robots gather data and share the information on cloud which is like a central repository for them, and they learn and create patterns from this repository data. This capability of robots is utilized to automate the repetitive manual and operational tasks in warehouses. It uses local networks to track data and coordinate pallets, and integrates with mobile networks to move pallets. Kehoe et al. (2015) discuss that hundreds of mobile robots communicate and coordinate with each other to move the stock racks and bring the inventory to the workers. This saves the walking time of workers and pallet operation time. This has revolutionized the distribution facilities in warehouses and created opportunities for new markets. This model is successful because the technology focuses on a need in the warehouse operations rather than solving a problem. The technology integrates with the warehouse management system and improves the operational efficiency and decreases inventory delivery time to the customers.


Niche Strategy

According to Noy (2010), Niche strategy is a generic business strategy adopted by the companies to gain better than average profits in their sector, especially in a dense market segment. The marketing theories consider niche strategy as the ability to manage and drive organization resources towards the profitability and growth with effectiveness whereas the population ecology considers dense market competition as the factor that determines either the birth or death of a niche firm. The niche concept is a subset of market segmentation theories. Effective aggregation process of bottom to top approach, starting from customer then to micro segmentation maximizes the profits. A niche firm focussing on a homogenous product in a heterogeneous market could increase its profits. Niche marketing takes segmentation to next level by creating a discrete segment of consumers. On the other hand, niche companies face the danger of either reduction in demand or growth potential of the company or the growth in niche market attracts new entrants and competitors. For example, Nike a niche company known for its athletic shoes has expanded into other segments like clothing and technology integrated personalised shoes to sustain the competition from other shoe companies.         


Nike designs new products by integrating shoe dynamics with technology and has a good marketing strategy. It launched mobile apps and wearable tracking devices that integrate with the shoes and give the personalized experience to its customers about how many steps they have walked and how many calories they burnt in a given time. Nike has its own digital platform called Nike+ and collaborates with mobile companies to integrate their technology in mobile devices.  Nike uses enterprise systems like SAP and Seibel systems to integrate and manage their customers, suppliers, manufacturing and human resources globally. Nike is effective in managing their resources and using digital technology to integrate with their core competency of manufacturing athletic shoes. This capability to integrate and create new market segments helps Nike to retain its own discrete segment of customers and compete with other companies in the high density athletic shoe market (fastcompany 2013).



HSE24 is a prominent home-shopping network that reaches about 40 million households in Germany, Switzerland and Australia, and expanded into Russia and Italy. To be a niche company in retail-online sector, HSE24 uses real time data to analyse the customer buying patterns and changes its marketing strategy quickly according to the buying behaviour of the customers and the goods in stock. Mullich (2014) says that enormous data is available in retail-online segment and the effective use of this data to make quick decisions distinguishes a retail-online company from its competitors, particularly for companies like HSE24 that have to sell the goods live over online and telephone. The “now moment” is crucial for HSE24 and uses web, mobile and social network technologies coupled with in-memory computing to gather data and take actions in real time to engage customers personally and create targeted marketing campaigns. In addition, it uses this data to ensure that there would be less product returns and hence saving the return operational and delivery costs. This requires effective real time integration of its enterprise systems like ERP, SCM and CRM. The challenge for HSE24 is to maintain the relation with customers across different geographic areas, predicting their needs despite not having direct interaction. HSE24 uses predictive analysis and in-memory computing to track the customer behaviour and to micro segment customers in real time that enables the marketing department to launch campaigns according to customer preferences. This increases the sales and leads to more profits for HSE24.




Organisations adopt various business strategies based on their preferences, market segment and customer cohort. In the current high technology world, huge amount of data is available from various sources both internal and external to the organizations, and the ability to use this data in real time and make quick decisions in accordance with customer behaviour also gives CA in the market. Companies are integrating digital technology, cloud robotics and in-memory computing with their ES to improve their operational efficiency, market share and revenues. Hsu (2013) says that integrating firm’s ES along with suppliers systems provides a win-win situation to the firm and the suppliers. Companies that have a discrete segment of customers loyal to them use niche business strategy to hold their position in the market and improve their product and services using innovation and technology. Boehe and Cruz (2010) state that companies use different strategies to create their own unique brand image and make customers loyal towards them. By identifying a business strategy that suites their business and customer needs and coupling it with their ES, companies would achieve a competitive advantage in the global market.




Boehe, DM & Cruz, LB 2010, 'Corporate social responsibility, product differentiation strategy and export performance', Journal of Business ethics, vol. 91, no. 2, pp. 325-46.

Chen, D, Doumeingts, G & Vernadat, F 2008, 'Architectures for enterprise integration and interoperability: Past, present and future', Computers in industry, vol. 59, no. 7, pp. 647-59.

D'Andrea, R 2012, 'Guest editorial: A revolution in the warehouse: A retrospective on kiva systems and the grand challenges ahead', IEEE Transactions on Automation Science and Engineering, vol. 4, no. 9, pp. 638-9.

Fastcompany 2013, Death To Core Competency: Lessons From Nike, Apple, Netflix, fastcompany, viewed 9 August 2015, <>

Hsu, P-F 2013, 'Commodity or competitive advantage? Analysis of the ERP value paradox', Electronic Commerce Research and Applications,

vol. 12, no. 6, pp. 412-24.

Kehoe, B, Patil, S, Abbeel, P & Goldberg, K 2015, 'A survey of research on cloud robotics and automation', Automation Science and Engineering, IEEE Transactions on, vol. 12, no. 2, pp. 398-409.

Mzoughi, N, Bahri, N & Ghachem, MS 2008, 'Impact of supply chain management and ERP on organizational performance and competitive advantage: Case of Tunisian companies', Journal of Global Information Technology Management, vol. 11, no. 3, pp. 24-46.


Mullich, J 2014, Using Real-Time Insights, HSE24 Gets Closer to Customers, Real-Time Enterprise Stories, viewed 11 August 2015,<>

Noy, E 2010, 'Niche strategy: merging economic and marketing theories with population ecology arguments', Journal of Strategic Marketing, vol. 18, no. 1, pp. 77-86.

Pbworks 2015, pgcasestudy, pbworks, viewed 15 August 2015,


Porter, ME 2008, 'The five competitive forces that shape strategy', Harvard Business School Publishing.

If you want to consume SAP data through non SAP application then this is a very useful document for you. It gives you a step by step, easy guide to build it yourself. We can, using SAP Web Services, read and write SAP data using non SAP applications. This option has been available for a long time now. Please do remember with the advent of UI5 and Fiori there are now more ways to utilise SAP data externally.


The example here shows how to read the data but can be enhanced to write back to SAP as well. Please check if you have the relevant SAP license.


This example was done on:

1) Microsoft Visual Studio 12

2) SAP ECC 6.0


I am writing down the steps which if followed properly will help you do it yourself.


The example here is to demonstrate how we can read vendors in SAP and pass it on to a .Net Application.


The prerequisites are:

1) Basic knowledge of SAP ABAP, especially RFC

3) Basic knowledge of Visual Basic/C#


Here is a list of, high level, steps on how to do it:

1) Create an RFC Enabled Function Module in SAP

a) In our example, we will create an RFC which accepts a country code and returns all the vendors for that country from an SAP server

2) Create a Web Service for the function module so that it can be consumed by other service based applications

3) Setup the Web Service

4) Create a small Visual Basic Project to show how the country selection be passed to the Web Service and utilise the list of vendors returned from SAP



We will start with a list of activities to be carried out in SAP first.

  1. Creating an RFC Enabled Function in SAP
    1. 1a.png
    2. 1b.png
    3. 1c.png
    4. 1d.png
  2. Next step is to convert this into a Web Service so that it can be called from outside SAP
    1. 2a.png
    2. 2b.png
    3. 2c.png
  3. Setup Web Service Step I- Run Transaction – SOAMANAGER – Select Web Service Configuration. Search the newly created Webservice. Click on the search result-> Click on Create Service (The below lines will be created once you have finished creating the service. You may want to look at the next screenshots for details on what to select. Authentication is the main one, we can leave all the rest to default.
    1. 3a.png
    2. 3b.png
    3. 3c.png
  4. Setup Web Service – Step II - Properties setting to be used while creating the service. Click on Open Binding WSDL generation to get the Web Service Link. Copy the WSDL URL for Binding – This will be used in other non SAP application to link to SAP
    1. 4a.png
    2. 4b.png
    3. 4c.png
  5. Consume it in .Net (Visual Basic Application ) - Step I -Start a new project of type WebSite. Add Web Form by right clicking on the Project. Drag a text box, button and labels as show below. We will use the text box to enter country code. The button will trigger the call to SAP function and return the vendors for that country.
    1. 5a.png
    2. 5b.png
    3. 5c.png
  6. Consume it in .Net (Visual Basic Application ) - Step II - Add a Service reference – Most critical part. Click on advanced. Click on Add Web Reference. Put the URL from Web Service screen above and click on go – You might be asked to login to SAP here. In the end the web service will show on the screen. You may change how it should be called in VB Project. Final Output once the FM is added in our example. Click on Add Reference
    1. 6a.png
    2. 6b.png
    3. 6c.png
    4. 6d.png
    5. 6e.png
  7. Consume it in .Net (Visual Basic Application ) - Step III - Code Snippet
    1. 7a.png
  8. Consume it in .Net (Visual Basic Application ) - Step IV - Output  - You can then run the VB application, press the submit button and the result would be fetched from SAP.
    1. 8a.png
  9. The output is not being shown here purposely.

My work involves innovative implementation of Order to Cash implementation for SAP customers. Having done SAP implementation for renowned companies like Daimler, P&G, Boots and Rio Tinto I am into implementation value added integration of SAP with commercial open source CRM and E-Commerce platforms.


I realized that most SAP customers use a subset of functionalities SAP provide. This is primarily because of license cost and secondly high implementation cost associated. Thus a company usually gets the core head office users on to the system and key powers users on board. A bulk of business users get missed out. Thus ground level activities get missed out. Most often left outs are Salespersons and the dealer network.


sap banner all inclusive.png





Usual SAP Implementation drop outs - Dealers and Sales team


Best businesses around the world run SAP. However what portion of your organization user base gets covered by the power this application can provide. In my over twelve years journey typically 10% of total company employee use SAP in some way. Most obvious exclusions are salesperson and dealer community, at least as far as OTC cycle is concerned. The reason? To my understanding, two folds -


a. Sheer maths behind

The sheer number of users in this segment is enormous. So lets say if you have a dealer base : headquarter  staff ration of 20:1, this means that in terms of licensing cost you have to shell out money in the range of 10x of what you currently are. Its huge isn't it? Even enterprise have their budgets worked out. Thus in most common scenarios, it is decided to leave out the job of order entry to internal staff and customer representative groups who then interact with dealers and salesperson to enter / track the orders on their behalf.



b. Mobility


For years, SAP had an interface which was not at all suited for the segment of users which were mobile and needed to use system on the go. This was where Salesforce took off and surprised the market. Many SAP customers integrated Salesforce before SAP realized that it needs to do something.

Licensing cost and infrastructural requirements don't allow even big enterprises to get them on board.


c. Training


Due to the geographic spread, training becomes a key issue on many occasion. This is also true because people of the ground keep changing and it necessitates elaborate training to be conducted every quarter for new recruits.

However, they are eventually the profit centers of the company. Revenues are there because they generate inquiries, quotations and finally the orders,don't they? So what are the options available to an organization.

The Alternatives

Fortunately, SAP provides great options for  most of the questions. The above aspect is no different. The possible solutions that the company banks on the latest revolution around SAP Integration capabilities and the recent focus on improving the UX / UI by the platform.

a. SAP Fiori

Since I saw the demo of the product, I must say that I am huge fan of SAP Fiori. For someone who has seen the light blue interface of SAP all this while, Its difficult to believe the kind of screens which one can develop this the solution. To a great extent SAP Fiori takes the concern on mobility away. Sales and service staff can easily interact with SAP in the most convenient manner from any device of their choice.

b. Matured commercial opensource solutions over SAP

Though the above solution is clean and rich, there are cases where budget is the main driver. Matured commercial open source solution like SugarCRM and Magento e-commerce provide a great option for companies to get the salesperson and dealer network on a single platform. There are multiple system integrators(SIs) who can help you achieve this. 


Smart companies make an effort to improve their returns on investment they did while implementing SAP. However this is possible by having a single platform across the organization and getting users on the robust platform either by implementing the latest technology or by integrating smart web applications which require less to no training.

Some customers are experiencing the German Texts in application t-codes, no matter they log in English or Japanese.

E.g. Confirmation counter in Confirmation overview in t-code IW43.


E.g. Costint tab in t-code CR03;


If you come across same situation, you can try this solution.


  1. Run report ZCHECK_D021T of note 1055585.
    This report is Z-correction report, you need create in manually in t-code SE38.
  2. Run report RSLANG20 to reset language load.
    This report is standard program. Note 110910 explains details about this report.
  3. Activate the screen in t-code SE51, if needed.
    If the steps above do not help fixing the German texts.


Here are the list of affected dynpros knownn so far.


MENUCC00 1000

MP002600 2001

MP002600 3000

MP002600 3000

MP100000 2100

RCNST000 200




RMSERI05 100

SAPL0C14 120

SAPL0C14 160

SAPL0C15 155

SAPL0C24 125

SAPL0C27 201

SAPL0C27 301

SAPL0CL0 201

SAPL0KI1 335

SAPL0KI2 335


SAPL0MP0 141

SAPL0MP0 142

SAPL0MP0 941





SAPL0Q01 222

SAPL0Q05 20

SAPL0Q05 20

SAPL0Q05 20

SAPL0Q05 60

SAPL0Q11 1081

SAPL0Q79 20











SAPLCBCM 300-348





SAPLCFDF 140-447


























































SAPLTB16 100

SAPLTM00 1201-1204









SAPMF02H 510




SAPMM61P 511

SAPMM61P 513


SAPMP53L 301

SAPMP56T 1001

SAPMP56T 1001






SAPMV45W 104

SAPMV45W 201

SAPMV45Y 200








STMMAIN100 403



Best regards,

What is Namespace:


It's the range of name it can used for custom objects.You can define an your own name space by trx SE03

Name space is there to differentiate between SAP standard ones and the ones developed by us.

'Z' and 'Y' are the name spaces allocated for objects created by us. where as A TO X for SAP STANDARD.


Here i am going to Register my Namespace for ERP system.


NAMESPACE Registration and Activation in SAP Service Marketplace:


1.Log in SMP and Go to keys and requests and select Development Namespace and click continue,




2.Click on Request namespace,




3.Click on Continue,




4.Enter you Namespace name, description and give your SID ,select your correct system for your namespace like below,




5.The Namespace has been Successfully Saved, click ok



6.Here we can see a Development Namespaces - “In process” Status:



7.After this SAP Support will check and accept your namespace,check like below,




8.Once we get Accepted From SMP We need to register in our SAp system in Tx:SE03.



9.Click on new entries,




10.Give Namespace details,



Give Namespace name and enter develop and Repair license,it will get from SAP service Marketplace,





We can use one Namespace for More systems like ERP,CRM..etc. for this we need to create Develop and Repair license keys.


Thank You ....!!!



This is another blog to share my experiences on using LSMW for data migration. In a recent project there was a requirement to migrate SEPA mandates from an existing legacy system. This was achieved using two LSMW programs, in preference to creating and processing ABAP programs.


For some background on SEPA mandates I recommend that you read this document which includes useful information on the transactions and BAPIs that were used in LSMW.


I am assuming that you are already familiar with LSMW. If you are a beginner then there are plenty of other SCN posts that will help you.

Basic functional requirements


Since the mandates already existed in a legacy system an external mandate reference was used.


It is assumed that more than one mandate is allowed per customer (for example if the customer bank account changes).


When a mandate is added the collection authorization indicator should be flagged on the relevant bank account.


A customer may have more than one bank account.


It must be possible to distinguish between mandate first use and recurring.

Approach overview


Before migrating mandates, the customers including their bank account details are created.


The creation of the mandates is based on recordings of FSEPA_M1. If a customer has more than one bank account, then a pop-up appears. Therefore two recordings of FSEPA_M1 are required.


If the mandate is successfully created then the collection authorization indicator is updated using a recording of XD02.


The above 3 recordings are processed in a single LSMW program.


In a second step a dummy usage record is created for mandates that have already been used. This is achieved by using a dummy LSMW recording and a direct update using a BAPI function. This is explained below under “Details of add usage record”.


In both LSMWs various function modules are used.

Details of initial mandate creation


Most of the fields in a mandate are filled by default from the customer master and from the company code which is the vendor, so very little real input is needed.

In our project the following was sufficient:


Please note that you need to specify the BIC (SWIFT) code for the FSEPA_M1 recordings to work correctly.


As stated earlier, three recordings were used:


In the BEGIN_OF_TRANSACTION block we do the following:

1. Check that the customer already exists. In our example the legacy customer number was encoded in the mandate id.

2. Check if the mandate already exists:

* Customer exists, now check if the mandate already exists

* It is assumed that more than 1 mandate is allowed

h_sel_criteria-snd_type = 'BUS3007'.

    h_sel_criteria-snd_id = h_kunnr.

    h_sel_criteria-anwnd = 'F'.



        I_SEL_CRITERIA     = h_sel_criteria


        ET_MANDATES        = h_mandates

        E_MESSAGE          = h_emessage

        ET_MANDATES_FAILED = h_mandatesfail.

    if not h_mandates is initial.

      loop at h_mandates into wa_mandates.

        if wa_mandates-mndid = infile-mndid.

          write: /001 'Legacy Number:',h_altkn,'SAP Customer:',

                h_kunnr, 'mandate already in SAP:', wa_mandates-mndid.

g_skip_transaction = yes.





3. Check how many bank accounts the customer has. If there is more than 1 bank account then a pop-up appears in FSEPA_M1 so a separate recording will be used. You need to set a flag and then test this at the beginning of each FSEPA_M1 recording.

4. Determine which bank account in the customer master needs to be updated. If there is more than one, then the bank accounts in the table on the “payment transactions” tab are sorted by country, bank key, and bank account. Of course your input will be the IBAN!


* check which customer bank a/c matches the IBAN, if any

* load the bank accounts into an internal table and then search it

  refresh it_cbanks.


  into table it_cbanks

  from knbk where kunnr eq h_kunnr.

  if g_skip_transaction ne yes.

    if it_cbanks[] is initial.

      write: /001 'Customer:',h_altkn,'SAP Customer:',h_kunnr,

         'Mandate:',infile-mndid, 'Customer has no SAP bank accounts.'.

      g_skip_transaction = yes.

    else. "Locate the bank account to update

      w_found = ''.

      w_index = 0.

      LOOP AT it_cbanks INTO wa_cbanks.

        select single iban from tiban into h_iban

                where BANKS = wa_cbanks-BANKS

                 and  BANKL = wa_cbanks-BANKL

                 and  BANKN = wa_cbanks-BANKN

                 and  BKONT = wa_cbanks-BKONT.

        if sy-subrc eq 0 and h_iban = infile-SND_IBAN.

          w_index = sy-tabix.

          w_found = 'X'.



      if w_found ne 'X'.

        write: /001 'Customer:',h_altkn,'SAP Customer:',h_kunnr,


              'IBAN not available for this Customer in SAP.'.







The value of w_index determined in the code above is then used in the conversion rules of the XD02 recording for each collection authorization field in the recording. The screen shot just shows the first two:



Details of add usage record


After all of the mandates have been created a second LSMW is used to add a dummy usage record when the LASTUSEDATE in the input file is not empty. The same input file is used.


There isn’t a SAP transaction to add a dummy usage record so we need add it using a function module. We can process this through LSMW using a little trick.


We create a dummy recording. I made a recording of XD03 (display customer) and saved it as DUMMY. Since I won’t actually be using the recording, it doesn’t matter which transaction I use. In the “field mapping and conversion rules” the ABAP code is inserted to perform a direct update of the SAP table.


This means that when the LSMW program is used, you should only process up to the convert step.


It is also important that you carefully test your LSMW! In this case standard function modules are used so the danger of corrupting the SAP database is very low.


Below I have included the complete code used in our project. Please note the value of USE_DOCID where 0668 is a company code value.



data: h_mndid type sepa_mndid.

selection-screen: begin of block 1 with frame title title1.

selection-screen: begin of line, comment 1(31) para1, position 33.

PARAMETERS: p_year(4) type c OBLIGATORY default '2014'.

selection-screen: end of line.

selection-screen: begin of line, comment 1(31) para2, position 33.

PARAMETERS: p_test as checkbox.

selection-screen: end of line.

selection-screen: end of block 1.

at selection-screen output.

  title1 = 'User processing parameters'.

  para1 = 'Ref year for dummy usage document'.

  para2 = 'Data validation only'.




if infile-status = 'R'. "status revoked


   write: /001 'Mandate:',infile-sepa_mndid,'Revoked mandate skipped'.

  1. else.


data: p_usage type SEPA_MANDATE_USE,

h_sel_criteria like SEPA_GET_CRITERIA_MANDATE,


wa_mandates LIKE LINE OF h_mandates,

h_emessage like BAPIRET1,

h_mandatesfail type SEPA_TAB_MANDATE_KEY_EXTERNAL,

h_usedate(8) type c.


if not infile-usedate is initial. "input is YYYY-MM-DD

  replace all occurrences of regex '[^0-9]' in infile-usedate with ''.

concatenate infile-usedate+4(4)



            into infile-usedate.

*write: /001 'last used date:',infile-usedate.

  h_usedate = infile-usedate.

* Get the GUID of the mandate

  h_sel_criteria-snd_type = 'BUS3007'.

  h_sel_criteria-mndid = h_mndid.

  h_sel_criteria-anwnd = 'F'.



      I_SEL_CRITERIA     = h_sel_criteria


      ET_MANDATES        = h_mandates

      E_MESSAGE          = h_emessage

      ET_MANDATES_FAILED = h_mandatesfail.

  if h_mandates is initial.

    write: /001 'Mandate:',h_mndid,

          'not migrated to SAP'.


    read table h_mandates into wa_mandates index 1.

    p_usage-MANDT = '360'.

    p_usage-MGUID = wa_mandates-mguid.

    p_usage-USE_DATE = h_usedate.

    p_usage-USE_DOCTYPE = 'BKPF'.

    concatenate '0668 9999999999 ' p_year into


      if p_test <> 'X'.

        call function 'SEPA_MANDATE_ADD_USAGE'


            i_usage = p_usage.

        commit work.


        write: /001 h_mndid,p_usage-MGUID,h_usedate,









With a few recordings and some standard function modules, it is possible to migrate SEPA mandates (including usage information) with LSMW.

This blog also gives an example of how to use a dummy recording and direct update in LSMW



Although SAP currently promotes “best practice data migration” using Business Objects Data Services, LSMW is still the tool of choice for many projects.

LSMW is free and simple to use and handles many things more or less automatically.

In this and other blogs I want to share some of my experiences from recent projects. I am not a programmer so any ABAP code that I show will not necessarily be the best that could be used. However, data migration is a one-off so it usually isn’t important if the code is beautiful and efficient (unless of course you have very large data volumes). I am assuming that you are already familiar with LSMW. If you are a beginner then there are plenty of other SCN posts that will help you.


How many LSMW programs to create a SAP project?

There is no standard SAP program, IDOC or BAPI available for the creation of a SAP project and its WBS elements, so recordings have to be used. The project builder transaction CJ20N is an “Enjoy Transaction” and is therefore not recordable.


The common approach is therefore to use CJ01 to create a project header and then to use CJ02 to add WBS elements. The question that then arises is how many recording and how many LSMW programs to use.


I adopted the classic approach of first creating the project header in LSMW and then separately adding the WBS elements.

The problem with adding WBS elements is the positioning of the cursor. The elements are added in a table and with an LSMW recording you can’t reposition the cursor with program code. Typically when adding records to a table, you need as a minimum to have a recording to add the first element and a recording to add subsequent elements. For WBS elements you only need these two recordings if you set them up correctly. These two recordings can be combined into a single LSMW program.


The reason that I am writing this blog is that I have seen examples with 3 or 4 separate recordings and LSMW programs being used and this seems unnecessarily complicated to me.


So, in order to add WBS elements to the projects created by a previous LSMW program, you need two recordings of CJ02:

  • Recording to add the first WBS element to a project. This has to be separate because there is no parent WBS
  • Recording to add subsequent WBS elements


Both recordings are included in a single LSMW program. In LSMW on the Maintain Object Attributes screen use the More Recordings button on the line where you add the recording. In the pop-up box you can then add extra recordings:



The “top” recording to add the first WBS element is straightforward. Here are some example screen shots:




The second CJ02 recording is what gives the most trouble. The way to set it up is as follows:

  • On the first screen enter then project and the parent WBS of the WBS element that you want to insert and then press enter, for example:
    • Project = ZB.999999
    • Parent WBS = ZB.999999.001
  • On the next screen which shows some WBS elements press the last page button followed by the next page button. This ensures that the second row of the table is always the place where you will enter the data for the new WBS. (When you do the recording with only your top WBS element present this seems a bit strange, but it’s the way to do it)
  • On the (empty) second line of the table enter the level, the WBS element and the description. Then go to the WBS detail screen either via the menu Details – WBS Detail Screen or CTRL F9
  • Now you can enter data on each of the tabs that you are using. Make sure that you analysed all of the tabs needed before you make the recording to avoid having to redo the recording in the future!
  • When you have entered all the data press the save button


You now need to be a little bit careful with the assignment of field names to the recording. You can use the “default all” button but you must then change some of the field names afterwards. The reason is that POSID on the initial screen is the parent WBS but on subsequent screens it is the WBS that you are adding. You should also be aware that in the separate recording to add the top WBS element, POSID is consistently the WBS that you are adding.


The actual field names that you use don’t really matter as long as you are consistent. In my LSMW I also had some help fields as follows:

  • h_pspid is the PROJECT identification
  • h_posid is the parent WBS
  • h_ident is the WBS to be added

and in the recordings I used the same convention.


Here are some example screen shots of the recording:





After defining the recordings you need to define the input file. The input file for the program should be sorted into the sequence of the WBS elements:

STUFE   C(003)   Level

PSPID    C(024)   Project Definition

POSID   C(024)   WBS Element


The remaining input fields will of course depend on the attributes being used.


Since multiple recordings are being used, we need some ABAP code in the BEGIN_OF_TRANSACTION block:

  • Determine which recording to use. This is easy: If STUFE = 1 then it’s a top level WBS
  • Determine the value for the parent WBS element. Obviously this depends on the template for the project type. For example in my case the level 3 element ZB.999999.001.01 would have a parent element of  ZB.999999.001
  • It is easier to work with these codes without the “.” Symbols. There is a function module CONVERSION_EXIT_PROJN_INPUT which can be used


In my case all of the project types used had the same mask for the WBS structure, so it was fairly easy to code.


We also decided to determine in the LSMW program the values of the operative indicators using logical rules based on the project type.



The indicators were only set at the lowest level WBS element and in order to know if the current element was at the lowest level, I implemented “read ahead” logic. I explained how to do this in a previous blog. With read ahead logic it is easy to decide if we are at the lowest level: if the current level is higher than the previous level, then the previous record was at the lowest level. When read ahead logic is used we test this condition in the BEGIN_OF_TRANSACTION block and write the previous record there.


The LSMW was also designed to handle multiple project types where the attributes that need to be filled (and sometimes their values) differ per project type. In the BEGIN_OF_TRANSACTION block the project profile was retrieved from table PROJ. This can then be used in the coding per field of the recordings. The input file was included the full set of attributes that might be used. In the spreadsheet that users used to provide the input, it was indicated which columns where needed per project type.



When set up correctly you only need 3 recordings and two LSMW programs to create projects and WBS elements. This is easier to manage during data migration.

R. Bailey

LSMW read ahead technique

Posted by R. Bailey May 5, 2015


Although SAP currently promotes “best practice data migration” using Business Objects Data Services, LSMW is still the tool of choice for many projects.

LSMW is free and simple to use and handles many things more or less automatically.


In this and subsequent blogs I want to share some of my experiences from recent projects. I am not a programmer so any ABAP code that I show will not necessarily be the best that could be used. However, data migration is a one-off so it usually isn’t important if the code is beautiful and efficient (unless of course you have very large data volumes). I am assuming that you are already familiar with LSMW. If you are a beginner then there are plenty of other SCN posts that will help you.


One of the advantages of LSMW can sometimes be a disadvantage. LSMW controls the flow of data in the program generated in the convert step. It reads the input records, applies the conversion logic, writes (for each segment if there are multiple segments) a record to the output buffer and, after processing all records relating to a transaction, it writes the transaction. However, sometimes you would like to know the content of the next input record and use this information while processing the current record. Unfortunately when LSMW reads a record, the previous record has already been written and is no longer available. This can be solved by using a “read ahead” technique.


Use cases

Here are some examples of where a read ahead technique might be used:

  • Processing GL bookings with RFBIBL00 using flat file input. Normally you have a header record followed by items and LSMW automatically detects each new header.
  • Processing WBS elements where operational indicators should be set on the lowest level. If the depth of the WBS structure is not fixed then you only know you reached the lowest level when you read the next record.
  • Processing vendor records from a legacy system where there are multiple records per vendor and you need to process all of the records before writing an output record.

All of the above occurred in a recent project of mine. I’ll now explain the technique using the RFBIBL00 example.

Worked Example

If you want to process GL bookings, AR open items or AP open items then SAP provides the standard batch input program RFBIBL00 which you can select in




For the transfer of opening balances in our project the input file provided from the legacy system was a flat file containing a limited number of fields. The Oracle Balancing Segment in the input file is used to determine a Profit Centre. The input account is actually an Oracle GL account which is converted using a lookup table in LSMW.



The input file is sorted by Company Code, Currency Key and Oracle Balancing Segment. A separate GL document is written for each combination of these values. The document is balanced by a booking to an offset account. If the balances have been loaded correctly then the balance of the offset account will of course be zero. During testing the GL conversion table was incomplete so some code was added to allow processing even if some input records were invalid – in this case the offset account will have a balance but we can see what is processed.


The structure relations are as you would expect:



With a flat input file we need to determine for ourselves when the key has changed and we will only know this when we read the next record. Therefore we change the flow of control in the LSMW program so that we can "read ahead" to the next record.


LSMW normally writes an output record in the END_OF_RECORD block and a transaction in the END_OF_TRANSACTION block. With the read ahead technique we do this in the BEGIN_OF_TRANSACTION block. At this point we still have the previous converted record and the next input record is also available so we can check whether there is a change of key. There are two things that have to be handled:

  • When processing the first input record we should not write any output
  • When we get to the end of the input file the last record hasn’t been written and we won’t come back to the begin block so the last record won't get written


Let’s now look at the code for each block. Since an offset booking has to be written at various places, the code for this has been put into a FORM routine.




* On change of company or currency code a new document is needed

* We write the balancing entry of the prior document here and

* a new header at end of record for BBKPF

if not prev_bukrs is initial and ( infile-bukrs ne prev_bukrs or

   infile-waers ne prev_waers or infile-balseg ne prev_balseg ).


  h_writehdr = 'X'.    "Check this at end of BBPKF record



We have defined some variable to contain the previous key field values: prev_bukrs, prev_waers and prev_balseg. When we read the first record these have an initial value. Otherwise if the value changes the we write the booking to the offset account and set a flag to write the header record for the new document.




* at_first_transfer_record.

if g_cnt_transactions_read = 1.


  1. endif.

if g_cnt_transactions_group = 5000.

  g_cnt_transactions_group = 0.




BGR00 is the batch input session record. This is the standard coding except that we replaced the “at first” test with a test on count of transactions read.




* On change of company, currency code or balancing segment

* start a new document

if h_writehdr = 'X' or prev_bukrs is initial.

* check prev_bukrs to get first header


  h_writehdr = ''.


* Set previous values here

  prev_bukrs = infile-bukrs.

  prev_waers = infile-waers.

  prev_balseg = infile-balseg.


BBKPF is the document header. We write a header for the first record and whenever the key changes. We also update the previous key values here.




if g_skip_record ne yes.


* Update running totals for the balancing item

  if INFILE-NEWBS = '40'.

    g_wrbtr_sum = g_wrbtr_sum + h_wrbtr.

    g_dmbtr_sum = g_dmbtr_sum + h_dmbtr.

  else. "Posting key 50

    g_wrbtr_sum = g_wrbtr_sum - h_wrbtr.

    g_dmbtr_sum = g_dmbtr_sum - h_dmbtr.


  g_item_count = g_item_count + 1.

  if g_item_count = 949.   "Split the document after 949 items


    g_item_count = 0. "Reset the item count after writing record

    transfer_this_record 'BBKPF'.  "Write header for next block




If the record is valid (our program contains various validity checks) then an output record is written and the cumulative value in local and foreign currency is updated. This coding block also contains a document split. If there are more than 949 items then a balancing entry is written followed by a new document header.




if g_flg_end_of_file = 'X'.




This is were we handle the problem of the last record. LSMW contains a number of global variables and a useful one that is not included in the LSMW documentation is g_flg_end_of_file. When this has value X we have reached the last record and a final offset booking should be written




  if g_wrbtr_sum ne 0 or g_dmbtr_sum ne 0.


* Offset entry not required if the document balances!

    bbseg-newko = p_offset.  "Use suspense account

    bbseg-zuonr = 'DATA MIGRATION'.

    bbseg-sgtxt = 'Balancing entry'.

    bbseg-prctr = h_prctr.

    bbseg-xref2 = '/'.  "Ensure this is empty here

    bbseg-valut = '/'.  "Empty on the offset booking

    bbseg-mwskz = '/'.  "Empty on the offset booking

*  bbseg-xref1 = '/'.  "Empty on the offset booking

    if g_wrbtr_sum ge 0.

      bbseg-newbs = '50'.    "Credit entry

      bbseg-wrbtr = g_wrbtr_sum.


      bbseg-newbs = '40'.    "Debit entry

      bbseg-wrbtr = - g_wrbtr_sum.


    if g_dmbtr_sum ne 0.

      if g_dmbtr_sum ge 0.

        bbseg-dmbtr = g_dmbtr_sum.


        bbseg-dmbtr = - g_dmbtr_sum.



    translate bbseg-wrbtr using '.,'.

    translate bbseg-dmbtr using '.,'.

    g_wrbtr_sum = 0.

    g_dmbtr_sum = 0.

    g_skip_record = no.  "LSMW carries over status of previous rec!!

    transfer_this_record 'BBSEG'.




There is no need for an offset booking if by chance the document already is in balance. Otherwise we create the offset booking. The offset account is an input parameter in our program and some other fields have fixed values. Our system is configured to use decimal comma so we need to change the value fields to what is expected on an input screen. At the end we write the balancing record and the transaction.



This is a simple technique that can be useful in a variety of situations.

As SAPPHIRE starts in Orlando Tomorrow, we, customers and partners, will be bombarded with information about S4/HANA. It is clear it is dead center in the SAP strategy for the next years.


The value of a renewed Business Suite will become clearer and cleared the more we hear about Simple Finance and now Simple Logistics. The roadmap looks very exciting with Fiori apps covering great scope and with the upcoming of Business Suite merge (Different components of the business suite, like CRM and SCM will now merge back with ERP to form a single system).


This transformation is still in its early stages. Financials are far ahead while logistics is coming soon. The roadmap for the “repatriation” of external business suite functions back to the ERP Core is just starting. I foresee a roadmap of 3-5 yrs until we see a complete fusion.


But what is clear is that SAP has addressed 2 of its most important issues: Simplification of SAP footprint AND User-Interface. We will soon see customers running ERP on HANA (S4/HANA) with complete business suite functions like Global ATP and eWM back in the core ERP. No more parallel SCM and/or CRM landscapes. No data replication. Shorter implementation, lower TCO.


So, all this is great, but there is a catch: Current SAP customers looking at jumping into S4/HANA need to revise their current SAP solution and consider return (as much as possible) back to standard functionality.


Customer have been running SAP for many years (some for decades) so they built on the Business Suite for 2 reasons: Either implement something that was not available at the time (early customers) or to implement customer specific requirements.

What we see more clearly now than before, is that the price to implement and maintain these customer “customizations” is much higher than just the implementation development hours. It may hinder adoption of future functionality. This is exactly what is happening now. Here are 2 major examples:


  1. Custom code – Customers often support thousands of custom built programs. Migrating to ERP on HANA requires a revision of this code so it can run properly (not talking about performance here, some DB practices differ and bad ABAP code of today will NOT run correctly on HANA. They NEED fixing)
  2. The more custom code and advanced configuration, the farther apart a customer will be from adopting 2 of the most important values of the S4HANA proposition. Guided Configuration and Standard Fiori apps.

So, now that it is more evident than ever that it pays to adopt standard practices and reduce customization to the max, isn’t it time to adopt SAP pace of innovation and stop trying to build IT solutions ourselves? Wasn’t THAT the original proposition of buying an ERP software in the first place?

Surprise for Kernel version 7.21 patch level 500 !


Today, I started applying EWA report actions to ne of my ABAP stack SAP systems. Since kernel was out of date, I checked download center whether there was a new patch or not. Luckily there was nw version 7.21 patch level 500. It was very fresh on download page and was released today.

I downloaded SAPEXE_500* and SAPEXEDB_500* immediately.


As it is always done I extracted both SAR files into a folder and copied extracted files into NTAMD64 folder with overwrite option.

Then I started my system. It didnot become all green, dispatcher was in Dialogue Queue Standstill status, there were no work processes seen on management console. I could see all wps that were started, using Task Manager processes tab page but they were not coming to management console AS ABAP WP TABLE. Besides this inconsistency I could login to system without any warning etc, too.




There was inconsistency and strangeness in the system.

When I checked SAPCAR command (used with -xf parameter) I realised that extraction did not finish successfully for SAPEXE file. So I redownload SAPEXE file 2-3 times thinking that there was an error in downloading (SAPEXE filesize incresed in later downloads) . But the same error occured each time for SAPCAR command. Something was wrong with the file not with downloading.


I tried download manager for downloading but there was also an oddness in DM. I have seen "You are not authorized to…" message. How could it be I downloaded the same file few minutes ago.


After this message, I decided to check the download page. What was the problem of SAPEXE file.


And there was a surprise waiting for me. SAPEXE*_500* files were withdrawn by SAP guys.


I was surprised. This was the first time I have met with a damaged kernel file.

A damaged file was released for all customers!

How could this be? I could not imagine.

Wasn't there a quality control?

This could not be done but it happened to me.


I have been checking download site for the files but they still did not appear.

Looking forward to 500.


I hope this kind of things will not occur again.



Hi fellows,


I want to share with you an experience about milestones after Go Live in business where I work.


We gone out Golive 9 months ago, since then we had many problems with: Users, Internal Communication, Process Speed and Productivity. Additionally the business was changed their natural process for another process. As the days the problems was increasing.


The CEO of the factory, is a person very very committed with the process and with SAP System, and always towards meetings for the purpose of communicate to all managers the importance to use SAP correctly, however after the meeting clutter return


In that moment was when I take awareness to the importance of Change Management, we had all day all days supporting different areas such as FI, CO, PS, MM, SD, PP, PM, ETM, HCM, QM.


After that and towards days, we start a program to had as core the Change management, with this tool was evident the improve in:


- Process

-Staff Communication

-Performance Workability

-Knowledge of process & business


Simply wanten to share this experience about impotance of tools as Change Management that are very interesting and helpful for our work


Currently we are designing Roll-Out to another centre of the Company and going to start to implementing WPB, surely considering The Change Management in book Lessons Learned




Filter Blog

By author:
By date:
By tag: