Said Shepl

Install DB Instance

Posted by Said Shepl Dec 9, 2014

We are present the installation of SAP System with a high availability for ABAP Instance:

This installation will be on Windows Server 2012 R2 as a platform and a SQL Server 2012 SP1 as adatabase level.

we well use Software provision manager 0.1 SP 06 and Kernel 7.2:

also we are divided this installation to the following Parts:

Part 1- Install SQL Server Cluster.

Installing SAP HA on SQL Part1: Install SQL Server Cluster.

Installing SAP HA on SQL Part1: Install SQL Server Cluster step2

Part 2- Install First Cluster Node.

Install First Cluster Node.

Part 3- Install DB instance.

Part 4- Install Additional Cluster Node.

Part 5- Install Central Instance.

Part 6- Install dialog Instance.

We will start directly now in Part 3- Install DB instance.

We must to be prepare our DB source, however DB backup or Export DB by uesing SWPM export tool.

You must download Software provision manager with last update (Uncar it) and Kernel last update for the same version an start by

Choose system copy --> MS SQL Server --> Target System Installation --> High-Availability System --> Based on AS ABAP --> Install DB instance.

DB Instance 01.PNG


DB Instance 02.PNG

We will choose standard system copy/Migration (load-based) because we use export DB tool

If you restore your DB using SQL Server, you can use another choice Homogeneous system copy


DB Instance 03.PNG

Choose MS SQL server instance name which you are created it during installing SQL Server Cluster.

DB Instance 04.PNG


DB Instance 05.PNG

In this step, we will provide sap installation with SAP Kernel CD location.

DB Instance 06.PNG

Choose LABELIX.ASC file for the required kernel.

DB Instance 07.PNG


DB Instance 08.PNG

In this step, we will Provide SAP installation by the location of Export DB which we are import it from the target system.

DB Instance 10.PNG

In this step, you can enter the password of SAP DB schema.

DB Instance 11.PNG

You can choose the required Number of data file according to the number of CPU cores for your server.

as illustrate for large system (16-32 CPU Cores), Medium System (8-16 CPU Cores) and Small System (4-8 CPU Cores).

DB Instance 12.PNG


DB Instance 13.PNG


DB Instance 14.PNG

In this step, you can enter the number of parallel jobs in the same time.

DB Instance 15.PNG

In this step, you select the kernel database .sar file to unpack it in the kernel directory.

DB Instance 16.PNG


DB Instance 17.PNG


DB Instance 18.PNG

We are received the following error during installation

DB Instance 20.PNG

We are use the following link and SAP Note 455195 - R3load: Use of TSK files to solve this issue:

SWPM: Program 'Migration Monitor' exits with error code 103</title><script type="text/j…

DB Instance 25.PNG

The problem is solved and installation is go on.

DB Instance 27.PNG


DB Instance 29.PNG

In this step, illustrate that the installation has complete successfully.

DB Instance 30.PNG



Said Shepl

We are present the installation of SAP System with a high availability for ABAP Instance:

This installation will be on Windows Server 2012 R2 as a platform and a SQL Server 2012 SP1 as adatabase level.

we well use Software provision manager 0.1 SP 06 and Kernel 7.2:

also we are divided this installation to the following Parts:

Part 1- Install SQL Server Cluster.

Installing SAP HA on SQL Part1: Install SQL Server Cluster.

Installing SAP HA on SQL Part1: Install SQL Server Cluster step2

Part 2- Install First Cluster Node.

Part 3- Install DB instance.

Part 4- Install Additional Cluster Node.

Part 5- nstall Centeral Instance.

Part 6- Install dialog Instance.



we will start directly now in Part 2- Install First Cluster Node.

you must download Software provision manager with last update (Uncar it) and Kernel last update for the same version an start by


choose system copy --> MS SQL Server --> Target System Installation --> High-Availability System --> Based on AS ABAP --> First Cluster Node

First Cluster Node 01.PNG


First Cluster Node 02.PNG


We are received the following error:

we are solve this issue by follow SAP Note: 1676665

First Cluster Node 02 Sol.PNG

Download and install Vcredist_x64 and install it

In this case, install the Microsoft Visual C++ 2005 Service Pack 1 Redistributable Package ATL Security Update, which is available at:

Retry the installation with SWPM.


First Cluster Node 03.PNG


First Cluster Node 04.PNG


First Cluster Node 05.PNG


Note: In this Step, you must Create A record in your DNS by SAP Virtual instance host name:

First Cluster Node 06.PNG


First Cluster Node 07.PNG


First Cluster Node 08.PNGFirst Cluster Node 09.PNG


First Cluster Node 10.PNG

First Cluster Node 11.PNG


We receive this error because we choose Cluster Disk 1 which is specified to SQL Server, We are re-choose Cluster Disk 3 which is available storage as the following:

First Cluster Node 12.PNG


First Cluster Node 13.PNG

First Cluster Node 14.PNG


First Cluster Node 15.PNG

Select Kernel drive LABELIDX.ASC

First Cluster Node 16.PNG


First Cluster Node 17.PNG


First Cluster Node 18.PNG

Reconfigure SWAP in OS Platform


First Cluster Node 20.PNG


First Cluster Node 09.PNG

First Cluster Node 22.PNG


First Cluster Node 23.PNG


First Cluster Node 24.PNG


First Cluster Node 25.PNG


First Cluster Node 26.PNG


First Cluster Node 27.PNG


First Cluster Node 28.PNG

LSMW.  I wonder where that 4 letter acronym ranks in terms of frequency of use here on SCN.  I'm sure it's in the top 10 even with stiff competition from SAP, ERP, BAPI, ABAP, and some others.


Why is that?  Well, it's a very useful tool and comes up frequently in the functional forums.  I remember when I got an email from a fellow SAP colleague introducing me to it.  That was back sometime in the fall of 1999 but I know version 1.0 came out a year earlier and was supported as far back as R/3 3.0F.  I dove into it and the guide that SAP had published and it was really great.  I could see immediately that for basic data conversions, I could handle the entire conversion process without the help of a developer.  Back in 1998, that was a fairly big deal and one that I'm sure the ABAPers had no problem ceding territory in.


Just a year earlier I was using CATT to do legacy conversion.  It had a similar transaction code based recording mechanism, a way to define import parameters, and a loading mechanism to map a .txt file to those parameters.  But CATT was not designed specifically for data conversion so it could be a pain to deal with.  In particular, tracking load errors was very tedious which required you to do a large number of mock loads on your data to ensure that it was perfect.


My History with LSMW

Back in 1999, it was obvious to me that LSMW was a big improvement over CATT for a few reasons:

  • I could incorporate standard load programs and BAPIs. Using screen recordings was no longer the only way to load data.  I hate screen recordings.  They eventually break and you have to handle them with kid gloves at times... you have to trick them into handling certain OK codes or work around validations/substitutions.
  • LSMW allowed you to use screen recordings as a way to define your target structures.  I love screen recordings!  Why?  Because, as a last resort, they let me change any record in the system using an SAP supported dialog process.  If you can get to it manually at a transaction code for a single record, than you can create/change/delete that same data in batch using a custom screen recording.
  • I could do the transformation within SAP rather in Excel.  That saved a lot of time especially if I had certain transformations (i.e., a cost center lookup) that were used in different loads.  Define once, use multiple times.
  • I could load multiple structures of data.  Again, this saved time because I didn't have to rearrange the data in Excel to force it into a particular structure format which might contain numerous fields that I had no interest in populating.  That left my source Excel file relatively clean which was far easier to manage.
  • Organization.  LSMW had a way to categorize each load by Project, Sub-Project, and Object.
  • No more developers!  While the tool allows you to insert custom logic, it's not required to do so.  If you know your data well enough and you have a typical legacy source file, there's no reason why a functional person such as myself can't load everything on his own.



Once word spread about LSMW inside SAP, it seemed that every functional consultant I worked with was using it.  Eventually we started using it for purposes other than legacy data conversion.  Mass changes, mass creation of new data that wasn't legacy related, etc.  Other non-functional areas used it too; I've seen security teams upload mass changes to userID records.



This is how I Really Feel

But... I didn't write this to praise LSMW.  Now, in the year 2014, I can't stand working with it.  It's limitations have been bugging me for years and SAP hasn't done anything to improve it.  My gripes:


  1. Poor organization.  The simple Project / Sub-Project / Object classification is too limiting.  It seems to be a quasi hierarchy of the individual LSMW objects... but why not release a fully functional hierarchy?  If we had a real hierarchy we could use multiple levels, parent-child relationships, drag-n-drop, etc.  There are some customers that don't use it that much and may only need a single level deep hierarchy.  Others might need 5 or more.  Either party is currently forced into using the existing 2 deep classification of Project / Sub-Project.  What I most often see is a horrible organization of the underlying LSMW objects.  That fault lies with the customers for not enforcing and administering this hierarchy.  But if the tool made it easier to classify and organize the various scripts, maybe it wouldn't be as messy as I've come to expect.
  2. The prompts are inconsistent. This is a minor gripe but the function keys are different per screen.  To read/convert your data file you navigate to a selection screen (a very limited one) and press F8 to execute.  To read the contents of those data files within SAP, you get a pop-up window and have to hit Enter to execute it.  No one limits the reading to a selection of records (or, very rarely do they) so I could do away with that prompt entirely.
  3. Another personal gripe but I'm so tired of the constant Read Data, Convert Data, Load Data...  Whoops!  Error!  Change in Excel, save to .txt, Read Data, Convert Data, etc.  The process has too many steps and I have to flip between SAP, Excel, my text editor, and my file manager (Directory Opus).  Or, why can't I link directly to my Excel file and view it within SAP?
  4. There isn't a good way to quickly test or validate some basics of the data.  I get that each area and load mechanism is different (i.e., BAPI versus screen recording) but there should be a quick way within the tool to better validate the data in a test format so that we know right away if the first 10 records are OK.
  5. Speed.  I had some tweets with Tammy Powlas this past weekend.  She used RDS for an upload (Initial Data Migration Over, The Fun Has Just Begun).  The upload of 600k records took an hour but I highly doubt that LSMW could beat that.
  6. The solution was great back in 1998 for the reasons I noted above.  Back then I would happily double click between my source and target fields, assign rules, create lookup tables, etc.  But it's 2014.  I'd rather use a Visio type of tool to maintain my data relationships.
  7. Lack of Development.  Here's the version we are running at my customer.  2004...  seriously?  No changes in 10 years?  I recall the early versions of LSMW... v1, v1.6, v1.7... but I don't remember there being a v2 or v3.  So how did we jump from v1.7 to v4 and what are the delta changes?  Seems like some upper management mandated creative version management to me.  My guess is that LSMW has been upgraded based on changes to WAS and to keep it along with ERP 5.0 to 6.0... but the product itself hasn't changed in terms of functionality.  LSMW still feels like a v2 product to me.


screenshot - 2014.11.12 - 08.31.12.png




My Biggest Gripe

But my biggest gripe isn't with the tool.  It's how it's used by the SAP community.


It seems that every consultant I know uses LSMW as their go-to source for all data changes.  I've walked into customers that have been using an LSMW to maintain some object for 10+ years!!!!  How the heck can something like that happen?  This is an area where LSMW's flexibility works against it... or rather, works against the customer's long term success with SAP.  The problem here is that it allows us functional folks to quickly develop a 'tool' to maintain data.  It's the quickest way to develop a solution on the Excel-to-SAP highway that accountants et al. need throughout the year.  For a truly ad-hoc requirement to do just about any process in SAP based on data in Excel, it works fine.  I don't have an issue with that and would recommend LSMW in those appropriate cases.  But it's not a long term solution.  Period, end of story.



Other Solutions

Mass Maintenance Tool

If you have a recurring need to mass change master data, you should be using the mass maintenance tool.  Just about every module has developed a solution using this tool to change the most important master data records in the system.


screenshot - 2014.11.12 - 08.56.29.png



Be Friendly to your ABAPer

Anyone heard of a BAPI?  If you have a recurring need to upload transaction data or make changes to certain POs, sales orders, etc, or have a master record not in the list above, there is a BAPI that will do that for you.  Get with your ABAPer, develop a suitable selection screen, get a test-run parameter on there, get a nice ALV based output report, and then get the tcode created.  Done...  that's a good solution using an SAP supported protocol that is far better, safer, consistent, and easier to work with than a screen based recording that dumps your data into BDC.  In my opinion, if part of your solution has the letters 'SM35' in it, you've done something wrong.


Why would anyone recommend to a customer that they should use this crummy process (read data, convert data, display converted data...) as the long term solution for making changes like this?  That's not a solution, it's a lame recommendation.

Final Word

LSMW and other similar screen based recording tools (Winrunner et al.) are flexible and it's tempting for people... and I'm talking primarily to the consultants out there that over-use and over-recommend LSMW... to keep going back to it.  It's a useful tool but there are problems when you don't have enough tools in your toolbox to use... you're limited in options and you keep going back to what you know.


Have you heard of the phrase "When you have a hammer, everything looks like a nail".  It came from noted psychologist Abraham H. Maslow in his 1966 book The Psychology of Science.


Maslow quote.png



His quote is also part of something called the Law of the Instrument.  A related concept of this is the notion of the Golden Hammer which was written about in AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis: William J. Brown, Raphael C. Malveau, Hays W.…  The book covers the factors that come up repeatedly in bad software projects.  Among them is what they call the Golden Hammer which is "a single technology that is used for every conceivable programming problem".


LSMW's time as my hammer of choice passed a long time ago.  It's a useful tool and should be in everyone's toolbox but we shouldn't use it unless there is an actual nail sticking out.



Data conversion in SAP project - continuation

Introduction :

I have recently written a blog about data conversion process, in which I've specified the major basics steps of this important process (Data conversion in SAP project).

It's advisable to start reading that blog before this one. As my PM project advances successfully forward, I have discovered new very important step in this critical process. I wish to share this step: Data cleansing.


Data cleansing step:

After you have fulfilled Analyzing errors in EXCEL file, you will discover that there are some conversion failures due to "garbage" data in customers legacy system.

It's important to outline that during this step, probably, the legacy system is still active (users are still using it).

As I've explained in previous blog - Excel file stores the output result from Process runner next to each record that was upload to SAP. Meaning, you can see next to each record the reason for failure.

Step 1: Out line and filter all records which consist "garbage"data reasons.

Example from my project : I tried to upload 2000 records of customers machines and tools which are PM equipments. two major reasons for failures were: 1.material X  doesn't exist in SAP (material number was a field data in entire legacy equipment record and it has to be correct in SAP).

2. Short text description is too long (I have mapped in my program that equipment short text will be transferred into SAP equipment short text, problem is, SAP equipment short text is limited to 40 chars only).

Step 2:  Send your customer those EXCEL records which were filtered in previous step, so they could understand why those records wouldn't be transferred to SAP in future.

Remark: This conversion process including cleansing step begins in SAP Dev environment. Afterwards moves to other SAP environments: Trn , Qa , Pre prod and Prod. each time this process begins from beginning through the end - in cycle. the data that is extracted from legacy system, extracted from production environments only, each and every time.

Step 3: Your customer should decide what to do with those records. They might direct you to erase those records from the EXCEL file , they might decide to cleanse the relevant data in legacy system. the customer only is responsible for cleansing the data in legacy system - You can't do it for him.

Step 4: After they decide what to do with those records, you should repeat steps 6, 4 , 5 , steps mentioned in previous blog : Delete and archive all data that was uploaded in this cycle (in order to "make room" for same but "cleaner data" to be uploaded), activate extraction program again to extract "cleaner" data from legacy system, upload cleaner data to SAP, analyze new failures and vice versa.

You should execute cleansing step along with steps 4 - 6 two or three times in average until the data is uploaded to SAP without failures.

Data conversion in SAP project - conversion from legacy system to SAP ECC.

Introduction :

I would like to share my experience in data conversion process with the SAP community. Data conversion is one of the most critical processes in successful SAP implementation projects. This process is a part of the realization step in the "ASAP" methodology (step 1 : project preparation. step 2: blueprint. step 3: realization. Step 4: final preparation. Step 5: go live). SAP advisors and local implementers are usually responsible to carry out data conversion from legacy system to the SAP ECC. I have also heard of SAP projects in which the Basis team has carried out this process.

The data converted is used only in order to set up master data in ECC . it is not used to set up historical transactional data from legacy system.

there are different tools which convert data: 1. SAP ECC built in tool via LSMW transaction code. 2. External tool named Process runner which communicates easily with ECC. I used Process runner which was purchased by my company.


Two of the most important qualities which are required in order to succeed in this process are : 1. Thoroughness 2. Communicating & understanding your Customer needs.


As mentioned above, data conversion process is part of the realization step.The realization step begins after the advisors (or local implementers) have finished writing down and submitting the blueprint documents for customer's approval. After the approval, the implementers start to customize and writing down specification documents for new developments in the Development area in ECC. Only then, its possible to start the data conversion process.

There are sub steps in data conversions:

1. Mapping the necessary fields in the ECC object that will be filled with data (I.E: Equipment object in PM module)

Here you need to be well aware of what is written in blueprint documents regarding your SAP objects.

It's recommended to differentiate between value obligatory fields of this object and value non obligatory fields. Some times object classification is needed. This happens when object regular fields are not enough to store entire data from legacy system. I used classification in equipment object that represented electric generators.

2. Creating one instance master data manually

The purpose of this step is to verify that the implementer is able to create master data manually before conducting the recording.

3. Recording master data set up via Process Runner (or LSMW).

In case the recording is not accurate, or changes in setting up the master data are need to be done after recording, the recording has to start all over again. Thus it is important for you to be certain how to set up the objects master data correctly.In case the recording was accurate and you saved it, Process runner creates an EXCEL file with proper columns to be filled (according to the fields you have entered in the recording) in order to set up several instances automatically.

For example : You have recorded setting up master data of one piece of equipment with certain data. After you have saved the recording, Process runner will create the proper structure in EXCEL for future recording. Then, you will be able to fill the EXCEL file proper columns with as much pieces of equipment data as needed, and execute the recording again when you wish to set up those pieces. In this way, multiple pieces of equipment will be created via Process Runner.

4. Creating Extraction program to extract data from legacy system

In this step you need to specify to the legacy system administrator (he is usually a MF programmer) in accurate manner which fields, and what tables you need the data from. Second thing you need to consider: what is the data population to be extracted (I.E: only active pieces of equipment / data which was created after certain date. your customer will know the answers to this question ). The system administrator should then prepare the program in the legacy system for future use.  In my project, the legacy system was MF system which was written in ADABAS NATURAL. I sent specification documents to the administrator specifying fields to extract and what data population to extract.

If there is necessity to do some kind of data manipulation (I.E : 1. Equipment type in legacy contains values: A , B , C while ECC equipment type was customized to contain values AA , BB , CC respectively 2. changing format of date values etc.. ), the administrator has to code it in the program.

It's very advisable that this program sorts the output columns identically to the columns order in the EXCEL file from previous step. The administrator should sort the columns in the right way. Eventually, the extraction program creates EXCEL file full of extraction data which fits the EXCEL file structure and format from previous step.

5. Analyzing errors log file and fixing extraction program

In this step the EXCEL file is full of data to be loaded to SAP ECC. try loading 50 percent of all rows in the file. Process runner will create output results. If there are any mistakes while the program is trying to create master data, It will indicate the reasons for it.You should analyze and fix the program respectively.

6. Preparing deletion and archiving program in SAP ECC

Eventually there is a chance you will need to delete any of the data that was loaded due to any reason. So first, you will need to distinguish the data that was converted and loaded to SAP ECC area from other data that was created manually by users. the best way to do it is using SAP standard report and specifying in the selection screen of the report the SAP user that created the data. For example in my project a certain programmer was using Process runner to load the data. The entire data he loaded was created under his user code. Thus, it was easy to distinguish the data. After the report extracted that data, mark what ever is necessary for deletion and use SARA tcode to Archive the data (I will post separately specific guide how to archive data in SAP using SARA tcode).


Hope  this information will help you working with SAP. For any question regarding this process fill free to ask me.


I've been an SCN member since 2006 and watched the involvement from others increase over the passing years.  This is both good and bad at the same time.  It's good to see more people get involved but I'm not sure the collective quality of SAP knowledge on the site increases at the same rate.  I suspose this isn't unexpected given SCN's growth rate.  However, with the increased size, scope, and viewership of SCN, I think there is a risk to SAP customers that rely on the information being presented here.


I'm blogging today because lately I've seen an increasingly growing number of recommendations from community members that the OP should solve their problem by either 1) running an SAP correction program or 2) debugging their way into a table update. Hacking table updates has been covered a few times already.  Just search on the appropriate key terms (I'd rather not list them) and you'll see plenty of discussions on it.


The point of this blog is to talk about the other technique (correction programs) and their consequences.



What are correction programs?

Correction programs are used to fix table inconsistencies that can not otherwise be fixed through a normal dialog transaction.  The programs are developed by SAP and the intent is to solve specific errors.  This is a critial point because these programs can not be used in all circumstances.  It's also important to note the audience of these programs.  They were developed to be used by SAP Developers... i.e., the folks in Germany, or now, in AGS worldwide.  These aren't originally intended to be customer facing tools.



What's the big deal?

Most, if not all of these programs, are direct table updates with little validation of the data in the existing table or the parameters entered at the time of execution.  There is little, if any, program documentation.  Most of them are crude utilities... and I'm not saying that to be critical.  Instead, I want to make the point that these are not sophisticated programs that can be used in a variety of scenarios and will safely stop an update from occurring if it doesn't make sense to do so (from a functional perspective).


Because of this, there is an element of risk to executing them.  The original data can not usually be recovered.  If the programs are executed incorrectly it's possible for inaccurate results to occurr.  SAP doesn't advertise or document these programs because their stance is that they should only be executed by SAP resources (or under their guidance).  That means if you run a program and cause a bigger problem, SAP isn't obligated to help you out of that situation.



When is it appropriate to run a correction program?

A correction program should only be executed after you've gone through the following 4 point checklist.


  1. If you get specific instructions from SAP via the help portal.
  2. A correction program should only be executed after thoroughly testing in a quality/test system.  This can be difficult because the unusual nature of these problems is such that it is difficult to replicate them.  However, if at all possible, I would do a test on the program as best I can and substantiate it with appropriate screenshots, table downloads, reports, etc.
  3. You should always try and solve the problem using normal transactions.  If there is a way to solve a GL issue using re-postings and such, then I'd always go that route then utilize a crude utility such as a correction program.
  4. Only as a last resort



When is it not appropriate to run a correction program?

Most importantly... and I can't stress this enough...  these programs should not be executed without a thorough understanding of the problem at hand, the tables impacted, and the updates being performed by the program.  If you can't read code or weren't guided by SAP about the updates being performed, I wouldn't run it.  If you can't talk in complete detail about the error and have proof that the error is triggered by a table inconsistency, and have the knowledge or tools to fix a potentially bigger problem if the correction program causes one, I wouldn't run it.




I'll show a few examples but I'll stay away from the more dangerous ones.


The first one has a clear warning message.  Most of the newer programs that I've seen have similar warnings even on the selection screen.


screenshot - 2014.08.11 - 14.56.53.png


Here's an old one.  No program documentation, no selection criteria, and very little information in the title.  If you can't read ABAP, how will you know what this program does.  What exactly does 'debug' mean in this context?


screenshot - 2014.08.13 - 17.10.31.png




The problem with topics such as this one is that a lot of people want to blast out the information to show off what they know.  My gripe is that we all need to realize that the responsibility (and blame) from running a correction program without proper consent or guidance from SAP is quite high.  Do so at your own risk.



How many times have you seen SAP projects generating disappointing feedback in the user community after months of those successes "Go Lives"?





Once the first invoice is issued and the first picture is shared and thousands of emails are sent full of congrats for everyone, the real business problems appear.




Don't we have to change how we consider success in our implementations?

Where key concepts as sustainability, Business benefits, Savings, etc… are?

Should we only consider a success if an invoice can be printed or if the first production order can be confirmed or if the first purchase order is received?




Implementing an SAP (ERP system) does not mean that process are being redefined or even yet improved. The redefinition of the business processes should be one more item of our implementations project. This is the key step that will ensure the company is using SAP in all its potential following the best practices. By redefining processes we will be ensuring real benefits, sustainability and savings that will let the companies see in SAP a good way to make process improvements getting operational excellence.




Does it mean that there is an opportunity to enhance the ASAP Methodology in order to add this business re-engineering as a pre-preparation?


I was associated with one the world’s largest SAP Implementation project in Oil and Gas Domain for more than 7 years. I had this unique opportunity & experience of working across different functions of project organization – worked as a Functional Analyst, Integration/Test Management, Global Deployment, Offshore Management. I have tried here to outline my experiences and share some best practises in the context of large SAP Rollout Programme.


Objective of the Programme - reduce complexity and bring change in three ways – through simple business models, standard global processes and
common IT systems. 

Journey:  10 Years - 35 countries & 39000+ users


Pathfinder - Low risk & diversified implementations done to test methodology

Ramping up - Building global design and processes

Complete Global Design - Complete global design and focus on deployment to multiple countries

Embedding - Retrofit the earlier go live countries (pathfinder & ramping up) to current level of design (global template); deploying change requests to live countries

Extending Business Value - Bring in country with high business value & deploye improvement change requests


Global Rollout Approach:

  1. Global Template Developement


  2.  Local Rollout Deployment



Programme Managment Framework -


Critical Sucess Factors :

  • Robust Governing Model - To deliver a programme of this size and scope - robust governing model is required with regular reviews and connects across teams and across management chain
  • Effective Planning (Milestone driven) - Advanced planning (couple of years in advance) with details of milestones and activities on a single spreadsheet. To give greater visibility for all staff on various activities of the programme
  • Diverse Team - Central Team was based in one location but a good mix of resources across globe to cover differenct time zones and local coordinations.
  • Dynamic delivery organisaton - changed as per programme need. Started off with one central team responsible for both design and deploy. Later division into design and deploy to drive global deployments. Adding new team to cater to live country requirements. This lead to siloed working and demarkation, dilution of knowledge across teams. Eventually teams merged and organised in hubs across globe to remove inefficiencies.
  • Resoucing - building right skillsets and domain knowldege
  • Process Adherence
  • Innovation and value additions


Definition of Sucessful Go Live : On time, on budget and with minimal disruption to the business


  • Business Continuity at Go Live - businesses are fully compliant and continue to serve customers smoothly and efficiently
  • Stable system that enables busines to carry out day to day activities
  • Customers and vendors are aligned with new processes and policies and experience the full benefits of the new processes and policies
  • Accurate and meaningful and transparent management information
  • Staff find it easier to work within the new processes and systems and ultimately have more time with their customers
  • There is a plan in place to drive through the benefits – everyone knows the part they play in realising the benefits and continuing to increase efficiency
  • Legally and fiscally compliant

India is fast emerging as a global manufacturing hub on account of its skilled manpower, access to raw materials & most importantly a large domestic demand which is growing by the day. Manufacturing roughly contributes 15% of the GDP of India & provides the largest direct & indirect employment in the country.  Attaining a competitive edge in manufacturing depends on several factors such as technology, focus on productivity & quality& positive governmental regulations. IT is today being widely embraced in this sector & has the potential to provide a distinct competitiveness to this crucial sector of our economy.


IT is helping to provide step changes in productivity though an entire range of manufacturing processes and is enabling companies to integrate both with their global & local suppliers & customers.


To sustain the growth in the manufacturing sector and also face the increasing competition globally & locally, companies will now have to look towards value addition rather than cost reduction alone. The path towards achieving this would involve inculcating global manufacturing best practices and IT will be a keen enabler to achieve this. Today Enterprise Resource Planning systems talk about not just IT adoption but also adoption of IT in line with global best practices.


However inspite of these advantages, the embracement of IT is still low in manufacturing firms compared to Western economies such as for instance in US, Germany or even UK.


Broadly for IT purposes, firms can be classified into those having a turnover - large (> 100 crores), medium (10-100 crores) & small (<10 crores). The aims & strategies for IT enablement in these firms depends to a large extent on size of these firms.


The key challenges faced by firms however irrespective of size remain fluctuations in raw material cost & in time, in quality supplies to customers. In short supply chain optimization. IT expectations are to assist in tracking production costs, tracking product quality, tracking customer orders and deliveries and most importantly provide strategic information to management to improve their processes continually.


In majority of organizations the success of the IT outcome depends on synchronizing the IT goals with Business goals. Organizations which have got this right have experienced significant benefits and those which haven’t are faced with issues such as confrontation between IT & Business interests, money which has not given expected benefits & returns and most importantly losing out the competitive edge in the longer run.


How does one measure IT usefulness or effectiveness. It is quite challenging & difficult. In the real world everybody uses a computer to do multiple tasks – surfing the internet, acquiring information, communicating, paying bills etc. In short it improves our quality of life. However if one were to put an actual measure on this, it would be a very difficult task and highly subjective.


However when companies invest large sums of money , time & effort on IT adoption they need to be clear on the outcomes and this where the real challenge is both from the companies perspective as well as those helping the company to achieve the objectives.


Studies have shown that a majority of companies which have invested in IT are not completely convinced about the effectiveness of their investment.


The challenges for large firms has mainly been in time deliveries , obtaining real time information and making decisions based on these whereas for smaller firms it is more to do with tracking costs and competitive pricing.


Typically the business expectations from IT are tracking costs, meeting timely deliveries, quality & access to business information.


In order to increase the benefit of IT investment, typically it needs to be combined with organizational change, BPR, greater business knowledge among IT staff and increased IT knowledge among the organization.


Some of the critical areas where IT is expected to help out are Order & demand management , Material scheduling , Accounting , costing , Vendor handling & payments , Invoice generation , material accounting ad of course HR & payroll systems.


Enterprise Resource planning is the most widely adopted IT application among manufacturing firms. Also supply chain management is also being undertaken to integrate a plant shop floor systems with the constituents of a supply chain. However as far as India is concerned, this is still in a nascent stage. Also organizations today are increasingly talking about Customer Relationship Management.


Today very few organizations, probably less than 3% combine CRM, SCM with ERP. This definitely impacts the effect of IT. However on a positive note organizations are increasingly realizing this and today are talking about implementing SCM & CRM on top of the ERP and specically SAP  implementations that have already done.


Today organizations are also talking about outsourcing IT which could result in savings cost as outsourcing organizations bring in better talent and ability to manage the IT systems , this is particularly so in IT hardware.


Since the main reasons for IT adoption are to improve processes such as order taking, delivery management, Invoicing & inventory control, wouldn’t it probably make more sense to focus on process improvement as a measure of IT effectiveness?



Rajesh Santhanam

The author has worked for some of the largest firms worldwide such as HP, Shell, Deutsche Bank, Hitachi  LogicaCMG & iGATE.



As part of various SAP support/development engagements, consultants frequently find themselves taking up the task of effort estimation. These estimates usually constitute activities viz. Requirement gathering, Solution Design, Configuration, ABAP development, Functional Validation, Regression Testing and all other actions leading to a successful User-Acceptance-Testing (UAT). The objective of this blog is to help the readers in understanding this business-critical function by throwing some detail on the perspectives of 3 key stake-holders viz. Client, Developer and Consultant.




Similar to any business proposition, client stands as the principal stakeholder and end-beneficiary of the Requirement-Under-Development. Also, through continuous ERP add-ons/developments, it is the client who brings revenue to the associated consulting organization. So, consultants, while doing effort-estimates, should always keep the below things in mind:

1.       1. The proposal has become a reality only since the client is seeking a better-way of handling his business processes and is eyeing to achieve the same through an efficient SAP-driven solution.

2.       2. Businesses don’t look at solutions with a skewed-sense of short-term savings. Effectively, clients are always willing to pay an extra dollar for a well-designed holistic ERP offering.

3.       3. Most of the requirements may have to deal with revamping existing business processes or implementing a new legal regulation or may even involve extending the existing business template to expanding global operations with unique country-specific challenges. Since time is a crucial component in such situations with every additional day of delay costing the business dearly, consultants have to be meticulous in their effort-planning. The objective here should be ‘Get-it-Right-FirstTime’.

4.       4. As we often experience, Rework is an exorbitant cost we end up paying when we try to deliver an incomplete solution resulting through not-so-thoroughly-assessed estimates. This Rework is not only a pricy affair but also delays the entire delivery cycle with enormous adverse business impact.

5.       5. Further, repeated failures in delivering Complete and Defect-free solutions with-in agreed timelines will lead to an erosion of client’s trust on consulting organization’s capabilities. This is a grave situation for any firm to get into since this negatively impacts its revenues as well as tarnishes the brand image in the long run.


Developer: (Usually an ABAP expert in SAP parlance)


As it is the developer who transforms the on-paper solution to an ERP reality by doing the necessary ABAP development, before arriving at an accurate estimate, he/she should ensure the below:


1.       1. A clear and complete understanding of the expected solution with a fairly-good picture of best and alternative scenarios to build the same

2.       2. A candid approach in making the consultant (functional) aware of the hidden-limitations that may be tied to the solution in future. Here, it is of utmost importance for the consultant and developer to be on the same page in their understanding and approach. This can be achieved only through continuous interaction and information-sharing at every stage of development. Here, the consultant has to ensure the developer aware of the changes happening on a daily-basis. (Through sharing minutes of discussions held with business). Based on the same, if required, the efforts should be revisited.

3.       3. Before starting any part of development, the technical consultant has to ask for a functional specification document which could be as basic as the solution-in-brief to be developed or may even incorporate acute details like database tables to be used, smart forms to be modified, programs to be amended etc. depending on the scale of requirement. This achieved clarity will surely reflect in making accurate and realistic estimates.

          Finally, any development approach should be simple and performance-effective in its design. Ultimately, if the convenience and ease-of-usage factors are missing, the solution itself will lose its relevance.




      A Functional Consultant acts a bridge between the client and developer. Basically, it is the job of a consultant to understand the business requirements and map them to a good-workable detail so that the development team can accomplish the required ABAP changes for successful implementation of the desired SAP solution.


      The following are vital factors to be looked-in by a consultant while doing the critical activity of estimating efforts:


1.       1. Ask relevant questions first: By not jumping into any assumptions and evaluating his understanding of the requirement at every stage of discussion, a consultant can ensure an accurate-mapping of client’s business need. In this stage, consultant should ask as many questions as possible by appropriately sharing his valuable insights/experience in handling similar requirements.

This way, the business also can develop a fair overview of what to expect and any valid, reasonable drawbacks can be appreciated before implementation itself.


2.       2. One-Team Approach: The consultant has to work in-tandem with the developer by constantly interacting with the latter and sharing all vital inputs on a regular basis. Before arriving at any rough estimate, they both should agree on the solution-approach to be adopted.

Here, the consultant and technical expert should work as one team with crucial individual roles being mutually acknowledged and respected.


      In essence, businesses look for solutions which are efficient (satisfies the business need and easy-to-use), high on quality (defect-free) and which can be easily scaled up to meet future requirements. Effectively, effort estimation, as an activity, should not be looked in a myopic lens to hastily deliver the thing-at-hand at any cost but has to be viewed in a broader perspective to implement solutions constituting little regressive impact, minimal or no rework and ultimately, offering lot-more than the business desires through optimum usage of efficient SAP functionalities. This can only be achieved, when we, as consultants, try to comprehend the engagement from a developer and more-importantly, through a client’s business perspective.

Daily Inventory operations includes receipt and issue of goods. For goods issue, companies usually use the reservation process. A department in need of a material will create a reservation with a particular movement type (typically 201 but some use customized as well). Referencing the reservation, the stores department issues the required material to the concerned department.


Businesses however, may not find the process completely adequate. Consider the following case, a user department requires a material of certain quantity. He will need to put up the requirement for approval with his immediate manager or to the head of the department. Once approved, he may proceed further with the reservation. This is similar to the way a Purchase Requisition (PR) is processed. A PR is created when there a requirement of a material of which stock is not available in inventory or a completely new material is required. The PR needs to go through an approval process which is mapped in SAP as Release procedure. Rarely, may be for materials belonging to C category, a release procedure is not required for PR approval.


Standard SAP does not provide release procedure for reservation. A customised dashboard can be created of a release process for reservation. The Reservation BADI will also be used to create a flag indicator which will enable to differentiate a released reservation from an unreleased one. The approval may be single or multiple depending on the client requirement.


A similar setup can be made in case of issue of goods in case the Stores department also follow the above work process. This will require the usage MIGO BADI. It may be argued that such complexity is not suitable for all businesses, and a fair amount of abap development will be required to achieve this, but a lot of companies do use this process. Some may not require the approval steps to be included in SAP and go with emails or other communication in tracking the approval and create reservation in SAP only once it gets a go ahead.

So you’re finally in production with your package software. Users are giving feedback, the system is stabilizing, and your program team is starting to wind down. Now what? Work on your solution never stops, and these key topics have to be addressed: Maintenance / Support and Upgrades / Patches. Now, when compared with custom-coded application, these topics require some different thinking. First, by leveraging a package solution, your software vendor takes on the burden of providing you with patches and upgrades for your base software. They often can play a role in helping tune your application as well. Finally, your software vendor will almost always have escalation paths for complex support issues as they arise.

You can break down your RUN state in the following way: Business Enhancements, Performance Tuning, Support Services, Software Updates, and ongoing Program Support.

Business Enhancements

By the second day (if not the first), you will start to receive requests for business enhancements for the solution that just went into production. These could range from minor tweaks to missed requirements. The work efforts can also range from a few hours to more than 30 days of development. As these requests come in, you will need to set up a RUN state team that can move ideas through the software development life cycle and get to production on a daily, weekly, monthly or quarterly basis. Most importantly, you need to plan for a period of continuous development to address key enhancement requests immediately after going live. Ideally, your organization is already operating in a model of continuous development.

“Packaged software does a great job keeping up with regulatory and security issues and usually pushes these code updates via Notes, Patches or Hot Fixes.”

Performance Tuning

No matter how much performance testing was performed during the build of the package solution, you will have unexpected results when you hit production. Being prepared to have engineers at the ready to tune the infrastructure, database, middleware, and UX layers of the application will be critical to a good experience for your end users. Don’t plan on just turning this responsibility over to your support teams, but rather retain this in your engineering organization: the engineers that built the solution are in the best position to advise on how to tune it. This function is also something you will need to turn to after every major release or upgrade.

Support Services

Support, in this context, is defined as Level 1 and Level 2 support for both end users and the application itself. This typically means that they focus on usability issues for users who may not be trained up on the system yet, as well as data, integration or performance errors. What support organizations should not be doing immediately after the system goes to production is try to troubleshoot problems (i.e. recurring incidents) or fix “Missed Requirements.” Support should be focused simply on break/fix and consistency of service. Level 3 teams (i.e. engineers that implemented the system) should be on point for problems or missed requirements.

“Don’t plan on just turning this responsibility over to your support teams, but rather retain this in your engineering organization: the engineers that built the solution are in the best position to advise on how to tune it.”

Ideally, Level 3 teams are embedded with the Support teams during the first 90 days post-production. This will help with knowledge transfer and incident resolution as well as quality. If the Level 3 teams are on the hook for support in the first 90 days, they will have motivation to ensure quality in development leading to a stable production implementation. Finally, before full transition from the development teams to Support, there should be pre-defined criteria that must be met for the system to be turned over. For example, zero high incidents for 4 weeks running or zero open high problems.

One point to note is that many software vendors have programs that will evaluate the quality of your implementation and flag areas of concern. You can leverage these programs to hold your organization and your implementation partners accountable for the quality of their implementation.

Software Updates

Patches, Hot fixes, Upgrades, and Notes are all things your software vendor will push out. Sometimes these are on a quarterly basis, but with the pace of technology change and security, these can be as frequent as weekly. Packaged software does a great job keeping up with regulatory and security issues and usually pushes these code updates via Notes, Patches or Hot Fixes. These tend to be relatively straightforward to implement and require minimal regression testing if any at all.

Upgrades can be more complex and typically are delivering major functionality changes. In most cases, you will need to plan for full regression testing, which usually means spinning up a full project to manage the upgrade. In particular you will have to pay attention to your integrations, reports, security, and any customizations you may have made during your implementation. Taking an end-to-end approach to an upgrade initiative will be imperative to success. This means starting with what changes the users will experience, and ensuring data / master data is not impacted by the upgrade.

In summary, Package Software can provide a lower cost, faster to market approach to enable many business capabilities. Package Software brings built-in innovation and continued development, security and regulatory controls, tested code beds leading to more consistent system performance, and increased data quality.

Final Installment

For most companies that choose package software, they have multiple vendors that assist them in implementation. The vendor ecosystem often ends up being the software vendor, a system integrator responsible for the bulk of the implementation, a RUN state vendor, and a smattering of niche vendors or staff augmentation providers. Our supplemental article on Vendor Management will dig into how to best leverage your vendors to get the most out of them.

The Case for Package Software
The Case for Package Software, Part II
The Case for Package Software, Part III

I was inspired by the "Tip: how to find out the correct component when raising OSS messages" blog post. As mentioned in that post, not choosing the correct component when raising an incident will probably cause the processing to be delayed.


In the same way, a lot of people mention a runtime error in the issue description, but there is no information on this runtime error. Which error is it? When did it happen? And this will also cause the processing to be delayed because in order to find this out, the incident will have to go back to the customer at least once just to clarify this simple piece of information.


Sometimes the runtime error title (or also known as short dump) is mentioned, but just this title does not help us too much. Sometimes the date when it happened is missing, which will make it difficult to find it in transaction ST22 in the system. Hey, sometimes the user provided for support to use has no authorization to access ST22. It can also happen that the runtime error is attached to the message, but in the form of a screenshot, usually only showing the top information of this runtime error, but not the complete details. This can also cause message processing to be delayed.


There is an easy way to export a runtime error, with all of the relevant information, making it easy to attach to your OSS incident or to e-mails if you want to share them with someone else in your team. Follow these steps:


1) Go into ST22 transaction to find the runtime error:



2) To see the runtime error details, double click a runtime error:



3) To export it, go to menu path System > List > Save > Local File:



4) A pop-up window will come up for you to choose in which format you'd like to save it. I'd recommend either the unconverted option which will save it in a .txt file or the HTML format:



5) Another pop-up window will come up for you to select a Directory where to save the exported file and also to give it a File Name:



6) Click on Generate and the file will be saved in the Directory you just defined:




That's it!


Just check the directory you selected to save the file in and it will be there.


The full runtime error (short dump) can now be easily shared! You can attach it to an OSS incident or any e-mails you might want to send your colleagues, you can save it on to a usb drive, ...

Before you even start considering package software, you need to have a good understanding of the business process you are trying to enable. More importantly, you need to know if this business process is the key differentiator that makes your company successful or is a supporting function like accounting. Package software for the most part excels at enabling these supporting functions. There are also many business processes that fall in between. They are more than just support functions, but not so highly specialized that they are truly differentiators. These often appear in supply chain, sales force automation, call center management, and product lifecycle management. Finally, there are industry specific processes. Many large package software companies like Oracle and SAP have industry vertical solutions that attempts to provide standard software for these processes.

In this first part we will dig into the following topics: Standard Processes, Speed-to-Market, Legacy Systems, On-premise Software, and Software as a Service.

Standard Processes
Package software is well suited to enable standard processes such as finance or human resources. As organizations start to consider package software, the first exercise they need to go through is a classification of their business processes. These business processes need to be broken into categories such as differentiating, competitive, and non-competitive. Enabling competitive and non-competitive processes with package software is a great place to start. Most differentiating processes will be too unique to easily enable them in package software solution. While there are many reasons to leverage package software for differentiating processes, that is a topic for another post.

Package software promises speed-to-market if you are able to accept its standard out of the box approach for your processes. This puts the onus of change on the business vs. the IT organization. For those package solutions that allow you to tweak their standard processes to better match yours through configuration, you can still enable your business process. However, there are several key areas that you must think through in order to move fast.

The first is data. Will you convert data or start your transactional history from scratch? Do you know where you master data lives and will you enable that in the new system or integrate with another system?

The second is integrations. How many upstream and downstream systems are required to integrate into the new solution? How flexible are those systems in changing their data structures to match the new package system? Will you need to create a translation layer to manage the data between the systems?

The third is performance. Do you have the internal infrastructure to manage the peak loads of the new system? Have infrastructure timelines been aligned to the overall implementation timelines? Do you know how the new system will scale?

These three components of data, integrations and performance are critical to understand in order to move fast. The good news is that many software vendors deal with these issues in every implementation and in some cases have built out methods to address these components. Unlike custom developed software, you have a starting point of where you need to begin in managing these risks to the ability to move fast.

Legacy Systems
We touched upon legacy systems a bit in the speed-to-market section above. So let’s dig a bit deeper on Data, Integration, Performance and Reporting.

Starting with Data, you will need to understand the quality of the data that exists in the legacy environment. If you are converting from a custom developed application or even an old package solution, you most likely have more flexible data rules than what the new system has. This will inevitably lead to challenges during data conversion to the new system. The payoff is structured data that will ensure data quality and consistency, easing future data consumption. Start data activities in parallel with the rest of the integration activities, otherwise you will find this track lagging and it will ultimately hold up your go-live.

Reporting should also be started in parallel with data and the whole program. Understanding what the new software package enables is important and educating the users to what is available out of the box is critical. What usually is not available from a software package is ad-hoc reporting. If this is a requirement, you will need to move data between the transaction system and your data warehouse solution. In addition, if analytical reports are required vs. standard historical reporting, this too may need to be built off of your data warehouse. Though many software vendors are migrating to in-memory systems that allow for very fast data analysis without having to migrate data between systems. These issues are similar to issues faced in custom solutions or packages due to the performance issues of doing data analysis against the transaction system.

Integration is another challenging effort to making software packages to work in a heterogeneous architecture environment. Embedding your integration team into your overall implementation team is a great way to ensure that integrations are completed on time. Cataloguing your integrations and determining if you can leverage a services architecture vs. a point-to-point integration may ease your overall development and simplify your architecture in general. The biggest challenges in developing integrations is incompatibilities in the data structure between the systems, performance of the interface, and access to knowledgeable subject matter experts on both systems. These integration challenges hold true regardless of custom or package software solutions.

Finally, performance of legacy systems and more importantly the interfaces are often over looked. For example, if there is expectation of real-time updates of data, but the source system from legacy operates on a nightly batch, there is no way to facilitate real-time data. Interface performance tends to be a more common challenge with legacy systems, where data flowing the interface is disrupted, leaving the new software package lacking the relevant data for the business to do its job. Fault tolerance in your interfaces is required to ensure a seamless experience by your users. Finally, the package itself needs to be performance tested. The software vendor will have great statistical data proving performance, but you must plan to validate its performance in your own environment.

On-Premise Software vs. Software as a Service (SaaS)
Software as a Service has grown tremendously over the past decade. This includes traditional software vendors adding SaaS offerings to there product lists. The hard part is deciding if SaaS, Hosted, or On-premise software is the way to go. In evaluating these options the following themes emerge for all of them as points of consideration: Security, Performance, Customizations, and Support.

Let’s start with On-Premise Software. If your business is concerned about having secure data, hosting your solution within your data center or on dedicated hardware within your data center provider’s offering is a must. By hosting your solution you control who has access to the data, have control of your network and physical access and don’t risk security breaches during transit of data. Many SaaS providers don’t segregate your data from their other customer’s data, increasing the risk that your data could be exposed inappropriately.

On-premise software also puts you in control of performance. You are able to scale, as you need too. If you have integrations from your software package to legacy systems, keeping data movement within your network is often faster than moving it over the Internet to your SaaS provider.

If you require customizations, you are able to control and manage that due to having the source code on premise where you can access it as you please. SaaS providers gain their scale by not allowing customizations and typically prohibit customers from making any change to the core code, limiting the extensibility of the solution.

Finally, there is support. On-premise software supported by your company falls within your support SLA’s. SaaS providers may not align to your internal SLA’s and for upgrades and outages; they often don’t have to avoid business sensitive times due to their business models. For example a SaaS provider may have an eight hour SLA for a severity one issue, while your internal SLA for severity one issues is 4-hours.

SaaS solutions have many benefits around Speed-to-Market, Scalability, Costs, and Performance. Due to the limited ability to do any customizations or configurations SaaS is often very turnkey resulting in speed-to-market. Pricing models tend to be lower cost at the start due to the subscription model vs. license model of traditional software. This means you don’t have to invest in costly infrastructure or the time to order, install, or configure that hardware.

From a performance perspective, many SaaS providers invest heavily in their data centers beyond what most companies may invest in their own data centers, leading to higher availability. The issue is that when they do have an outage, you don’t have control over recovery.

When considering SaaS solutions focus on the following business functions as a place to start: travel and expense management, training, knowledge management, sales force automation and talent management.

Please add your own perspectives on your experiences with Package Software. In the next installment we will discuss the Secrets of Making Packages Work in more depth. Topics will include a deeper dive on integration, data conversion, user interfaces, customizations, skills, and system integrators.

Links to Part II and Part III

When creating OSS messages, a lot of people have no clue on which component to use. So they just choose a common component such as BC-ABA, FI-GL or just randomly choose one component. This will cause the message processing to be delayed. Here I would like to share how to find out the correct component.



In general SAP distinguishes components according to its products. If it is R/3 then the component is distinguished according to package/programs. Below are some examples. The rule is to find out the corresponding package.


1. If you have the program name.


Go to SE38-> find out the package of the program -> double click on the package name



2. If you have the transaction name


Goto transaction SE93 -> display the transaction -> double click on the package name



3. If you have the function module name.


     Goto transaction SE37 -> display the FM -> goto Attributes tab-> double click on the package name



4.  As shown in the above 3 examples, no matter it is Table(SE11), Business object (SWO1) , or BADI (SE18), try to find the attributes of the object and you will find package, then application component.


5. Service market place  


1). Regarding on how to open system connections (R/3, WTS, HTTP…) and issues regarding the connections.


Component: XX-SER-NET-HTL



2). Regarding S users on service market place.




authorization: XX-SER-SAPSMP-ACC



6. Others


If the product is not SAP R/3 and you have no idea which component should be used, just give a call to the hotlines listed in SAP note 560499 (Global Support Customer Interaction: Telephone/fax/e-mail) and they will help you to find out the proper component.


Filter Blog

By author:
By date:
By tag: