Introduction: Today organizations need effective SAP implementation in order to create value for their customers’ and at the same time saving cost of services for themselves. For resolving key business issues or for making strategic business decisions the decision maker's look at data. Hence effective data conversion is gaining importance.
Data migration basically means moving data from one system to another. Data Migration could be driven by several initiatives taken up by the customer like application changes (moving from Oracle to SAP) or upgrade (moving to newer SAP Releases).
Conversion does not simply mean moving data from one system to another, rather it means moving meaningful data.
Just to emphasize more on this fact let’s just consider a simple example; I have data conversion requirement for customer master and during the data load due to unknown reasons there is a digit missing from the customer’s contact number. Just imagine how much impact this small miss is going to have on Customer Service?
Data Conversion Challenges: In general, data migration is considered to a simple task deflating the real risks involved.
How can data Pre-validation tool help?
This tool is a step closer towards smooth data migration by facilitating the data migration team with an ability to perform checks on the data before we actually start to load the data. We see pre-validation as a step in between the sequential steps defined by the industry; Transform and Load to ensure quality of data and also save time on the migration activity.
This Preload Validation Tool is generic and scalable; it can be used across SAP systems’ for diverse conversion requirements pertaining to data load activities across various functional modules
Value Proposition
This tool gives you the flexibility to identify and resolve issues related to data even before it’s loaded to SAP.
Technical Design
The Idea is to create a generic tool and for that it’s necessary to determine the input file structure at run time.
We will need to create couple of custom tables along with their maintenance views;
Next step would be to create a report program with selection screen field conversion identification number and Input File Path as mandatory input and after the execution the report will display an ALV Output to list out errors with each record in excel used as input. The output will be easy to understand with only few fields, it will just tell the user the excel row that has the error, Field Name, Field Value and error description.
The report can also generate a graphical output displaying the errors associated with each field based on the predefined categories.
The core validation logic of the report will hold logic to read the DDIC attributes associated with each field and table name, stored in the item table to perform type check, value table or check table checks, format checks for date or currency fields and length related checks.
Tool Development
Step 1: Create Header Table
Step 2: Create Item Table
Step 3: Define generic output structure for ALV display.
Step 4: Define a customizing table for error categories defined (Non-mandatory Step)
Step 5: Create a Report Program in transaction SE38 with some name like "Z_VALIDATE_DATA_READ_VALIDATE" with Selection screen as described in the above section.
Step 6: Build the code similar to code snapshot in appendix section at the end.
Step 7: The ALV Output/Graphical Output
You can also choose the Chart Type, to club the count for errors classified in predefined categories.
Appendix: Code Snapshot is attached
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
10 | |
9 | |
5 | |
4 | |
4 | |
4 | |
4 | |
3 | |
3 | |
3 |