This article (Practical Planning) is part of the upcoming series “considerations for successful data migration to SAP” that talks about high-level planning considerations for successful data migration into SAP. It doesn’t talk about basic definitions and methodologies. Views expressed in this article solely arise from successful experiences from several projects and are being presented as “considerations”. Feel free to comment and critique. Enjoy SAP.
1. Engage the right Resources at the right time
ROI on EIM/data management investments comes from user collaboration and not just from technological solutions.
Business and IT teams are collectively responsible for migrating data in ways that—up on cutover:
- Be least disruptive to Enterprise Business as Usual (BAU)
- Ensure BAU is resumed using the highest quality data, and not having to work around bad data in the new target system
- Avoid data correction and remediation post conversion in the target system
For successful migration, consider forming the Data team right. Engage Source Business Leads, IT, SAP IT/Conversion resources and SAP Validators appropriately.
Ideally for every conversion object / stream, consider a conversion team comprising of:
- Business (Source) Data Owner / SME:
- Define Conversion Objects, Conversion Criteria and Conversion Specifications
- Define Validation procedures
- Perform pre-load and post load validations per conversion cycle
- Lead Data cleansing, purging and de-duplication activity
- Source IT Resource
- Implement source data extraction criteria
- Support source data cleansing activities
- Support data extraction through conversion cycles
- Assist with defining conversion criteria and specifications
- SAP Configurator / IT Functional Lead
- Configure and organize SAP screen / field layouts for the conversion object
- Setup supporting organizational structures
- Establish SAP validation rules and requirements and assist with definition of conversion specifications per SAP configuration
- Define load and validation methods
- ETL Programmer
- Develop, unit-test, deploy and execute ETL jobs
- Develop and deliver pre and post load validation reports
- Load data into target system
- SAP Loader (can be same as ETL Programmer)
- Execute load programs to load data into target system
- Deliver load reports with record counts and error reasons (if any)
- SAP Validator (can be same as Business (Source) Data Owner / SME)
- Execute data validation scripts, procedures and test variants, as established per data object
- Reconcile data
- Sign-off on data
Once team-structures are scoped, consider engaging the teams right from the blueprinting phase in activities listed above.
2. Establish the right procedures to deliver Good Data
Every project goes through integration test (IT<X>) cycles, user acceptance (UA) mock cutover, final cutover and go-live—mid-year or year-end. Executing these cycles using converted data is seldom the norm, for reasons like:
- Core application functionality and RICEFW testing can be done with sample data
- Data is not deemed a predecessor to IT cycles due to time and resource constraints
- Unavailability of converted data for the IT cycle as data conversion wasn’t given the importance that it should have
Implement Comprehensive Validation processesImplement the Extract, Transform, Validate – pre-load, Load and Validate – post load (ETVLV) routine for data conversion. Engage the Business Data Owners in comprehensive validation of data, before and after conversion/loading.
Set DATA as predecessor for test cycles
- Plan for Data Conversion in all cycles by defining logical “source data freeze” dates per cycle, by data type (one for master and one for transactional). Convert data as of IT cycle prior month-end / quarter-end / year-end.
- Extract, Transform, Validate pre-load, Load and Validate post load (ELVLV) and reconcile data in all cycles
- Execute data test scripts in all cycles—here are a few examples:
- Trial balance checks up on balance and open transactional conversion
- Payment simulations using Vendor Master data and AP open transactional data
- Depreciation, and AUC settlements on converted FA/PS data
The above outline is what’s typically done in a cutover window. Practice that through the testing cycles and perfect.Consider scheduling and planning go-live cutover based on factual and actual cutover rehearsals per project testing cycle.Details on repetitive iterations are in the next section “SPRINT through ASAP”. For this one, the suggestion is to rehearse cutover during every test cycle.
3. SPRINT through ASAP, iterate, repeat, practice and perfect
Consider practicing startup and cutover along the project’s testing cycles as mentioned in the previous section. Plan for iterative conversion cycles that complete in time (as predecessors) for project testing cycles. Consider the below structure for each conversion cycle.
- Profile Data
- Review Conversion Criteria and Rules
- Ex: Ageing Criteria for converting Master Data—had the record transacted in the past X years and X months, is the record still a real world object/entity, etc. If the record qualifies, continue to Step 3
- Cleanse, purge, de-duplicate at source / ETL staging area
- Gather Data Extracts
- Proceed to Step 5 with qualifying data
- Data that does not qualify conversion criteria is taken to Step 2
- Validate Pre-Load Data—records that qualify for conversion through validation routines are the ones valid for conversion. Records that fail are flagged as “Invalids” and handed off to Business to perform source data cleansing / conversion validation rule review. For valid records, perform:
- Perform domain and formatting checks, mandatory field checks, lookup/translation validity checks and integrity checks
- Validate control totals, active record counts
- Review key attributes such as prices, UoM, currencies, etc.
- Load Data
- Valid data from Step 5 is loaded via the established load mechanism. Records that load are validated are validated post-load, reconciled and signed off (Step 7 and Step 8).
- Valid data from Step 5 that failed to load is submitted for manual/automated corrections, loaded as delta and proceeded to Steps 7 & 8
- Validate Post-Load
- Record counts, integrity checks, validation scripts, control totals, balance / difference checks, etc.
- Reconcile and Sign-off