Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member229090
Discoverer
0 Kudos

OVERVIEW

As the heading implies, there is an inherent relationship between business process management and data management. One of the items you will not see on a company’s balance sheet is the value assigned to data.
This paper is not meant as the “silver bullet” but hopefully will generate enough questions to elicit a response from Business and IM as to what needs to be done, and when can we start. Rather than the age old adage “I see no value in fixing my data”.

DEFINITIONS

Data management is a sub-set of information management, while process management is a sub-set of business management. The need is therefore to bring these two areas closer together, with the one enabling the other.
   
Data can be classed as either structured or unstructured. Unstructured data is data such as e-mails, twitter, etc. Structured data is hopefully what is in your ERP system. Data as a whole can be split into the following sub-sections:


• Metadata – Describes or defines other data. Characteristics that need to be known about data, including information about physical data representations, technical and business processes, rules and constraints and data structures. The Enterprise structure in an ERP system is an example of metadata.
• Master data – Is a set of consistent and uniform attributes and identifiers that defines the core entities of the enterprise [Source: Gartner]. It consists of elements across multiple functional areas, from two or more disparate systems or business processes. Correct master data is key to analytics.
• Reference data - Defines the set of permissible values to be used by other data fields. Normally seen as a drop down list from which a value can be selected.
• Transactional data - Always has a time dimension, a numerical value and refers to one or more objects. Examples of transactional data include sales orders, purchase orders, invoices, payments, etc.

STRATEGY AND APPROACH
 
The fundamentals of a data strategy are:-

• Standardize
• Consolidate
• Improve

Try not to take the “boil the ocean” enterprise approach to design and implementation. Focus on the critical data orientated improvements, such as:-
 
• Data Governance
• Data Standards
• Data Quality Management
• Metadata Management

Who owns the data?
 
• The answer is business; more precisely the business process owner (BPO) should own the data for his process, a.k.a. the Data Guardian.

Who maintains the data?
 
• Data stewards maintain the data. There could be one or more data stewards per business process.

What does the data mean?

• These are the characteristics, dimensions, attributes, variables and measures associated with the data.

Where does the data come from?

• The source of the data could be multiple systems, most commonly the ERP system is your largest source of structured data.


How is data collected?

• Most commonly is the ERP system, not only is it the source of data but it also collects the transactional data as it is recorded.
 
Where is data stored?

• Data can be stored in your ERP, archives, data lakes, data warehouses, etc. The IM department is responsible for storing the data, securing the data, ensuring integrity of the data and providing access to the data, a.k.a. the Data Custodian.
 
When is data collected?
 
• This could be your on-line transactional processing (OLTP) or a daily, weekly, monthly feed.
 
How will data be used?

• As management information
• On-line analytical processing (OLAP), etc.


DATA GOVERNANCE

Data governance sets the policies and standards that address how data management decisions are to be made and monitors compliance.
 
Data stewardship is the maintenance of the data and includes the operational activities to be performed according to policies and standards established under governance.
 
Management of change ensures successful adoption and implementation of policies and standards across the organization.
 
Data quality management provides the framework and processes for ensuring data quality.

Architecture provides the models, styles, access mechanisms, metadata and integrity to the blueprint that guides an implementation.

Technology enables the architecture, through tools for quality, reporting, profiling, sustainability, etc.

Data governance issues can be divided into 3 categories:-

• People – Who owns the data
• Process – Fragmented or functional
• Technology – Disparate sources, integration, growth and complexity
 
This leads to issues around

• Quality
• Granularity
• Accuracy
• Accessibility
• Integration
• Usability
• Security
• Value
• Trust

The dimensions of data governance
 
• People
       Education on the need for governance
       Data ownership
       Governance model
       Align people to new roles as part of governance structures
• Process
       Methodology to implement, enable and sustain data governance
       Policies, procedures and standards for governing data
       KPI’s for measuring and monitoring efficiency, effectiveness and quality
• Technology
       Invest in enablers for governing data and information

The solution components

• People
       Establish structures
       Define roles and responsibilities
       Create functional models
• Process
       Establish a methodology
       Policies, procedures, standards and change management
       Quality framework
        Business rules
        KPI’s
        Issue resolution
• Technology
       Quality assessment scorecards
       Governance dashboards
       Data management tools & workflows
       Process adherence
       Sustainability


A data domain is a specific area where identification and control of the master data is focused. Based on the various end to end processes, the data domains would look something like this,
 
• MLE – Finance (General Ledger, Cost Accumulators, Assets, Enterprise Structure, etc.)
• MOR – Engineering (Plant Maintenance, Functional Locations, Objects, etc.)
• PTC – Marketing & Sales (Customers)
• PTP – Supply Chain (Vendors, Facilities, Services, Contracts, etc.)
• RTP – Manufacturing (Materials, Products, etc.)

DATA STANDARDS

Typical data errors

• Duplicate entries
• Missing data
• Multiple names for same customer/vendor
• Incomplete data
• Nonstandard formats
• Incorrect address information
• Typo’s
• Data recorded on incorrect data elements

The recommendation is to create a data dictionary. Each data element used in a particular master data structure should be defined; what it is used for, where it is used, the format it should be captured in and what’s its association to the data model. Is it a mandatory field, an optional field, or should the field be suppressed, i.e. not in use. Should the mandatory and optional fields have reference data attached to them, i.e. a drop down table to choose a value from. Data standards are part of the enterprise architecture.


DATA QUALITY

The synergy between high quality data and good business processes means you can exploit information to the fullest.

Data quality refers to the level of quality of data, but is generally considered high quality if it is fit for purpose. It should be consistent across the various systems within the enterprise. Disparate systems should be kept updated across the enterprise and mechanisms should be put in place to ensure that an update is an update for all. Enterprises should conduct periodic reviews of their data through data quality assessments. These evaluations would highlight any deviations from standards, missing data, spelling errors, wrong assignments, etc. The assessments would also highlight the effectiveness of the measures that are in place to ensure adherence to process and accuracy at time of capture. The 1:10:100 principle can be a guide to data quality. Very simply, it states that the cost of entering data correctly at the time of the event is $1.00, to correct it at a later stage costs $10.00 and to do nothing at all will cost $100.00 sometime in the future, for example a decision made using incorrect or outdated data could result in financial loss to the enterprise. The evaluations would determine whether an enterprise needs to conduct a data cleansing exercise or a data enrichment exercise.

The benefits of data quality management (DQM) are immense.

DQM will:- 

• Improve business analytics
• Provide a single view of master data
• Improve profitability analysis
• Ensure data consistency
• Embed data governance policies and standards
• Enhance decision making
• Deliver business value to management accounts.
 
Document data quality requirements and define rules for measuring quality. Ensure business rules are in place and KPI’s measure exceptions rather than the rule. Deviations should be reported and remediated immediately. Engage with business users to gain insight into the business objectives and solicit their expectations on data usability. Data quality issues can be translated into rules for measuring key dimensions of quality such as,
 
• Data consistency
• Value formats in disparate systems
• Data completeness
• Sources of record

Assess new data to create a quality baseline. Data profiling is used to scan data across data sets to check for formats and completeness. Profiling tools can feed information back to data stewards that might indicate errors on data types, structure, relational databases and primary keys in databases.

Implement semantic metadata management processes. As the use of disparate systems in an enterprise grows, so the need to centrally manage and govern business relevant metadata increases. Management should collaborate on establishing standards to reduce the risk of users having inconsistent interpretations on metadata which in turn leads to inconsistencies in reporting.

Check data validity on an ongoing basis. Tools have been developed for validating data against the quality rules that have been defined. Strategic implementation enables the rules and validation mechanisms to be shared across applications and deployed at various locations in an enterprise’s data flow. This allows for continuous data inspection and quality measurements. Alerts can be sent to data stewards to accentuate high priority data flaws. Dashboards and scorecards with aggregated metrics can be used for a wider audience.

Keep on top of data quality problems. Sustainability is the key to success in a data quality assurance program. Develop a platform to enable tracking and resolution of data quality incidents. There must be processes for evaluating and eliminating root causes of data errors.

The above activities form the backbone of a proactive data quality assurance management framework with controls, rules and processes that can enable an enterprise to identify and address data flaws before they negatively impact the business.


POLICIES, STANDARDS, GUIDELINES AND PROCEDURES

In order to protect information, businesses implement rules and controls around the protection of their data and the systems that store and process this data. This is achieved through the implementation of information security policies, standards, guidelines and procedures.

An information security policy consists of “high level “statements relating to the protection of information across the enterprise. The policy should outline

• Security roles and responsibilities
• Define the scope of information to be protected
• Provide a high level description of the controls
• Reference the standards and guidelines that support it
 
Standards consist of specific low level mandatory controls that help enforce and support the information security policy, or policies. The use of passwords and the rules around their complexity is an example of a low level mandatory control.

Guidelines consist of recommended, non-mandatory controls that help support standards or serve as a reference when no applicable standard is in place. Guidelines are viewed as best practices that are not requirements but are highly recommended. A standard may require a password to be 10 or more characters; a guideline would state that it is best practice that the password be renewed every 3 months.

Procedures consist of step by step instructions to assist employees in implementing the various policies, standards and guidelines. A procedure is very specific as to how a task is performed or a control is implemented.


SUMMARY

The creating and maintaining of accurate and complete master data records for an enterprise has become an imperative. Business needs to develop data maintenance and governance processes and procedures. Governance structures need to be put in place. If you conduct a data cleansing or enrichment exercise without the necessary processes, procedures and structures in place you will end up with a failure as there is no sustainability associated with the project. Data migrations become the stuff nightmares are made of, Go-live dates are compromised and projects fail.

Beware the pitfalls of bad governance practices. Check for both buy-in and commitment. Lookout for “Ready, Fire, Aim”. Do not try to fix everything at once. Don’t go to extremes, if your project is pitched at too high a level you will not achieve the desired outcomes, and conversely if the project is too granular it will take too long to implement. Ensure effective change management. Technology alone is not the silver bullet. Design with sustainability in mind. Do not ignore disparate systems.

“One of the first signs of insanity is doing the same thing over and over again, expecting a different result”.

Labels in this area