All of us have heard about the 4 Ps of Marketing. How about extending this concept to the world of data synchronization between trading partners?


Supply Chain is one of the most important functions these days for any large organization that manufactures and sells a variety of products. There are innumerable trading partners involved throughout this supply chain and ensuring that every stakeholder has accurate and on time data is one of the most crucial aspects that cannot be ignored.


Having a robust system in place that ensures continuous coordination of information between the manufacturer and the retailers is what GDSN (Global Data Synchronization Network) facilitates.


GDSN is a universally accepted set of data formatting standards that ensures senders and receivers of product data are operating within the same structure and maintaining the highest levels of data integrity.


This is how information flows in a GDSN typically:

The manufacturer wants to publish information about its items or products in a global data pool or registry. This information is sent in a decided format that is accepted by the receiver. The recipients of this data are retailers who can request for this information by subscribing to specific items. The middle person between the manufacturer and the retailers is the Global Registry = GS1. GS1 provides validation services to ensure uniqueness and applies some basic rules.


The GS1 Global Registry® is the GDSN's "information directory" that details who has subscribed to trade item or party data, guarantees the uniqueness of the registered items and parties, and ensures that all data pools in the network are complying with a standards-based set of validation rules


This is now where GTINs and GLNs come into picture. The two main keys based on which GS1 functions are the Global Trade Identification Number (GTIN) and the GLN (Global Location Number)


The GTIN is a unique number assigned to a product that can be ordered, priced and invoiced by a trading partner at any point in the supply chain. Simply put, it is the Barcode that you see on a product.

There has to be a very close relationship between the GTIN Data and the Product Data.

Think of it this way: A company manufactures pens for example. Now, end customers like us would be buying a single pen - let's call this an item. There would be large depots or supermarkets who would however purchase cases of 20 pens for example. And think of transporting these cases of pens from the manufacturing plant to the warehouses on wooden trays or pallets.

Each of these items, cases and pallets would have their own characteristics and hence would be assigned a unique barcode for easier tracking. The length, width, height, weight and various other dimensions and attributes of these GTINs in essence, need to be maintained accurately. This is what is maintained in that common pool of data that every trading partner can access.


The GLN is used to identify the physical location of a items in a supply chain, legal entities and trading parties.


Implementing GDSN for any company is going to be beneficial. However, there are some key points that need to be thought over before taking that big step:

  • Readiness of the organization and all stakeholders involved to accept this big change – especially end users whose work processes will change here after.
  • Choosing the right GDSN partner is of key importance – the product that they offer should suit your needs and whatever customization is required should be achievable quickly and without too much of a hassle
  • Strategizing the roll out of GDSN – Do you want to do it all in one go through a big bang approach? Or test it in one market and then gradually learn from the mistakes made (if any) and proceed further


Hope this helps gain some perspective on the basics of GDSN and GTINs in general.

As MDM consultant, we archive and unarchive repositories during different phases and in different contexts.


I was trying to unarchive the production copy repository in Dev landscape and ended facing consistently issue of 'Unable to modify Database' as shown in below screen shot.I was not sure of what was happening. I tried to stop/start MDS, unmount and mount repository in console etc...during which I wasted productive hours.




Finally, I got to know from my colleague that the issue is due to space issue. If the server does not have sufficient space for unarchive, then we face this issue. I wrote this blog with intention of may be this will be of help for others as quick reference instead of wasting time to once again reinvent the wheel.

We had requirement to develop automated script which is scheduled to run on daily based during off-hours for verify and repair of MDM repository. I am writing this blog to explain the steps I followed to write the script. I should have written the code snippet in Wiki; but including in MDM section as the script will be of use not only for developers i.e. it may be of immense help for basis consultants and in some cases functional consultants.


Following points need to be considered.


1. As CLIX commands require credentials to be supplied along with the command; we need to find better way to secure credentials instead of supplying along with the command in script file

2. As the repair needs repository to be stopped, we need to first trigger CLIX verify command on the repository. If the outcome of this is error, then only trigger repair

3. For verification of the repository, we need to check if the repository is available using CLIX repIsAvailable command. If repository is not busy, the script returns available

4. Trigger Verify using CLIX command repCheck. If this ends in error fatal; it means repository needs repair

5. Stop the repository.

6. If the stop is successful, then trigger the repair

7. update the outcome of the repair i.e. fatal error or successful to the stakeholders.


For secure storage of database and repository credentials, we can make use of Windows Credentials Manager. There are number of open source powershell scripts which provide access to stored credentials. Refer screen shot present at last.


Following commands can be used in sequence for the repository availability, repository verify, if repair is required then stop and repair the repository.


                 $MDMRepCredentials = & CredentialsFetch.ps1 'MDM_REP'

                 $MDMDBCredentials = & CredentialsFetch.ps1 'MDM_DB'

                 if($MDMRepCredentials -eq 'NOTFOUND')


                                echo "MDM Repository Credentials are not maintained. Hence exiting the script"



                if($MDMDBCredentials -eq 'NOTFOUND')


                                echo "MDM DB Credentials are not maintained. Hence exiting the script"



                 echo "Trigger Repository Verify as first step"

               CLIX repCheck server VerifyReparRep:server:S:$MDMDBCredentials $MDMRepCredentials


                 if($RC -eq 'false')


                     echo"Verify check shows there was fatal error; which means repair is required"



                                $RepAvailability  = CLIX repIsAvailable server VerifyReparRep:server:S:$MDMDBCredentials [-T –TIMELOOP 60] [-

W] [-D]

                                $RC=$? # This capture whether fatal error is the result of command execution.

                                if($RC -eq 'false')


                                  echo " Repository Availability check resulted in exception. Exiting the script"



                                  #wait for 5 miutes and give a try

                                #Start-Sleep -sec 300

                     }While(!($RepAvailability -eq "available"))          

                if($RepAvailability -eq "available")


                                echo "Stop the repository for repair as it is available for maintenance activities"

                                CLIX repStop  server VerifyReparRep:server:S:$MDMDBCredentials $MDMRepCredentials


                                if($RC -eq 'true')


                                   echo "Trigger repair as Stop comamnd executed successfully"

                                CLIX repStop  server VerifyReparRep:server:S:$MDMDBCredentials $MDMRepCredentials                                                        

                                                 if($RC -eq 'true')


                                                                echo "Repair completed successfully"




                                                       echo "repository repair ended in error. Needs immediate attention"






                                echo "Repository stop trigger ended in fatal error. Exiting the script"










In this competitive world, every IT service provider strives to deliver improved software and enhance its features on regular basis.  To name few

  • Rich User Interfaces
  • Applications with faster responses
  • Advanced Analytics
  • Many more …..


All this effort is to attract more customers and increase our sales margins. It is very evident that Customers and even Vendors play a key role in every business. Now the Question is “Are we doing Business with Right Customers & Vendors?”

In an era of tight security, determining whether individuals, companies or organizations are restricted from conducting trade is essential. Exporters are responsible for ensuring that they are compliant with government regulations and that their goods are not being sold to undesirable entities. These entities are referred to as Denied, Debarred, and/or Restricted Parties, and checking your transactions against these restricted party lists is called a Restricted Party Screening (RPS) process.

Recently, we have developed a Master Data Solution for one of the World’s largest confectionery, food and beverage companies. We ensured the basic functionality as mentioned below

  • mandatory fields are filled
  • business validations are triggered
  • routing happens through intelligent workflows
  • data syndicates to downstream systems and so on


But, what we additionally improved further is related to Legal & Compliance.

  • Properly conducting Party Screening is a critical and long-standing compliance expectation. Each responsible unit (Business unit, country or function) is currently required to perform RPS before conducting business with a new partner. 
  • Master Data Creation is not integrated with screening and this lack of linkage creates a risk of a restricted party being inadvertently approved for business.



What we achieved?

  • We proposed a solution to strengthen the controls surrounding RPS by integrating the process with SAP
  • SAP is integrated with Third Party software which performs the party screening and generates unique partner Id. 
  • A partner validation is built, to ensure that the partner Id was in fact screened and cleared as an “Approved” record, prior to initiating the Master Data creation process within Vendor and Customer domains
  • Also there are specific regional requirements and exceptions for party screening which is controlled through work flow


Benefits to Client:

  • Integration of RPS with Master Data has strengthened the Legal & Compliance controls by making it more efficient, effective, and secure.
  • Instilled the discipline for Business to screen all potential business partners are legitimate entities.
  • Established consistent process across all regions against compliance requirements.
  • Time Saving  during master data creation as the basic data is pulled directly from RPS, thus minimizing the data entry



Key Message:

It is important to expand scope beyond technical and functional requirements and ensure we are also aligned with Legal & Compliance


About Author

• Sreeram Kasturi  is working as Senior Technology Architect with Infosys and has been executing complex SAP NetWeaver projects for major customers in  America, Europe & Asia Pacific

• He is a Techno Functional Consultant and has expertise in the areas of SAP Portal, SAP MDM, SAP GDS, SAP BPM & SAP KMC.



Is someone able to tell me what the diffference is between using the Partner function in the vendor master record or using the Permitted Payees functionality in the vendor master record.  Don't both functionalities bring up a list of alternate payees to pay when you process the invoice


Any help will be greatly appreciated.





Don't miss our three-part Webcast series on strategies and best practices for governing your data!

Are you struggling to govern new types of data? Planning an information governance initiative – and figuring out how to measure your progress and success? Wondering how other companies are improving data quality?


Don’t miss our three-part Webcast series this October:

  • Master Data Governance for Next-Gen Data – Big Data, Social, Cloud, and Mobile
    Thursday, October 3, 2013


Aaron Zornes, industry expert and founder of The MDM Institute, join us to share the benefits of SAP Master Data Governance as well as best practices for how to get started with applying governance to the next generation of data. Get updates on the current crop of technologies that must be accounted for in your organization’s plans for the next 18-36 months.



  • Establishing Data Governance at CSL Behring
    Tuesday, October 15, 2013


Find out how CSL Behring, a global leader in the life sciences industry, improved data quality, processes, and governance across all types of master data. You’ll hear how CSL Behring established the foundation for governing and managing data as an asset in order to improve business reporting and transactional business processes.



  • Information Governance Metrics Made Easy
    Tuesday, October 29, 2013


If you’re planning or if you've already started an information governance initiative, measuring progress is essential to establish credibility and to keep that funding coming in. Learn how to determine the right metrics and formulas for calculating benefits, how to determine challenges and progress, and SAP’s own internal approach to information governance.



We look forward to seeing you there!

SAP NetWeaver MDM 7.1 is the proven infrastructure technology for cross-domain master data consolidation and harmonization across the enterprise.


See also below: Overview - benefits of NetWeaver MDM


The latest support package, released on March 28, 2013, provides you with the following benefits:


Integration with SAP Data Services and SAP Information Steward (DB-Views)

Now you can integrate MDM with other downstream systems via MDM DB Views.

You can enhance MDM capabilities using SAP Data Services (DS) for matching, cleansing, data enrichment, and ETL. You can also enhance MDM capabilities using SAP Information Steward (IS) for Data Profiling.

A new MDM CLIX command generates DB Views which can be consumed directly using the DBMS capabilities.


NW-MDM web-user capabilities


  • Export and Import via Web-Dynpro

In previous versions, via the WebDynpro, you could export only the display values of the lookup fields and you could not import this exported file.

As of MDM 7.1 SP10, "Advanced Export" functionality enables you to export all the values in a lookup field and you can also import this exported file via the WebDynpro.

The “Advance Export” / Import – is actually using the MDM-Syndication server and the MDM-Import server. It means that all the file formats which are supported by these servers are also supported by the WebDynpro “Advance Export” / Import. Also it means that all field types are supported: text, tuples, main table lookups, etc.


  • Now, via the basic export in WebDynpro you can export also the Qualified Lookup fields.


  • Finding data easily (lookup auto-suggest items).

As of MDM 7.1 SP10, the lookup fields could be configured for Auto-Suggest so at runtime as you starts typing text, the lookup field will display the lookup values that start with the entered text.


  • Sorting
    • Matching Enhancements: Now you can also sort the Class and Score matching fields (columns) in a Match WebDynpro component.
    • Item Details WebDynpro Component: Sorting of tuple member fields is supported.



Enhance data model of the master data (tuple)

Enhanced tuple (data type) functionalities - You can now define a currency field type within tuple member fields.

Up till now you could only store a maximum of 8 digits in a tuple field. Now you can use the currency data-type in a Tuple to store a number with a precision of about 16 digits. The Currency data-type can be used with or without the currency symbol.


Enhanced security

You can now enforce the use of strong and secure passwords in MDM by configuring password requirements including minimum and maximum password length, password complexity requirements (use of uppercase, lowercase, digits, special characters), and preventing re-use of previous passwords (by keeping a password history for users). The configured password requirements apply to all MDM client applications.


UI enhancements

In the GUI of MDM Data Manager, the fields for which the user has read-only access are now disabled by default and appear grayed out.


Enhanced MDM workflow administrator capabilities

Email destination is based on data field - You can now define for each record, the people that will be notified when a specific record is changed in an MDM table. For example - each SKU/Product in the main table has a different owner who should get a notification email upon a change in this product. You can add a text filed in any table in MDM (configurable in MDM) and fill it with the email address. The Notify step will read the data form this field and will send the email to this email address.

The Workflow Notify step contains two new properties to support this feature. In addition, you can now embed any main table field value into the email body or email title by using the new %Code=Name% variable workflow notifications code.


Enhanced MDM Client capabilities

You can now use MDM Data Manager to export data from lookup main fields to Excel, Access, or text files, in the same way as for any other field type.


Enhanced error reporting

MDM-Import-Manager logs into a file - Now, when there is any failure in the import process via the Import Manager, you will have a report file that you can send to other people in order to resolve the issue. The import log file contains Source record ID, Reason for failure to import, matching field names, and matching field values.


MDM Configuration Assistant

Now you can easily manage the configuration of all MDM Servers that are based on the keys in the ini files of the MDM server (MDS, MDIS, and MDSS).

This tool will also help you to validate the keys and will prevent errors in the MDM configuration. For more details see SAP Note 1543650


Improved supportability

  • Now you can better monitor the repository status by using the SAP Solution Manager Diagnostic (WILY) reporting metrics. This includes:

o    Method name that was executed in the performance metrics and repository status.

o    For import and syndication: performance metrics based on repository and remote systems; files in ready and exception folders.

  • Enhanced automatic SLD (System Landscape) registration



And What’s Next?

SAP NetWeaver MDM is a mature product on a stable core. Continuous improvements are planned as usual in annual Support Packages. SP11 is planned to be released in Q4 2013.



Overview- NetWeaver MDM benefits


The NetWeaver MDM has the following added value:


  • Low TCO:

o Quick and easy to install.

o Supporting most common database servers' vendors, including: Oracle, MS-SQL, DB2 and MaxDB.

o NW-MDM does not require an installed SAP-ECC system in order to run the NW-MDM.

o Easier backup with archives, user maintenance, single console tool for all basis functions.


  • Governance can be achieved by using BPM (SAP NetWeaver Business Process Management) and reuse of validation rules.


  • PIM (Product Information Management) Support includes: Taxonomy, attributes, attached media (images/PDFs)


  • Flexible data-model - such as Tuples


  • Distributed scenario: NW-MDM enables each system to create its own new record.


  • Harmonization: You can syndicate data from MDM and import the same data into all systems within the customer company in order to harmonize the data


  • Consolidation - A single record in MDM (consolidated) now supports all occurrences of the record in all consuming systems each consuming system having its own unique key. NW-MDM match, merge and create persistency of best records based on a user review process.  NW-MDM is not embedded in the ECC system and stands for master data that consolidates data out of different systems from different landscapes.


  • Heterogeneous landscape: SAP NW MDM provides a solution for heterogeneous landscape related issues. This solution is also for landscapes that are not SAP system driven. MDM will integrate into a mixed SAP landscape of multiple (but differently configured) ECC clients, SRM and CRM clients as well as third-party ERP applications or legacy mainframe systems.  MDM can also speed the integration of mergers or acquisitions.


  • Data Quality (DQ) for cleaning - SAP NW MDM provides integration with SAP BODS DQ tool solution.

Hi All,


Am using SAP MDM 5.5 Version and it has Database MS SQL 2000 as the Back-end and the OS is MICROSOFT Windows Server 2003 R2.


I would like to upgrade the Database from MS SQL 2000 to MS SQL 2005.


Can anyone help me to know whether SAP MDM 5.5 supports MS SQL 2005. Through PAM, I understand it supports MS SQL 2005.


Can any of the experts available here reply back to me from your real time experience.




Ganesh Mani. K

In order to achieve SSO between SAP Netweaver MDM and SAP portal trusted connection needs to be set up. A trusted connection is the only authentication type supported in this version. A trusted connection is useful when the user has already logged on to the application and does not have to provide yet another password just for MDM.

The NetWeaver Web AS user must also exist as an MDM User (defined in the MDM repository). You can authenticate a session by using a trusted connection between the server where the MDM Java API is deployed, and the server where the MDM server is running.

You establish a trusted connection between the environment running the API and the MDM server by modifying a configuration file located in the MDM server installation directory. In this way the server is aware of which clients to trust.

Procedure to define trusted connection.

Trusted connection to the MDM server can be defined as follows.

1.    allow.ip file : Create a flat text file with the name allow.ip, this file contains IP addresses of trusted partners.  For each connection there is a separate entry.  IP address can be fully specified address e.g. or it’s also possible to use wild card to signify the entire subnet.

For instance 10.17.79.* indicates that any address within the 10.17.79 network is considered to be trusted. Other possible variants: {10.17.*|10.*}, but not just {*}.

Comments can be inserted by placing a number character (#), in the first column.


2.    deny.ip file: Create the deny.ip file if required, to allow an entire network while specifying some exceptions, create a deny.ip file according to the same rules as the allow.ip file:


For example, if 10.17.79 is to be trusted, with the exception of the address

  1., insert 10.17.79.* for the allow.ip file, and for the
  2. deny.ip file.

If an entry is found in the deny.ip file, the server defines it as not trusted.


The same behavior is defined for all IP addresses that are not found in allow.ip.

For example, a consumer with IP requests data from the server. If the IP is not found in allow.ip, then the request is refused.

Only if the IP address is found in the allow.ip file and not restricted in the deny.ip file,  is the caller categorized as a trusted partner.

Comments can be inserted by placing a number character (#), in the first column.


3.    Location of ip files:

By default the allow.ip and deny.ip files should be located in {MDM Server root}\exe.


In other words these files have to be stored in the folder in which the executable is located.

For example, if the MDM system ID is M01, the location of the trusted configuration files is usr\sap\M01\MDS01\exe.

Alternative location for these files can be specified by entering a line in the mds.ini file.

For example, TrustFiles Dir={folder path}, indicating that both the allow.ip and deny.ip files are in the folder, {folder path}.


When the MDM server runs on an AIX platform, the mds.ini file must contain the parameter, TrustFiles Dir = {folder path}.

After the initial configuration, the trusted partners are configured, enabling trusted cooperation between the Web AS servers and the MDM server.


After making above changes its mandatory to restart MDM server instance.

SAP NetWeaver MDM 7.1 is the proven infrastructure technology for cross-domain master data consolidation and harmonization across the enterprise. The latest support package, released on August 24; provides you with the following benefits:


Improvements related to PIM and Web-channel Experience:

  • Import images by Java API. You can now run a batch/automatic import of images and documents. This can be used to import thumbnails that were generated outside of MDM.
  • Taxonomy attribute with remote system keys. You can now define key mapping  of remote keys for attribute definitions.
  • Search by tuple value. You can now sort main table records by tuple field value.


Core MDM data model, usability, processing, data integration and administration enhancements:


  • Data Model:  Additional coverage for Tuples; Setting access rights at field level in a Tuple; Adding new timestamp, user stamp, create stamp and Auto-ID fields in a Tuple; Assignments into tuple member fields.

These enhancements allow you to:

1. Better control the security on tuple level

2. Better manage the data in a tuple

3. Better manipulate the data in the tuple


  • Usability, Processing; Administration:

End-User / Web Component (Web Dynpro). Within Web Dynpro components the following can be controlled:

    • Language inheritance in Lookup fields: You can now display the language inheritance values in the Lookup that are in Web Dynpro.
    • The tuple comparison feature in Web Dynpro now indicates whether the tuples are equal or not. This lets you find out very easily whether any changes were made to the tuples in a checked-out record compared to the original record’s tuples.
    • You can now configure "word wrap" for the label in a Search component.
    • You can add new events for:
      • Load of the Merge Component. This allows you, for example, to perform additional validations on the resulting record of a merge operation.
      • Getting records to merge & remove record. This allows you, for example, to store the data from records to be merged before the merge operation.
      • Tuple selected record as part of an Item Details view. This allows you to display the selected tuple record in the Item Details view when it represents a tuple as a flat structure.


  • Data Integration: Import, Export / Syndication:  You can now define permissions for exporting read-only tables and fields to Excel.


  • Schema Transport: The "Import Change File" in the MDM Console now displays the schema change files that were transported by the CTS+.


  • Publishing/Print: Support of InDesign CS5.5.


  • UOM (Units Of Measure): The UOM Manager is now integrated within the MDM Console; This allows you to control the measurements on a Win64 bit system (e.g.: Windows 7) and also provides support for all the MDM-supported databases including those that were not previously supported by the legacy UOM Manager.


  • New CLIX command to output the current list of MDS connections..


And What’s Next?

SAP NetWeaver MDM is a mature product on a stable core. Continuous improvements are planned as usual in annual Support Packages. SP10 is planned to be released in Q2 2013.


A book on SAP Enterprise Information Management is planned to be published in May. Content includes, for example:

  • Step-by-step instructions for working with SAP Data Services
  • Most important tasks in SAP Information Steward, SAP Master Data Governance, SAP NetWeaver Information Lifecycle Management and more

All royalties donated to Doctors Without Borders.

For more information, see the SAP Press bookstore.

What are your requirements for, and experience with  “analytical master data management (MDM)”?

Analytical MDM is the provision of authoritative master data for enterprise-wide reporting and analysis.  This research sponsored by SAP seeks to understand how organizations such as yours are approaching the need for analytical MDM, and how well the products in the market are meeting those challenges. 

What is the difference between analytical MDM and operational MDM?

Analytical MDM is focused on the management of master data and associated hierarchies that are required for aggregation and analysis within the data warehouse and in business intelligence reporting tools.  Operational MDM, by contrast, provides authoritative master data to operational applications such as ERP and CRM systems.  SAP has commissioned a survey from market research firm The Information Difference to help us gain this insight.


The survey will take around 10 minutes of your time.  As thanks for your time, SAP will send you a summary of the analysis when it is complete.  In addition, you will automatically entered into a prize draw that may win you an Information Difference “Vendor Profile” report of your choice, a value of US $ 495.   We will also donate US $ 2 per completed survey to the International Red Cross. 

Please take the survey!

The Customer: Surgutneftegas

Surgutneftegas, one of the largest oil and gas producers in Russia, employs innovative technologies and automated processes. To create an IT infrastructure and facilitate its transition to service-oriented architecture, the company deployed sophisticated SAP® software and technology. By helping to raise data quality, the solutions allowed Surgutneftegas to optimize inventory, reduce purchasing costs, and achieve a dramatic return on investment.

“SAP Consulting professionals helped us build a solution that supports our business processes with current, consistent master data. This project is the first step in building our new, service-oriented IT architecture.”

Rinat Gimranov, CIO, Surgutneftegas

Key software components involved:

  • SAP NetWeaver Master Data Management
  • SAP NetWeaver Portal
  • SAP NetWeaver Process Integration
  • SAP NetWeaver Business Warehouse

For complete scenario and implemetation details, click the link above.

Stay tuned for the next customer example.





Markus Ganser wrote a great blog on some See what Enterprise Information Management has in the bag at SAP TechEd 2011. I’m going to highlight a few from the TechEd Las Vegas session, because that’s where I get to go. J I’m also including Expert Networking Sessions and Pod discussion topics from the show floor. Because Information Governance spans so many technology areas, I thought I’d pull them together for you.

Pre-conference seminar: Newly-added SAP Master Data Governance session on Monday.

Key Hands-On sessions

  • EIM266: Next Generation Archiving: Extend Compliance in Your Corporate Environment
  • EIM161: Using Business Content Extractors in SAP BusinessObjects Data Services
  • EIM164: SAP Data Migration Solution Overview: Best Practices Hands-On
  • EIM163: Profile and Create Data Quality Scorecards to Understand the Health of Your Data
  • EIM267: The Importance of Cleansing and Standardizing Product Data
  • PMC265: Accelerating Business Rules with SAP NetWeaver BRM
  • PMC263: Process Analytics with SAP NetWeaver Business Process Management
  • EIM260: Getting Started with SAP Master Data Governance
  • EIM262: SAP NetWeaver Master Data Management and SAP BusinessObjects Information Steward

Key Lecture sessions

  • SAP BusinessObjects Data Services 4.0 and Beyond
  • EIM102: SAP Data Migration Solution Overview: Best Practices
  • EIM204: Connecting ECM to Business Processes: Evolving Needs and Technologies
  • EIM100: Enterprise Information Management Overview
  • PMC228: IDEXX Runs Marketing Points Program on BRF Rules Engine
  • PMC228: Applied NetWeaver BPM and BRM in Agile and High Volume Scenarios
  • PMC102: Business Rules Management with SAP: BRFplus and SAP NetWeaver BRM
  • EIM203: SAP BusinessObjects Information Steward 4.0 Product Overview
  • EIM112: Strategies and Tools to Ensure the Quality of Your SAP Data
  • EIM114: Information Governance: Reducing Costs and Increasing Customer Satisfaction
  • EIM211: Showcasing MDM Workflow Integration with BPM and Data Services
  • EIM106: SAP NetWeaver MDM for Customer Data Integration: Rapid Delivery in Eight Weeks
  • EIM201: Applying Information Governance in End-to-End MDM Scenarios

Pod topics on the SAP Show Floor

  • EIM-P11: Access and transform data from any source (Tuesday at 10am)
  • EIM-P01: SAP BusinessObjects Data Services Roadmap (Tuesday at 11am)
  • EIM-P19: SAP NetWeaver Information Lifecycle Management (Tuesday at 12pm)
  • PMC-P06: Business Rules Management with SAP (Tuesday at 2pm)
  • EIM-P17: Master Data Management at SAP (Wednesday at 10am)
  • PMC-P02: SAP NetWeaver Business Process and Rule Management (Wednesday at 1pm)
  • EIM-P14: Support your Information Governance program with Information Steward (Thursday at 2pm)

Expert Networking Sessions

  • Tuesday at 2pm: Guidelines on Starting a SAP Data Archiving Project (Karin Tillotson)
  • Wednesday at 11:30am: Build your own network with SAP StreamWork (Sharon Haver)
  • Wednesday at 1:30pm: Getting started with Information Governance (me!)
  • Thursday at 6pm: How to organize and deliver a business rules project (Carsten Ziegler)

PLEASE join me in the Influence Council session for EIM350: Enterprise Information Management on Tuesday afternoon.

To get a complete picture of the lectures and hands-on sessions offered in the Enterprise Information Management track, click the EIM session overview link and select the topics that you are interested in and would like to attend. To display the overall education and session catalogue, click the URL at the top of the blog.

Mark your calendars! Join us at a lecture, hands-on session, Expert Networking Lounge, or Expert Pod. Find us when we're there by sending a tweet to @InaSAP, @SAPMDMGroup, @SAPILM, @SAPECM, or @SAPBOEIM. 

The Customer's Voice: Votorantim

Present in 20 Brazilian states and 15 countries, the Votorantim Group operates in industry, finance, and new business development. Votorantim implemented SAP NetWeaver® Master Data Management to improve the management and standardization of its material registration and request system, in line with UNSPSC standards.

“In an innovative project in Brazil, we implemented SAP NetWeaver MDM and received a total commitment from the team throughout the stabilization actions. Today we consider SAP NetWeaver MDM an extremely useful tool in Votorantim’s business processes.”

Nayla Alves Santos, SAP Coordinator, Votorantim Group

For a summary of the MDM scenario set-up and implemetation details, click the link above.

Stay tuned for the next customer example.




Filter Blog

By author: By date:
By tag: