DataPool Response:


When the trade item is successfully exported from GDS and sent to 1 SYNC via PI, 1 sync will send the response back to GDS as a receipt of acknowledgement.


If item is Added, Modified or deleted 1 sync will send the proper acknowledgment with status as Accepted/Rejected. PCL (Parent and child Linkages) relationship for the item can also be done through GDS export process.


Flow

CAtalogue Response.png

Process

  1. The 1 Sync Server sends an AS2 message containing the status of the item registration / publication or the response from the customer (accept / reject ) to GDS via PI
  2. PI receives the 1 SYNC AS2 messages  for further processing.
  3. PI transforms the data from an CatalogueResponse message to a DataPoolResponse.
  4. The DataPoolResponse message is sent to the GDS system via a JAVA proxy call
  5. The JPR Layer in GDS performs some pre-validation and then passes on the data to the GDS core component
  6. The GDS core component updates the status of the trade item by invoking the MDM JAVA API to make calls to the MDM server.

 

 

Please feel free to share your thoughts and suggestions in the comments!!


Follow me on Google+

GDS Outbound Process

This blog explains about the GDS outbound process/steps involved in GDS Item registration and publishing in 1WorldSync.


1WorldSync

1WorldSync is the Product information network and data pool solutions provider, certified within GS1 Global Data Synchronization Network(GDSN).It supports GS1 Standards, the global standards for identifying, capturing, and sharing product information.

 

Flow

Item_register_Publish.png

Process

  1. When a trade item is created, changed, or published by a user GDS CORE performs a validation and retrieves all the data for the item if it meets the criteria for export
  2. GDS CORE then passes on the data to the JPR layer for outbound processing
  3. The JPR layer then sends the message to PI using a JAVA proxy call
  4. PI transforms the data from an MT_TradeltemExport message to a CatalogueRequest
  5. The CatalogueRequest message is sent to the 1 sync data pool server via AS2 adapter.

 

Points to Remind

  • In SAP PI standard SAP GDS Content is being used, PI GDS outbound mapping is further customized based on business requirements.
  • 1World Sync xml guide has to be referred for mapping purpose to link the correct attributes - attr,attrmany, attrgroupmany etc...
  • In case of incorrect mapping the item has been rejected in 1World Sync and returns a acknowledgement message with "Invalid XML Message".
  • Message can be grouped together with specific to Target market and IPGLN (Information provider GLN).
  • 1 World Sync will provide Unique user id for each Target market.
  • User id has to be maintained in SAP GDS Parties table .


In the next blog we will explain about the Data pool response from 1 sync - GDS Process - DataPool Response from 1SYNC

 

 

Please feel free to share your thoughts and suggestions in the comments!!


Follow me on Google+

GDS Inbound Process:
In this blog we will see the process used to send data from SAP MDM to SAP GDS. SAP GDS will receive Product/Materials  information from SAP MDM or SAP ECC. All the GDS relevant data that is required to create items in GDS is entered in SAP MDM product and sent from MDM 7.1 to GDS 2.1 via PI.

 

The Product Global Master Data repository in MDM is a custom built repository.


  • MDM Adapter is used in PI to receive the data from SAP MDM.
  • GDS Standard content (XML structure) has to be used in PI to send the data to GDS.
  • Registering Inbound Interfaces of SAP GDS - Refer the blog GDS Overview and Technical Guide for PI.


Flow

Untitled.png

 


  1. Product Information details are sent from MDM Product Global.
  2. PI picks the files using MDM/File Adapter and transform the message to GDS understandable format - MT_Trade Items XML Message.
  3. MT_TradeItem message is sent to GDS Via Java proxy Run time call.(XI Proxy adapter is used )
  4. JPR Layer in SAP GDS Performs Pre validation and then passes on the data to GDS core component.
  5. The GDS core components apps drives the creation/Modification of the item by invoking MDM Java API to make calls to the MDM server and get the status for the transaction.
  6. Status and details for the transaction execution are then added to the process logs.


Points to remind


  1. GTIN, Information Provider GLN and Target Market are the mandatory attributes required in GDS to determine whether the item is New/Modified.
  2. GDS will insert/update the item based on the uniqueness of GTIN, IP (Information Provider) GLN and TM( Target Market).
  3. GDS custom attributes creation should follow the standard SAP naming conventions, it should start with Z<Attribute NAME>.
  4. Any new creation or modification of the GDS table requires GDS console and server restart for changes to be effective.
  5. Custom attribute mappings should be handled under custom elements in the PI mappings.
  6. CustomGenericElement,CustomLookupElement,CustomLanguageElement and CustomQualifierRecordElement are the types of GDS custom attributes.
  7. Table code, Field code and reference path to be mapped with reference to SAP GDS attributes.
  8. IP GLN key values mapping should be maintained in SAP GDS Parties table.
  9. Key and value attributes in the PI Mappings to be mapped accordingly.
  10. GDS Key value mappings are case sensitive, if hard coded in PI it should be case sensitive same as GDS.
  11. If key value mapping is not maintained in GDS the item will throw warnings and we can see it in GDS Process logs.

 

Next blog will discuss about the GDS export Process -GDS Outbound Process - Item Register and Publish in 1Sync

 

Please feel free to share your thoughts and suggestions in the comments!!


Follow me on Google+

Environment: Webdynpro Java Portal and SAP MDM   

Scenario 1: Create


  1. Create the BinaryBlobRecord instance(blob_instance) from RecordFactory.createEmptyBinaryObjectRecord(BINARY_OBJECTS_TABLE_ID);
  2. Set all the values that are to be updated in the same instance (blob_instance).**
  • blob_instance.setDataGroupId(<value>)
  • blob_instance.setBinary(<Binary Object Byte[]>);
  • blob_instance.setHasOriginal(<boolean>);

     3. Use CreateRecordCommand create = new CreateRecordCommand(<user session context>); to create the Binary Object Record (B1)

          CreateRecordCommand (MDM Java API Library)

     4. Set the value and Update the Main Table Record (R1) with the Link (B1) Result : We have a record (B1) in the Binary Object table linked to the main table* record (R1).

*Can be table of any type


Scenario 2: Update
To update the BLOB object of that particular Binary Object record (B1).Steps:

  1. Search the Binary Objects table with for the record to be updated
    1. Get the link from the main table record (R1)
    2. The return value is Rec ID of the BLOB (B1).
  2. Create the BinaryBlobRecord instance(blob_instance) from RecordFactory.createEmptyBinaryObjectRecord(BINARY_OBJECTS_TABLE_ID,B1);
  3. Set all the values that are to be updated in the same instance (blob_instance).**
    • blob_instance.setDataGroupId(<value>)
    • blob_instance.setBinary(<Binary Object Byte[]>);
    • blob_instance.setHasOriginal(<boolean>);
  4. Use the ModifyRecordCommand to update the Record.

    ModifyRecordCommand (MDM Java API Library)

Result: The Binary Object is updated and hence there is no need to re-link again.


**For the details about the different Parameters of Binary Objects - Working with Binary Large Objects (BLOBs) - SAP NetWeaver Master Data Management (MDM) 7.1 - SAP Library

 

to set the setDataGroupId(Mandate for all types of BLOBs)

RetrieveGroupTreeCommand tree = new RetrieveGroupTreeCommand(<usersession context>););

tree.setGroupType(GroupTypes.DATA_GROUP_TYPE);

tree.execute();

HierGroupNode node = tree.getGroupTree();

//As per our requirement we just placed all the BLOBs in the first leaf of the node

//you can various options in the HierGroupNode (MDM Java API Library) to traverse and select the relevant leaf.

GroupNodeId gID = node.getAllLeafs()[0].getId();                 

blob_instance.setDataGroupId(gID);                

 

to set the setDataLocationId (Mandatory only for Binary Objects)               

RetrieveGroupTreeCommand location = new RetrieveGroupTreeCommand(<usersession context>);

location.setGroupType(GroupTypes.DATA_LOCATION_TYPE);

location.execute();

HierGroupNode lTree = location.getGroupTree();

//As per our requirement we just placed all the BLOBs in the first leaf of the node

//you can various options in the HierGroupNode (MDM Java API Library) to traverse and select the relevant leaf.

GroupNodeId lID = lTree.getAllLeafs()[0].getId();                

blob_instance.setDataLocationId(lID);

 


Scenario 3: Deletion
Note : we can delete a Binary Object only if it is not linked anywhere.

Delete the orphaned or the unused records from the Binary Objects table.


  1. Get the Record ID’s from the Binary Object Table

     Using the  RetrieveLimitedRecordsExCommand objExCommand_allData = new RetrieveLimitedRecordsExCommand(<UserSessionContext>);          Command.   

2. Use the below code to get usage of the Binary Object Record (B1).   

RetrieveBinaryObjectUsageCommand objUsageCommand =new                                           RetrieveBinaryObjectUsageCommand(<UserSessionContext>);            

    • objUsageCommand.setTableId(BinaryObjects_tableID));
    • objUsageCommand.setRecordId(B1);
    • objUsageCommand.execute();

     RepositoryItemUsageResult objUsageResult = objUsageCommand .getUsageResult();

     RepositoryItemUsage[] repoUsage=objUsageResult.getUsages();

 

Refer the link for the return values : Constant Field Values (MDM Java API Library)

 

If the length of repoUsage is 0 then the Binary Object Record (B1) has no usage. And therefore can be erased.

 

     3.Use the

     DeleteRecordsCommand objDeleteBLOBRecordsCommand = new DeleteRecordsCommand((<UserSessionContext>);

     DeleteRecordsCommand (MDM Java API Library)

            To delete the Record.

Note :  Trying to delete Binary Object records which are linked will result into an Exception  .

 

FYI: For your information you can use the Pagination technique for searching Pagination of MDM search result in Java

Problem:

SAP MDM is a repository which can hold a huge and complex data. So, searching the repository and retrieving data from the table is a heavy task. We have the SAP provided MDM APIs in place to carry out the search and return the result. But, when the result is huge, it is a risk to increase the page size according to the response result.

For example, if the search result is expected to return 10K values and for which we have increased the page size to 10k, then there is chance that the system goes down or the system will encounter an delayin response time. Thus to overcome this situation we will have the paging concept implemented here, while retrieving we just need to mention the below code snippet.

 

Solution:

Before retrieving the data set the Page size.

Note: Choose wisely according to the requirement (Maximum Page Size = 1000).



//Retrieve data

RetrieveLimitedRecordsExCommand getDataCommad = new RetrieveLimitedRecordsExCommand(<UserSessionContext_instance>);

getDataCommad.setResultDefinition(<ResultDefinition_instance>);

getDataCommad.setPageSize(<PageSize_int>)

getDataCommad.execute();



//Number of records the Search returned

int resultCount = getDataCommad.getSearchTableMatchCount();

//Size of the page

int pageSize = getDataCommad.getPageSize();

//Total pages

int totalPages = resultCount/pageSize;

// if there are 1030 records then totalPages = 1, while there are 2 pages 

// when Page Size is 1000.

// Hence a Modulo would help us to know if there is an extra page

int hadExtraValues  = resultCount%pageSize;

 

 

         

if(hadExtraValues != 0)

{

totalPages++; // if there is an extra page then the var:totalPages is increased by 1

}

 

Record[] tempRecord = null;

int arraySize = 0;

//int page Will start from 1 as the getDataCommad.execute(); will return the //records in page  = 0.

for(int page=1; page<= totalPages; page++)

{

//TODO - Code to manipulate with the records in Page = page.



      getDataCommad.setPageIndex(page)//Set the next page Index

      getDataCommad.execute();

}

Deleting and Renaming the import map:

 

Steps for deleting Import map:

1.      Login to the import manager.

1.JPG


 

2.      Go to File->open-> pop-up window will appear with the list of all the import maps.

2.JPG

 

3.      Press the delete key which is on your keyboard.

4.      Now your import map will be deleted.

 

Steps for renaming import map:

 

1.      Login to the import manager.

1.JPG

 

2.      Go to File->open-> pop-up window will appear with the list of all the import maps.

3.JPG

 

3.      Press the F2 key which is on your keyboard.

4.      Now your import map will be renamed.

 

GDS?

  • Global Data Synchronization (GDS) is a term used to describe an industry initiative focused on keeping item information in sync between retailers and their suppliers.
  • GDS is a standardized process for the exchange of product data in the Global Data Synchronization Network (GDSN),built around the GS1 Global Registry and certified data pools.
  • GS1 Standard Board developed the Global Data Synchronization (GDS) process.
  • The solution integrates with downstream systems to help suppliers aggregate and validate GDS specific data and send this data through the appropriate format and process to their data pool.
  • For retailers, the solution enables a company to publish/subscribe to information, integrates downstream current, new product introduction or maintenance processes.
  • The following industries are actively participating in GDS:
    •   Grocery
    •   Food and Beverage
    •   Health and Beauty
    •   Electronics
    •   Over the Counter (OTC)
    •   Music, Movie, Games
    •   Apparel (minimal)
    •   Hardlines

Overview

  • The SAP solution for Global Data Synchronization is a turn-key application which leverages the NetWeaver platform and MDM to provide  a comprehensive  end-to-end solution.
  • Source data is sent from SAP MDM or SAP ECC to SAP GDS using PI interface.
  • There are five simple steps that allow trading partners to synchronize item,
    • Load Data: The seller registers product and company information in its data pool.
    • Register Data: A small subset of this data is sent to the GS1 Global Registry.
    • Request Subscription: The buyer, through its own data pool, subscribes to receive a seller's information.
    • Publish Data: The seller’s data pool publishes the requested information to the buyer’s data pool.
    • Confirm & Inform: The buyer sends a confirmation to the seller via each company's data pool, which informs the supplier of the action taken by the retailer using the information.

GDS Architecture

GDS_Archi.pngRegistering Inbound Interfaces for GDS

  • To receive inbound messages from SAP PI, the Java Proxy Runtime (JPR) to  be registered on the  SAP GDS.
  • This task is performed on the PI Adapter Engine on the Web AS Java for the SAP GDS Console.
  • Open the JPR proxy server configuration pages of the Web AS Java on which the GDS Console is installed: http://<GDS server name>:<port number>/ProxyServer/.
  • Use the /register command to register the following interfaces one after the other, using the nomenclature shown on the JPR proxy server page:


Part of Register CommandEntry
myNamespacehttp://sap.com/xi/GDS/21
InterfaceMI_TradeItems_In
BeanTradeItemsInProxy
MethodmITradeItemsIn
  • The URL for registering the inbound proxy will be as follows :

         http://<GDS Server>/ProxyServer/register?ns=http://sap.com/xi/GDS/21&interface=MI_TradeItems_In&bean=TradeItemsInProxy&method=mITradeItemsIn

          http://<GDS server>/ProxyServer/register?ns=http://sap.com/xi/GDS/21&interface=MI_DataPoolResponse_In&bean=DataPoolResponseInProxy&method=mIDataPoolResponseIn


Process Integration - SAP PI

  • SAP NetWeaver Process Integration is the central component for exchanging messages between SAP GDS and 1-Sync.
  • Based on GDS Version 2.0 / 2.1 the respective standard content to be imported to SAP PI.
  • SAP GDS Inbound mapping to be developed using the standard GDS data types. E.g.: TradeItems
  • Data can be sourced from any system with PI certified connectivity.
  • Default values and additional mappings can be set.
  • Customize the GDS outbound mapping (GDS to 1SYNC) based on business needs available under the SWC 1SYNC/68 -1SYNC current version.
  • XI Adapter - Java Proxy is used to communicate with SAP GDS application.
  • Technical and Business system should be properly configured in PI SLD.
  • PIAFUSER is used to connect to SAP GDS Application from PI. PIAFUSER – User should be created in PI and GDS systems.
  • AS2 adapter is used to communicate between SAP GDS and 1SYNC.


In the next blog we will discuss about the GDS Inbound Process and Tips and GDS Outbound Process - Item Register and Publish in 1Sync


Please feel free to share your thoughts and suggestions in the comments!!


Follow me on Google+

Emerging of GDSN

  • The industrial sector is currently in the era where the retailers, manufacturers and distributors have entered a promising, yet challenging, period in their relationships. As they recognize the need of working closely to improve operational efficiency, more efficient information sharing and management has become a critical issue.
  • Traditionally, they exchanged product information using manual, paper-based processes which led to inefficiencies, errors and duplicate labor efforts among trading partners.
  • This gave rise to the need for a standards-based data synchronized model – Global Data Synchronization Network(GDSN).

What is GDSN?

  • The Global Data Synchronization Network (GDSN) is an internet-based, interconnected network of inter-operable data pools and a global registry known as the GS1 Global Registry, that enables companies around the globe to exchange standardized and synchronized supply chain data with their trading partners using a standardized Global Product Classification.
  • GDSN assures that data exchanged between trading partners is accurate and compliant with universally supported standards.
  • Organizations are leveraging the GDSN to streamline how product, service, location, organization, price and promotion data is created and exchanged within their own enterprise.

GDSN consists of the following : 

  • Supplier/retailer trading partner
  • Data pools that hold and process trading partner data, such as Agentrics (formerly Worldwide Retail Exchange) and 1SYNC (formerly Transora and UCCnet).
  • GS1 Global Registry, a directory that helps locate data sources and keep relationships between trading partners in sync. It matches subscriptions to registrations to facilitate the synchronization process.
  • A set of standards established by EAN International and the Uniform Code Council, Inc. (EAN•UCC) through  the Global Standards Management Process (GSMP).
  • Enterprise software for product information management.

 

The Global Location Number (GLN) and the Global Trade Item Number (GTIN) are the global identification numbers in the GDSN; the GLN is the identifier for legal entities, trading partners and locations while the GTIN is the identifiers for trade items.

 

GDSN_Block.png

  • The GDSN is a network of data pools that facilitates the synchronization of item information between retailer and suppliers through a single global registry.
  • The value of the GDSN is that a retailer and supplier can have a single point of entry into the network through their selected data pool.
  • Once connected, they can subscribe to or publish all of their item information in a single and consistent process for all trading partners that are connected to the GDSN.

 

In the next blog will explain about GDS Overview and Technical Guide for PI

 

Please share your thoughts and comments.


Follow me on Google+

Reddy M

StructuralX (exception) in MDM

Posted by Reddy M Nov 21, 2014

Hi Friends

 

 

              I am importing the data from Oracle db to sap mdm through sap pi. During the import i got a error like StructuralX (exception) in mdm and i can check the StructuralX folder there is no any Exception files in StructuralX folder , where i can make Wrong ,please let me know ASAP.

 

 

 

 

Thanks,

Reddy

Transporting MDMGX objects from SAP ECC to Material Repository in MDM

Maybe include a bit of an intro?

MDMGX (MDM Generic Extraction Framework) is a standard SAP transaction code available within core SAP ERP systems depending on your version and release level.

At NIMBL, one of our major clients – a global services professional company - had a business requirement to add OverheadGroups functionality to their material repository in MDM. In order to accomplish this, we first had to perform local table (LT) configuration for OverheadGroups in our SAP ECC Development client as all local table configurations are maintained in SAP ECC in our client landscape.

To share what I learned, in this blog, I will provide the steps I performed to accomplish this requirement.

First off, our client’s ECC landscape is architectured according to SAP best practices.

MDMGX1.png

Note: Transporting individual configuration objects under MDMGX is not possible - so it is recommended that all of the configuration objects under MDMGX need to be transported in order to maintain consistency across the landscape. Tip: Use an asterisk (*) to include all objects under a specific transport.

Configuring LT_Overheadgroups under MDMGX for Material repository

Step1: Login to the Development client

Step 2: Execute MDMGX transaction and click on Maintain Ports and Check-Tables

MDMGX2.png

Step 3: Provide the appropriate System and Repository information to perform LT configuration

MDMGX3.png

Step 4: Click new file to perform LT_OverheadGroups configuration

MDMGX4.png

Step 5: Enter following details for LT_OverheadGroups configuration as shown below. This configuration provides data in English for LT_OverheadGroups table in MDM Data manager. This information will be available to user through portal.

MDMGX5.png

Step 6: Copy the configuration to Testing client using the SCC1 transaction code. Make sure to include all of the objects under MDMGX as mentioned above.

  After Successful testing and importing transports to your ECC production environment, we next need to perform the following steps to extract data to and from the ECC production LT_OverheadGroups table enabling data flow from the MDM Data Manager


Step 7: Execute transaction MDMGX and select “Start Extraction”


MDMGX6.png

 

Step 8: Provide the Client details and port name of the table to which data needs to be extracted in the MDM Data manager. Click execute to start the extraction process.

 

MDMGX7.png

Step 9: Login to the MDM Data Manager and make sure the data is available

MDMGX8.png

So there you go. Easy as pie right? I think this is a best approach to make the data flow from SAP ERP system to MDM if the local tables are maintained in SAP ECC system.

 

Please stay tuned for my next blog covering how to maintain ports within the MDM console. I would love to hear your thoughts, comments, and own ideas. Please feel free to email me at hari.sonnenahalli@benimbl.com anytime.

 

Hope this helps!!!

We have implemented our master data management solution using customized MDM WDC i.e. coupled with CE as a user interface on top of MDM (7.1SP6) along with CE BPM (7.2SP5). The solution was fit-for-purpose and did the intended job of data integration, governance, cleansing and maintaining single source of truth neatly with WDC providing easy to build UI. The reusable and generic nature of the components ensure independence from the underlying MDM schema and as a result, they are configurable and extendable.

 

With obvious advantages the MDM solution provided, business further came with ever diverse needs and was looking for SAP MDM as a tool for maintaining variety of data (flat, hierarchies etc.) of the organization – not all of it was master/reference data.

 

Most of the requirements were pertinent to UI of data when stored in MDM (e.g. enabling hierarchical or tree structure view on hierarchies, having drag-drop) with other few related to MDM solution e.g. check-in/check-out on hierarchies etc. Though MDM and MDM WDC could accommodate some of these requirements, others slightly tricky requirement couldn’t be addressed easily. But as the requirement kept on loading, we felt SAP MDM is bit limited to support them.

 

E.g. many organizations have huge amount of organizational data in hierarchies. An organization looking for MDM solution, naturally, would like support for all its data i.e. flat, hierarchies etc. We faced few limitations in implementing hierarchies using SAP MDM. The situation demanded tree like user interface. Though WDCs attempts to provide that display, we felt it was bit short on user satisfaction. Also hierarchies in MDM come with limitation that cannot be check-in/check-out, referred only at leaf node (perhaps we can make up for this using internal leaf node, but at the expense of duplication) etc. An add-on component like WDCs or some mature support in WDCs for hierarchies could augur well here. Additionally, main table like features for hierarchies i.e. check-in/checkout etc. will not only bring consistency but make the master data support scope wider.

 

Another such relative area could be, what-if analysis. Often organization undergoing restructuring or other org level changes of merging / splitting of business units needs to do a lot of what-if analysis. E.g. assuming a scenario of merging business unit A with either B or C, management interested in knowing likely cash flows for unit B and C if merged with A, would be interested in some analysis before committing data. We came across one such requirement, wherein business was undergoing some restructuring activity and wanted to do some what-if analysis on org data before committing it. As we couldnt anything in SAP MDM space to address it, we have to settle for Oracle DRM tool with feeds from SAP MDM. It would have been much easier for customers if there was such support using WDC like components. It avoids them lot of hassle of evaluating other options and also provides solution to adjoining needs under same roof.

 

We also faced issues in distributing the data from MDM. Once master data solution is implemented, the data should be routinely distributed across functional groups to satisfy their need of master data and for integrity of the solution. Business may need regular / scheduled distribution of master data to other functional group. Though syndicator gives us option of automatic (monthly/weekly/daily…) syndication of repository data, the schedule is too elementary to meet even slightest advance distribution needs. E.g. the automatic daily syndication will keep on syndicating repository data without considering weekdays/weekends. Downstream systems looking to receive data only on weekdays may need to think twice. Of course, we may ask downstream systems to align their needs, but such a small natural extension should be available out-of-the-box.

 

One may argue saying none of the requirement is related to master data, but, today, enterprises want to get max out of their data, SAP MDM – and perhaps using WDC like components - should extend the support basked to provide for such need. After all, customers will love to have their master data requirement addressed using one platform than keep on adding tools for each new requirement.

I came accross a requirement in one of my assignment, in which I was expected to write a Web Service to fetch the data from MDM tables. I thought to share the steps I followed to acheive this requirement through this blog.


Requirement: Implement a webservice to fetch data from MDM tables.

 

Solution:

Step 1: Create a EJBProject

EJBProject.png

Step 2: Create a Session Bean in the EJB project created in previous step. To do so, right click on the EJBModule and select Session bean from the menu.

Give package name, Bean class name and check the "Remote" check box. Once done, Click on finish button.

We have successfully created a Session bean.

 

Step 3: Add below dependencies to the EJB project. These are required to use MDM APIs.

Dependencies.png

Step 4: Create an "Enterprise Application" project and add the EJB project created in Step1 in the Java EE dependency screen, click on finish.

At this point, all required projects and dependencies are created.

Step 5: Create a SYSTEM_USER in MDM repository which we require to connect to MDM system.

Step 6: Open the Session bean created previously and declare a method in it. Add the below code to connect to MDM in the Session bean class.

Code.png

Step 7: We successfully connected to MDM repository in the above step. Now we have to query the MDM table to fetch the desired record.

My requirement was to query "Sites" table based on "Country" and "Organization" which are provided as input to web service.

Below is the code I used to fetch the records . This piece of code can be modified based on individual requirement.

  Code1.png

Step 8: Loop through the Result Set and set the required output fields to return variable of our method, which will be output from web service.

Step 9 : Copy the method signature the we created in Session bean and paste it in the "Remote" interface class created. This is required to create Our Web Service.

Step 10: Let's expose our Session bean as Web Service. To do so, right click on the Session bean --> Web service --> Create Web Service.

Check the "Publish the Web serice" check box and click next. In the following screen, select "create New Interface" option  and give a name to Interface.

Select the method to be exposed in the Web Service and click on finish button. Now we have successfully created a web service out of our Bean.

 

Step 11: Build the Enterprise application project and deploy on the server.

 

Step 12: We have successfully deployed our changes on server. Now let's login to WSnavigator to test the web service.

WDNavigator.png

Step 13: Provide username and password and log into wsnavigator. Select Provider system radio button and enter the service name and click on search button. we will get the list of service available in the Service. Select our service and click on Next button.WS1.png

Step 14: Provide the necessary inputs and click on next button. If the matching record is found, We we get the desired output.

 

I Hope this will be usefull.

Regards,

Pavan

Many a times certain requirement occurs wherein we have to monitor the activities of MDM . Activities like record being modified , record rollback , record syndication, repository status change and few other MDM events. Manually monitoring these activities is not possible.In such a scenario , monitoring MDM activities through some other mechanism is more feasible.

 

Here the concept of listening from MDM events comes into picture. We can call it a MDM Listener activity.This can be achieved using a MDM Java API  - 'EventDispatcher' .

 

'EventDispatcher' is a class in Java which would help to listen (ie raise alert on the registered event) so that related processing can be done.

Simple MDM events like record checkout , record modified , record rolled back , record syndicated can be listened . Also , events like repository loaded , repository unloaded can be listened.

 

This class is responsible for dispatching all events coming from one specific MDM server to all event listeners subscribed in this dispatcher.We just need to register the specific MDM server so that it can handle the events listened by this dispatcher.

 

In order to define the set of notifications available on the dispatcher it should be registered to one or several groups of notifications .Notifications which indicate Repository level activities , server level activitites or Data level activities.

 

The Steps for subscribing the listener and starting it are as follows:

 

Create a main class with a method which will hold the listener and create another class which will extend AbstractServerListener Interface if we want to listern to Server activities , AbstractRepositoryListener Interface if we want to listern to Repository activities or extend AbstractDataListener Interface if we want to listern to record level activities.

 

If all these 3 activities are to be listened to , then 3 different classes extending above interfaces individually should be declared.

 

These classes ,which extend the required interface, can override methods of the respective interface for further processing.

 

In the below example we will create a main class 'EventListenerExample' with a method 'EventMethod' .The second class extending the interface would be  RepositoryEventListener with method 'repositoryStatusChanged' which overrides the 'repositoryStatusChanged' method of AbstractRepositoryListener Interface.

To run this , few parameters are required ie server name , repository name , and the user name authorised to access them.

 

1.  If we were to listen MDM repository events, then the Listener class would be as below

 

import com.sap.mdm.notification.event.RepositoryStatusEvent;

import com.sap.mdm.notification.AbstractRepositoryListener;

 

class RepositoryEventListener extends AbstractRepositoryListener

{

     public void repositoryStatusChanged(RepositoryStatusEvent event)

      {

          //Print repository status

      }

}

 

2. In the main class , in method 'EventMethod' ,create an instance of EventDispatcher as below and set the server name

      EventDispatcher eventDispatcher = EventDispatcherManager.getInstance().getEventDispatcher(serverName);


3. Add listeners to Event Dispatcher class

     For this , first we need to initialize listeners.For initializing listeners , the class which extends the reqiured listener interface is created as above.

 

    RepositoryEventListener repositoryListener = new RepositoryEventListener(); // Creating instance of Listener class

 

4. Once the listener instance is created , add the listener object initialized ablove to the EventDispatcher so that these events can be captured.

 

     eventDispatcher.addListener(reposListener);

 

5. Register (subscribe) for all type of notifications for the event

 

RepositoryEventRegistrationSet reposRegSet = new RepositoryEventRegistrationSet(); 

reposRegSet.addEventType(RepositoryEventRegistrationSet.IMPORT_EVENTS); 

reposRegSet.addEventType(RepositoryEventRegistrationSet.SYNDICATION_EVENTS); 

reposRegSet.addEventType(RepositoryEventRegistrationSet.ALL_REPOSITORY_EVENTS);

 

RepositorySessionContext repSessionContxt = new RepositorySessionContext(serverName,repositoryName,userName);

 

eventDispatcher.registerRepositoryNotifications(reposRegSet, repSessionContxt,"");


With this the listener is triggered. But this listener activity wouldn't continue for long.To keep MDM listener running for sometime , add a loop as below which would run for 5 mins.As per requirement the time condition can be changed . Now ,the listener would continue for 5 mins

 

while (count < 5 * 60)

{

   try

   {

      Thread.sleep(1000);

   }

   catch (InterruptedException e)

  {

      e.printStackTrace();

   }                                    

  count++;

}

 

 

So , now with the code in place , a call to the method EventMethod will set the Listener and this thread will continue for 5 mins (since we have kept the while loop running for 5 mins) . In this while , any change in repository status would be captured by the listener and if any loggers are written in this method then these can be seen.

 

Once the entire execution is done , one of the important step is to terminate the EventDispatcher instance only in standalone applications.Don't terminate EventDispatcher if it runs in a Web AS because this instance may be shared by other threads.

 

EventDispatcherManager.getInstance().terminateEventDispatcher(serverName);

 

 

Now, the complete code would look like below

 

public class EventListenerExample

{

public void EventMethod (String para_serverName , String para_repositoryName , String para_userName ) {

 

     UserSessionContext userSessionContext;

 

     SessionManager sessionManager;

 

     userSessionContext = new UserSessionContext(para_serverName,para_repositoryName, "" ,para_userName);

 

     userSessionContext.setTrustedConnection(true);

 

     sessionManager = SessionManager.getInstance(); 

     sessionManager.createSession(userSessionContext,SessionTypes.USER_SESSION_TYPE,"");

     EventDispatcher eventDispatcher = EventDispatcherManager.getInstance().getEventDispatcher(para_serverName);

 

     RepositoryEventListener repositoryListener = new RepositoryEventListener();

     eventDispatcher.addListener(reposListener);

 

      RepositoryEventRegistrationSet reposRegSet = new RepositoryEventRegistrationSet(); 

     reposRegSet.addEventType(RepositoryEventRegistrationSet.IMPORT_EVENTS); 

     reposRegSet.addEventType(RepositoryEventRegistrationSet.SYNDICATION_EVENTS); 

     reposRegSet.addEventType(RepositoryEventRegistrationSet.ALL_REPOSITORY_EVENTS);

 

     RepositorySessionContext repSessionContxt = new RepositorySessionContext(para_serverName,para_repositoryName,para_userName); 

 

     eventDispatcher.registerRepositoryNotifications(reposRegSet, repSessionContxt,"");

 

     while (count < 5 * 60)

     {

        try

        {

           Thread.sleep(1000);

        }

        catch (InterruptedException e)

       {

           e.printStackTrace();

        }                                    

       count++;

     }

 

     EventDispatcherManager.getInstance().terminateEventDispatcher(para_serverName);

     sessionManager.destroySessions(userSessionContext);

 

     }

}

 

 

class RepositoryEventListener extends AbstractRepositoryListener

{

     public void repositoryStatusChanged(RepositoryStatusEvent event)

      {

          //Print repository status

      }

}

 

 

This can now be written in a web module and further triggered as a Web service and used or a call to this web module can be done from Web dynpro Java applciation.

 

Limitation of 'EventDispatcher' MDM Java API :

 

MDM event notification is not a guarantee delivery system. In addition, there are known limitations and deficiency. For example, AbstractDataListener.recordAdded is not trigger if a record is created through import.

 

Another important limitation would be with termination of EventDispatcher instance.EventDispatcher instance shoudln't be terminated if it runs in a Web AS because this instance may be shared by other threads.It should be terminated only in standalone application.

HI All,

 

I am sharing my knowledge as per to explain the process of creating a Repository in SAP MDM. this blog is for giving some solution to James for the following discussion  How to add repository in MDM data manger.

 

 

  1. Install MDM server ( MDS,MDIS,MDSS,MDLS) and DBMS server ( Database )
  2. Now start the MDM server and Auxiliary servers (MDIS, MDSS and MDLS) by as mentioned in the below steps.
  3. Control Panel\System and Security\Administrative Tools>Services>MDM server 7.1
  4. Follow the above step to start the auxiliary servers also.
  5. Now install MDM clients ( console, Data Manager, Import Manager and Syndicator)
  6. After installing the clients now open MDM Console.
  7. Now Mount the MDM Server in console.
  8. Note the server has to be in running status.
  9. As the server id now in running status, this doesn’t mean you can create repository in MDM server, to this you need to configure the DBMS setting.
  10. Right click on the server and choose DBMS setting and gives the appropriate DBMS server name and password of DBMS server.
  11. The above step is about to establish a connection between MDM server with DBMS server.
  12. Now Right click on the MDM server and choose the option create repository.
  13. Here you need to give the DBMS server name and password again, until the Repository name field and port field will be disabled
  14. Once the DBMS name and password is verified the repository name & port number will be enabled.
  15. Now you can type preferred repository name & port number and by this you will get a new repository in SAP MDM.
  16. Once the repository gets displayed under the MDM server hierarchy node, by default it will be in stopped status.
  17. Now you can design your repository as per requirement and once you feel the repository is ready for your use, then right click on the repository and choose start repository.
  18. You will get two options Immediate and Update Indices.
  19. It is good if you choose Update Indices.
  20. Now the repository will be reflected as per the designed in other clients.
  21. If you need to again change or to update structure of the Repository, then got to console down the repository and perform the action.
  22. Stopping a repository is not necessary for any type of addition in Admin node (Users, Roles, Ports, Remote system, XML schema, Links, etc.).

However most of the above steps cover’s some basis activities, I just added for knowledge sharing.

In the following article I would like to present a very simple integration between two SAP products, SAP NetWeaver Master Data Management (MDM) and SAP Lumira. The goal of the integration is to easily generate analytic visual reports of the master data content.

Preface

SAP NetWeaver MDM customers enjoy many features and abilities of the product including content consolidation, master data harmonization, and many others.
However a customer may need to analyze the master data for any of the following reasons:

 

  • To transform the raw master data into visual information for business purposes
  • To gather statistical and quantitative analysis of the master data
  • To perform data aggregation according to certain criteria (e.g. Supplier, Region) and present the results in a visual manner

 

The integration between the two products can be done by connecting SAP Lumira to MDM using the MDM DB views.
image002.png

MDM DB Views

The ability to generate MDM DB views was officially added to the MDM product in the MDM 7.1 SP10 release. DB views provide the customer with a real time data representation of the MDM content via SQL views. An SQL view is generated for each MDM table, by using a simple CLIX command.

 

The views are read-only.

 

SAP Lumira

SAP Lumira is a data manipulation and visualization tool. Users can connect to various data sources, select and clean data, and manipulate and visualize data using a diverse set of graphical charts and tables.

 

In this article I will describe how to visualize the MDM data using SAP Lumira.

 

A real life scenario

To demonstrate the capabilities of visualizing MDM data on SAP Lumira I’ll use a simple product repository.

 

I’ve already got data in the repository but I’ve been requested to provide analysis reports on the status of the repository’s products to upper management.

 

The request is:

 

  • Provide a visual report on the number of products per manufacturer.
  • Provide a visual report on the number of products per customer country.

 

In my repository I have the fields ‘Part Number’, ‘Manufacturer’ and ‘Customer_Country’ in the main table ‘Products’ of the repository.

 

‘Part Number’ uniquely identifies each product in the repository. ‘Manufacturer’ and ‘Customer_Country’ are the fields that I use to generate the reports.

 

Without the integration between SAP Lumira and NW MDM I’d have to query the data myself using MDM Data Manager or the MDM APIs, and then summarize and formalize the results, processes that could take hours and could lead to mistakes of data changed or incorrectly copied during one of the steps.

 

The integration of SAP Lumira with SAP MDM will generate the two reports in a matter of minutes.

 

SAP Lumira will query the data directly from the database using the MDM DB views in 3 easy steps.

 

Step 1: Generating MDM DB Views

MDM DB views are generated using the CLIX command, RepViewsGenerate.

 

This command generates an SQL view for the MDM data tables and each MDM table will be represented by a single view with all the display fields as the SQL view fields.

 

The syntax of the command is:

 

RepViewsGenerate <ServerName> <RepositoryName>;<DB_Name>;<DB_Type_Letter>;<DB_UserName>;<DB_Password> <Repository_UserName>:<Repository_Password> "*"

 

In my example my repository name is MyRepository, it is mounted on an MS SQL server called MYMSSQL and my MDM server name is MYSERVER so I’ve used the following syntax:

 

CLIX RepViewsGenerate MYSERVER MyRepository;MYMSSQL;S;sa;pass Admin:adminpass "*"

Once I’ve executed the command the MDM DB views will be generated in my database, and each table is now represented by a single view.

 

My assignment is to generate analytic reports on the main table Products, so I’ll use the corresponding view M_Products_1_0.

 

Step 2: Connecting to the MDM DB Views with SAP Lumira

Configuring a JDBC driver for SAP Lumira

 

Once I’ve generated the views my next step is to connect to the DBMS using SAP Lumira using the following 2 components:

 

  • SAP Lumira Desktop Standard  Edition
  • A JDBC driver for my DBMS

 

I’m using MS SQL server 2008 so I’ve used Microsoft SQL Server JDBC Driver 3.0.

 

Once I’ve configured the JDBC driver in SAP Lumira (File -> Preferences -> FreeHand SQL) and restarted the application, I’m ready to connect to the MDM main table view.

 

Connecting to the MDM Main Table View

 

Connecting to the MDM main table view is done in the following way:
  • I start a new SAP Lumira document.
  • In the Data Source selection screen, I select FreeHand SQL.
image003.png
  • I select my DBMS type. The connection icon should be green if the JDBC driver is configured correctly.
  • I enter the DBMS login parameters and click ‘Validate connection’. After the validation has passed successfully, I click ‘Acquire’.

D.png

  • If the validation step failed, the issue might be wrong login details (Server Name, Server Port, Username and Password) or a problem with the JDBC driver or network.
  • The next dialog box displays a ‘query’ box, but there is no need to write a SQL query to get the data. I expand the catalog in the upper left corner and look for my repository name with an ‘_MXXX’ suffix (usually _M000).
image011.png
  • Once I find my repository database, I expand it and look for the view I need (in my example the data resides in the view M_Products_1_0) and double-click the name of the view.
  • I click ‘Preview Data’ to fetch a preview of the main table data.
image013.png
  • The menu below shows the column selection for columns that will be used by SAP Lumira. The default selection will fetch the data of all columns of the MDM main table into SAP Lumira; however, it is unnecessary for me to fetch all of them since I only require the 3 fields I mentioned earlier, ‘Part Number’, ‘Manufacturer’, and ‘Customer_Country’.
  • I select the 3 columns that I need and click ‘Acquire’ and continue to design my visual report.
image015.png

Step 3: Preparing the Reports in SAP Lumira

In the SAP Lumira’s preparation screen I select the ‘Visualize’ view.
image017.png
In the view I can see my 3 fields in the ‘Attribute’ column.
image019.png
I select the field ‘Part Number’ so that I can uniquely identify each product in my repository.  The next step is to define this field as measure, which means that this field will provide the data units for the charts and graphs that I’ll create.
I right-click the attribute and choose ‘Create a measure’.
image021.png
I can now complete the first task I was requested to do: Provide a visual report on the number of products per manufacturer.

A visual report on the number of products per manufacturer

This is a rather simple visual report and SAP Lumira can present it in several different charts and graphs.
I start with a simple pie chart:
  • I click on the ‘Pies’ icon’s drop-down menu and select ‘Pie Chart’.
image023.png
  • I drag the new ‘Part Number’ measure I defined into the ‘Pie Sectors’ box under ‘Measures’ and drag the ‘Manufacturer’ attribute into the ‘Legend Color’ under ‘Dimensions’.

A.png

That's it! A pie chart with the number of products per manufacturer is generated!
image029.png
I can see from the chart that most of my products are manufactured by SAP AG, Apple Inc and Microsoft Corporation.
But that's not all. I can now switch between different visual representation types by clicking the relevant icon and decide which way I’d like to present my data,
For example, the same data is shown below in 3D Column bars, (available under the ‘Bars’ icon) and in cloud tag (available under the ‘Others’ icon).
image031.png
image033.png

A visual report on the number of products per customer country

My next task is to provide a visual report on the number of products per customer country.
Now, I can use the exact same method I used in the previous example and simply replace the ‘Manufacturer’ attribute with the ‘Customer_Country’ attribute and it will work, but I’d like to use an additional way to present the data for this report.
Although I acquired 3 fields from my main table view as attributes, there is a different icon next to the ‘Customer_Country’ attribute, which indicates that SAP Lumira detected that this attribute contains geographical data and can be displayed in geographic format.
image035.png

In order for me to display the attribute in geographic format I need to create a geographic hierarchy in the following way:

 

  • I right-click the ‘Customer_Country’ attribute and select ‘Create a geographic hierarchy’.

image037.png

  • Two options are presented, ‘By Name’ and ‘By Latitude/Longitude’. Since the data is country names, I select the first option.
  • The next window presents the types of geographical locations SAP Lumira can use to present the data. My data is countries, so I’ll select the ‘Country’ option. I can also choose between presenting all the values or only those values to which SAP Lumira could  match a country.

image040.png

  • SAP Lumira will now match the data in my repository with the known countries. All the values in my repository were matched to existing countries.

image042.png

  • Once the matching is completed, a new geographic hierarchy called ‘Country’ is created and added in the Hierarchies menu.

image044.png

  • Now I select ‘Geo choropleth chart’ and in it I place the following values:
    • I drag the ‘Part Number’ measure into the ‘Value’ box under ‘Measures’ and  drag the new geographic hierarchy ‘Country’ into the ‘Geography’ box under ‘Dimensions’.

B.png

The result is a map that maps the number of products per customer country. I can clearly see that my products are sold mainly in the US and EMEA. I might need to improve my product availability in the other countries

 

image050.png

Combining the two reports


I completed my two main tasks and sent my reports to my manager. I was then asked if there is a way to combine the two reports and create a visual report on the number of products per customer country and for each country to present the number of products per manufacturer.

Using the measures, attributes, and geographic hierarchies I created, I can accomplish the task easily. I’ll show two ways to present this report.

 

Geographic

This scenario is almost identical to the Geo choropleth chart that I created earlier; the difference is that instead of creating a Geo choropleth chart I select ‘Geo Pie chart’. The second change is that I drag the ‘Manufacturer’ attribute into the ‘Overlay Data’ box under ‘Dimensions’.

image052.png

That's it! I now have a Geo Pie chart in which each country also contains a separate pie chart for the manufacturer in the products sold in this country.

I can now recognize trends in my repository; for example, most of the products bought in Asia are from the manufacturer SAP AG and most of the products bought in Europe are from the manufacturer Apple.

image054.png

Tree Map

The second option to present this report is in the ‘Maps’ visualization. I use the ‘Tree map’ option and I drag the ‘Part Number’ into the ‘Area Weight’ box and the  ‘Country’ geographic hierarchy and ‘Manufacturer’ attributes into the ‘Area Name’ box under ‘Dimensions’.

image056.png

 

This displays the same results as the previous chart. In my opinion this method is less attractive but can provide information easily if drill down into the data is needed.

image058.png

Row Data Statistics

SAP Lumira can present statistics on the master data as row data in tables. This is useful when the exact data count is needed. In the following example I can extract the exact number of manufacturers and customer countries per product

 

I switch to the ‘Data’ view and select the ‘Facets’ view.

C.png

Next I need to choose the measure according to which the facet tables will be created; I select the ‘Part Number’ measure.

image064.png

Facet tables are now created for the other two attributes. I can now see the exact number of manufacturers and customer countries per product:

image066.png

I can now filter the results further by double clicking the values in the facet tables; for example, if I click ‘SAP AG’ I can see the division of customer countries for this specific vendor only:

image067.png

 

 

Reusing the reports

An important subject I want to elaborate on is the reusability of the reports. Once I have saved my reports, I can reuse them as long as the schema of my repository is not changed. All I need to do is to run the ‘Refresh document’ command from the ‘Data’ menu, and SAP Lumira will fetch the new data from my database and update the report automatically.

 

Publishing the results

Once I complete all the reports and save them, I can share them in several ways. The first step is to switch from the prepared view to the shared view.

It will display all the options to share the data sets and visualizations I created:

image070.png

Visualizations can be published via emails or on SAP StreamWorks.

 

Datasets can also be published via SAP HANA, SAP BusinessObjects Explorer, Lumira Cloud or a csv file.

 

Summary

In this article I’ve shown a small portion of what can be done by combining SAP NW MDM and SAP Lumira. My examples were on a relatively simple data set but this doesn’t mean that complex data sets can’t be extracted. For example several joint queries on MDM DB views can provide complex data sets which include MDM hierarchies, tuples, etc.

 

More information on SAP Lumira can be found in the SAP Lumira user guide at:

 

http://help.sap.com/lumira

 

and at:

 

http://www.saphana.com/community/learn/solutions/sap-lumira

 

SAP Lumira

 

More information on DB views can be found in the MDM 7.1 SP10 Console Reference Guide in the section "Generating and Deleting MDM DB Views".

 

http://help.sap.com/nwmdm

 

SAP NetWeaver Master Data Management

 

Enjoy,

 

Tal Shnaiderman

MDM Development

 

Actions

Filter Blog

By author:
By date:
By tag: