Customer connection program for SAP NW MDM 2016 has started.

Customers who want to submit new development request can do through below link

Hi All,


This document helps you to create a Repository in SAP MDM Server.



1. We need to install MDM Server like MDS, MDIS

2. Install Database Server.

3. Install MDM console, Data Manager, Syndicator as per your requirement.

4. MDM Server should be up on running.



1. Open the SAP MDM Console


2. Right click on SAP MDM Server node


3. Select the Server from the list


4. SAP MDM Server is connected.

5. Right Click on MDM server and click on create repository


6. Select the DBMS server and provide the user credentials (Database) like below:

          User name : <DB User Name>

          Password  : *******


7. Click on Next and provide the Repository Name


8. Click on Finish.

Now, We have created a new Repository.


Connect to Repository:

1. Right click on Repository and select connect to Repository


2. Provide the credential

       User Name : Admin

       Password   : sapmdm


We have successfully connected to Repository.


3. Start the Repository

    Right click on Repository Node (Test_Repository) and select Start Repository-->Immediate.


Repository started successfully.







DataPool Response:

When the trade item is successfully exported from GDS and sent to 1 SYNC via PI, 1 sync will send the response back to GDS as a receipt of acknowledgement.

If item is Added, Modified or deleted 1 sync will send the proper acknowledgment with status as Accepted/Rejected. PCL (Parent and child Linkages) relationship for the item can also be done through GDS export process.


CAtalogue Response.png


  1. The 1 Sync Server sends an AS2 message containing the status of the item registration / publication or the response from the customer (accept / reject ) to GDS via PI
  2. PI receives the 1 SYNC AS2 messages  for further processing.
  3. PI transforms the data from an CatalogueResponse message to a DataPoolResponse.
  4. The DataPoolResponse message is sent to the GDS system via a JAVA proxy call
  5. The JPR Layer in GDS performs some pre-validation and then passes on the data to the GDS core component
  6. The GDS core component updates the status of the trade item by invoking the MDM JAVA API to make calls to the MDM server.



Please feel free to share your thoughts and suggestions in the comments!!

Follow me on Google+

GDS Outbound Process

This blog explains about the GDS outbound process/steps involved in GDS Item registration and publishing in 1WorldSync.


1WorldSync is the Product information network and data pool solutions provider, certified within GS1 Global Data Synchronization Network(GDSN).It supports GS1 Standards, the global standards for identifying, capturing, and sharing product information.





  1. When a trade item is created, changed, or published by a user GDS CORE performs a validation and retrieves all the data for the item if it meets the criteria for export
  2. GDS CORE then passes on the data to the JPR layer for outbound processing
  3. The JPR layer then sends the message to PI using a JAVA proxy call
  4. PI transforms the data from an MT_TradeltemExport message to a CatalogueRequest
  5. The CatalogueRequest message is sent to the 1 sync data pool server via AS2 adapter.


Points to Remind

  • In SAP PI standard SAP GDS Content is being used, PI GDS outbound mapping is further customized based on business requirements.
  • 1World Sync xml guide has to be referred for mapping purpose to link the correct attributes - attr,attrmany, attrgroupmany etc...
  • In case of incorrect mapping the item has been rejected in 1World Sync and returns a acknowledgement message with "Invalid XML Message".
  • Message can be grouped together with specific to Target market and IPGLN (Information provider GLN).
  • 1 World Sync will provide Unique user id for each Target market.
  • User id has to be maintained in SAP GDS Parties table .

In the next blog we will explain about the Data pool response from 1 sync - GDS Process - DataPool Response from 1SYNC



Please feel free to share your thoughts and suggestions in the comments!!

Follow me on Google+

GDS Inbound Process:
In this blog we will see the process used to send data from SAP MDM to SAP GDS. SAP GDS will receive Product/Materials  information from SAP MDM or SAP ECC. All the GDS relevant data that is required to create items in GDS is entered in SAP MDM product and sent from MDM 7.1 to GDS 2.1 via PI.


The Product Global Master Data repository in MDM is a custom built repository.

  • MDM Adapter is used in PI to receive the data from SAP MDM.
  • GDS Standard content (XML structure) has to be used in PI to send the data to GDS.
  • Registering Inbound Interfaces of SAP GDS - Refer the blog GDS Overview and Technical Guide for PI.




  1. Product Information details are sent from MDM Product Global.
  2. PI picks the files using MDM/File Adapter and transform the message to GDS understandable format - MT_Trade Items XML Message.
  3. MT_TradeItem message is sent to GDS Via Java proxy Run time call.(XI Proxy adapter is used )
  4. JPR Layer in SAP GDS Performs Pre validation and then passes on the data to GDS core component.
  5. The GDS core components apps drives the creation/Modification of the item by invoking MDM Java API to make calls to the MDM server and get the status for the transaction.
  6. Status and details for the transaction execution are then added to the process logs.

Points to remind

  1. GTIN, Information Provider GLN and Target Market are the mandatory attributes required in GDS to determine whether the item is New/Modified.
  2. GDS will insert/update the item based on the uniqueness of GTIN, IP (Information Provider) GLN and TM( Target Market).
  3. GDS custom attributes creation should follow the standard SAP naming conventions, it should start with Z<Attribute NAME>.
  4. Any new creation or modification of the GDS table requires GDS console and server restart for changes to be effective.
  5. Custom attribute mappings should be handled under custom elements in the PI mappings.
  6. CustomGenericElement,CustomLookupElement,CustomLanguageElement and CustomQualifierRecordElement are the types of GDS custom attributes.
  7. Table code, Field code and reference path to be mapped with reference to SAP GDS attributes.
  8. IP GLN key values mapping should be maintained in SAP GDS Parties table.
  9. Key and value attributes in the PI Mappings to be mapped accordingly.
  10. GDS Key value mappings are case sensitive, if hard coded in PI it should be case sensitive same as GDS.
  11. If key value mapping is not maintained in GDS the item will throw warnings and we can see it in GDS Process logs.


Next blog will discuss about the GDS export Process -GDS Outbound Process - Item Register and Publish in 1Sync


Please feel free to share your thoughts and suggestions in the comments!!

Follow me on Google+

Environment: Webdynpro Java Portal and SAP MDM   

Scenario 1: Create

  1. Create the BinaryBlobRecord instance(blob_instance) from RecordFactory.createEmptyBinaryObjectRecord(BINARY_OBJECTS_TABLE_ID);
  2. Set all the values that are to be updated in the same instance (blob_instance).**
  • blob_instance.setDataGroupId(<value>)
  • blob_instance.setBinary(<Binary Object Byte[]>);
  • blob_instance.setHasOriginal(<boolean>);

     3. Use CreateRecordCommand create = new CreateRecordCommand(<user session context>); to create the Binary Object Record (B1)

          CreateRecordCommand (MDM Java API Library)

     4. Set the value and Update the Main Table Record (R1) with the Link (B1) Result : We have a record (B1) in the Binary Object table linked to the main table* record (R1).

*Can be table of any type

Scenario 2: Update
To update the BLOB object of that particular Binary Object record (B1).Steps:

  1. Search the Binary Objects table with for the record to be updated
    1. Get the link from the main table record (R1)
    2. The return value is Rec ID of the BLOB (B1).
  2. Create the BinaryBlobRecord instance(blob_instance) from RecordFactory.createEmptyBinaryObjectRecord(BINARY_OBJECTS_TABLE_ID,B1);
  3. Set all the values that are to be updated in the same instance (blob_instance).**
    • blob_instance.setDataGroupId(<value>)
    • blob_instance.setBinary(<Binary Object Byte[]>);
    • blob_instance.setHasOriginal(<boolean>);
  4. Use the ModifyRecordCommand to update the Record.

    ModifyRecordCommand (MDM Java API Library)

Result: The Binary Object is updated and hence there is no need to re-link again.

**For the details about the different Parameters of Binary Objects - Working with Binary Large Objects (BLOBs) - SAP NetWeaver Master Data Management (MDM) 7.1 - SAP Library


to set the setDataGroupId(Mandate for all types of BLOBs)

RetrieveGroupTreeCommand tree = new RetrieveGroupTreeCommand(<usersession context>););



HierGroupNode node = tree.getGroupTree();

//As per our requirement we just placed all the BLOBs in the first leaf of the node

//you can various options in the HierGroupNode (MDM Java API Library) to traverse and select the relevant leaf.

GroupNodeId gID = node.getAllLeafs()[0].getId();                 



to set the setDataLocationId (Mandatory only for Binary Objects)               

RetrieveGroupTreeCommand location = new RetrieveGroupTreeCommand(<usersession context>);



HierGroupNode lTree = location.getGroupTree();

//As per our requirement we just placed all the BLOBs in the first leaf of the node

//you can various options in the HierGroupNode (MDM Java API Library) to traverse and select the relevant leaf.

GroupNodeId lID = lTree.getAllLeafs()[0].getId();                



Scenario 3: Deletion
Note : we can delete a Binary Object only if it is not linked anywhere.

Delete the orphaned or the unused records from the Binary Objects table.

  1. Get the Record ID’s from the Binary Object Table

     Using the  RetrieveLimitedRecordsExCommand objExCommand_allData = new RetrieveLimitedRecordsExCommand(<UserSessionContext>);          Command.   

2. Use the below code to get usage of the Binary Object Record (B1).   

RetrieveBinaryObjectUsageCommand objUsageCommand =new                                           RetrieveBinaryObjectUsageCommand(<UserSessionContext>);            

    • objUsageCommand.setTableId(BinaryObjects_tableID));
    • objUsageCommand.setRecordId(B1);
    • objUsageCommand.execute();

     RepositoryItemUsageResult objUsageResult = objUsageCommand .getUsageResult();

     RepositoryItemUsage[] repoUsage=objUsageResult.getUsages();


Refer the link for the return values : Constant Field Values (MDM Java API Library)


If the length of repoUsage is 0 then the Binary Object Record (B1) has no usage. And therefore can be erased.


     3.Use the

     DeleteRecordsCommand objDeleteBLOBRecordsCommand = new DeleteRecordsCommand((<UserSessionContext>);

     DeleteRecordsCommand (MDM Java API Library)

            To delete the Record.

Note :  Trying to delete Binary Object records which are linked will result into an Exception  .


FYI: For your information you can use the Pagination technique for searching Pagination of MDM search result in Java


SAP MDM is a repository which can hold a huge and complex data. So, searching the repository and retrieving data from the table is a heavy task. We have the SAP provided MDM APIs in place to carry out the search and return the result. But, when the result is huge, it is a risk to increase the page size according to the response result.

For example, if the search result is expected to return 10K values and for which we have increased the page size to 10k, then there is chance that the system goes down or the system will encounter an delayin response time. Thus to overcome this situation we will have the paging concept implemented here, while retrieving we just need to mention the below code snippet.



Before retrieving the data set the Page size.

Note: Choose wisely according to the requirement (Maximum Page Size = 1000).

//Retrieve data

RetrieveLimitedRecordsExCommand getDataCommad = new RetrieveLimitedRecordsExCommand(<UserSessionContext_instance>);




//Number of records the Search returned

int resultCount = getDataCommad.getSearchTableMatchCount();

//Size of the page

int pageSize = getDataCommad.getPageSize();

//Total pages

int totalPages = resultCount/pageSize;

// if there are 1030 records then totalPages = 1, while there are 2 pages 

// when Page Size is 1000.

// Hence a Modulo would help us to know if there is an extra page

int hadExtraValues  = resultCount%pageSize;




if(hadExtraValues != 0)


totalPages++; // if there is an extra page then the var:totalPages is increased by 1



Record[] tempRecord = null;

int arraySize = 0;

//int page Will start from 1 as the getDataCommad.execute(); will return the //records in page  = 0.

for(int page=1; page<= totalPages; page++)


//TODO - Code to manipulate with the records in Page = page.

      getDataCommad.setPageIndex(page)//Set the next page Index



Deleting and Renaming the import map:


Steps for deleting Import map:

1.      Login to the import manager.



2.      Go to File->open-> pop-up window will appear with the list of all the import maps.



3.      Press the delete key which is on your keyboard.

4.      Now your import map will be deleted.


Steps for renaming import map:


1.      Login to the import manager.



2.      Go to File->open-> pop-up window will appear with the list of all the import maps.



3.      Press the F2 key which is on your keyboard.

4.      Now your import map will be renamed.



  • Global Data Synchronization (GDS) is a term used to describe an industry initiative focused on keeping item information in sync between retailers and their suppliers.
  • GDS is a standardized process for the exchange of product data in the Global Data Synchronization Network (GDSN),built around the GS1 Global Registry and certified data pools.
  • GS1 Standard Board developed the Global Data Synchronization (GDS) process.
  • The solution integrates with downstream systems to help suppliers aggregate and validate GDS specific data and send this data through the appropriate format and process to their data pool.
  • For retailers, the solution enables a company to publish/subscribe to information, integrates downstream current, new product introduction or maintenance processes.
  • The following industries are actively participating in GDS:
    •   Grocery
    •   Food and Beverage
    •   Health and Beauty
    •   Electronics
    •   Over the Counter (OTC)
    •   Music, Movie, Games
    •   Apparel (minimal)
    •   Hardlines


  • The SAP solution for Global Data Synchronization is a turn-key application which leverages the NetWeaver platform and MDM to provide  a comprehensive  end-to-end solution.
  • Source data is sent from SAP MDM or SAP ECC to SAP GDS using PI interface.
  • There are five simple steps that allow trading partners to synchronize item,
    • Load Data: The seller registers product and company information in its data pool.
    • Register Data: A small subset of this data is sent to the GS1 Global Registry.
    • Request Subscription: The buyer, through its own data pool, subscribes to receive a seller's information.
    • Publish Data: The seller’s data pool publishes the requested information to the buyer’s data pool.
    • Confirm & Inform: The buyer sends a confirmation to the seller via each company's data pool, which informs the supplier of the action taken by the retailer using the information.

GDS Architecture

GDS_Archi.pngRegistering Inbound Interfaces for GDS

  • To receive inbound messages from SAP PI, the Java Proxy Runtime (JPR) to  be registered on the  SAP GDS.
  • This task is performed on the PI Adapter Engine on the Web AS Java for the SAP GDS Console.
  • Open the JPR proxy server configuration pages of the Web AS Java on which the GDS Console is installed: http://<GDS server name>:<port number>/ProxyServer/.
  • Use the /register command to register the following interfaces one after the other, using the nomenclature shown on the JPR proxy server page:

Part of Register CommandEntry
  • The URL for registering the inbound proxy will be as follows :

         http://<GDS Server>/ProxyServer/register?ns=

          http://<GDS server>/ProxyServer/register?ns=

Process Integration - SAP PI

  • SAP NetWeaver Process Integration is the central component for exchanging messages between SAP GDS and 1-Sync.
  • Based on GDS Version 2.0 / 2.1 the respective standard content to be imported to SAP PI.
  • SAP GDS Inbound mapping to be developed using the standard GDS data types. E.g.: TradeItems
  • Data can be sourced from any system with PI certified connectivity.
  • Default values and additional mappings can be set.
  • Customize the GDS outbound mapping (GDS to 1SYNC) based on business needs available under the SWC -1SYNC current version.
  • XI Adapter - Java Proxy is used to communicate with SAP GDS application.
  • Technical and Business system should be properly configured in PI SLD.
  • PIAFUSER is used to connect to SAP GDS Application from PI. PIAFUSER – User should be created in PI and GDS systems.
  • AS2 adapter is used to communicate between SAP GDS and 1SYNC.

In the next blog we will discuss about the GDS Inbound Process and Tips and GDS Outbound Process - Item Register and Publish in 1Sync

Please feel free to share your thoughts and suggestions in the comments!!

Follow me on Google+

Emerging of GDSN

  • The industrial sector is currently in the era where the retailers, manufacturers and distributors have entered a promising, yet challenging, period in their relationships. As they recognize the need of working closely to improve operational efficiency, more efficient information sharing and management has become a critical issue.
  • Traditionally, they exchanged product information using manual, paper-based processes which led to inefficiencies, errors and duplicate labor efforts among trading partners.
  • This gave rise to the need for a standards-based data synchronized model – Global Data Synchronization Network(GDSN).

What is GDSN?

  • The Global Data Synchronization Network (GDSN) is an internet-based, interconnected network of inter-operable data pools and a global registry known as the GS1 Global Registry, that enables companies around the globe to exchange standardized and synchronized supply chain data with their trading partners using a standardized Global Product Classification.
  • GDSN assures that data exchanged between trading partners is accurate and compliant with universally supported standards.
  • Organizations are leveraging the GDSN to streamline how product, service, location, organization, price and promotion data is created and exchanged within their own enterprise.

GDSN consists of the following : 

  • Supplier/retailer trading partner
  • Data pools that hold and process trading partner data, such as Agentrics (formerly Worldwide Retail Exchange) and 1SYNC (formerly Transora and UCCnet).
  • GS1 Global Registry, a directory that helps locate data sources and keep relationships between trading partners in sync. It matches subscriptions to registrations to facilitate the synchronization process.
  • A set of standards established by EAN International and the Uniform Code Council, Inc. (EAN•UCC) through  the Global Standards Management Process (GSMP).
  • Enterprise software for product information management.


The Global Location Number (GLN) and the Global Trade Item Number (GTIN) are the global identification numbers in the GDSN; the GLN is the identifier for legal entities, trading partners and locations while the GTIN is the identifiers for trade items.



  • The GDSN is a network of data pools that facilitates the synchronization of item information between retailer and suppliers through a single global registry.
  • The value of the GDSN is that a retailer and supplier can have a single point of entry into the network through their selected data pool.
  • Once connected, they can subscribe to or publish all of their item information in a single and consistent process for all trading partners that are connected to the GDSN.


In the next blog will explain about GDS Overview and Technical Guide for PI


Please share your thoughts and comments.

Follow me on Google+

Reddy M

StructuralX (exception) in MDM

Posted by Reddy M Nov 21, 2014

Hi Friends



              I am importing the data from Oracle db to sap mdm through sap pi. During the import i got a error like StructuralX (exception) in mdm and i can check the StructuralX folder there is no any Exception files in StructuralX folder , where i can make Wrong ,please let me know ASAP.







Transporting MDMGX objects from SAP ECC to Material Repository in MDM

Maybe include a bit of an intro?

MDMGX (MDM Generic Extraction Framework) is a standard SAP transaction code available within core SAP ERP systems depending on your version and release level.

At NIMBL, one of our major clients – a global services professional company - had a business requirement to add OverheadGroups functionality to their material repository in MDM. In order to accomplish this, we first had to perform local table (LT) configuration for OverheadGroups in our SAP ECC Development client as all local table configurations are maintained in SAP ECC in our client landscape.

To share what I learned, in this blog, I will provide the steps I performed to accomplish this requirement.

First off, our client’s ECC landscape is architectured according to SAP best practices.


Note: Transporting individual configuration objects under MDMGX is not possible - so it is recommended that all of the configuration objects under MDMGX need to be transported in order to maintain consistency across the landscape. Tip: Use an asterisk (*) to include all objects under a specific transport.

Configuring LT_Overheadgroups under MDMGX for Material repository

Step1: Login to the Development client

Step 2: Execute MDMGX transaction and click on Maintain Ports and Check-Tables


Step 3: Provide the appropriate System and Repository information to perform LT configuration


Step 4: Click new file to perform LT_OverheadGroups configuration


Step 5: Enter following details for LT_OverheadGroups configuration as shown below. This configuration provides data in English for LT_OverheadGroups table in MDM Data manager. This information will be available to user through portal.


Step 6: Copy the configuration to Testing client using the SCC1 transaction code. Make sure to include all of the objects under MDMGX as mentioned above.

  After Successful testing and importing transports to your ECC production environment, we next need to perform the following steps to extract data to and from the ECC production LT_OverheadGroups table enabling data flow from the MDM Data Manager

Step 7: Execute transaction MDMGX and select “Start Extraction”



Step 8: Provide the Client details and port name of the table to which data needs to be extracted in the MDM Data manager. Click execute to start the extraction process.



Step 9: Login to the MDM Data Manager and make sure the data is available


So there you go. Easy as pie right? I think this is a best approach to make the data flow from SAP ERP system to MDM if the local tables are maintained in SAP ECC system.


Please stay tuned for my next blog covering how to maintain ports within the MDM console. I would love to hear your thoughts, comments, and own ideas. Please feel free to email me at anytime.


Hope this helps!!!

We have implemented our master data management solution using customized MDM WDC i.e. coupled with CE as a user interface on top of MDM (7.1SP6) along with CE BPM (7.2SP5). The solution was fit-for-purpose and did the intended job of data integration, governance, cleansing and maintaining single source of truth neatly with WDC providing easy to build UI. The reusable and generic nature of the components ensure independence from the underlying MDM schema and as a result, they are configurable and extendable.


With obvious advantages the MDM solution provided, business further came with ever diverse needs and was looking for SAP MDM as a tool for maintaining variety of data (flat, hierarchies etc.) of the organization – not all of it was master/reference data.


Most of the requirements were pertinent to UI of data when stored in MDM (e.g. enabling hierarchical or tree structure view on hierarchies, having drag-drop) with other few related to MDM solution e.g. check-in/check-out on hierarchies etc. Though MDM and MDM WDC could accommodate some of these requirements, others slightly tricky requirement couldn’t be addressed easily. But as the requirement kept on loading, we felt SAP MDM is bit limited to support them.


E.g. many organizations have huge amount of organizational data in hierarchies. An organization looking for MDM solution, naturally, would like support for all its data i.e. flat, hierarchies etc. We faced few limitations in implementing hierarchies using SAP MDM. The situation demanded tree like user interface. Though WDCs attempts to provide that display, we felt it was bit short on user satisfaction. Also hierarchies in MDM come with limitation that cannot be check-in/check-out, referred only at leaf node (perhaps we can make up for this using internal leaf node, but at the expense of duplication) etc. An add-on component like WDCs or some mature support in WDCs for hierarchies could augur well here. Additionally, main table like features for hierarchies i.e. check-in/checkout etc. will not only bring consistency but make the master data support scope wider.


Another such relative area could be, what-if analysis. Often organization undergoing restructuring or other org level changes of merging / splitting of business units needs to do a lot of what-if analysis. E.g. assuming a scenario of merging business unit A with either B or C, management interested in knowing likely cash flows for unit B and C if merged with A, would be interested in some analysis before committing data. We came across one such requirement, wherein business was undergoing some restructuring activity and wanted to do some what-if analysis on org data before committing it. As we couldnt anything in SAP MDM space to address it, we have to settle for Oracle DRM tool with feeds from SAP MDM. It would have been much easier for customers if there was such support using WDC like components. It avoids them lot of hassle of evaluating other options and also provides solution to adjoining needs under same roof.


We also faced issues in distributing the data from MDM. Once master data solution is implemented, the data should be routinely distributed across functional groups to satisfy their need of master data and for integrity of the solution. Business may need regular / scheduled distribution of master data to other functional group. Though syndicator gives us option of automatic (monthly/weekly/daily…) syndication of repository data, the schedule is too elementary to meet even slightest advance distribution needs. E.g. the automatic daily syndication will keep on syndicating repository data without considering weekdays/weekends. Downstream systems looking to receive data only on weekdays may need to think twice. Of course, we may ask downstream systems to align their needs, but such a small natural extension should be available out-of-the-box.


One may argue saying none of the requirement is related to master data, but, today, enterprises want to get max out of their data, SAP MDM – and perhaps using WDC like components - should extend the support basked to provide for such need. After all, customers will love to have their master data requirement addressed using one platform than keep on adding tools for each new requirement.

I came accross a requirement in one of my assignment, in which I was expected to write a Web Service to fetch the data from MDM tables. I thought to share the steps I followed to acheive this requirement through this blog.

Requirement: Implement a webservice to fetch data from MDM tables.



Step 1: Create a EJBProject


Step 2: Create a Session Bean in the EJB project created in previous step. To do so, right click on the EJBModule and select Session bean from the menu.

Give package name, Bean class name and check the "Remote" check box. Once done, Click on finish button.

We have successfully created a Session bean.


Step 3: Add below dependencies to the EJB project. These are required to use MDM APIs.


Step 4: Create an "Enterprise Application" project and add the EJB project created in Step1 in the Java EE dependency screen, click on finish.

At this point, all required projects and dependencies are created.

Step 5: Create a SYSTEM_USER in MDM repository which we require to connect to MDM system.

Step 6: Open the Session bean created previously and declare a method in it. Add the below code to connect to MDM in the Session bean class.


Step 7: We successfully connected to MDM repository in the above step. Now we have to query the MDM table to fetch the desired record.

My requirement was to query "Sites" table based on "Country" and "Organization" which are provided as input to web service.

Below is the code I used to fetch the records . This piece of code can be modified based on individual requirement.


Step 8: Loop through the Result Set and set the required output fields to return variable of our method, which will be output from web service.

Step 9 : Copy the method signature the we created in Session bean and paste it in the "Remote" interface class created. This is required to create Our Web Service.

Step 10: Let's expose our Session bean as Web Service. To do so, right click on the Session bean --> Web service --> Create Web Service.

Check the "Publish the Web serice" check box and click next. In the following screen, select "create New Interface" option  and give a name to Interface.

Select the method to be exposed in the Web Service and click on finish button. Now we have successfully created a web service out of our Bean.


Step 11: Build the Enterprise application project and deploy on the server.


Step 12: We have successfully deployed our changes on server. Now let's login to WSnavigator to test the web service.


Step 13: Provide username and password and log into wsnavigator. Select Provider system radio button and enter the service name and click on search button. we will get the list of service available in the Service. Select our service and click on Next button.WS1.png

Step 14: Provide the necessary inputs and click on next button. If the matching record is found, We we get the desired output.


I Hope this will be usefull.



Many a times certain requirement occurs wherein we have to monitor the activities of MDM . Activities like record being modified , record rollback , record syndication, repository status change and few other MDM events. Manually monitoring these activities is not possible.In such a scenario , monitoring MDM activities through some other mechanism is more feasible.


Here the concept of listening from MDM events comes into picture. We can call it a MDM Listener activity.This can be achieved using a MDM Java API  - 'EventDispatcher' .


'EventDispatcher' is a class in Java which would help to listen (ie raise alert on the registered event) so that related processing can be done.

Simple MDM events like record checkout , record modified , record rolled back , record syndicated can be listened . Also , events like repository loaded , repository unloaded can be listened.


This class is responsible for dispatching all events coming from one specific MDM server to all event listeners subscribed in this dispatcher.We just need to register the specific MDM server so that it can handle the events listened by this dispatcher.


In order to define the set of notifications available on the dispatcher it should be registered to one or several groups of notifications .Notifications which indicate Repository level activities , server level activitites or Data level activities.


The Steps for subscribing the listener and starting it are as follows:


Create a main class with a method which will hold the listener and create another class which will extend AbstractServerListener Interface if we want to listern to Server activities , AbstractRepositoryListener Interface if we want to listern to Repository activities or extend AbstractDataListener Interface if we want to listern to record level activities.


If all these 3 activities are to be listened to , then 3 different classes extending above interfaces individually should be declared.


These classes ,which extend the required interface, can override methods of the respective interface for further processing.


In the below example we will create a main class 'EventListenerExample' with a method 'EventMethod' .The second class extending the interface would be  RepositoryEventListener with method 'repositoryStatusChanged' which overrides the 'repositoryStatusChanged' method of AbstractRepositoryListener Interface.

To run this , few parameters are required ie server name , repository name , and the user name authorised to access them.


1.  If we were to listen MDM repository events, then the Listener class would be as below





class RepositoryEventListener extends AbstractRepositoryListener


     public void repositoryStatusChanged(RepositoryStatusEvent event)


          //Print repository status




2. In the main class , in method 'EventMethod' ,create an instance of EventDispatcher as below and set the server name

      EventDispatcher eventDispatcher = EventDispatcherManager.getInstance().getEventDispatcher(serverName);

3. Add listeners to Event Dispatcher class

     For this , first we need to initialize listeners.For initializing listeners , the class which extends the reqiured listener interface is created as above.


    RepositoryEventListener repositoryListener = new RepositoryEventListener(); // Creating instance of Listener class


4. Once the listener instance is created , add the listener object initialized ablove to the EventDispatcher so that these events can be captured.




5. Register (subscribe) for all type of notifications for the event


RepositoryEventRegistrationSet reposRegSet = new RepositoryEventRegistrationSet(); 





RepositorySessionContext repSessionContxt = new RepositorySessionContext(serverName,repositoryName,userName);


eventDispatcher.registerRepositoryNotifications(reposRegSet, repSessionContxt,"");

With this the listener is triggered. But this listener activity wouldn't continue for long.To keep MDM listener running for sometime , add a loop as below which would run for 5 mins.As per requirement the time condition can be changed . Now ,the listener would continue for 5 mins


while (count < 5 * 60)






   catch (InterruptedException e)








So , now with the code in place , a call to the method EventMethod will set the Listener and this thread will continue for 5 mins (since we have kept the while loop running for 5 mins) . In this while , any change in repository status would be captured by the listener and if any loggers are written in this method then these can be seen.


Once the entire execution is done , one of the important step is to terminate the EventDispatcher instance only in standalone applications.Don't terminate EventDispatcher if it runs in a Web AS because this instance may be shared by other threads.





Now, the complete code would look like below


public class EventListenerExample


public void EventMethod (String para_serverName , String para_repositoryName , String para_userName ) {


     UserSessionContext userSessionContext;


     SessionManager sessionManager;


     userSessionContext = new UserSessionContext(para_serverName,para_repositoryName, "" ,para_userName);




     sessionManager = SessionManager.getInstance(); 


     EventDispatcher eventDispatcher = EventDispatcherManager.getInstance().getEventDispatcher(para_serverName);


     RepositoryEventListener repositoryListener = new RepositoryEventListener();



      RepositoryEventRegistrationSet reposRegSet = new RepositoryEventRegistrationSet(); 





     RepositorySessionContext repSessionContxt = new RepositorySessionContext(para_serverName,para_repositoryName,para_userName); 


     eventDispatcher.registerRepositoryNotifications(reposRegSet, repSessionContxt,"");


     while (count < 5 * 60)






        catch (InterruptedException e)














class RepositoryEventListener extends AbstractRepositoryListener


     public void repositoryStatusChanged(RepositoryStatusEvent event)


          //Print repository status





This can now be written in a web module and further triggered as a Web service and used or a call to this web module can be done from Web dynpro Java applciation.


Limitation of 'EventDispatcher' MDM Java API :


MDM event notification is not a guarantee delivery system. In addition, there are known limitations and deficiency. For example, AbstractDataListener.recordAdded is not trigger if a record is created through import.


Another important limitation would be with termination of EventDispatcher instance.EventDispatcher instance shoudln't be terminated if it runs in a Web AS because this instance may be shared by other threads.It should be terminated only in standalone application.


Filter Blog

By author:
By date:
By tag: