When the trade item is successfully exported from GDS and sent to 1 SYNC via PI, 1 sync will send the response back to GDS as a receipt of acknowledgement.
If item is Added, Modified or deleted 1 sync will send the proper acknowledgment with status as Accepted/Rejected. PCL (Parent and child Linkages) relationship for the item can also be done through GDS export process.
The 1 Sync Server sends an AS2 message containing the status of the item registration / publication or the response from the customer (accept / reject ) to GDS via PI
PI receives the 1 SYNC AS2 messages for further processing.
PI transforms the data from an CatalogueResponse message to a DataPoolResponse.
The DataPoolResponse message is sent to the GDS system via a JAVA proxy call
The JPR Layer in GDS performs some pre-validation and then passes on the data to the GDS core component
The GDS core component updates the status of the trade item by invoking the MDM JAVA API to make calls to the MDM server.
Please feel free to share your thoughts and suggestions in the comments!!
This blog explains about the GDS outbound process/steps involved in GDS Item registration and publishing in 1WorldSync.
1WorldSync is the Product information network and data pool solutions provider, certified within GS1 Global Data Synchronization Network(GDSN).It supports GS1 Standards, the global standards for identifying, capturing, and sharing product information.
When a trade item is created, changed, or published by a user GDS CORE performs a validation and retrieves all the data for the item if it meets the criteria for export
GDS CORE then passes on the data to the JPR layer for outbound processing
The JPR layer then sends the message to PI using a JAVA proxy call
PI transforms the data from an MT_TradeltemExport message to a CatalogueRequest
The CatalogueRequest message is sent to the 1 sync data pool server via AS2 adapter.
Points to Remind
In SAP PI standard SAP GDS Content is being used, PI GDS outbound mapping is further customized based on business requirements.
1World Sync xml guide has to be referred for mapping purpose to link the correct attributes - attr,attrmany, attrgroupmany etc...
In case of incorrect mapping the item has been rejected in 1World Sync and returns a acknowledgement message with "Invalid XML Message".
Message can be grouped together with specific to Target market and IPGLN (Information provider GLN).
1 World Sync will provide Unique user id for each Target market.
User id has to be maintained in SAP GDS Parties table .
GDS Inbound Process: In this blog we will see the process used to send data from SAP MDM to SAP GDS. SAP GDS will receive Product/Materials information from SAP MDM or SAP ECC. All the GDS relevant data that is required to create items in GDS is entered in SAP MDM product and sent from MDM 7.1 to GDS 2.1 via PI.
The Product Global Master Data repository in MDM is a custom built repository.
MDM Adapter is used in PI to receive the data from SAP MDM.
GDS Standard content (XML structure) has to be used in PI to send the data to GDS.
SAP MDM is a repository which can hold a huge and complex data. So, searching the repository and retrieving data from the table is a heavy task. We have the SAP provided MDM APIs in place to carry out the search and return the result. But, when the result is huge, it is a risk to increase the page size according to the response result.
For example, if the search result is expected to return 10K values and for which we have increased the page size to 10k, then there is chance that the system goes down or the system will encounter an delayin response time. Thus to overcome this situation we will have the paging concept implemented here, while retrieving we just need to mention the below code snippet.
Before retrieving the data set the Page size.
Note: Choose wisely according to the requirement (Maximum Page Size = 1000).
The industrial sector is currently in the era where the retailers, manufacturers and distributors have entered a promising, yet challenging, period in their relationships. As they recognize the need of working closely to improve operational efficiency, more efficient information sharing and management has become a critical issue.
Traditionally, they exchanged product information using manual, paper-based processes which led to inefficiencies, errors and duplicate labor efforts among trading partners.
This gave rise to the need for a standards-based data synchronized model – Global Data Synchronization Network(GDSN).
What is GDSN?
The Global Data Synchronization Network (GDSN) is an internet-based, interconnected network of inter-operable data pools and a global registry known as the GS1 Global Registry, that enables companies around the globe to exchange standardized and synchronized supply chain data with their trading partners using a standardized Global Product Classification.
GDSN assures that data exchanged between trading partners is accurate and compliant with universally supported standards.
Organizations are leveraging the GDSN to streamline how product, service, location, organization, price and promotion data is created and exchanged within their own enterprise.
GDSN consists of the following :
Supplier/retailer trading partner
Data pools that hold and process trading partner data, such as Agentrics (formerly Worldwide Retail Exchange) and 1SYNC (formerly Transora and UCCnet).
GS1 Global Registry, a directory that helps locate data sources and keep relationships between trading partners in sync. It matches subscriptions to registrations to facilitate the synchronization process.
A set of standards established by EAN International and the Uniform Code Council, Inc. (EAN•UCC) through the Global Standards Management Process (GSMP).
Enterprise software for product information management.
The Global Location Number (GLN) and the Global Trade Item Number (GTIN) are the global identification numbers in the GDSN; the GLN is the identifier for legal entities, trading partners and locations while the GTIN is the identifiers for trade items.
The GDSN is a network of data pools that facilitates the synchronization of item information between retailer and suppliers through a single global registry.
The value of the GDSN is that a retailer and supplier can have a single point of entry into the network through their selected data pool.
Once connected, they can subscribe to or publish all of their item information in a single and consistent process for all trading partners that are connected to the GDSN.
I am importing the data from Oracle db to sap mdm through sap pi. During the import i got a error like StructuralX (exception) in mdm and i can check the StructuralX folder there is no any Exception files in StructuralX folder , where i can make Wrong ,please let me know ASAP.
Transporting MDMGX objects from SAP ECC to Material Repository in MDM
Maybe include a bit of an intro?
MDMGX (MDM Generic Extraction Framework) is a standard SAP transaction code available within core SAP ERP systems depending on your version and release level.
At NIMBL, one of our major clients – a global services professional company - had a business requirement to add OverheadGroups functionality to their material repository in MDM. In order to accomplish this, we first had to perform local table (LT) configuration for OverheadGroups in our SAP ECC Development client as all local table configurations are maintained in SAP ECC in our client landscape.
To share what I learned, in this blog, I will provide the steps I performed to accomplish this requirement.
First off, our client’s ECC landscape is architectured according to SAP best practices.
Note: Transporting individual configuration objects under MDMGX is not possible - so it is recommended that all of the configuration objects under MDMGX need to be transported in order to maintain consistency across the landscape. Tip: Use an asterisk (*) to include all objects under a specific transport.
Configuring LT_Overheadgroups under MDMGX for Material repository
Step1: Login to the Development client
Step 2: Execute MDMGX transaction and click on Maintain Ports and Check-Tables
Step 3: Provide the appropriate System and Repository information to perform LT configuration
Step 4: Click new file to perform LT_OverheadGroups configuration
Step 5: Enter following details for LT_OverheadGroups configuration as shown below. This configuration provides data in English for LT_OverheadGroups table in MDM Data manager. This information will be available to user through portal.
Step 6: Copy the configuration to Testing client using the SCC1 transaction code. Make sure to include all of the objects under MDMGX as mentioned above.
After Successful testing and importing transports to your ECC production environment, we next need to perform the following steps to extract data to and from the ECC production LT_OverheadGroups table enabling data flow from the MDM Data Manager
Step 7: Execute transaction MDMGX and select “Start Extraction”
Step 8: Provide the Client details and port name of the table to which data needs to be extracted in the MDM Data manager. Click execute to start the extraction process.
Step 9: Login to the MDM Data Manager and make sure the data is available
So there you go. Easy as pie right? I think this is a best approach to make the data flow from SAP ERP system to MDM if the local tables are maintained in SAP ECC system.
Please stay tuned for my next blog covering how to maintain ports within the MDM console. I would love to hear your thoughts, comments, and own ideas. Please feel free to email me at firstname.lastname@example.org anytime.
We have implemented our master data management solution using customized MDM WDC i.e. coupled with CE as a user interface on top of MDM (7.1SP6) along with CE BPM (7.2SP5). The solution was fit-for-purpose and did the intended job of data integration, governance, cleansing and maintaining single source of truth neatly with WDC providing easy to build UI. The reusable and generic nature of the components ensure independence from the underlying MDM schema and as a result, they are configurable and extendable.
With obvious advantages the MDM solution provided, business further came with ever diverse needs and was looking for SAP MDM as a tool for maintaining variety of data (flat, hierarchies etc.) of the organization – not all of it was master/reference data.
Most of the requirements were pertinent to UI of data when stored in MDM (e.g. enabling hierarchical or tree structure view on hierarchies, having drag-drop) with other few related to MDM solution e.g. check-in/check-out on hierarchies etc. Though MDM and MDM WDC could accommodate some of these requirements, others slightly tricky requirement couldn’t be addressed easily. But as the requirement kept on loading, we felt SAP MDM is bit limited to support them.
E.g. many organizations have huge amount of organizational data in hierarchies. An organization looking for MDM solution, naturally, would like support for all its data i.e. flat, hierarchies etc. We faced few limitations in implementing hierarchies using SAP MDM. The situation demanded tree like user interface. Though WDCs attempts to provide that display, we felt it was bit short on user satisfaction. Also hierarchies in MDM come with limitation that cannot be check-in/check-out, referred only at leaf node (perhaps we can make up for this using internal leaf node, but at the expense of duplication) etc. An add-on component like WDCs or some mature support in WDCs for hierarchies could augur well here. Additionally, main table like features for hierarchies i.e. check-in/checkout etc. will not only bring consistency but make the master data support scope wider.
Another such relative area could be, what-if analysis. Often organization undergoing restructuring or other org level changes of merging / splitting of business units needs to do a lot of what-if analysis. E.g. assuming a scenario of merging business unit A with either B or C, management interested in knowing likely cash flows for unit B and C if merged with A, would be interested in some analysis before committing data. We came across one such requirement, wherein business was undergoing some restructuring activity and wanted to do some what-if analysis on org data before committing it. As we couldnt anything in SAP MDM space to address it, we have to settle for Oracle DRM tool with feeds from SAP MDM. It would have been much easier for customers if there was such support using WDC like components. It avoids them lot of hassle of evaluating other options and also provides solution to adjoining needs under same roof.
We also faced issues in distributing the data from MDM. Once master data solution is implemented, the data should be routinely distributed across functional groups to satisfy their need of master data and for integrity of the solution. Business may need regular / scheduled distribution of master data to other functional group. Though syndicator gives us option of automatic (monthly/weekly/daily…) syndication of repository data, the schedule is too elementary to meet even slightest advance distribution needs. E.g. the automatic daily syndication will keep on syndicating repository data without considering weekdays/weekends. Downstream systems looking to receive data only on weekdays may need to think twice. Of course, we may ask downstream systems to align their needs, but such a small natural extension should be available out-of-the-box.
One may argue saying none of the requirement is related to master data, but, today, enterprises want to get max out of their data, SAP MDM – and perhaps using WDC like components - should extend the support basked to provide for such need. After all, customers will love to have their master data requirement addressed using one platform than keep on adding tools for each new requirement.
I came accross a requirement in one of my assignment, in which I was expected to write a Web Service to fetch the data from MDM tables. I thought to share the steps I followed to acheive this requirement through this blog.
Requirement: Implement a webservice to fetch data from MDM tables.
Step 1: Create a EJBProject
Step 2: Create a Session Bean in the EJB project created in previous step. To do so, right click on the EJBModule and select Session bean from the menu.
Give package name, Bean class name and check the "Remote" check box. Once done, Click on finish button.
We have successfully created a Session bean.
Step 3: Add below dependencies to the EJB project. These are required to use MDM APIs.
Step 4: Create an "Enterprise Application" project and add the EJB project created in Step1 in the Java EE dependency screen, click on finish.
At this point, all required projects and dependencies are created.
Step 5: Create a SYSTEM_USER in MDM repository which we require to connect to MDM system.
Step 6: Open the Session bean created previously and declare a method in it. Add the below code to connect to MDM in the Session bean class.
Step 7: We successfully connected to MDM repository in the above step. Now we have to query the MDM table to fetch the desired record.
My requirement was to query "Sites" table based on "Country" and "Organization" which are provided as input to web service.
Below is the code I used to fetch the records . This piece of code can be modified based on individual requirement.
Step 8: Loop through the Result Set and set the required output fields to return variable of our method, which will be output from web service.
Step 9 : Copy the method signature the we created in Session bean and paste it in the "Remote" interface class created. This is required to create Our Web Service.
Step 10: Let's expose our Session bean as Web Service. To do so, right click on the Session bean --> Web service --> Create Web Service.
Check the "Publish the Web serice" check box and click next. In the following screen, select "create New Interface" option and give a name to Interface.
Select the method to be exposed in the Web Service and click on finish button. Now we have successfully created a web service out of our Bean.
Step 11: Build the Enterprise application project and deploy on the server.
Step 12: We have successfully deployed our changes on server. Now let's login to WSnavigator to test the web service.
Step 13: Provide username and password and log into wsnavigator. Select Provider system radio button and enter the service name and click on search button. we will get the list of service available in the Service. Select our service and click on Next button.
Step 14: Provide the necessary inputs and click on next button. If the matching record is found, We we get the desired output.
Many a times certain requirement occurs wherein we have to monitor the activities of MDM . Activities like record being modified , record rollback , record syndication, repository status change and few other MDM events. Manually monitoring these activities is not possible.In such a scenario , monitoring MDM activities through some other mechanism is more feasible.
Here the concept of listening from MDM events comes into picture. We can call it a MDM Listener activity.This can be achieved using a MDM Java API - 'EventDispatcher' .
'EventDispatcher' is a class in Java which would help to listen (ie raise alert on the registered event) so that related processing can be done.
Simple MDM events like record checkout , record modified , record rolled back , record syndicated can be listened . Also , events like repository loaded , repository unloaded can be listened.
This class is responsible for dispatching all events coming from one specific MDM server to all event listeners subscribed in this dispatcher.We just need to register the specific MDM server so that it can handle the events listened by this dispatcher.
In order to define the set of notifications available on the dispatcher it should be registered to one or several groups of notifications .Notifications which indicate Repository level activities , server level activitites or Data level activities.
The Steps for subscribing the listener and starting it are as follows:
Create a main class with a method which will hold the listener and create another class which will extend AbstractServerListener Interface if we want to listern to Server activities , AbstractRepositoryListener Interface if we want to listern to Repository activities or extend AbstractDataListener Interface if we want to listern to record level activities.
If all these 3 activities are to be listened to , then 3 different classes extending above interfaces individually should be declared.
These classes ,which extend the required interface, can override methods of the respective interface for further processing.
In the below example we will create a main class 'EventListenerExample' with a method 'EventMethod' .The second class extending the interface would be RepositoryEventListener with method 'repositoryStatusChanged' which overrides the 'repositoryStatusChanged' method of AbstractRepositoryListener Interface.
To run this , few parameters are required ie server name , repository name , and the user name authorised to access them.
1. If we were to listen MDM repository events, then the Listener class would be as below
With this the listener is triggered. But this listener activity wouldn't continue for long.To keep MDM listener running for sometime , add a loop as below which would run for 5 mins.As per requirement the time condition can be changed . Now ,the listener would continue for 5 mins
while (count < 5 * 60)
catch (InterruptedException e)
So , now with the code in place , a call to the method EventMethod will set the Listener and this thread will continue for 5 mins (since we have kept the while loop running for 5 mins) . In this while , any change in repository status would be captured by the listener and if any loggers are written in this method then these can be seen.
Once the entire execution is done , one of the important step is to terminate the EventDispatcher instance only in standalone applications.Don't terminate EventDispatcher if it runs in a Web AS because this instance may be shared by other threads.
class RepositoryEventListener extends AbstractRepositoryListener
public void repositoryStatusChanged(RepositoryStatusEvent event)
//Print repository status
This can now be written in a web module and further triggered as a Web service and used or a call to this web module can be done from Web dynpro Java applciation.
Limitation of 'EventDispatcher' MDM Java API:
MDM event notification is not a guarantee delivery system. In addition, there are known limitations and deficiency. For example, AbstractDataListener.recordAddedis not trigger if a record is created through import.
Another important limitation would be with termination of EventDispatcher instance.EventDispatcher instance shoudln't be terminated if it runs in a Web AS because this instance may be shared by other threads.It should be terminated only in standalone application.