We have implemented our master data management solution using customized MDM WDC i.e. coupled with CE as a user interface on top of MDM (7.1SP6) along with CE BPM (7.2SP5). The solution was fit-for-purpose and did the intended job of data integration, governance, cleansing and maintaining single source of truth neatly with WDC providing easy to build UI. The reusable and generic nature of the components ensure independence from the underlying MDM schema and as a result, they are configurable and extendable.

 

With obvious advantages the MDM solution provided, business further came with ever diverse needs and was looking for SAP MDM as a tool for maintaining variety of data (flat, hierarchies etc.) of the organization – not all of it was master/reference data.

 

Most of the requirements were pertinent to UI of data when stored in MDM (e.g. enabling hierarchical or tree structure view on hierarchies, having drag-drop) with other few related to MDM solution e.g. check-in/check-out on hierarchies etc. Though MDM and MDM WDC could accommodate some of these requirements, others slightly tricky requirement couldn’t be addressed easily. But as the requirement kept on loading, we felt SAP MDM is bit limited to support them.

 

E.g. many organizations have huge amount of organizational data in hierarchies. An organization looking for MDM solution, naturally, would like support for all its data i.e. flat, hierarchies etc. We faced few limitations in implementing hierarchies using SAP MDM. The situation demanded tree like user interface. Though WDCs attempts to provide that display, we felt it was bit short on user satisfaction. Also hierarchies in MDM come with limitation that cannot be check-in/check-out, referred only at leaf node (perhaps we can make up for this using internal leaf node, but at the expense of duplication) etc. An add-on component like WDCs or some mature support in WDCs for hierarchies could augur well here. Additionally, main table like features for hierarchies i.e. check-in/checkout etc. will not only bring consistency but make the master data support scope wider.

 

Another such relative area could be, what-if analysis. Often organization undergoing restructuring or other org level changes of merging / splitting of business units needs to do a lot of what-if analysis. E.g. assuming a scenario of merging business unit A with either B or C, management interested in knowing likely cash flows for unit B and C if merged with A, would be interested in some analysis before committing data. We came across one such requirement, wherein business was undergoing some restructuring activity and wanted to do some what-if analysis on org data before committing it. As we couldnt anything in SAP MDM space to address it, we have to settle for Oracle DRM tool with feeds from SAP MDM. It would have been much easier for customers if there was such support using WDC like components. It avoids them lot of hassle of evaluating other options and also provides solution to adjoining needs under same roof.

 

We also faced issues in distributing the data from MDM. Once master data solution is implemented, the data should be routinely distributed across functional groups to satisfy their need of master data and for integrity of the solution. Business may need regular / scheduled distribution of master data to other functional group. Though syndicator gives us option of automatic (monthly/weekly/daily…) syndication of repository data, the schedule is too elementary to meet even slightest advance distribution needs. E.g. the automatic daily syndication will keep on syndicating repository data without considering weekdays/weekends. Downstream systems looking to receive data only on weekdays may need to think twice. Of course, we may ask downstream systems to align their needs, but such a small natural extension should be available out-of-the-box.

 

One may argue saying none of the requirement is related to master data, but, today, enterprises want to get max out of their data, SAP MDM – and perhaps using WDC like components - should extend the support basked to provide for such need. After all, customers will love to have their master data requirement addressed using one platform than keep on adding tools for each new requirement.

I came accross a requirement in one of my assignment, in which I was expected to write a Web Service to fetch the data from MDM tables. I thought to share the steps I followed to acheive this requirement through this blog.


Requirement: Implement a webservice to fetch data from MDM tables.

 

Solution:

Step 1: Create a EJBProject

EJBProject.png

Step 2: Create a Session Bean in the EJB project created in previous step. To do so, right click on the EJBModule and select Session bean from the menu.

Give package name, Bean class name and check the "Remote" check box. Once done, Click on finish button.

We have successfully created a Session bean.

 

Step 3: Add below dependencies to the EJB project. These are required to use MDM APIs.

Dependencies.png

Step 4: Create an "Enterprise Application" project and add the EJB project created in Step1 in the Java EE dependency screen, click on finish.

At this point, all required projects and dependencies are created.

Step 5: Create a SYSTEM_USER in MDM repository which we require to connect to MDM system.

Step 6: Open the Session bean created previously and declare a method in it. Add the below code to connect to MDM in the Session bean class.

Code.png

Step 7: We successfully connected to MDM repository in the above step. Now we have to query the MDM table to fetch the desired record.

My requirement was to query "Sites" table based on "Country" and "Organization" which are provided as input to web service.

Below is the code I used to fetch the records . This piece of code can be modified based on individual requirement.

  Code1.png

Step 8: Loop through the Result Set and set the required output fields to return variable of our method, which will be output from web service.

Step 9 : Copy the method signature the we created in Session bean and paste it in the "Remote" interface class created. This is required to create Our Web Service.

Step 10: Let's expose our Session bean as Web Service. To do so, right click on the Session bean --> Web service --> Create Web Service.

Check the "Publish the Web serice" check box and click next. In the following screen, select "create New Interface" option  and give a name to Interface.

Select the method to be exposed in the Web Service and click on finish button. Now we have successfully created a web service out of our Bean.

 

Step 11: Build the Enterprise application project and deploy on the server.

 

Step 12: We have successfully deployed our changes on server. Now let's login to WSnavigator to test the web service.

WDNavigator.png

Step 13: Provide username and password and log into wsnavigator. Select Provider system radio button and enter the service name and click on search button. we will get the list of service available in the Service. Select our service and click on Next button.WS1.png

Step 14: Provide the necessary inputs and click on next button. If the matching record is found, We we get the desired output.

 

I Hope this will be usefull.

Regards,

Pavan

Many a times certain requirement occurs wherein we have to monitor the activities of MDM . Activities like record being modified , record rollback , record syndication, repository status change and few other MDM events. Manually monitoring these activities is not possible.In such a scenario , monitoring MDM activities through some other mechanism is more feasible.

 

Here the concept of listening from MDM events comes into picture. We can call it a MDM Listener activity.This can be achieved using a MDM Java API  - 'EventDispatcher' .

 

'EventDispatcher' is a class in Java which would help to listen (ie raise alert on the registered event) so that related processing can be done.

Simple MDM events like record checkout , record modified , record rolled back , record syndicated can be listened . Also , events like repository loaded , repository unloaded can be listened.

 

This class is responsible for dispatching all events coming from one specific MDM server to all event listeners subscribed in this dispatcher.We just need to register the specific MDM server so that it can handle the events listened by this dispatcher.

 

In order to define the set of notifications available on the dispatcher it should be registered to one or several groups of notifications .Notifications which indicate Repository level activities , server level activitites or Data level activities.

 

The Steps for subscribing the listener and starting it are as follows:

 

Create a main class with a method which will hold the listener and create another class which will extend AbstractServerListener Interface if we want to listern to Server activities , AbstractRepositoryListener Interface if we want to listern to Repository activities or extend AbstractDataListener Interface if we want to listern to record level activities.

 

If all these 3 activities are to be listened to , then 3 different classes extending above interfaces individually should be declared.

 

These classes ,which extend the required interface, can override methods of the respective interface for further processing.

 

In the below example we will create a main class 'EventListenerExample' with a method 'EventMethod' .The second class extending the interface would be  RepositoryEventListener with method 'repositoryStatusChanged' which overrides the 'repositoryStatusChanged' method of AbstractRepositoryListener Interface.

To run this , few parameters are required ie server name , repository name , and the user name authorised to access them.

 

1.  If we were to listen MDM repository events, then the Listener class would be as below

 

import com.sap.mdm.notification.event.RepositoryStatusEvent;

import com.sap.mdm.notification.AbstractRepositoryListener;

 

class RepositoryEventListener extends AbstractRepositoryListener

{

     public void repositoryStatusChanged(RepositoryStatusEvent event)

      {

          //Print repository status

      }

}

 

2. In the main class , in method 'EventMethod' ,create an instance of EventDispatcher as below and set the server name

      EventDispatcher eventDispatcher = EventDispatcherManager.getInstance().getEventDispatcher(serverName);


3. Add listeners to Event Dispatcher class

     For this , first we need to initialize listeners.For initializing listeners , the class which extends the reqiured listener interface is created as above.

 

    RepositoryEventListener repositoryListener = new RepositoryEventListener(); // Creating instance of Listener class

 

4. Once the listener instance is created , add the listener object initialized ablove to the EventDispatcher so that these events can be captured.

 

     eventDispatcher.addListener(reposListener);

 

5. Register (subscribe) for all type of notifications for the event

 

RepositoryEventRegistrationSet reposRegSet = new RepositoryEventRegistrationSet(); 

reposRegSet.addEventType(RepositoryEventRegistrationSet.IMPORT_EVENTS); 

reposRegSet.addEventType(RepositoryEventRegistrationSet.SYNDICATION_EVENTS); 

reposRegSet.addEventType(RepositoryEventRegistrationSet.ALL_REPOSITORY_EVENTS);

 

RepositorySessionContext repSessionContxt = new RepositorySessionContext(serverName,repositoryName,userName);

 

eventDispatcher.registerRepositoryNotifications(reposRegSet, repSessionContxt,"");


With this the listener is triggered. But this listener activity wouldn't continue for long.To keep MDM listener running for sometime , add a loop as below which would run for 5 mins.As per requirement the time condition can be changed . Now ,the listener would continue for 5 mins

 

while (count < 5 * 60)

{

   try

   {

      Thread.sleep(1000);

   }

   catch (InterruptedException e)

  {

      e.printStackTrace();

   }                                    

  count++;

}

 

 

So , now with the code in place , a call to the method EventMethod will set the Listener and this thread will continue for 5 mins (since we have kept the while loop running for 5 mins) . In this while , any change in repository status would be captured by the listener and if any loggers are written in this method then these can be seen.

 

Once the entire execution is done , one of the important step is to terminate the EventDispatcher instance only in standalone applications.Don't terminate EventDispatcher if it runs in a Web AS because this instance may be shared by other threads.

 

EventDispatcherManager.getInstance().terminateEventDispatcher(serverName);

 

 

Now, the complete code would look like below

 

public class EventListenerExample

{

public void EventMethod (String para_serverName , String para_repositoryName , String para_userName ) {

 

     UserSessionContext userSessionContext;

 

     SessionManager sessionManager;

 

     userSessionContext = new UserSessionContext(para_serverName,para_repositoryName, "" ,para_userName);

 

     userSessionContext.setTrustedConnection(true);

 

     sessionManager = SessionManager.getInstance(); 

     sessionManager.createSession(userSessionContext,SessionTypes.USER_SESSION_TYPE,"");

     EventDispatcher eventDispatcher = EventDispatcherManager.getInstance().getEventDispatcher(para_serverName);

 

     RepositoryEventListener repositoryListener = new RepositoryEventListener();

     eventDispatcher.addListener(reposListener);

 

      RepositoryEventRegistrationSet reposRegSet = new RepositoryEventRegistrationSet(); 

     reposRegSet.addEventType(RepositoryEventRegistrationSet.IMPORT_EVENTS); 

     reposRegSet.addEventType(RepositoryEventRegistrationSet.SYNDICATION_EVENTS); 

     reposRegSet.addEventType(RepositoryEventRegistrationSet.ALL_REPOSITORY_EVENTS);

 

     RepositorySessionContext repSessionContxt = new RepositorySessionContext(para_serverName,para_repositoryName,para_userName); 

 

     eventDispatcher.registerRepositoryNotifications(reposRegSet, repSessionContxt,"");

 

     while (count < 5 * 60)

     {

        try

        {

           Thread.sleep(1000);

        }

        catch (InterruptedException e)

       {

           e.printStackTrace();

        }                                    

       count++;

     }

 

     EventDispatcherManager.getInstance().terminateEventDispatcher(para_serverName);

     sessionManager.destroySessions(userSessionContext);

 

     }

}

 

 

class RepositoryEventListener extends AbstractRepositoryListener

{

     public void repositoryStatusChanged(RepositoryStatusEvent event)

      {

          //Print repository status

      }

}

 

 

This can now be written in a web module and further triggered as a Web service and used or a call to this web module can be done from Web dynpro Java applciation.

 

Limitation of 'EventDispatcher' MDM Java API :

 

MDM event notification is not a guarantee delivery system. In addition, there are known limitations and deficiency. For example, AbstractDataListener.recordAdded is not trigger if a record is created through import.

 

Another important limitation would be with termination of EventDispatcher instance.EventDispatcher instance shoudln't be terminated if it runs in a Web AS because this instance may be shared by other threads.It should be terminated only in standalone application.

HI All,

 

I am sharing my knowledge as per to explain the process of creating a Repository in SAP MDM. this blog is for giving some solution to James for the following discussion  How to add repository in MDM data manger.

 

 

  1. Install MDM server ( MDS,MDIS,MDSS,MDLS) and DBMS server ( Database )
  2. Now start the MDM server and Auxiliary servers (MDIS, MDSS and MDLS) by as mentioned in the below steps.
  3. Control Panel\System and Security\Administrative Tools>Services>MDM server 7.1
  4. Follow the above step to start the auxiliary servers also.
  5. Now install MDM clients ( console, Data Manager, Import Manager and Syndicator)
  6. After installing the clients now open MDM Console.
  7. Now Mount the MDM Server in console.
  8. Note the server has to be in running status.
  9. As the server id now in running status, this doesn’t mean you can create repository in MDM server, to this you need to configure the DBMS setting.
  10. Right click on the server and choose DBMS setting and gives the appropriate DBMS server name and password of DBMS server.
  11. The above step is about to establish a connection between MDM server with DBMS server.
  12. Now Right click on the MDM server and choose the option create repository.
  13. Here you need to give the DBMS server name and password again, until the Repository name field and port field will be disabled
  14. Once the DBMS name and password is verified the repository name & port number will be enabled.
  15. Now you can type preferred repository name & port number and by this you will get a new repository in SAP MDM.
  16. Once the repository gets displayed under the MDM server hierarchy node, by default it will be in stopped status.
  17. Now you can design your repository as per requirement and once you feel the repository is ready for your use, then right click on the repository and choose start repository.
  18. You will get two options Immediate and Update Indices.
  19. It is good if you choose Update Indices.
  20. Now the repository will be reflected as per the designed in other clients.
  21. If you need to again change or to update structure of the Repository, then got to console down the repository and perform the action.
  22. Stopping a repository is not necessary for any type of addition in Admin node (Users, Roles, Ports, Remote system, XML schema, Links, etc.).

However most of the above steps cover’s some basis activities, I just added for knowledge sharing.

In the following article I would like to present a very simple integration between two SAP products, SAP NetWeaver Master Data Management (MDM) and SAP Lumira. The goal of the integration is to easily generate analytic visual reports of the master data content.

Preface

SAP NetWeaver MDM customers enjoy many features and abilities of the product including content consolidation, master data harmonization, and many others.
However a customer may need to analyze the master data for any of the following reasons:

 

  • To transform the raw master data into visual information for business purposes
  • To gather statistical and quantitative analysis of the master data
  • To perform data aggregation according to certain criteria (e.g. Supplier, Region) and present the results in a visual manner

 

The integration between the two products can be done by connecting SAP Lumira to MDM using the MDM DB views.
image002.png

MDM DB Views

The ability to generate MDM DB views was officially added to the MDM product in the MDM 7.1 SP10 release. DB views provide the customer with a real time data representation of the MDM content via SQL views. An SQL view is generated for each MDM table, by using a simple CLIX command.

 

The views are read-only.

 

SAP Lumira

SAP Lumira is a data manipulation and visualization tool. Users can connect to various data sources, select and clean data, and manipulate and visualize data using a diverse set of graphical charts and tables.

 

In this article I will describe how to visualize the MDM data using SAP Lumira.

 

A real life scenario

To demonstrate the capabilities of visualizing MDM data on SAP Lumira I’ll use a simple product repository.

 

I’ve already got data in the repository but I’ve been requested to provide analysis reports on the status of the repository’s products to upper management.

 

The request is:

 

  • Provide a visual report on the number of products per manufacturer.
  • Provide a visual report on the number of products per customer country.

 

In my repository I have the fields ‘Part Number’, ‘Manufacturer’ and ‘Customer_Country’ in the main table ‘Products’ of the repository.

 

‘Part Number’ uniquely identifies each product in the repository. ‘Manufacturer’ and ‘Customer_Country’ are the fields that I use to generate the reports.

 

Without the integration between SAP Lumira and NW MDM I’d have to query the data myself using MDM Data Manager or the MDM APIs, and then summarize and formalize the results, processes that could take hours and could lead to mistakes of data changed or incorrectly copied during one of the steps.

 

The integration of SAP Lumira with SAP MDM will generate the two reports in a matter of minutes.

 

SAP Lumira will query the data directly from the database using the MDM DB views in 3 easy steps.

 

Step 1: Generating MDM DB Views

MDM DB views are generated using the CLIX command, RepViewsGenerate.

 

This command generates an SQL view for the MDM data tables and each MDM table will be represented by a single view with all the display fields as the SQL view fields.

 

The syntax of the command is:

 

RepViewsGenerate <ServerName> <RepositoryName>;<DB_Name>;<DB_Type_Letter>;<DB_UserName>;<DB_Password> <Repository_UserName>:<Repository_Password> "*"

 

In my example my repository name is MyRepository, it is mounted on an MS SQL server called MYMSSQL and my MDM server name is MYSERVER so I’ve used the following syntax:

 

CLIX RepViewsGenerate MYSERVER MyRepository;MYMSSQL;S;sa;pass Admin:adminpass "*"

Once I’ve executed the command the MDM DB views will be generated in my database, and each table is now represented by a single view.

 

My assignment is to generate analytic reports on the main table Products, so I’ll use the corresponding view M_Products_1_0.

 

Step 2: Connecting to the MDM DB Views with SAP Lumira

Configuring a JDBC driver for SAP Lumira

 

Once I’ve generated the views my next step is to connect to the DBMS using SAP Lumira using the following 2 components:

 

  • SAP Lumira Desktop Standard  Edition
  • A JDBC driver for my DBMS

 

I’m using MS SQL server 2008 so I’ve used Microsoft SQL Server JDBC Driver 3.0.

 

Once I’ve configured the JDBC driver in SAP Lumira (File -> Preferences -> FreeHand SQL) and restarted the application, I’m ready to connect to the MDM main table view.

 

Connecting to the MDM Main Table View

 

Connecting to the MDM main table view is done in the following way:
  • I start a new SAP Lumira document.
  • In the Data Source selection screen, I select FreeHand SQL.
image003.png
  • I select my DBMS type. The connection icon should be green if the JDBC driver is configured correctly.
  • I enter the DBMS login parameters and click ‘Validate connection’. After the validation has passed successfully, I click ‘Acquire’.

D.png

  • If the validation step failed, the issue might be wrong login details (Server Name, Server Port, Username and Password) or a problem with the JDBC driver or network.
  • The next dialog box displays a ‘query’ box, but there is no need to write a SQL query to get the data. I expand the catalog in the upper left corner and look for my repository name with an ‘_MXXX’ suffix (usually _M000).
image011.png
  • Once I find my repository database, I expand it and look for the view I need (in my example the data resides in the view M_Products_1_0) and double-click the name of the view.
  • I click ‘Preview Data’ to fetch a preview of the main table data.
image013.png
  • The menu below shows the column selection for columns that will be used by SAP Lumira. The default selection will fetch the data of all columns of the MDM main table into SAP Lumira; however, it is unnecessary for me to fetch all of them since I only require the 3 fields I mentioned earlier, ‘Part Number’, ‘Manufacturer’, and ‘Customer_Country’.
  • I select the 3 columns that I need and click ‘Acquire’ and continue to design my visual report.
image015.png

Step 3: Preparing the Reports in SAP Lumira

In the SAP Lumira’s preparation screen I select the ‘Visualize’ view.
image017.png
In the view I can see my 3 fields in the ‘Attribute’ column.
image019.png
I select the field ‘Part Number’ so that I can uniquely identify each product in my repository.  The next step is to define this field as measure, which means that this field will provide the data units for the charts and graphs that I’ll create.
I right-click the attribute and choose ‘Create a measure’.
image021.png
I can now complete the first task I was requested to do: Provide a visual report on the number of products per manufacturer.

A visual report on the number of products per manufacturer

This is a rather simple visual report and SAP Lumira can present it in several different charts and graphs.
I start with a simple pie chart:
  • I click on the ‘Pies’ icon’s drop-down menu and select ‘Pie Chart’.
image023.png
  • I drag the new ‘Part Number’ measure I defined into the ‘Pie Sectors’ box under ‘Measures’ and drag the ‘Manufacturer’ attribute into the ‘Legend Color’ under ‘Dimensions’.

A.png

That's it! A pie chart with the number of products per manufacturer is generated!
image029.png
I can see from the chart that most of my products are manufactured by SAP AG, Apple Inc and Microsoft Corporation.
But that's not all. I can now switch between different visual representation types by clicking the relevant icon and decide which way I’d like to present my data,
For example, the same data is shown below in 3D Column bars, (available under the ‘Bars’ icon) and in cloud tag (available under the ‘Others’ icon).
image031.png
image033.png

A visual report on the number of products per customer country

My next task is to provide a visual report on the number of products per customer country.
Now, I can use the exact same method I used in the previous example and simply replace the ‘Manufacturer’ attribute with the ‘Customer_Country’ attribute and it will work, but I’d like to use an additional way to present the data for this report.
Although I acquired 3 fields from my main table view as attributes, there is a different icon next to the ‘Customer_Country’ attribute, which indicates that SAP Lumira detected that this attribute contains geographical data and can be displayed in geographic format.
image035.png

In order for me to display the attribute in geographic format I need to create a geographic hierarchy in the following way:

 

  • I right-click the ‘Customer_Country’ attribute and select ‘Create a geographic hierarchy’.

image037.png

  • Two options are presented, ‘By Name’ and ‘By Latitude/Longitude’. Since the data is country names, I select the first option.
  • The next window presents the types of geographical locations SAP Lumira can use to present the data. My data is countries, so I’ll select the ‘Country’ option. I can also choose between presenting all the values or only those values to which SAP Lumira could  match a country.

image040.png

  • SAP Lumira will now match the data in my repository with the known countries. All the values in my repository were matched to existing countries.

image042.png

  • Once the matching is completed, a new geographic hierarchy called ‘Country’ is created and added in the Hierarchies menu.

image044.png

  • Now I select ‘Geo choropleth chart’ and in it I place the following values:
    • I drag the ‘Part Number’ measure into the ‘Value’ box under ‘Measures’ and  drag the new geographic hierarchy ‘Country’ into the ‘Geography’ box under ‘Dimensions’.

B.png

The result is a map that maps the number of products per customer country. I can clearly see that my products are sold mainly in the US and EMEA. I might need to improve my product availability in the other countries

 

image050.png

Combining the two reports


I completed my two main tasks and sent my reports to my manager. I was then asked if there is a way to combine the two reports and create a visual report on the number of products per customer country and for each country to present the number of products per manufacturer.

Using the measures, attributes, and geographic hierarchies I created, I can accomplish the task easily. I’ll show two ways to present this report.

 

Geographic

This scenario is almost identical to the Geo choropleth chart that I created earlier; the difference is that instead of creating a Geo choropleth chart I select ‘Geo Pie chart’. The second change is that I drag the ‘Manufacturer’ attribute into the ‘Overlay Data’ box under ‘Dimensions’.

image052.png

That's it! I now have a Geo Pie chart in which each country also contains a separate pie chart for the manufacturer in the products sold in this country.

I can now recognize trends in my repository; for example, most of the products bought in Asia are from the manufacturer SAP AG and most of the products bought in Europe are from the manufacturer Apple.

image054.png

Tree Map

The second option to present this report is in the ‘Maps’ visualization. I use the ‘Tree map’ option and I drag the ‘Part Number’ into the ‘Area Weight’ box and the  ‘Country’ geographic hierarchy and ‘Manufacturer’ attributes into the ‘Area Name’ box under ‘Dimensions’.

image056.png

 

This displays the same results as the previous chart. In my opinion this method is less attractive but can provide information easily if drill down into the data is needed.

image058.png

Row Data Statistics

SAP Lumira can present statistics on the master data as row data in tables. This is useful when the exact data count is needed. In the following example I can extract the exact number of manufacturers and customer countries per product

 

I switch to the ‘Data’ view and select the ‘Facets’ view.

C.png

Next I need to choose the measure according to which the facet tables will be created; I select the ‘Part Number’ measure.

image064.png

Facet tables are now created for the other two attributes. I can now see the exact number of manufacturers and customer countries per product:

image066.png

I can now filter the results further by double clicking the values in the facet tables; for example, if I click ‘SAP AG’ I can see the division of customer countries for this specific vendor only:

image067.png

 

 

Reusing the reports

An important subject I want to elaborate on is the reusability of the reports. Once I have saved my reports, I can reuse them as long as the schema of my repository is not changed. All I need to do is to run the ‘Refresh document’ command from the ‘Data’ menu, and SAP Lumira will fetch the new data from my database and update the report automatically.

 

Publishing the results

Once I complete all the reports and save them, I can share them in several ways. The first step is to switch from the prepared view to the shared view.

It will display all the options to share the data sets and visualizations I created:

image070.png

Visualizations can be published via emails or on SAP StreamWorks.

 

Datasets can also be published via SAP HANA, SAP BusinessObjects Explorer, Lumira Cloud or a csv file.

 

Summary

In this article I’ve shown a small portion of what can be done by combining SAP NW MDM and SAP Lumira. My examples were on a relatively simple data set but this doesn’t mean that complex data sets can’t be extracted. For example several joint queries on MDM DB views can provide complex data sets which include MDM hierarchies, tuples, etc.

 

More information on SAP Lumira can be found in the SAP Lumira user guide at:

 

http://help.sap.com/lumira

 

and at:

 

http://www.saphana.com/community/learn/solutions/sap-lumira

 

SAP Lumira

 

More information on DB views can be found in the MDM 7.1 SP10 Console Reference Guide in the section "Generating and Deleting MDM DB Views".

 

http://help.sap.com/nwmdm

 

SAP NetWeaver Master Data Management

 

Enjoy,

 

Tal Shnaiderman

MDM Development

 

SAP NW MDM provides us with beautiful fearture in the form of remote keys,which serves as cross reference between systems.

Many a times duplicate entires or wrong entries get created  that cause lot of confusion.This effect is  more pronounced when we have many remote systems and keys. In the following blog I have illustrated a small workaround to get rid of such wrongly created remote key values so that we can populate them correctly. The following snapshots would illustrate the steps involved -

 

1. In the current scenario we have a lookup table with wrong entries in remote key section as shown below.Our requirement here is to remove these entries and not cause multiple unwanted entries in the remote key section of record.

correct3.jpg

Export this data along with remote keys.

 

2.Create a dummy entry in this lookup table -

CORRECTION.jpg

 

3.Edit the exported data,and set description as "DUMMY"so that all the entries have same value for matching field as the dummy record created above in step 2.

 

In our case we have modified the description data and set it as "DUMMY" for all records.

C4.jpg

 

4.Connect to Import Manager - Use Description as match field as it is set "DUMMY" for all source records and one destination record.

Map the matching field and Remote Key.Import setting Import Action as "Update NULL"

5.jpg

6.jpg

 

5.Checking remote keys of records post step 4.

 

8.jpg

9.jpg

6.Deleting the DUMMY record.

 

7.Reimporting the data mapping correct remote keys.

 

This way we can get rid of multiple erroneous or wrongly created entries in remote keys.

 

Thanks,

Ravi

In the following blog I have tried to detail my understanding of the subject.For a clearer understanding I have kept it in a question answer format.Please feel free to add to it.Here we start.

 

Big data and MDM seem to have a significant connection, but for now the connection is still very unclear. At first thought they seem like an odd pair with great contrast between them.

 

MDM can be defined as a set of tools, procedures, and policies to govern, create and maintain a trusted data. This data serves as the very base of business transactions and hence many times referred to as “DNA of a business”.

 

Big Data on the other hand comprises of environments which is composed of huge volume of data coming from variety of sources. It is the overall umbrella of social media data, unstructured documents, streaming data from instrumented devices, and more. Unlike MDM its main character is not Trust.

In Gartner’s words “Big data is high-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making.”

 

So how are the two related?

 

Big data provides a high volume data, this high volume and high velocity data needs to be processed and refined to come at valuable information. And here lies the link - The MDM hub can keep a traditional 'golden record' of trusted information along with a less-trusted view of the same person or product based on the findings from big data. When you combine the traditional 'golden record' with new information found from  your big data, the superset of information can power even better business insights and business decisions that were not possible before. The combined view can provide a more insightful complete view, but they can still be presented separately in cases where business can't afford to base decisions on the less-trusted view. So Big data is supported by MDM, it tells a business what does a predictive analysis or classification mean for its Customers, Vendors and more.

 

How to arrive at valuable information?

 

To achieve this business would have to mine the Big data. In other words, it would involve exploring and analyzing large amounts of data to find patterns for big data. The techniques came out of the fields of statistics and artificial intelligence (AI), with a bit of database management thrown into the mix. Generally, the goal of the data mining is either classification or prediction. In classification, the idea is to sort data into groups. For example, a marketer might be interested in the characteristics of those who responded versus who didn’t respond to a promotion. Based on such information he can change his strategy to target a group of responders. To add to this there are softwares enabling a process called Opinion mining. It is a type of natural language processing for tracking the mood of the public about a particular product. It is also called sentiment analysis, involves building a system to collect and examine opinions about the product made in blog posts, comments, reviews or tweets. Automated opinion mining often uses machine learning, a component of artificial intelligence (AI).

 

Together big data and MDM can help extract insight from the increasing volume, velocity, variety, and mapping that to the truthful data, in context, beyond what was previously possible. This could lead to creating master data entities, loading new profile information into the MDM system, sharing master data records or entities with the big data platform as the basis for big data analysis, and much more.

 

Example –Let us take example of Health care system. Here master data could be Patients, Providers, Payers, Households, Employees, Reference data etc. To have complete view of a Patient we can have data from Hospital, Doctor, Web, Health insurance, Pharmacy etc. One can leverage predictive analysis to come at patients who are at risk for readmission, leverage classification to measure and identify causes for adverse events, assist in drug discovery ,this can help insurance companies devise personalized programs etc.

 

How do innovations in existing MDM portfolio enable this connect?

 

SAP is employing a multipronged approach.  Firstly, the current Master data solutions would continue to evolve in line with business, addressing their needs. Secondly, SAP is leveraging SAP HANA as the common data platform for SAP enterprise MDM .It would allow businesses to handle huge volume of data in large scale master data hub.  Additionally, the Master Data Services package from SAP and Business Objects enables customers to improve business process efficiency, effectiveness, and responsiveness, and make better informed decisions based on high-quality, accurate master data.

 

I would like to conclude by saying that to have MDM in place is to already have a strong foundation for Big data,together they can provide a Vantage point to business.

In this part of the blog, I am going to explain the steps to download a specific SAP NW MDM component from SMP (Service Market Place).

 

To download specific SAP NW MDM component, follow the below mentioned steps:


  1. Go to: https://websmp203.sap-ag.de/swdc
    Note: To download any content from service market place, SMP login is required.
  2. Navigate to SAP Software Download Center --> Support Packages and Patches --> A-Z Index sub menu and click on ‘M’ alphabet under Support Packages and Patches tab:
    11.JPG

  3. Select SAP MDM --> SAP MDM 5.5 OR SAP NETWEAVER MDM 7.1.
    12.JPG

  4. Click on Entry by Component link and then click on any component e.g. Import Clients:
    23.JPG

  5. Navigate to installation of the component as per your operating system e.g. click on MDM Import Manager 7.1 --> Win32 link and click on the desired version e.g. MDMIM7008_40_200004524.zip under Downloads tab:
    24.JPG

  6. A window will be displayed, which might ask to re-enter SMP login credentials. Once SMP login credentials re-entered, the selected file will be popped up to download. Select Save File radio button and click OK button to download the file to your local machine:
    25.JPG

 

Note:

  1. User must have downloading rights to download any content from Service Market Place.
  2. If user is downloading any specific SAP NW MDM content as explained above, Download Basket is not required but sometimes, it might ask to use Download Basket while downloading MDS, Installation Master and other servers.

 

Thank you for reading, hope this will help you..!!

In this part of the blog, I am going to explain the steps to download a specific SAP NW MDM Support Package (SP) Stack from SMP (Service Market Place).

To download specific SAP NW MDM SP Stack, follow the below mentioned steps:


  1. Go to: https://websmp203.sap-ag.de/swdc
    Note: To download any content from service market place, SMP login is required.

  2. Navigate to SAP Software Download Center --> Support Packages and Patches --> A-Z Index sub menu and click on ‘M’ alphabet under Support Packages and Patches tab:
    11.JPG

  3. Select SAP MDM --> SAP MDM 5.5 OR SAP NETWEAVER MDM 7.1.
    12.JPG

  4. Click on Support Package Stack Download link:
    13.JPG
  5. A new Support Package Stack Download window will open, which will ask to complete few steps to download SP stack. Select your current and target SP Stack in Start SP Stack and Target SP Stack drop down list fields and press Next Step button. E.g. I selected current stack as SAP NW MDM 7.1 SP06 and target stack as SAP NW MDM 7.1 SP08:
    14.JPG

  6. Select the installation, as per your Operating System, for all components and click to Next Step button:
    15.JPG
    Note: A warning message will be displayed on the top of the window, if any component left to select installation.
    16.JPG
    Click Previous Step button and select the appropriate installation for the component and then click Next Step button.

  7. All components of the selected SP Stack will be available to download. User can select all or required component of selected SP Stack and click Add to Download Basket button.
    17.JPG

 

 

As soon as user will click Add to Download Basket button, the selected SAP NW MDM contents will be added to Download Basket to download.

 

I have explained how user can download Specific SAP NW MDM component in Part 3 - Download Specific SAP NW MDM Component blog…!!

I have seen number of threads on “Where/How to download SAP NW MDM content” topic. So, I decided to write a blog on how to download SAP NW MDM content from SMP (Service Market Place) in detail, to make it easy for everyone. Usually, this activity is taken care by Basis team in most of the organizations.


There are three ways to download SAP NW MDM content. User can download complete SAP NW MDM content or SP Stack or the required component from SMP:

  1. Download Complete SAP NW MDM Content
  2. Download Specific SAP NW MDM SP (Support Package) Stack
  3. Download Specific SAP NW MDM Component

 

To download complete SAP NW MDM content, follow the below mentioned steps:

  1. Go to: https://websmp203.sap-ag.de/swdc
    Note: To download any content from service market place, SMP login is required.

  2. Navigate to SAP Software Download Center --> Installation and Upgrades --> A-Z Index sub hierarchy menu and click on ‘M’ alphabet under Installation and Upgrades tab:
    1.JPG
  3. Select SAP MDM --> SAP MDM 5.5 OR SAP NETWEAVER MDM 7.1 as per your need:
    2.JPG

  4. Click on Installation link. Select all the content’s check box and click on Add to Download Basket button:
    3.JPG

As soon as user will click Add to Download Basket button, the selected SAP NW MDM contents will be added to Download Basket.

 

I have explained how user can download Specific SAP NW MDM Support Package Stack in Part 2 - Download Specific SAP NW MDM SP (Support Package) Stack blog…!!

Hi

 

I am experiencing an issue with in Data Manager when try to add new workflow record in workflow table. Everytime i open visio through Data Manager it freeze up and eventually gives an error message with only one option to cancel visio from opening. it's a way in which i can resolve this issue

 

Thanks in advance

Kwanele Nkuna

Below are two frequent errors while accessing Change Item History application in SAP Netweaver GDS.

 

1. Connection error: Specified jdbc Alias parameter is invalid or Data source is not running.

 

Solution: Create a custom data source
for connecting to database of MDM repository in ‘Netweaver Administration’ -->
ConfigurationManagemen-->Infrastructure-->Application Resources.

 

Creation of jdbc datasource depends on database being used. Below link will be helpful to create required jdbc datasource.

 

http://help.sap.com/saphelp_nwce72/helpdata/en/4a/5da077f60414d2e10000000a42189b/frameset.htm

  

After creating custom datasource successfully, create an alias for this datasource with the name ‘gdstr’ in lowercase. Name of datasource can be any name but it’s recommended not to keep same name as of alias.

 

2. You do not have authorization to use the MDM Change Tracker application.

 

Solution: If we use SAP Netweaver GDS in portal mode then there is authorization error on click of Change Item History button. To fix this issue Role with MDM_CHANGE_TRACKER action has to be assign to Portal user.

 

GDS Console uses Netweaver UME Guest user for the change tracker authentication. So in GDS Console it is enough to assign the change tracker role for the UME Guest user.

But when user login to portal it uses Netweaver UME user for it. The change tracker uses that UME user for authentication. So the portal user needs to have the change tracker role as well.

SAP recommends assigning the change tracker role to portal users in Netweaver Administrator UME. This can easily achieved by creating a UME role and assign MDM_CHANGE_TRACKER action to this role, afterwards assign this role to a group for all portal users.

 

In both GDS console and portal mode user should have assigned a role with change tracking enabled in MDM console.

It was a great experience working on SAP NetWeaver MDM and SAP MDG(Master Data Governance), Now I am looking forward for SAP BPM(Business Process Management). After analysing, I found that MDM with MDG and BPM can enhance and govern quality of data for any organization which was not realized before. MDM’s strong features of data quality, MDG’s governance, BPM’s process, when combined could remove lot of problems(data quality, transparency, latency) we are facing now a days.

 

 

Suppose  some data is entered through website, say Supplier name, phone number. This Data is sent to NetWeaver MDM. Now here, Data will be qualified and managed with the  features like matching and merging, validations, assignments, taxonomy to make one single version of truth or Golden records. When Data is approved by Master Data Specialist, BPM triggers MDG workflow, which ensures that  all ERP specific attributes are maintained, which also ensures transparency so that requestor can easily check status in workflow log. After master data governance process the new supplier is created in ECC. Now, BPM here will sent a notification to original requester about creation. 

As you probably know it is the basis of the service oriented architecture to use webservices for the communication amongst different components. Also if you create a environment with SAP BPM and SAP MDM webservices on MDM site are generated and consumed in SAP BPM.

During the implementation of this scenario we encountered the problem, that the webservice call from SAP BPM is successfully executed. Also on MDM site the calculation was successfully and in the SOA Logger the result was visible. But the result never went into SAP BPM.

The solution of the problem was pretty simple but really hard to find: SAP MDM webservice DO NOT support WSIL. WSIL is the generic webservice file in which all services of the server are listed and not only the specific ones for the single webservice. So we created a new provider system pointing to the WSDL of the service and added this provider system then in the application communication to the service group and it worked!

In our last project we had several problems in calling BODS webservice via the Java Webdynpro. The import of the model to the NWDS was not successful. Also creating a CAF service around the BODS webservice didn't lead to the expected result. So we decided to implement the the BODS call via the BODS API.

The first step was to create configurations in the NWA. This enabled us to switch the target BODS system in a configuration and not in the code. Next was to create the RTService call and add for example the input string.

After that the RTService was executed. Then we parsed the result, did some error handling and passed back our result.

Down below you can find some code snippets

import com.businessobjects.rtsclient.RTServiceClientX;

import com.businessobjects.rtsclient.RTServiceException;

RTServiceClientX rts = new RTServiceClientX();
        try {
            //generate input for RTServiceClient

 

        String input = "<?xml version=\"1.0\" encoding=\"UTF-8\"?> " + "<IBANInformation> "
                + "<MDM_BANK_NUMBER>"+bankNumber+"</MDM_BANK_NUMBER> "
                + "<MDM_BANK_ACCOUNT_NUMBER>"+bankAccountNumber+"</MDM_BANK_ACCOUNT_NUMBER> "
                + "<MDM_BANK_CONTROL_KEY>"+bankControlKey+"</MDM_BANK_CONTROL_KEY> "
                + "<MDM_SWIFT_CODE>"+swiftCode+"</MDM_SWIFT_CODE> "
                + "<MDM_COUNTRY>"+country+"</MDM_COUNTRY>"
                + "</IBANInformation>";

             //get configuration data
            IWDWebModule module = wdComponentAPI.getDeployableObjectPart().getWebModule();
            IWDConfiguration config = WDConfiguration.getConfiguration(module);

            //execute call to RTServiceClient
            String bodsResult = "";
            rts.connect(config.getStringEntry("BODS_SYSTEM"), config.getIntEntry("BODS_PORT"));
            try {
                bodsResult = rts.invoke("IBANGeneration", input);
            } catch (RTServiceException e) {
                GUId logid = logger.traceThrowableT(Severity.ERROR, "", e).getId();
                wdComponentAPI.getMessageManager().reportMessage(IMessageIBANComp.ERROR, new String[] { logid.toString() });
            }
            rts.disconnect();

            //parse result of call to RTServiceClient
            DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();
            DocumentBuilder db = dbf.newDocumentBuilder();
            InputStream is = new ByteArrayInputStream(bodsResult.getBytes("UTF-8"));
            Document doc = db.parse(is);
            doc.getDocumentElement().normalize();
            NodeList nodeList = doc.getElementsByTagName("IBAN");
            if (nodeList.getLength() > 0) {
                String iban =nodeList.item(0).getTextContent();
                if (iban.equals("0")){//return if error
                    NodeList errorCodeList = doc.getElementsByTagName("ERRORCODE");
                    NodeList errorMessageList = doc.getElementsByTagName("ERRORMESSAGE");
                    // If error code is 201 just go back. No real error
                    if(errorCodeList.item(0).getTextContent().equalsIgnoreCase("201"))
                        return null;
                    wdComponentAPI.getMessageManager().reportException("Error during call to BODS RTServiceClientX: Errorcode:" +
                     errorCodeList.item(0).getTextContent()+" Error Message:"+errorMessageList.item(0).getTextContent());
                    throw new Exception("Error during call to BODS RTServiceClientX: Errorcode:" + errorCodeList.item(0).getTextContent()+
                     " ErrorMessage:"+errorMessageList.item(0).getTextContent());    
                              }else{
                    return iban;
                }
            }
        } catch (Exception e) {

 

        }

Actions

Filter Blog

By author: By date:
By tag: