Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 

Recently I came across a need to debug the standard execution flow of the data enrichment framework in MDG to troubleshoot some issues. Apparently the trouble was that a custom enrichment done in the system wasn’t working. However, I was pretty convinced that it wasn’t the issue and decided to find out what the real trouble was.

The problem statement was that the overall MDG performance turned slow very suddenly and it seems to get stuck at the custom data enrichment applied.

So I started debugging from the point where the enrichment gave way to the standard. I could see that the enrichment did its job well, and the entities were getting the data updated correctly. Immediately after that, came  the standard validations that perform data checks on the entire change request.

The below screenshot shows the standard code which applies all the checks for the request step.

The selection was based on the below:

Based on the configuration, there was a duplicate check in place. As I gradually proceeded with the debugging to narrow down the cause, I could see that the performance hit was coming up in the duplication check.

Next round of debugging - go inside the standard duplication check to understand whats happening within it.

The standard does the duplicate search based on the configured set of attributes.

So it prepares the data and then accesses the enterprise search template set for this data model.

Once the template is obtained, it accesses the correct connector ID for establishing the link to TREX via the connectors. ( As you must be aware by now, that the connectors are setup using the ESH Cockpit , which by itself is a pretty cool feature.)

So after doing all this 'magic', it accesses TREX and tries to return the results. Now if the TREX is not accessible or has any data corruption issues with the indices, this is going to fail and there would not be any results. This has the potential to 'hang' the system.

So it was a good learning to go through all these layers and finally pin-point that the issue wasn't with any custom enhancement done, but rather with some corrupt indices in the TREX. And upon re-indexing the same and recreating the connector, the issue was resolved.

So a lesson learnt is to execute the enterprise test search tool from the back-end to check this issue out much faster than going through all the layers of debugging.

Hope this may prove useful for all the techies out there who are interested in cracking the duplication check available in the standard logic flow.

Labels in this area