Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
carsten_ziegler
Active Contributor

I am sitting in my hotel room in Moline, Illinois recapping the day I spent with a customer (you can probably guess which one I mean). I explained and demoed SAP NetWeaver Decision Service Management (NW DSM). Then we discussed some of their use cases and implemented a small scenario in a team effort involving functional as well as technical experts. The scenario was very simple, not a challenge at all.

My déjà vu

The alternative to a decision service implemented with NW DSM would have been the creation of 100 lines of customer code and a handful of Z-tables to store some configuration settings. I could have written about ten other cities and ten other clients. It’s like a déjà vu. The story is often the same. At least half of the customers I visit do not think that they need a rules engine. They simply want to find new ways to implement custom logic in the SAP standard. In other words: They want to reduce custom code and custom tables!

From functional experts I often hear statements like:

  • “We do not know how it works and why it was built this way.”
  • “Is this still used? Has it ever been used? What does it do?”
  • “Theoretically, there is the capability to adjust the logic. Practically, I have no idea about the what and the how. I don't even know of a system in which changes are possible.”
  • “For every change, we need to get management buy-in, write a spec and start a project. Guess what? Most changes just never happen.”

From IT experts:

  • “I found a design document but I cannot map the code to it. Who is the functional guy to explain it to me?”
  • “Why do we have those 3 Z-tables? What happens if we delete them?”
  • “Does anyone understand the spec document? Why do I have to read accounting books to get it?”

From IT managers:

  • “I am told to reduce custom code but nobody tells me how”.
  • “Sorry, cannot do this. We do not have capacity.”
  • “We are still porting our custom additions to the latest EHP.”

Can a decision management tool like NW DSM solve this problem? Sounds like the title of this blog. But it's not as simple as that. Let’s check the details.

Drawbacks of the Traditional Development Model

The development process a majority of customers follow is segregated into phases, most of which are executed sequentially (waterfall), with one phase’s output being input to the next one as shown in the diagram. The responsibility for the outcome of the phases is shared between at least two teams. Functional teams define requirements (specification), test code changes, and use the improvements after go live. Technical teams convert the specification into a design document and implement the required changes. Modified and newly created repository objects (such as ABAP code and database tables) are transported from one system to the other using SAP change and transport management (CTS) by administrators, another team.

Some of the problems of this approach are expressed in the questions stated above. Summary of issues:

  • Lost in translation - IT experts need to understand functional context
  • High number of teams and people involved means high alignment efforts and costs
  • What is learnt in realization cannot be fed back into your blueprint (specification)
  • No structure for ongoing optimization after go-live
  • Small windows of opportunity – CTS requires system downtimes
  • Release upgrades can spoil custom code

For functional managers and decision makers, adding or changing custom code is often perceived as complex and expensive because of those reasons. Ideas get stuck in discussions and negotiations and are finally turned down when the business case (the magic word for killing anything) does not work out.

Introducing the Decision Service Model

Using decision services as part of a development process instead (let’s call this the Decision Service Model) can provide an alternative and overcome many of the problems of the traditional model outlined above. A decision service is a self-contained, callable service to make an operational business decision (after James Taylor who popularized the term). You can compare a decision service with a function module or method. In most cases it is stateless, which simplifies the integration into processes.

NW DSM is a software system that helps customers define, deploy, execute, monitor, and maintain decision services to implement the variety and complexity of decision logic used by operational systems. For this purpose NW DSM provides

  • a user interface to model, test, and analyze decision services
  • a code generator to transform the model into an executable service
  • a distribution cockpit to deploy the decision service to the target server
  • a repository with service catalogs and versioning/auditing capabilities

Once a decision service has been created, it can be called by a program. The initial connection may still be programmed following the traditional model. However, once this is completed, future updates of the service implementation will not require the calling program to change. All of the features provided by NW DSM can now be used to build or update decision services, analyze the execution, etc.

SAP NetWeaver Decision Service Management

NW DSM provides a cockpit-like user interface for full overview and control of all decision services that are created, deployed, and used across customer systems. With a few clicks, it is possible to connect to a remote system to make data (metadata, master data or customizing) accessible for usage in service implementations. Once a service is implemented, it takes just another few clicks to deploy, (= transfer) the service to one or multiple remote system where it then can be tested or used productively. More information about NW DSM can be found here,

Comparison

The difference, and therefore, advantages and disadvantages of this new approach are best described by these four categories: time, quality, transparency, and finally, costs.

Time from request to implementation

Traditional coding approaches often need weeks or months from idea to executable business logic. This is not only due to the phased waterfall approach but also because of a time-consuming alignment between functional and technical teams. Although functional experts own the business logic technical teams often need to gain in-depth domain expertise to be able to understand the requirements and compile a design document. I have seen plenty of examples where weeks have been spent writing a comprehensive specification, and still a significant number of meetings and changes were required before even thinking about a design. My experience shows me that in a scenario of medium to high complexity it is not possible to write a specification that is complete, free of contradictions, up-to-date, and easy to understand by a person outside of the functional domain. What makes this even more difficult is the gained insight for all parties in all those discussions. Requirements tend to evolve over time and priorities change. If this is not accepted (e.g. because of deadlines to be met), the solution that is finally implemented lacks good ideas. It is not as good as it could be. Finally, another important aspect that can delay the go-live of a project significantly is the release management. Often customers have strict policies about when changes can be transported into the productive system. There are customers that allow monthly updates. Others have quarterly schedules or even longer periods between updates. Hence, updates need to be planned well ahead. The fear of losing an opportunity to update, and by doing so, delaying the implementation often results in compromises on functionality or quality.

The decision service model helps overcomes those problems. Whatever business logic is put into decision services can be collaboratively changed. Functional and technical experts can jointly implement or change and test decision services because of its business-friendly authoring concepts (see more below in the paragraph on Transparency). Instead of huge documents, it is often a good idea to define a small team of functional and technical experts and have them build the service and demo its execution with the help of the simulation protocol. Lively discussions amongst domain experts do not need a developer’s attendance. Contradictions and gaps that are impossible to locate in 1000s of text lines can be identified more quickly in semi-structured decision tables or trees. Changing requirements are not discussed as abstract assumptions and ideas but based on existing and executable rules. There are no longer any cut-off dates. Even productive services can be continuously analyzed using the execution traces and incremental optimizations can be applied. A decision service can be managed independently of the rest of the application and therefore new and better versions can be applied outside of traditional release cycles. Release windows only hold true for the connector code to include a decision service into a process. However, this code is based on a generated code template (NW DSM feature) and easy to integrate. Decision services even provide a code-free enrichment of input data should there be a need to include more information into the decision making rules.

Quality of implemented decision-making logic

In a traditional development project, developers have to understand the requirements, to implement the requested changes, and to provide a test environment for business experts to validate the code after weeks or months of implementation work. Persons who cannot read ABAP find it impossible to understand where something was changed and how exactly the changes where implemented. Consequently, testing can often be compared with a labyrinth, finding dead ends (bugs) or exits (correct processing). Only a few known and reproducible combinations out of a much higher number of possibilities are tested. Software bugs, which are detected after go-live, are fixed in a technical correction process using CTS and weekly updates or so. New ideas or improved business logic requires a new round of alignment, budget, etc.

The decision service model takes a totally different approach. Business logic contained in decision services is organized in multiple rule catalogs that are created for the need of specific user groups such as domain experts and developers. NW DSM provides one central place for the rules, no matter in which system they are used. Testing efforts are significantly lower as design and execution transparency (see below) allow domain experts to detect many errors at a very early stage, no matter whether caused by incomplete specification, design error, etc. In the testing labyrinth, function experts now take the bird’s perspective. Semi-structured rules such as provided in decision tables allow NW DSM to automatically test for full coverage of combinations or for contradictions in rules. NW DSM provides a test tool with which tests can be automated and rerun in case rules changes are required. Corrections and service improvements can also be deployed into test or productive systems after go-live with a few clicks of a button or with a customizable workflow. In another feature, NW DSM provides the capability to deploy decision services into a sandbox space (also on productive systems) that allows running and testing a service without side effects on the target system before the service is used productively. Corrections and changes can be tested and applied within seconds without risk: No alignment is needed and no technical processes are required. Changes can be revoked with the same ease.

Transparency of decision-making logic applied across SAP applications

In the traditional approach, there are two ways of implementing decision-making logic:

  • Code
  • Z-Tables (custom database tables)

Good developers love code because they can shape it into whatever form they want, and once familiar with a language like ABAP coding is also reasonably fast. The problem is not the speed of development itself. As discussed above, the problem is more about the lifecycle and, most importantly, the transparency. For a non-technical person, code does not explain itself. Z-tables are used as an alternative, when configuration needs are very limited. However, the combination of customer code and z-tables also comes with problems:

  • Unclear access sequence: several tables are called in sequence, but the exact sequence is unknown or not all tables are known (select table1, if nothing found, select table 2, if…)
  • Unclear table design: meaning and impact of fields is not clear
  • Overly complex table design: possible future requirements result in many complex z-tables, the table design includes many options that are not going to be used and are not understood

Burying business logic in code means more of an effort to understand current system behavior, higher effort for alignment on new requirements, a reduction in quality, and lower testing productivity.

In contrast, the decision service model aims to involve business experts deeply in all phases. Concepts such as spreadsheet-like decision tables or formulas, decision trees, and natural language like if-then rules are understood with little or no training. It is of secondary importance whether a developer or a business user creates those rules. Decision logic expressed using those capabilities is the foundation of fruitful discussions about functional correctness. Quirky details, such as required database selects or other technical expressions, can be hidden using descriptions in the language of the business experts. However, NW DSM does not only provide design transparency, it also provides execution transparency in form of traces that show what result was returned and how the result was found in step-by-step explanations. The trace is also the basis of decision service analytics to answer questions like:

  • How often is the decision service called and what results does it return?
  • What rows get most hits  in a specific decision table or what nodes in case of a tree?
  • Which rules and expressions are never used and can therefore be deleted for simplification?
  • What are the most common decision paths?

When business users do not see the rules, they are often forgotten after a while. Only experts that know exactly how the processes are implemented can actively search for optimization and leverage new ideas for better products and services or lower costs.

Costs of implementation and change

On several occasions, I see people choosing the traditional approach because they are very familiar with it. They know how to calculate time and costs and add some buffer for the usual unforeseen things. Every report, every table, every transaction, view cluster etc. has its price in form of required man days. A cost and time calculation is set up quickly.  Once the specification is final, no new requirements are accepted. Any change to the spec requires a new renegotiation, often a tedious exercise so that only the most important changes are included.

How many changes will never be executed because they are too expensive? How many projects never off the ground? How much extra work is needed because of known limitations? Many of these opportunities do not need to be lost. They could be realized with help of decision services that do not need complex alignment processes with various groups and managers in the organization (such as with development, transport management, or management of third party resources).

Besides the qualitative aspects, an internal SAP research revealed the top 5 cost factors in projects. Each one can be reduced by applying a decision service approach. Plenty of reasons have been given in this blog.

  1. 19%          End-user documentation and training
  2. 17%          Business process definition
  3. 11%          Project management
  4. 10%          Project team training
  5. 9%          Integration test

In addition, NW DSM provides some more capabilities to cut down costs. With NW DSM, it is easy to access data in other systems, removing the need to replicate data regularly into development systems. NW DSM has optimized the deployment process, allowing implementation teams to iterate more quickly and therefore fix problems or try out alternatives. All of this is possible without dependencies to other teams (such as release or transport). NW DSM even allows running multiple versions of a decision service to compare the outcome and several more features which I will describe in more detail in one of my next blogs.

A hurdle of the decision service approach is that it is new. The teams and especially the developers need to be trained and it is advisable to also include external help from an experienced NW DSM consultant in the beginning.

James Taylor created an ROI study on NW DSM which you can find on his website.

Summary

Can NW DSM solve the problems related to custom code and z-tables? Yes and no! You will still need custom code for user interfaces, the overall process flow, etc. However, whenever business logic deals with calculations, data validation, classification, determinations, look-ups, etc. you probably have a candidate for a decision service that would better be built with NW DSM as explained in this blog. So NW DSM will not kill all customer code but it is the perfect tool to significantly reduce the lines of custom code and z-tables. At the same time, you can better test your changes and you equip your applications with capabilities that are impossible to build yourself.

Last but not least: You may think about upgrading your infrastructure using SAP HANA or other SAP products. NW DSM reduces the pain to change your custom code. NW DSM may help you benefit from future improvements without the need to adapt your code.

UPDATE

glen.simpson

2 Comments