Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
sander_vanwilligen
Active Contributor

In this blog series I would like to share how to realize central definition and maintenance of partitioning patterns. The standard Semantically Partitioned Object (SPO) functionality of SAP NetWeaver BW 7.3 is enhanced by implementing Business Add-in (BAdI) RSLPO_BADI_PARTITIONING and few control tables. It facilitates an automated (re)partitioning, either individually or collectively, and enables partitions, criteria, texts and Data Transfer Processes (DTPs) to be generated automatically and consistently.

The blog series consists of the following blogs in addition to this blog:

I developed a working version of the BAdI implementation which I would like to share via two documents. Implementing Pattern-based Partitioning using the SPO BAdI - Part 1: Control Tables contains all technical details of creating the control tables and the related ABAP data dictionary objects. Implementing Pattern-based Partitioning using the SPO BAdI - Part 2: BAdI Implementation explains all technical details of implementing BAdI RSLPO_BADI_PARTITIONING and the necessary ABAP Object Oriented programming in the implementing class.

LSA Data Flow Templates

Although pattern-based partitioning can be applied in any BW 7.3 data warehouse architecture, let’s explore some use cases in an SAP BW Layered, Scalable Architecture (LSA) Enterprise Data Warehouse architecture. There is an interesting blog series on LSA Data Flow Templates by Jürgen Haupt. Please refer to the paragraph Further Reading for a complete overview of these blogs.

The data flow is a new modeling object in BW 7.3 and it’s possible to define data flow templates. These data flow templates can be reused by incorporating them into any new data flow and offer powerful means for defining and applying standardized modeling patterns.

SAP delivers a series of 10 LSA data flow templates as Business Content.

Figure 1: LSA Data Flow Templates

In the context of pattern-based partitioning using the SPO BAdI, let’s highlight the following data flow templates:

  • /LSA400/ LSA Scalability & Data Domains - Strategic Semantic Partitioning of Entire BW (Split);
  • /LSA330/ LSA Scalability - Data Flow Split Using a Pass Thru DataStore object;
  • /LSA300/ LSA Tactical Scalability - Flow Split using Semantic Partitioning.

Early PSA-based Strategic Partitioning

The first use case is based on LSA data flow template /LSA400/ LSA Scalability & Data Domains - Strategic Semantic Partitioning of Entire BW (Split).

Strategic partitioning is a modeling pattern where data domains are used to partition transactional data flows. Data domains are considered as a landmark building block since it’s used throughout the entire LSA data warehouse.

The screenshot below gives a conceptual overview of this data flow template.

Figure 2: Strategic Partitioning - Early PSA Based (Source: SAP AG)

The scenario here is a single Enterrpise Resource Planning (ERP) source system where we have to split the transactional data into the various data domains. The technique used is early Persistent Storage Area (PSA)-based partitioning: the data has to be assigned to the appropriate data domain by means of a so-called domain driving characteristic.

Let’s have a look at a frequently occurring situation. The domain driving characteristic is Company Code. The Company Code values have to be assigned to the appropriate domain. The table below shows a simplified example of the maintenance required in this situation.

Table 1: Example Maintenance Control Table

Pass Thru DSO-based Strategic Partitioning

The second used case is based on LSA data flow template /LSA330/ LSA Scalability - Data Flow Split Using a Pass Thru DataStore object (DSO).

This data flow template also implements strategic partitioning using data domains. The main difference with the previous data flow is adding an intermediate DataStore Object (DSO) directly after the PSA.

The screenshot below gives a conceptual overview of this data flow template.

Figure 3:  Strategic Partitioning Using a Pass Thru DataStore object (Source: SAP AG)

The scenario here is a single ERP source system where we have to split the transactional data into the various data domains. The technique used is pass thru DSO based partitioning: the data is as a preliminary step assigned to the appropriate data domain by means of a so-called domain driving characteristic and persistently stored in a single write-optimized pass thru DSO. From here the further partitioning is effectuated, similar to the previous data flow template.

A pass thru DSO will make the domain partitioning easier for inbound data flows. As a first preliminary step the data unification is executed and all administrative characteristics are supplemented. One of them is the data domain.

The process of determination of the domain driving characteristic and reading the control tables is not in scope of this blog series.

Let’s have a look at the example of the previous paragraph. The domain driving characteristic is Company Code. The Company Code values still have to be assigned to the appropriate domain but that takes place during the upload from the PSA to the pass thru DSO. Here the domain is persistently stored as an administrative characteristic. The table below shows the maintenance required in this example.

Table 2: Example Maintenance Control Table

The main advantage is that the determination of domain is already executed in advance. The only difference is that it will be persistently stored in the pass thru DSO prior to the logical partitioning using SPO.

Having the domain available in the pass thru DSO offers several advantages:

  • The maintenance of SPO partitioning becomes extremely simple: only filtering on domain is required;
  • Domain can be classified as a stable partitioning criterion which is interesting from a data management perspective;
  • It offers an inbound check on failed domain assignment, these errors can be made visible using the Error Stack and it’s possible to put further processing on a hold.

Tactical Partitioning

The third use case is based on LSA data flow template /LSA300/ LSA Tactical Scalability - Flow Split using Semantic Partitioning.

Tactical partitioning or sub partitioning is a modeling pattern where other partitioning criteria than data domain are used to partition transactional data flows.

The screenshot below gives a conceptual overview of this data flow template.

Figure 4:  Tactical Partitioning (Source: SAP AG)

The scenario here is a non-domain partitioned data flow that is tactically partitioned in the Reporting layer.

Frequently used partitioning criteria are Fiscal Year and Calendar Year. However, other partitioning criteria are also possible. Tactical partitioning can be applied in conjunction with strategic partitioning using domains.

Let’s have a look at a frequently occurring situation. The partitioning characteristic is Calendar Year. E.g. the last 3 years are required from a reporting point-of-view. The table below shows the maintenance required in this example.

Table 3: Example Maintenance Control Table

Further Reading

Please refer to the following blogs for more information on LSA data flow templates:

Conclusion

I highlighted in this last blog some use cases of pattern-based partitioning based on LSA data flow templates. For every use case a simplified example of the table maintenance is shown. The use cases consist of early PSA-based strategic partitioning, strategic partitioning using an intermediate pass thru DSO and tactical partitioning.

4 Comments
Labels in this area