1 2 3 17 Previous Next

SAP NetWeaver Business Warehouse

250 Posts

All of us know, BODS jobs helps to load data from Non SAP source to BW Data Source(PSA). If there is requirement, after completion of PSA data load the same data needs to move to DSO and cube automatically. How to do it.


Normally Process Chain will be used data to load to DSO and Cube(DTP).DSO Activation, Delete Index, Create Index and entire process will be controlled and manage by PCs.


BODS will trigger the Process Chain after completion of PSA load using New Function Call.


Below the step by step to trigger Process Chain from BODS.


1.       Import BAPI  “RSPC_API_CHAIN_START” into BW Data store


2.     Create Batch job and dataflow in BODS

3.     Add Row Generator as input and add Query Transform.


4.     Click on the query transform and in the schema out Add New Function Call.


5.     Select the Function “RSPC_API_CHAIN_START” from the list


6.     Enter Process Chain Name (with in a single quotes) in the input parameter window and click finish.



7.     Click Next and select the output parameter  and enter finish.


8.     Add One more Query transform and template table. Map the function output parameter to template table as below.



9.     Make sure that schema in and schema out has same fields by propagating the fields

10.   Make sure process chain start variant start time as “Immediate start”. So that process chain will start immediately. 


11.   Validate and execute the BODS job.


12.  Job executed successfully. Execution log will display as below,


11.   BODS Job triggered BW Process Chain and the return log value stored into template table.


     Return value captured in Template Table


The same way you can call other BABI/Function module from BODS. Hope this post helps.

Hi All,


Today I'll share some of thoughts on data archiving in. if you are a newbie to data archiving then I hope the below will be useful 



The big challenge for administrators is determining what data to keep and what to get rid of, before encountering problems such as data overflow, longer transaction processing times, and performance degradation.


So let me first say about the general concept of archive - archive its nothing but keeping your old/history data in to some place means like similar way of backup of data. In computer you will archive some of your data  it was old/if you want that data for later retrieval you will copy to some hard disk storage.

If you are professional that definitely you could have done this , you have been using office outlook and some time your outlook pops up with the message says that your mailbox size is full. So what some of them will do they will delete some unwanted mails, and some thought no i need to preserve my data/mails in outlook. Then you are going for archiving your outlook, by archiving the old mail and keeping it as PST file in some location.

Goal of Data archiving


Most of the business have the challenge of determining what data to keep and what to get rid of, before encountering problems such as data overflow, longer transaction processing times, and performance degradation.

Data archiving is becoming an important business process that is used to keep enterprise systems functioning at by moving outdated data files from system to offline or near online data stores. The primary goal of data archiving is to preserve valuable data for later.


Prerequisites and action items:


Below are some of the considerations,


  1. What is the stuff of archival and how long must the data be retained before deletion?
  2. What determines whether the data is current and must be kept live?
  3. How is storage architected in the enterprise, what growth plans are in place and
  4. What business continuity and  recovery solutions are in place
  5. How to safe guard the data against unauthorized access?
  6. How the archived data will be retained after some upgrade to the system 
  7. Time lines of archive data whether business wants to do it on daily, weekly, monthly or quarterly?
  8. At what location will the archive be stored on-site at the data center, off-site at a backup location or moved to cold storage elsewhere?


Successful implementation of data archiving


Below are the certain criteria for that


  1. It Improve performance of the active transactional database system.
  2. Protect data from corruption, loss or unauthorized access.
  3. Reduce overall storage costs.
  4. Automate most of the process, reducing the administrative burden.
  5. Provide reporting capabilities for validation needs.
  6. Ease the retrieval of archived data.
  7. Handle application upgrades or changes that affect the archived dataset.
  8. Automatically retire data that is no longer needed.



I hope the below info is use full and in next blog i'll share more technical ideas related to data archiving 







1) What data(years, division, applications, countries etc.) will we migrate from what systems to BW?

2) What data will reside "only in BW"? There may be cases that you don't want to move old data into ECC but may want to keep in BW only for regulatory etc reasons

3) What data will NOT be moved to BW, so that we can exclude from our extractions - e.g. CRM, certain division, before certain year etc?

4) What is data archiving strategy for BW? or data retention period? such as keeping only current plus two fiscal year data

5) Will there be a parallel landscape (e.a ' BW on HANA and a classic BW) during transition for some period? The reason for the question is there might be problems getting delta from same tables to many BW systems

6) Will the data be presented in English only?

7) Will there be any data moving from BW to other systems ( such as forecasting/planning/SAS? )

8) Will there be any MDM/MDG rules? Since BW can accept data from any resources we may need to re-route them to MDM/MDG application before extracting into BW

9) Is there any requirement to load non-structured data into BW?

10) Will there be any sensitive data (customer SSN, credit card numbers, employee salaries etc)?

11) Do we allow changing data in BW?

Secondary questions

S1) Are we required to keep any time-dependent master data?

S2) Will there be any Inventory type data (non-cumulative data) that we need an opening balances?

S3) Will there be BPC on top of BW?

S4) Do we delete data (instead of reversing it) in the source systems or do we manipulate data in the source systems w/o going thru the application layer?



Hi all,


I often copy process chains, for example an existing chain for european data to a new chain with changed infopackages or different source System infopackages for asia.


In BW 7.3 or later systems one option to coyp a chain is to use the wizard. With the wizard you can choose which DTP, InfoPackages should be changed in the copied chain. Here you can also define a new start process. The wizard will create a new start process for the copied chain. If you check and activate the chain, everything is fine.


If you choose "simple copy", the chain is copied 1:1. All steps are copied to the new chain, also the start process. But each process chain needs to have a unique start process. Check or activation of the new chain will fail in this case. So you delete manually the copied start process in the new chain and insert a new start process.Then you connect it to the other process types as before. If you save and check the process chain, process chain will be green. This blog describes, how you can enhance the method for copying process chain. This enhancement will create a new start process of type 'TRIGGER'  with technical Name "<technical Name of process chain>_START".


I investigated how copy of process chain is implemented. Fortunately it is implemented in class 'CL_RSPC_CHAIN' in private instance method '_copy'.

Now you can use ABAP enhancements.  How to crete an ABAP enhancement you can find in the excellent blog series Enhancement framework in ABAP by Karthikeyan B.

First create an enhancement of class 'CL_RSPC_CHAIN'..

Then you put the cursor in the method '_copy' and insert post-method.

Insert following code in the post-exit method:

DATA: l_trigger TYPE rspc_variant,
      l_trigger_text TYPE rstxtlg,
      ls_chain  TYPE rspc_s_chain.


LOOP AT core_object->p_t_chain INTO ls_chain WHERE type = 'TRIGGER'.
  CONCATENATE ls_chain-chain_id '_START' INTO l_trigger.
*             I_NO_TRANSPORT       =
*           EXCEPTIONS
*             FAILED               = 1
*             OTHERS               = 2
  IF sy-subrc <> 0.
*           Implement suitable error handling here
  EXPORTING i_variant = l_trigger
    e_variant      = l_trigger
    e_variant_text = l_trigger_text.


  ls_chain-variante = l_trigger.
  MODIFY core_object->p_t_chain FROM ls_chain.


Check, save and activate the method. This post-exit method will create a new start process named <technical name of process chain>_START.

Now you can copy your process chain and it will create a new unique start process variant for each new process chain.


But please make sure that no one is creating or changing process chains, when you activate this enhancement. With activation a recompilation of the whole Transaction RSPC is necessary and user may wonder about some sudden dumps!

If you like this idea, you can also promote my idea on SAP Idea place: Copy of process chain creates new start process also


Maybe the simplest solution in BW 7.3 or later systems is to hide / disable the button "simple copy". Then you're forced to use the wizard and there you can define a newstart process.Hopefully this very small feature can be implemented with one of the next service packs.


I hope you like this blog. Feel free to comment.

Martin Maruskin

InfoCube Utilities

Posted by Martin Maruskin Mar 28, 2014

Recently I found infocube related functionalities which are not so obvious within the standard transactions like RSA1. Within this blog post I’d like to share them to others. May it happen that someone benefit from it. One of them is mass copy cube a like function. This function usually every customer tries to do by self-creating some Z* report. And moreover there are other functions…

So how I found it? I needed to change an name space for one of my cubes and I was browsing service.sap.com/notes to find out how to do that. I ran into the SAP Note 1708553 - InfoCube creation enforces BW application selection where I saw transaction RSDCUBEM mentioned. BTW there in the BW systems; there are also transactions RSDCUBED and RSDCUBE. As they all point to same ABAP report SAPMRSD0 and screen 1000 I cannot really say what is the difference between them.

As I was exploring TA RSDCUBEM I found interesting items in menu Edit-> InfoCube Utilities->


From the pop-up following function are available:

1. Make Copies – Allows copying one info cube to new ones with different names. You can specify start and end suffix of 2 characters for new infocubes names. Function is implemented by calling ABAP report RSDG_CUBE_COPY.


2. Activate Infocubes – Enables mass activation of cubes per chosen cube type or cubes in particular InfoArea or selection. ABAP report RSDG_CUBE_ACTIVATE is behind this function.


3. Delete Infocubes – This is mass deletion of infocubes per cube type or specified InfoArea or selection. Implemented via call of ABAP  report RSDG_CUBE_DELETE.


4. Analyze Infocube – This is just a simple call of TA RSRV - Analysis and Repair of BW Objects.


5. Reorganize Texts – Within the call of report RSDG_CUBE_REORG_TEXTS another report is called RSDG_MPRO_REORG_TEXTS. Report is dealing with texts (table RSDCUBET) of cubes/MultiProviders and is trying to insert the text of e.g. business content objects if text is not present in its active version. At least that is my impression based on what I see in the report’s source code. I’m not really sure what this function would be useful for. If someone has experience with this report can you share it in comments?


6. Change Validity Slice – This function is useful while having non-cumulative key figures (like SAP ECC Inventory management) in cube. As per by standard behavior Validity period of stocks is determined by the oldest and the most recent material movement. If you need to display data in your report that lies further in past you can run this function to change the validity period. Function is implemented by calling ABAP report RSDG_CUBE_VALT_MODIFY.


Hi Friends,

First of all, I would like to thank each and every one of you for the overwhelming response that you have given me for my document on

SAP BW 7.3 Promising Features

I was really excited to see the number of comments, likes and suggestions for that document.

Recently, I got a chance to work on a project which involved SAP BW 7.4 SP5.

The latest SP5 of SAP BW 7.4 powered by HANA comes up with a lot of new and enhanced features.

I have tried to explore the so called new features and has taken the help of HELP.SAP.COM wherever required.

I have made an attempt to summarize and mention most of the Features and the same have been mentioned in the following links:

Part 1 of the series can be found in the link :  http://scn.sap.com/docs/DOC-53205

Part 2 of the series can be found in the link :  http://scn.sap.com/docs/DOC-53210

Part 3 of the series can be found in the link :  http://scn.sap.com/docs/DOC-53527

Part 4 of the series can be found in the link :  http://scn.sap.com/docs/DOC-53505

Hope you will find all the above documents useful.


Also, in case if you feel like I have made some mistakes or if you want to add more features, please feel free to do that as I have made all these 4 documents as Collaborative ones.


Looking forward for your comments and suggestions.




Qlikview is a flexible solution for visually analyzing the data. To be able to connect and show business intelligence data visually, we need a connector to integrate with Qlikview tool. We will try using central Xtract QV server component that constitutes the data hub between SAP and QlikView applications. Primarily we will see how to extract data from BW Query and show in Qlikview, other options such as Cube,DSO,Master data etc will be covered in subsequent parts.


1) Download and install the trial Xtract QV connector

2) Configure the BW system and source from which data needs to be extracted.

3) Generate Qlikview script from Xtract QV and past into Qlikview designer.

4) Now, you can design charts based on the dimensions that you choose from above steps.

5) An example of how to extract from BW query is shown in below video.


Hi Friends,


This document deals about how to collect and transport Application Component hierarchy (RSA9) at ECC side.

Application Component hierarchy – provides Tree wise structure which is provides by SAP, where each node holds its own SAP given standard datasources.

RSA5 – Standard SAP business content Datasources

RSA6 – Active Datasources in Source system.

Here to get active datasources in proper hierarchy, we required to activate Application component hierarchy using RSA9.


All the readers who are going through this document might know above details and along with above details I am focusing on How to transport Application component hierarchy to further Quality and Production system.

If you don’t activate or if you don’t transport Application component to further QA system, the application and its further related datasources will assign under UNDESNOTCONNECTED.



In general for first time datasources activation at ECC system, it is mandatory to install Application component hierarchy using Transaction RSA9. Then standard applications and its Business content datasources will assign under proper node.

pic 1.2.PNG


Let’s see how to collect Application component.

Go to SE03

Select Change Object Directory entries, under Object directory tab.



Assign Object type DSAA and object APCO_MERGED in given areas as shown below.



Here you will find object is collected under Local object ($TMP)



Change and collect into proper Package.

PIC 5 - Copy.PNG



Then transport it to further systems.


pic 7.PNG

Hope this document helps you.

Best Wishes,



Thanks for all for you comments and feedback for my first post  Accounts Receivables-A Walk Through Part1.


Now having explained the functional aspect of AR in my last blog, I would like to detail on the FI-AR Data source and some of the important fields in the AR data source.

  1. 0FI_AR_3/4 data source works based on AIM delta process, which means it only brings the After Image. 
    Therefore the data has to first flow to a staging DSO and then to a Cube for the delta mechanism to work out.
  2. It is always better to map the entire characteristic available in AR data source to data targets.

        The key characteristics (other than the org structure characters)  that should  be available in your data target for AR reporting are

    1. Accounting Document (BELNR)
    2. Item (BUZEI)
    3. Document type
    4. Document status (BSTAT)
    5. Clearing Date (AUGDT)
    6. Posting Key (BSCHL)
    7. Posting date (BUDAT)
    8. Customer (KUNNR)
    9. Net due date (NETDT)
    10. Base Line date (ZFBDT)
  1. 0FI_AR_3/4  data source doesn’t bring the statistical AR document (noted items) and you have to implement SAP Note 411758 - "Extracting noted item“ to bring in the same.
  2. It is also better that you link the AR document to the sales document (VBELN) so that you can analyze the AR’s based on various sales attributes.
  3. The AR Field REBZG (Number of the Invoice) is also an important field which keeps a track of the flow of the AR’s for a particular invoice.

AR Reporting

AR is always reported as on date. Let us go through the some of the cases which can make your understanding clear.


Case 1. For a customer C1 an AR document was created on 01.01.2013 for 1000 Currency.


So  if you run the report.


As on 31.01.2013 the AR value for C1 should be 1000


Case2 . On 25.02.2013 another AR document was created for an amount of 500 Currency for C1.


So  if you run


As on 31.01.2013 the AR value for C1 should be 1000


As on 28.02.2013 the AR value for C1 should be 1500 (1000+500)


Case 3 . On 15.03.2013 an amount 1000 was cleared against the receivables of Customer C1


So if you run


As on 31.01.2013 the AR value for C1 should be 1000


As on 28.02.2013 the AR value for C1 should be 1500

As on 30.03.2013 the AR value for  C1 should be 500 (1500 -1000)


Now below will be sample of the  data that will be available in the cube after the above 3 transactions.



Accounting Document



Posting Date



Clearing Date









O (open)






O (open)






O (open)








C (Cleared )



Now the question would be how to do this in the bex to get the report as stated in Case 3.


In Bex you will have to define two selections and add them to get the balances as of any given date.


1. Selection1 ( To get the Open Receivable )  =  Amount key figure +  <= Posting Date  (variable which is input ready ) + Open status (Status with ‘O’ only)



2. Selection  2 ( To get the cleared Receivables) = Amount Key figure + <= Posting date (Same variables used in selection 1) + > Clearing Date (Customer Exit variable to read from Posting Date variable )


So in the Case 1 .

As on 31.01.2013 the AR value for  C1 should be 1000


Selection 1 = 0


Selection 2 = 1000


Total Balance = 1000

Case 2




As on 28.02.2013 the AR value for C1 should be 1500 (1000+500)


Selection 1 = 500


Selection 2 = 1000

Total Balance = 1500


Case 3 .


As on 30.03.2013 the AR value for C1 should be 500 (1500 -1000)


Selection 1 = 500


Selection 2 = 0


Total Balance = 500


In the next blog I would to try to highlight on AR Ageing bucket Analysis and some of the Dashboard Analysis that can be done on AR.



Hello everyone!


I wrote couple of posts for ABAPers and this one is the first one for BW guys. First of all why my post is different from others? There are many, in fact infinite articles, blogs, tips and tricks regarding BW extractors. You can search for them on SCN Redirecting... too.


So, here is an idea! I got one requirement from BW guys and I never worked on extractors because it was my first time.


BW guys goes!




Here is the requirement:

We need a BW extractor based on a Z ABAP ALV report and it should give us the exact output as it's giving in ECC (isn't it simple?). Wow! I was like shocked then I thought it should be easy! But it was like a biggest challenge for me and problem's queue was waiting for me.


Requirement was simple but the challenge was to call ALV report without making a copy of that so SUBMIT came into play but whenever we submit it first shows us the ALV then it comes back with the result. ALV was never required because RSA3 has it's own ALV to show the data.


What I did were the basic steps for creating an extractor (you already know):


1. Copy RSAX_BIW_GET_DATA_SIMPLE from Function Group RSAX to my desired name.

2. Make changes according in the extractor to fulfill my requirement.

3. Put in the RANGES to filter the data and pass it to the ALV report.

4. Handle the packages accordingly.

5. Create a Datasource.

6. Create an Extract Structure.


Simple? Yeah! It is.


Now here is the trick! I used SET method of Class CL_SALV_BS_RUNTIME_INFO and I switched ON only one parameter that is DATA because I don't need MetaData and Display ALV.

                METADATA = ABAP_FALSE
                DATA     = ABAP_TRUE ).

This will set the Memory ID for ALV Report. Now call the Z ABAP report or standard one by SUBMIT button and Clear the memory.

          SUBMIT ZSDRP003
            WITH P_EMAIL = S_EMAIL-LOW
            WITH P_MARERR = '0'
            WITH P_MODE = '0'
            WITH S_CUST IN S_CUST
            WITH S_DATE IN S_DATE
            WITH S_SORD IN S_SORD




Now you need to get the data ALV returned for that purpose following code can be used:



                IMPORTING R_DATA      = LR_DATA ).



              ASSIGN LR_DATA->* TO <LT_DATA>.


Here LR_DATA is declared as TYPE REF TO Data and <LT_DATA> is FIELD_SYMBOL of TYPE ANY.


Major steps done you fetched the data and put in your desired table. Final step is to give it to BW. For that purpose I have created a transparent table with same same structure as the Extract Structure. Why I did it? Because OPEN CURSOR can not fetch data without a transparent table.







Same will be done for every package.


To summarize, there are two major steps. Call standard class to SUBMIT report and get data back, 2nd: Create a transparent table.


I haven't seen this approach anywhere on the internet. Believe me or not it's working like a charm. Whole source-code is attached.


Let me know in your comments what you feel about this one!



New Note Analyzer for BW PCA and Housekeeping


Link to the Documentation - Configuration Guide ABAP Post-Copy Automation

Starting from Week 12/2014 there will be a new and improved SAP Note Analyzer application available to check the pre-requisites for the usage of BW PCA and Housekeeping. The tool combines now the two different tools from LM Basis and SAP BW into one interface.


It will be delivered via the existing SAP Notes:

Note 1614266 System Copy: Post Copy Automation (PCA) / LVM

Note 1707321 - BW System Copy: Note analyzer Post Copy Automation (BW-PCA)

which contains the Program Code STC_NOTE_ANALYZER.


No additional pre requisites are neccesary, the higher the SP Level of the source system, the fewer the number of Notes will be automatically downloaded.

There are two XML files to apply depending on the BW-PCA scenario: Basis and BW



After you uploaded the two XML files additional options are visible in the New Note Analyzer.


Loading “in dialog” allows you to see the detailed list of SAP Notes to implement prior to the usage of the BW-PCA task list.




Implement the latest version of the following collective SAP Notes for transaction SNOTE:

Note 875986 - Note Assistant: Important notes for SAP_BASIS up to 702

For SAP NetWeaver 7.00, 7.01, and 7.02

Note 1668882 - Note Assistant: Important notes for SAP_BASIS 730,731,740

For SAP NetWeaver 7.30, 7.31, and 7.40




Best Regards

Roland Kramer, PM BW/In-Memory

Basis consultants might have questions about


  • What tasks needs to be performed in an typical BW system
  • What are the different BW objects and their use etc

From a Basis perspective, BW is just an SAP system following the 3-tier architecture.


What is done in BW system?

Data from different systems like SAP ECC, MDM (or any R/3), Oracle, MS SQL Server etc (SAP and Non-SAP systems) will be Extracted, Transformed and loaded in to SAP BW system for reporting purposes.

What are the components of BW system?

Typical BW system will have components like shown below. These are the backend components that will be installed for a BW system.



BW system also has a front-end component called BEx . BEx is SAP BW component that provides flexible reporting capabilities.


What are the frequently used terms in BW system?

Data Source

A DataSource is a set of fields that are used to extract data from a source system and transfer it to the entry layer of the BW system. In other words, data source contains the source table and field information from where the data is to be extracted into the BW system.When the DataSource is activated, the system creates a PSA table in the Persistent Staging Area (PSA), the entry layer of BI.The extracted records are stored in the PSA table.

Info Package

Before data can be processed in BW, it has to be loaded into the PSA using an InfoPackage. In the InfoPackage, you specify the selection parameters for transferring data into the PSA. A data source can have multiple info packages.

Info Object

InfoObjects are the smallest units of BW. You map the information in a structured form that is required for constructing InfoProviders. The data from every single field in source table can be stored in corresponding Info Objects in BW.

Data Transfer Process (DTP)

DTP can be used to load data from one target to another using transformation and other filters.  

Data Target - A data target is an object into which data is loaded. Data targets are:

A distinction is made between:

  • Pure data targets for which no queries can be created or executed.
  • Data targets for which queries can be defined. These are called InfoProviders. Queries can always defined with Basic InfoCubes.  A Basic Cube is therefore never a pure data target, but both a data target and an InfoProvider.


Info Provider - An InfoProvider is an object for which queries can be created or executed in BEx. InfoProviders are the objects or views that are relevant for reporting.

Data Store Object (DSO)

DSO – data target, Info Provider, used in BW system to store data and for reporting purpose. In BW, data will be stored in multiple layers (like in a DSO,Info Cube etc). The data from PSA table (viz entry point in BW) can be loaded into DSO or an Info Cube or Master Info Object. The DSO structure is a flat physical table that contains fields, index etc. This can be related to any table in Oracle or other database, where a table is consists of fields, primary keys, index etc.

Associated tables(Custom created):

New Data: /BIC/Adsoname40

Active Data: /BIC/Adsoname00

Change Log: /BIC/########## (dynamically generated)

Info Cube

Cube – data target, Info Provider, used in BW system to store data and for reporting purpose. An InfoCube is a multidimensional data structure and a set of relational tables that contain InfoObjects. Hence info cube is not a flat structure like DSO, but has many tables related together. When an info cube is activated, all relevant backend tables are created automatically by the system. To check the backend tables created for an Info Cube, use Tcode LISTSCHEMA.

To summarize, data in BW system will be loaded into a Master Info Object, DSO or an Info Cube.However there are other objects which can display data without storing it physically.


Process Chains

Process chains are used to automate and schedule the data extraction, transformation and loading process in BW system.

Below is the process flow within BW system.




What are the typical tasks Basis perform in BW system?

  • Apply support package
  • Apply SAP Note
  • Installation/Upgrade
  • Moving transports
  • Analyse EWA recommendation
  • Monitor Jobs
  • Maintain Source system connection
  • Maintain RFCs
  • Monitor table space and archive logs
  • Monitor system log
  • Operating system monitor
  • ABAP runtime errors


There might be much more activities that I haven’t listed here but this will give you basic overview of the activities.

Next part, we will see the common errors occurring in BW system that needs basis involvement.








Cleaning up the hard drive of my laptop this week I've come across a number of documents that reminded me that in spring 2004 we - i.e. the BW and TREX development teams - had the first version of the BWA up and running internally. Then, it still was project Euclid. Later that year, Euclid was presented publicly at Techeds in San Diego and Munich - see also Fig. 1. Euclid paved the way to SAP's first commercial in-memory product, the BW Accelerator (BWA). The BWA was first shipped with NetWeaver 2004s (i.e. BW 7.0). It removed the need to define and maintain aggregates, i.e. materialized aggregations (group-by's and filters). At that time, BW's change run - the process that adjusted aggregates to master data changes, like changes in attributes and hierarchies - was one of the top-3 critical processes in almost every customer installation.


Fig. 1: "High Performance BI" (aka BWA) as explained by Shai Agassi in his keynote at Teched San Diego in 2004.


One of the first customers adopting BWA simply upgraded from BW 3.5 to BW 7.0 leaving everything in place but added BWA to his BW system. Thus, there was no change to the end user but the power of BWA. They reported the following effects:

  • more than 90% of queries on indexed cubes ran faster than before
  • one third of queries saw response time cuts by more than half
  • the average query runtime was cut by two thirds

Those impressive improvements in query performance translated into a better usage of their BW system. As adding BWA was the only change that improved usage could be attributed to the improved query run times:

  • the number of query runs per week increased by over 60%
  • over two thirds of the queries saw an increased usage

One other episode came from a European customer ordering a BWA instance for a proof-of-concept to then not allow the hardware vendor to remove the BWA as they wanted to keep and use it immediately. Since then, 1000+ BWAs have been installed and nowadays, BW-on-HANA basically runs a "BWA within" providing even better query performance w/o aggregates and removing the need to synchronize a standard RDBMS and a BWA as "all (i.e. RDBMS and BWA) is one", namely HANA.

Aggregates usage has significantly declined over the years despite significant growth rates of around 15-20% for BW installations per year. Fig. 2 shows how the number of aggregates-related support tickets raised by BW customers has decreased by 60% over the course of 8 years while the number of BW installations roughly tripled. Oddly, some people still talk about aggregates in the context of BW, even 10 years after their way out started. Those comments are trailing 10 years behind the fact.


Fig. 2: Number of support tickets related to BW aggregates. Number of tickets in 2006 = 100%.


This blog has been cross-published here. You can follow me on Twitter under @tfxz.

Step 1:

T-code: SE38



Press Execute (F8)




Step 2:

Enter the APD Technical name and Version details after that enter  ‘/h’ in the T-code  bar and Press Enter  then Execute(F8)




Step 3:

Once debugging screen comes, follow below steps to set break point.


Break Points


Break Point at


Breakpoint for Method


Then select the local Class in class radio button and give the below details.















After getting the Breakpoint set message, press F8 to reach the Break point position.





Step 4:

After reaching break point Press F5 to get into the perform which executes the APD routine code.




Press F5 to execute the APD routine code




Here we can debug the custom APD routine code.

SAP POS DM (SAP Point-of-Sale Data Management) effectively processes Retail Sales Transaction data. It can be easily made available to SAP NetWeaver BW based Reporting. SAP Provides standard BI Content extractor - POS Analytics: Retail Control (0RT_PA_TRAN_CONTROL) for the same.


In simple business terminology, a sales transaction means goods exchange interaction between store and customer which includes sales and/or returns. In addition to this there is also a payment or refund.


Below are some of the key components of a normal sales transaction at a Retail store –

  1. Transaction Header: It consists of Transaction date/time, Store code/description & Transaction number
  2. Transaction Item: It consists of Sales item information such as Qty sold/returned, Net Sales/return value, Markdowns/Discounts
  3. Transaction Tender: It consists of Tender type & Tender Value


Sample Representation based on a store receipt is shown below for easy understanding:




Filter Blog

By author:
By date:
By tag: