Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 

Short version:

The following is a slight rant covering recent challenges tackling concerns/complaints on the (somewhat emotive) subject of BI report performance. I'm currently trying to source the opinions and experiences of my peers and would be very grateful if people could comment on their own experiences in this field.

My aim is to (eventually) provide evidenced targets & thresholds for our BI user community. We would like to compare our targets (detailed below) and understand what peer organisations are doing to see if these targets are realistic or indeed typical.

If you have time, please take this survey: Industry Views on BI Report Performance, Targets and Thresholds


Long version - Background:

During a recent global transformation programme a Global BI instance was created (BW7.3 + BWA with BOE 4.0 (later migrated to4.1)). As BI was based on a new single instance ECC a portion of the reporting requirements were completely new, often replacing manual processes of data extraction and collation.


To meet the requirements of a global audience, while keeping reporting information and layout flexible, the following was typical:

  • Multiple reporting requirements were aggregated up into a single physical report or dashboard (consolidation/simplification)
  • Reports were built with a large number of mandatory/optional prompts (maximum flexibility)


A mixture of technologies were deployed to fulfill requirements:

  • Crystal Reports for Enterprise: fixed format reports (typically Finance, P&L-style reports)
  • Analysis for OLAP: flexible query and analysis (slice & dice) reports (with self-service options)
  • Analysis for MS Office: initially deployed for reports that required manual entry integration with Integrated Planning (IP)
  • BI Dashboards: Operational dashboards and Finance bridge reports (i.e. waterfall charts)
  • BEx Web Templates: Formatted analysis (slice & dice) reports that couldn't be delivered in Analysis OLAP

The Crystal and Analysis reports consume BW queries as their semantic layer and are published to (and called from) the BusinessObjects Enterprise platform (SAP BI4). The BI Dashboards and Web Templates are published into SAP BW and called via NW Java.

Because the reports replaced a lot of manual processes it was initially decided that performance targets would not need be applied because manual processes that took days, or even weeks in some cases, would be replaced by reports with a run-time of minutes. Of course once the reports were actually in the hands of users that message was lost and the question of performance arose. To provide a starting point for conversations and subsequent plans for remedial action, performance thresholds were introduced. See diagram below.

These thresholds were not in place when the reports were designed and built therefore many reports, especially those that aggregated multiple reporting requirements into a single report, fell outside of these thresholds. At least we now had a baseline to work from.

Dealing with the "Our BI reports are slooow!" complaint

The topic of BI performance, especially the end-to-end report run-time experience for the user is usually, at best, a subjective and often emotive subject. In recent months differentiating between subjective and objective, determining criteria/methods for monitoring/testing/troubleshooting and making performance improvement recommendations has been a large part of my work day. And evening. And night.

We live in a time where people expect a web page to respond in seconds to their demand for information so a report that takes a few minutes to run ca be considered slow. For those of us who have had a career around BI, some might share my opinion that the only reports that are slow enough to be a problem are those that run for more than 24 hours and risk failing before the source is dropped and re-loaded! So, what is "slow" and how do we make it something quantifiable and measurable?

Our answer here was to deploy Solution Manager's End User Experience Monitoring.  This finally let us have a fact based conversation about performance. If a user says a report is "slow" we can evaluate whether it runs at the speed IT expects. If it does then we can infer that the user simply judges the report to run slower than he would like it to (the "Google effect" as we call it). If the report is seen to be running more slowly than the usual benchmark for that report then we can see if there is a persistent problem or whether it was an isolated spike. Either way we can manage the users' expectations based on facts. This alone was a major milestone for us in terms of managing performance expectations.

One thing that quickly became apparent was that not all locations are created equal. A report that ran "as expected" in Europe might run almost twice as slowly in some low-latency geographical regions. Solution Manager's End User Experience Monitoring was again useful here as we could look at the history of a report run-time and compare the report to itself or we could compare the run-time of the report in one location against the same report in a different location. This doesn't solve any problems but allows for a fact based conversation around performance rather than the usual emotive one. Trust me, that's a large part of the problem.

Infrastructure as an Aggravating Factor

Obviously, as a large enterprise organisation hosting a global instance, the BI solution is wrapped in multiple IT layers of integration (firewalls, identity access management, load balancers, etc). It became clear that no matter how well we fine-tuned reports and queries there were other areas that heavily impacted performance. We identified all of the "pillars", as well as the "blobs" within the pillars, and created a working group focused on identifying and solving any bottlenecks. Some were easier than others.

Actually making a difference...no, really!

Some quick wins (which I may expand on in the comments if people are interested) were changes made on the firewalls to align timeouts, increasing the number of dialogue processes in BW, application of various dozens and dozens of SAP Notes, changes to Portal and some tweaking of BI platorm settings for many of the reporting services. Some much trickier subjects i.e. making significant changes to the virtualization layer or upgrading to BI4.1 Service Pack 6 are currently being considered.

The end result is, even with all of this effort, a large portion of our corporate "core" reports still run outside of the threshold. While we continue to make all the positive changes we can, it's likely this will be for ever-diminishing returns. This is where I am keen to get feedback from my peers. As I mentioned at the very beginning of this post we are trying to determine if our approach (or rather our targets and thresholds) are aligned with other organisations.

If you have time please take the survey (so I can easily share results here) or simply add your thoughts in the comments. Either way, thanks for taking the time to read this, I just needed to get it all off my chest!

Survey: Industry Views on BI Report Performance, Targets and Thresholds

Labels in this area