Additional Blogs by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member
0 Kudos

Last week during a customer meeting, a customer (from IT) insisted that the query itself provides the answer and reporting is often superfluous. “The query already contains the answer they need,” he stated, all puffily chested. While an interesting perspective – and I shudder to think of the number of narrow queries the customer must be forced to build and maintain – this just doesn’t work in practice.

Reports consist of cross tabs, tables and charts and calculations that provide answers to numerous questions. Even one 40-row, 5-column crosstab can provide enough detail to answer dozens or even hundreds of questions. But, with the 10 minutes that a user might have to glean a take-away from the table, it’s often difficult for them to have the same reactions that the author had attempted to expose through building the report. Typically, consumers need help to understand what a report’s data is telling them. That’s why authors, even Web Intelligence users building ad-hoc reports, spend time formatting tables and charts, laying out the tables and charts in a certain way, adding single cells for titles and guides to interactions, changing cell highlighting colors, using conditional formatting (“alerters” in WebI), adding input controls to enable simple customization, etc. 

We’ve even seen reports with images captured from WebI’s user experience, such a button, to guide the user to a certain feature to help their interactivity.

Sometimes these efforts, while valuable, miss the most basic challenge consumers have when trying to understand a report’s content. A BI project owner once told me “At meetings, when comparing what are seemingly identical reports, my users would nearly break out into fist fights over whose numbers were right. They spent too much time arguing about calculations and not enough about what to do about the results.”  Ironically, both sides were almost always “right”. It just came down to a difference between the prompt/parameter or report-side filters used, or when the refresh occured. Without a way to see this on the report, they thought any deltas resulted from the calculations they had chosen.

I thought about that comment the other day after I received a WebI doc from a customer. It contained a great example of a simple step that authors can do to help consumers more efficiently understand, interact with, and make decisions on the content they receive. It probably took no more than 120 seconds to set up. They used these templates (from WebI's left hand panel) and simply dragged them into a report tab:

"

Once they’re placed in the report, they automatically generate the relevant description. Here’s a snapshot of the output from one of these cells – the prompt summary (known as parameters to BEx users).

image

(Note that this document, to preserve the customer’s confidentiality, has redacted the references to the customer name and replaced with ACME.)

This particular report has a lot of optional prompts that were “ignored” by the person consuming the content. But, what if the prompt “Profit Center Hierarchies Name” had “Division X” and “Division Y” as a response? Knowing that Division Z was not included in the report’s output helps explain the totals.

The Report Filter Summary works the same – except it summarizes the filters the author has placed on the tables and charts after the query results had been retrieved. You can imagine how useful it is for an end-user to know, at a glance, how the data in the report has been transformed. Especially when the report is, as is frequently the case, printed to be shared with other users who don't see it online.

As a bonus, the author of the sample report also had added a link to some documentation describing more in detail about the report – its goals, ways to use it, who owns it, etc – on a centralized server. Such efforts are well appreciated by end-users, but also helps reduce the numbers of questions authors must respond to. So everyone benefits.

image

I’ve come across many authors who insist that the consumers of their content know that content very deeply, and such metadata is needless overhead. I don’t buy that for a second. Not everyone is as conversant with the data as the analyst who provided the original requirements. A new hire, for example, would be a LOT happier with the background info provided on this tab than making assumptions about why this or that calculation seems lower than expected. Or a consumer looking at an archived report with content that has long since evolved. Or the colleague who printed the report out and reads it on the train. All would benefit from being able to glance back through the metadata -- dragged into a coverpage, a header, or anywhere in the document -- that describes the content.

I think two minutes total of drag and drop and a little formatting seems a small price to pay for ensuring just a bit more consumer understanding. And maybe less office violence.

2 Comments