1 2 3 19 Previous Next

BI Platform

275 Posts

Hey!

 

Great news from Sathish Rajagopal - he's just announced on his blog http://scn.sap.com/blogs/sathishrajagopal that we've just release 'Phase 3' of our SAP BI Pattern Books project!

 

These now include a further 2 books, totally comprehensive, tailored and delivered specifically for customers moving to BI 4.1.

 

 

  • How to successfully Upgrade from SAP BusinessObjects Enterprise XI 3.1 release to SAP BusinessObjects BI 4.1 release
  • How to successfully Update from SAP BusinessObjects BI 4.0 release to SAP BusinessObjects BI 4.1 release

 

 

Think of these as "friendly" instruction manuals, with step-by-step instructions, following the particular 'pattern' of real-world deployment scenarios. 

The perfect complement to our technical Help guides, these Pattern Books also list the challenges, tips & tricks, best-practices etc. including links to relevant SCN articles and SAP Notes.

 

 

Here are the links to the SAP BI Pattern Books:

 

 

If you have any questions please ask. 

Enjoy ,and good luck!

 

Cheers,

H

Dear BI users around the globe..

 

For all those that feared BI4.1 being the end station of the well know and beloved SAP BusinessObjects BI Suite, some interesting news have been shared by Steve Lucas during the ASUG User Conference (SABOC)

 

To be clear: Yes there will be a BusinessObjects BI 4.2,” Lucas announced, “and it will be in 2015.” He promised some major enhancements, such as in administration, ease of use and live streaming of data. “Above all else,” he said, “the No. 1 feature will be quality.”

 

read more about what will be coming in the SAP Analytics Portfolio in the Vision Article shared by ASUG via http://www.asugnews.com/article/sap-shares-analytics-vision-lumira-bi-4.2-details-at-asugs-saboc-event

 

Once more details can be shared, I will update you all with our plans for BI4.2

Regards

Merlijn

Hi everyone,

 

Just a quick hello to let you know that a new BI-related course has just opened for registration.

 

This free openSAP course is called "BI Clients and Applications on SAP HANA" and is designed to help you get hands-on experience in deploying BI on SAP HANA, leveraging HANA both as a data source and as a platform.

 

 

Learn about the new features in SAP BusinessObjects BI 4.1, and how it is optimized for SAP HANA. Benefit from best-practices around design, development, and troubleshooting of BI content, and simplify your SAP BusinessObjects BI toolkit. You can also experience SAP Lumira, in the desktop, cloud, and server releases to create amazing visualizations that can be consumed anywhere.

 


BI Clients and Applications on SAP HANA begins from October 29 and enrolment is now open. The course is targeted at BI Developers and Business/Data Analysts, but is open to anyone interested in learning about BI.

 

 

 

Please note that whilst the (optional) hands-on exercises will incur some minimal infrastructure costs, Registration, learning content, final exam and Record of Achievement are all free of charge.

 

 

Enroll today!

 

Regards,

H

I would like to share my recent experience on configuring Windows AD and SSO on a new SAP BI 4.1 BOBJ server. I used to follow the traditional way of using Kerberos Key tab file; this time went for the plain password technique. Thanks to SAP Support for clarifying the different approaches.

 

Below are the steps followed in my configuration…

 

Service Account Setup

 

1.1    Creation of New Account

 

 

·         To set up user authentication for a service, you must register the service as a user in AD on the Domain Controller.

·         To register the service, on the Domain Controller, open the Active Directory Users and Computers snap in.

·         Click the Users folder to display a list of users and on the Action menu, click New and then click User.

·         Enter a name and logon name for the new service, and then click Next.

·         On the next screen, enter a password for the service. Ensure that the User must change password at next logon option is not selected.

·         Click Next and then click Finish.

·         Right-click the user you have entered in the User folder list, and then click Properties.

·         Click the Account tab and then select Account is trusted for delegation and Password never expires. This prevents the service account from expiring, which would cause Kerberos errors.

·         If your Domain Controller is running in a lower Domain Functional Level (lower than Windows 2003 Domain), view the  Account properties for the user you created in step 2, and select Use DES encryption types for this account.

·         Note: In Windows 2003 and 2008, Domain Functional Level RC4 is used by default.

·         Click OK.

 

1.2      Settings to be done on the User Account

 

·         Ensure the Password Never Expires option is enabled in the User Account Properties à Account Tab

 

1.3     To grant the service account rights

 

·         Logon to BOBJ server and perform the below steps

·         Click Start > Control Panel > Administrative Tools > Local Security Policy.

·         Expand Local Policies, and then click User Rights Assignment.

·         Double-click Act as part of the operating system.

·         Click Add.

·         Enter the name of the service account BOserviceaccount, and then click OK.

·         Ensure that the Local Policy Setting check box is selected, and click OK

·         Ensure the service account BOserviceaccount has the following System Rights enabled on the BOBJ Server

   *  Act as part of Operating System

   *  Log on as a Batch Job

   *  Log on as a service

   *  Replace a process level Token

 

 

1.1     To add an account to the Administrator's group

 

 

·         Logon to BOBJ Server and perform the below steps

·         Right-click My Computer and click Manage.

·         Go to System Tools > Local Users and Groups > Groups.

·         Right-click Administrators, then click Add to Group.

·         Click Add and type the logon name BOserviceaccount of the service account.

·         Click Check Names to ensure that the account resolves.

·         Click OK, and then click OK again.

 

    2.0  SPN

 

Steps to set  SPN .

 

In order to create appropriate Service Principal Names (SPNs), execute the following commands on Active Directory server:

 

·         Login to the Domain Controller Server

·         Use the setspn -a command to add the HTTP service principal names to the service account which was created earlier

a)     setspn -a HTTP/BOBJServerName  BOserviceaccount

                              b)     setspn -a HTTP/BOBJServerName.Domain.COM BOserviceaccount

                              c)     setspn -a HTTP/IP address of BOBJ Server BOserviceaccount

                              d)     setspn -a ServicePrincipalName  BOserviceaccount

 

Verify the Service Account Properties and it should similar to the below screenshot.

serviceaccountprop1.png

·         Go to the Delegation Tab on the Properties of the Service Account Created and Enable the option “Trust this user for delegation to any service (Kerberos only) on the Service Account” as denoted in the below screenshot

 

serviceaccountprop2.png

·         Run setspn -l BOserviceaccount to verify that the HTTP service principal names were added to the service account.

 

      3.0     Expected Output

      At the end of the set of activities performed as denoted in the sections

·         Service Principal Name

·         Domain Controller IP address and its FDQN

·         Service Account

·         Service Account Password

      4.0     Adding Entries for AD Configuration

 

·         Create a file bscLogin.conf & Krb5.ini in the BOBJ Installation Directory

·         Add the below entry in the bscLogin.conf in case of Tomcat Web Application Server

                              com.businessobjects.security.jgss.initiate {
                              com.sun.security.auth.module.Krb5LoginModule required;
                                                  };

·         Add the below entry in the Krb5.ini file. Ensure the entry kdc=<Domain Controller Server Name>.DOMAIN.COM

[libdefaults]
default_realm = DOMAIN.COM
dns_lookup_kdc = true
dns_lookup_realm = true
default_tgs_enctypes = rc4-hmac
default_tkt_enctypes = rc4-hmac
[realms]
DOMAIN.COM = {
kdc = Domain Controller Server nme. DOMAIN.COM
default_domain = DOMAIN.COM
}

Make sure this file is saved correctly by navigating to C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win64_x64\jdk\bin\ folder on the BOBJ server, and execute ‘kinit BOserviceaccount’ in a command prompt. If a new ticket is stored, the file is correct.

 

·         Open the Tomcat Configuration and add the below command lines to the Java Option

-Djava.security.auth.login.config=C:\BO\WinAD\bscLogin.conf
-Djava.security.krb5.conf=C:\BO\WinAD\Krb5.ini

·         Update the server.xml located in the directory C:\BO\Tomcat\conf search for Connector String and ensure the Connector entry is similar to that of entries mentioned below

<Connector port="8080" protocol="HTTP/1.1" connectionTimeout="20000" redirectPort="8443" compression="on" URIEncoding="UTF-8" acceptCount="100" debug="0" disableUploadTimeout="true" enableLookups="false" maxSpareThreads="75" maxThreads="150" maxHttpHeaderSize=" 65536" minSpareThreads="25" compressionMinSize="2048" noCompressionUserAgents="gozilla, traviata" compressableMimeType="text/html,text/xml,text/plain,text/css,text/javascript,text/json,application/json" />

 

4.1     Active Directory SSO Configuration

 

 

  ·         The active directory SSO configuration involves creation of .properties file in the Tomcat WebApps folder where the BOE .war file is deployed

·      Go to the location C:\BO\Tomcat\webapps\BOE\WEB-INF\config\custom and create the BIlaunchpad.properties file and add the following entries in that file.

                     authentication.default=secWinAD
               authentication.visible=true

·      Go to the location C:\BO\Tomcat\webapps\BOE\WEB-INF\config\custom and create the global.properties file and add the following entries in that file.

                     sso.enabled=true
               siteminder.enabled=false
               vintela.enabled=true
               idm.realm=DOMAIN.COM
               idm.princ=ServicePrincipalName
               idm.allowUnsecured=true
               idm.allowNTLM=false
               idm.logger.name=simple
               idm.logger.props=error-log.properties
               idm.allowS4U=true

Note that we are not keeping key tab file path in global file.

·         Open Tomcat Options Add the following lines to Tomcat Java Options:

                       -Dcom.wedgetail.idm.sso.password=<Password of BOServiceaccount>

                       -Djcsi.kerberos.debug=true

 

Start Tomcat, go to C:\BO\SAP BusinessObjects\Tomcat\logs\, check stdout.log  has ‘credentials obtained’ shown.

Test single-sign-on is now working in a browser on any client system (not on the BOBJ server).

 

In order to avoid SSO stops working on patch upgrades,

We can copy the BIlaunchpad.properties and global.properties from C:\BO\Tomcat\webapps\BOE\WEB-INF\config\custom to C:\BO\SAP BusinessObjects Enterprise XI 4.0\warfiles\webapps\BOE\WEB-INF\config\custom

 

 

Hope this helps..

 

Thanks and Regards

Sandeep Chandran


Continuing with my earlier blog BusinessObjects Administration - Setting up security model – An easy way to configure and manage. In this blog we are going to see the different security model designs which I came across throughout my experience. Here are some of them

1. Basic setup with Functional user categorization


This is the basic setup without any application/Department wise user categorization. Users will be categorized only based on their functionality (Viewer/Analyst/Author). This means users can view data across different applications and there is virtually no restriction on viewing the BI content. Functionalities enabled for each user category is described here in my previous blog  BusinessObjects Administration - Setting up security model – An easy way to configure and manage

Graphical representation of the model


Model 1.jpg

2. Hybrid setup with inherited user category

 

In this model, Users are segregated by each Application/department first. The subsequent user categorization would be additive in nature and the hierarchy will fall from least privileged group to most privileged group by making use of Group level inheritance capability. Similar to type 1 model, users of Application/department can view all the reports but only within their department.


Graphical representation of the model


Model 2.jpg


3. Hybrid setup  with cross Application/departmental & user category


This is the most effective and commonly used security model across various BusinessObjects deployments. Compared to previous two models this model ensures the BI content is accessed in a highly secured way i.e. reports are viewed by only group of users who are intended to do so. Here user groups within the department will be having distinct functional access rather than additive functional access as compared to that of type 2 model.


Graphical representation of the model

Model 3.jpg

Let us see about each model in detail in upcoming blogs. Keep reading.



The Issue

 

I recently deployed SAP BusinessObjects 4.1 across several environments at a client site and ran into the following error message when saving a Web Intelligence report to the platform from the Webi Rich Client, Java Panel and HTML Panel.


The document's serialization version is too recent (Error: WIS 30915)


Scheduling a Web Intelligence report to Webi format also resulted in the above error.

 

The Web Intelligence processing server logs left the following trace:

 

**ERROR:SRM:An internal error occured while dgSerializeManagerImpl is calling ibo_idgSrmStore->PublishToCorporate [kdgSerializeManager.cpp;2533]

 

  • No issues were logged in the setupengine.log
  • Running a repair on the deployment didn't resolve the issue

 

Root Cause

 

Query Builder reveals what would appear to be an incorrect version number for the Web Intelligence application

 

SELECT SI_ID, SI_NAME, SI_FILENAME, SI_WEBI_SERIALIZATION_VERSION FROM CI_APPOBJECTS WHERE SI_KIND = 'WebIntelligence'


14.0.3.0.jpg


Solution

 

  • Note the dfo filename above - BusinessObjects_WebIntelligence_dfo.xml.
  • Stop all SIA nodes
  • Backup the CMS database
  • Copy the file from the <InstallDir>\SAP BusinessObjects Enterprise XI 4.0\dfo directory to <InstallDir>\SAP BusinessObjects Enterprise XI 4.0\packages directory
  • Start the SIA Node containing the CMS
  • Confirm the file has been removed from the <InstallDir>\SAP BusinessObjects Enterprise XI 4.0\packages directory
  • Confirm the update via Query Builder

14.1.4.0.jpg

 

  • Run earlier tests.......All working

 

Side Notes

 

I ran into this issue on an upgrade from BI4.0 SP5 FP4. I wasn't able to replicate this issue (as expected) on an environment that had a clean build i.e. no upgrade

 

I'll report the issue through the correct channels in due course but I felt it was best to get this information out there as I burnt several hours troubleshooting this and no doubt someone else might run into the same problem.


Designing security model is one among the important phase in BusinessObjects implementation/migration projects. Well-organized security model not only provides easier Administration but also ensures security is seamlessly implemented across different functional/application user groups with less maintenance effort.

 

From this blog on wards, we are going to see how to design and implement the security model. Before starting the design, we should consider the below things

 

Various rights categories (as on BI 4.0 SP3)

 

There are 4 different rights categories as we already aware and they listed below

 

  • General
  • Application
  • Content
  • System

 

I have consolidated all of them and the same is depicted below

rights.png

 

Various user categories


Always categorize users based on Application as well as functionality (what they can do). I have categorized users based on their BusinessObjects Application/content and their functionality on BI content


Application wise user category


User category

Description

Crystal users

Crystal report users

WebI users

Web Intelligence report users

Dashboard users

Xcelcius dashboard users

Design Studio Users

Design Studio dashboard users

Universe/Information designers

Universe Designers

Analysis users

Analysis Application users

Explorer users

Explorer application users

Mobile users

Mobile report users


Functional user category


User category

Description

BOE Users

All users of the BusinessObjects system

Viewers

Users who can only view/refresh the reports

Interactive Analysts

Analysts can refresh/create/modify the reports that they create and they cannot create/modify the corporate reports

Interactive Authors

Analysts can refresh/create/modify the corporate reports

Super Users/Managers

Users who can manage and maintain document as well as users for a particular application/department

Content Schedulers

Users who can schedule reports for their own and on behalf of others

Content Promoters

Users who can migrate/promote BI content across different environments

Delegated administrators

Users who can administrate the Businessobjects deployment as a whole or part of it

 

Based on the rights and user categorizations we are going to see more about the security model design in my upcoming blogs! Thanks for reading!



We can add the HANA Live views in UDT by adding HANA connection. Normally views will be stored in_SYS_BIC schema.

Step 1: Create a HANA Connection with JDBC driver

p1

 

Step 2: Choose JDBC drive and Add the credentials and Host name for HANA DB

p2

p3

 

Step 3: Test the Connection

 

p4

 

 

Step 4: Add a new class under project.

 

p5

Step 5: Select  the view from _SYS_BIC schema

 

p6

 

p7

 

 

Step 6: Select your view and add the objects for the class.

 

p9

Now you have successfully added the HANA Live View from HANA Database into UDT. Universe can be created using HANA Live view.

Thanks

Let me know, if you have any issues.


As we all already aware, In BI 4.x the capabilities provided by as-is sample of Audit reporting suite is limited and we can see there are so many requirements flowing in SCN around this for quite some time with reference to additional requirements from Auditing .

 

Can I extend the Auditing capabilities and How?

 

We can enhance the existing audit capability from the as-is sample. Besides the default sample reports provided, I do have few more requirements something like below.

  • Frequently used reports
  • List of most active users
  • Who are all my Mobile BI users?

 

To achieve the requirements above I have adopted following approaches.

 

Approaches

 

Below are some of the approaches I have considered for Audit reporting enhancements.

 

1. Creating customized Audit reports from the existing Audit schema

 

We can create enhanced audit reports from the existing Audit schema based on our requirement. We can create extended reports by referring existing report and modify the report prompts/filter etc.

 

For example to get the Mobile report access use  "Application_Type_Name" from table "ADS_APPLICATION_TYPE_STR" which provides the application type from which the access is from i.e. mobile device. It is available as "Client Application Type" in class "Events" in the Universe)

 

2. Creating Custom tables in Audit schema for the reporting

 

          We can create custom tables in Audit schema based on our requirement. One of such option is to create derived tables in Audit universe based on the Custom SQL statements than can be run directly on Audit database.

 

More active users can be obtained by running the below SQL on Audit database

-----------------------------------------------------------------------------------------------------------------

SELECT

            ADS_EVENT.USER_NAME AS USER,

            COUNT(ADS_EVENT.EVENT_ID) AS COUNT,

RANK() OVER (ORDER BY COUNT(ADS_EVENT.EVENT_ID) DESC) AS RANK,

FROM ADS_EVENT

WHERE EVENT_TYPE_ID=1014

GROUP BY ADS_EVENT.USER_NAME

-----------------------------------------------------------------------------------------------------------------

 

Create a derived table in Audit universe with the above SQL and then you can directly run reports on top of the derived table column/objects.

 

Alternatively if the Custom SQL extracts large dataset we can skip the derived table approach which is meant for less number of rows and create a materialized view at database side and refresh it periodically and then do the reporting from there.

 

3. Creating a metadata repository and start the reporting by creating the multi source universe which points to both Auditing schema as well as metadata schema.

 

          This approach will be very useful whenever we need to create reports that need to capture the information from both Audit and BO repository. Some of the information such as number of Named users/Concurrent users cannot be extracted from Audit schema in which metadata reporting along with Audit reporting will be handy.

 


BI 4.x Audit reporting references:

 

BusinessObjects Auditing - What is changed in BO 4.0?

Sample Auditing Universe and Reports for SAP BusinessObjects_4_x

SAP BusinessObjects 4.0 Auditor Configuration & Deployment End to End

BusinessObjects Auditing - Considerations & Enabling

 


Thanks for reading. Appreciate all your thoughts, comments, ideas & feedback.

Hi All,

 

I wanted to share an experience that I had during migration of universes and reports from BO 6.5 to BOE 3.1

 

My BO 6.5 Admin gave me the bomain.key file which I used in the Import Wizard.

 

A little bit of background for the people like me who have never worked on a BO 6.5 system. There is no concept of CMS and filestore in BO 6.5

 

Instead there are domains which house the BI content. There are mainly three types of domains in BO 6.5 which I am aware of.

 

1. Security Domain :- It has the users, groups and the security information.

2. Document Domain :- It has the reports.

3. Universe Domain :- It has the universes.

 

A particular environment can have multiple Document and Universe Domains. These documents translate to folders while migrating to BOE architecture (BOE XI R2 or BOE 3.1)

 

bomain.key is the encrypted file which has the connection and linking information for these domains of BO 6.5 environment. You need this file while logging into the BO 6.5 system using Import Wizard.

 

So my BO 6.5 Admin provided me with the bomain.key file for migrating content required for the project. My BO 6.5 repository was on Oracle and so was the CMS of BOE 3.1

 

I faced a weird issue while logging into BO 6.5 through Import Wizard. It gave me an error about missing TNS entry. So I contacted my BO 6.5 Admin and he provided the TNS for my security domain. This help me pass the login screen.

 

However when I selected the folders and universes to migrate, I saw only empty folders. None of the reports or universes were visible in the Import Wizard.

 

To troubleshoot this issue further, I enabled tracing on Import Wizard by adding "-trace" to the IW shortcut in the startup menu. The logs were very lucid and correctly pointed to the problem.

 

I got the below trace in the logs.

 

2014/08/21 08:33:10.966|>=|W| | 3864|7560| |||||||||||||||_BOImportHelper::getUniverses: Universe 15 cannot be imported because the corresponding domain is down

2014/08/21 08:33:10.966|>=|W| | 3864|7560| |||||||||||||||_BOImportHelper::getUniverses: Universe 16 cannot be imported because the corresponding domain is down

2014/08/21 08:33:20.028|>>|E| | 3864|7560| |||||||||||||||PingDomain: Unable to connect to domain 11 because: ORA-12154: TNS:could not resolve the connect identifier specified


Apparently my universe and document domains required additional tns entries. So to further overcome this issue, I just merged the tnsnames.ora file from the BO 6.5 server with the one on BO 3.1 server.


This resolved the issue and migration went very smooth.


I will just put the crux of my blog in points for easier understanding.


  • There is no CMS or Filestore concept in BO 6.5 It has domains.
  • There are three types of domains, mainly security, document and universe domain
  • These domains translate to folders while migrating to BOE XI R2\3.1
  • bomain.key file is the encrypted file holding information of all domains. It is required logging into the Import Wizard.
  • The machine\box from where you launch the Import Wizard should be able to connect to all the domains in BO 6.5 for successful migration.

 

Thanks for reading my blog and I hope you found it useful.

 

Regards

Chinmaya

In our daily Business Objects reports scheduling life, we often use events as a key factor for triggering the BO Reports. Data quality can be obtained by fixing the events as a dependency to a BO Report. If I'm not wrong, almost everyone who uses Business Objects follow this event dependency concept to maintain the quality of data in the reports. What else, we can play with those events??...

 

I thought of sharing on how well we can utilize the BO events for Load balancing, Data quality and Maintenance activities.

 

Basically, events are classified into 3 types.

 

  1. Custom Event - Occurs when you explicitly clicks its "Trigger this event" button.
  2. File Event - Waits for the particular file to generate in the desired location to trigger the event.
  3. Schedule Event - Triggers the event, when a particular object has been processed.

 

To know more about events, read Page#229 in Admin guide.

 

Load Balancing:

 

Rudiments:

 

Dummy Reports - 3 or 4 Reports*

Schedule Events - 3 or 4 Events*

 

Priority events can be created to maintain the load balancing on BO servers. By creating these events, the reports will be kicked off based on the criticality. The four dummy reports which we have created will be executed every 1 min* and triggers the corresponding schedule events.

 

Here, we can name those 4 dummy reports as Priority1_Report, Priority2_Report, Priority3_Report and Priority4_Report.

Name those 4 schedule events as Event_P1, Event_P2, Event_P3 and Event_P4. (Fix the events as a OUT Conditions for corresponding Priority reports)

 

The Priority events will be fixed as a condition to all the BO Reports based on the start time of the reports. For example,

 

Event NamesFixed to the reports
Event_P1Scheduled to start between 8 AM -10 AM
Event_P2Scheduled to start between 10:01 AM - 11:30 AM
Event_P3Scheduled to start between 11:31 AM - 1:00 PM
Event_P4Scheduled to start after 1:01 PM

 

How these events can be used?

 

  • Pause these Priority Reports when there is a delay in the data load or any other issues to avoid reports kicking off at the same time.
  • Once the data load process got completed or any other issues are fixed, release the "Priority1_Report" first to generate the critical reports depends on Event_P1.
  • Release the "Priority2_Report" after 30-45 mins* after confirming that there are enough resource available for processing the next set of reports. By this time, most of the P1 reports might have completed.

  • Release the "Priority3_Report" and "Priority4_Report" reports after 30-45 mins* respectively by checking the number of reports in RUNNING state.

 

Maintenance Activities:

 

I divide this module into 2 segments.

 

  1. Maintenance on BO Server
  2. Maintenance on Data load server (DB)

 

For Maintenance on BO Server, Rudiments:

 

Dummy Reports - 1 Report

Schedule Events - 1 Event

 

To avoid report failures or to stop all the BO reports from kicking off, we can have this one dummy report to manage all these actions. The dummy reports which we have created will be executed every 1 min* and triggers the corresponding schedule events.

 

Here, we can name that 1 dummy report as Allow_All_Reports.

Name the schedule event as Event_AllowAll. (Fix this event as a OUT Conditions for "Allow_All_Reports" reports)

 

How this event can be used?

 

  • Make this event event as a mandatory event while scheduling a report.
  • This event is going to be the key for BO Reports to stop or kick off.
  • Incase of any issues, if you don't want your reports to trigger and to avoid the report failures, you can go-ahead and pause one single report "Allow_All_Reports" to control all other scheduled reports on your environment.
  • Once the issues are resolved, you can resume this "Allow_All_Reports" to allow all the reports to kick off.

Note - If you pause this report by mistake then none of your reports will trigger.

 

 

For Maintenance on Data Load Server (DB), Rudiments:

 

Dummy Reports - 3 or 4 Reports*

Schedule Events - 3 or 4 Events*

 

To avoid report failures or to stop all the BO reports which are hitting specific database during planned or unplanned maintenance on data load servers (DB), we can have these dummy report to manage all these actions. The dummy reports which we have created will be executed every 1 min* and triggers the corresponding schedule events.

 

Here, we can name that 3 dummy report based on the DB Names like DB1_Report, DB2_Report and DB3_Report.

Name the schedule event as Event_DB1, Event_DB2 and Event_DB3. (Fix these event as a OUT Conditions for respective DB Event reports)

 

For example, Consider DB1_Report is scheduled for Oracle DB reports. Then, fix the Event_DB1 as a in-condition for all the reports which are hitting the Oracle Database. Incase of any unplanned maintenance or any issues with the database, we don't have to look for the metadata or identify the list of reports hitting the Oracle DB and pause them manually. All you need to do is, go-ahead and pause the DB1_Report which will in-turn hold all the reports which are hitting the Oracle DB since we have scheduled those reports with DB Event conditions.

 

Event NameReports hitting ...
Event_DB1Oracle Database
Event_DB2SQL Database
Event_DB3Teradata

 

Data Quality:

 

The Quality of data in the report can be maintained by fixing any of the 3 events (Custom, Schedule or File events). Based on the ETL jobs which loads the data to the tables used by the reports, we can fix the best event type as a condition to the BO Reports.

 

  • If ETL jobs are generating any trigger files after processing the data, file event can be fixed as a condition to all those reports which are using the tables loaded by that particular ETL job.
  • Custom Event can be used to kick off the reports, once the specific table got loaded.
  • Scheduled event will be acting as a WATCHER Reports to monitor the status of ETL job. Based on the completion of ETL job, instance will be successful and triggers the corresponding event.

 

To Summarize:

 

While scheduling any reports, plan to fix the below events as a dependency to maintain the quality of data, load balancing and to avoid report failures or to pause \ resume any reports during planned \ unplanned maintenance period.

 

Event to wait for:  Event_AllowAll, <<Priority Event>>, <<Database Event>>, <<Data Load Event>>

 

  1. Event_AllowAll - Default event which should be fixed as a dependency for all scheduled reports. Pausing this report will hold all the reports in your environment.
  2. Priority Event - Should be fixed based on the scheduled start time of the report. Helps in Load balancing during loads delay or any other issues.
  3. Database Event - Fix this dependency based on the database the report is running against. Helps to pause \ resume the reports during maintenance period or any DB issues.
  4. Data Load Event -Based on the ETL process, fix the best suitable event for ensuring the data quality.

 

Points to be noted:

 

  • No DB connections are required for the dummy reports which has been created for load balancing and maintenance.
  • As these reports are not hitting any DB's, instances will complete in few seconds.

  • Place these event reports in a separate folder and maintain security level in such a way that only Administrators can Pause \ Resume these reports. (Customize your security levels based on the requirement)
  • Fix the report LIMITS as 10.
  • These event reports will be acting as a one stop location where you can control almost all the reports which are scheduled in your environment.

 

*Based on business requirements.

 

I welcome the feedback, comments and complements.

 

Thanks

Vijay Madhav

You have to complete 4 steps.

 

1. Create an event

2. Create a script file for database table record count

3. Create the schedule

4. Create a windows task schedule

 

1- Create an event:

 

Log on to CMC and create an event.

2014-08-11_12h07_50.png

 

Go to "System Events" folder and  click on "Create an event"

2014-08-11_12h09_30.png

 

Fill the text boxes. Choose "File" as type and the path of your "ok" file. Note that the "ok" file (which is "f:\a.txt" for this example) is not created yet. The text file will be created by a script mentioned below. The "f" drive is a drive on your BO server.

 

2014-08-11_12h11_11.png

 

2- Create a script file for database table record count

 

Ok. Now we need a script. This script will;

  • connect to the database
  • check the number of rows in our table
  • if there is record which is corresponding to our query, create the "ok" file

You can find an example file with this kind of script as an attachment.

 

3- Create the schedule

 

Go to your BI portal and create your schedule. Don't forget to select your event.

2014-08-11_12h47_47.png

 

So, your schedule will never run, until there is a file named as "a.txt" on your "F" drive.

 

PS: Don't forget that the schedule looks for the "ok" file which is created after the schedule creation time.

 

4- Create a windows task schedule

The last step is , create a windows schedule task, which automatically runs your script file every hour or quarter or etc. If your ETL process fails there will be no record in your table and the script won't create the "ok" file "a.txt". When the ETL process succeed your script will create the "ok" file and your schedule will run.

Overview

 

The objective behind this post is to try and bring some attention to what I consider a serious product shortfall, namely the sequential processing of multiple SQL statements within a single universe query.

 

The below idea touches on this briefly, it also references a secondary issue which is the sequential processing of multiple dataproviders in a single document - another topic which needs addressing.

https://ideas.sap.com/ct/ct_a_view_idea.bix?c=1DA84A30-1E5A-43FA-95C5-857A8B99D197&idea_id=3DE05DC2-0AC1-4BFF-9207-D71172E904E6

 

The Issue

 

As mentioned this post focuses on the execution behaviour of a single unx query that contains multiple statements (flows) and the sequential processing thereof. These queries are generally invoked when the following options have been set on the universe:

  • Multiple statements for each context
  • Multiple statements for each Measure

 

Steps to reproduce:

 

  • Create a universe containing multiple contexts
  • Enable the option 'Multiple statements for each context'
  • Select a measure from each context, this will produce 2 statements (observed when selecting view script)
  • Profile/monitor database activity
  • Execute the query
  • Note the sequential execution of each statement

 

This behaviour is documented in SAP Note: 1258337. This document is slightly dated but from what I've seen the behaviour hasn't changed.

 

Findings

 

The below components all use a java library to execute and retrieve the results of a .unx query from the database.

  • Crystal Reports Processing Server (Crystal Reports for Enterprise)
  • Dashboard Processing Server
  • Information Design Tool
  • Dashboard Designer
  • Crystal Reports for Enterprise Client

 

Generally speaking the guts of the code that performs the query execution is held in a class called com.businessobjects.dsl.services.dataprovider.impl.QuerySpecDataProvider which can be found in the jar files:

  • dsl_engine.jar
  • com.businessobjects.dsl.services.impl.jar

 

Changing the below two methods, in the class com.businessobjects.dsl.services.dataprovider.impl.QuerySpecDataProvider, to include a simple asynchronous call when running the recursive calls to getResultNode(<?>) would bring about significant performance improvements when executing queries that contain multiple statements:

  • getResultNode(<?>)
  • getMergedResults(<?>)

 

By making the changes above you should in theory see vast improvements in overall query performance  when:

  • You have a large number of statements in a batch
  • And/Or statements within the batch take a long time to run

It is important to note that the final output will always take as long as the longest running statement in the batch as all results need to be retrieved before they can be passed up the stack.

 

Please note that the workflow is slightly different for components that access the Webi Processing Server, namely:

  • Webi Reports
  • PowerBI
  • BIWS

 

The Webi processing server utilises a different code base, I’m struggling to get some readable traces together for this application component but my guess is changing the underlying code, compiled in QT.dll, in line with the above recommended change would have a similar positive affect.

 

Conclusion

 

The above findings are the result of testing against a single source relational unx universe, behaviour may differ when using multisource universes/olap connectivity - both of which I haven't got round to testing yet.

 

With that in mind by no means am I proposing that the above small change to the referenced class is a complete solution but one would hope that given the available manpower at SAP, coupled with the performance improvements this change can bring to the product suite, a full solution could be implemented by SAP in a relatively short space of time - given enough push by us the community.

 

I've listed several ideas below that detail where sequential processing takes place. Please up-vote them if you want to see the current implementation improved and ultimately reap the performance gains these changes will bring.

 

https://ideas.sap.com/ct/ct_a_view_idea.bix?c=1DA84A30-1E5A-43FA-95C5-857A8B99D197&idea_id=3DE05DC2-0AC1-4BFF-9207-D71172E904E6

https://ideas.sap.com/ct/ct_a_view_idea.bix?c=BB5523E4-062F-4420-B35F-0B1F0D4769A9&idea_id=19CD336C-C882-4630-9DFD-000A160DD384

https://ideas.sap.com/ct/ct_a_view_idea.bix?c=A5E8DEA8-D886-4250-BA2B-039F7D32FFC0&idea_id=BE83AB34-2763-4F53-8C23-AC221055EB66

https://ideas.sap.com/ct/ct_a_view_idea.bix?c=1DA84A30-1E5A-43FA-95C5-857A8B99D197&idea_id=24E4A4A2-9834-4D02-BE72-EF505729BE18

https://ideas.sap.com/ct/ct_a_view_idea.bix?c=1DA84A30-1E5A-43FA-95C5-857A8B99D197&idea_id=54372DF0-A3E2-45F4-A852-9A983084FCC9

 

For those JAVA devs out there, it is of course possible to decompile the com.businessobjects.dsl.services.dataprovider.impl.QuerySpecDataProvider class and implement this change in order to reap the performance benefits, however there is a high likelihood that merely the act of decompiling the code would be a copyright infringement so this is by no means endorsed or recommended by me.

 

Let the crusade begin......hopefully.

 

Cheers

 

James


Dear All,

 

After a long time here in SCN, I am writing this blog about one of the quite interesting feature in BI 4.x i.e. Visual difference.

 

Visual difference provides an opportunity to compare two versions of BI content in order to identify the changes incorporated in each version. This will allow you to track the changes made in every version of the BI content and the content could be a Report, Universe or even a LCMBIAR file.

 

Visual difference background

 

Visual difference is one among the hosted service in APS and consumed by both Promotion management and Version management. It is not necessary to have a dedicated APS for hosting Visual difference service unless otherwise majority of your users are using it exclusively apart from Promotion/Version managements.

 

1.jpg


What & How to compare

 

You can compare the BI contents present in any one of the below locations

 

  • CMS – from a BusinessObjects repository
  • VMS – from a Version Management system repository
  • Local File System  – from a local file system

 

2.jpg

 

In CMS, you can select both reports and Universes.

 

3.jpg

Post selection you can compare them across repositories (CMS/VMS/LFS)

 

4.jpg

You can even schedules the difference generation similar to reports

 

5.jpg

 

Visual difference use case for each user category

 

Developer

  • Difference log - What are the changes made in my current report/Universe with reference to my previous version so that I can proceed with my development from the required version
  • Change history - What are the consolidates Changes made since from my initial Universe version (Version 0)


Test Analyst

  • Changed section - What are the only modified objects in the Universe so that I can use my test cases only for the required objects for faster testing approach?


Administrator

  • Difference log - What are the changes made in user’s current report/Universe comparing earlier version?  Based on this as per user requirement I will restore my universe/report.
  • Consolidation for easier maintenance - What are differences between Sales universe and marketing universe? Can I merge both of them and create a single universe to fulfill the reporting requirements for both the departments?


Hope the blog is interesting. Let’s start using visual difference and make our work even simpler & smarter. Thanks for reading!



As promised in SAP BusinessObjects XI 3.1/BI 4.x installation prerequisites and best practices [Windows] here is a continuation of the pre-requisites & best practices document for SAP BOE XI 3.1 and BI 4.x on Linux/Unix based environments.

 

The same concept applies as for Windows servers, which is to allow the installer to run as uninterrupted as possible with respect to the OS parameters/settings. There are certainly deeper checks required on a *nix environment, however these should be taken care of during the build-up of a server and are mostly common for any application. However, here are a few of them which have been observed to cause issues if not set correctly.

 

I've not included sizing in this as I wanted to keep this document for OS related parameters and configurations only however if you wish to learn more about how to size an environment you can visit: **http://service.sap.com/sizing.

 

A. COMPATIBILITY

 

To start with, the hardware and software configuration of the server or client machine that we're installing SAP BusinessObjects on must be supported. SAP provides a supported platforms guide or "Product Availability Matrix" (PAM) for several products.

These can be found at the following URL: http://service.sap.com/pam

You can also refer the following KBA: 1338845 - How to find Product Availability Matrix (PAM) / Supported Platforms Document for SAP BusinessObjects Products

 

Here is an example of a page from the PAM document. Along with the compatibility with SAP and 3rd party softwares, for Unix, OS patch requirements and additional patches/libraries that are required are also mentioned here. The product version is also included in the screenshot.A-PAM1.PNG

Source: SAP

A-PAM2.PNG

Source: SAP

 

Ensure you're viewing the details of the correct OS / patch when viewing a PAM or Supported Platforms Guide.

**Visit the Sizing URL to know more about SAPS.

 

 

B. USER / GROUP

 

The most crucial piece in a Unix BO installation is the user. BO cannot be installed on Unix using the root user nor is root access required post-installation to run BusinessObjects. Hence, it is best practice to have a separate user for an installation for example: bo41adm.

Add this user to an existing group or create a group separately for the BO user. bo41adm will own the BO installation and be used to run all scripts, etc.

 

The user must have sufficient rights on the following:

a. Installation directory

b. Installation media

The following rights must be given:

a. Read

b. Write

c. Execute

A minimum right's value of 755 is sufficient.

 

B-Rights.PNG

 

 

C. HOSTNAME & hosts FILE ENTRY

 

Just like Windows, BusinessObjects is identified on a server by its hostname. Hence, we must ensure that the system has a valid and resolvable hostname.

To set a coherent hostname (e.g. bidev01, etc.), you can use the command:

hostname <desired name>


Nextly, make sure the hostname is associated with the machine's IP address in the hosts file, wherever relevant.

The hosts file on a Linux/Unix system is found in /etc.

hosts.JPG

 

This is a necessary step because if the hostname is not resolved over the network, many services will not be accessible. Client tools have been observed to fail to connect to a BO cluster / host if it cannot be resolved.

 

If you're choosing to export SLD data to Solution Manager, the hostname is what is included in the ".xml" output of the BOBJ SLD REG batch. If the hostname/ip address mapping is not correct, the SLD will contain incorrect data cause errors ahead.

 

 

D. FIREWALL


Most access to a Unix based server take place through a terminal emulator / console such as PuTTY. This can be used to access a BO server from any machine that has network access to that server. This means that we're installing remotely.

 

It is advised to use the terminal emulator from a machine which is in the same network as the server (or within the DMZ, if applicable). However, in scenarios where this is not possible, ensure that there is NO restriction on the network path that may hamper the installation.

 

 

E. SELinux:

 

...or Security-Enhanced Linux.

SELinux is an access control implementation for a Linux kernel. It provides security restrictions for certain kinds of executions.

For more details: Security-Enhanced Linux - Wikipedia, the free encyclopedia

 

Note that SELinux is only for RedHat Enterprise Linux.

 

Disable SELinux prior to perform a BusinessObjects installation. To disable on RHEL5, follow the below steps:

  1. Click on System on the main taskbar.
  2. Select Administration.
  3. Click on SELinux Management.
  4. Choose to keep this disabled.

See the below screenshot of how to disable SELinux on a RedHat Linux 5 OS.

E-SELinux.PNG

 

You could also perform this using the command prompt:

To check the status of SELinux: sestatus

To change the SELinux status, you can make the changes in /etc/selinux/config


For more details you can check the following link:

https://access.redhat.com/documentation/en-US/Red_Hat_Enterprise_Linux/6/html/Security-Enhanced_Linux/sect-Security-Enhanced_Linux-Working_with_SELinux-Enabling_and_Disabling_SELinux.html

 

 

F. RESOURCE LIMITS:

 

A Linux/Unixoperating system has a methodology of sharing the available resources within a set of users, groups, domains, etc.. These resources can be split up to allow optimum usage of the OS that has various applications installed on it and managed by different users.

 

A user can be allocated a certain amount of resources. The user can play around within the range that it has been provided by setting the required value (soft limit) and can set it upto the admin restricted maximum value (hard limit).

 

For example, in RHEL5, we can see the limits using the ulimit -a command.

F-Lim1.PNG

 

The limits configuration file is here: /etc/security/limits.conf

F-Lim2.PNG

The root user has access to make changes to the configuration in this file.


It is recommended to set the limits to unlimited for BusinessObjects installations as mentioned in the Installation guides.

There have been issues observed with BI 4.0 installations on Linux which led to random services crashing. These issues were overcome by increasing the limits. See below KBAs:

1845973 - In BI 4.0 linux environments, random servers including CMS crash when starting SIA

1756728 - Servers fail randomly and the system becomes unstable in BI 4.0 installed on Linux

 

 

G. LOCALE

 

In the PAM document, you will also find a list of LOCALES that BusinessObjects is supported with. A compatible locale needs to be set prior to installation.

To check the locale, you can type the command: locale

To set a specific locale, you can type the command: locale <value>; For example: locale en_US.utf8

The PAM document mentioned in point A presents a list of the supported locales.

 


H. USER PROFILE / PATH:

 

Needless to say, however the user profile must have the correct access related requisites in terms of the Unix executables, etc. Improper access to the user's bin usually causes issues when the installer internally sources, runs scripts, etc.

Have observed issues where if the path to the "ping" command is not added to $PATH, installations do not proceed with a INS00293.

---------

 

 

I hope the above helps towards a successful installation. Above recommendations are based on several instances observed while troubleshooting support issues where the installation had failed and one or more of these options had helped reach to completion!


Being Linux/Unix environments there is a lot to be ensured from the OS perspective as well as the rights. The basic idea here is to allow the product to install without any hindrances, access issues, etc. from the OS level.


You can find documentation for installation, deployment, etc. here: BI 4.0, BI 4.1 and BOE XI 3.1.

 

 

Regards,

-Sid

Actions

Filter Blog

By author:
By date:
By tag: