1 2 3 25 Previous Next

BI Platform

368 Posts

It's a new year, and that means licensing updates!


We have simplified our license model to make it easier to understand, while still keeping the core components.


We have even provided a little bonus to customers that own the BI Suite license - you will now benefit automatically from runtime licenses of Data Integrator and Sybase IQ, and Sybase Power Designer!


No license conversions are needed. Simply go to Service Marketplace and download your new software.


You get:

  • 8 cores of Sybase IQ
  • 8 Cores of Data Integrator
  • 1 named user of Sybase PowerDesiger


These are not full use licenses, so be sure to read the Software Use Rights to understand the restrictions. In short:

  • Data Integrator cannot be clustered
  • Sybase IQ runtime is limited to access by and through the SAP BusinessObjects BI Suite and SAP Predictive Analytics


Note that there is no new version of the BI Platform required to leverage this. No software upgrades are needed.


Those of you that own a similar sounding license called the 'BA&T SAP BusinessObjects BI Suite' will need to execute a license conversion to benefit from these new components.


We are also introducing a simpler version of the BI Suite called the 'SAP BusinessObjects BI Reporting' license. This includes Web Intelligence, Crystal Reports and the BI Platform only, and is intended for reporting-focused deployments.


We are removing some little used license from our price list:

  • The 'Analytics Editions' that were a special bundle of Sybase IQ + Data Integrator + the BusinessObjects BI Suite
  • The restricted use 'SAP BusinessObjects BI Suite for SAP Applications'
  • The component model that allowed customers to purchase any of the individual BusinessObjects client tools


Customers that own any of these retired products are not impacted. You can continue to use these licenses and purchase more licenses from that license model.


Finally, it's important to remind everyone that our license metrics are unchanged. We continue to offer our BI software with the following metrics:

  • The named user (NU) metric for users that need guaranteed access to the software, or for users that need to use most of the desktop software products.
  • The concurrent session based license (CSBL) for casual users. This applies to the server components and Analysis for Office.
  • The new public document metric for publicly sharing BI documents on the internet.


For more information, please read the licensing FAQ here.


Blair Wheadon


GM of Data Discovery

Hi everyone,

I wanted to make our Crystal Reports customers on the SCN community aware of some great promotions we’ve got running right now with a few listing/review sites.


Basically, if you leave a detailed review of Crystal Reports on one or all of these sites, they will reward you a gift card for your time.  This is open to customers only, no partners or employees.  All reviews are vetted by the sites for authenticity before you receive your gift card.


The G2crowd offer is open globally. The Software Insider offer is US only.  The TrustRadius offer is global too.


So if you’re from the US, feel free to leave a review on all 3 sites.

Here are the details on how to submit a review and get your gift card(s)!

Review Crystal Reports with G2crowd:

Leave an authentic, detailed review and G2crowd will reward you with a $25 VISA gift card upon their review and posting of the submission. This is open to the first 40 people only, so it’s a first come, first serve offer. This is open to people from all countries.
Click here to create your review



Review Crystal Reports with Software Insider:

Leave an authentic, detailed review and Software Insider will reward you with a $15 Amazon gift card for your first review, and a $10 Amazon gift card for each review thereafter.  Once your review is screened and is posted to the site, you will receive your gift card.
This program is valid until March 31, 2016 and valid in the US only.
Click here to create your review


Review Crystal Reports with TrustRadius:

Leave an authentic, detailed review and TrustRadius will reward you with a $25 Amazon gift card upon their review and posting of the submission. This is open to the first 40 people only, so it’s a first come, first serve offer. This is open to people from all countries.
Click here to create your review





Ryan Oliver
SAP Digital team

Quite often, I had always wondered if there was an easy out to export the results from a CMC to excel.


There is of course, the tool that can be found online, that runs against a macro and exports it to Excel, but I wanted to see if something can be done directly out of the CMC itself.


With this background, I stumbled on this little workaround that uses a little known plug in that can do that. Of course, it involves a few steps, but in the end it will get the job done.



Uploading the document, for the benefit it of SCN users.





This is my first upload here, so I do not know how this works. Hopefully it will be useful to someone.


Disclaimer: This will work only with FIREFOX BROWSER


  1. Open CMC in a Browser ( Firefox)
  2. Activate FIREBUG Plug in ( found on the top right hand side of the toolbar)



3.     This will open a new window or a small window at the bottom of the current browser. You also have an option of opening it in a new window if it opens in a same window by clicking the icon



4.     Click on the little Arrow icon, found on the top left hand side of the menu bar of the firebug window



5.     Highlight the first row in CMC, that you want the query to be saved into Excel




6.     The min u do the selection in the above step, the code in the FIREBUG window gets highlighted on the row that the data on the CMC window was selected



7.     Carefully, identify the <Table id> code, which can be found a few lines above the selected



8.     Locate <DIV ID> which is right above the <TABLE ID> pointer and place the mouse on that line, right click the mouse and select  “COPY INNER HTML”           from the selected pop up menu

9.     Save this code to a Notepad and save it as .HTML file or a.XLS file


10.     Either of the 2 files will take a long time to open and will also get saved with the icons from CMC.


11.     Copy/select individually the columns that are needed, to the next tab for a cleaner and neater output.


12.     Finally … since CMC has a restriction to display only 500 objects at one time, you might have to do this multiple times, by navigating to the next page           and starting the process from step 1, if your list of objects runs into more than 500.

Happy New Year!


The end of 2015 meant the end of mainstream support for SAP BusinessObjects XI 3.1 and SAP BusinessObjects BI 4.0. However, end of support does not mean end of the world, according to SAP Mentor Greg Myers (see related article, End of Support Does Not Equal End of World). However, some organizations may be affected by Microsoft's decision to stop supporting Internet Explorer 8, 9, and 10 as of today, January 12, 2016 (see Microsoft's official statement or SAP KB 2264238). Keep in mind that later editions of XI 3.1, BI 4.0 and BI 4.1 all provide IE 11 support (see related SCN article, SAP Support for Internet Explorer 11 Can Head Off Security Woes).


SAP BusinessObjects Business Intelligence BI 4.1 SP7


The most recent release of the BI 4.1 platform is Support Pack 7, which has all of the goodies of Support Pack 6 (see Christian Ah-Soon's excellent write-up) and adds support for Windows 10 on the desktop and Suse Linux 12 and Red Hat 7 on the server. However, there are some curious omissions in the supported platforms document, most notably Microsoft Office 2016 and the latest releases of the Adobe Flash player (version 17 is the highest version supported but version 20 is the highest version shipping).


SAP BusinessObjects Business Intelligence BI 4.2


While still not generally available (see related SCN Article, Waiting for SAP BusinessObjects BI 4.2), SAP released BI 4.2 Support Pack 1 at the end of 2015. Some customers have opted to enroll in the ramp-up program, although most will likely wait for Support Pack 2, which will be the GA release. BI 4.2 promises a lot of new features (see related article, What's New in SAP BI 4.2), plus improved upgrade workflows and linked universe support for those customers still on earlier releases.


What now?


Most organizations that use SAP BI should have two platform objectives for 2016 - adopting SAP Lumira in their BI landscape and upgrading to SAP BusinessObjects Business Intelligence 4.2. Unless you are actively patching up to address a bug or gain platform support (such as IE11 support), I suggest waiting for the general availability of BI 4.2. If you've never made the leap to BI4, it's time to get off of now-unsupported software. Consider entering the early adopter program for BI 4.2 and take advantage of its upgrade process improvements, knowing that the GA patch will be waiting for you before it's time to go live.


What are your organization's BI priorities in 2016?

Hello Everyone,


Here I would like to highlight some of the new features available with BI 4.2 from the Platform perspective. Some of you would have already aware of these features, but here I want to list it for the newbies.


1. User Notifications:

This is the feature which helps administrator to broadcast information to users. Consider the case where the administrator of BO system wants to send a message to all/any particular users. This can be achieved through User Notifications from CMC. The administrator can specify the timeline for the notification to be visible. The notification can also be sent as an Email which is mapped to the user.


How to create Notifications:

Go to BO CMC->Events->User Notifications


Specify the timeline for the notification to be visible and enter the Title and Description:


Manage Subscribers:

Here the subscribers can be users or groups


How to view the Notifications:

Once the user log into BI Launchpad, the notification would be highlighted and also available under My Alerts section. By clicking “See More” the details can be seen


2. Commentary:

This service is provided by platform for all BI Clients. The comments are stored in Audit DB by default but it’s customizable

Where to Customize:

Go to BO CMC->Applications->BI Commentary Application->Properties



The comments are managed by rights, and the authorizations on commenting to be defined on individual folder or object level

Eg: If I want to enable rights for the documents under folder say “Murali_Folder”, go to its User Security and select the user, assign the rights



Adding/Managing comments are already discussed in http://scn.sap.com/docs/DOC-67437 which you can refer.

3. Recycle Bin:

This feature enables the user to restore accidentally deleted content. It supports only Public Folders and Content.

Where to Enable:

Go to BO CMC->Applications->Recycle Bin Application


The threshold (No of days) can be set and once the threshold reaches the files will be removed from Recycle Bin


Some of the deletes files available in Recycle Bin


4. Administrator Cockpit:

If the administrator want to get a fast insight about the BO systems, he can access this feature, which gives, simplified drillable visual page with key information like list of Servers/status, Jobs, etc and specific actions are possible


5. Promotion Manager:

With BI 4.2, the administrator can do a selective retrieval of objects from LCMBIAR file. New job will be initiated based on the selection made. This would be helpful when the LCMBIAR file contains more no. of objects and the administrator want to promote only lesser objects.


Also the admin can utilize connection overrides while promoting content

6. Rest API:

These APIs would help

  • For User Management (Add/Create/Modify)
  • Uploading/Downloading documents to/from Platform

7. Upgrade Manager:

During the upgrade,

  • the administrator can specify the Log Level
  • also he can control the content(Objects) while upgrading


8. Audit:

In BI 4.2, the scalability of Auditing has been improved, which means when more events logged, the repository can be accessed by more users. There will not be any performance bottlenecks w.r.t Auditing.

Please note that this is in the Beta release and it may change in the GA release without prior notice.

Thanks for reading and hope this helps you.



Issue: LDAP roles not assigned to users after migration from 3.1 to 4.1




Assumption: Groups might not added before migration.


Detailed background of issue:


Assume migrated 3.1 objects to 4.1 (New system).


LDAP groups are not syncing for all our users unless I manually re-create their LDAP alias.  But aliases assigned to all users successfully. For example, <User> has a valid LDAP alias, shown here in the Properties section of his user profile:


ldap .jpg

However, when we look at Member Of, his Information Technology LDAP group does not appear:

ldap .jpg

Actual assigned LDAP group should show as below. As of now we are deleting alias --> recreating Enterprise alias --> deleting enterprise alias and finally --> Re-creating LDAP group alias.

ldap .jpg


Update groups and users with alias option is not working using on-demand option.




  • Try with on-demand option as this is worked for me in Development environment.
  • Remove LDAP groups and add again in CMC LDAP configuration.

Docker is one of the coolest technologies that brought containers to the masses. From Development to Deployment, it has made the lives of many developers and admins easy. If you have not heard about Docker yet, you can check out the early posthttp://scn.sap.com/community/cloud/blog/2014/06/17/docker--containers-in-the-cloud. SAP HANA cloud platform already has mentions of Docker http://scn.sap.com/community/cloud/blog/2014/10/15/sap-hana-cloud-platform--setting-the-stage-part-2. In addition to that we will see how we can put Docker to use with SAP BI Platform.

Docker Scenarios

Docker can be used in a different ways for SAP BI platform. You can dockerize the entire tier or just specific ones as per your requirement. The following are some of the ways you can deploy Docker in your SAP BI landscape and advantages of using them.


Basic Deployment

The entire BI Platform (web tier, processing tier and the database) can be installed on a single Docker image. This is similar to using a windows/Linux image and installing the SAP BI Platform on top of it.



This is a very simple approach of using Docker for SAP BI Platform and is useful for testing out Docker or having a simple environment.


Split Deployment

The split deployment is slightly advanced and makes use of the true features of Docker. You can do a split deployment on same system or on different system. You can split the different tiers into different Docker container each with different specs. All of these can still co-exist on the same working together.



The advantage of having this type of setup is that you can have customized OS settings for different tier that does not conflict with each other. The Operating system for the Database can be tuned for it and your Web tier gets its own container space and memory.

Distributed Deployment

You can extended the above split deployment into a distributed multi-node deployment. Since Docker are typically images that are run into containers. You can setup once and then run it easily on any host systems. An example of distributed deployment can be multiple web tier and platform services.



The BI Platform and the Web tier are both clustered at the application level and are installed on separate Docker containers. The applications communicate as they are on real hosts and individual services can be split across nodes.

This can be further extended into different hosts since Docker can be configured to communicate inter-host without any problems.



The above scenarios looks hard on diagrams, it’s relatively easy to deploy using Docker. It also adds other advantages like separate memory, operating system configuration and easy to replicate/deploy the same image on multiple systems with less configuration changes.

SAP BI Platform already supports Virtualization, Docker is the next generation to that and offers more powerful methods of deployment both on cloud and on premise and SAP BI Platform can make use of its power.

Apache Webserver SSL Setup for BOE


For the purpose of the entire Landscape, I have consumed   three separate machines running the OS Windows 2008 SP2  64 Bit Edition with a hardware size of 16G Ram Quad core processors.



Machine Name


Machine 1


CA Certificate Authority issuing machine

Machine 2


Apache  as a Web server Installation

Machine 3


Client machine where certificate needs to   be Imported





How it Works:

We first set the https network working by configuring  the Apache Webserver https enabled.  We have a Certificate Authority machine which generates the different certificates required for the Client Server Authentication.  Client Certificates are imported by the  client in their machine or Browser  and the Server side certificate is configured in the Apache Web server.

Client and Server certificates share a common certificate from a CA machine  which  we signing certificate. This needs to be configured in the Apache Web server , When the certificate is generated for the Client, this details of the signing certificate is used .

Machine 1 : V01717  - (Certificate Authority Machine)


Download openssl.zip from http://gnuwin32.sourceforge.net/packages/openssl.htm  and extract openssl.exe

Configuring the openSSL

Go to the  system environment variable and set the path as given in the screenshot below::


2.       Double Click openssl.exe  . This should open up in the command prompt with openssl program running. Follow below to enter the required commands for the generation of various keys and certificates required.


Signing Certificate Generation:

genrsa -out v01513_ca.key 1024

req -new -key v01513_ca.key -out v01513_ca.csr

x509 -req -days 365 -in v01513_ca.csr -signkey v01513_ca.key -out v01513_ca.crt



These commands above would generate three files namely  v01513_ca.key , v01513_ca.csr  and  v01513_ca.crt  out of which the .crt file is the certificate file which needs to  be imported to the server apache machine


Generate Certificate Files for Apache Server


genrsa -out v01513_server.key 1024

req -new -key v01513_server.key -out v01513_server.csr

x509 -req -days 365 -in v01513_server.csr -signkey v01513_server.key -out v01513_server.crt




Generate Certificate Files for Client


genrsa -out v01513_client.key 1024

req -new -key v01513_client.key -out v01513_client.csr -config openssl.cnf

x509 -req -days 365 -CA v01513_ca.crt -CAkey v01513_ca.key -CAcreateserial -in v01513_client.csr -out v01513_client.crt


openssl pkcs12 -export -clcerts -in v01513_client.crt -inkey v01513_client.key -out v01513_client.p12



This would generate a Client certificates , The final command would generate a .p12 file which  needs to be distributed to the client to be imported.

Importing of the cerficates is quite simple and double click the v01513_client.p12 file  and finish importing the certificate.

Configure the Apache  Web server (V01513):

1.       Create a folder  certificates  in Apache Webserver home folder, For us , it is c:\Apache24

2.       Copy the following certificates generated by the CA Machine inside this folder.


·         v01513_ca.crt

·         v01513_server.key

·         v01513_server.crt


3.       Include httpd-ssl.conf file in the httpd.conf

Include conf/extra/httpd-ssl.conf


4.       Go to the httpd-ssl.conf and edit the following lines:

5.       Uncomment the following line #Listen 80    and add the following  specifying 9080 as the port number from where https can be accessed

Listen 9080

6.       Uncomment the line #SSLRequireSSL to


SSLVerifyClient require

7.       Add a port where https would be accessible

<VirtualHost _default_:9080>

<Location /BOE>

8.       Ensure the line  is commented and has the htdocs folder location appropriate

DocumentRoot "c:/Apache24/htdocs"

9.       Just before this file is end, before  the tag  </VirtualHost> , add the following lines of code:

JkMount /examples tomcatlb

JkMount /examples/* tomcatlb

JkMount /docs/* tomcatlb

JkMount / balancer

JkMount /status   stat

Include conf/bobj.BOE.conf

Include conf/bobj.AdminTools.conf

Include conf/bobj.BOE.conf

Include conf/bobj.BusinessProcessBI.conf

Include conf/bobj.clientapi.conf

Include conf/bobj.dswsbobje.conf


By giving the appropriate certificate files which we had created above


SSLCertificateFile "c:/Apache24/certificates/v01513_server.crt"

SSLCertificateKeyFile "c:/Apache24/certificates/v01513_server.key"

SSLCACertificateFile "C:/Apache24/certificates/v01513_ca.crt"


11.   Once the above settings is all configured, Restart Apache  in the command line :

Httpd.exe –k start –DSSL  



Configure the Client  machine (V08000):

1.       Transfer the v01513_client.p12  to the Client machines  and double  click the file.

2.       This should import the file with the required steps.

3.       Access the URL with https  with url and port number


Final Modified file would look  like this :


Listen 9080


SSLPassPhraseDialog  builtin

SSLSessionCache        "shmcb:c:/Apache24/logs/ssl_scache(512000)"

SSLSessionCacheTimeout  300


<VirtualHost _default_:9080>

DocumentRoot "c:/Apache24/htdocs"

#ServerName www.example.com:9080

#ServerAdmin admin@example.com

ErrorLog "c:/Apache24/logs/error.log"

TransferLog "c:/Apache24/logs/access.log"

SSLEngine on

SSLCertificateFile "c:/Apache24/certificates/v01513_server.crt"

SSLCertificateKeyFile "c:/Apache24/certificates/v01513_server.key"

SSLCACertificateFile "C:/Apache24/certificates/v01513_ca.crt"

<Location /BOE>


SSLVerifyClient require

SSLVerifyDepth 10


<FilesMatch "\.(cgi|shtml|phtml|php)$">

    SSLOptions +StdEnvVars


<Directory "c:/Apache24/cgi-bin">

    SSLOptions +StdEnvVars


BrowserMatch "MSIE [2-5]" \

         nokeepalive ssl-unclean-shutdown \

         downgrade-1.0 force-response-1.0


CustomLog "c:/Apache24/logs/ssl_request.log" \

          "%t %h %{SSL_PROTOCOL}x %{SSL_CIPHER}x \"%r\" %b"


JkMount /examples tomcatlb

JkMount /examples/* tomcatlb

JkMount /docs/* tomcatlb

JkMount / balancer

JkMount /status   stat

Include conf/bobj.BOE.conf

Include conf/bobj.AdminTools.conf

Include conf/bobj.BOE.conf

Include conf/bobj.BusinessProcessBI.conf

Include conf/bobj.clientapi.conf

Include conf/bobj.dswsbobje.conf


Hi All,


Recently I have shared a document wherein I have mentioned about refreshing Crystal Reports using MultiTenancy Management tool in BI4.1.


This blog is about refreshing Webi Intelligence Reports, Universes and Connections using Multi Tenancy Management Tool(MTM) in BI 4.1 SP06.


Before we proceed further you would need to follow the steps mentioned in the document: How To Use Multi Tenancy Management Tool(MTM) For Refreshing Crystal Reports in BI 4.1- Part I


Note: If you do not have Crystal Reports to be updated in the tenant template then you would not need to follow the steps mentioned for the CR reports in the above blog.


Here, we will start with the mapping of the Universes with the Webi Reports in the tenant's folder from tenant's template.


So, I will assume that you have followed the above document for creating tenant's template and copied all of the required Connections, Webi Reports and their Universes in the template's folders.


Note: We need to ensure that all of the Webi Reports from the template folders are using the universes from the template's universe folder only. If any webi report is not mapped to any universe then MTM will encounter issues.


I have used BI4.1 SP06 on Windows platform and Oracle 11g database for this testing. I have copied all of the Webi reports, Connections and their Universes in the template folders as mentioned in the document : How To Use Multi Tenancy Management Tool(MTM) For Refreshing Crystal Reports in BI 4.1- Part I


MTM supports mapping of ".UNX" BI4.1SP06 onwards, it also supports mapping of ".UNV".


After copying all the required Connections, Webi Reports and their Universes, we would need to configure TestMTM_def.properties file. I have already mentioned all other required information to configure this file in the above document.


Followings are the steps required to configure this file in order to refresh Webi Reports/Universes/Connections:

    1. Change the value of below parameters from false to true:







Note: If you do not have any shared universes/connections then you do not need to specify the above values "optionUseSharedUniverses" and "optionUseSharedConnections". Shared universes and connections are nothing but the universes and connecitons which will be shared between multiple tenants.


    2.  Specify the Reports/Universes/Connections folder templates:



Note: In the above document I have created a tenant template as "$tenant_template$" in CMC and it has the folder(template) structures for                    universes, connecitons and Webi reports as below.


                    Report Template:


                    Universe Template:



                    Connection Template:



Note: "Report Folder 1", Report Folder 2" folders contain the WEBI Reports. "Universe Folder 1", "Universe Folder 2" folders contain the                              universes and $tenant_template$ folder contains the following three connations.



    3. Followings are the paramerts of this file which have to configured to refresh the connecitons:


    4. Now, we will configure the above parameters for the connections:


            As you can see that I have copied three connections in the connection template folder, this is why I have specified the following three connection  details. If you have more connections in the template folder then you would need to specify the DB information for all of them as below.






    5. This is how I have mentioned these parametes in this file:



Note: You would need to specify this information as below.

E.g. ccis.dataconnection.dbcredentials1=ARiyILFu7565Dwf8VolZzqQ;tenantDBN;tenantDB;userABC;Password123

Here, ARiyILFu7565Dwf8VolZzqQ will be the CUIDs of the connections which are present in the template connection folder. tenantDBN would be a Database server name. userABC and Password123 would be the user IDs and passwords used in these connections. As I said if you have multiple connections then you would need to repeat this information for all of the connections as ccis.dataconnection.dbcredentials1, ccis.dataconnection.dbcredentials2, ccis.dataconnection.dbcredentials3  and so on.

    6. After configuring this file, you would need to run the MTM command as mentioned below.


Command: BOInstallDIR\SAP BusinessObjects Enterprise XI 4.0\java\apps\multitenancyManager\jars>java -jar multitenancymanager.jar -configfile TestMTM_def.properties


    7. This command will create the tenant and will copy the Webi Reports, Universes and Connections to the tenant's folders.


Points To Be Noted:

  1. MTM will create the folder structure as created in the template for Reports, Universes and Connections.
  2. MTM will copy the Reports, Universes and Connections from template to the tenant's folders.
  3. MTM will not rename any report, universe and connection, it will use the same names as specified in the templates.
  4. MTM will first update the connections and then it will map the universe to the Webi Reports.



I have tried this on BI4.1 SP06 wherein it has successfully refreshed the connections as I can see that in the MTM logs but it failed while mapping/updating universes with the Webi Reports.


MTM Logs:

2015-11-27T13:27:46.129+0530Object infoCI_APPOBJECTSTestConn2CCIS.DataConnectionSuccessfully updated object TestConn2 (id=8,609)
2015-11-27T13:27:46.163+0530Object infoCI_APPOBJECTSTestConn1CCIS.DataConnectionSuccessfully updated object TestConn1 (id=8,610)
2015-11-27T13:27:46.194+0530Object infoCI_APPOBJECTSTestConn3CCIS.DataConnectionSuccessfully updated object TestConn3 (id=8,611)


I had three connections in the template and logs have shown that the connections are updated successfully. It failed while mapping the universe to the Webi Report.


MTM Logs:

An error occurred while updating Web Intelligence document HGBSXXXX_Test MTM Report. The detailed error message is ChangeSourceHelper cannot load target DataSource: Test UNV 1 Universe.

I have found an SAP note: 2140459 which says that this is currently under investigation by the SAP Development team. According to this note it happens with the linked universes only whereas I have not used any linked universe for this blog. So I believe this issue is not specific to the linked universes.

I have also got in touch with the SAP support team to check whether this is the same issue or not. I will keep posting once I have any update regarding this issue by the SAP Development team.


I believe once SAP resolves this "ChangeSourceHelper" issue, all of the above steps would be helpful to map/update the Webi reports and Universes using MTM in BI4.1.

To Know More About MTM:

Please refer this guide and following blogs.


Overview of SAP BI 4.x Multitenancy Management Tool by Christina Obry.

Multitenancy Management tool setup, new features in BI 4.1 by Sohel Ahmed Syed

How To Use Multi Tenancy Management Tool(MTM) For Refreshing Crystal Reports in BI 4.1 by Swapnil Yavalkar

Pattern Book on SAP BusinessObjects BI 4.1 Performance & Load Testing using JMeter

Dear BusinessObjects BI Community,

I am very happy to annouce the release of Phase 4 of SAP BI Pattern Books project with a book on a  most sought subject "how to properly configure and run performance & load testing using Apache JMeter on SAP BusinessObjects Platform".

Key Highlights of the book:

  • This book was developed using a clustered and load balancer configured SAP BusinessObjects BI Platform 4.1 SP05 architecture deployed on total of 26 CPUs with 116 GB memory
  • BusinessObjects Web Intelligence was the main reporting tool used through Semantic Layer against SAP HANA data source with around 3 million records where the tests executed against tens of thousands of records at a time
  • More than 12 test cycles have been executed with a maximu of 300 Active Concurrent User sessions and those results have been captured and documented with screenshots


Links to the SAP BI Pattern Books:

Link to the Main Page

Pattern Book on BI 4.x Performance & Load Testing


Please feel free to share this with our customers, partners and rest of the SAP BusinessObjects BI world.


Your feedbacks / comments are highly appreciated.


For comments, follow-up questions and related resources, get in touch with Jim Rapp (Check his SCN Profile here and he can also be reached via Twitter @jmsrpp), Jonathan Brown (JB's SCN Profile and he can also be reached via twitter @its_me_jonny) and myself (my SCN Profile and via twitter @srajagpl).




I got real problems with performance in BI Launch Pad. For instance, my reports were opening very long time, navigation in BI Launch Pad was awful.

1. Let's modify the Tomcat setting, because by default they have low value.
o JavaHeapSize (-Xmx) from 2G to 4G
o MaxThreads from default (200) to 900
1.1 JavaHeapSize we should modify:
cd <bo inst folder>/sap_bobj/tomcat/bin
modify setenv.sh

# set the JAVA_OPTS for tomcat

JAVA_OPTS="-d64 -Dbobj.enterprise.home=${BOBJEDIR}enterprise_xi40 -Djava.awt.headless=true -Djava.net.preferIPv4Stack=false -Xmx4g -XX:MaxPermSize=384m -XX:+HeapDumpOnOutOfMemoryError -Xloggc:<bo_inst_folder>/sap_bobj/tomcat/logs/tomcat.gc.log -XX:+PrintGCDetails -XX:+UseParallelOldGC"

1.2 MaxThreads we would modify:
cd <bo_inst_folder>/sap_bobj/tomcat/conf modify

Define a non-SSL HTTP/1.1 Connector on port 8080

<Connector port="8080" protocol="HTTP/1.1" connectionTimeout="20000" redirectPort="8443" maxThreads="900" URIEncoding="UTF-8"/>
  <!-- A "Connector" using the shared thread pool-->
    <Connector executor="tomcatThreadPool"
               port="8080" protocol="HTTP/1.1"
               redirectPort="8443" />




2. Cleaning of old installed patches and service packs from BO. What does it mean?
Just go to <bo inst folder> and run ./modifyOrRemoveProducts.sh (for Linux):
It is clear that i have many various installation on my server. But it is important to delete old installation!


3. Cleaning of logging directory, in <bo inst folder>/sap_bobj/logging
o Deleting Trace-Files
o Deleting Logs old than 30 days
4. Tuning APS, for start you can use CMC Wizard.
5. APS properties for tracing set to unspecified (-> means bo_trace.ini are used)


Level - Description
Unspecified - Force the use of BO_trace.ini to be used
None - Only critical events such as failures will be logged
Low - Ignores warning and status messages
Medium - Only Status messages with least important will be ignored
High - Includes all the logging messages


6. Disabling of unused services in the CMC, for instance, Crystal, Analyses services.


7. Reorganisation of database statistics of BO-Repository and BO-Auditing.


This one is high important point. As you know, every database has to manage its statistics according to number of rows. And if you do not collect stat, than you performance as well as refresh rate are decreasing.

Join Ty Miller, VP of Product Management SAP Analytics, on Thursday November 19th at 07:00 AM PST / 16:00 CET to learn about the revolutionary release of SAP Cloud for Analytics .


On this call Ty will discuss details of the release, the future of Cloud Analytics at SAP, as well as how you can get exclusive access to the software.


Registration for the webinar required and available via : SAP Customer Experience Group




Hello everyone,


I would like to take sometime to explain the behavior of the scenario when we schedule a report which contains the prompts and mark the delivery as "BI Inbox" as destination.


I am pretty sure most of you are quite aware of this functionally as this is being with the product for a while, however i am trying to establish the reason from the product and the design perspective.


The reason I am reaching out through SCN is that there are some discrepancies in understanding the significance of the "BI Inbox" aka Managed destination.


The end users see a difference when they try to compare the property bags of the instance delivered to "BI Inbox" to that of the same instance copied to other folder of users choice.


There is a difference which need to be understood, however few customers are complaining this is a bug.


Steps Involved while Scheduling a Report which is marked for "BI Inbox" as destination :


1) Select/choose the report wish to schedule.


2) Right Click select Schedule or select schedule option from the menu.


3) The user is redirected to the schedule page and there we choose the destinations, formats, other desired schedule options


4) Once the user is completed with the customizations click on schedule.


5) The History page opens up and display the status of the schedule ( Pending/Running/success and failed)


Internally the Servers which are responsible in this activity are. (Considering We are scheduling a Webi document) The processing server will vary for a CR document.



Webi Processing Server

Job server

Job server Child

Destination job server ( as we have choose the destination as "BI Inbox")


On top of this, the "BI Inbox" which is a Managed Destination doesn't have a option to schedule the report. however we can {view/refresh/Save As} the report through the users inbox.


This is where there is a noticeable design from the Platform While Viewing the Report instance, the Platform takes the help of the respective Plugin.


Hence in this case the View is done through Webi Plugin and because of which the user is able to view all the prompts which is integral part of the .wid file.


The user can even refresh and use Save As option to save the instance in that case the Webi Plug-in supplies all the desired property values.


This is not true in case of scheduling the report ,as the destination job server doesn't necessarily need to know about the report Kind as this is already processed by the respective Processing server,


The Delivery is based on the schedule options provided by the user and the delivery rules.


These delivery rules are different for all the 4 destinations (Managed - "BI Inbox", Unmanaged - "File System", SMTP- "Email", FTP- "File Transfer") .


This scenario is already validated with the affected domain teams and the Platform team and we also performed the tests on 3.x 4.0 and 4.1 and found the behavior is consistent.


Hence I would conclude that this is not a bug , as each destination is designed for a specific purpose.


Best Regards,


Whats this about?


This is an adjunct to Steve Fredell document in 1631734 - Configuring Active Directory Manual Authentication and SSO for BI4 in regard to manual authentication of SSO for BI4 which is tomcat centric.


However at a recent client there a similar use case required however was for deployment of the BO on NW7.31 >rather< than tomcat as the web application server.


There is also a worthwhile troubleshooting guide for tomcat here 1476374 - ***Best Practices*** including Basic and Advanced AD Troubleshooting Steps for Manual Logon, NTLM, Kerberos and Vintela Single Sign On





I have been asked to configure AD authentication, following Steve Fredell's "Configure Active Directory Manual Authentication and SSO for BI4" I could successfully get AD authentication working  fine with tomcat , but got stuck with NW as the web application server.

After spending close to 1 day of my customers time attempting this I failed to get it to work and posted this forum message AD authentication for BI4.0 on NW7.3x portal


-----------------  Forum Post ----------------------------



However when I use the same BOE/CMC with imported early into portal I get the error:



Account Information Not Recognized: Active Directory Authentication failed to log you on. Please contact your system administrator to make sure you are a member of a valid mapped group and try again. If you are not a member of the default domain, enter your user name as UserName@DNS_DomainName, and then try again. (FWM 00006)




So tomcat obviously understands the kerberos authentication, I have made sure the same server principle name and AD administrator credentials are the same, is in use by tomcat and portal both use SAPService<SID>


Any tips as to what I need to do to get Windows AD authentication working to BOE/CMC from a NW7.3 portal? Do I need to re-import the BOE deployment?



Swapnil Yavalkar  responded saying he had been successful but without capturing any details, which was good news as at least it proved it was possible.



Unfortunately in the available notes there is almost zero about how to get NW7 working as the web application server, I placed a service call to SAP in regard to the problem. Days later I got the response that there is a unpublished internal note 1852377 that describes the solution to my problem, I had got 95% of the way there but had not performed the subnode configuration in NW config tool.

Whilst I cannot republish this note I will post my solution ( albeit sanitized of customer details) which fairly much covers the same trajectory as 1852377


Step 0. Ensure principle names for Kerberos


Find and ensure you have set principle names for the service running your NW (portal) web application server.


You may need to setspn -A  to configure your principle names, however scope is beyond this blog for a start try here . However here is a sanitized list of principle names for the service owner ( which is more than needed)




Step 1. Create your kerberos configuration file


You will need a krb5.ini file as per notes above  into C:\windows, I copied mine from an existing tomcat configuration I had working.







Step 2. Add kerberos module to Netweaver Administrator


You will need to enable krb5 module to  NWA  http://theportalserver.com:50200/nwa




Configuration -> Authentication and Single Sign-On -> "Login Modules" tab



Create a module with the display name Krb5LoginModule with the class name of com.sun.security.auth.module.Krb5LoginModule





Then in tab "Components" tab create  a custom configuration called  com.businessobjects.security.jgss.initiate


Choose the lower authentication stack tab and then add the login module "krb5LoginModule" with the flag "REQUIRED"


Dont forget to save



Step 3 Using SAP Java configuration tool we add Java options.


I found it is best to do this during downtime of the NW portal.



Call configtool.bat from usr\sap\<SID>\J<id>\j2ee\configtool




I normally choose expert mode.


Choose the instance then choose "VM Parameters" tab


Select sap from the vendor list and global from the platform list.


Choose the "system" tab and new.
Add Name java.security.krb5.conf and the value of C:\windows\krb5.ini


Create another parameter called javax.security.auth.useSubjectCredsOnly with the value "false"


Choose File -> Apply changes.



Step 4. Adding sub-nodes for com.businessobjects.security.jgss.initiate policy.

Continuing with config tool

Choose Tools -> Configuration editor







Choose Edit mode.



Navigate to Configurations -> Security -> Configurations -> com.businessobjects.security.jgss.initiate -> security -> authentication.




Right click and choose "Create sub-node"


Choose "Value-Entry" name create_security_session with the value "false"


Then apply changes again.


Step 5. Restart NW Portal






Testing after 10 minutes system will restart and you should be able to authenticate with NW7.3 Web.loginok.png



For me it just worked first time , so I don't have any troubleshooting validation except check your syntax each time.




a) Get it working with your tomcat server first as per the guide attached to note 1631734 use the troubleshooting guide 1476374

b) Then follow this blog or note 1852377

Hi All,


We did a great PoC on the last month and I would like to share with you the results and what we did.


The POC history

Customer have on their environment SAP HANA installed with one BW productive system (BW on HANA) since 2013. Since the initial implementation, they were able to see many improvements on performance of many BW queries, DSOs and cubes. But, one of their most important reports, accessible per more than 1,000 users was not with a good performance with a performance of around 6 minutes to refresh the data

This generate a large negative noise against all the company like:

“SAP HANA doesn’t work”

“After the migration of BW to SAP HANA, the performance of my report didn’t change significantly”

"SAP HANA is not what I expected”

With that in mind, we did a meeting with the customer and schedule a small assessment. During our assessment we find some issues related to modeling best practices that were affecting the performance:

  • The union of 4 cubes resulting in a data set of almost 1 million rows;
  • Combination of ABAP Code on this SAP BW Query to convert some currencies and also to restrict data based on one Table (They are not using the Standard Analysis Authorizations);
  • Combination of two hierarchies structures on the same query/report;
  • Usage of Web Intelligence to one demand that could be fit with Design Studio;
  • Execution of many queries in Web Intelligence;
  • Many calculations done on report level

Technically speaking, this is what we find on their BW system when we start our assessment:


The POC Challenge:

With that in mind and after show these issues to customer, they challenge us: “OK SAP. We want this report running in 5 seconds or less.”

What and how we did:

Before thinking about how technically we could improve the performance of this report, we started talking and engaging the customer IT area to break some paradigms like: “Why you don’t do this directly on HANA?” or “Did you try to use a different tool?”. The business area also got involved during all process, to test if the report is ok in terms of usability and also if the values are ok.

These teams was a critical success factor, as they helped a lot validating some features of security, transformations, filters etc and also validating the data when we developed the HANA Views and also the dashboard.

We decided then to develop this same logic structure directly on SAP HANA, accessing the Calculation Views of each cube automatically created by SAP BW and modeling there all transformations.

As a front-end tool, we decided that the best would be Design Studio, as don’t generate any MicroCube and it’s also optimized to read HANA Views and the usage of Parent-Child Hierarchies.

It was the customer requirements for this dashboard:

Key Functionalities:

  • Good Performance (5 seconds)
  • Hierarchies (parent-child)
  • Custom Row-Based Security (based on one table)
  • Currency and Unit of Measure Conversion
  • Multi-Language
  • Drill down to more granularity (never possible with WebIntelligence)

After some weeks of hard work, we showed for this customer all of these functionalities based on HANA Views and the result was impressive:

Reduction of time from 6 minutes to less than 5 seconds (4,8 seconds on the best execution).

An improvement of 72X!!

To have a better understanding of how much this improve the customer operation and how this number is representative, let’s imagine that you could improve the time spend of a trip from São Paulo to New York that takes around 10 hours in 72 times. You would arrive in New York in only 8,5 minutes.


10 hours.JPG


8 minutes.JPG

After this job, the customer was satisfied and convinced that SAP HANA is really fast and really saw that with SAP HANA they can have an improvement on their process by increasing the BI adoption against all the company

The customer also saw that the problem was not on HANA, but in the way they did their BW query and also on the Front-End tool that they are using.


Filter Blog

By author:
By date:
By tag: