1 2 3 21 Previous Next

BI Platform

306 Posts

Today SAP Insider hosted this online chat with SAP's Sathish Rajagopal Harjeet Judge Maheshwar Singh Gowda Timma Ramu


For the full Q&A check the replay here.  Also check out the BI 2015 event in March.


Below is a small subset of the Q&A edited, and reprinted with SAP Insider permission:


Question and Answer:



Q: Will BW and BO merge in the future? As HANA is positioning BODS as a primary component for Data services and Lumira is on the horizon, how will BO future roadmap look like?

A: There will be more / tighter integration between these two - SAP BW and SAP BusinessObjects. There is no plan in the roadmap to merge these two technologies. SAP BusinessObjects will continue to be our Enterprise BI platform, which will be the foundation for all future innovations around Analytics. Whereas BW will continue to leverage the power of SAP HANA to store and process enterprise data.



Q: Is there a straightforward approach to check which reports are created on Universe?

A: There is no straightforward approach to get this information. You will have to write queries in query builder to find out the reports that are associated with a universe. You may have to potentially write more than one query to get the information you are looking. The other option would be to write a SDK script with some logic to run the queries. You can also explore the use of Information Steward tool that you posted in your other question. I will add value in extracting metadata from BI system database.



Q: Can you tell us more about the Free Hand SQL capabilities added to BI 4.1 SP5?

A: Currently support deski fhsql document migration to webi and refresh on webi, supported via extension point framework.

Plan is to fully integrate in UI in the future releases with asstional connection and query management capabilities.

Part of Free Hand SQL support was introduced in BI 4.1 SP05 release but we are working on to support the remaining parts in SP06 and 4.2 releases


Q: I would like to know which is the best way to connect a Dashboard to the cubes

A: If you are using BW my suggestion would be explore using SAP Design Studio to build your dashboard. Design Studio is designed from group up to support BW scenarios. You can use SAP Business Objects dashboards as well using various connectivity options:

1) Direct BICS connection to BW if you plan on hosting the dashboard on netweaver portal

2) Build Webi report on BW query and expose the block as webservice and use BIWS connection in dashboards.




Q: What are the main benefits to move from BI4.0SP5 to BI4.1SP0 ?

A: Depends on the usage of BI Platform. the following SCN blog lists the enhancements on platform and client tools in BI 4.1 SP05.

SAP BusinessObjects BI4.1 SP05 What's New



Q: We are planning upgrade to BI4.1 sometime next year (fingers x'd), anything in particular we should look out for?

A:  The answer would depend upon which version you are upgrading from. Few things to pay attention to:

1) Know that BI4.x is 64 bit architecture so the hardware requirements may be different

2) Understand the BI4.x offers 32 bit and 64 bit db connectivity depending upon which client you are using for reporting. You will have to configure both 32 bit and 64 bit db connectivity

3) Pay attention to sizing your system. If you are on 3.x don't expect to run your BI4.1 system on the same hardware

4) Split your Adaptive Processing Server as this will impact system stability. You can find document on SCN on how to do this


Q: Can a webi report connect straight to a HANA view , without need for a universe. Any plans to deliver this functionality ?

A: Direct access to HANA views from Webi is planned with BI 4.2


Q: We are on Bex3.5 and trying to decide whether to move to Bex7, Analysis for Office, or another product. We will likely not install the entire Bobj suite, but have a ton of workbooks

A: I would suggest you to check the differences and importantly gaps between these options and then decide. Because you may be using unique or certain functionalities in your environment. It won't be wise to suggest one way or other. But ultimately you need to upgrade from 3.5 for sure.



Q: Most of the Clients are on and will be in XI 3.1 and BI 4.1 parallel. 3.1 Infoview supports 1.6 Build 32 but 4.1 BI Launchpad does not. This would mean developers and users cant be login to both environments unless we do some manual overides (which is not supported by Network/Security teams). Is there any alternative to this?

A: I assume you are referring to the java version on the client. There is no easy to way to deal with this. Couple options:

1) Use Citrix and have some clients go through citrix which has different version of JVM

2) Explore the use of HTML query panel for Web Intelligence



Q: We use a portal to present our reports to customers using opendoc. We have one server with one webi processing server. We are always running into issues where users sessions are stuck and busobj is not timing out the sessions. We also have an issue with webi processing server when at specific time at night its always throwing warnings that its high on memory or maximum user connections are logged, when there are zero users logged in and using the webi processing server. Any advice or insight on these issues?

A: Opendoc sessions timeout by default at about 20 mins. This time is configurable. You could also use the kill session in the CMC to release the idle sessions. However you need to be on 4.1 sp3 or greater


Q: Which BusinessObjects BI 4.1 tools to use for an access to Transient Provider?

A: The tools like CR4E, Webi and Analysis clients etc.. using BICS for Data Access can access Transient Provider.



Q: I am missing a functionality to add comments to the reports which can be entered by users. ist there a Standard solution available?

A: BI4.1 has collaboration feature that supports integration with SAP JAM



Q: When will the UNV go out the door and when will UNX take over? Should we panic now and convert all our UNVs to UNXs?

A: Our goal is to support innovation without disruption. We are not planning to end .unv support any time soon which is why you still see the universe designer in BI4.1. Having said that, most of the new functionality is only added to .unx universes to entice you to eventually make the conversion to .unx universe. My advice is continue to use .unv universes for your existing content and do you new development on .unx. You should also have a mid to long term plan to convert your universes to .unx to take advantage of the new features.



Q: I would like to put my results on a world map using Design Studio. Which Tools do you offer? Will there be a full map Integration of Dashboards available with Googlemaps or an own SAP world map?

A: You can use SDK components delivered by partners

List of Design Studio SDK Components

A: The full geo map support in Design Studio is planned for future release.


Q: Is audit functionality has improved in 4.1 as compare to 3.1 if yes what has improved?

A: We introduced additional functionalities such as more events to capture etc. in BI 4.1 release and the schema itself has been improved with totally a new structure for better performance etc..



Q: Do you know the release date for Design Studio with offline data support?

A: We don't have a timeline for this, but this is a roadmap item for the future. I would encourage to put this idea on idea place if it's not already there. You can also vote on the idea...more customers that vote the idea the more likely you will see the feature in the product.



Q: Global Input Controls (one set of Input Controls,controls all tabs) is it happening SP6? or is it avalaibel in any earlier Fix Packs. This should have been logical addon feature in 4.1 as it was Pending Idea in ideas.sap.com for very long

A: Yes, Global Input Controls is planned for BI 4.1 SP06.



Q: So are you saying we can link universes in 4.1, i thought this feature is no more there in 4.1?

A: You are correct. it is planned for the future release most likely BI4.2


Look for more in March at BI 2015

A fantastic opportunity for you to learn more about BusinessObjects BI 4 is currently being offered by SAP via OpenSAP.


Enroll now:




Here is the course summary:


“We live in a world where big data, people, machines and processes are interlinked in an internet of everything. Immense value can be unleashed by connecting this information to the work we do every day, enabling us to quickly discover what is happening and then act with the power of collective insight. Learn how to unleash this power by implementing SAP BI 4 with our new SAP BusinessObjects BI 4 Platform Innovation & Implementation Training course offered through openSAP.

Successful deployments require proper sizing, hardware, configuration, security and administration. This course, designed for experienced BI system administrators, is brought to you by the Strategic Customer Engagements Team, who are SAP’s most senior SAP BusinessObjects BI specialists.”


Enjoy the learning experience!

Dear SCN Community Members,


We are please to announce the availability of the SAP BusinessObjects BI4 Custom Implementation Report. With this report, we will help you understanding the best option to implement your SAP BusinessObjects BI4 deployment based on your organisational requirements. Based on a set of questions and your input, an Implementation Report will be generated containing a long list of recommendations and links to relevant content to further enable you in deploying SAP BusinessObjects BI4 successfully.


Implement | SAP BusinessObjects Business Intelligence Solutions 2015-01-20 13-19-20.jpg


You can run your own Custom Implementation report via : https://www.sapbusinessobjectsbi.com/implement/


Please share your feedback with us!


Merlijn Ekkel


Director BI Solutions | SAP GMT BI | Solution Management

SAP BI 4.1 IDT SP5 Business Security & Data Security Profile Filter Implementation




SAP BO 4.1 SP5

IDT - Single Source UNX

Windows 2008 Server


Implementation Scenario:


My project has a requirement where we need to implement Security on top of the Universe Class/Objects, such a way that certain Departmental users will have only access to subset of objects of their department say eg: Purchase, Receipts. Now, we want to use the same universe for other users who is not in any department but need access to all the Classes/Objects. Now the catch is that those department users will have an extra filter condition in the definition of Security Profiles.


I will explain this scenario with the actual problem I faced and how I did a work around to overcome that issue ! Initially my assumption was this is Column/Object level secuirity can be achieved thru Business Security Profiles ONLY if you have filters required on top of Class/Objects. Which turned to be FALSE ! Some play around with the security profiles helped me understand the reason(s) behind it. I covered it in later part of this blog..

To start with I defined 2 Business security Profiles


1. Created View with Purchase Only objects in IDT on my Single Source UNX.

2. Created another View with Receipts Only Objects in the same universe.



Created 2 Security Profiles


1. BS_Purch_Only_Profile with filter PO_Type = 'BULK'

     - GRANT Purch_only object


2. BS_Receipt_Only_Profile with filter R_PO_Type='BULK'

     - Grant Receipt Only object



My requirement is that I want to define Business Security Profiles in IDT such a way that USER1 in both the above profiles need to have above filters separated in the WebI Query. Currently the Net Business Security Profile Query with the above approach result in PO_Type = 'BULK' AND R_PO_Type='BULK' in WHERE Clause of WebI report on either Purchases OR Receipts.



This generation of 'AND' for the filter is by SAP's Design(for Business Security profiles) and can be handled thru Data Security Profile.






Observations and Work around:



i) Filters in Business Security Profile always comes with 'AND' in the Where clause(Net Profile Security) if the user is in multiple Business Security Profiles. So, there is no other way to overcome this issue as this is by SAP Design.


The reason for this sort of design is that Classes and Objects can spread across Subject areas. So, these filters primarily helps at the Classes and filter levels irrespective of tables that are hitting the database. If you have further level of security requirement you need to control at the Row level fetch which is nothing but the Data security Profile.

ii) I used Data Security Profile to apply Filters and Business Security Profile to Display subset of Objects say Eg: Purchase vs Receipts Class Objects; in IDT  as per the requirements.


Steps in Detail:


- Create Data Security Profile on top of the Purchase Only View.

- Add filter to the Data Security Profile: DS_PO -> Rows -> Add PO_Type = 'BULK' condition.

- Similarly Create Data Security Profile  for Receipts Only View DS_Receipts -> Add R_PO_Type='BULK' condition;


Now USER1 is assigned both the Data(DS_PO & DS_Receipts) and Business Security Profiles(BS_Purch_Only_Profile & BS_Receipt_Only_Profile).

The Net Security Profile gives my expected results !


NOTE: You cannot test Security profiles in IDT -> Business Layer -> Queries but you can check the Net Security Profile in Security Editor, by selecting the Universe on which Profiles are defined and the User together !



In WebI, when USER1

Query1: Creates a query with Purchases ONLY View, this will generate ONLY PO_Type = 'BULK' filter instead of both the filters as in the initial problem

Query2: Create a query with Receipt ONLY view, will result ONLY R_PO_Type='BULK' filter


Both the Queries executes in One Report and give me two separate tables.


The above 2 steps helped to implement my requirement.




- Some Users are only in One Group they will have only one required filter instead of all the filters in the WHERE clause !

- Users not in any Security Profile will have NO Filter conditions. This is awesome feature as per me !!

- Replace my USER1 with a Purchase Or Receipts Department Group created in CMC. So, the entire department's security is controlled !




Hope this helps who want similar security implementations ! Encourage you to add Questions/comments on similar issues.




Communication to Identity providers like Active Directory, LDAP and SAP was covered in part 1, and securing the web tier was covered in part 2.

Now let's look at the actually BI servers, like the Central Management Server, (CMS), File repository Server (FRS) and others.


We'll look at port restrictions, potential firewall setups, SSL/TLS and other configuration switches.


FIPS 140-2

By now you may have read about the -fips parameter on the SIA.  FIPS stands for Federal Information Processing Standard.  I cover this mode more in my data security blog.  The quick summary is that BI4 uses FIPS certified encryption libraries to perform its encryption.  

Turning this switch on (add a "-fips" on the SIA command line), prevents usage of older clients and disables some older functionality.  If you do not have any xir3 clients or custom applications running against your BI4 system, there is no reason NOT to have this switch on.  Do expect this to become the default in upcoming maintenance releases, where you will instead need a special switch to turn ON old functionality, but by default, and xir3 or older client will NOT be able to connect.


It is not just about enforcing stronger BI4 security.  By disabling older functionality, you again reduce the attack surface, where a server not accepting calls based on older functionality will be harder to exploit.  If you're familiar with the POODLE attack, you'll know for example that the latest recommendation is to outright disable SSLv3 protocols and use strictly TLS.   A similar concept applies here . 

Minimum Privileges

Creating a special locked down user to run BOE can be worthwhile.  The built in windows system account is actually quite powerful


The rights required on the local computer where the SIA is running are as follows:


-Logon As a Service.

-Read/Write to HKEY_LOCAL_MACHINE\SOFTWARE\SAP BusinessObjects\Suite XI 4.0

-Read/Write to Install directory (specifically Write access to the Log Locations).

The important part here is the account should NOT be an Administrator on the local machine.



Server to Server channel Encryption (CORBA SSL)

The how to steps for server to server communication encryption are detailed well in the BI4 admin guide, as well as in this online wiki for unix:

The client configuration is detailed in sap note 1722634

How much of a performance hit can you expect?    It really depends on many factors, there is often a tradeoff in performance for security, but a rough guidance can be a 10%-20% impact based on what I have seen so far. 

File Repository Server

This is an important server to protect, because it contains your report content on the file system.  If the reports are saved as PDF or saved with data, that makes them very valuable to attackers.  There are a few additional things you can do to protect the content.

-Secure the FRS OS folders so that only the account that the SIA hosting the FRS can access

-Use file level encryption.  This can protect the content from unauthorized access through the local machine. 

-Virus Scanning.  For large deployments and heavy usage, this can be a big bottleneck on the I/O to the point that performance visibly suffers.    For performance reasons, you may consider running scheduled scans in "off hours" rather than real time virus scanning.  By far, real time virus scan is more secure, but you can further mitigate with locking down what users can upload. 

-Limit content types from being uploaded:

Rather than granting the generic "Add Objects" right, you can actually lock it down to content types, and only permit CR, Webi etc types of documents.  This will prevent a user from uploading a bad executable or batch file, that another user then downloads and executes on their own machine.  Of course one would hope that end users would know better, but prevention is your best defense. 


Default Accounts

All BI installations start with a default "Administrator" account.  For a potential attacker, that is one known piece of information for trying a brute force attack.  Enabling auto lockout for failed attempts will certainly help mitigate this, however another thing you can do is to rename the default account.  Instead of "Administrator" use your own naming such as <Company>_BI_Admin.  For example SAP_BI_Admin.


Stale Accounts

Have people left the company?  Maybe never even logged in?  The less accounts you have, the less chance of an old stale password falling into the wrong hands, or accounts being misused.  It is again about reducing attack surface.


The following query, which you can run using the AdminTools console, will return to you a list of users by the last logon time.


Below is a stripped down sample output.  While these users may have content in personal folders you don't want to lose, consider disabling the accounts.


Ports, Firewalls

Firewalls help you reduce the attack surface.  In the simplest, happiest (from a security standpoint) workflow, all your users are web users, and will only be connecting to BI Launchpad.  In this case, the BI servers can be fire-walled away from the end users.  However chances are you also have thick clients connecting.  In this case, make sure the thick clients are limited to connecting from a trusted network zone, if networks are partitioned.

You can bind servers to a specific port in the CMC.


The CMS has both the the name server and request port that you can configure:


By setting a specific range of ports to use or binding to specific ports, you can then use a firewall to further lock down and reduce the attack surface of your servers.


Keep in mind that thick clients must be able to communicate with the CMS, as well as the Input and Output file repository server.  There is a fairly complete overview of the server port communication described in the administration guide, section 8.14.2


Your IT may have also put your database layer into a separate network zone, inaccessible to regular workstations.  Yes, IT is making your life difficult, but for a good reason in the classical 3 tier architecture.  Clients can and should (for security purposes) connect through the BI platform which in turn connects to the database layer.  This extra hop makes it more difficult for a connection to abuse or attack the database layer directly, where all your valuable data resides. 

Database Encryption

The communication between the BI processing servers and the actual database can, and from a security standpoint should be encrypted.  To help you decide, a threat model should be done.  How sensitive is the data, how isolated are the data sources are just two considerations.   Generally, one should assume that their network HAS been compromised, and build out a security in depth approach.  It is quite easy for someone in your company to fall for a phishing attack.    You can set database encryption at the driver level, below being an example of a sql server driver:


CMS DB Encryption

The CMS repository does not store any data in your reports, however it can store sensitive metadata such as connection information.  This is automatically encrypted using a two key mechanism as part of the BI4 build in encryption.  Again, this is described in my encryption & data security blog.

Using your database vendor's built in database encryption to encrypt the whole data may actually be overkill here, and is actually not something that I would strongly recommend as being necessary, but certainly a valuable 'security in depth' principle option.   The advantage of selectively encrypting content, the way the BI4 process does is that you do not suffer performance hits on non essential data encryption, such as the metadata associated with a report's layout.


Temporary Files

During document creation and processing, temporary files will be created, and they may contain some data.  Have a look at your temporary folders, and lock these down to the process that the SIA service hosting these servers is running under.   See below for the Crystal Reports processing server as an example.

Placeholders like "%DefaultDataDir% and others are defined under the placeholders tab of your Server Properties.

%DefaultDataDir% defaults to "/SAP BusinessObjects Enterprise XI 4.0/Data/"



Auditing is an important out of the box solution, to keep a track on the usage pattern of the SAP BOE platform. Audit data is relevant both from an administration perspective, as well as from compliance perspective for maintaining audit trail for a specified interval of time. While sample audit universe acts as a starter kit to start reporting on audit data (http://scn.sap.com/docs/DOC-53904), a knowledge on the underlying data model helps us build our own queries & reports and optimize them better for performance. The starting point of understanding how auditing works and what information is audited, refer to the relevant chapters in the BI Platform admin guide, downloadable at : http://help.sap.com/boall_en/. e.g. in sbo41sp3_bip_admin_en.pdf, chapter 21 and 33 talk about auditing. There are also several insightful blog posts on auditing and audit reporting by 'Manikandan Elumalai' on SCN.


Any SQL examples shown in this blog post are based on audit database hosted in Oracle. However, the same can easily be adapted to any other query language syntax, as the table structures remain same.


Audit Data Model:


Audit database is designed for both transactions and querying. Audit data is continuously being written to this database by BOE and at the same time audit reports / queries can be fired on it to report near real time audit information.


There are two main transaction tables in audit database: ADS_EVENT and ADS_EVENT_DETAIL. Remaining tables are either lookup or bridge tables. Any auditable action in BOE is captured as a unique Event_Id stored in ADS_EVENT and each Event_ID will have one or more detail records (Event_Detail_Id) in ADS_EVENT_DETAIL. Both the Event and its corresponding Detail can be of specific types and can have other supporting attributes.


This core concept of auditing has remain unchanged since BO XI 3.1, though the number of tables have increased significantly in BI 4.x audit database. The increase in number of tables is primarily due to increase in the attributes being captured and more normalization of the data structures.



BO XI 3.1 Audit Data Model



BI 4.x Audit Data Model






Audit Data Dictionary:


The best way to analyze audit database, is to use a GUI based database client like Oracle SQL Developer. The following queries are helpful in listing the data dictionary:


select owner, object_name, subobject_name, object_type
from all_objects
where owner = '<Schema Name where audit tables are created>'
order by object_type, object_name;


select owner, index_name, index_type, uniqueness, table_name, table_type
from all_indexes
where owner = '<Schema Name where audit tables are created>';


desc <each table name>;


A clear trend which comes up, based on the output of the above queries:

  • Only tables and indexes are present in audit database. No views, procedures, materialized views etc. exists
  • There is no enforced referential integrity between the tables i.e. no primary and foreign keys
  • Index type is normal and either unique or non-unique
  • Due to multilingual support being available by default in BI 4.x, all lookup tables (names ending with _STR) have 'Language' as an additional field
  • The field EVENT_DETAIL_VALUE in ADS_EVENT_DETAIL is of datatype CLOB. Remaining columns in all the tables are of either varchar2, numeric or date datatypes.


Building Audit Queries:

Common audit reporting scenarios may have metrics like Count of Events, Last <Event Type> Timestamp, Count of Users. All these metrics can be derived from the table ADS_EVENT. Supporting details for an event can be obtained from ADS_EVENT_DETAIL. Description of attributes can be obtained from the lookup tables after joining with either ADS_EVENT or ADS_EVENT_DETAIL tables. It is important to apply suitable filter to the queries to optimize performance. Common filter criteria may be based on date, event type, detail type, language etc.


Example scenario:  Reporting user group membership details for users, who have logged into BOE in past 30 days:


dbms_lob.substr(ad.EVENT_DETAIL_VALUE,2000,1) USER_GROUP
AND ad.EVENT_DETAIL_TYPE_ID = 15 --Denotes detail type: User Group Name
AND ad.event_detail_value not like 'Everyone%' --To eliminate the 'Everyone' group records
AND exists
(select 1 from ads_event X where X.event_type_id = 1014 --Denotes event type: Logon 
and X.event_id = ae.event_id and X.start_time >= sysdate-30))
WHERE rownum < 50001


The above query converts CLOB data type to varchar. Once converted, regular string functions can be applied on the results like order by, distinct etc.


Concluding Remarks:


The above write-up is not an exhaustive reference on audit database. Readers are encouraged to validate the above contents in line with standard BI Platform admin guide. Comments are welcome to further enhance the contents of this blog post. Thanks for your time

Have you ever seen a log file in your Application Server directory called TraceLog_<pid>_<date_timestamp>.glf?  Ever wondered what that was?


This log file is generated from a number of the BI Platform Web Applications and by default will contain on Error level messages.  Here is an example of one I found on my test machine:


Found in Directory: C:\Program Files (x86)\SAP BusinessObjects\tomcat\

Filename:  TraceLog_1140_2014_11_20_05_23_52_898_trace.glf



|64DF6F8D078E466397CBCD8D875B98240|2014 11 20 05:23:52.904|-0800|Error| |==|E| |TraceLog| 1140|  18|Start Level Event Dispatcher| ||||||||||||||||||||com.bo.aa.layout.DashboardManager||underlying implementation doesn't recognize the attribute

java.lang.IllegalArgumentException: http://javax.xml.XMLConstants/feature/secure-processing

  at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.setAttribute(Unknown Source)

  at com.bo.aa.layout.DashboardManager.setDocBuilderFeaturesForXXE(DashboardManager.java:134)

  at com.bo.aa.layout.DashboardManager.<clinit>(DashboardManager.java:161)

  at com.bo.aa.impl.DBServerImpl.<clinit>(DBServerImpl.java:397)

  at com.bo.aa.servlet.AFBootServlet.InitServers(AFBootServlet.java:80)

  at com.bo.aa.servlet.AFBootServlet.init(AFBootServlet.java:47)

  at com.businessobjects.http.servlet.internal.ServletRegistration.init(ServletRegistration.java:81)

  at com.businessobjects.http.servlet.internal.digester.WebXmlRegistrationManager.loadServlets(WebXmlRegistrationManager.java:127)

  at com.businessobjects.http.servlet.internal.digester.WebXmlRegistrationManager.registerRest(WebXmlRegistrationManager.java:209)

  at com.businessobjects.http.servlet.internal.ProxyServlet.readXml(ProxyServlet.java:368)

  at com.businessobjects.http.servlet.internal.ProxyServlet.registerInternal(ProxyServlet.java:395)

  at com.businessobjects.http.servlet.internal.ProxyServlet.register(ProxyServlet.java:317)

  at com.businessobjects.http.servlet.config.WebXmlConfigurator.register(WebXmlConfigurator.java:60)

  at com.businessobjects.bip.core.web.bundle.CoreWebXmlActivator.start(CoreWebXmlActivator.java:66)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:782)

  at java.security.AccessController.doPrivileged(Native Method)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:773)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:754)

  at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:352)

  at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:280)

  at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:272)

  at com.businessobjects.http.servlet.Activator.startBundle(Activator.java:129)

  at com.businessobjects.http.servlet.Activator.start(Activator.java:116)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:782)

  at java.security.AccessController.doPrivileged(Native Method)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:773)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:754)

  at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:352)

  at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:370)

  at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1068)

  at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:557)

  at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:464)

  at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:248)

  at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:445)

  at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:220)

  at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:330)


For those of you that have spent time looking at these types of messages, you will likely recognize a few things.  The first, is that the bulk of this error message is a Java backtrace. Backtraces are often read from the bottom up and it gives you an idea of the sequence of calls that occurred leading up to the error.  In this case, we can see the error: 


com.bo.aa.layout.DashboardManager||underlying implementation doesn't recognize the attribute

java.lang.IllegalArgumentException: http://javax.xml.XMLConstants/feature/secure-processing

Which tells us what caused the error trace log entry, but we might be more interested in what happened leading up to this error. For that, we can traverse the backtrace to get an idea of what was going on before this error. 

In this case, I have no idea what actually caused this error.  I just found it on my test machine from around 3 weeks ago.  But from the backtrace, I can made an educated guess that the cause was related to a Dashboard layout of some sort.  Regardless, this is not the purpose of this blog so I will move on.

The error messages found in these TraceLog*.glf files are not usually enough to properly troubleshoot an issue.  To get proper details around what causes an issue, we need have more verbose logging.


One way we can enable verbose logging for the BI Platform Web Apps is by enabling it in the CMC.  Section 25.4.1 in the BIP Administrator's guide covers how to do this.  In the CMC, you can enable traces for the BI Launchpad, CMC, Open Document, Promotion Management, Version Management, Visual Difference and Web Services applications.


Another way to enable tracing for the BI Platform Web Apps is to follow the below steps.  I have found added details in the these log files that wasn't available through the CMC enabled logs:


Steps to setup Verbose logging for the TraceLog Application server traces (example for Tomcat)

  1. Go to this folder and copy the BO_Trace.ini:  C:\Program Files (x86)\SAP BusinessObjects\tomcat\webapps\BOE\WEB-INF\TraceLog
  2. Paste this file in the C:\Program Files (x86)\SAP BusinessObjects\tomcat directory and rename it to TraceLog_trace.ini
  3. Edit this file and change the line:
    sap_trace_level = trace_error;


    sap_trace_level = trace_debug;
  4. Find the line below it and change it as well:
    sap_log_level = log_error;


    sap_log_level = log_info;
  5. I also like to set the append = true; to append = false; which will use the Process ID and Date/Time stamp in the naming convention of the log files.
  6. Save the TraceLog_trace.ini file and within a minute, you should start seeing some log files growing in the Tomcat directory.


Here is an example of what my log files contain after enabling the above log levels:


|039A2887DCF24130ADA77A3BA3DBF3A6155|2014 12 17 14:57:25.015|-0800|Information| |==| | |TraceLog| 1144|  47|http-bio-8080-exec-7| ||||||||||||||||||||com.businessobjects.bip.core.web.bridge.ServletBridgeLoggingDelegate||servletbridge.relative.url.prefix.out.of.bundle: ../..


|039A2887DCF24130ADA77A3BA3DBF3A6156|2014 12 17 14:57:25.015|-0800|Information| |==| | |TraceLog| 1144|  47|http-bio-8080-exec-7| ||||||||||||||||||||com.businessobjects.bip.core.web.bridge.ServletBridgeLoggingDelegate||servletbridge.relative.url.prefix.to.root.of.bundle: ../../IVExplorer

We can see that the type of entry is "Information" now which tells us our settings are being used.


Now, this trace is quite verbose so really the only time I would recommend using it is when you can reproduce an issue in a short period of time.  To deactivate the trace, you just edit the TraceLog_trace.ini file and set the trace/log levels back to *_error. 


Do not delete the file as this will not deactivate the current trace levels.  Just edit and save the file to deactivate it.  If you do delete the file, you will need to restart Tomcat to disable the traces again.


Any ways, this trace can sometimes give you additional details that are not available in other tracing methods.  Be sure to deactivate it as soon as you are done using it though as it does have a slight impact on performance.




In my previous blog, I covered securing of the communication of your authentication providers.

In this posting, we will cover the configuration of the web tier.   It is your war file deployment, and probably the most exposed part of your deployment, especially if you're facing the public web.


Reduce the attack surface.

The less you have deployed, the less that can be attacked.   Although the default BI install will deploy a number of components, you likely don't need them all.

You may see a list like this of war files deployed:

AdminTools - designed for running advanced direct queries against the BI repository.   If you don't use this, remove it.   You could also consider running it on a separate, local access only deployment.


BOE - This is the core of the BI deployment, includes CMC, BI Launchapd and OpenDocument functionality.  Note that using wdeploy, you can split the CMC and BI Launchpad deployment, and put the CMC functionality on another, more locked down application server.


dswsbobje - web service used by Crystal Reports for Enterprise, Dashboard designer, and your custom applications.  Again something you can remove if none of the above apply to you.


BusinessProcessBI - this is an SDK which is not needed for core functionality.  If you're not deploying custom applications that make use of this, this is something you can remove from your deployment.


clientAPI - contains Crystal Reports ActiveX controls for custom application deployment.  You can almost certainly remove this.


MobiServer & MobileBIService - if you are not deploying mobile, you should have no need for these.


docs - This is the default tomcat documentation.  They are also available online, so there should not be any need for these to be deployed.  They contain information about the version of tomcat which is not necessary.


Tomcat Security

Refer to your tomcat guide.  The following is an excerpt from the tomcat guide on default web applications:

Tomcat ships with a number of web applications that are enabled by default. Vulnerabilities have been discovered in these applications in the past. Applications that are not required should be removed so the system will not be at risk if another vulnerability is discovered.



Apache regularly publishes its list of fixed vulnerabilities here:


BI SP's regularly bundle updates of Tomcat.  SAP continually monitors the bundled applications and works to deliver any updates as part of the regular maintenance cycle.  We regularly monitor the security listings of tomcat, and use that to drive our updates.

If you are unable to stay on the latest support packages, you may want to consider reviewing the list of vulnerabilities and using your own update of Tomcat at least until such time when you can deploy the latest BI4.x support pack.


Tomcat User Account

The user account only needs to read files under tomcat.  Create a user for the tomcat service account, give the service account "Logon as a User" rights, and read only rights on the tomcat folder.


Hide CMS Information

The single biggest benefit is usability, because users will not accidentally lose the information or put in a typo and try to connect to the wrong place.

There is no reason why someone should try to communicate anywhere other than the CMS. so set the CMS.visbile=false setting in the BIlaunchpad.properties file and the CmcApp.properties files.

Change the following:


# You can specify the default CMS machine name here



# Choose whether to let the user change the CMS name



Now, there is less chance to redirect any shared secrets, credentials or other information to a server of their choosing.


Secure the communication channel - Use TLS

This should be a fairly well accepted policy already.

While terms like HTTPS and SSL are thrown around, this should really mean "TLS" behind the scenes.  TLS is a newer protocol for secure communication.  SSLv3 has now been rendered insecure, and you should be configuring your application servers to use the TLSv1 or higher protocol.

If you are not using SSO exclusively to logon to the BI web apps, (likely to be the case with CMC which does not support SSO), you should be encrypting the traffic and logging on with HTTPS.   Otherwise, the logon credentials will be passed from the browser to Tomcat or the application server of you choice in clear text over the wire.


You've heard of POODLE?  Disable SSLv3 in Tomcat while you're at it.



Do you use flash?  Dashboarding, aka XCelsius

The BI install installs a file called crossdomain.xml.  It's an XML document that grants a web client—such as Adobe Flash Player, Adobe Reader, etc.—permission to handle data across multiple domains.

The default is very inclusive,


    <site-control permitted-cross-domain-policies="all"/>

    <allow-http-request-headers-from domain="*" headers="*" secure="false" />

    <allow-access-from domain="*" secure="false" />


and you should take steps to lock it down if you will allow hosting of flash based content.

As this configuration file is completely outside of the SAP BI control, please refer to Adobe's documentation for crossdomain.xml



Protect Credentials

If you're setting up Active Directory SSO, make sure you're not storing the credentials as a java option, but protect the password with a keytab instead.

Don't do this (notice the wedgetail.idm.sso password in clear text):


Do this instead:


1. Create a keytab with the ktpass command

The details for this are contained in the whitepaper attached to sap note http://service.sap.com/sap/support/notes/1631734

The whitepaper is a must for anyone setting up AD for the first time.


2. Copy the.keytab file to the c:\windows\ directory of the application server

3. Add the following line to C:\Program Files (x86)\SAP BusinessObjects\Tomcat\webapps\BOE\WEB-INF\config\custom\global.properties idm.keytab=C:/WINDOWS/<your keytab file name>

If you're using Trusted Authentication, make sure you secure the shared secret file, so that only the process that your web application server is running as can access it.   Consider using OS file level encryption to further lock this file down.

Web Application Container Server

If you are using the WACS, to host your restful web services, or possibly the CMC, the configuration for secure communication is done through server properties in the CMC.


What about Cross Site Scripting, SQL Injection, OWASP TOP 10?!   IS IT SAFE!!??


SAP has a very strict release criteria, and a secure development cycle implemented.  Testing includes, and is not limited to, static code scanning, dynamic analysis tools, manual penetration testing and security architecture reviews.   You can find out more about our security processes here:



The secure approach is to treat your internal network that all your end users access as compromised.   Just think of the latest Sony attack as an example, to see the value of encrypting the communication channels.


Additionally, leveraging firewalls to block off parts of the network to would be attackers is also valuable.  Firewalls and server communication are covered in part 3.




Feel free to add you comments/questions on other areas, the blog will get updated with any additional bits that may have been missed here.


Back in Q3, we released a new, simpler pricing and licensing model.


In Q4, we just release a FAQ document that publicly describes this simplified model.


This can be found on the BusinessObjects BI Upgrade page. Here's a direct link.


Please record your additional questions in the comments below, and I will incorporate those into v2.


Thanks, Blair


Blair Wheadon

GM of Enterprise BI


In this new blog series, I will outline some of the best practices of securing your BI platform.

We will take the approach of outlining what assets we need to protect, and based on a threat model analysis, outline the steps you can and should take to secure all aspects of the BI deployment.


Are you absolutely secure?

If you answered yes, you either blew up and burned down your entire IT infrastructure, or you are fooling yourself.  Security is all about risk management.   Let us therefore do a flyover around some of the ways to lock things down and manage our risk.


In Part 1, we will look at securing the Identity Provider communication, and review how the data is stored.

The main external identity providers are outlined above.   From a security standpoint, we are concerned about both the data moving across the network, as well as data about users stored in the CMS repository.


Active Directory

When using the BI Active Directory connector, the calls between the CMS and active directory are actually encrypted natively by the Microsoft infrastructure.  The good thing about this is that is you do not need to take any additional steps to protect the network communication for this purpose. 


To access the Active Directory, query for and map users, the BI system requires AD credentials.

The data at rest, meaning the data that is stored in the CMS database is going to be strongly encrypted.  Refer to my articles on data security in BI4 for more information on the specifics: Encryption & Data Security in BI 4.0


The important consideration here then is, how much access the AD account (v8\bossosvcacct) above has in active directory.  You should always consider security in depth. What IF somehow the account is compromised.  How much damage could it do in your enterprise?

This account only needs to list your Active Directory contents.  This is controlled with the "List Contents" right.  While the best practices for locking down Active Directory are a little beyond the scope, you can for example reduce the account's ability to query for additional user properties like email address.  Some examples contained in this external blog on hiding AD objects.


The account that the SIA runs under should also run with minimum privileges.   Suppose the process gets exploited somehow or the credentials fall into the wrong hands.  You most certainly don't want the account to have the capability to create a user or grant permissions.


In many cases, users will use the same account for querying active directory as they do for running the SIA.


When creating the account in AD, while more tricky to sometimes setup , constrained delegation can allow you to limit the services for which a resulting kerberos ticket is used for.   While this is not supported for the OLAP on MSAAS when working with SSO to database, it should work for all other use cases and is a way to restrict the usage.


The rights required on the local computer where the SIA is running are as follows:

-Act as Part of the Operating System

-Logon As a Service.

-Read/Write to HKEY_LOCAL_MACHINE\SOFTWARE\SAP BusinessObjects\Suite XI 4.0

-Read/Write to Install directory (specifically Write access to the Log Locations).

The important part here is the account should NOT be an Administrator on the local machine.




Unlike Active Directory, the logon via LDAP to the underlying identity provider will be sent in clear text over the wire unless you configure SSL.

You can do this while going through the wizard or directly in your LDAP authentication screen after.

At minimum, you should be using Server Authentication.   This will allow you to ensure that BI only connects to a trusted LDAP source, and will not send the LDAP credentials to an untrusted source, and of course encrypt the actual traffic, as it should be.


Again, the details you store here are stored encrypted in the CMS repository, and the deep details are here:Encryption & Data Security in BI 4.0



Your SAP authentication also requires an extra step to be encrypted.  This is done by in the SNC configuration.  Notice you can set different levels on encryption here, and this applies to not only the queries sent from the CMS to the SAP system for user authentication, but also for data access as you build out your reports.

But WAIT you say, I'm using OLAP and UNX and I use STS (the security token service).  Isn't SNC for the legacy xir3 content like UNV?

SNC is ALSO a security layer.   The SNC settings for encryption will be used for the STS communication when setting up your SSO to BW.   The summary here is that you should be configuring SNC always, at least for the Authentication level of quality of protection, and avoid sending around credentials unencrypted.

You will notice that BI4.1 now ships with a default SNC library to help with the configuration and potentially save you the extra step of downloading the libraries by using the "Use Default" setting for SNC library settings.


In the next part of the security blog  series, I will look at protecting the web tier. 





Are you an experienced BI System Administrator? Would you like to learn how to prepare for a successful and smooth BI deployment by implementing SAP BusinessObjects BI 4? Did you miss the first openSAP BI course, BI 4.0 Platform Innovation and Implementation? Then you’ll be glad to hear we’re repeating the course starting January 21, 2015!


BI 4 Platform Innovation and Implementation is aimed at experienced BI System Administrators. During this course, participants will have the opportunity to practice with hands on exercises in their own Amazon cloud-based system to prepare for a successful and smooth BI deployment. This course is designed for experienced BI System Administrators, responsible for implementation, deployment and administration of SAP BusinessObjects BI 4.


In this course, we’ll cover the following BI topics:


  • Week 1: Introduction, Architecture & Sizing
  • Week 2: Installation, Upgrade & Promotion
  • Week 3: Troubleshooting & Authentication
  • Week 4: Performance Optimization
  • Week 5: Final Exam


There are almost 23,000 participants signed up to the first round of this course, which received great feedback. Here’s just a small selection from the I like, I wish forum in the first course. (You must be logged in and enrolled to view the I like, I wish forum)


“Course contents are magnific. It has surprised me to discover lessons about topics that I have never seen it before in any official SAP BI courses”


“Thanks a lot for the course. It helped me a lot to improve my skill set and answered many questions. I have already performed the Installation and setup for one of our customer. Now I have learnt new tips in this course. I am confident that in future set ups at client location use these tips.”


“As a part-time admin for my company's internal BOBJ installation this is just what I needed to fill in the gaps since I am primarily a BI developer (not an admin). Really excited to have this available and would love to see more - especially on reporting tools. Very well done!



Sign up for this popular openSAP course, BI 4.0 Platform Innovation and Implementation today!

  Most of people have noticed that Platform Search application works differently in Business Intelligence Platform (BI) 4.x comparing to previous release. The architecture of Platform Search has been changed significantly since BI 4.0. It provides scalable and flexible Business Objects content indexing and search infrastructure support for different proprietary BOE content types. It can be set to real time indexing, so that the user is not required to restart the Indexing every time when he wants latest indexing content. When the documents are published/modified/deleted in the repository, the application identifies those documents and they will be indexed. Alternatively, it can be set to schedule based indexing which will trigger the indexing based on the schedule time. In either way, the user can perform searching in BI Launchpad while indexing is happening. Platform Search also supports load balancing and failover for both indexing and searching in a clustered environment.


  Platform Search service is the service in the Adaptive Processing Server, which has the logic to index the BOE content and search the content. It uses Apache Lucene, a free open source information retrieval software library from Apache Software Foundation. The version of Apache Lucene currently used by BI 4.0 and BI 4.1 is 2.4.1.


  The functionality of the Platform Search service can be divided as Indexing and Searching. Before the content becomes searchable, the content needs to be indexed. In a large sized system with a large number of infoobjects, getting all the infoobjects fully indexed first time can be time consuming because indexing involves several sequential tasks. I will talk about the indexing process in this blog.



Indexing Process


Indexing is a continuous process that involves the following sequential tasks:


1.     Use Crawling mechanism to poll the CMS repository and identifies objects that are published, modified, or deleted. It can be done in two ways: continuous and scheduled crawling.


2.     Use Extracting mechanism to call the extractors based upon the document type. There is a dedicated extractor for every document type that is available in the repository. There are following extractors:


        • Metadata Extractor
        • Crystal Reports Extractor
        • Web Intelligence Extractor
        • Universe Extractor
        • BI Workspace
        • Agnostic Extractor (Microsoft Word/Excel/PPT, Text, RTF, PDF)


3.      Use Indexing mechanism to index all the extracted content through the third-party library, Apache Lucene Engine. The time required for indexing varies, depending on the number of objects in the system, and the size and type of documents. It involves the following steps:

    1. The extracted content will be stored in the local file system (<BI 4 Install folder>\Data\PlatformSearchData\workplace\Temporary Surrogate Files) in an xml format called as Surrogate files.
    2. These surrogate files will be uploaded to Input File Repository Server (FRS) and will be removed from the local file system.
    3. The content of the surrogate files will be read and will be indexed by using specific index Engine into temporary location called as Delta Indexing Area (<BI 4 Install folder>\Data\PlatformSearchData\workplace\DeltaIndexes).
    4. The Delta index will be uploaded to Input FRS and will be deleted from the local file system.
    5. The Delta Index will be read and will be merged into Master Indexed Area (<BI 4 Install folder>\Data\PlatformSearchData\Lucene Index Engine\index) which is the final indexed area in the local file system.


         For indexing to run successfully, the following servers must be running and enabled:


        • InputFileRepositoryServer
        • OutputFileRepositoryServer
        • CentralManagementServer
        • AdaptiveProcessingServer with Platform search service on
        • AdaptiveJobServer (scheduled crawling)
        • WebIntelligenceProcessingServer (content type is selected as Web Intelligence)
        • CrystalReportApplicationServer (content type is selected as Crystal Reports)



4.      Generating Content Store and Speller/Suggestions

         After completing the Indexing task the following things will be generated:


        • Content Store: The content store contains information such as id, cuid, name, kind, and instance extracted from the master index in a format that can be read easily. This helps to quicken the search.

Each AdaptiveProcessingServer creates its own content store (<BI 4 Install folder>\Data\PlatformSearchData\workplace\<NodeName>.AdaptiveProcessingServer\




        • Speller/Suggestions: The similar words will be created from the master indexed data and will be indexed. The speller folder will be created under “Lucene Index Engine” folder (<BI 4 Install folder>\Data\PlatformSearchData\Lucene Index Engine\speller)






Platform Search Queues


  Internally, above indexing sequential tasks are handled by Platform Search Queues. When Indexing is started, an infoobject would eventually go through the following queues in this order:

To Be Extracted > Under Extraction > To Be Indexed > Indexing > Delta Index To Be Merged > Content Store Merge

If multiple Platform Search Services exist, there is only one To Be Extracted, To Be Indexed, Delta Index To Be Merged and Content Store Merge queue for all nodes. But each Platform Search Service has its own Under Extraction Queue and Indexing Queue. Only one Platform Search Service will be designated as the master service to do delta index merge into master index.


  Each Platform Search Queue itself is an infoobject, the status of each Platform Search Queue can be retrieved by running the following query in the Query Builder:



It will return the results with the following SI_NAMEs:

  • Platform Search (Delta Index To Be Merged) Queue
  • Platform Search (To Be Indexed) Queue
  • Platform Search (To Be Extracted) Queue
  • Platform Search (Exclude Documents) Queue
  • Platform Search (Include Documents) Queue
  • Platform Search Content Store Merge Queue
  • Platform Search (Under Extraction - Enity - AcpzqPRw1thIk_GYPiEETF8)
  • Platform Search (Indexing - Enity - AcpzqPRw1thIk_GYPiEETF8)


You will find a property called SI_PLATFORM_SEARCH_OBJECTS in each queue. That property displays the number of objects being processed in that queue. If SI_TOTAL of that property displays 0, it means that queue is empty.


  Exclude Documents and Include Documents are two special Queues to handle the exclude documents. When you update the exclude documents in CMC > Applications > Platform Search Application > Properties > Documents Excluded from Indexing, the documents will be added to the Platform Search
(Exclude Documents) Queue
.  When infoobjects are extracted, they will be excluded.

  When you remove the exclude documents in CMC > Applications > Platform Search Application > Properties > Documents Excluded from Indexing, the documents will be removed from exclude documents queue and added to the Platform Search (Include Documents) Queue. The crawling will only add documents to be extracted queue if only there is modification for the infoobject and its content or it is a new infoobject. In the case of those infoobjects removed from the exclude documents, they are neither new infoobject, nor modified, so they won't be picked up by crawling. They are added to this special queue, so that they will be added to the To Be Extracted queue.

  From the Platform Search Queues result, you can see that Under Extraction and Indexing Queues are associated with a Platform Search Service session SI_CUID because each Platform Search Service has its own Under Extraction Queue and Indexing Queue. The information of Platform Search Service Sessions can be retrieved by running the following query in the Query Builder:



Each Platform Search service should have one session. If the heartbeat (SI_PLATFORM_SEARCH_HEARTBEAT_TIMESTAMP) isn’t updated regularly on one session, other search service would try to return the hung service’s objects to the previous queue and take over unfinished work.

Here are some other useful queries you can run to get information regarding Platform Search Application.



Retrieving the general information about Platform Search Application



The property SI_PLATFORM_SEARCH_SERVICE_CONTEXT_ACTION shows if the indexing is running. 0 means Indexing is not running, 1 means Indexing is running.



Retrieving the information of Platform Search Application Status



For example, you can check the following properties:



SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID represents the SI_ID of the last infoobject which was added to the To Be Extracted queue. The infoobjects are added to the To Be Extracted queue in the batches. So if we have a batch of 100 infoobjects which are added in the To Be Extracted queue, this field will have the max SI_ID among the SI_IDs of those infoobjects.



SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_ID represents the SI_ID of the last infoobject which was added to the To Be Indexed queue. When indexing starts, this field will have the same value as SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID. But during the indexing if some infoobjects didn't get added to the To Be Indexed queue, then this field is updated with the max SI_ID of the infoobjects which actually got added to the To Be Indexed queue. And SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID field is retained with the original value. For both these fields, the SI_IDs of folders are not included.



  For the definition of above properties related to Platform Search, please use the latest release of the SAP BI Platform Support Tool. A new report option has been added in BI Platform Support Tool that will provide detailed information on the Platform Search and how it is performing.







  I hope this blog helps you to understand how Platform Search Indexing works.


The Best-Run Businesses Run SAP



Friday, December 5 2014

New Maintenance Dates for SAP BusinessObjects BI4.1

To enable our customer base adopting SAP BusinessObjects BI4.1, SAP is pleased to announce the extension of Maintenance for SAP BusinessObjects BI4.1 with two years of additional support. The End of Maintenance and Priority One dates have been extended as of today!



In this issue:


1. New SAP BusinessObjects BI4.1 End of Maintenance Dates


By SAP support standards, the End of Maintenance dates are defined by a 7+2 years support plan for a product line, this resulted in an End of Maintenance for SAP BusinessObjects BI4.1 by December, 31 2016. However, while listening to our customers, SAP learned that the existing End of Maintenance Dates where too short to enable a full adoption of SAP BusinessObjects BI4.1.

To enable all our customers in the adoption of SAP BusinessObjects BI41, SAP has decided to extend the default Mainstream Maintenance with an additional two years.


  • End of Mainstream Maintenance for SAP BusinessObjects BI4.1 is now December 31, 2018
  • End of Priority One Support for SAP BusinessObjects BI4.1 is now December 31, 2020




SAP BusinessObjects BI4.1 Product Availability Matrix ›



SAP Maintenance Strategy Rules



SAP Mainstream Maintenance Rules ›



SAP BusinessObjects Priority-One Rules ›



2. Existing SAP BusinessObjects Products Support Overview


Existing SAP BusinessObjects Products Support Overview

Although the Mainstream Maintenance dates for SAP BusinessObjects BI4.1 have been extended by 2 year, this is not the case for other SAP BusinessObjects Products. To provide a complete overview of SAP BusinessObjects Products and their End of Mainstream dates the list below is provided.

In case you are currently not running an SAP BusinessObjects BI4.1 release, please validate your current Mainstream Maintenance dates. In case the Mainstream Maintenance dates have been passed or are passing in the near future, it is strongly recommended to upgrade your existing environment to SAP BusinessObjects BI4.1. For details on the Upgrade process, please read:



SAP BusinessObjects Platform BI4.1

End of Mainstream Maintenance -> December, 31 2018

End of Priority One Support -> December, 31 2020



SAP BusinessObjects Platform BI4.0

End of Mainstream Maintenance -> December, 31 2015

End of Priority One Support -> December, 31 2017



SAP BusinessObjects Enterprise XI3.1

End of Mainstream Maintenance -> December, 31 2015

End of Priority One Support -> December, 31 2017



Prior to SAP BusinessObjects Enterprise XI3.1


End of Mainstream Maintenance -> Expired

End of Priority One Support -> Expired



3. Where to find more information about (upgrading to) SAP BusinessObjects BI4.1


SAP has been building up a dedicated webpage with a collection of the most valuable information in regards to the SAP BusinessObjects BI4.1 Suite. Wherever you are in your Business Intelligence journey, find resources here that will help your organization be successful with SAP BusinessObjects BI solutions.



The SAPBUSINESSOBJECTSBI.COM Website will enable to


  1. Be Your Organization's BI Champion
    Leverage the resources below to show what's possible with SAP BusinessObjects BI solutions and increase BI excitement within your organization.

  2. Empower Employees with a BI Strategy
    A solid BI strategy helps you get the most from your data assets, technology investments and BI initiatives.

  3. Design and Manage a Successful Implementation
    Address organizational and governance needs as well as the technology portion of your implementation.

  4. Get Personalized Upgrade Advice
    Start planning your upgrade with a personalized guide from an upgrade expert. The report will help you get the most out of your BI deployment.

  5. Advance your skills with BI Academy
    Learn more about BI and SAP BusinessObjects BI solutions.

  6. Events & Webinars
    Find new perspectives, alternative approaches, and spark your BI creativity.






At a recent customer project my teammate and I were facing an issue with importing Webi documents and universes into Translation Manager (all on BO4.1 SP4) getting the following error message:


org.apache.axis2.AxisFault: Unable to find servers in CMS, servers could be down or disabled by the administrator (FWN 01014)


There is a KB article describing this error and a solution: http://service.sap.com/sap/support/notes/1879675


One hand we followed the instructions in the KB article and specified the hostname for each server. Although our server is multihomed (that means has more than one network interfaces) we didn't thought about this before hand because the second network interface goes into a backup LAN and is never used for communication with e.g. the client tools. In addition everything worked fine so far - until we got the issue with Translation Manager. Still, just setting the hostname was not enough. On the server side there is the Windows firewall enabled, therefore we had to assign static request ports to several services. We already did so before our issue with the Translation Manager for the CMS, Input & Output FRS, all Webi Processing Server. We used TCPView to analyze on which ports Translation Manager was opening connections. As long as there were requests on ports which we didn't specify in the CMC (some random port number e.g. 56487) we narrowed down all the services which Translation Manager establishes a connection. We had to specify a port for all of the following servers:

  • The Adaptive Processing Server (APS) hosting the DSL-Bridge Service
  • The APS hosting the Client Auditing Proxy Service
  • The APS hosting the Search Index Service
  • The APS hosting the Translation Service
  • The WACS (Web Application Container Server)


Besides the issue with Translation Manager we had another issue with creating new universe connections in the Universe Design Tool. When creating a new connection we had to wait up to 30 minutes ( ! ) to get to the wizard page where you can select from the list of available database drivers. Still, after 30 minutes everything worked fine and could create the connection successfully. Based on our experience with Translation Manager we run again TCPView and found that we need to assign a port number to both connection servers (32 and 64 bit) in CMC. Having done this, creating a new connection now works without any waiting time.


After all: If you have firewalls between your BO server and client tools, just assign a port to all servers available and open the port on the firewall (the only exception might be for the Event Server as I'm really not aware of any communication between this server and a component outside the BO server)

Crystal Reports Enterprise 4.1 SP 05 release is supporting the feature, manual entry in prompts for BW variables. Here I will be mentioning information on supported and not-supported variables and steps to have multi-value field for selection option variable.

Manual entry in prompts is supporting for following variables:

  1. Single value variable
  2. Multi-value variable
  3. Interval( Range) variable
  4. Selection option variable
  5. Formula variable
  6. Single Keydate variable

Manual entry feature is not supported for Hierarchy variables, Hierarchy node variables.


  1. The feature is available by default in CRE and BOE Viewer for old (4.1 SP04 or other) and newly created reports.

When you open any report or create a new report, you should have one text box for manual entry with add symbol like below.  When you refresh the report with prompts, the report will display the below dialog with manual entry text box option.

Multiple values can be entered with semicolon separated.

Either you can enter the values manually or can select from the list.

ManulEntry TExt field.png



  2. To get multi-value selection field for Selection Option variable we have to make an entry in configuration file of Crystal Reports Enterprise installation folder.

Make the below entry in the configuration file (C:\Program Files (x86)\SAP BusinessObjects\Crystal Reports for Enterprise XI 4.0\configuration\config.ini)

Entry to be made:  “sap.sl.bics.variableComplexSelectionMapping=multivalue





  3. In-order to get multi value selection field in Viewer, we have to make an entry in CMC as well.

Entry Location: Login in to CMC -> Servers->Crystal Report Services-> CrystalReportsProcessingServer -> Java Child VM arguements

Entry to be made: “Dsap.sl.bics.variableComplexSelectionMapping=multivalue”



If the entry is not made in CRE or Viewer then the field appears as Interval or Range field.



  Hope it helps…


Filter Blog

By author:
By date:
By tag: