1 2 3 20 Previous Next

BI Platform

300 Posts

Have you ever seen a log file in your Application Server directory called TraceLog_<pid>_<date_timestamp>.glf?  Ever wondered what that was?

 

This log file is generated from a number of the BI Platform Web Applications and by default will contain on Error level messages.  Here is an example of one I found on my test machine:

 

Found in Directory: C:\Program Files (x86)\SAP BusinessObjects\tomcat\

Filename:  TraceLog_1140_2014_11_20_05_23_52_898_trace.glf

Contents:

 

|64DF6F8D078E466397CBCD8D875B98240|2014 11 20 05:23:52.904|-0800|Error| |==|E| |TraceLog| 1140|  18|Start Level Event Dispatcher| ||||||||||||||||||||com.bo.aa.layout.DashboardManager||underlying implementation doesn't recognize the attribute

java.lang.IllegalArgumentException: http://javax.xml.XMLConstants/feature/secure-processing

  at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.setAttribute(Unknown Source)

  at com.bo.aa.layout.DashboardManager.setDocBuilderFeaturesForXXE(DashboardManager.java:134)

  at com.bo.aa.layout.DashboardManager.<clinit>(DashboardManager.java:161)

  at com.bo.aa.impl.DBServerImpl.<clinit>(DBServerImpl.java:397)

  at com.bo.aa.servlet.AFBootServlet.InitServers(AFBootServlet.java:80)

  at com.bo.aa.servlet.AFBootServlet.init(AFBootServlet.java:47)

  at com.businessobjects.http.servlet.internal.ServletRegistration.init(ServletRegistration.java:81)

  at com.businessobjects.http.servlet.internal.digester.WebXmlRegistrationManager.loadServlets(WebXmlRegistrationManager.java:127)

  at com.businessobjects.http.servlet.internal.digester.WebXmlRegistrationManager.registerRest(WebXmlRegistrationManager.java:209)

  at com.businessobjects.http.servlet.internal.ProxyServlet.readXml(ProxyServlet.java:368)

  at com.businessobjects.http.servlet.internal.ProxyServlet.registerInternal(ProxyServlet.java:395)

  at com.businessobjects.http.servlet.internal.ProxyServlet.register(ProxyServlet.java:317)

  at com.businessobjects.http.servlet.config.WebXmlConfigurator.register(WebXmlConfigurator.java:60)

  at com.businessobjects.bip.core.web.bundle.CoreWebXmlActivator.start(CoreWebXmlActivator.java:66)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:782)

  at java.security.AccessController.doPrivileged(Native Method)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:773)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:754)

  at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:352)

  at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:280)

  at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:272)

  at com.businessobjects.http.servlet.Activator.startBundle(Activator.java:129)

  at com.businessobjects.http.servlet.Activator.start(Activator.java:116)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:782)

  at java.security.AccessController.doPrivileged(Native Method)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:773)

  at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:754)

  at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:352)

  at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:370)

  at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1068)

  at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:557)

  at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:464)

  at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:248)

  at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:445)

  at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:220)

  at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:330)

 

For those of you that have spent time looking at these types of messages, you will likely recognize a few things.  The first, is that the bulk of this error message is a Java backtrace. Backtraces are often read from the bottom up and it gives you an idea of the sequence of calls that occurred leading up to the error.  In this case, we can see the error: 

 

com.bo.aa.layout.DashboardManager||underlying implementation doesn't recognize the attribute

java.lang.IllegalArgumentException: http://javax.xml.XMLConstants/feature/secure-processing


Which tells us what caused the error trace log entry, but we might be more interested in what happened leading up to this error. For that, we can traverse the backtrace to get an idea of what was going on before this error. 


In this case, I have no idea what actually caused this error.  I just found it on my test machine from around 3 weeks ago.  But from the backtrace, I can made an educated guess that the cause was related to a Dashboard layout of some sort.  Regardless, this is not the purpose of this blog so I will move on.


The error messages found in these TraceLog*.glf files are not usually enough to properly troubleshoot an issue.  To get proper details around what causes an issue, we need have more verbose logging.

 

One way we can enable verbose logging for the BI Platform Web Apps is by enabling it in the CMC.  Section 25.4.1 in the BIP Administrator's guide covers how to do this.  In the CMC, you can enable traces for the BI Launchpad, CMC, Open Document, Promotion Management, Version Management, Visual Difference and Web Services applications.

 

Another way to enable tracing for the BI Platform Web Apps is to follow the below steps.  I have found added details in the these log files that wasn't available through the CMC enabled logs:

 

Steps to setup Verbose logging for the TraceLog Application server traces (example for Tomcat)


  1. Go to this folder and copy the BO_Trace.ini:  C:\Program Files (x86)\SAP BusinessObjects\tomcat\webapps\BOE\WEB-INF\TraceLog
  2. Paste this file in the C:\Program Files (x86)\SAP BusinessObjects\tomcat directory and rename it to TraceLog_trace.ini
  3. Edit this file and change the line:
    sap_trace_level = trace_error;

        to

    sap_trace_level = trace_debug;
  4. Find the line below it and change it as well:
    sap_log_level = log_error;

        to

    sap_log_level = log_info;
  5. I also like to set the append = true; to append = false; which will use the Process ID and Date/Time stamp in the naming convention of the log files.
  6. Save the TraceLog_trace.ini file and within a minute, you should start seeing some log files growing in the Tomcat directory.

 

Here is an example of what my log files contain after enabling the above log levels:

 

|039A2887DCF24130ADA77A3BA3DBF3A6155|2014 12 17 14:57:25.015|-0800|Information| |==| | |TraceLog| 1144|  47|http-bio-8080-exec-7| ||||||||||||||||||||com.businessobjects.bip.core.web.bridge.ServletBridgeLoggingDelegate||servletbridge.relative.url.prefix.out.of.bundle: ../..

 

|039A2887DCF24130ADA77A3BA3DBF3A6156|2014 12 17 14:57:25.015|-0800|Information| |==| | |TraceLog| 1144|  47|http-bio-8080-exec-7| ||||||||||||||||||||com.businessobjects.bip.core.web.bridge.ServletBridgeLoggingDelegate||servletbridge.relative.url.prefix.to.root.of.bundle: ../../IVExplorer

We can see that the type of entry is "Information" now which tells us our settings are being used.

 

Now, this trace is quite verbose so really the only time I would recommend using it is when you can reproduce an issue in a short period of time.  To deactivate the trace, you just edit the TraceLog_trace.ini file and set the trace/log levels back to *_error. 

 

Do not delete the file as this will not deactivate the current trace levels.  Just edit and save the file to deactivate it.  If you do delete the file, you will need to restart Tomcat to disable the traces again.

 

Any ways, this trace can sometimes give you additional details that are not available in other tracing methods.  Be sure to deactivate it as soon as you are done using it though as it does have a slight impact on performance.

 

Thanks,

Jb

In my previous blog, I covered securing of the communication of your authentication providers.

In this posting, we will cover the configuration of the web tier.   It is your war file deployment, and probably the most exposed part of your deployment, especially if you're facing the public web.

 

Reduce the attack surface.

The less you have deployed, the less that can be attacked.   Although the default BI install will deploy a number of components, you likely don't need them all.

You may see a list like this of war files deployed:

AdminTools - designed for running advanced direct queries against the BI repository.   If you don't use this, remove it.   You could also consider running it on a separate, local access only deployment.

 

BOE - This is the core of the BI deployment, includes CMC, BI Launchapd and OpenDocument functionality.  Note that using wdeploy, you can split the CMC and BI Launchpad deployment, and put the CMC functionality on another, more locked down application server.

 

dswsbobje - web service used by Crystal Reports for Enterprise, Dashboard designer, and your custom applications.  Again something you can remove if none of the above apply to you.

 

BusinessProcessBI - this is an SDK which is not needed for core functionality.  If you're not deploying custom applications that make use of this, this is something you can remove from your deployment.

 

clientAPI - contains Crystal Reports ActiveX controls for custom application deployment.  You can almost certainly remove this.

 

MobiServer & MobileBIService - if you are not deploying mobile, you should have no need for these.

 

docs - This is the default tomcat documentation.  They are also available online, so there should not be any need for these to be deployed.  They contain information about the version of tomcat which is not necessary.

 

Tomcat Security

Refer to your tomcat guide.  The following is an excerpt from the tomcat guide on default web applications:

Tomcat ships with a number of web applications that are enabled by default. Vulnerabilities have been discovered in these applications in the past. Applications that are not required should be removed so the system will not be at risk if another vulnerability is discovered.

http://tomcat.apache.org/tomcat-7.0-doc/security-howto.html#Default_web_applications

 

Apache regularly publishes its list of fixed vulnerabilities here:

http://tomcat.apache.org/security-7.html

BI SP's regularly bundle updates of Tomcat.  SAP continually monitors the bundled applications and works to deliver any updates as part of the regular maintenance cycle.  We regularly monitor the security listings of tomcat, and use that to drive our updates.

If you are unable to stay on the latest support packages, you may want to consider reviewing the list of vulnerabilities and using your own update of Tomcat at least until such time when you can deploy the latest BI4.x support pack.

 

Tomcat User Account

The user account only needs to read files under tomcat.  Create a user for the tomcat service account, give the service account "Logon as a User" rights, and read only rights on the tomcat folder.

 

Secure the communication channel - Use TLS

This should be a fairly well accepted policy already.

While terms like HTTPS and SSL are thrown around, this should really mean "TLS" behind the scenes.  TLS is a newer protocol for secure communication.  SSLv3 has now been rendered insecure, and you should be configuring your application servers to use the TLSv1 or higher protocol.

If you are not using SSO exclusively to logon to the BI web apps, (likely to be the case with CMC which does not support SSO), you should be encrypting the traffic and logging on with HTTPS.   Otherwise, the logon credentials will be passed from the browser to Tomcat or the application server of you choice in clear text over the wire.

 

You've heard of POODLE?  Disable SSLv3 in Tomcat while you're at it.

 

 

Do you use flash?  Dashboarding, aka XCelsius

The BI install installs a file called crossdomain.xml.  It's an XML document that grants a web client—such as Adobe Flash Player, Adobe Reader, etc.—permission to handle data across multiple domains.

The default is very inclusive,

<cross-domain-policy>

    <site-control permitted-cross-domain-policies="all"/>

    <allow-http-request-headers-from domain="*" headers="*" secure="false" />

    <allow-access-from domain="*" secure="false" />

</cross-domain-policy>

and you should take steps to lock it down if you will allow hosting of flash based content.

As this configuration file is completely outside of the SAP BI control, please refer to Adobe's documentation for crossdomain.xml

 

 

Protect Credentials

If you're setting up Active Directory SSO, make sure you're not storing the credentials as a java option, but protect the password with a keytab instead.

Don't do this (notice the wedgetail.idm.sso password in clear text):

 

Do this instead:

 

1. Create a keytab with the ktpass command

The details for this are contained in the whitepaper attached to sap note http://service.sap.com/sap/support/notes/1631734

The whitepaper is a must for anyone setting up AD for the first time.

 

2. Copy the.keytab file to the c:\windows\ directory of the application server

3. Add the following line to C:\Program Files (x86)\SAP BusinessObjects\Tomcat\webapps\BOE\WEB-INF\config\custom\global.properties idm.keytab=C:/WINDOWS/<your keytab file name>


If you're using Trusted Authentication, make sure you secure the shared secret file, so that only the process that your web application server is running as can access it.   Consider using OS file level encryption to further lock this file down.



Web Application Container Server

If you are using the WACS, to host your restful web services, or possibly the CMC, the configuration for secure communication is done through server properties in the CMC.

 

What about Cross Site Scripting, SQL Injection, OWASP TOP 10?!   IS IT SAFE!!??

 

SAP has a very strict release criteria, and a secure development cycle implemented.  Testing includes, and is not limited to, static code scanning, dynamic analysis tools, manual penetration testing and security architecture reviews.   You can find out more about our security processes here:

 

Conclusion

The secure approach is to treat your internal network that all your end users access as compromised.   Just think of the latest Sony attack as an example, to see the value of encrypting the communication channels.

 

Additionally, leveraging firewalls to block off parts of the network to would be attackers is also valuable.  Firewalls and server communication will be covered in a later blog post.

 

 

Feel free to add you comments/questions on other areas, the blog will get updated with any additional bits that may have been missed here.


 

Back in Q3, we released a new, simpler pricing and licensing model.

 

In Q4, we just release a FAQ document that publicly describes this simplified model.

 

This can be found on the BusinessObjects BI Upgrade page. Here's a direct link.

 

Please record your additional questions in the comments below, and I will incorporate those into v2.

 

Thanks, Blair

 

Blair Wheadon

GM of Enterprise BI

@blairtwheadon

In this new blog series, I will outline some of the best practices of securing your BI platform.

We will take the approach of outlining what assets we need to protect, and based on a threat model analysis, outline the steps you can and should take to secure all aspects of the BI deployment.

 

Are you absolutely secure?

If you answered yes, you either blew up and burned down your entire IT infrastructure, or you are fooling yourself.  Security is all about risk management.   Let us therefore do a flyover around some of the ways to lock things down and manage our risk.

 

In Part 1, we will look at securing the Identity Provider communication, and review how the data is stored.

The main external identity providers are outlined above.   From a security standpoint, we are concerned about both the data moving across the network, as well as data about users stored in the CMS repository.

 

Active Directory

When using the BI Active Directory connector, the calls between the CMS and active directory are actually encrypted natively by the Microsoft infrastructure.  The good thing about this is that is you do not need to take any additional steps to protect the network communication for this purpose. 

 

To access the Active Directory, query for and map users, the BI system requires AD credentials.

The data at rest, meaning the data that is stored in the CMS database is going to be strongly encrypted.  Refer to my articles on data security in BI4 for more information on the specifics: Encryption & Data Security in BI 4.0

 

The important consideration here then is, how much access the AD account (v8\bossosvcacct) above has in active directory.  You should always consider security in depth. What IF somehow the account is compromised.  How much damage could it do in your enterprise?

This account only needs to list your Active Directory contents.  This is controlled with the "List Contents" right.  While the best practices for locking down Active Directory are a little beyond the scope, you can for example reduce the account's ability to query for additional user properties like email address.  Some examples contained in this external blog on hiding AD objects.

 

The account that the SIA runs under should also run with minimum privileges.   Suppose the process gets exploited somehow or the credentials fall into the wrong hands.  You most certainly don't want the account to have the capability to create a user or grant permissions.

 

In many cases, users will use the same account for querying active directory as they do for running the SIA.

 

When creating the account in AD, while more tricky to sometimes setup , constrained delegation can allow you to limit the services for which a resulting kerberos ticket is used for.   While this is not supported for the OLAP on MSAAS when working with SSO to database, it should work for all other use cases and is a way to restrict the usage.

 

The rights required on the local computer where the SIA is running are as follows:

-Act as Part of the Operating System

-Logon As a Service.

-Read/Write to HKEY_LOCAL_MACHINE\SOFTWARE\SAP BusinessObjects\Suite XI 4.0

-Read/Write to Install directory (specifically Write access to the Log Locations).

The important part here is the account should NOT be an Administrator on the local machine.

 

LDAP

 

Unlike Active Directory, the logon via LDAP to the underlying identity provider will be sent in clear text over the wire unless you configure SSL.

You can do this while going through the wizard or directly in your LDAP authentication screen after.

At minimum, you should be using Server Authentication.   This will allow you to ensure that BI only connects to a trusted LDAP source, and will not send the LDAP credentials to an untrusted source, and of course encrypt the actual traffic, as it should be.

 

Again, the details you store here are stored encrypted in the CMS repository, and the deep details are here:Encryption & Data Security in BI 4.0

 

SAP

Your SAP authentication also requires an extra step to be encrypted.  This is done by in the SNC configuration.  Notice you can set different levels on encryption here, and this applies to not only the queries sent from the CMS to the SAP system for user authentication, but also for data access as you build out your reports.

But WAIT you say, I'm using OLAP and UNX and I use STS (the security token service).  Isn't SNC for the legacy xir3 content like UNV?

SNC is ALSO a security layer.   The SNC settings for encryption will be used for the STS communication when setting up your SSO to BW.   The summary here is that you should be configuring SNC always, at least for the Authentication level of quality of protection, and avoid sending around credentials unencrypted.

You will notice that BI4.1 now ships with a default SNC library to help with the configuration and potentially save you the extra step of downloading the libraries by using the "Use Default" setting for SNC library settings.

 

In the next part of the security blog  series, I will look at protecting the web tier. 

 

 

 

 

Are you an experienced BI System Administrator? Would you like to learn how to prepare for a successful and smooth BI deployment by implementing SAP BusinessObjects BI 4? Did you miss the first openSAP BI course, BI 4.0 Platform Innovation and Implementation? Then you’ll be glad to hear we’re repeating the course starting January 21, 2015!

 

BI 4 Platform Innovation and Implementation is aimed at experienced BI System Administrators. During this course, participants will have the opportunity to practice with hands on exercises in their own Amazon cloud-based system to prepare for a successful and smooth BI deployment. This course is designed for experienced BI System Administrators, responsible for implementation, deployment and administration of SAP BusinessObjects BI 4.

 

In this course, we’ll cover the following BI topics:

 

  • Week 1: Introduction, Architecture & Sizing
  • Week 2: Installation, Upgrade & Promotion
  • Week 3: Troubleshooting & Authentication
  • Week 4: Performance Optimization
  • Week 5: Final Exam

 

There are almost 23,000 participants signed up to the first round of this course, which received great feedback. Here’s just a small selection from the I like, I wish forum in the first course. (You must be logged in and enrolled to view the I like, I wish forum)

 

“Course contents are magnific. It has surprised me to discover lessons about topics that I have never seen it before in any official SAP BI courses”

 

“Thanks a lot for the course. It helped me a lot to improve my skill set and answered many questions. I have already performed the Installation and setup for one of our customer. Now I have learnt new tips in this course. I am confident that in future set ups at client location use these tips.”

 

“As a part-time admin for my company's internal BOBJ installation this is just what I needed to fill in the gaps since I am primarily a BI developer (not an admin). Really excited to have this available and would love to see more - especially on reporting tools. Very well done!

 

 

Sign up for this popular openSAP course, BI 4.0 Platform Innovation and Implementation today!

  Most of people have noticed that Platform Search application works differently in Business Intelligence Platform (BI) 4.x comparing to previous release. The architecture of Platform Search has been changed significantly since BI 4.0. It provides scalable and flexible Business Objects content indexing and search infrastructure support for different proprietary BOE content types. It can be set to real time indexing, so that the user is not required to restart the Indexing every time when he wants latest indexing content. When the documents are published/modified/deleted in the repository, the application identifies those documents and they will be indexed. Alternatively, it can be set to schedule based indexing which will trigger the indexing based on the schedule time. In either way, the user can perform searching in BI Launchpad while indexing is happening. Platform Search also supports load balancing and failover for both indexing and searching in a clustered environment.

 

  Platform Search service is the service in the Adaptive Processing Server, which has the logic to index the BOE content and search the content. It uses Apache Lucene, a free open source information retrieval software library from Apache Software Foundation. The version of Apache Lucene currently used by BI 4.0 and BI 4.1 is 2.4.1.

 

  The functionality of the Platform Search service can be divided as Indexing and Searching. Before the content becomes searchable, the content needs to be indexed. In a large sized system with a large number of infoobjects, getting all the infoobjects fully indexed first time can be time consuming because indexing involves several sequential tasks. I will talk about the indexing process in this blog.

 

 

Indexing Process

 

Indexing is a continuous process that involves the following sequential tasks:

 

1.     Use Crawling mechanism to poll the CMS repository and identifies objects that are published, modified, or deleted. It can be done in two ways: continuous and scheduled crawling.

 

2.     Use Extracting mechanism to call the extractors based upon the document type. There is a dedicated extractor for every document type that is available in the repository. There are following extractors:

 

        • Metadata Extractor
        • Crystal Reports Extractor
        • Web Intelligence Extractor
        • Universe Extractor
        • BI Workspace
        • Agnostic Extractor (Microsoft Word/Excel/PPT, Text, RTF, PDF)

 

3.      Use Indexing mechanism to index all the extracted content through the third-party library, Apache Lucene Engine. The time required for indexing varies, depending on the number of objects in the system, and the size and type of documents. It involves the following steps:

    1. The extracted content will be stored in the local file system (<BI 4 Install folder>\Data\PlatformSearchData\workplace\Temporary Surrogate Files) in an xml format called as Surrogate files.
    2. These surrogate files will be uploaded to Input File Repository Server (FRS) and will be removed from the local file system.
    3. The content of the surrogate files will be read and will be indexed by using specific index Engine into temporary location called as Delta Indexing Area (<BI 4 Install folder>\Data\PlatformSearchData\workplace\DeltaIndexes).
    4. The Delta index will be uploaded to Input FRS and will be deleted from the local file system.
    5. The Delta Index will be read and will be merged into Master Indexed Area (<BI 4 Install folder>\Data\PlatformSearchData\Lucene Index Engine\index) which is the final indexed area in the local file system.

 

         For indexing to run successfully, the following servers must be running and enabled:

 

        • InputFileRepositoryServer
        • OutputFileRepositoryServer
        • CentralManagementServer
        • AdaptiveProcessingServer with Platform search service on
        • AdaptiveJobServer (scheduled crawling)
        • WebIntelligenceProcessingServer (content type is selected as Web Intelligence)
        • CrystalReportApplicationServer (content type is selected as Crystal Reports)

 

 

4.      Generating Content Store and Speller/Suggestions

         After completing the Indexing task the following things will be generated:

 

        • Content Store: The content store contains information such as id, cuid, name, kind, and instance extracted from the master index in a format that can be read easily. This helps to quicken the search.

Each AdaptiveProcessingServer creates its own content store (<BI 4 Install folder>\Data\PlatformSearchData\workplace\<NodeName>.AdaptiveProcessingServer\

ContentStores)

 

 

        • Speller/Suggestions: The similar words will be created from the master indexed data and will be indexed. The speller folder will be created under “Lucene Index Engine” folder (<BI 4 Install folder>\Data\PlatformSearchData\Lucene Index Engine\speller)

 

 

 

 

 

Platform Search Queues

 

  Internally, above indexing sequential tasks are handled by Platform Search Queues. When Indexing is started, an infoobject would eventually go through the following queues in this order:

To Be Extracted > Under Extraction > To Be Indexed > Indexing > Delta Index To Be Merged > Content Store Merge


If multiple Platform Search Services exist, there is only one To Be Extracted, To Be Indexed, Delta Index To Be Merged and Content Store Merge queue for all nodes. But each Platform Search Service has its own Under Extraction Queue and Indexing Queue. Only one Platform Search Service will be designated as the master service to do delta index merge into master index.

 

  Each Platform Search Queue itself is an infoobject, the status of each Platform Search Queue can be retrieved by running the following query in the Query Builder:

SELECT * FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE SI_KIND = 'PlatformSearchQueue'

 

It will return the results with the following SI_NAMEs:

  • Platform Search (Delta Index To Be Merged) Queue
  • Platform Search (To Be Indexed) Queue
  • Platform Search (To Be Extracted) Queue
  • Platform Search (Exclude Documents) Queue
  • Platform Search (Include Documents) Queue
  • Platform Search Content Store Merge Queue
  • Platform Search (Under Extraction - Enity - AcpzqPRw1thIk_GYPiEETF8)
  • Platform Search (Indexing - Enity - AcpzqPRw1thIk_GYPiEETF8)

 

You will find a property called SI_PLATFORM_SEARCH_OBJECTS in each queue. That property displays the number of objects being processed in that queue. If SI_TOTAL of that property displays 0, it means that queue is empty.

 

  Exclude Documents and Include Documents are two special Queues to handle the exclude documents. When you update the exclude documents in CMC > Applications > Platform Search Application > Properties > Documents Excluded from Indexing, the documents will be added to the Platform Search
(Exclude Documents) Queue
.  When infoobjects are extracted, they will be excluded.


  When you remove the exclude documents in CMC > Applications > Platform Search Application > Properties > Documents Excluded from Indexing, the documents will be removed from exclude documents queue and added to the Platform Search (Include Documents) Queue. The crawling will only add documents to be extracted queue if only there is modification for the infoobject and its content or it is a new infoobject. In the case of those infoobjects removed from the exclude documents, they are neither new infoobject, nor modified, so they won't be picked up by crawling. They are added to this special queue, so that they will be added to the To Be Extracted queue.


  From the Platform Search Queues result, you can see that Under Extraction and Indexing Queues are associated with a Platform Search Service session SI_CUID because each Platform Search Service has its own Under Extraction Queue and Indexing Queue. The information of Platform Search Service Sessions can be retrieved by running the following query in the Query Builder:

SELECT * FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE SI_KIND = 'PlatformSearchServiceSession'

 

Each Platform Search service should have one session. If the heartbeat (SI_PLATFORM_SEARCH_HEARTBEAT_TIMESTAMP) isn’t updated regularly on one session, other search service would try to return the hung service’s objects to the previous queue and take over unfinished work.


Here are some other useful queries you can run to get information regarding Platform Search Application.

 

 

Retrieving the general information about Platform Search Application

SELECT * FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE SI_KIND = 'PlatformSearchApplication'

 

The property SI_PLATFORM_SEARCH_SERVICE_CONTEXT_ACTION shows if the indexing is running. 0 means Indexing is not running, 1 means Indexing is running.

 

 


Retrieving the information of Platform Search Application Status

SELECT * FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE SI_KIND = 'PlatformSearchApplicationStatus'

 

For example, you can check the following properties:

  • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_DAILY_MAX_OBJECT_ID
  • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_ID
  • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID
  • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_FOLDER_ID
  • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_UNIVERSE_ID
  • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_TIMESTAMP

 

SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID represents the SI_ID of the last infoobject which was added to the To Be Extracted queue. The infoobjects are added to the To Be Extracted queue in the batches. So if we have a batch of 100 infoobjects which are added in the To Be Extracted queue, this field will have the max SI_ID among the SI_IDs of those infoobjects.

 

 

SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_ID represents the SI_ID of the last infoobject which was added to the To Be Indexed queue. When indexing starts, this field will have the same value as SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID. But during the indexing if some infoobjects didn't get added to the To Be Indexed queue, then this field is updated with the max SI_ID of the infoobjects which actually got added to the To Be Indexed queue. And SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID field is retained with the original value. For both these fields, the SI_IDs of folders are not included.

 

 

  For the definition of above properties related to Platform Search, please use the latest release of the SAP BI Platform Support Tool. A new report option has been added in BI Platform Support Tool that will provide detailed information on the Platform Search and how it is performing.

 

 

BISupportTool.png

 

 

 

  I hope this blog helps you to understand how Platform Search Indexing works.

sap-logo.png



The Best-Run Businesses Run SAP


 

275682_l_srgb_s_gl.jpg

Friday, December 5 2014

New Maintenance Dates for SAP BusinessObjects BI4.1


To enable our customer base adopting SAP BusinessObjects BI4.1, SAP is pleased to announce the extension of Maintenance for SAP BusinessObjects BI4.1 with two years of additional support. The End of Maintenance and Priority One dates have been extended as of today!

 

 

In this issue:

 

1. New SAP BusinessObjects BI4.1 End of Maintenance Dates

 

By SAP support standards, the End of Maintenance dates are defined by a 7+2 years support plan for a product line, this resulted in an End of Maintenance for SAP BusinessObjects BI4.1 by December, 31 2016. However, while listening to our customers, SAP learned that the existing End of Maintenance Dates where too short to enable a full adoption of SAP BusinessObjects BI4.1.

To enable all our customers in the adoption of SAP BusinessObjects BI41, SAP has decided to extend the default Mainstream Maintenance with an additional two years.

 

  • End of Mainstream Maintenance for SAP BusinessObjects BI4.1 is now December 31, 2018
  • End of Priority One Support for SAP BusinessObjects BI4.1 is now December 31, 2020


 

ico-vidresult_icon_sm.png

 

SAP BusinessObjects BI4.1 Product Availability Matrix ›

ico-vidresult_icon_sm.png

 

SAP Maintenance Strategy Rules

ico-vidresult_icon_sm.png

 

SAP Mainstream Maintenance Rules ›

ico-vidresult_icon_sm.png

 

SAP BusinessObjects Priority-One Rules ›

 

 

2. Existing SAP BusinessObjects Products Support Overview

 

Existing SAP BusinessObjects Products Support Overview

Although the Mainstream Maintenance dates for SAP BusinessObjects BI4.1 have been extended by 2 year, this is not the case for other SAP BusinessObjects Products. To provide a complete overview of SAP BusinessObjects Products and their End of Mainstream dates the list below is provided.

In case you are currently not running an SAP BusinessObjects BI4.1 release, please validate your current Mainstream Maintenance dates. In case the Mainstream Maintenance dates have been passed or are passing in the near future, it is strongly recommended to upgrade your existing environment to SAP BusinessObjects BI4.1. For details on the Upgrade process, please read:

 

ico-vidresult_icon_sm.png

SAP BusinessObjects Platform BI4.1


End of Mainstream Maintenance -> December, 31 2018

End of Priority One Support -> December, 31 2020

 

ico-vidresult_icon_sm.png

SAP BusinessObjects Platform BI4.0


End of Mainstream Maintenance -> December, 31 2015

End of Priority One Support -> December, 31 2017

 

ico-vidresult_icon_sm.png

SAP BusinessObjects Enterprise XI3.1


End of Mainstream Maintenance -> December, 31 2015

End of Priority One Support -> December, 31 2017

 

ico-vidresult_icon_sm.png

Prior to SAP BusinessObjects Enterprise XI3.1


 

End of Mainstream Maintenance -> Expired

End of Priority One Support -> Expired

 

 

3. Where to find more information about (upgrading to) SAP BusinessObjects BI4.1

 

SAP has been building up a dedicated webpage with a collection of the most valuable information in regards to the SAP BusinessObjects BI4.1 Suite. Wherever you are in your Business Intelligence journey, find resources here that will help your organization be successful with SAP BusinessObjects BI solutions.

 

 

The SAPBUSINESSOBJECTSBI.COM Website will enable to

 

  1. Be Your Organization's BI Champion
    Leverage the resources below to show what's possible with SAP BusinessObjects BI solutions and increase BI excitement within your organization.

  2. Empower Employees with a BI Strategy
    A solid BI strategy helps you get the most from your data assets, technology investments and BI initiatives.

  3. Design and Manage a Successful Implementation
    Address organizational and governance needs as well as the technology portion of your implementation.

  4. Get Personalized Upgrade Advice
    Start planning your upgrade with a personalized guide from an upgrade expert. The report will help you get the most out of your BI deployment.

  5. Advance your skills with BI Academy
    Learn more about BI and SAP BusinessObjects BI solutions.

  6. Events & Webinars
    Find new perspectives, alternative approaches, and spark your BI creativity.

 

 

ico-vidresult_icon_sm.png

 

www.sapbusinessobjectsbi.com

At a recent customer project my teammate and I were facing an issue with importing Webi documents and universes into Translation Manager (all on BO4.1 SP4) getting the following error message:

 

org.apache.axis2.AxisFault: Unable to find servers in CMS, servers could be down or disabled by the administrator (FWN 01014)

 

There is a KB article describing this error and a solution: http://service.sap.com/sap/support/notes/1879675

 

One hand we followed the instructions in the KB article and specified the hostname for each server. Although our server is multihomed (that means has more than one network interfaces) we didn't thought about this before hand because the second network interface goes into a backup LAN and is never used for communication with e.g. the client tools. In addition everything worked fine so far - until we got the issue with Translation Manager. Still, just setting the hostname was not enough. On the server side there is the Windows firewall enabled, therefore we had to assign static request ports to several services. We already did so before our issue with the Translation Manager for the CMS, Input & Output FRS, all Webi Processing Server. We used TCPView to analyze on which ports Translation Manager was opening connections. As long as there were requests on ports which we didn't specify in the CMC (some random port number e.g. 56487) we narrowed down all the services which Translation Manager establishes a connection. We had to specify a port for all of the following servers:

  • The Adaptive Processing Server (APS) hosting the DSL-Bridge Service
  • The APS hosting the Client Auditing Proxy Service
  • The APS hosting the Search Index Service
  • The APS hosting the Translation Service
  • The WACS (Web Application Container Server)

 

Besides the issue with Translation Manager we had another issue with creating new universe connections in the Universe Design Tool. When creating a new connection we had to wait up to 30 minutes ( ! ) to get to the wizard page where you can select from the list of available database drivers. Still, after 30 minutes everything worked fine and could create the connection successfully. Based on our experience with Translation Manager we run again TCPView and found that we need to assign a port number to both connection servers (32 and 64 bit) in CMC. Having done this, creating a new connection now works without any waiting time.

 

After all: If you have firewalls between your BO server and client tools, just assign a port to all servers available and open the port on the firewall (the only exception might be for the Event Server as I'm really not aware of any communication between this server and a component outside the BO server)

Crystal Reports Enterprise 4.1 SP 05 release is supporting the feature, manual entry in prompts for BW variables. Here I will be mentioning information on supported and not-supported variables and steps to have multi-value field for selection option variable.

Manual entry in prompts is supporting for following variables:

  1. Single value variable
  2. Multi-value variable
  3. Interval( Range) variable
  4. Selection option variable
  5. Formula variable
  6. Single Keydate variable

Manual entry feature is not supported for Hierarchy variables, Hierarchy node variables.

 

  1. The feature is available by default in CRE and BOE Viewer for old (4.1 SP04 or other) and newly created reports.

When you open any report or create a new report, you should have one text box for manual entry with add symbol like below.  When you refresh the report with prompts, the report will display the below dialog with manual entry text box option.

Multiple values can be entered with semicolon separated.

Either you can enter the values manually or can select from the list.

ManulEntry TExt field.png

 

 

  2. To get multi-value selection field for Selection Option variable we have to make an entry in configuration file of Crystal Reports Enterprise installation folder.

Make the below entry in the configuration file (C:\Program Files (x86)\SAP BusinessObjects\Crystal Reports for Enterprise XI 4.0\configuration\config.ini)

Entry to be made:  “sap.sl.bics.variableComplexSelectionMapping=multivalue

 

MultiSelction.png

 

 

  3. In-order to get multi value selection field in Viewer, we have to make an entry in CMC as well.

Entry Location: Login in to CMC -> Servers->Crystal Report Services-> CrystalReportsProcessingServer -> Java Child VM arguements

Entry to be made: “Dsap.sl.bics.variableComplexSelectionMapping=multivalue”

 

 

If the entry is not made in CRE or Viewer then the field appears as Interval or Range field.

Range.png

 

  Hope it helps…

Hello,

 

 

if you try to install BO 4.1 SP5 as update on existing installation of BO 4.1 SP4 on Windows 2012 Server R2 you get an error after the restart of the system.

 

The BO System can not connect to the CMS system.

 

In CCM, in the properties of Server Intelligence Agent (SIA), on configuration tab, getting the error: Failed to retrieve the cluster name from the database. Reason: Parser failed to initialize. If you try to configer the cms new, the error "FWB 00090" occurs.

 

 

To fix the problem, check if the files sqlrule.llr and sqlrule.dfa are in <BI_install_folder>\SAP Business Objects\SAP BusinessObjects Enterprise XI 4.0\win64_x64.

 

If not, copy the files newest version from a sub-directory of <BI_Install_folder>\Install Data\Install Cache to <BI_install_folder>\SAP Business Objects\SAP BusinessObjects Enterprise XI 4.0\win64_x64

 

Restart the SIA and the problem is solved.

 

Regards

Andreas

Over the past few years, I've been asked to create or assist in creating various scripts to automate a task in SAP Business Intelligence 4.x (BI4).  Most of these scripts were loose files in the form of .jsp or .vbs that had to be run manually.  Lately, I've taken a personal liking for Program Files which are uploaded and stored within the BI environment.  The last request was for a script that would disable any user that hadn't logged into the CMS within the past 90 days.  Rather than hard-coding the 90 days into the program file, I created it to take the number of days as an argument (or input parameter).  And if the argument is set to 0, then the script will only display the user info without committing any changes.

 

If you're interested in creating your own Program File and want more information on how to do this yourself, I wrote another blog titled How to Create a Program File in BI4.  The blog describes the steps and requirements for creating a java program in Eclipse as well as provides a project template to start with.

 

Description

 

The following script can be used to disable users in bulk that have not logged into the BI system for the past X number of days.  This is equivilant to logging into the CMC manually, navigating to a user's property page and clicking the 'Account is Disabled' checkbox under the 'Attribute Binding' section.

 

Where to Download


The source code and .lcmbiar file can be downloaded from the following SAP Note#:  http://service.sap.com/sap/support/notes/2097401

 


How to import the script into BI 4.x

 

The script can be imported into your BI environment using Promotion Management.  The zip downloaded from the kbase has the .lcbiar file within it.  Follow the steps below to import it using Promotion Management.

 

  1. Log into the Central Management Console (CMC) as Administrator
  2. From the home page, click on Promotion Management
  3. Click on the Import dropdown menu and choose the option, Import File
  4. The Import from File dialog box appears.  Ensure the 'File System' radio button is selected and click on 'Browse'
  5. Unzip the .lcmbiar file you previously downloaded.  In this box, navigate to the .lcmbiar file and click OK.
  6. There will be a 'new job' tab opened with some of the information from the .lcmbiar file automatically filled out (such as Name, Description, etc).  Select the Destination menu and choose the CMS you'd like to import the Program File into.  You will be prompted to provide permissions for this CMS.
  7. Click Create.
  8. Now that the import job is created, you will need to run the promotion.  Click on the Promote button in the toolbar.
  9. The summary page opens showing exactly which objects will be promoted.  You should see 2 items.  A folder object and a Program object.  Click the Promote button at the bottom of this page.
  10. When the job is finished running, the Status column will show 'Success' as a result.  The job should take less than a minute to run, so if you don't see it succeed within a few minutes, make sure you have an Adaptive Job Server running that contains a Promotion Management Scheduling Service.

 

Running the Program File

 

Once the .lcmbiar file is imported (see the steps above), you will see a folder called "Admin Scripts" under the top level root folder.  Be sure to set permissions on this folder accordingly so its not accessible for non-Admin users.  To schedule, follow the steps below:

 

  1. Navigate to the 'Admin Scripts' folder and right click on the Program File underneath called 'DisableInactiveUsers' and choose Properties.
  2. Under the 'Schedule' option on the left side, choose the Program Parameters option.
  3. Set the Arguments field to the number of days since last logon.  This value must be a positive integer.  ( In other words, 10.5 days will not work. )
  4. Click Schedule.

 

 

RECOMMENDATION:  The first time you run the program file, run it with the argument of 0.  This will give a full output for all users and how recently each user has logged in.  The output can be read as-is in text or easily copied into Excel and sorted as needed.  Once you know the value of days old you want to run the job under, set the 'Argument' value accordingly.

 

Other Info

 

As with any script, there can be unexpected results.  Test the script first in a lower level environment such as a Test or Development system.  Be sure to have a backup of your CMS system database in case you need to quickly revert the changes.  And lastly, if you'd like to see the source code, the Eclipse project files are included in the zip file attached to the same SAP Note.

 

The Arguments field by default is set to 0.  This value represents the minimum number of days since last successful login.  The script will query for the SI_LASTLOGONTIME value stored in the CMS repository for each user.  It will determine how many days ago this SI_LASTLOGONTIME was (based on current time where the Adaptive Job Server is running).  Any user that has not logged in within this number of days will be disabled when the script is run except for one condition.  Any user that has never logged at least once will not have a SI_LASTLOGONTIME property and thus will be left unaltered. For these users, the output file will show the equivalent of the Java Date(0) value which is either 1969 or 1970 depending on your locale settings. 

 

If you leave the days value as 0, then the script will not commit any changes.  In this case it will only output the full user list with their last login date/time and if that user is currently disabled or not.  If any value greater than 0 is used, then the script will attempt to disable any users that match the criteria except for those that have never logged into the CMS at least once.

 

Once the program file is scheduled, clicking on the successful instance within the history window will show the output results similar to viewing a report instance.

 

historywindow.jpg

 

Except for the header info, the data in the output file is in a comma separated value format.  Below is an example of the output file:

 

 

screenshot1.jpg

 

If you have a larger user list, opening the csv within excel can make the data easier to work with.

screenshot2.jpg

I've written a bit of jsp, vbscript, and java based applications in the past and lately I've really enjoyed writing simple java apps that can be run as a Program File (.jar) within the BI 4.x environment.  A couple examples of these Program Files were shared earlier this year called the 'biUserSessionKillScript' and 'DisableInactiveUsers'.

 

How to Delete Stale BI4 User Sessions with biUserSessionKillScript

How to Auto-Disable Inactive Users in BI4

 

In BI 4.x, Program Files can be in the form of a batch file, vbscript, javascript, shell script, or a jar file.  The examples above were compiled as .jar files and below I'll explain what it takes to create your own java Program File.  I've created a template that can be imported into Eclipse to help get you started.  The source code is attached to a Kbase which can be downloaded separately here:

 

SAP Note 2099941

 

In this example, its assumed you know how to import a java project into Eclipse and how to export it to a jar file.

 

The What and Why:


Developing a BI4 Program File has many benefits.

  • For one, your application will be executed by the AdaptiveJobServer which means it will look to the BI4 java classpath used by the AJS.  There's no need to package all the extra dependent jar's as you would a standalone application.  Your import statements will automatically look for and find these libraries in the "C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\java\lib" directory assuming you are only using the BI SDK based libraries.
  • Second, a Program File offers all the benefits of historical instances.  But instead of having a Webi or CR report as your scheduled instance, a .txt file is generated in the Output FRS location with all of the System.out.println() information within your program.  This can be beneficial for both debugging your application as well as giving a historical look at whatever your application did when it was scheduled.
  • Lastly, since the program is stored, executed and managed within the BI system, the security for running the application can be set just as any other infoobject.

 

Attached at the bottom of this article is an Eclipse project template for creating a new BI4 program file.  Feel free to use this as a starting point for creating your own Program File.  Below is an explanation and breakdown of how this application template works.

 

How it works:

 

In the top portion of the example you'll find the class definition and main method.  The difficulty in writing a new program file is the application needs to be able to run standalone in order for you to develop it, alter it and debug it within Eclipse.  When the application is run as a program file within BI, the standalone portion of this application will never be executed.  In other words, the main() method is never actually called.  Instead, the Adaptive Job Server calls the run() method directly.

 

The example starts off with the class definition and the main() method.  The class implements the IProgramBase interface.  This is a requirement for all BI4 program file applications because of how the program job server service calls the run() method directly.

image1.jpg

The majority of the main() method code does nothing more than create an enterprise session and pass it to the run() method.  Be careful of what you include in your main() method as this will only be executed when your application is run outside of BI (within Eclipse, via command line or any other means).  All code you want executed when the application runs within BI must be inside the run() method or in another method called from this point down.  Again, think of the run() method as your starting point.

 

In order to run the application from the command line in standalone mode, you will need to create an enterprise session.  This requires 4 command line parameters passed to your application.  The code below first ensures that 4 values were entered.  It compensates for a 5th value in case something else needs to be passed, but this isn't necessary for the code below to work.

 

image2.jpg

There's not much going on here in the run() method.  In this example, its assigning the username within the enterprise session to a variable and then writing it to the console.  However, every time you output text to console via System.out.println() the Adaptive Job Server will write this to the text file created in the Output FRS when the program file is scheduled.  There's no need for additional code to output your information to a text file.  This is all built into BI and automatically done at schedule time.

 

image3.jpg

 

Although this really doesn't do much, keep in mind its just an example.  The program will run as-is and output to console the user name held in the enterprise session.  To test this within the Eclipse IDE, you will need to create a Run Configuration.

 

  1. To do this, right click on the project name within the Package Explorer, choose Run-As > Run Configurations.
  2. Highlight Java Application on the left and click the 'new launch configuration' button.
  3. Under the Main tab:
    • The project name should be auto filled in if you had right clicked on the correct package within the Package Explorer.  If its not, then browse for it and select the project.
    • You'll need to search and select the main class in the 2nd box.  Click the Search button to do this and select the ProgramObjectTemplate classname.
  4. Under the Arguments tab:
    • Add the 4 command line arguments separated by a space.
    • The values specified in the main() method are:  <username> <password> <cms name> <auth type>
  5. Once the Run Configuration is created and saved, you can run the application within the Eclipse IDE and test it.  Once tested, then you'll be ready to export it to jar and import the jar into the Central Management Console.

 

image5.jpg

image7.jpg

Import the Program File into the CMC


To import the Program File into the CMC, you'll need to follow these steps:


  1. Log into the CMC
  2. Navigate to a folder where you want to import the file. Right click on the folder, choose Add > Program File
  3. Choose Browse and select the .jar file you exported from Eclipse
  4. Select the Java radio button and click OK
  5. You will now see the new Program File within the folder.
  6. Before you can schedule it, you'll need to set the default Program Parameters.  To do this, right click on the Program File, and select Default Settings > Program Parameters
  7. In the Class to Run box, type the name of the class - ProgramObjectTemplate.
  8. If there were any additional command line parameters after the first 4 we created (user, password, cmsname and authtype), you would set these in the Arguments box. The BI4 Java SDK will automatically create a user session at schedule time based on the user that schedules the program file.  So passing these here isn't required.  The only arguments you would need in the Arguments box are the additional command line parameters passed beyond these 4.  See either of the examples mentioned at the beginning of this blog as both use an additional command line argument.


image8.jpg

image9.jpg

One of the key concepts to understand in the BI Platform Monitoring Application is the Health State metric.  A number of different aspects of the monitoring application rely on the Health State metric and without a clear understanding of how this is supposed to work, it makes effective monitoring and troubleshooting the application a frustrating task.  In this article, I will describe the concept of Health State in great detail and I will also describe how to correct the Health State in your BI Platform Monitoring application.

 

 

Health State Metric

 

In the Central Management Console, when creating a new watch, there are two types of Health State metrics that can be used as threshold criteria:


  • Server Health State - The Server Health State indicates the health of a particular server.  This metric can be used to understand whether the server is up and running, whether the server is overloaded, and whether the server is still able to take additional requests.  The Health State of the server can indicate to the BI administrator if they need to take action to troubleshoot a problem on that particular server

 

_bb.png



  • Topology Health State - The Topology Health State indicates the cumulative health of all servers of a particular type (Categories health) and also all servers in a particular server group.  The Service Categories include CrystalReports, Analysis Services, Dashboard Services, Promotion Management Services, Core Services, Explorer Services, Connectivity Services, and SIA nodes


_aa.png

 

 

 

How the value for Health State is determined

 

In the case of the Server Health State metric, the value is determined by the result that particular server's watch.  Anytime you create a new server manually or use the System Configuration Wizard to create your Adaptive Processing Server configuration, the system will automatically create a new watch for each server using the nomenclature of NODENAME.SERVERNAME Watch.  This is a "system" created watch and cannot be manually deleted.  You may have noticed in the Central Management Console that the system created Server Watches are also displayed for ease of access under CMC -> Home -> Servers --> Servers List.

 

_serverslist.png

 

Health State Evaluation


Depending on value returned by the server's watch formula, the server health will display one of the following five states.


 

STATEDEFINITION
GREENServer health is good and no action is necessary
AMBERServer is slightly overloaded, nearing peak values as defined by the caution rule
REDServer resources are over used, unable to take new requests, or the server is stopped or disabled
DISABLEDThe watch is marked as disabled in the BI Monitoring application.  Select the watch and click the enable button to re-enable the evaluation of this watch
FAILEDThere is an error in the watch formula or the BI Monitoring service is disabled or not running


Topology and Categories Health States

 

In order to provide the BI administrator a quick path to troubleshoot issues in the BI Platform landscape, the server health states are aggregated into service category health states.  This makes it much more simple to tell if any particular product type is available for the end users that are using the system.  For example, if your BI system mainly processes Crystal Report view-on-demand requests, then it is vital in order to achieve maximum up-time that all the Crystal Reports Processing Servers in the BI landscape are available to process these jobs.  The Crystal Reports category health state depends on the aggregated health state of all the Crystal Reports server watches.  This can be seen by editing the Crystal Reports category watch formula where you will find in the formula the health state of all Crystal Reports servers.

 

_watt.png

 

In the case of the Crystal Reports category, all of the servers required to process Crystal Reports are grouped together in the topology map so that you can tell at a glance which server watch may be causing the overall category state to change.

 

_crr.png

 

Fixing the Overall Health Watch and the Health State Hierarchy

 

On the BI Platform Monitoring Dashboard, there is an Overall Health state indicator (also known as the Consolidated Health Watch).  You may have noticed that this is quite often not showing a valid state (Green, Amber, or Red) and instead is giving a state of Failed.  In order to fix this, it is important to understand how this particular Health State is determined, then make the necessary underlying watch formula corrections that this watch is dependent on.  In the monitoring application, there is a large hierarchy of Health State watches and if any of these dependent watches is broken or invalid, the Overall Health will show a state of Failed.  In order to help the BI Administrator to correct their BI Platform Monitoring application and Overall Health, I have created a diagram showing each level in the Overall Health hierarchy which you can use to track down the broken watches and correct the formula. 


In this example, you can see that the Overall Health state is Failed. 


__her.png


 

If any of the dependent Health Watches below the Consolidated Health Watch are failed, then the watch in the next level up will also be failed.  Therefore, you must start at the bottom of the hierarchy and correct this watch.  In this example, the server APS 2 has a failed watch, therefore the SIA Node 2 watch is failed, the Enterprise Nodes watch is failed, and so on.

 

__her2.png

 

After correcting the APS 2 Health State watch formula, all of the parent watches are now also showing a correct value and the Overall Health is Green (OK).  Note that, after you correct the child watch formula, wait for a few minutes as there is a metric refresh internal of 60 seconds (by default) where the Monitoring Service will update the status of all watches in the system.  In otherwords, the change in Overall Health will not happen immediately after correcting the dependent watches so be patient.

 

__her33.png

 

Repairing the Server Watch formulas

 

When creating a new server or using the System Configuration Wizard, you will find that the automatic routine that handles this is not perfect and depending on which service you are creating, the automatically generated system watch may contain either the wrong server name reference, and in some cases (such as the Connection Server), the wrong metric altogether.  When you edit the watch's danger rule or caution rule you will see in red, the erroneous contents in the formula that needs to be corrected.

 

A server Health State watch should contain at the very least a check to make sure the server is running.  Depending on the granularity that you desire you can create a two state watch, or a three state watch.

 

____.png

 

 

 

If you want to see a yellow caution state when a server is stopping and starting then you should use a three state watch, if you are only interested in seeing green state for running and red for any other state, you can use a two state watch.  Using the server metric Server Running State, you can easily create a new server watch based on whether that server is available or not.

 

 

Server Running State Values

 

State

Value

Stopped

0

Starting

1

Initializing

2

Running

3

Stopping

4

Failed

5

Running With Errors

6

Running With Warnings

7

 

 

See below an example of both two state and three state watches that check for server availability.  In this example, my SIA node name is NODE and the server name is SERVERNAME.

 

 

Two state watch formula:

 

Danger RuleNODE.SERVERNAME$'Server Running State‘!=3

 

Three state watch formula:

 

Caution RuleNODE.SERVERNAME$'Server Running State'==1 || NODE.SERVERNAME$'Server Running State'==2 || NODE.SERVERNAME$'Server Running State'==4 || NODE.SERVERNAME$'Server Running State'==6 || NODE.SERVERNAME$'Server Running State'==7
Danger RuleNODE.SERVERNAME$'Server Running State'==0 || NODE.SERVERNAME$'Server Running State'==5

 

 

Factoring in performance to the server health state

 

In some cases such as the Central Management Server, the load on the CMS server is used to determine the server health state.  Depending on which type of server you are editing the watch for, there are a variety of different metrics that can be used to determine load.  You may want to also include in your server watch formula some thresholds for these metrics so that the server health state metric is dependent also on how well the service is performing and whether it is able to take on more jobs.


Refer to the BI Platform Administrator Guide for more information on server metrics to determine which metrics are suitable for your BI landscape.

 

Update 27/11/2014: Please read if you have SAP Data Services and/or SAP Information Steward - see below

Update 26/11/2014: Released today, the updated SAP BusinessObjects BI 4.1 Supported Platforms - see below

Update 24/11/2014: Free-Hand SQL Extension for Web Intelligence for SP05 article released.

 

 

Announcement

 

SAP has released on Monday November 17th 2014, as planned in the Maintenance Schedule, Support Package 05 for the following products:


  • SBOP BI Platform 4.1 SP05 Server
  • SBOP BI Platform 4.1 SP05 Client Tools
  • SBOP BI Platform 4.1 SP05 Live Office
  • SBOP BI Platform 4.1 SP05 Crystal Reports for Enterprise
  • SBOP BI Platform 4.1 SP05 Integration for SharePoint
  • SBOP BI Platform 4.1 SP05 NET SDK Runtime
  • SAP BusinessObjects Dashboards 4.1 SP05
  • SAP BusinessObjects Explorer 4.1 SP05
  • SAP Crystal Server 2013 SP05
  • SAP Crystal Reports 2013 SP05

 

This comes five months after the release of SAP BI 4.1 SP4 (SP04) back in June 2014.

 

You can download these updates from the SAP Marketplace as a Full Install Package or Support Package (Update).

 

E.g.: Full Install

new.png

 

E.g.: Support Package (Update)

update.png

 

Download Location: Software Downloads | SAP Support Portal

 

 

What's New?

 

The updated What's New document has been released early this time on 06/11/2014.  This is a good read, there are few good new features in this update but the ones that are significant to me are:

 

  • SAP Lumira integration with the BI Platform
  • Free-hand SQL (FHSQL) for Web Intelligence via the SDK and UI Extension Points*

 

* I was told at SAP TechEd that a sample will be made available to help us doing this.  This new feature is expected to be out of the box with no SDK required with SP06.  Articles:

 

 

 

There are tons of fixes (356 to be exact).

 

 

Supported Platform (Product Availability Matrix)

 

The SAP BusinessObjects BI 4.1 Supported Platforms (PAM) document has been released on November 26th 2014.

 

URLs:

 

 

Documentation

 

The usual documents have been made available:

 

 

 

 

 

 

 

 

 

Forward Fit Plan

 

SAP is no longer updating the SBOP Forward Fit Plan so I'm unable to confirm for the moment which updates are included here...  One would hope it's as it used to be and will include SAP BI 4.1 SP04 Path 3...


To be confirmed...



Maintenance Schedule


SAP BI 4.1 SP05 Patch 5.1 (Week 51 - December 2014)

SAP BI 4.1 SP05 Patch 5.2 (Week 4 - January 2015)

 

SAP BI 4.1 SP06 is scheduled to be released late July 2015 (Week 30 2015).

 

Source: Schedules for Support Packages and Stacks | SAP Support Portal

 

 

Installing Updates


This training server has a clean installation of SP04 with the English language only installed.  This is how long it tool to install everything.


Note: For those who have read my previous post about the release of SAP BI 4.1 SP04, the timings below will seem much quicker.  The reason for this is that I'm using a training server with the same specs but different pre-installations of SAP software...  Or SAP have made things a lot quicker now!


 

Updates

 

    • SBOP BI Platform 4.1 SP05 Server
    • SBOP BI Platform 4.1 SP05 Client Tools
    • SAP BusinessObjects Explorer 4.1 SP05

 

Environment

 

    • Windows Server 2012
    • 4x Virtual Processors (Hyper-V)
    • 20 GB RAM

 

Duration

 

1. As always, the Preparing to install screen takes a while...  about 6-7 minutes for me.

 

Please wait.png


2. This chart shows the time it took waiting for the Preparing screen to disappear then the installation time.


install time.png


3. As always, when you click Finish, do NOT reboot straight away.  Wait for setupengine.exe to go away in your Task Manager.  This can take a minute or so.

 

Task Manager.png

 

 

Past Articles

 

For information, I wrote the following articles about previous SAP BI Support Packages:

 

 

 

SAP Data Services and/or SAP Information Steward

 

 

Those of you with SAP Data Services (DS) 4.x and/or SAP Information Steward (IS) 4.x installed on your SAP BI 4.1 server will not be able to install this update.

 

At the "Check Prerequisites" screen you will get the following message:

untitled.png

 

SAP Note 1740516 is currently unavailable.  It is likely going to say that SAP BI 4.1 SP05 is compatible with SAP DS / IS 4.2 SP04 (to be released).

 

  • SAP DS 4.2 SP04: no release date yet (probably at the same time of IS)
  • SAP IS 4.2 SP04: release date Week 51, 2014

 

 

Conclusion

 

It's still early days and there are couple of documents that need to be updated but looking forward to have a look at the Free-hand SQL in Web Intelligence!

 

I'm also please that Apache Tomcat is now updated from the very old 7.0.32 to 7.0.55.

 

As always, please share your how it went for you in the comments below.  I'm sure this does help many people.

 

 

Please like and rate this article if you find it useful!

 

 

Thanks and good luck!

 

Patrick

Have you ever wanted to use a CSV delimiter other than the standard comma, semi-colon or Tab?  Maybe a pipe or tilde character?  Me too.  I didn't think it was possible, but a Google search of "web intelligence custom delimiter" yielded this...

 

Cleartelligence blog | Adding custom column delimiter characters to CSV export options to WEBI

 

To be sure, this would not be supported or even recommended by SAP.  So use at your own risk, but I have to say that is a pretty sweet customization.

 

Enjoy,

 

Noel

Actions

Filter Blog

By author:
By date:
By tag: