1 2 3 18 Previous Next

BI Platform

263 Posts

HI!

 

As we know there are some limitation doing migration using Promotion Management. So the alternative is LCM_CLI.bat i.e. the command line tool.

 

Below are the steps to do it Live to Live Migration using lcm_cli.bat tool.

 

1)  We Will Prepare the ".properties". In order to create it simply use notepad or notepad++ or any text editor then save it as <name>.properties file say my.properties.

Note* Make sure you select All Files while saving and remove txt extension and use properties.


2) Here I am Providing you sample Properties file "my.properties" for a Object Ex. Test_Test.rpt

======================================================================

action=promote

LCM_CMS=localhost:6400

LCM_userName=administrator

LCM_password=Password1

LCM_authentication=secEnterprise

Source_CMS=Source:Port (Example: 172.16.12.133:6400)

Source_userName=administrator

Source_password=password

Source_authentication=secEnterprise

exportQuery1=select * from ci_Infoobjects where si_cuid='AaU8jopWOstGt51J2rBdXZQ'

Destination_CMS=Destination:Port   (Example: 172.16.12.134:6400)

Destination_username=administrator

Destination_password=password

Destination_authentication=secEnterprise

======================================================================

 

 

lcm1.JPG

 

 

Above, export query is simply the object i wanted to migrate i.e. Test_Test.rpt (Crystal Report) from Source to Destination.

In Your Case use Use the CUID of your report (i.e. CMC>Public Folder>Your Folder>Your_Report.rpt>Properties) .

Also, You Can Use Many Export Query in single file by simply using exportQuery1,exportQuery2,exportQuery3 and so on.)

 

 

3) Then Simply Go to <installdir>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win64_x64\scripts into cmd i.e.

Command Prompt and run

 

lcm_cli.bat -lcmproperty "D:\my.properties"

 

lcm2.JPG

 

Note.: For Migrating the whole Content use all 3 virtual table and migrate it.  or Simply create a new job in promotion management with all the objects then Save it (Don't Promote it) then Copy the CUID of the Job and use the same procedure above.(Thanks @Jawahar For this Simply method combination)

 

 

Thanks in Advance.

 

Regards,

Mahak Rathore

HI !

 

This is my first Blog for the Learner as I am also new to it but I have collected some major points which tells what is the major difference between Enterprise and Edge product of Business objects.This blog is only for information purpose to clarify why business objects used two different names for the same product version.

 

Below is explanation for it.

 


Differences between BusinessObjects Enterprise XI 3.0/3.1 and BusinessObjects Edge 3.0/3.1

 

 

  1. Tomcat is the only web application service supported in BusinessObjects Edge 3.0. BusinessObjects Edge 3.1 has an internal hosted application server which services all its web-based applications. This is no need to purchase a separate application server. Should you wish to deploy on a separate application server, Tomcat is supported.
  2. BusinessObjects Edge 3.0/3.1 does not have native .NET IIS support.
  3. Profiles and Publications (also referred to as Publishing) are disabled in BusinessObjects Edge 3.0. This is enabled in BusinessObjects Edge 3.1, with a limitation of sending to 100 recipients at a time. The option to send to additional recipients can be purchased as add-on functionality.
  4. Federation is disabled in BusinessObjects Edge 3.0/3.1. Note this feature remains visible in the CMC for the BusinessObjects Edge 3.0 release, however it is not operational.
  5. BI Widget is supported in BusinessObjects Edge Edge 3.0/3.1 Polestar is supported in BusinessObjects Edge Edge 3.1.
  6. BusinessObjects Edge 3.0/3.1 is a single server deployment while BusinessObjects Enterprise can be deployed on multiple servers.

 

 

Reviews/Opinions/Questions are welcome.

 

Hope after reading the above explanation will help you.

 

Thanks in Advance.

 

Regards,

Mahak Rathore

The first release of the integration of Lumira for BI4.1 has been released!

 

 

Here I cover the functionality available with the first release.  The functionality is delivered with an add-on Installer for BI4.1.  Minimum requirement is 4.1 SP3 and Lumira 1.17.   This also means you need HANA to run Lumira 1.17.

BI 4.1 SP4 is recommend if you want full LCM support.

The most basic workflow:

First, you will see that with Lumira 1.17, there is a new option to publish to BI

Publish.png

 

When you have created your data set and visualization, you will be able to publish those to the BI repository, just like you do with a Crystal Report or Web Intelligence report.

ChooseFolder.png

 

Once you have published your story and dataset, you can see the story listed directly in BI launchpad, inheriting the same folder security as your other reports.

StoryList.png

From here, you can view this report just like you do with any other report type.  The "story" that you are viewing is just a visualization based on a data set.  This also includes support for viewing InfoGraphics, right in BI Launchpad.

StoryView.pngInfographic.png

This is also supported in OpenDocument, as you would expect with the other document types!

OpenDoc.png

Universe Support.

Both UNV and UNX data sources are supported by Lumira.  This means you can leverage your existing universe infrastructure, AND underlying security.

Once you've acquired data from a universe, what you publish to BI is the dataset, based on that universe.   That dataset then be scheduled, and refreshed by the user.   Below you can see the schedule option on a dataset, which gives you the standard scheduling options that you would see for a CR or Webi report.

schedule.png

Of course static datasets, which are based off uploaded data will not have the schedule options.

Note that the "force refresh on open" is not yet available.  However a user can refresh the data on demand if they have rights to do so.

When scheduling or refreshing, the same prompts with which the data was first acquired will be reused.  At this time, changing prompts or popping those up for the user when the manually refresh is not yet supported.

 

Datasets:

Datasets are the building blocks of stories.  A story can be build from one or more datasets.  In BI, these datasets are stored in the CMC, in a new location:

datasetsCMC.png

These datasets can be secured, deleted, much like you can do with explorer infospaces.

 

More on Datasets:

In Lumira, you 'prepare' a dataset, including merging data from a source like universe with data from a local spreadsheet. 

Screen Shot 2014-07-02 at 3.00.50 PM.png

You can also preform other operations on the data before publishing it.   When you schedule a dataset to run to fetch latest data, all the transforms that were applied will be replayed automatically.  This means any column joins/splits, custom calculations or combining of the data with a local excel source will be reapplied.

 

 

 

ESRI map support

Support for ESRI ArcGIS is also available with this initial release, which gives you great interactive maps support.  Support for on premise ESRI server is not yet available with this first relesae.

ESRI.png

 

Mobile Support

The SAP BI Mobile application will now make the stories available through the Mobi application, where you can consume stories directly alongside your existing webi and other BI mobile content.

 

LCM Support

Full LCM support including the dependency calculations is included.  This means that promoting a story, will allow you to include the dependent dataset, universe, connection object and all related security.   Do note that to have full LCM UI for the Lumira integration, you must be on BI 4.1 SP4.  On 4.1 SP3, you can still use the LCM command line interface.

 

New BI Proxy Server

A new category of servers shows up, called "Lumira Servies".   This service is responsible for the proxying the requests down to Lumira Server, which performs all the heavy lifting of the analytic.

Servers.png

 

Auditing, Monitoring, BI Enterprise Readiness

Standard monitoring metrics will show up for this new service.  Additionally, the standard 'who viewed what' that you expect to see in your audit log also becomes available.

 

Data Storage & setting Limits

When a dataset is published to BI4, the underlying data is actually stored in Lumira Server.  This is all done transparently behind the scenes and does not require any administration in Lumira Server.  In fact, it does not actually show up in your Lumira Server repository.

When a user refreshes a story based on a universe, they will get their own copy of the data stored temporarily.   An administrator can set size and time restrictions for this temporary storage.

datasetManager.png

 

Limitations

Stories must be authored and edited in Lumira Desktop.  Authoring directly from BI Launchpad, as you would do with the Webi Applet is not yet supported.

Accessing HANA Live is not yet supported.  At this time, a static dataset from HANA must be published from desktop.

 

 

Summary

This is only the first integration release of Lumira which already packs in a lot of functionality and allows you to leverage your existing BI4 infrastructure.  The Lumira BI4 integration will continue to add more enhancements and functionality with more releases over the coming months.

This post shows some possible ways to show in WebIntelligence who never login in the platform using SAP BO BI Universe in SBOPRepositoryExplorer and in the second method combining with SAP BO BI Audit Universe. Also it can be a good option to understand if Audit is working as expected.

 

 

Method I (Without Audit data)

 

Prerequisites:


Some required components:

  1. SAP BO BI 4.x Platform;
  2. SBOPRepositoryExplorer connection and the universe;
  3. WebIntelligence to create the document.

 

 

Creating a report with users information:

 

From WRC or from BI LaunchPad using WebIntelligence we can create next query to show number of users that we have in our SAP BO BI environment:

 

 

In our Test environment we have 2.921 users.

Now we can discover the number of users that never logged in Test environment:

 

 

It means that we have 2.921-2.676=245 users who have ever logged in Test environment.

 

With next query we can show list of users who never logged in this environment:

 

 

 

 

Method II (with Audit data)

 

Prerequisites:

 

Some required components:

  1. SAP BO BI 4.x Platform;
  2. SAP BO BI Audit DB;
  3. SAP BO BI Audit Universe configured and pointing to Audit DB;
  4. Excel to save users from Audit Report;
  5. IDT (Information Design Tool) to configure SBOPRepositoryExplorer connection and the universe;
  6. WebIntelligence to create the document.

 

 

Creating Report with Users Login-Activity from Audit DB:

 

Using WebIntelligence and Audit universe:

 

- For result objects:

 

 

- Filter Event Status with: User Logon Succeeded, Named User Logon Succeeded and Concurrent Logon Succeeded.

 

 

At the end you have next query:

 

 

Execute with Run Query:

 

 

Save document for a future use.



Export Report to Excel File:


Export report to Excel (XLS or XLSX):

 

 

Remove in the Excel all blank rows before head and all blank columns before "User Name", remove also any special character different than [a-Z][0-9].

You also can use SAP BO Live Office to retrieve data using Audit universe and schedule periodically.

Rename report name to the final table name in the universe:

 

 

Save to a visible path by SAP BO BI Client Tools (IDT and WRC) and by SAP BO BI WeIProcessingServer.

 

 

 

Retrieve SBOPRepositoryExplorer universe to IDT:

 

Create in IDT (Information Design Tool) a Local Project and from Repository Resources:

 

 

 

 

 

Configure Universe Connection attaching the Excel File:

 

To attach our Excel file definition to our universe we must create an universe connection in IDT into a project, for example:

 

 

 

 


Test Data from Excel in Connection:

 

Before continue with next steps is important to check if Excel data can be read where path is correctly defined and also the structure:


 

 



Import new Table (Excel) into Universe:

 

Now we can import the table into the Data Foundation:

 

 

Insert Join between EXCEL's table and USERS table:

 

 

Configure Join:

 

 

 

Save the Data Foundation.

 

 

Define new objects in the Business Layer:

 

Here we can define in the Business Layer, into the "Users" folder the new measure coming from the new table:

 

 

for example, with next content:

 

sum( ifnull("XLA_AUDIT_LOGON_USERS"."NUMBER_OF_LOGINS_TOTAL",0) )



and before save and publish the universe, create a query to test results with all users and users without login:

 

- With login (in our example 2.921 users):

 

 

- Without login (in our example 2.761 users):

 

 

It means that we have only 2.921-2.761=160 users who have ever logged in Test environment.

Now we can publish our new universe to CMS for next topic.

 

 

Compare data from Method I and Method II

 

As you can observe in "Method I" we have 245 users logged and in "Method II" 160 users logged. We want to discover what users are different from "Method I" and "Method II" and try to understand why those users were not recorded in Audit DB.

 

- First is create a query with all users logged from both methods:

 

 

(245 users)

 

- Second is create a combined query (with minus) to get the list of users that were not included in Audit DB record:

 

 

So we have to investigate why those 85 users where not recorded in Audit DB.

 

That's all by the moment.

Jorge Sousa

This post shows one possible way to create the list of SAP BO LCM jobs using the SAP BO BI Universe in SBOPRepositoryExplorer.

 

Prerequisites:

 

Some required components:

  1. SAP BO BI 4.x Platform;
  2. XML file with predefined CMS query;
  3. IDT to configure SBOPRepositoryExplorer connection and the universe;
  4. WebIntelligence to create the document.

 

For this example I'm using SAP BO BI 4.1 SP2.

 

Create CMS query for LCM Jobs:

 

We can create an XML file with next content:

<?xml version='1.0' encoding='ISO-8859-15'?>
<Tables xmlns="http://www.w3.org/2011/SBOPTables">
<Table>
    <TableName>LCMJOBS</TableName>
    <TableDescription>LCM Jobs</TableDescription>
    <BOQuery>SELECT SI_ID,
                    SI_CUID,
                    SI_NAME,
                    SI_DESCRIPTION,
                    SI_OWNER,
                    SI_OWNERID,
                    SI_PARENTID,
                    SI_PARENT_CUID,
                    SI_PARENT_FOLDER,
                    SI_PARENT_FOLDER_CUID,
                    SI_RECURRING,
                    SI_CREATION_TIME,
                    SI_DESTINATION_CMS_NAME,
                    SI_DOC_SENDER,
                    SI_ENDTIME,
                    SI_FLAGS,
                    SI_INSTANCE,
                    SI_JOB_EXPORT_FILE_NAME,
                    SI_JOB_STATUS,
                    SI_NEW_JOB_ID,
                    SI_SCHEDULE_STATUS,
                    SI_SOURCE_CMS_NAME,
                    SI_STARTTIME,
                    SI_STATUSINFO,
                    SI_UPDATE_TS
                FROM
                    CI_INFOOBJECTS
                WHERE
                    SI_KIND='LCMJob'</BOQuery>
    <BOFields>SI_ID,
              SI_CUID,
              SI_NAME,
              SI_DESCRIPTION,
              SI_OWNER,
              SI_OWNERID,
              SI_PARENTID,
              SI_PARENT_CUID,
              SI_PARENT_FOLDER,
              SI_PARENT_FOLDER_CUID,
              SI_RECURRING,
              SI_CREATION_TIME,
              SI_DESTINATION_CMS_NAME,
              SI_DOC_SENDER,
              SI_ENDTIME,
              SI_FLAGS,
              SI_INSTANCE,
              SI_JOB_EXPORT_FILE_NAME,
              SI_JOB_STATUS,
              SI_NEW_JOB_ID,
              SI_SCHEDULE_STATUS,
              SI_SOURCE_CMS_NAME,
              SI_STARTTIME,
              SI_STATUSINFO,
              SI_UPDATE_TS</BOFields>
    <DatabaseFieldTypes>INTEGER,
                        VARCHAR(100),
                        VARCHAR(256),
                        VARCHAR(256),
                        VARCHAR(50),
                        INTEGER,
                        INTEGER,
                        VARCHAR(100),
                        INTEGER,
                        VARCHAR(100),
                        BIT,
                        TIMESTAMP,
                        VARCHAR(256),
                        VARCHAR(50),
                        TIMESTAMP,
                        INTEGER,
                        BIT,
                        VARCHAR(256),
                        VARCHAR(256),
                        INTEGER,
                        INTEGER,
                        VARCHAR(256),
                        TIMESTAMP,
                        VARCHAR(256),
                        TIMESTAMP</DatabaseFieldTypes>
    <BOFunction>NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC,
                NOFUNC</BOFunction>   
</Table>
</Tables>




 

Configure Universe Connection attaching the XML file:

 

To attach our XML file definition to our universe we must create an universe connection in IDT into a project, for example:

 

 

 

 

Create the Universe:

 

After connection configured we can create the Data Foundation and Business Layer:

 

 

 

Test the Universe:

 

When universe is already created we can do a test before publish to CMS:

 

 

Publish Universe to CMS:

 

After test we can publish to CMS:

 

 

Create report in BI LaunchPad with WebI:

 

After published the universe we can use in WRC and in BI LaunchPad:

 

and the report can be like:

 

 

Thanks and enjoy.

Jorge Sousa

 

Update 17/06/2014: What's New document has been updated.  Comments below.

Update 22/06/2014: Added Section: Maintenance Schedule.  See below.

Update 23/06/2014: PAM has been updated.  See below.

Update 04/07/2014: Forward Fit Plan has been updated.  See below.

 

 

Announcement

 

SAP has released on Friday June 13th 2014, as planned in the Maintenance Schedule, Support Package 04 for the following products:


  • SBOP BI Platform 4.1 SP04 Server
  • SBOP BI Platform 4.1 SP04 Client Tools
  • SBOP BI Platform 4.1 SP04 Live Office
  • SBOP BI Platform 4.1 SP04 Crystal Reports for Enterprise
  • SBOP BI Platform 4.1 SP04 Integration for SharePoint
  • SBOP BI Platform 4.1 SP04 NET SDK Runtime
  • SAP BusinessObjects Dashboards 4.1 SP04
  • SAP BusinessObjects Explorer 4.1 SP04
  • SAP Crystal Server 2013 SP04
  • SAP Crystal Reports 2013 SP04

 

You can download these updates from the SAP Marketplace as a Full Install Package or Support Package (Update).

 

E.g.: Full Install

Full Install.png

 

E.g.: Support Package (Update)

SP (Update).png

 

 

What's New?

 

The updated What's New document has been released few days late on 1706/2014.  There are 9 new features and unless I'm missing the point, none of them are very impressive.  However, there are tons of fixes (293 to be exact).

 

Tip: If the link above still shows an old version, refresh the page in your browser or press F5.

 

 

Supported Platform (Product Availability Matrix)

 

The updated SAP BusinessObjects BI 4.1 Supported Platforms (PAM) document has been released a week late on 23/06/2014.

 

Alternative URL: http://service.sap.com/pam

 

As far as I can tell, the following extra support has been added since SAP BI 4.1 SP03:

 

  • CMS + Audit Repository Support by Operating System
    • Microsoft SQL Server 2014
    • Oracle 12c
    • SAP HANA SPS08
    • Sybase ASE 16

 

  • Adobe Flash Player 12

 

  • SAP HANA Support per SAP BI Products

 

  • Java Runtime (JRE) 1.8 (For browser use with Web Intelligence - Not Server side aka JDK which is still 1.7)

 

 

Documentation

 

The usual documents have been made available:

 

 

 

 

 

 

 

 

 

Forward Fit Plan

 

The SBOP Forward Fit Plan has finally been updated.  Few weeks late...  SAP BI 4.1 SP04 includes the following updates and fixes:

 

  • BI 4.1 Patch 3.1
  • BI 4.1 Patch 2.2 - 2.4
  • BI 4.1 Patch 1.6 - 1.8

 

  • BI 4.0 Patch 6.11 - 6.12
  • BI 4.0 Patch 7.7 - 7.9

 

  • XI 3.1 FP 6.4

 

Source: SBOP Forward Fit Plan



Maintenance Schedule


SAP BI 4.1 SP04 Patch 4.1 (Week 31 - August 2014)

SAP BI 4.1 SP04 Patch 4.2 (Week 35 - August/September 2014)

SAP BI 4.1 SP04 Patch 4.3 (Week 40 - October 2014)

SAP BI 4.1 SP04 Patch 4.4 (Week 44 - November 2014)

SAP BI 4.1 SP04 Patch 4.5 (Week 48 - November / December 2014)

 

Interesting and strange that there is no SAP BI 4.1 SP05 scheduled in Q3 or Q4 of this year.  There normally is a new SP0X when a SPY z.4 is released.

 

E.g.: SP04 when SP03 Patch 4 (3.4) is released.

 

Is this a sign of an up and coming SAP BI 4.2 or SAP BI 5.0?

 

Source: Maintenance Schedule

 

 

Installing Updates

 

I have installed the following updates on my training server.

 

Updates Installed

 

    • SBOP BI Platform 4.1 SP04 Server
    • SBOP BI Platform 4.1 SP04 Client Tools
    • SAP BusinessObjects Explorer 4.1 SP04

 

Environment

 

    • Windows Server 2012
    • 4x Virtual Processors (Hyper-V)
    • 20 GB RAM

 

Duration

 

Bearing in mind my training server originally started with a clean installation of SP02 then patched to SP03 with 3x languages (English, French, Finnish), this is how long it tool to install everything.

 

1. As always, the Preparing to install screen takes longer and longer...

 

Please wait.png


2. This chart shows the time it took waiting for the Preparing screen to disappear then the installation time.  That's right, about 2.5 hours (151 minutes) just to patch SAP BI Platform 4.1 SP04 Server.


Chart.png


3. As always, when you click Finish, do NOT reboot straight away.  Wait for setupengine.exe to go away in your Task Manager.  This can take a minute or so.

 

Task Manager.png

 

4. How it looks for me now.

 

Programs.png

 

 

Past Articles

 

For information, I wrote the following articles about previous SAP BI Support Packages:

 

 

 

Conclusion

 

It's still early days and there are couple of documents that need to be updated but so far so good.  No errors in the Event Viewer, every services are starting as they should and some preliminarily tests are successful.

 

As always, please share your how it went for you in the comments below.  I'm sure this does help many people.

 

 

Please like and rate this article if you find it useful!

 

 

Thanks and good luck!

 

Patrick

Here's some exciting news for you enterprise data connectivity junkies out there: SAP's BI 4.1 suite will support Hive2 and Impala connectivity via ODBC and JDBC drivers from Simba Technologies. And later in the year, so too will SAP's Lumira data visualization software.

 

For Simba Technologies, it's a mutually-rewarding partnership: Simba shares SAP's broad commitments to enterprise Big Data innovation, integration, productivity, and efficiency. But beyond that, why should Simba's connectivity "plumbing" matter to SAP's customers?

 

SAP BI 4.1 + Simba Connectivity = The Future of Big Data Interoperability

SAP's adoption of Simba connectivity drivers represents SAP taking a progressive stand for the innovation enterprise: From the data warehouse to the BI application to the Hadoop framework of choice, when it comes to Big Data in the enterprise, interoperability matters. A lot.

 

The SAP BI 4.1 Suite now includes Simba ODBC and JDBC drivers as embedded components. SAP BI 4.1 customers can easily connect their Big Data directly to Hive or Impala. Queries are faster, performance is better, and reliability is so good enterprise IT can take it for granted. (Not that enterprise IT would or should ever take anything for granted, of course!)

 

Interoperability and Extensibility: How Best-in-class Connectivity Impacts SAP Enterprises

What's really meaningful about this partnership is that it signifies SAP's commitment to interoperability. The Simba JDBC drivers for Big Data, for example, adhere to the JDBC standards. For SAP BI 4.1 customers, that means accessibility to more apps, more platforms, more data sources. The SAP BI 4.1 Suite is a first-class diplomatic citizen in the Big Data world because it can connect directly to any Hadoop distribution – no need to get drivers from the Hadoop distros – SAP has it all built in. SAP BI 4.1 customers get this direct Big Data and Hadoop connectivity using the same tools and products they have always used.

 

Today BI 4.1, Tomorrow Lumira

BI 4.1’s support for Hive and Impala connectivity via SIMBA drivers is a first step (or the first two steps) in optimizing Hadoop connectivity for SAP enterprises. SAP has cranked the throttle when it comes to optimizing Hadoop SQL engine performance. And the next step is on the BI visualization side: Lumira, SAP’s innovative solution for visualizing all that big data, will adopt Simba JDBC drivers later this year. The right tools, optimized connectivity, and screaming fast query speed: It’s a great time to be an SAP Big Data enterprise.

At SAP we understand customers would prefer to resolve product issues on their own, rather than logging a support incident with SAP Product Support.

 

To help customers resolve their own issues without our involvement, we have started to externalize our methodology for resolving product related issues. We call this ‘Troubleshooting by Scenario’. Troubleshooting by Scenario means you can follow exactly the same steps and methodology that a SAP support engineer or developer would follow to isolate the issue.

 

Troubleshooting by Scenario’ provides customers with a list of scenarios. Example scenarios are ‘Promotion Management’, ‘Process crashes’, ‘Install problems’, ‘Scheduling problems’, and ‘Report refresh problems’ to name but a few.

 

(To start with we are providing one scenario “Promotion Management”, within one product area “Business Intelligence Products”, in order to get your feedback before we expand this innovative idea further.)

 

Within each scenario we provide a list of hypotheses (something to test or high level symptoms). For example “Problem with the meta data of the source or target repository” or “Known workflow causes a problem” or “Individual object causing a problem somewhere” are all hypotheses.

 

For each hypothesis we explain:

  • the purpose (of the troubleshooting task)
  • the 'tool' name (and details of the tool)
  • a rating (to help you pick in case you're not sure which one to use first)
  • why the tools is suitable
  • how to use the tool (for that particular hypothesis)
  • next steps

 

The idea all along, is for customers to self-serve and resolve issues without the need to contact SAP. Of course there may be times you will need to contact SAP to help isolate or resolve an issue and certainly when a defect requires a new code-level fix.

 

Please watch this video

 

and visit the Promotion Management Troubleshooting Scenario at http://wiki.scn.sap.com/wiki/display/BOBJ/Promotion+Management+Problems

 

Your feedback is most valuable, please either comment here, use the survey or contact me via Twitter.

I'd be delighted to hear your opinions.

Thank you.

 

Matthew Shaw                          Twitter:@MattShaw_on_BI                Feedback Survey

This post contains how we can create an useful security matrix in SAP BO BI using the SBOPRepositoryExplorer connector.

 

Shortly I want to show you next content for this example of security matrix:

 

  • Access Levels (ACLs) definition;
  • Groups and Users Relation;
  • Application Rights;
  • Folder Rights;
  • Universe Rights;
  • Connection Rights.

 

1. How to extract ACLs


Using the universe provided in SBOPRepositoryExplorer and WebIntelligence:

universe_selection.png

We could use next dimensions to extract the ACLs from our CMS repository:

 

 

 

Next step is create some required variables for our example:

 

 

=If ([Specific Right]=0 And [Right Group Name]<>"General")
     Then "Overwrite General"
          Else
               If ([Specific Right]=1 And [Right Group Name]<>"General")
                    Then "Specific Right"



 

=If  (Count([CRole Right ID]) Where([CRole Right Granted]=1))>0
     Then 1
          Else
               If (Count([CRole Right ID]) Where([CRole Right Granted]=0))>0
                    Then 0
                         Else Count([CRole Right ID])



Now we can create a cross-table like:

 

 

And for the values ( ) we can use conditional formatting rules:

 

 

And the final result is (example):

 

2. Groups and Users Relation

 

For this kind of content we can use different perspectives/views in function of our requirements, but anyway, we can use next basic dimensions:


 

Here we have an example with:

 

"Group Name"

"User Name"


and for the value we can define next formula:

 

=If Count([Group Name])>=1
     Then "X"
          Else ""




Other example using full path:

 

 

3. Application Rights

 

For applications we can use next dimensions:

 

 

A possible example:

 

4. Folder Rights

 

For folder rights we can use the same logic than application rights:

 

 

5. Universe Rights


In next example we are showing rights for universe folders:

 

 

 

6. Connection Rights


Like the previous one we can use next dimensions:


 

This is a simple way to create online our security matrix for SAP BO BI.

 

Thanks and enjoy!

Jorge Sousa


Here are the slow behavior  of the CMC which ​was giving a hard time to work.


It takes 30 seconds to more than 1 minute to login in CMC / BI Launchpad.

Once logged in, navigation is slow between different sections.

Tomcat manager came up fast; however login page also takes time to come up.


Things to check: ~


Multiple boe_cmsd observed on the server box. One remains constant while other come and go.

Checked that Platform search was is set to continue crawling.

#Our APS service is split; however no separate APS for Platform search service.

As our BO server runs on 2 nodes cluster; we stopped the 2nd node to see if the performance improves.

It was observed in the past that restarting SIA resolves the issue temporarily; however issue comes up again after few days.

We restarted SIA but no visible change observed in the system.

We Disabled 'Enable auto-save for users who have sufficient privileges' for Webi Report referring KBA 1342368.

# Refer KBA 1956237 for information on auto-save in Web Intelligence is triggered the Server hangs.


How did we resolve this issue: ~


Removed the Platform Search Service from # APS and create separate #APS with the Platform Search Service.
In order to do this, Login to CMC go to Application > Platform Search Application > properties > check scheduled crawling. Save and Close.

Navigate to the program object for the Platform search under folder's in CMC and schedule it for after working hours.

One fine day you you as a BO Admin will find the Tomcat is going down frequently ot couple to few times a week. And you may not have any clue what's going.

In our case we had this issue and hence i thought to share it with you all.


This is likely due to the compression bug in tomcat .

Or check the logs and see if there is any indication of a problem with max permgen being reached (OOM - out of memory).



As a resolution First thing is to disable compression in server.xml (/path/to/tomcat/server.xml ) - Look for the connectors that have

compression=on and change it to compression=off .

Second thing to do will be to up the max permgen space for Tomcat. This can typically be done in the environment shell script (env.sh/setenv.sh) .


Knowing all these, we modified the server.xml from Tomact/conf/server.xml to off the compression. From compression=on to compression=off.

We cleared out the cache of tomcat Tomcat/work/catalina/localhost.


Then, we were able to login to CMC.


Then we changed the value for -Xmx to 4096 and added one parameter

-Xms to 256 in following variable-

JAVA_OPTS in tomcat/bin/setenv.sh script.

For more , we can refer SAP note-1750952 - BI4 Setting JAVA_OPTS for tomcat

Log in to CMC and go to Monitoring page. Go to Watchlist tab and click New.

2014-05-27_14h25_56.png

 

Give a name for your watch and select "Two (Ok, Danger)". Click Next.

2014-05-27_14h27_22.png

Type "disk" as filter and choose "Free Disk Space in Root Directory". Change the value of Danger. In this example i set it to less or equal then 100 GB.

2014-05-27_14h29_56.png

Click Next. At the Bottom, set the notification settings. Save your watch. You can add also e-mail alerts. Now copy a large file to your disk for test the watch. If your free disk space becomes less then 100 GB, you will get an alert depending on your notification settings.

2014-05-27_14h35_28.png

Exist a simple way to recover a password from a SAP BO BI Relational Connection with few code lines when you forgot or when you need to see if all security rules in your security baseline are working as expected and nobody can discover database connection passwords used for SAP BO BI reports with a simple logon and a simple application. Also you can test with your non-administrator username to see if you can see the password.

The code is based in SAP BO SDK  Java or .NET and in the test done was passing some parameters: CMS System, Username, Password, Kind of Authentication and CUID of the connection.


1. Main Source Code

I adapt and change from the original code used in my connector "SBOPRepositoryExplorer" to explore CMS repository using a simple universe in real time (How to explore SAP BusinessObjects BI CMS Repository) and I did a test to check the vulnerability. By SAP Copyright policies I'm not allow to publish the content.

 

2. Create a simple user in CMC associated to group everyone

Create in CMC a simple user without any other group than everyone:

 

CMC1

 

CMC2

 

3. Create a simple connection from some database


For example in IDT create a simple relational connection to test.

 

IDT

 

In our example username is "userOracleTst" and password "simplePassword123.".


4. Test from command line

 

"C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\win32_x86\sapjvm\bin\java.exe" -jar "D:\app\Tools\ConnectionProperties.jar"  BO4Server tstuser simplePassword123. secEnterprise Ace8qbWCgDlFvUmUYlHxuHs

Where ConnectionProperties.jar is a compiled application to check the content of the connection.


BO4Server is the CMS server.

tstUser is the usermane.

simplePassword123. is the password.

secEnterprise is the authentication method used.

Ace8qbWCgDlFvUmUYlHxuHs is the CUID of the connection created.

 

result

 

5. Workaround


It can be a little bit dangerous that anyone with a simple username in BO can discover our DB password connection. This right has been introduced in BI 4.0 SP3 to secure the connection parameters –typically username, pwd, servername– that were downloaded for Web Intelligence offline. Indeed, Web Intelligence offline needs to keep a copy of the connection (username, pwd, servername…) in order to access the DB without being connected to the CMS. To address the danger of this approach, it is possible to deny the right in the CMC via the option “Download connection locally”.

 

If the right “Download connection locally” is granted, you can use WebI offline, but  cnx parameters can be downloaded.

 

If the right “Download connection locally” is denied, all sensitive cnx parameters remain on the CMS and thus WebI cannot be used offline anymore. As the cnx parameters remain on the CMS, then all DB access are performed server side.

 

For more information see p. 845 in the Business Intelligence Platform Administrator Guide

 

 

Jorge Sousa

Hi

 

SAP BI Platform has provided Subversion tool(third party) that is shipped with BI 4.X. Subversion tool is used to maintain different version of any object in SAP BI.

 

Below are subversion control Terminologies which comes in picture.

 

1. VMS Repository:- A Global Shared location where subversion stores all the its all items versioning, history, lock and unlock status etc.

                                  Default location in Windows is <Install_dir>SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\LCM_repository\svn_repository

                                  and linux User_home_dir/LCM_repository\svn_repository

 

2. Working copy: Working copy is a snapshot of the repository.VMS repository is Global,working copy is local.Adaptive Processing Server first writes in Working                               copy and the update it is using svn command updates in Global Copy which VMS repository

 

                              Default location in windows <Install_Dir>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\CheckOut\BIPW08R2B6400\WORKSPACE

 

                         And linux Default location is <Install_Dir>/sap_bobj/enterprisexi40/CheckOut/BIPW08R2B6400/WORKSPACE

 

Normally after installation , uprade or by any chance you may face issues where CMS version and VMS version doesn't match in Version management. This could be because working copy is not in sync with VMS repository.

 

Image.jpg

 

Since your VMS respository and Working copy is out of sync we cannot add the Particular Report which in this case is "Input Control and variabale " Webi Report. This kind of deadlock situation where only option left is to reset repository.

 

If we manully check the VMS repository using command

 

svn ls svn://localhost:3690/svn_repository

 

you will get below output

 

image2.png

 

***Note:- Above displayed is CUID of Webi report .In Working Copy and VMS repository BI objects gets stored as folder with CUID name.

 

but if you navigate to working copy(<Install_Dir>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\CheckOut\BIPW08R2B6400\WORKSPACE) the the folder is missing.

 

Is the reason there is a mismatch in CMS version and VMS version becoause CMS version is being retrieved from (<Install_Dir>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\CheckOut\BIPW08R2B6400\WORKSPACE) by subversion command and VMS version is retrieved VMS repository.

 

The possible solution to fix issue is sync objects between VMS repository and Working Copy.

 

You could use below command to achive this.

 

svn update svn://localhost:3690/svn_repository "Install_Dir>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\CheckOut\BIPW08R2B6400\WORKSPACE"

 

 

image3.png

 

Now if you navigate to working copy "Install_Dir>\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\CheckOut\BIPW08R2B6400\WORKSPACE" you could find a folder with CUID of same report also if you go back to CMC -> Version Management you will find CMS and VMS version are in sync and you could now checkin or delete accordingly.

 

image4.png

 

 

Hope steps performed above helps you.

 

Please provide your inputs.

 

Thanks

In SAP Business Objects installations that use SAP SQL Anywhere as the default CMS and Audit repositories, database logging is not enabled.  If you ever run into the situation where the database cannot start, it's useful to enable logging to find out why that is happening.  Here's how you do that for both Windows and Unix deployments.

 

Windows Installations

 

Database logging is enabled by adding a startup parameter when the database starts. The database startup parameters are specified in the Windows service for the CMS and Audit databases.  You'll need to install the SAP SQL Anywhere Client on your BOBJ system to run the database administration tools.  Instructions on how to obtain the SQL Anywhere Client are here: Database Administration Tools for SQL Anywhere CMS and Audit DBs.

 

  1. After installing the SQL Anywhere Client, launch Sybase Central.
  2. In the right panel, double-click on "SQL Anywhere 12".

    SCmain.png
  3. Switch to the Services tab.  You'll see the database service "SQLAnywhereForBI".  Double-click on that entry to see its properties.

    DBServiceCMS.png
  4. Switch to the Configuration tab to see the database startup parameters.

    ServiceProperties.png
  5. That's where you'll add the additional parameter to enable logging.  Add the following at the beginning of the Parameters text box:

    -o "C:\BOBJ\dblogs\cmslog.txt"

    In this example, all database server messages will be logged in the file cmslog.txt located in the specified directory.  You can obviously choose a different location and/or file name for your deployment.

    ServiceProperties-Log.png
  6. Click the Apply button.  Close the Service Properties dialog and shut down Sybase Central.

 

Important Notes:

 

  • You'll need to restart the database service to create the log file.  You can do this from Sybase Central, from the Services applet in Control Panel, or by restarting the Business Objects server.
  • After troubleshooting and resolving any issues, disable database logging in your production system.  For performance reasons, we recommend that production database servers run without any logging.

 

Here's an example of what appears inside the database log file:

 

I. 05/22 11:08:33. Starting database "BI4_CMS" (C:\Program Files (x86)\SAP BusinessObjects\sqlanywhere\database\BI4_CMS.db) at Thu May 22 2014 11:08
I. 05/22 11:08:33. Starting database "BI4_Audit" (C:\Program Files (x86)\SAP BusinessObjects\sqlanywhere\database\BI4_Audit.db) at Thu May 22 2014 11:08
I. 05/22 11:08:33. Transaction log: BI4_CMS.log
I. 05/22 11:08:33. Transaction log: BI4_Audit.log
I. 05/22 11:08:33. Starting checkpoint of "BI4_Audit" (BI4_Audit.db) at Thu May 22 2014 11:08
I. 05/22 11:08:33. Starting checkpoint of "BI4_CMS" (BI4_CMS.db) at Thu May 22 2014 11:08
I. 05/22 11:08:33. Finished checkpoint of "BI4_Audit" (BI4_Audit.db) at Thu May 22 2014 11:08
I. 05/22 11:08:33. Finished checkpoint of "BI4_CMS" (BI4_CMS.db) at Thu May 22 2014 11:08
I. 05/22 11:08:33. Database "BI4_Audit" (BI4_Audit.db) started at Thu May 22 2014 11:08
I. 05/22 11:08:33. Database "BI4_CMS" (BI4_CMS.db) started at Thu May 22 2014 11:08
I. 05/22 11:08:33. Database server started at Thu May 22 2014 11:08
I. 05/22 11:08:33. Trying to start SharedMemory link ...
I. 05/22 11:08:33.     SharedMemory link started successfully
I. 05/22 11:08:33. Trying to start TCPIP link ...
I. 05/22 11:08:33. Starting on port 2638
I. 05/22 11:08:34.     TCPIP link started successfully
I. 05/22 11:08:35. Now accepting requests

 

Unix Installations

 

The database startup parameters are specified in the file <bobj_install>/sap_bobj/sqlanywhere_startup.sh, where <bobj_install> is the location where you installed SAP Business Objects.

 

  1. Edit that file (sqlanywhere_startup.sh).  The database startup parameters appear at the end:

    dbspawn -f dbsrv12 -gk all -n "$SQLANYWHERE_SERVER" -x "tcpip(port=${SQLANYWHERE_PORT};DoBroadcast=NO;BroadcastListener=NO)" "${SQLANYWHERE_CMS_DBFILE}" "${SQLANYWHERE_AUDIT_DBFILE}"

 

  1. Add the following parameter to enable logging:

    dbspawn -f dbsrv12 -o /home/bo4user/cmslog.txt -gk all -n "$SQLANYWHERE_SERVER" -x "tcpip(port=${SQLANYWHERE_PORT};DoBroadcast=NO;BroadcastListener=NO)" "${SQLANYWHERE_CMS_DBFILE}" "${SQLANYWHERE_AUDIT_DBFILE}"

    In this example, all database server messages will be logged in the file cmslog.txt located in the specified directory.  You can obviously choose a different location and/or file name for your deployment.

 

 

Important Notes:

 

  • You'll need to restart the database service to create the log file.  You can do this by executing the shutdown/startup scripts.
  • After troubleshooting and resolving any issues, disable database logging in your production system.  For performance reasons, we recommend that production database servers run without any logging.

 

SQL Anywhere Database Server Log Files

 

The -o filename parameter prints all database server messages to the specified log file.  There are also a few other parameters that you can use:

 

ParameterDescription
-oe filenameLog startup errors, fatal errors, and assertions.
-on { size[ k | m | g ] }

Specify a maximum size for the database server message log, after which the file is renamed with the extension .old and a new file

is started.

-os { size[ k | m | g ] }Specify a maximum size for the database server message log file, at which point the file is renamed.
-ot logfile

Truncate the database server message log file and appends output messages to it.

 

For complete details on SQL Anywhere database server startup parameters, please refer to the documentation: http://dcx.sybase.com/1201/en/dbadmin/server-database-dbengine.html.

Actions

Filter Blog

By author:
By date:
By tag: