cancel
Showing results for 
Search instead for 
Did you mean: 

Recommended BPM Context size

Former Member
0 Kudos

Hi Experts,

We are currently on PO 7.4.

Lately, I have been reading about BPM performance and came across a recommendation which advice to keep the BPM Context size to minimum possible in order to avoid performance issues. I have following doubts on this -

1. On an average, what should be the limit of context elements count to avoid performance hit? 20-30, 60-70, 90-100 or more than that?

2. By mapping the context with reporting data-source, would it be possible to transfer the data burden to reporting data-source?

3. As it is not possible to directly insert into reporting database through APIs, we would need to capture business data in process context first and then map the same to reporting activity. Doesn't that mean the context size will increase resulting in performance hit?

I would really appreciate your detailed views in this.

Thanks in advance.

Accepted Solutions (0)

Answers (3)

Answers (3)

Former Member
0 Kudos

Also keep in mind, that you can only store FLAT data and simple data types in the Reporting Datasources.

There is no way to store data tables (0...n cardinality) within BPM reporting datasources....

Former Member
0 Kudos

Hi

Check out DOs and DONTs with SAP NW BPM: http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/30189551-12db-2e10-f58a-bbf241466...

It says


Decide which data to store in process context, and which data to retrieve when User Interface (UI) is

initialized , recommendation is to keep the process context as small as possible

There is no absolute recommendation possible as it depends on the individual situation, but the smaller your context is the better.

A good way to go is to store your data outside the bpm context and only keep the relative keys within your context to get access to it.

former_member191643
Active Contributor
0 Kudos

Hi Avinash,

Your concern is totally valid.

To answer your questions,

1. There is no such pre-defined limit for Process Context elements (Data Objects) but it is advised to keep the process-context light, to avoid complex mappings and heavy executions. When a data object is created in the process and deployed on the server, the data passed to it (or the data structure) is stored on the server in the form of BLOBs (Binary Large Objects). So you can identify the amount of space it can occupy on the server and how the process execution will take a hit.

2.Mapping the process context data to Reporting data source Activities will not help at all. When you do that, you are just passing the data from Process Context to the Data Source, not transferring it. Data in the process context stays as it is unless and until the process is completed and archived.

3. Yes, it is not possible to directly insert data into reporting database. That is the reason why it is advised to keep the process context as light as possible. Use a 0..n or 1..n node in a data object wherever possible because both a string or a list are treated as single BLOBs once deployed.

I have mentioned all this considering ideal server capabilities. If the server has high-end configurations and a state-of-the-art RAM, this should not be a problem. Normally. it is very light-weight.

Hope this answers at-least some of your doubts.

Regards,

Siddhant.