cancel
Showing results for 
Search instead for 
Did you mean: 

IDM Sizing Guide - empirical data for statistics

Former Member
0 Kudos

Hi,

In the current IDM sizing tool we are using a formula to calculate the vCPUs, memory and disk.

While this approach gives a result, we are thinking to further improve it by using sizing empirical data - to enter the real life size configuration data from customer installations and map them  to T-Shirt sizes.

I'm asking IDM consultants and experts who have field data to contribute to gather enough statistical data.

You can publish here or send to me:

- either simple set of [Number of identities, vCPUs, Memory, Disk, System Type (DB/Runtime,UI, VDS)]

- or extended one like the table bellow

Variable Description Sample Figures
ANT Average Number of action Tasks towards target systems (including the identity store).Typically 5.                5
APE Audit per Entry. (Average Size in KB of the audit log for one user. The audit log includes information about tasks executed on the user. Typically 1 KB).                1
MKA Months to Keep Audit             100
NCM Number of Changes per Month 10 000 000
NIO Peak Number of Operations per Hour on Identity Store          1 000
NOE Number of Entries    1 000 000
NOR Number of Revisions of historical user data                5
NOS Number of Connected Systems               75
NPO Peak Number of updates (per hour) to the identity store leading to Provisioning. Updates can come from the Identity Management User Interface, a job and an action task.             280
NPPE Peak number of Entries to be Processed in Parallel (per hour).If one user is provisioned into two different systems, this counts as two operations.Note that this number does not take into account the time spent on the system being provisioned to.             180
SCE Size of Content per Entry in MB. This number may vary depending on which attributes are stored on each entry. The number will be higher when including for instance pictures or other binary attributes. (MB)         0.500
TEPH Peak Task Executions per Hour       110 000
SAPS100 SAPS is defined as the computing power to handle 2,000 fully business processed order line items per hour. In technical terms, this throughput is achieved by processing 6,000 dialog steps (screen changes), 2,000 postings per hour in the SD Benchmark, or 2,400 SAP transactions.       46 200

Or from your experience with sizing IDM - anything you can share in this discussion.

Best wishes,

Fedya

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi,

, Former Member, @stef

As the topic is back on the agenda, with the help of here are some queries we could use to get sense about the size of the system:

Number of managed users

select count(*) from mxi_entry  where mcEntryType = 'MX_PERSON'

Number of managed Business Roles

select count(*) from mxi_entry  where mcEntryType = 'MX_ROLE'

Number of Managed Technical Roles

select count(*) from mxi_entry  where mcEntryType = 'MX_PRIVILEGE'

Number of Entries

select count(*) from mxi_entry

Audit per Entry.(Average times each entry was changed)

select avg(countAudits) from (select count(*) as countAudits from mxp_ext_audit  group by Aud_OnEntry)counts

Number of Connected Systems

select count(*) from MC_REPOSITORY

Number of assignments

select count(*) from mxi_link

Number of active links

select count(*) from mxi_link where mclinkstate = 0

Audit size information

select count(*) from mxp_audit

select count(*) from mxp_ext_audit

and then knowing this - what are the actual vCPUs, memory and disk.

Some of the resource intensive scenarios one could imagine are Provisioning of users to several target systems - how many I guess: select * from mxi_entry  where mcEntryType = 'MX_PRIVILEGE' and mcDisplayName like '%ONLY' -- and not like ':READONLY'

Another interesting case is adding a new target system. For what time is it expected that all users are provisioned? But lets start simple.

Best wishes,

Fedya

former_member2987
Active Contributor
0 Kudos

Thanks, Fedya.  I'll be taking a look at this real soon.

Matt

Answers (2)

Answers (2)

Former Member
0 Kudos

I agree with Matt. SQL queries would make it lot easier. I would love to contribute data to get the tool going.

former_member2987
Active Contributor
0 Kudos

Fedya,

A first step to gathering this information might be the creation of a tool or some queries that we can run to gather this information.

Any ideas?

Thanks,

Matt