Scenario: There can be terminated users whose user id has been locked with security administrative lock and all roles are removed from the account or maybe they are completely removed from the system and then those users have pending reserved workflow items, still hanging to be completed.



Solution 1: GRC Process Control provides provision to reassign these workflow to a different user using the functionality Replacement/Removal.

For this, under Access Management tab and go to ‘Replacements’.



Click on button ‘Replace or Remove’.


Now under in: choose ‘Deleted User Name’ and in Find: give the exact and complete user id of the terminated user. And then go ahead click next and do the replacement as for other users



. Related notes: 1904421, 1927964.


Solution 2: If you have administrative rights in backend system, you can also forward these hanging workflow items directly from backend using the transaction SWIA.

Simply search for the work item id and use the icon  4.png to forward. Just remember this does not allow you to forward the work item to your own user.

This is a weekly blog that will have the 5 last SAP Notes of Access Control corrections released to customer by SAP!



1 - 2290322 - UAM: Missing Reviewer Agent for Notification purpose in UAR and SOD workflow configuration





Open the MSMP configuration and select the process type for UAR or SOD. Go to the Maintain Agents tab.Reviewer agent is missing for notification purpose in UAR and SOD Process type.



2 - 2294014 - HANA role does not get saved in the target system




If a HANA role is created through Access Control (Role Maintenance), the role is not saved in the target system. How should the role methodology be configured for HANA roles?


3- 2229853 - GRC and S/4HANA oP: compatibility information





You want to use GRC with S/4HANA on-Premise (oP).  Which points do you need to consider?


4 - 2266192 - Truncation of Firefighter ID Description





The Description of Fire Fighter Object is getting truncated.




5 - 2291174 - AC10.X Incorrect column name in generated permission rules table





Incorrect column name in the permission rule table while generating rules
after creation of access risks.



To receive updates bookmark the blog and have updates every week!


Rafael Guimbala

I have sat on both sides of the table: I have been  the consultant working with clients to implement SAP GRC Access Control components, and I have been a customer member of the project teams. In today's tight budgetary climate, exploitation projects are sometimes the best way to get project funding: configuring and implementing some additional component of a solution already bought, installed, and licenses paid. Thus, on the one hand, such exploitation projects can be huge wins: the customer gets more value from a solution already live, for better ROI. So what's not to like?


Here is the part that can be overlooked in those rosy, halcyon early days post go-live of the additional component.  Chances are good that your project scope went to go-live, with the consultancy providing some limited  time of "hypercare." Hurrah, it works! They are using it! The process works as designed and documented. The project sponsors are happy. Bye bye, good luck, it's been great, let us know when you are ready for the next step in the roadmap. And on down the road they go to their next engagement.


Now you have to support this thing, possibly with just a few tweaks to your previous support processes, but other times the new solution requires processes that are brand new, with new risks and opportunities.


That's no biggie, there are bound to be lots of blogs, wikis, and presentations online covering leading practices for production support for all of the Access Control components.  Mmm, noooo, not really. SCN has a treasure trove of resources for going from installation to the go-live, like those listed in this compendium,

SAP Access Control - Useful Documents, Blogs, Resources, etc.

and discussions with tips for dealing with all kinds of issues and  the "undocumented features" of some support packs, but production support?   Welcome to "ownership:" you are on your own.


To be honest, it is not so surprising; the majority of the people who post on SCN do not work in production support, maybe never have, or are only called in when something is broken. Yes, there are some SAP customers who post here, but we seem to be rather in the minority. And who among the customers is going to boldly proclaim that they can advise on leading practices? Perhaps some of us just need a bit of encouragement.


Presentations at TechEd? If only; something process oriented would be considered "not technical enough." The SAPInsider GRC Conference? No, not there either. The SAP user groups? On I found some great presentations on new features and roadmaps, use cases and implementation case studies, and  one excellent presentation on administering your GRC system, but even that one was focused on best practices for dealing with problems. It seems that production support processes are not glamorous or exciting enough for presentations.


I plan to post a few specific questions but this is my ask to followers of this space: anyone who has any great ideas for production support processes for the GRC Access Control components- the field is wide open! You are cordially invited to step right up and share your experiences, especially those who have been doing it for years. Once we get this new process sorted out, I will publish a post, but don't wait for me. You don't have to claim that you have all the answers, or that your processes are one size fits all. Just sharing what works for you might help the next poor sod who implements that component and then says to herself. OK, now what?

T. de Jong

Risk Terminator

Posted by T. de Jong Jan 18, 2016

Risk terminator is a ‘hidden’ feature of SAP GRC Access Control that can be used to analyze roles and users assignments on access risks in the backend system. Depending on the GRC configuration risk terminator can work both detective (report on access violations) or preventive (the system will prevent violations from being introduced in roles).

figure 1.jpg

Whenever a change is made to an existing SAP role or a new role is created the content of the role is first checked by the access rules established in the GRC rule set (that resides in the GRC environment). By using risk terminator the role administrator can immediately remediate access violations in development and play an important role in making sure that the system will stay clean (by avoiding violations being introduced in production).

Risk terminator can also be used when assigning roles to users as well, which can be a powerful feature in production.

Now let’s take a closer look at risk terminator


Example A:

The role ztestriskterminator is created in PFCG (profile generator) by the role administrator.

figure 2.jpg

The role administrator adds the Purchase order maintenance transaction codes ME21N and ME22N and ME29N PO approval transaction code to the role.

figure 3.jpg


The authorization objects ‘Document type in purchase order’ and the release code and group in the role, that are required to create/change a purchase order and release a purchase order, are not restricted.

figure 4.jpg


When generating the profile of the role the GRC rule set is called and the role is analyzed for possible access violations.

figure 5.jpg

One access risk is detected as is shown above. The assignment of the conflicting activities maintain the purchase order and release purchase order is called a SoD-conflict (segregation of duty conflict). With the detailed information the role administrator can take proper action and remediate the violation if he or she think it is necessary.


Other views such as management view are available as well just as in standard GRC.

figure 6.jpg

Depending on the risk terminator’s configuration the role administrator can chose to discard the changes, continue with the simulation or generate the profile of the role with the violations

figure 7.jpg


Example B:

Risk terminator can also be used when assigning roles to users. In the example below the ztestriskterminator role will be assigned to user ZTEST.

figure 8.jpg

As a result ZTEST user will be assigned the conflicting activities maintain PO and release PO which is a SoD-conflict.

fiure 10.jpg

The user administrator can chose to abort the role assignment or continue.

Example C:

Most organizations use the function - task role concept. This means that a (business) function is build out of one or more tasks. In SAP this is called the composite - single role approach. Risk terminator adds value here as well.

The role administrator sets up a composite role named Master Data Officer and adds two task roles. The first task role grants access to vendor master data maintenance and the other one to confirming sensitive vendor changes (such as bank details/alternative payee)

pic 1.JPG

The ability to change the vendor master record (FK02) and confirm sensitive vendor master record changes (FK08) should be seperated. This SoD-conflict is detected by the risk terminator tool.

picture 2.JPG

The role administrator chooses to assign the vendor confirmation role to another function in finance instead of assigning this role to the master data officer.

Especially in the development area risk terminator proves to be a valuable asset to the role administrator in preventing SoD-conflict and sensitive access violations from being introduced in roles.

This post is a continuation of Security Weaver’s Process Auditor - Developer's Observations (Part 1)


g. How to Debug the Alert Generation

This section will show where areas of the generated program to focus on in case you need to debug the logic that reads the rules and contols.

As usual, enter /H in tcode to start the debug mode  and ENTER.







... Enter form FCODE_EXECUTE_CONTROL in include /PSYNG/SA_001F01...


Logic that reads the rules and controls
Include /PSYNG/SA_001F01’s form FCODE_EXECUTE_CONTROL contains that the logic that reads the rules and controls...




Logic that calls the Fetch Data Code

Continue to single step to here...



Fetch Code program is called here...




Here are where the parameters are read for the control.....



h. How to Import to a Target Systems

To assign objects to a transport to import Development to a target system, follow the processing steps.

1. Create transport request (manually or in SolMan ChaRM). Manually add the Control program and import into the target system.



2. Now download the Control Matrix files. Please follow the below steps to achieve the same.

a) Go to the source system.

b) Run the transaction /N/PSYNG/PA.  Go to Misc.Tab

c) Click on Upload Download (Backup)



e. Check the Download Radio button.

f. Check all check boxes and specify the file path.



Select EXECUTE to download the files checked.



3. In the Target system,  upload the files. Please follow the below steps:

a) Run the transaction /N/PSYNG/PA.  Go to Misc. Tab

b) Click on Upload Download (Backup)

c) Check the Upload Radio button.

d) Check all check box and specify the file paths that you downloaded to in the source system.







For some reason, in my Rule Details file, two empty records were created (for reason unknown).This caused an error on the upload.  I edited the file in Notepad to delete the empty records after the download and uploaded the modified file.


To confirm the target system is as expected, check the following areas:
• The tabs
• The user exit(s) (For me, in the target system, I needed to access the screen on the tab Misc, then the entry appeared in the table.)


i. Observations

• When code changes are made, programs and the CODE file need to be downloaded/uploaded.

• By standard, files downloaded/uploaded contain data/values for ALL controls and programs, not just the ones you modified.


j. Additional Controls

For our requirements, the code to process tcodes FB05 and FB50 are had the same ABAP, so I copy/pasted to the new controls. The only differences were the tcodes noted in the control rules and in the variants.


Thank you for reading!  I hope this post will help your developer in your implementation!   If you have any questions or if I missed some details, please let me know and I will update this post.  

Cheers! 



This blog pertains to Security Weaver’s utility Process Auditor.  I recently had the opportunity to work with Security Weaver’s Process Auditor (or PA). In summary, PA is a utility that will provide continuous compliance monitoring in SAP.


This previous post gives a general overview of the utility...


PA has several out-of-the-box controls to use, however our requirement was to develop custom controls.  I was not able to find detailed documentation on how to configure and develop within the tool or about how the areas are connected within the tool and in the canned programs. The PA user guide (ProcessAuditor-UserGuideforv2 5PS3.pdf) contains user information and general information on how to create a new control and rule id.  Start with that document first.  In this blog, I will discuss other technical details that are vague or not detailed at all in the user guide as well as my observations to help your developer/configurator with an installation.


Sections covered in this blog include:

1) Our Requirements
2) The Process Auditor utility

a. Create a new Control and Rule Parameters
b. The Development Workbench:

- Output format (the format of the alert record)
- Fetch Data Code (when to generate the alert)
- Hotspot code (for hyperlink in the Inbox)

c. Online Execution
d. Execution in Background
e. How to view the Alerts
f. User Exits to Know About
g. How to Debug the Alert Generation
h. How to Import to a Target Systems
i. Observations
j. Additional Controls


Our Requirements

Our requirements were to create three custom controls in ECC for transaction codes FB01, FB05, and FB50 based on on a specific document type. I’ll only show FB01 in this blog.

Basically, I needed to...

  1. Create a rule ID for each control
  2. Create control IDs
  3. Create hotspots
  4. Create a background Jobs to trigger alerts in Process Auditor tool.


Process Auditor utility

Tcode /n/PSYNG/PA

a) Create a new Control

As explained in the user guide, here is the control I created:




Rule Parameters…

Rule parameters are passed to the program when the control is executed directly in the utility.  They are NOT passed to the batch job.  The batch job variant and special logic in Fetch Code are needed to pass these values to batch processing. I’ll explain more about that in the Fetch Data Code section.


b) The Development Workbench

Alert Output Format

In this section, enter the format of the output record that will appear in the Alert.  I used the same name for the Rule ID as I did for the Control ID for clarity.



For the accounting debit and credit amounts above, I used a custom structure…


Fetch Data Code

Any time you enter the Fetch Code tab, you need to enter the Rule ID. Then, while keeping the cursor on the Rule ID field, select ENTER. This will allow the existing code to be displayed....



If you move the cursor‘s focus to either the Selection Screen or Fetch Data Code sections, then select ENTER, the utility gets „confused“ and it will appear as if no code exists like this...



Don’t worry! Any existing code is still there. To fix this situation, just select a different tab. Select to confirm that the changes are NOT to be saved.
Then, return to the Fetch Data Code tab and keep the cursor on the Rule ID field when you select ENTER.


The Selection Screen and Fetch Data Codes sections are Includes. They are inserted into the canned program created by the PA utility.


Here is the code I used in my Selection Screen section....





* Refer to include /PSYNG/SA_001F01 form submit_programs for fields
* passed in SELTAB:

** Used to create case. Not used in Fetch Code directly. Declared in
** main program
*                  p_ruleid for /PSYNG/SARULEHDR-RULEID.

** Used to read data:
SELECT-OPTIONS  : BELNR  for bkpf-belnr,
                  BLART  for bkpf-blart,
                  wrbtr for BSEG-WRBTR,
                  tcode  for bkpf-tcode,

* read data fields for testing...
                  BUDAT FOR SY-DATUM.

* When executed in batch, the rule parameters are not passed, so the
* batch job does not receive stardard parms ZCNTRLID or ZP_RULEID.
* Pass them in variant, so the case id will be created by the batch job.
parameters      : zp_rule like /PSYNG/SARULEHDR-RULEID,

* "=X if processing batch job. Set in variant only




Here is the code in my Fetch Data Code.....



* Flow:
* 1) Read data for selection parameters
* 2) Populate output fields that appear in inbox
* 3) Fill rule and control ids, needed for batch processing
* 4) output for batch job spool
* Batch Job:
* When scheduling batch job, the data for this is extracted for
* yesterday.  Schedule batch job at 00:01 (midnight) to read all of
* yesterday's data.

        BUKRS TYPE BKPF-BUKRS, "company code
        BELNR TYPE BKPF-BELNR, "document no
        GJAHR TYPE BKPF-GJAHR, "year
        BLART TYPE BKPF-BLART, "document type
        BUDAT TYPE BKPF-BUDAT, "Posting date
        tcode type bkpf-tcode, "doc created in tcode

        BUKRS TYPE BKPF-BUKRS, "company code
        BELNR TYPE BKPF-BELNR, "document no
        GJAHR TYPE BKPF-GJAHR, "year
        WRBTR type ZPA_WRBTR, "amount in transaction

                                              INITIAL SIZE 0.

                                              INITIAL SIZE 0.

data: gs_output like line of GT_output.

      gs_tabix type sy-tabix.

data:  ls_yesterday like sy-datum.
DATA : v_bukrs LIKE bkpf-bukrs,
      v_belnr LIKE bkpf-belnr,
      v_gjahr LIKE bkpf-gjahr.

DATA : bkpf_curs  TYPE cursor,
      l_pack_size TYPE I VALUE 999999. "limit # of alerts



* Set date range to read. If not set, process as in production and
* default to only yesterday's records. If set, we are in testing mode.
    if BUDAT[] is initial.
*    for production processing, use yesterday's date only...
      ls_yesterday = sy-datum - 1.
      BUDAT-sign = 'I'.
      BUDAT-option = 'EQ'.
      BUDAT-low = ls_yesterday.
      append BUDAT.
*    use date requested in parm

* 1) Read data for selection parameters
      FOR SELECT bukrs belnr gjahr blart budat tcode
                  FROM bkpf
                  WHERE belnr in belnr
                    and tcode in tcode
                    and blart in blart
                    and BUDAT IN BUDAT.
    FETCH NEXT CURSOR bkpf_curs  INTO TABLE lt_bkpf
                                  PACKAGE SIZE l_pack_size.
    IF SY-SUBRC <> 0.

    SORT lt_bkpf BY bukrs belnr gjahr.
    SELECT bukrs belnr gjahr buzei bschl koart shkzg gsber MWSKZ WRBTR
      FROM bseg INTO TABLE lt_bseg
      FOR ALL ENTRIES IN lt_bkpf
        WHERE bukrs = lt_bkpf-bukrs
          AND belnr = lt_bkpf-belnr
          AND gjahr = lt_bkpf-gjahr.

    SORT lt_bseg by bukrs belnr gjahr.

    LOOP AT lt_bkpf.
      v_bukrs = lt_bkpf-BUKRS.
      v_belnr = lt_bkpf-BELNR.
      v_gjahr = lt_bkpf-GJAHR.

      LOOP AT lt_bseg FROM TABIX.
        IF LT_BSEG-BUKRS <> v_bukrs OR
          LT_BSEG-GJAHR <> v_gjahr OR
          LT_BSEG-BELNR <> v_belnr.

          TABIX = SY-TABIX.

* 2) Populate output fields that appear in inbox
        GT_output-bukrs = LT_BSEG-bukrs.
        GT_output-belnr = LT_BSEG-belnr.
        GT_output-blart = lt_bkpf-blart.
        GT_output-GJAHR = lt_bkpf-GJAHR.
        GT_output-budat  = lt_bkpf-budat.
        GT_output-tcode = lt_bkpf-tcode.

        case LT_BSEG-SHKZG.
          when 'H'."credit
            GT_output-WRBTR = LT_BSEG-WRBTR.
          when 'S'. "debit
              GT_output-DMBTR = LT_BSEG-WRBTR.

        GT_output-ruleid = cntrlid.
        GT_output-CONTROLID = p_ruleid.

        COLLECT GT_output.
        CLEAR : GT_output.
      CLEAR : lt_bseg.

    CLEAR lt_bkpf.
  REFRESH lt_bkpf.


* 3) Filter cumulative amounts for DEBIT based on parm value
loop at GT_OUTPUT into gs_output.
  gs_tabix = sy-tabix.
  if Gs_output-DMBTR in wrbtr. "min amt set in parms
    "...keep in list
  else. "not in alert range so filter from list...
    delete gt_output index gs_tabix.

SORT GT_OUTPUT BY bukrs blart belnr.

* 4) Fill rule and control ids, needed for batch processing
if cntrlid is initial.
  move-corresponding zcntrlid to cntrlid.
  append cntrlid.

if p_ruleid is initial.
    move zp_rule to p_ruleid.

* 4) output for batch job spool
if p_batch is not initial. "processing batch job?
    write: / 'For date:', BUDAT-option, space, BUDAT-sign,
              space, BUDAT-low.
    write: / 'Parameters:',
          / 'BELNR=', belnr,
          / 'BLART=', blart,
          / 'TCODE=', tcode,
          / 'cntrlid=', cntrlid,
          / 'p_ruleid=', p_ruleid,
          / 'Amount (Credit)',
          / 'Amount (Debit)'.

    skip 2.
    write: / 'Data Read:'.
    loop at gt_output.
        write: / GT_output-bukrs,

FREE : lt_bkpf, lt_bseg.



Hotspot Code

This logic is called when the Alert is selected in the Inbox (note that this does NOT apply to the Alert Report). 

This logic will call FB03 (display mode) for the accounting document hyperlink selected in the Inbox.




CHECK i_column_id-fieldname = 'BELNR'.

clear gt_output.
READ TABLE GT_OUTPUT INDEX is_row_no-row_id.

                                          AND  belnr = gt_output-belnr.
  CHECK sy-subrc = 0.

if sy-subrc = 0.
  MESSAGE e077(s#) WITH 'FB03'.



When you are done with the code sections, select SAVE and GENERATE. You will be prompted to enter the program name to save it under.  This program name is the one you will want to use later if you schedule a batch job for execution.


c) Execution Online

Online execution is perfect for testing.  The parameters used in the execution online will be the ones stored on the Rule Parameter screen.

In this example, only data for transaction FB01 for document type ZG created on 12/15/2015 with debit amounts greater than $200,000 will be processed in the alert…



To trigger the execution, select tab Controls -> Header. Enter the Control ID to execute and ENTER (with the cursor’s focus on the Control ID field)…



Select Execute Control button....



A new Case ID will be created in the Inbox (of the user assigned) and in the Alert Monitor...





...The format of these roles is from the Development Workbench -> Format tab.

...The Hotspot Code (hyperlink) is executed by selecting the Accounting Document Number. It will flow to FB03 in this case.


d) Execution in Background

To create the batch job, you can create it directly in tcode SM36 or you can create a "template“  batch job using the wizard in the utility.  I use the word "template“ because the wizard does not create the batch job correctly. I needed to modify the resulting job for it to exist correctly.


I created one batch job for each custom control for clarity.

Create a Variant

The Rule Parameters are not used by the batch job, so you’ll need to create the variant to initialize those fields.


  • I’m using BUDAT in the Fetch Code Data to read a specific date’s data for testing. BUDAT is not set in the variant since our requirement was to create alerts for all documents that meet the criteria. The Fetch Code Data is written for this requirement.
  • P_BATCH is used to print the values to the spool file. (Refer to the Fetch Data Code section)
  • The other values are similar to the Rule Parameters.


To Create the Batch Job using the Wizard

Go to tab Monitoring -> Process Controls.





To fix the incorrectly created batch job, in SM37, edit the batch job step to correct the Z program name and the variant.  Also, confirm and update the Frequency.  My Fetch Logic Code will read the data from the prior date, so it is set to run daily just after midnight to pick up all of prior day’s records.


e. How to view the Alerts


Alerts will be set to the Inbox of the user assigned to the case id.


Alert Report

They will also be sent to the Alert Report...




f. User Exits to Know About

Program /PSYNG/SA_009  - User Exit 100
This exit allows setting flag SCHEDULE to X if controls are to be re-processed so they will appear again in the Inbox. 
If controls are not to be reprocessed, remove this flag.

Activate the Exit in tab Misc….



My Exit code looks like this when generated…



* Report  /PSYNG/SA_009                                                *
* AUTHOR: Security Weaver, LLC                                        *
* COPYRIGHTS Security Weaver, LLC

REPORT /psyng/sa_009 MESSAGE-ID /psyng/sa.
DATA: schedule.

*& Form start_user_exit
FORM execute_user_exit using schedule.
schedule = 'X'.



The exit code for the Schedule Flag is read here...


Program /PSYNG/SA_001F01 – User Exit 001

I found this exit in my analysis. It is not currently in use in my implementation.




*    WHEN 'PROC03'.                    "SOD Control Report
*      submit /PSYNG/SA_SOD_BY_HISTORY via selection-screen and return.
*                via selection-screen and return.
      SELECT SINGLE * FROM /psyng/sa_usrext INTO /psyng/sa_usrext
            WHERE  exitnumber = 1.
      IF sy-subrc = 0.
        l_prog_name = /psyng/sa_usrext-exitname.
        PERFORM execute_user_exit IN PROGRAM (l_prog_name).



Due to the length of this post, this discussion is continued under...

Security Weaver’s Process Auditor - Developer's Observations (Part 2)

Sometimes you have an idea that resonates with others but you do not realize the complete requirements until you have exactly what you asked for.  Such is the case when GRC Access Control 10 was being developed.  I made a request to have the SAP user statistics data summarized into GRC Access Control along with some analysis reports.  At the time SAP supported an older process of Reverse Business Engineering (RBE) functionality which we used to support multiple metrics.  These reports included inactive transactions, unused roles, transactions most used, and more.  Later our business provided requirements using the RBE data to automatically remove roles when a user had not executed a transaction within a role for a specific number of days.  We also used this for our quarterly user access review and SOD analysis to determine if roles were no longer relevant or if an SOD violation could be resolved through the removal of an inactive role assignment.


To perform our consolidated user access review we manually exported the RBE data into a Microsoft Access database from each individual SAP system.  Some companies also do this by sending the data directly into a BW InfoCube.  Once we had this consolidated data we could fulfill the user metrics and related reporting requirements.  Since the SAP statistics data is summarized into monthly periods, we only performed this process once a month.  Although the batch jobs were scheduled to create the RBE data files, there was still a day of work to summarize all of our production systems into the Access database.


With the enhancements coming in GRC 10 and the delivery of several of our influence items, we built a business case to become a GRC Access Control ramp-up customer.  Fortunately, SAP delivered the Action Usage functionality which imports the transactional activity into GRC Access Control.  This was exactly what we asked for but, unfortunately not the best solution.  We started out performing the same process of exporting data to MS Access but only from GRC.


SAP scheduled a follow up meeting to see how we liked the new functionality.  We shared our metrics reporting, user access review process, role metrics details and SOD reduction efforts.  Having consolidated data helped us but it was not the best long term solution if we still needed to perform processing in MS Access.  We documented detailed requirements and these were delivered in GRC 10.1.


I now sympathize with our business users who provide us functionality requests.  Sometimes you do not know phase 2 requirements until you have actually used a system.  In the past I have prototyped solutions before documenting the requirements.  Sharing story boards and high level process flows would have been helpful here.  Once you have an improvement idea, you need to take the next step and document the requirements.  When someone only has an idea it leaves room for interpretation.  When there are detailed requirements, there are tangible deliverables.  Now that we have been using the Action Usage process for multiple GRC processes, additional requirements are being noted.


Across the many GRC 10 and 10.1 implementations there must be other customers that also use the Action Usage data.  What are some ways that you are currently using it?  Do you use it for UAR?  Remediation of SOD risks?  Validation before applying a mitigating control?  Forensic activity?  If you are using this data, how would you improve the process?  What reports or functions do you think still need additional development?  I am looking forward to your feedback.

Some customers are experiencing the following issue after upgrade to GRC 10.1 SP 10 and SP11:


Despite repository sync job is completed without any dumps the table GRACUSERCONN is not updated,


To solve this issue implement SAP Notes below


2221261 - Code changes in repository sync for performance improvement


2168872 - Inconsistent entries in table GRACUSER and GRACUSERCONN



2256786 - All data for Portal Users not updated in Repository Sync



IF the issue still persists follow the steps in SAP Note below:

2253834 - Repository sync collection of corrections


For missing user type in GRACUSERCONN please implement SAP Note below:


2250690 - Some Portal Users missing in Repository Sync


2259378 - Repository Sync issue with expired and locked users - plugin




Rafael Guimbala

One of the key functionalities of SAP Dynamic Authorization management, is to provide finer-grained access control. When we talk about finer-grained access control, we are talking about the functionality to make access controls decisions, one level deeper than what standard roles can reach.


  • TransactionalData access controls
  • Tab/View Level access controls
  • Field level Data access controls


For the purpose of this discussion, let’s see why we need field level access controls.


When we speak to our prospect, we hear over and over that the reason why they are denying users for access to certain transactions is that there is a sensitive piece of information in one of the fields. SAP standard rolesdo not control data at the field level and customization is not the route the prospect want to take.


SAP Dynamic Authorization management Field Level controls helps control the data at the domain level, so irrespective of where the field is pulled from the data is scrambled or Masked and can only be viewed by users with the right attributes( Applying ABAC principles) and it  is independent of the UI the data is pulled from.



Sample SAP DAM ABAC policy for securing Netvalue in Sales Order:



With SAP Dynamic Authorization Management, we can take into consideration limitless conditions to make access control decisions even at field level. So next time if there is a complex security requirement, you know that there is a GRC product SAP DAM  that would easily be able to handle the requirement.


How it works, Please refer to the link below for SAP DAM solution brief


-Anand Kotti

Rafael Guimbala

My First Firefighter

Posted by Rafael Guimbala Nov 27, 2015

The main goal of this Screen Personas flavor My First Firefighter, is to provide the GRC Access Control administrator with diagnostic of firefighter configuration. The flavor collects firefighter configuration data and compares to expected value for a correct behavior. This comparison result into a detailed log to assist GRC administrators with a root cause analysis.


Here is an image of the flavor:





First you need to have screen personas in your system, for more information please click on the link below


After the validation of screen personas, follow the steps on the link(Importing Flavors - SAP Screen Personas - SAP Library)  to import the flavor in SAP Note mentioned in the end of this post.



The flavor have the following features:





  1. Press Ctrl + * in order to retrieve system information from many System
    -> Status.




  1. Type the target connector name in order to test the connection through SM59
    (connection and authorization test). The configured integration scenario will
    also be tested:


  • In case of success: Move to the next test
  • In case of failure: Go to SM59 and fix
    the connection error




  1. User type, user password and user timezone are checked


  • In case of success: Move to the next test
  • In case of failure: User type must be of
    type 'Dialog' or 'Service'. Firefighter user must not have initial password and
    his time zone must be the same timezone of the system.




  1. Press the button to check the name of the firefighter role as well as if the
    firefighter workflow will trigger after the session ends:


  • In case of success: Firefighter role and parameter 4007 of SPRO configuration is shown.
  • In case of failure: Parameter 4010 must be adjusted according to the
    Firefighter role




  1. GRC text object must exist in SE75. Press the button to check whether the
    GRC object is inserted in tables TTXOB,TTXOT,TTXID and TTXID.
  • In case of success: Move to the next filed
  • In case of failure: SAP note 2058516 manual steps must be followed to run a
    script which inserts the GRC objects into these tables




  1. Type the firefighter ID in order to compare system and firefighter ID time


  • In case of success: Move to the next test
  • In case of failure: Change the
    firefighter ID time zone to the same as the system time zone




  1. Upon pressing the button, firefighter job schedule is checked in table
    GRACTASKEXECSTMP. Also, the jobs are checked for today in SM37.
  • In case of success: This information will return the firefighter schedule of
    program GRAC_ACTION_USAGE_SYNC. This program must run hourly to avoid
    performance and time out issues.
  • In case of failure: Firefighter background job is not scheduled and must be
    scheduled through SM36 for the target connector (instructions). Check the status
    of the job. If cancelled, provide instructions.



For more information and also a video of the flavor please check SAP Note 2157307 - Screen Personas - Firefighter Log Sync Update [VIDEO]



Feedback and ideas of new flavors are much appreciated!

The main goal of this report is to provide the GRC Access Control administrator with diagnostic of LDAP
connection and configuration. The report collects LDAP configuration data and compares to expected value
for a correct behavior. This comparison result into a detailed log to assist GRC administrators with a root cause analysis.

*This is only a diagnostic tool, the LDAP on GRC can still present other issues even if all the items are checked*


  1. How to Install LADT:
    In transaction se38 create a new Z report named ZLADT_LOG type include.
  2. Copy the file log.txt source code into the report, save and activate.
  3. In transaction se38 create a new z report named ZLADT type executable program.
  4. Copy the file main.txt source code into the report, save and activate.


How to operate LADT:

  1. In transaction se38 choose report ZLADT and execute.
  2. In the field Ldap Connector, insert the LDAP connector that want to test and run the report.


The result log shows 3 types of messages:


1)    A success message will show status “OK” and it means that the step is
correctly configured.


2)    A warning message will show status “Attention” and it means that one or
more optional steps are not configured correctly. This message shows a return
code, which can be interpreted in the next section of this note to implement the
optional steps.


3)    An error message will show status “Error” and it means that one or more
mandatory steps are not configured correctly. This message shows a return code,
which must be interpreted in the next section of this note to implement the
optional steps.


Please refer to the following procedures to correct the error.


CODE 00000 - Check your LDAP configuration according the error message.


CODE 00001 - Set program id equal to RFC ID in SM59 as below:


Code 00002 - Maintain a server for the LDAP Transaction:


CODE 00003 - Assign the LDAP Connector to a connector group:


CODE 00004 - Assign integration scenario AUTH in SPRO for LDAP connector:


CODE 00005 - Assign integration scenario PROV in SPRO for LDAP connector:


CODE 00006 - Assign integration scenario AUTH in SPRO for LDAP connection type:



CODE 00007 - Set application type 12 to LDAP connector:

CODE 00009 - Change the application type of LDAP connector to 12:


7 8 9.jpg


CODE 00010 - Set application type 12 to LDAP connector group:

CODE 00011 - Active LDAP connector group:


CODE 00012 - Change the application type of LDAP connector group to 12:


10 11 12 .jpg


CODE 00014 - Check the ldap field mapping for action 0003, make sure that all fields are set for LDAP connector and SAP:

CODE 00016 - Check the ldap field mapping for action 0004, make sure that all fields are set for LDAP connector and SAP:


14 15 16 17.jpg


CODE 00015 - Maintain field mapping for LDAP connector action 0003




CODE 00017 - Maintain field mapping for LDAP connector action 0004


CODE 00018 - Maintain connector type as LDAP

CODE 00019 - Maintain attributes for LDAP connector


(*This image is only illustrative, please check with your basis team your user path)

CODE 00020 - Maintain LDAP connector as a user search data source (not mandatory).



CODE 00021 - Maintain LDAP connector as a user detail data source (not mandatory).



CODE 00022 - Maintain LDAP connector as user authentication (not mandatory).


CODE  00023 - Maintain LDAP connector as end-user authentication (not mandatory).





Your feedback is welcome! Feel free to share your impressions of the program in the comments box.



The purpose of this blog post is to explain the about the different Access Request tables and how these tables can be utilized in order to prepare reports as per your requirements:


1. Request Reason


Request reason is stored in SAPscript, with Text Object as "GRC" and ID as "LTXT". You can use standard SAPscript Function Module (READ_TEXT) to fetch request reason of a GRC request by passing the "TEXT" value to the Name field. This TEXT value can be fetched from table STXH


e.g: ACCREQ/00155D08DA361ED2A1BD201C710165A5/LONG_TEXT (For access requests ACCREQ/RequestID(GRACREQ Table)/LONG_TEXT





2. Request Comments



Request comments also are stored in SAPscript, with Text Object as "GRC" and ID as "NOTE". You can use standard SAPscript Function Module (READ_TEXT) to fetch comments of a GRC request by passing the "TEXT" value to the Name field in the same way as done above for request reason. This TEXT value can be fetched from table STXH.

e.g: ACCREQ/00155D08C4051ED4BDFDEF53EC12C0D7/20150511151344 (ACCREQ/RequestID(GRACREQ Table)/XXXX)

3.  GRACREQ - Request details table

This table will provide the information about Request ID, Request Type, Request Creation Date and Request Priority

4. GRACREQUSER - GRC Request User details table

This table will provide the information about user for whom GRC request has been raised and provides details about User ID, User First Name, User Last Name and User Email ID

5. GRACREQPROVITEM - GRC Request Line Item Details

This table will provide the information about the request and the below Line Items in the request with their corresponding VALID FROM and VALID TO dates.




Fire Fighter Id



PD Profile

FireFighter Role

6. GRACREQPROVLOG - GRC Request Provisioning Logs

This table will provide the information about the request and the Line Items in the request with their provisioning status (Success or Failure or Warning)


7. GRFNMWRTINST - GRC Request Instance Details

This table will provide the information about the request and its corresponding instance status.

8. GRFNMWRTDATLG - GRC Request Approval Status

Get the details of Instance ID from GRFNMWRTINST table by passing the request number in "EXTERNAL_KEY_DIS" field. Based on the Instance ID you can get the details of each Line Item approval status in the request, Path ID, Stage Sequence Number and Approver User column in this table gives the details of the approvers.

Based on Path ID you can get the stage details by using the tables "GRFNMWCNPATH" and "GRFNMWCNSTG"

9. GRFNMWRTAPPR - Current Approver for Request Line Items

This table will provide the information about the request and current approvers for corresponding Line Items in the request.



These tables will provide the information about the roles and their corresponding role owners maintained in BRM.


11.  HRUS_D2 - Approver Delegation Table

This table will provide the information about the delegated approvers in GRC


These tables will provide the information about the default roles maintained in GRC.



Looking forward for all your inputs in improving this blog by including additional table details (if any missing)


Thanks for reading.


Best Regards,

Madhu Babu Sai

Dear all,


This document will gives you overview about master data (ex:Controls) change workflow in GRC Process Controls


Central controls are created for sub processes under Business Processes



Once controls are created, if you open



If change master data workflow activated, SAVE button will disabled and Request change button will appear



SPRO configuration:

First activate master data object for which you required workflow

SPRO>GRC>Shared master data settings>Activate Workflow for Master Data Changes





If we do changes in central controls then workflow will trigger for change approval and notification

Now maintain Custom Agent Determination Rules for entity: XCONTROL

SPRO>GRC>General Settings>Workflow> maintain Custom Agent Determination Rules




NOTE: Correct role selection is very important for business event and map with correct entity id, select notification business event if notification required.

Now go and change for control in NWBC, once you click on Request Change button, you get error



Reason: Not maintained user in fallback receiver

SPRO>GRC>General Settings>Workflow> Maintain Fallback Receiver




Now try the same from NWBC

  Once you click on Request Change for control, it will ask for change request


Provide details and click OK, will get the below message.



Reference:Master Data Change Request Workflow - Governance, Risk and Compliance - SCN Wiki


pc:No Approver Found. Request Change is not possible.


Hope it helps for others.




Dear all,


This document will give you overview of creation of regulations and how to assign to sub processes.


Regulations and Policies are provides visibility into your compliance framework and access to end-to-end policy management

Regulations are assigned to Sub process, controls, IELC (Indirect Entity-Level Controls), Policies and Ad-Hoc Issues, which are assigned to organizations.

Regulations will be part of master data



We can create Regulation group,Regulation and Regulation Requirement




Creation of Regulation Group




Provide the details and click on SAVE



Once regulation group has been created, then create Regulation

Select the regulation group and click on Regulation to create




Provide the regulation name, description and select the Assign regulation configuration from drop down.

Assign regulation configuration will be maintained in SPRO

SPRO>GRC>Process Controls>Multiple Compliance Framework>configure compliance Initiatives







Select the Assign regulation configuration from drop down, click on save


Now regulation will created under regulation group




Select the regulation and create regulation requirement



Provide the details and Save



Now select the sub process from Business Process to assign the created regulation





go  to regulations tab


click on Add to see and select the regulations and Save the sub process.




Logos 10-1-2015 3-43-36 PM.png

SAP TechEd 2015 in Las Vegas is just two short weeks away. Have you created a personal agenda yet? Mine is still a work in progress, already jammed with double and triple- booked times, but there are some things that I can recommend with certainty. Most importantly, do create a personal agenda. No matter how busy you are, it is worth spending some time browsing the sessions both by the tracks and by some keyword searches. Every year I find some security-related  sessions in other tracks, so it is time well spent. It is also OK to double book your agenda, in case a session cancels or is not what you expected.


So what is in my personal agenda? First, let me back up to something not in my own agenda, but everyone should at least consider: the ASUG pre-conference sessions. Depending on the projects ahead at your organization, there could be something to give you a great deep-dive start to the week. Be sure to check them out in this post by Tammy Powlas :

Jump Start SAP TechEd Las Vegas with ASUG Pre-Conference Hands-on Sessions


OK, back to my own agenda. Here are some recommendations for you to consider adding to your own agenda:


1. GRC Access Control Sessions. I am so pleased to see such a variety of sessions on GRC Access Control at SAP TechEd this year. This has been a quest of mine for several years now, to get more content in this area into the program.  If Access Control is something you are implementing or already support, be sure to consider these presentations:

SEC110 - Upgrading SAP Access Control and other GRC solutions from 10.0 to 10.1. My organization has not yet upgraded to 10.1, so I am very much looking forward to the lessons learned and other content of this ASUG education session.

SEC208 - SAP Access Control Customer Connection: Co-Innovation for the Win. This is my own presentation, so of course I am excited about it. Come to this session to hear about the improvements to SAP Access Control,  some  already delivered and some still in progress, that came out of Customer Connection projects, and learn about what is ahead.

SEC160 Hands-On Lab: An Introduction to using Key Features of SAP Access Control. A hands-on session on GRC Access Control- woo hoo! I have been begging for this for years. Access Control 10 has so much functionality that you may not have implemented all of it yet. This is the perfect opportunity to get hands on-time in several areas. If you have not yet signed up for your Hands-On sessions, get going, they are filling up.

SEC807 Road Map Q&A SAP Access Control. This is our chance to hear about the road ahead for this solution and ask questions of the product owner.


2. Security sessions. The security track covers a lot of ground; depending on the solutions in use at your organization, some of these sessions are likely to be more applicable than others, so be sure to browse both sub tracks of the Security track. Some of the sessions in my agenda:

SEC107 "Access"ing Your SAP Security Data. This session includes an intro to SAP Security, so if you are just starting out in SAP security, this ASUG education session will be great for you. As for me, I am looking forward to hearing about using Microsoft Access to manage security data.

SEC206 Deploying SAP Fiori to meet the Needs of Your Current Security Model.  My organization is not yet using Fiori, but surely it is just a matter of time, so this is another ASUG education session on my list.


3. ASUG education sessions. The ASUG sessions already mentioned are just a sample of the TechEd content brought to you by ASUG's TechEd Design Team: Tammy Powlas, Kristen Dennis, Kevin Comegys, and me, along with our SAP Point of Contact Peter McNulty. We have been hard at work since before ASUG/ SAPPHIRE to bring you the best possible content from ASUG members. You can find it in the session catalogue under the Source filter. Some of the other ASUG sessions in my agenda are:

TEC122 Building the Business Case for SAP Business Suite powered by SAP HANA

INT110 Secure Integration to the Cloud: Connecting On-Prem and Cloud Applications

BA122 It Isn't Only Brain Surgery: SAP HANA and SAP BusinessObject BI Solutions.


4. Expert Networking sessions. These may be the hidden gems of SAP TechEd, your place to meet with SAP Mentors, presenters, product managers, and your peers.  I am hosting two Expert Networking sessions in the SAP Mentors Lounge, EXP 27263 on Tuesday at 1:30 PM, and EXP27262 on Thursday at 10:30 AM, and I hope to see a lot of the regulars from the GRC and Security spaces on SCN as well as ASUG members there. To find the Expert Networking sessions and add them to your personal agenda, do a search with a filter on Session Type> Networking session.


5. Evening Events. After a long day of lectures, labs, road maps, and chatting with the experts, it is great to kick back and network in a more informal way. Be sure to attend the Networking event on Wednesday, starting at 6:00 PM on the show floor. The SAP Fiori Jam Band will once again lead the way and rock out Las Vegas. Come and sing along with us! Photo courtesy of SAP photographers.




Hope to see you all there!


Filter Blog

By author:
By date:
By tag: