Gartner’s 2013 Global CIO Study points to issues I have previously aired: namely a failure to obtain full advantage from new and disruptive technology.

This should be of concern to board, all executives, leaders of IT, and risk and assurance professionals.

 

Here are some key excerpts:

  • Enterprises realize on average only 43 percent of technology's business potential. That number has to grow for IT to remain relevant in an increasingly digital world.
  • Over the last 18 months, digital technologies — including mobile, analytics, big data, social and cloud — have reached a tipping point with business executives. Analysts said there is no choice but to increase technology's potential in the enterprise, and this means evolving IT's strategies, priorities and plans.
  • Digital technologies provide a platform to achieve results, but only if CIOs adopt new roles and behaviors to find digital value. CIOs require a new agenda that incorporates hunting for new digital innovations and opportunities, and harvesting value from products, services and operations.
  • In a world of change, it is concerning that around half of CIOs surveyed do not see IT's enterprise role changing over the next three years.
  • Without change, CIOs and IT consign themselves to tending a garden of legacy assets and responsibilities.

 

However, the top priorities continue to be individual new technologies, not the more holistic perspective discussed in other studies where the CIO is asked – and expected – to step up and play a more strategic role in the organization: leading the path to growth through the smart deployment of technology. Elsewhere, Gartner has talked about “Nexus”, the growing need for IT to use multiple technologies together to create value. IDC refers to this as the third platform.

 

I have a few questions for board members, executives, and risk and assurance professionals to ask:

  1. Are we obtaining full advantage from the new technologies?
  2. Do we have the capabilities to understand, assess, and realize their potential value?
  3. Do we have the capabilities to adopt new technology safely? Are risk and assurance professionals actively involved, helping us understand and address any new or changed risks as a result of adopting new technology?
  4. Is IT leading the way with a vision of how technology can reshape our organization’s processes, products, services, communications with customers, and so on?
  5. Do we know what we are missing? Is that acceptable?

 

I welcome your views and comments.

 

Related posts:

http://www.theiia.org/blogs/marks/index.cfm?postid=411

When data builds up it can affect SAP system performance.  The best practice for this situation is data archiving.  This moves the data out of the production system in order to manage database growth while allowing business users direct, transparent access to it. But what about data created by the GRC system?  It is imperative that data needed for any audit or legal requirement be immediate available.

 

Dolphin has a solution, the Dolphin Data Management Cockpit (DMC), which offers traditional SAP archiving and serves GRC objectives. DMC permits data encryption at the field level. It works with /virsa/ tables (especially the firefighter log tables – which can build up fast), and allows this data to be archived with standard SAP archiving protocol (using SARA).  The SAP application provides archive capabilities, but this is not true archiving but more of a download/upload process. The download/upload process can take time to accomplish and can affect system performance.

 

The DMC solution uses the standard archiving process, which has minimal effect on your system. For data retrieval, we can update SAP Tcodes for GRC to read the archived data, making both online and archived data available to end users.

 

The cost of storing data in an active database can be expensive regardless of your SAP system and even more so for those using SAP HANA. Implementing an archiving strategy with transparent access to the data is a much more cost efficient alternative when data is no longer used by end users on regular bases.

 

Plus an added advantage to archiving, once data is archived, data cannot be altered or deleted which is ensures compliance when it comes to audits.

1.)  Some of the common attributes on which you will base your BRF msmp rule are alredy available in context( like priority,criticality      etc.) but there  are few other attributes which are not available in the context like role sensitivity etc. To create your rules based on these attributes you can create expression of type DB lookup and read these attributes realtime from database table. Following example will provide mode details of creating an initiator rule based on availablity of role owners

 

1.) 

1.)     

C     Create a new expression of type ‘DB Lookup’ in your existing initiator rule

 

Untitled.jpg

 

 

 

1.)      Provide name and description to your DB lookup and fill in following details

 

Untitled.jpg

2.)

3.)    

      Once DB lookup is created and activated. Open your decision table and click on ‘Table Settings’ button. In your table settings ‘Insert Column’ as shown below

 

Untitled.jpg

 

4.)

 

4.)     Select the newely created DB Lookup as a new column

 

 

Untitled.jpg

 

Untitled.jpg

 

 

5.)      Now in your decision table you can have first row for roles without role owners and rest of the table can remain same as your existing rule

 

Untitled.jpg

 

 

2

If you haven’t registered yet for GRC 2013, March 19-22 in Las Vegas, there’s still time!  With more than 250 sessions to choose from, including workshops, case studies, demos, panel discussions, and roundtables, this is the best event of the year to attend if you’re using or evaluating SAP solutions for governance, risk, and compliance (GRC).

 

In this video, Michael Lortz gives you a peak at some of the planned activities

 

With four days and so many options, how do you get the most ‘bang for your buck’?  I’d like to recommend the following sessions and opportunities.

 

Education Sessions

 

You can view all the sessions in the conference guide, or if you’re already registered, you can find sessions via the online agenda builder.  If you’re limited on time, these are the sessions I highly recommend:

 

 

GRC Solutions Center

 

Do you have questions about  product functionality, implementation best practices, or the product roadmap? The GRC Solution Center is your opportunity to
have them answered in one-on-one meetings with SAP’s GRC experts – including the solution management and development teams. Note: meetings are by appointment only, and fill up quickly.  Visit the reservation desk early to reserve your spot. The solution center is located at level 1, room 103, and opens every morning at 8:00 a.m.

 

Ask-the-Experts

 

Ask-the-Experts is another great opportunity for one-on-one meetings with solution experts from SAP and our partners.  Whether you're looking for tips and tricks on technical implementation or functional design, or trying to build a business case for your next project, these experts can share the good, the bad, and the ugly based on their years of experience. The GRC Ask-the-Experts sessions are scheduled for Tuesday at 6pm, so grab a beverage of your choice in the Welcome Reception and find the Ask-the-Experts tables in the same room. These sessions run until 6:45pm.

 

Discussion Forums

 

New at GRC2012 and back by popular demand,  these 30-minute interactive discussions, led by a subject-matter expert,  offer an opportunity for customers with similar interests to talk informally about specific topics and is a great networking opportunity.

 

For more details about GRC2013, download the conference brochure – and make sure to follow the conversation on Twitter #GRC2013. 

 

I look forward to meeting you at GRC2013.

 

Originally posted on the Analytics from SAP Blog

In recent days, both noted GRC pundit and analyst Michael Rasmussen and consultant James Roeske sat down with Dave Hannon of SAPinsider to answer questions regarding GRC frameworks and SAP Access Control 10.0.

First, Michael provided insights on topics including:

  • Common mistakes in setting GRC strategy
  • The role of technology in a GRC strategy
  • The one true definition of GRC
  • How to drive collaboration in your GRC program
  • Importance of GRC maturity and integrity
  • Selecting the right GRC solution for your organization

You can hear the full podcast here: ow.ly/i2c1H

James, who is CEO of Customer Advisory Group, discussed getting the most "bang for your buck" with SAP Access Control 10.0. James discussed topics such as:

  • New features in the 10.0 release
  • Enhanced integration with other SAP solutions for GRC
  • Important technical- and business-level considerations before implementing or upgrading to 10.0.

You can listen to the full podcast here: bit.ly/YP8g1j

In addition, both Michael and James will present at the upcoming GRC 2013 conference in Las Vegas from March 18-22.

Michael will lead two pre-conference workshops on March 18:

 

James will present two sessions on March 19:

 

Matthew Moore

Conference Producer, GRC 2013

Follow me at @mattmoorewis

Many organizations do far too much work on these areas, primarily because they scope the work in isolation from their top-down approach to the identification of key controls. They base their scope on good business practice, and/or a list of ‘rules’ from a consultant or software vendor, rather than focusing on the access limitations necessary to prevent an action that might lead to a material misstatement of the financials.

The following discussion is taken from my book, Minimize Costs and Increase the Value of Your Sarbanes-Oxley 404 Program: Management's Guide to Effective Internal Controls, published by and available from the Institute of Internal Auditors (just $35 for members in hard copy, $25 as a PDF download).

Segregation of duties and restricted access controls must be identified, assessed, and tested where they are key controls. (A key control is one that is relied upon to either prevent of detect a material misstatement of the financials.) Key SOD and RA controls include those that:

  • Are required for an authorization control to be effective. For example, if the business control requires that all purchase orders be approved in the system by the purchasing manager, it is critical to ensure that only the purchasing manager has that capability.
  • Reduce the risk of a material fraud that could be reported incorrectly in the financial statements.

With restricted access and segregation of duties, there is a risk of doing more work than is required for Sarbanes-Oxley. While there are excellent business reasons for restricting access to only those functions individuals need to perform their assigned tasks, it is important to remember that only fraud risk that is both material and also misstated in the financials is within scope for Sarbanes-Oxley.

This last point is important. Many companies test SOD using a standard set of “rules” (combinations of access privileges deemed inappropriate) that have been provided by a consultant or vendor. While they may represent a risk to the business (at least in theory), they may not represent a risk of material misstatement for your organization. The rules used to drive SOD testing should be based on the top-down, risk-based approach described above, to support a key control or reduce the risk of a material fraud.

As an example, at a company where I was responsible for the Sarbanes-Oxley program, both the external auditor and the internal auditor (at that point, the internal audit activity was outsourced) had tested user access consistently for several years. They each used a standard set of more than 150 rules to identify (a) access to important ERP transactions, and (b) SOD conflicts where one individual would have the ability, using a combination of ERP transactions, to commit a fraud. When the Sarbanes-Oxley team changed to a risk-based approach, concentrating on testing access rights that represented a risk of material misstatement, the number of rules was cut to about 20.

Is your SOX scope based on a top-down, risk-based assessment when it comes to SOD and RA?

Please share how many rules you test (tests of SOD and/or RA).

Recently, two of the Big Four accounting firms released reports that address the increasing importance of the CIO. PwC published their 5th Annual Digital IQ Survey and Deloitte issued an Audit Committee Brief on the topic of “Understanding the CFO and CIO Dynamic”.

The short discussion by Deloitte provides advice for members of the audit committee. The authors say that “The ability to mine data and drive insight from a company’s numerous systems has highlighted the importance of an effective relationship between the CFO and CIO”. They refer to an earlier Deloitte publication, CFO Signals, which found that “only a little over half of the CFOs said they have the information they need to manage the business effectively, and about one-third expressed a neutral opinion”.

It is understandable that the Deloitte piece focuses on the clear and troubling problem that CFOs and other executives are making decisions without the benefit of the information they need, both on performance and risk. It is also understandable that their advice to the audit committee brings in the issue of financial reporting risks. But, I think there is more that the audit committee and the board as a whole should be concerned with, let alone the CFO, than that the CIO has aligned the activities of the IT organization with the strategies set by top management.

Rather, I think the CFO and the board should be asking whether the CIO is sufficiently involved in helping to set the strategies and vision of the organization.

I found the PwC contribution more useful. Although the paper talks about ‘collaboration’ between the CIO and the top executives, the value is clearly highest when the executive team – with the CEO as active champion – recognizes that “technology is a critical driver of business value”

PwC talks about “Strong Collaborators” and how these companies outperform their rivals. They say “Strong Collaborators are those that said that the CIO has a strong relationship (4.5 out of 5 or better across all relationship pairs) with members across the C-suite: CEO, CFO, CMO, CRO, CSO, CISO, and business unit leaders”.

Here are a couple of key excerpts, but I recommend reading the entire 9-page document:

  • “Naturally, companies with high Digital IQ understand which technologies will provide the greatest business benefits, leveraging the tools and platforms to optimize processes and improve overall performance. Our analysis shows that Strong Collaborators are more likely to aggressively invest in the four key digital technologies—mobility, cloud computing, business analytics and social media—than other companies.”
  • “Our survey found that companies with collaborative C-suites intertwine business strategy and information technology and are often rewarded with stronger company performance. They can also adapt quickly to market changes to maintain an advantage over competitors.”

PwC addresses the need for information to drive decisions. They say “Strong Collaborators are more likely to integrate internal and third-party data to better support decision-making, a critical step to provide senior leaders with the insight to make the right choices.”

But the key for me is that the CIO moves out of a role as the janitor and enters the organization’s executive team to drive the organization forward. PwC captures this with:

“CIOs of Strong Collaborator companies tend to not only ensure that technology initiatives are in step with the business plan but champion innovation across the enterprise.”

I welcome your views and comments. My recommendation is that CFOs and members of the audit committee review the two papers together and consider whether the role and relationship of the CIO within the organization is as effective as it should be.

Security Guide

SAP Access Control™ 10.0 / Process Control™ 10.0 / Risk Management™ 10.0

 

Please find the the same in the below link :

 

https://websmp210.sap-ag.de/~sapdownload/011000358700001377352010E

Hi,

 

Since the announcement of SAP GRC 10.0, every organization wants to migrate from 5.3 to 10.0

 

Hence I would like to start this blog with some questions of Migration from 5.3 to 10.0.

 

1. Why do every customer has to migrate it to  SAP GRC 10.0?

 

2. What are excellent features exist between GRC 5.3 to 10.0?

 

2. Will my current GRC technology infrastructure suffice?

 

2. Will the cost justify the return?

 

3. What will be migration impact?

 

4. What is the average time for Implementation?

 

5. At anytime can we get a Audit reports?

 

I welcome your views and commentary.

The Aberdeen Group has a new research report out on Fighting Fraud with Big Data Visibility and Intelligence.

 

The report includes a useful review of the risk and cost of fraud. (Note that it errs when it refers to ‘tips’ as being external: these are typically calls to the internal compliance hotline or whistleblower line.)

 

What is new in the report is the discussion of the ability to mine the mass of Big Data, perhaps with predictive analytics, to understand and assess fraud risk, and also to monitor for red flags that indicate an investigation is warranted. As the report says:

 

“Rapid changes in information technology infrastructure are increasing the difficulty of maintaining high levels of preparedness simultaneously against all threats. In response, organizations are adopting enhanced strategies for fighting fraud: from 100% success at prevention, to greater visibility, faster detection and incident response; from “figure out what already happened” using post-incident forensics, to proactively “figuring out what’s happening” using Big Data and predictive analytics.”

 

Unfortunately, Aberdeen’s research showed that only about 16% are using predictive analytics for the detection and prevention of fraud.

 

Why is this? I suggest it’s from one or more of these factors:

 

  1. Those responsible for fraud prevention/detection are not aware of the capabilities of the new technology
  2. Those responsible for fraud prevention/detection are (justifiably or not) content with the ‘older’ technology
  3. Priority and/or resources are not given to fraud prevention/detection

 

I welcome your views.

I admit to criticizing my “alma mater”, PwC, for much of their thought ‘leadership’ over the last years.

Today, I come to praise PwC, not to bury it.

They have published an excellent guide for boards that merits reading not only by board members but also by all those responsible for management of IT, risk management, and internal audit.

Directors and IT: What Works Best suggests a six-step process, what they refer to as an IT Oversight Framework, that I believe should be effective for the majority of organizations.

Why is this important? PwC answers:

  • “The pace of change in this area is rapid, the subject matter is complicated, and the highly technical jargon used to describe emerging and evolving risks makes this a challenging area. And companies are relying more and more on technology to get ahead, often prompting substantial changes in how they operate.”
  • “Many directors are confused by and uncomfortable with overseeing IT. They sometimes don’t have an adequate understanding of the subject to be effective and confident in overseeing this area. And they do not necessarily have a well-defined process to help them in fulfilling this very important responsibility. Together, these factors can create an “IT confidence gap.””
  • “Directors are hungry for more information about the company’s approach to managing IT strategy and risk and believe they do not get enough information from management: 67% indicate their company’s approach to managing IT risk and strategy provides them with only “moderate” information to be effective or the information “needs improvement.” Many directors want more comfort regarding IT activities so they can sleep better at night.”

The six step process is described in detail in the guide. Here’s is my summary:

  1. Assessment: Understand the role of and reliance on technology – in the industry in general, and as it affects the organization in particular. As PwC says: “Conclude how important IT is to the company’s success”. But a word of caution – see #4, below
  2. Approach: Who will provide oversight of IT and technology, and how?
  3. Prioritization: Of all the technology-related activities, which merit priority attention?
  4. Strategy: In many ways, this is the most important area of focus. Most organizations are highly dependent on technology to advance – much more so than is evidenced by the responses to PwC’s study. Frankly, as intimated by PwC, when 87% directors and executives fail to indicate that reliance on technology is critical, it indicates myopia or outright blindness to the future.  PwC reports that “Nearly half of directors believe the board’s ability to oversee strategic use of IT is less than effective”. However, they also say that “Most CEOs of global companies say technology is the number-one factor that will impact their company’s future in the next three years; they believe it will be even bigger than changing economic and market conditions”.
  5. Risk: As PwC indicates, technology is a source of risk to the business, and technology-related issues need to be ‘baked’ into the risk management oversight process
  6. Monitoring speaks to the continued need for oversight, not something you take on once a year

This is, in my opinion, an excellent starting point for oversight (and management) of technology.

But:

  • My advice is to start looking at technology as the subject of discussion rather than IT. The IT function or department only manages or directs part of the investment in and use of technology across the organization. In fact, much of the budget and decision-making when it comes to technology is increasingly outside the IT function – especially when it comes to the use of technology for marketing
  • New technology and related issues change constantly, so don’t limit yourself to the subject areas introduced by PwC. For example, I think the announcement on January 10th by SAP that they now enable organizations to run their ERP systems (including manufacturing capacity planning and other complex and calculation-intensive applications) in memory, and as much as 300,000 times faster, is amazing and may transform traditional computing.
  • Boards need to understand that IT is no longer a utility that provides a platform for the business. In most cases, it is a vital and integrated element and capability for strategy and execution. Separate discussions on IT and strategy, or even organizational performance, may soon have to disappear

I welcome your views and commentary.

 

I truly believe that amazing developments are arriving that will make future decision-making far more effective. I want to talk about two in this post; admittedly one is more a hope and the other more a prediction.

 

The prediction can be expressed this way:

 

In the near future, which is getting nearer every day, decision-makers will have moved from an experience-based process to an information-based process. They will have reliable, useful information delivered to the palm of their hand in near real time that will let them make better decisions faster.

 

Until now, those making decisions have placed great reliance on their experience and ‘gut’ when making decisions. The information they have is typically historical, days if not months old. At times, the information is buried in reports or in a form that is not immediately useful. Studies have shown that even when the data exists within the organization and it is possible to ‘mine’ it to produce the information they need, managers don’t know how to get it – or it takes too long.

 

In the absence of information on today’s state of affairs and trends, decision-makers rely extensively on their experience its results. Their decisions are not always the best.

 

I think we would all agree that you can make better decisions with better information. Better being faster, more current, more useful (e.g., highlighted for you, not buried).

 

IDC captures much of what is happening when it says, in its 2013 predictions: “The ICT industry is in the midst of a once every 20-25 years shift to a new technology platform for growth and innovation. We call it the 3rd Platform, built on mobile devices and apps, cloud services, mobile broadband networks, big data analytics and social technologies.”

 

It’s not only that we have cloud, big data, mobile, social, in-memory computing, predictive analytics, and more. It’s that organizations are deploying a combination of these technologies to deliver near-real time insights, in a useful form (such as dashboard on mobile devices) that enable better decisions at speed.

 

Note that last phrase: ‘at speed’. George Patton would love this technology, because being able to make decisions faster (when based on reliable and current information) is a sure recipe for success.

 

Now, we have the ability to mine that incredible mountain of big data, completing analysis in seconds instead of hours, and deliver the results to the manager’s tablet – wherever she is. If the manager needs to get more question, another dive into the data (i.e., another round of analytics) can be completed in seconds.

 

There isn’t time or space to explain or discuss all of the new technologies. But I would be happy to answer questions (please post in Comments).

 

The hope is that as organizations improve their understanding and practice of risk management, it will move from a separate and distinct activity to an integral and necessary part of decision-making. No decision can be a quality decision unless the potential effects of that decision – and alternative decisions - are understood. No decision can be a good decision without reliable, current, useful information on uncertainty, both the good and the bad that may lie ahead.

 

Every manager will become a performance and risk manager. You can’t optimize long-term performance without optimizing potential outcomes – the essence of risk management. The risk officer becomes more of a mentor and coach, and nobody sees them as responsible for managing risk.

 

I welcome your views and commentary.

We've been running Virsa/Compliance Callibrator/SAP GRC for quite a while now. When we first started the project and ran the first analysis it turned out that we were in much better shape than many people expected, certainly our external consultants. Apparently, many organisations end up with a 7-digit violation count first time around, if viewed at permission level. We had a little over 50,000. That's been reducing slowly over the course of a couple of years now, until eventually, today, we got this:

screenshot.png

Celebrations all round

 

Making big reductions in that number is always easy at first, and gets progressively harder as time goes on. We've been below 1,000 violations for the last 12 months, below 500 for 6 months, and below 100 for 4 months.

screenshot2.png

We've used a few mitigations, and in a handful of places had to use Firefighter where there just aren't enough people, but mostly this is proper segregation of duties. If you are embarking on the same process and can't see the light at the end of the tunnel, take heart - it is hard work, but zero violations is possible!

 

Next step - an upgrade to GRC 10.0...

We all have high expectations to reduce risks in our SAP environments.  The objective which we chose to take was to get clean and stay clean.  Management has further decided to track our every move from the risk analysis dashboards.  Oh, Big Brother! Are violations going up or down?  With this kind of visibility, you want to address risks prior to provisioning access.  But how do you do this when the GRC Access Control 10 application forces you to choose one default risk? What if you have rules for critical authorizations in addition to segregation of duties?  How can you be sure that the default risk type is not removed?

 

These are all questions that we ask ourselves as we perform a detailed analysis of our violations. We identify additional unmitigated risks that are being introduced into our GRC control environment.  We spent weeks identifying the root causes of these newly introduced risk violations.  Since our task is to get clean and stay clean, what could we do to prevent these new violations?  Prior to recent changes, the GRC 10 application only allowed a single default risk type through configuration.  The application provided flexibility to choose one of the five risk analysis options as a default, but by only having a single default parameter you may allow unmitigated risks to be introduced into your environment.  If you were proactive, you may be able to mitigate some risks prior to them being included in the management dashboard.  But is our job to perform multiple manual processes to reduce risk?  I believe there are more important tasks than to constantly monitor new violations manually.  Isn’t that what the application is for? Yes!

 

With recent influence activities you now have more flexibility.  If you wanted to select all five risk analysis types as defaults on an access request you could.  However, choosing all of the risk types introduces a second issue of false positives.  These are issues when a user technically has access to a transaction (Action) but does not have the required authorization object values to create the risk.  I personally would not recommend selecting all risk types but rather select those appropriate to your environment which will force mitigation of risks prior to provisioning.  The five risk types on a standard GRC Access Request are as follows:

 

parm 1023 risks.jpg

 

This new option is available in SAP Note 1776542 (UAM: Multiple values for default report type not possible).  If you are on GRC support package 10 or less, you can manually implement this note.  The solution is currently scheduled to be included in GRC support package 11.  Without applying the note and performing the manual steps, you can only select one default risk type for GRC configuration parameter 1023.  After the note is implemented you have flexibility that was not previously available through configuration.  You could have provided this through careful custom coding but the ability to apply notes within a complex environment would create new support issues.  After applying the note you are allowed multiple parameter 1023 entries in the GRC configuration.

 

parm 1023 config.jpg

 

With this note our mission is almost complete.  Our next agenda item is to prevent the user from unchecking the default risk types within the request.

Last week, I had the honor of being the opening keynote speaker at the Compliance Week West conference in Palo Alto. As we gathered, I chatted with a couple of friends from a large technology company. They told me about some amazing things they are doing with the latest technology (including from SAP and its partners) to improve their risk and compliance activities. This company is not alone and I am hearing stories from companies in all different sectors and geographies almost every week.

 

For example:

  • One company is continuously monitoring hundreds of millions of transactions for indicators (red flags) of potential fraud. While organizations have been doing this on a monthly basis for a long time, the latest in-memory technology provides speed improvements - up to 300,000 times faster than just a year ago – that let them monitor transactions almost as they are processed. Now, they can intervene and take action quickly and close down anything improper very quickly.
  • A large bank is using some of the same in-memory technology to monitor signs of money-laundering. With the massive fines being levied by the government and regulators for anti-money laundering (AML) compliance failures, this has become a critical activity for financial services organizations. The power is now available to monitor the literally billions of transactions processed every day.
  • An IT organization has moved its information security threat risk assessment tool onto an in-memory platform. Previously, the tool was limited to assessing intrusion risks by analyzing a sample of intrusion attempts. As a result, its accuracy and reliability was limited. Now, it does its assessment based on the full history of intrusion attempts.
  • SAP is one of many companies that use social media monitoring technology (sometimes referred to as sentiment analytics or text analytics) to monitor what people are saying about the company; this keeps their fingers on reputation risk.
  • SAP’s internal risk management function is in the process of deploying mobile risk analytics. Linked to our enterprise risk management system, this mobile app will enable every manager to see and dive into the risks they own. It is enabling risk management to be “embedded” into daily management of the business.
  • Other companies are using new technology to improve their monitoring and communication of risks across the organization. It is great to go to a conference, such as Compliance Week West, and see the growing maturity of risk and compliance solutions showcased by vendors, some with integrated risk monitoring capabilities.

 

The ability to monitor risk and compliance in a more dynamic fashion that is responsive to change delivers power and value to the organization – and to the contribution that can be made by risk and compliance professionals.

 

But, I hear you say, risk and compliance functions don’t have the money to spend on expensive new toys.

 

That is true, but the majority of companies are either acquiring or actively looking at the new technology to improve business operations – especially to leverage so-called Big Data, but also to improve the analytics used to make decisions and run the business.

 

Risk and compliance professionals should be looking for the opportunity to leverage the technology their organization is acquiring for other purposes. That is what my friends at the Silicon Valley technology giant did.

 

What to look for? Here’s a partial list of new technology to power risk and compliance:

  • In-memory computing (sometimes this is called in-memory analytics, sometimes just as a platform or a database. SAP’s solution is called HANA)
  • Predictive analytics
  • Mobile analytics (sometimes referred to as mobile business intelligence, or mobile BI)
  • Risk monitoring, including event monitoring (where a real-time agent tests individual transactions against rules as they are processed)
  • And more, such as these solutions from Wipro

 

I would love to hear what you are doing with the new technology to improve risk and compliance effectiveness and efficiency.

Actions

Filter Blog

By author:
By date:
By tag: