The report includes a useful review of the risk and cost of fraud. (Note that it errs when it refers to ‘tips’ as being external: these are typically calls to the internal compliance hotline or whistleblower line.)
What is new in the report is the discussion of the ability to mine the mass of Big Data, perhaps with predictive analytics, to understand and assess fraud risk, and also to monitor for red flags that indicate an investigation is warranted. As the report says:
“Rapid changes in information technology infrastructure are increasing the difficulty of maintaining high levels of preparedness simultaneously against all threats. In response, organizations are adopting enhanced strategies for fighting fraud: from 100% success at prevention, to greater visibility, faster detection and incident response; from “figure out what already happened” using post-incident forensics, to proactively “figuring out what’s happening” using Big Data and predictive analytics.”
Unfortunately, Aberdeen’s research showed that only about 16% are using predictive analytics for the detection and prevention of fraud.
Why is this? I suggest it’s from one or more of these factors:
Those responsible for fraud prevention/detection are not aware of the capabilities of the new technology
Those responsible for fraud prevention/detection are (justifiably or not) content with the ‘older’ technology
Priority and/or resources are not given to fraud prevention/detection
I admit to criticizing my “alma mater”, PwC, for much of their thought ‘leadership’ over the last years.
Today, I come to praise PwC, not to bury it.
They have published an excellent guide for boards that merits reading not only by board members but also by all those responsible for management of IT, risk management, and internal audit.
Directors and IT: What Works Best suggests a six-step process, what they refer to as an IT Oversight Framework, that I believe should be effective for the majority of organizations.
Why is this important? PwC answers:
“The pace of change in this area is rapid, the subject matter is complicated, and the highly technical jargon used to describe emerging and evolving risks makes this a challenging area. And companies are relying more and more on technology to get ahead, often prompting substantial changes in how they operate.”
“Many directors are confused by and uncomfortable with overseeing IT. They sometimes don’t have an adequate understanding of the subject to be effective and confident in overseeing this area. And they do not necessarily have a well-defined process to help them in fulfilling this very important responsibility. Together, these factors can create an “IT confidence gap.””
“Directors are hungry for more information about the company’s approach to managing IT strategy and risk and believe they do not get enough information from management: 67% indicate their company’s approach to managing IT risk and strategy provides them with only “moderate” information to be effective or the information “needs improvement.” Many directors want more comfort regarding IT activities so they can sleep better at night.”
The six step process is described in detail in the guide. Here’s is my summary:
Assessment: Understand the role of and reliance on technology – in the industry in general, and as it affects the organization in particular. As PwC says: “Conclude how important IT is to the company’s success”. But a word of caution – see #4, below
Approach: Who will provide oversight of IT and technology, and how?
Prioritization: Of all the technology-related activities, which merit priority attention?
Strategy: In many ways, this is the most important area of focus. Most organizations are highly dependent on technology to advance – much more so than is evidenced by the responses to PwC’s study. Frankly, as intimated by PwC, when 87% directors and executives fail to indicate that reliance on technology is critical, it indicates myopia or outright blindness to the future. PwC reports that “Nearly half of directors believe the board’s ability to oversee strategic use of IT is less than effective”. However, they also say that “Most CEOs of global companies say technology is the number-one factor that will impact their company’s future in the next three years; they believe it will be even bigger than changing economic and market conditions”.
Risk: As PwC indicates, technology is a source of risk to the business, and technology-related issues need to be ‘baked’ into the risk management oversight process
Monitoring speaks to the continued need for oversight, not something you take on once a year
This is, in my opinion, an excellent starting point for oversight (and management) of technology.
My advice is to start looking at technology as the subject of discussion rather than IT. The IT function or department only manages or directs part of the investment in and use of technology across the organization. In fact, much of the budget and decision-making when it comes to technology is increasingly outside the IT function – especially when it comes to the use of technology for marketing
New technology and related issues change constantly, so don’t limit yourself to the subject areas introduced by PwC. For example, I think the announcement on January 10th by SAP that they now enable organizations to run their ERP systems (including manufacturing capacity planning and other complex and calculation-intensive applications) in memory, and as much as 300,000 times faster, is amazing and may transform traditional computing.
Boards need to understand that IT is no longer a utility that provides a platform for the business. In most cases, it is a vital and integrated element and capability for strategy and execution. Separate discussions on IT and strategy, or even organizational performance, may soon have to disappear
I truly believe that amazing developments are arriving that will make future decision-making far more effective. I want to talk about two in this post; admittedly one is more a hope and the other more a prediction.
The prediction can be expressed this way:
In the near future, which is getting nearer every day, decision-makers will have moved from an experience-based process to an information-based process. They will have reliable, useful information delivered to the palm of their hand in near real time that will let them make better decisions faster.
Until now, those making decisions have placed great reliance on their experience and ‘gut’ when making decisions. The information they have is typically historical, days if not months old. At times, the information is buried in reports or in a form that is not immediately useful. Studies have shown that even when the data exists within the organization and it is possible to ‘mine’ it to produce the information they need, managers don’t know how to get it – or it takes too long.
In the absence of information on today’s state of affairs and trends, decision-makers rely extensively on their experience its results. Their decisions are not always the best.
I think we would all agree that you can make better decisions with better information. Better being faster, more current, more useful (e.g., highlighted for you, not buried).
IDC captures much of what is happening when it says, in its 2013 predictions: “The ICT industry is in the midst of a once every 20-25 years shift to a new technology platform for growth and innovation. We call it the 3rd Platform, built on mobile devices and apps, cloud services, mobile broadband networks, big data analytics and social technologies.”
It’s not only that we have cloud, big data, mobile, social, in-memory computing, predictive analytics, and more. It’s that organizations are deploying a combination of these technologies to deliver near-real time insights, in a useful form (such as dashboard on mobile devices) that enable better decisions at speed.
Note that last phrase: ‘at speed’. George Patton would love this technology, because being able to make decisions faster (when based on reliable and current information) is a sure recipe for success.
Now, we have the ability to mine that incredible mountain of big data, completing analysis in seconds instead of hours, and deliver the results to the manager’s tablet – wherever she is. If the manager needs to get more question, another dive into the data (i.e., another round of analytics) can be completed in seconds.
There isn’t time or space to explain or discuss all of the new technologies. But I would be happy to answer questions (please post in Comments).
The hope is that as organizations improve their understanding and practice of risk management, it will move from a separate and distinct activity to an integral and necessary part of decision-making. No decision can be a quality decision unless the potential effects of that decision – and alternative decisions - are understood. No decision can be a good decision without reliable, current, useful information on uncertainty, both the good and the bad that may lie ahead.
Every manager will become a performance and risk manager. You can’t optimize long-term performance without optimizing potential outcomes – the essence of risk management. The risk officer becomes more of a mentor and coach, and nobody sees them as responsible for managing risk.
We've been running Virsa/Compliance Callibrator/SAP GRC for quite a while now. When we first started the project and ran the first analysis it turned out that we were in much better shape than many people expected, certainly our external consultants. Apparently, many organisations end up with a 7-digit violation count first time around, if viewed at permission level. We had a little over 50,000. That's been reducing slowly over the course of a couple of years now, until eventually, today, we got this:
Celebrations all round
Making big reductions in that number is always easy at first, and gets progressively harder as time goes on. We've been below 1,000 violations for the last 12 months, below 500 for 6 months, and below 100 for 4 months.
We've used a few mitigations, and in a handful of places had to use Firefighter where there just aren't enough people, but mostly this is proper segregation of duties. If you are embarking on the same process and can't see the light at the end of the tunnel, take heart - it is hard work, but zero violations is possible!
We all have high expectations to reduce risks in our SAP environments. The objective which we chose to take was to get clean and stay clean. Management has further decided to track our every move from the risk analysis dashboards. Oh, Big Brother! Are violations going up or down? With this kind of visibility, you want to address risks prior to provisioning access. But how do you do this when the GRC Access Control 10 application forces you to choose one default risk? What if you have rules for critical authorizations in addition to segregation of duties? How can you be sure that the default risk type is not removed?
These are all questions that we ask ourselves as we perform a detailed analysis of our violations. We identify additional unmitigated risks that are being introduced into our GRC control environment. We spent weeks identifying the root causes of these newly introduced risk violations. Since our task is to get clean and stay clean, what could we do to prevent these new violations? Prior to recent changes, the GRC 10 application only allowed a single default risk type through configuration. The application provided flexibility to choose one of the five risk analysis options as a default, but by only having a single default parameter you may allow unmitigated risks to be introduced into your environment. If you were proactive, you may be able to mitigate some risks prior to them being included in the management dashboard. But is our job to perform multiple manual processes to reduce risk? I believe there are more important tasks than to constantly monitor new violations manually. Isn’t that what the application is for? Yes!
With recent influence activities you now have more flexibility. If you wanted to select all five risk analysis types as defaults on an access request you could. However, choosing all of the risk types introduces a second issue of false positives. These are issues when a user technically has access to a transaction (Action) but does not have the required authorization object values to create the risk. I personally would not recommend selecting all risk types but rather select those appropriate to your environment which will force mitigation of risks prior to provisioning. The five risk types on a standard GRC Access Request are as follows:
This new option is available in SAP Note 1776542 (UAM: Multiple values for default report type not possible). If you are on GRC support package 10 or less, you can manually implement this note. The solution is currently scheduled to be included in GRC support package 11. Without applying the note and performing the manual steps, you can only select one default risk type for GRC configuration parameter 1023. After the note is implemented you have flexibility that was not previously available through configuration. You could have provided this through careful custom coding but the ability to apply notes within a complex environment would create new support issues. After applying the note you are allowed multiple parameter 1023 entries in the GRC configuration.
With this note our mission is almost complete. Our next agenda item is to prevent the user from unchecking the default risk types within the request.
Last week, I had the honor of being the opening keynote speaker at the Compliance Week West conference in Palo Alto. As we gathered, I chatted with a couple of friends from a large technology company. They told me about some amazing things they are doing with the latest technology (including from SAP and its partners) to improve their risk and compliance activities. This company is not alone and I am hearing stories from companies in all different sectors and geographies almost every week.
One company is continuously monitoring hundreds of millions of transactions for indicators (red flags) of potential fraud. While organizations have been doing this on a monthly basis for a long time, the latest in-memory technology provides speed improvements - up to 300,000 times faster than just a year ago – that let them monitor transactions almost as they are processed. Now, they can intervene and take action quickly and close down anything improper very quickly.
A large bank is using some of the same in-memory technology to monitor signs of money-laundering. With the massive fines being levied by the government and regulators for anti-money laundering (AML) compliance failures, this has become a critical activity for financial services organizations. The power is now available to monitor the literally billions of transactions processed every day.
An IT organization has moved its information security threat risk assessment tool onto an in-memory platform. Previously, the tool was limited to assessing intrusion risks by analyzing a sample of intrusion attempts. As a result, its accuracy and reliability was limited. Now, it does its assessment based on the full history of intrusion attempts.
SAP is one of many companies that use social media monitoring technology (sometimes referred to as sentiment analytics or text analytics) to monitor what people are saying about the company; this keeps their fingers on reputation risk.
SAP’s internal risk management function is in the process of deploying mobile risk analytics. Linked to our enterprise risk management system, this mobile app will enable every manager to see and dive into the risks they own. It is enabling risk management to be “embedded” into daily management of the business.
Other companies are using new technology to improve their monitoring and communication of risks across the organization. It is great to go to a conference, such as Compliance Week West, and see the growing maturity of risk and compliance solutions showcased by vendors, some with integrated risk monitoring capabilities.
The ability to monitor risk and compliance in a more dynamic fashion that is responsive to change delivers power and value to the organization – and to the contribution that can be made by risk and compliance professionals.
But, I hear you say, risk and compliance functions don’t have the money to spend on expensive new toys.
That is true, but the majority of companies are either acquiring or actively looking at the new technology to improve business operations – especially to leverage so-called Big Data, but also to improve the analytics used to make decisions and run the business.
Risk and compliance professionals should be looking for the opportunity to leverage the technology their organization is acquiring for other purposes. That is what my friends at the Silicon Valley technology giant did.
What to look for? Here’s a partial list of new technology to power risk and compliance:
In-memory computing (sometimes this is called in-memory analytics, sometimes just as a platform or a database. SAP’s solution is called HANA)
Mobile analytics (sometimes referred to as mobile business intelligence, or mobile BI)
Risk monitoring, including event monitoring (where a real-time agent tests individual transactions against rules as they are processed)
"This resource guide, prepared by DOJ and SEC staff, aims to provide businesses and individuals with information to help them abide by the law, detect and prevent FCPA violations, and implement effective compliance programs."
I am not an attorney, so will not provide additional highlights in my normal fashion. Instead, I have included links to legal firms’ and experts’ analyses.
A recent whitepaper by Michael Rasmussen titled “Anti-Bribery & Corruption: The Good, The Bad, & The Ugly” discusses how over the past 18 months the sentiment at the DOJ has shifted from somewhat passive to a proactive approach of requiring multi-national companies to demonstrate they have process checks & balances in place to alert them to any Foreign Corrupt Practices Act (FCPA) event.
With this new direction from the DOJ and the expanding regulations, increased fines and sanctions around the world, today’s organizations need preventative and detective measures to monitor for corruption. A proactive compliance program that includes Transaction Monitoring demonstrates strong controls that can help shield a company from liability.
This paper discusses how transaction monitoring eases the anti-corruption compliance burden by delivering operational effectiveness, human and financial efficiency and agility to compliance processes by monitoring the transactions and the personnel that perform them, and detecting and preventing bribery, corruption and other types of fraud.
Deloitte has done a good job summarizing some of the fast-moving developments and applications of the latest business technology in their Tech Trends 2012: Elevate IT for digital business. The summary page includes a short video discussion that is worth reviewing before reading the report.
The publication lists and discusses 5 “disruptors” and 5 “enablers”, each of which are interesting topics.
The disruptive technologies are:
Enterprise Mobility Unleashed
The enabling technologies are:
Here are some of the points that caught my eye:
It’s an uncommon, and perhaps even unique, time to have so many emerging forces – all rapidly evolving, technology-centric and each already impacting business so strongly. Whether or not you have previously thought of your business as inherently digital, the convergence of these forces offers a new set of tools, opening the door to a new set of rules for operations, performance and competition. This is an opportunity for IT to truly help elevate business performance.
Each of these 2012 trends is relevant today. Each has significant momentum and potential to make an impact. Each warrants timely consideration. Forward-thinking organizations should consider developing an explicit strategy in each area – even if that strategy is to wait and see. But whatever you do, step up. Use the digital forces to your advantage. Don’t get caught unaware or unprepared.
Leading enterprises today are applying social technologies like collaboration, communication and content management to social networks – the connected web of people and assets that impact on a given business goal or outcome – amplified by social media from blogs to social networking sites to content communities. Yet it’s more than tools and technology. Businesses are being fundamentally changed as leaders rethink their core processes and capabilities with a social mindset to find new ways to create more value, faster.
Millennials joining the workforce are wired to use social and mobile channels to bond, socialize and solve problems. Organizations that lack internal, governed social media and computing channels may find their younger employees using public tools as a well-intentioned, but risky, alternative.
Mobility has evolved from an issue within a few niche industries and functions (think oil & gas and logistics services) to a potential source of innovation across wide-ranging vertical industries, processes and business models. And while many of the underlying components have been evolving for decades, the break-out potential is only now being realized.
Early experiments in business-to-consumer and early business-to-business scenarios are leading to more compelling, complex applications across the enterprise value chain, making integration, security and manageability more critical.
Organizations need policies and tools to authenticate users; control devices, applications and data; provide end-to-end encryption while at rest, in flight and in use; run content filtering and malware protection; and allow security event monitoring, logging and response. Security policies and profiles should be tied to specific users and scenarios, focusing remedies on likely incidents, not the infinite range of risk possibilities.
Developing, deploying and supporting mobile solutions is quite a bit different than traditional IT. Doing it well requires a special blend of business insight, deep technical chops and strong design. Companies that recognize this required mix of business, art and science can set themselves apart from their competition and help to reshape entire industries.
End users have plenty of opportunities to bypass IT and procure off-the-shelf or low/no-code solutions that are just good enough to meet their needs. Through mobile and desktop application (app) stores, cloud-based marketplaces and rapid development and deployment platforms, business stakeholders are one swipe of the corporate credit card away from procuring rogue “almost-enterprise” applications to fulfill their unmet needs. As a result, CIOs should consider adopting a design-led, user-centric approach to new application development, while also accepting the inevitability of business users directly sourcing apps. BYOA (bring your own application) will likely become part of many organizations’ solution footprints.
Whether your organization will shift to cloud services is unlikely to remain an open question. The question now becomes how that shift is likely to happen. Will your approach add complexity to your technology environment – or will it bring elegance and simplicity? The choice is yours.
The potential of big data is immense. Remove constraints on the size, type, source and complexity of useful data, and businesses can ask bolder questions. Technology limitations that once required sampling or relied on assumptions to simplify high-density data sets have fallen to the march of technology. Long processing times and dependencies on batch feeds are being replaced by on-demand results and near real-time visibility.
Organizations that put big data to work may pursue a huge competitive edge in 2012 – and beyond.
These emerging technologies are driving business transformation and bold new applications – sources of both sustaining and disruptive innovations. Social business and enterprise mobility are changing the way business is conducted and, increasingly, allowing new operating and business models to emerge. IT organizations should be educating their business counterparts on the potential of these new technologies – and preparing for the groundswell of demand once the implications are understood.
Creating a vision for driving innovation is the often-forgotten third role of CIOs. Beyond running the business of IT and delivering IT to support the needs of the business, CIOs should be leading the charge toward innovation through emerging solutions and technologies.
What does this mean for boards, executives, and risk and assurance professionals? My view:
Does your organization have an organized strategy and plan for adopting and optimizing the use of technology? This applies both to the technology that is selected and the technology that is not selected
Are the CIO and his team working cooperatively with the business? Are priorities shared? Is the business doing its own thing? Is IT a visionary leader, or a reluctant irrelevance?
Are the risk, compliance, control, and information security functions involved early and throughout the strategy-setting and deployment phases? Are risks and controls thought of before or not until after there is a problem?
Are you sure you will not be left behind by a competitor who leverages the new technology better than you?
Their message is clear: putting access to information in the hands of management and employees improves the timeliness and quality of decisions, and helps enable improved performance.
Mobile BI (described here) refers to the ability not only to see dashboards and other visualization of information on mobile devices (tablets and smart phones), but in most cases to analyze that data directly on the device. Generally speaking, the device is connected over the internet to enterprise systems so that the data is current and subject to corporate internal controls.
Key points include:
46% believe that mobile BI will give them a competitive advantage
The average time to make a decision is just 66 hours for those with this technology enabled compared to 190 days without
What does this mean to board members, risk, and assurance professionals?
Board members can get not only their board books on their iPad, but intelligent dashboards that enable them to drill down into and explore the numbers
Board members, executives, and practitioners can get real-time alerts of changing conditions delivered to the palm of their hands for rapid response
Risk information can be delivered to managers, enabling them to make business decisions based on the risks today (assuming risk monitoring is in place)
Practitioners can continue to monitor risk conditions even when not at their desks. In addition, they can monitor performance – and I continue to assert that Key Performance indicators provide a window into how well risks are being managed
Internal auditors can explore CAATS on the go! This new technology is simple to use and will improve the quality of the audit
Internal auditors should consider the quality of information when assessing internal controls. After all, Information and Communication is a COSO component!
Whether you are one of those who like the term ‘risk appetite’, prefer ‘risk tolerance’, or advocate (as I do) the ISO 31000:2009 term ‘risk criteria’, this is a tough area. While regulators frequently (including Basel III and multiple nations’ corporate governance codes) require organizations to establish one, I have yet to see something that really works.
While it may be possible to establish acceptable risk levels or criteria for aggregated financial risks, how do you set such standards for reputation, strategic, compliance, political, or IT-related risks? How do you establish and then measure aggregate reputation or compliance risk across the organization? I have seen some companies set a single number, say 3% of capital, as their “risk appetite”, but how can that make sense when you are considering compliance risk? How does it help a procurement manager decide whether to use a sole source vendor of essential components, to use two and allocate each 50% of the supply, or take another approach? Surely, (a) no organization can rely on a single “risk appetite” number: you need several, each covering a different category of risk; and, (b) what counts is the ability to direct risk-takers (frontline managers) in their daily decisions as they run the business.
Attempts to solve the problem have come from:
COSO, in a paper by Dr. Larry Rittenberg and Frank Martens
Although each has value, none have so far met my test, which I have summarized below.
To be effective, an organization needs measures (whatever you want to call them) that allow:
The board and top management to ensure that the risks taken across the organization, individually and in aggregate, are the risks they want taken. This is extraordinarily difficult when you consider the risk decisions that are taken every day as part of running the business and how they interact, with a decision in one area affecting risks and opportunities in a distant part of the organization, plus how they need to be aggregated to provide risk vision across the entire enterprise.
Managers making decisions to understand not only the risks they are taking (and modifying), but whether they are the risks top management and the board want them to take. The issue here is applying top-level “risk appetite statements” to individual decisions. If this bridge cannot be crossed, then the entire exercise has limited value – other than cosmetically.
That’s the key for me. If there is no practical guidance for the frontline manager, this is all a chimera: a look-good, check-the-box practice that does not have any real effect on how risk is being managed across the organization.
In the latest McKinsey Quarterly is an interesting discussion entitled Managing the strategy journey. I was struck by the early reference to “pervasive, ongoing uncertainty” and the statement that “companies needed to get their senior-leadership teams working together in a fundamentally different way”. Their basic point is that rather than strategy being set in annual or semi-annual meetings, there should be a more continuous process, a journey, in which strategies and related actions are monitored and adjusted at least weekly.
The authors don’t hold back with their language, noting that they found “strong evidence that a great many companies are generating strategies that, by their own admission, are sub-standard”. Only 35% of strategies they evaluated were considered likely to beat the competition.
McKinsey’s first recommendation is that companies’ executive leadership teams should increase the time they spend together on strategy to 2-4 hours every week or two. Why? Because (although the authors don’t say it this way) risks to strategies and objectives are changing all the time, requiring response and, sometimes, adjustment to strategic directions. The authors do talk about companies’ inabilities to manage uncertainty – i.e., risk. – and the need to develop “uncertainty-management skills”.
They also recommend moving to rolling forecasts and budgets – which recognize that trying to look 12 months or more out is gazing into the mists of uncertainty (my metaphor). McKinsey has some detailed ideas and suggestions for moving to that more frequent process.
The board is advised to recognize the need for flexibility and that the company’s strategy is not “set in stone” but rather there is a “continual evolution and refreshment of the enterprise’s strategic direction.” I would add that this emphasizes the need to integrate risk management into both strategy-setting and performance management.
Do you agree that executive leadership teams should spend as much time monitoring and adjusting strategy as on operational issues?
This week, I was honored to present to 220 board directors at Bursa Malaysia, the Malaysia stock exchange (an event coordinated by my friends at IIA Malaysia).
The topic was “Governance, Risk Management, Compliance: What Directors Should Know” and if you are interested you can upload a copy of my slides. I defined GRC as the capability that enables an organization to set objectives that deliver value to stakeholders, optimize performance to achieve or surpass objectives through management of risk, and act with integrity (which includes not only compliance with laws and regulations, but with the expectations of the society within which we live and operate).
During the course of the presentation, and in answer to a number of questions, I made the following observations about critical actions necessary if governance, risk management, and internal audit are to be effective in delivering performance and value to stakeholders:
The board should have a majority of directors who are independent of management (the exception being in a family business)
The audit committee and the risk committee (if there is one) must include individuals with expertise in risk management – which extends beyond financial risks to all risks of significance to the organization, including reputation risk, operational risk, etc.
Risk management must be recognized as being more than covering the back of the organization (i.e., a compliance activity focused on minimizing the potential impact of major disasters). Instead, it needs to be acknowledged as enhancing the ability of the organization to move forward and achieve or surpass its objectives. Risk management has to be embedded in strategy, performance management, and daily decision-making processes – enabling the organization (a) to make better decisions because it has information and is taking action in response to the uncertainty in its path, and (b) to optimize potential outcomes
Board members cannot be effective without reliable, timely information or confidence in management’s processes and controls (including risk management). The best source for assurance on both counts is an internal audit department that is:
Independent of management, reporting directly to the board or a committee of the board
Led by an experienced and competent professional (CAE) that was selected by the board and not by management. (It is not acceptable, in my opinion, for management to manage the hiring process and present its selection to the board for approval)
The CAE’s performance must assessed by the board, not management, and the compensation of the CAE, including bonus, must also be set by the board
The internal audit function has to be sufficiently resourced to meet its obligation to provide assurance and consulting services relative to the more significant risks to the organization
Only the board can terminate or discipline the CAE, and only the CAE can terminate or discipline any of the staff of the internal audit function
The CAE should report not only to the audit committee of the board, but to any committee that is responsible for oversight on areas addressed by internal audit. For example, the CAE should regularly report to a board’s risk committee, compliance committee, governance committee, etc.
The internal audit function must provide assurance to the board or committee of the board in the form of a formal opinion at least annually. That opinion will be based on the work performed (and the CAE should ensure that the audit plan considers all risks of significance to the organization) and include an assessment of whether the organization’s governance, risk management, and related internal control processes provide reasonable assurance that risks of significance to the organization are managed within acceptable levels/criteria
Individuals who possess information about potential violations of the organization’s code of conduct, including but not limited to violations of law or regulation, should be able to report their suspicions to an objective party independent of the management responsible for the area, without fear of retaliation or other harm. That may mean reporting directly and anonymously to the audit committee, but in practice the latter will delegate this important responsibility to a trusted advisor, such as the head of internal audit, who can be relied upon to keep the allegation and related information confidential
Some observations and clarifications on the above:
While many nation’s corporate governance codes or regulations require listed companies to have an internal audit function, there is no requirement to have an independent or competent one. That has to change. Too many organizations are complying with the letter and not the spirit of these codes by hiring an inexperienced individual who reports to lower level management and not the board
While far too few internal audit departments are providing the opinion I say should be mandated (and provided for 20 years as CAE), this should be the #1 priority for every CAE. It may take a couple of years to change not only the activity but the philosophy of the internal audit function, but this is critical if the board is to obtain the assurance it needs to be effective.
IT IS NOT ACCEPTABLE FOR MANAGEMENT AND/OR THE BOARD TO BE REQUIRED TO ASSESS RISK MANAGEMENT AND INTERNAL CONTROL AND ALLOW INTERNAL AUDIT NOT TO PROVIDE THEIR PROFESSIONAL OPINION
The facts that CAEs have not provided an opinion in the past and audit committees have not been asking for it do not change the fact that this is critical. AUDIT COMMITTEES SHOULD DEMAND AN OPINION
Far too many risk professionals and their management have limited the formal risk management program to a few (10-20) so-called ‘high’ risks that deal only with potential adverse events. For risk management to deliver on its potential, it has to enable management to make risk-intelligent decisions that drive performance – optimizing potential upsides as well as minimizing the downside
Managing risk cannot be left to periodic meetings, workshops, and assessments. The business runs every day; risks change all the time; and the consideration and management of risk has to match the speed of decision-making
RISK MANAGEMENT CANNOT BE A ‘CHECK-THE-BOX-ACTIVITY’ to demonstrate compliance. The true test of risk management is whether management at all levels is able to confirm that information about uncertainty, together with related actions to modify risk, is helping them make better decisions and be more successful