Launching a Value-Based Analytics and RPA Program

RPA Program
Author: Chris Sanders, CISA, COBIT 5 Foundation Certified
Date Published: 20 December 2018

For those who missed it, SpaceX recently launched a Tesla automobile into outer space. Amid the entertainment value of picturing a car orbiting the Earth, many people missed how the engineers at SpaceX were pushing the limits of what was possible with a smaller budget than some nation-sponsored space programs. So, how did they do it? SpaceX used machine learning (an advanced form of data analytics) to optimize the orbit and descent procedures of the rocket by running millions of simulations in a fraction of the time (and cost) in which humans could have accomplished the task.1 Many organizations are probably already using some form of data analytics in customer profiling, engineering or—hopefully—assessing IT risk.

While initiating an analytics framework for an audit organization may seem like old news, it is worth remembering that Grant Thornton surveyed approximately 170 chief audit executives in 2017 and found that greater than 72 percent of organizations use Excel as their primary analytics tool, and less than 40 percent consistently leverage analytics.2 An industry-agnostic framework can help an organization launch its own analytics program (without a rocket science degree).

Step 1: Focus on the Processes and Identify Quick Wins

For organizations that do not already have established analytics programs, it is not easy to open the business checkbook by talking about advanced analytics that may (or may not) improve risk detection capabilities. This can be even more difficult for an audit team that is confronted with a decision-maker who responds that the group has “audited risk before without needing any expensive tools.” At a similar point, practitioners have witnessed multiple organizations break down this roadblock by identifying processes that, using analytics, could be fully or partially performed by a computer (which is otherwise defined as robotics process automation [RPA]), and for which investment would result in time-savings value for the organization.

Figure 1As per figure 1, quick wins in this context are those that are highly repetitive (high value) and use highly structured data sources (low complexity or, for simplicity, cost). Figure 2 illustrates how quick wins were determined for a sample IT audit organization. The organization in this example quantified the potential value and costs associated with analytics-based automation of its IT audit control tests on a 1 to 5 scoring scale. The scores were determined based on the team’s self-assessment of procedure frequency and knowledge of data sources (from prior testing) using the following guidelines:

  • Repetitiveness: 1 (ad hoc) to 5 (regular)—As examples, password testing was performed only once per year by this organization, so it was defined as ad hoc and scored a 1 (the lowest score); new user testing was performed three or more times per year, so it was defined, in relative terms, as regular and scored a 5 (the highest score).
  • Data structure: 1 (unstructured) to 5 (highly structured)—As examples, the data sources of configuration for password testing were, in many cases, not even found within databases, so it was scored as 1 (the lowest score); new user testing considered commonly used data sources with well-defined enterprise fields (such as Active Directory-based user IDs and roles/groups), so it was scored as 5 (the highest score).

Figure 2

Based on the results of this exercise, the organization identified new and terminated user testing as quick wins for analytics-based RPA implementation. Once the sample organization decided which processes/control tests to target, it could calculate its expected return on investment (ROI) for the analytics implementation.

Step 2: Calculate ROI for Quick Wins

Finance teams demand numbers-based decisions and will likely need to approve or deny any analytics pilot or full-scale implementation. Therefore, it is critical to build a numbers-based business case via an ROI calculation. The sample organization in step 1 mapped out the manual steps and time associated with its quick wins (new and terminated user testing) and, based on the net value of pre-RPA and post-RPA manual time for the audit team, the organization calculated its estimated time savings (per test). Figures 3 and 4 show the example analyses of both processes.

Figure 3
Figure 4

Note that both of these examples assume that a sample-based testing approach would continue after implementation; however, it is common for organizations to leverage analytics-based RPA implementation to test 100 percent of their populations since analytics-based RPA implementation is extremely scalable at low incremental cost (as opposed to manual/pre-analytics testing, where organizations use sample-based testing to control the high costs associated with increasing testing scope). The sample organization did not initially adopt this approach to limit the scope and associated risk with its initial implementation (with the intent to evaluate 100-percent testing as a phase two consideration).

Based on the results of the time-saving calculations, the sample organization estimated a savings of four hours for each new user test and 4.5 hours for each terminated user test (per application or other asset). The organization then multiplied those savings by two tests per year for 50 applications and calculated a projected ROI of 850 hours per year for these two quick wins. Initiating the discussion with a number such as that, rather than analytics possibilities with unknown value, is a stronger launching point for most IT audit teams. Many organizations have guidelines for tying cost to hour savings, but this calculation will be different for every organization.

Step 3: Partner With IT to Optimize the Management of Technology

It is apparent that technology to implement these analytics-based RPA use cases has not yet been discussed here. This omission is deliberate. Choosing a technology and agreeing to a cost are the points where the going gets tough and where projects tend to lose focus, languish in the bottom of prioritization queues or receive a vote of “no” from the decision-maker. Unfortunately, the root cause of these negative outcomes, in many cases, can be traced to the audit teams asking for a tool without first defining its use cases with associated numbers-based ROI.

Many successful teams begin this step of the process by initiating a discussion with IT covering:

  • The analytics and/or RPA tools that already exist in the environment and that could accomplish the use cases without significant investment
  • If no tools exist (or tools exist, but are inadequate), the procedure to partner on the pilot of a third-party tool (or tools)
  • The feasibility of technology to accomplish the targeted use cases within the organization

While each of these steps is important, the last is especially critical to gain consensus on the feasibility of the ROI calculation before committing to expenditures. For example, the sample organization shared its targeted quick wins with the organization’s IT department and decided, with the help of IT, that it was technically infeasible to automate the extraction of user lists from applications hosted by third-party vendors without significant investment. Consequently, the organization agreed to reduce the time-savings calculation for that activity and reached a new consensus-based ROI calculation. At this point, the project became a joint goal, with the quick wins becoming the key performance indicators (KPIs) used to score the fit of the technologies evaluated.

While some organizations have successfully built and scaled their own analytics-based RPA technology, most audit teams do not have the expertise and/or time to successfully and sustainably build and manage their own technology platform. The most successful audit teams have taken a joint approach to their respective tool set deployment. An example of an optimized responsibilities breakdown is depicted in figure 5.

Figure 5

Within figure 5 are two components that will likely be the primary responsibility of an IT organization but also require input from the audit team:

  • Tool deploy and run—Pilot platforms against use cases to ensure that KPIs are met.
  • Data extract, transform, load (ETL)—Define the data needed and the frequency of imports required.

Aligned to the optimized responsibilities for the audit team, full control over the analytic and workflow design and results analysis steps can be critical for establishing a sustainable foundation for the expansion of an analytics program to other analytics. New analytics may help a team assess control performance or risk, but these new analytics may not possess a clear ROI (potentially leading to future project stalls). If the technology does not meet these guidelines, it may be useful to consider discussing other tools that do.

Step 4: Launch

Ideally, the team is able to implement steps 1 through 3 as a technological framework to begin the process of exercising quick wins. If not, this is not an isolated case; this implementation can be especially difficult for small organizations in which the calculated ROI may be lower than the high costs associated with leading analytics and RPA technologies. If the team or the organization falls into this category, there is still hope. Following steps 1 through 3 should have enabled identification of reasonable use cases that could likely be implemented in something as simple as Microsoft Excel using pre-recorded Visual Basic for Applications (VBA) scripts to achieve full or partial automation. If the organization outgrows Excel, Microsoft Access is another tool that is already available in most organizations to enable analytics, and there are even several powerful open-source analytic tools (such as KNIME) on the Internet. These tools may lack the integrated and feature-rich capabilities of the market leaders, but they will allow the organization to begin to initiate the move to analytics-based RPA testing. Discussion with IT can be leveraged to partner on the right approach for the organization. After all, even rocket scientists work in teams.

Conclusion

The potential for analytics and RPA technologies within IT audit organizations is widely agreed upon, but implementation and adoption efforts have languished in many organizations. Organizations that identify their own value opportunities and goals first, prior to exploring technology options, are more likely to have long-term success.

Author’s Note

Opinions expressed in this article are the author’s own and do not necessarily represent the views of Charles Schwab.

Endnotes

1 Fernholz, T.; “SpaceX’s Self-Landing Rocket Is a Flying Robot That’s Great at Math,” Quartz, 21 February 2017, http://qz.com/915702/the-spacex-falcon-9-rocket-you-see-landing-on-earth-is-really-a-sophisticated-flying-robot/
2 Grant Thornton, “Data Analytics’ New Frontiers: Internal Audit Looks Forward,” 22 September 2017

Chris Sanders, CISA, COBIT 5 Foundation
Is an internal controls and process improvement team manager for Charles Schwab in Denver, Colorado, USA. He has seven years of IT audit experience and has led the creation of data analytics programs for multiple IT audit and governance groups.