A small internal IT audit team had a complex and diverse environment that it needed to review, evaluate and test and to which it needed to provide assurance. The enterprise to be tested consisted of 7 core divisions and more than 300 entities that varied across approximately 90 IT domains. Each of the management teams responsible for the IT domains had their own requirements and, thus, their own systems, standards, and policies. This translated into a complex environment and an audit cycle of more than 2 years, which, in a dynamic and evolving technology landscape, was not sufficient to address the associated risk.
To address this challenge, IT general control procedures were automated so that testing could be performed across all organizations in a continuous and/or periodic manner. Throughout this process, the audit team learned multiple important lessons about things it could have done differently during the audit automation process.
To grow an audit program, one should aim to demonstrate quick wins with scalable control tests to secure budget and goodwill for any complex evidence-gathering methods and control tests that may be sought later.
The audit team had 7 major takeaways following the audit automation process:
- Establish control tests that require structured data first—The audit team tackled complex controls that required machine learning (ML) to gather evidence very early on in the automation journey of internal audit. In hindsight, team members felt they should have focused first on automating key controls that already had structured data available for testing.
- Focus on quick wins—The audit team knew what it wanted to audit and which data it needed to do so, but the data were not always readily available. Instead of focusing on quick wins and automating the maximum number of controls with the data it could collect, the team spent a significant amount of time trying to create structured data from varying sources.
Evidence-gathering methods and control tests that use ML can make an impact in the long run (and can be fun to display proudly), but they take skills and time to develop. Instead, the audit team would have gained more value from addressing the more analytical procedures that were well-defined and potentially already running as scripts. To grow an audit program, one should aim to demonstrate quick wins with scalable control tests to secure budget and goodwill for any complex evidence-gathering methods and control tests that may be sought later. - Create a data collection framework—The audit team wished it would have researched data-collecting agents and technologies in more detail before tackling the IT landscape work. The team wrote its own tools to collect the evidence to perform the audit testing, which resulted in additional maintenance and support responsibilities shifting to the auditors. This, in turn, shifted the focus to supporting IT in running and understanding outputs rather than providing insight to management and building more control tests.
- Use third-party specialists—Regarding more complex measures such as patching, network and website security, the audit team would have preferred to integrate with third parties sooner. These types of vendors have security specialists dedicated to researching, collecting and exposing vulnerabilities. Their tools stay up-to-date with very low-level standards for which the typical IT auditor would not know to look. By working with such vendors, the audit team could have focused on identifying whether there was risk and whether it was being managed, instead of the more granular details.
- Define a measurement framework—The audit team felt it should have defined the risk framework being used by the audit automation in much more detail, or stayed away from defining risk per control entirely. The focus would have then shifted to a key performance indicator (KPI) framework instead. KPIs make more sense to the audience receiving the results and drive the correct governance behaviors. Because of the transparency of the portals, one might end up debating how risk is calculated rather than discussing the actual valid findings.
- Think about the impact on operations and audit—Often auditors must perform the control tests as designed and performed by management to determine if the task is being performed properly. If the review process is automated, the organization will be inclined to include the outcomes of automated audit testing within its operational processes. However, this brings the auditor’s independence into question. By leveraging third-party applications (apps) and views that are logically segregated between the organization and the audit teams, operational procedures and audit tests can be split on the same platform. Auditors must think about what is truly being audited. Is it an audit to illustrate that the risk exists, or that management is not taking action? Sometimes it is both.
- Do not forget change management—Last, but most important, do not forget about change management. The audit team found that the biggest hurdles to overcome were not the availability of data, defining the procedures to be automated or presenting the findings and reporting in a continuous manner. The most important lesson from implementing automated IT control procedures was not taking into account the people sooner. When one embarks on the journey of automating IT control procedures, it is very important to take the affected stakeholders into account from the start of the journey. Their buy-in should be granted and it should be confirmed that relevant stakeholders are on board.
Conclusion
Considering these items when planning the automation of IT audit control tests makes the actual delivery easier, faster and much more sustainable in the long run. With the described 7 aspects in place, it is possible to scale a solution to make maximum impact over multiple business areas.
Editor’s Note
Hear more about what the author has to say on this topic by listening to the “Seven Things to Know Before Automating IT General Control Audits” episode of the ISACA® Podcast.
Frans Geldenhuys, CISA, CA(SA)
Is a chartered accountant and a founding member of Bidvest Advisory Services (Pty) Ltd, a South African software development and consulting company focusing on the automation of professional services through its platform, ALICE. He has more than 10 years of experience in the audit industry performing financial, operational and IT audits.
Gustav Silvo, CISA
Is an experienced IT auditor and a co-founder of Bidvest’s ALICE. His primary objective is to be an enabler for positive change by providing IT assurance and advisory services where needed. Silvo leads research and development efforts for IT audit in ALICE. His research focuses on future trends and implementing control testing and usability requests from IT auditors and managers.