ISACA Journal Volume 6 Podcast: Cybersecurity for a “Simple” Auditor
A few issues back, I wrote about the US National Institute of Standards and Technology’s (NIST) Framework for Improving Critical Infrastructure Cybersecurity.1 In that article, I pointed out that the framework conflates information security and cybersecurity, which I believe should be differentiated. I received a very gratifying note regarding that article from Ian Sharland in South Africa, which said, in part, that he had “been struggling to articulate the differences—for our senior management—between our previous information security audit process (based on a combination of the COBIT, ISO 27001 and ITIL frameworks/standards) and this cybersecurity audit process.” Ian’s message raised a question in my mind: What exactly is the cybersecurity audit process?2 If, as I contend, cybersecurity is above and beyond information security, how then is the audit approach different?
Lack of Cybersecurity Standards
One difficulty in assessing cybersecurity preparedness is the lack of a standard to serve as the basis for an audit.3 The NIST framework has become a de facto standard despite the fact that it is more than a little sketchy as to details. Though it is not a standard, there really is nothing else against which to measure cybersecurity. Moreover, the technology that must be the subject of a cybersecurity audit is poorly understood and is mutating rapidly. Auditors (and everyone else, for that matter) are hard-pressed to keep up.
Now, some auditors are learned and savvy in the ways of technology. I will leave it to them to teach us all the ways of finding the deep truths about cybersecurity. Right now, I would rather address myself to a “simple” auditor, one who is not so skilled in system internals and knows not what to ask.
A “simple” auditor should consider the fundamental difference between information security and cybersecurity: the nature of the threat. There is simply a distinction between protecting information against misuse of all sorts and an attack by a government, a terrorist group or a criminal enterprise that has immense resources of expertise, personnel and time—all directed at subverting one individual organization. To use a somewhat inapt analogy, I protect my car with a lock and insurance, but those are not the tools of choice if I see a gang armed with crowbars and chainsaws approaching my fender. This distinction, to my mind, is the very core of auditing an organization’s preparations for defending itself against cyberattacks.
Simple Questions
As is true in so many cases, the cybersecurity audit process begins with the objectives of an audit, which leads to the questions one chooses to ask. If a “simple” auditor only wants to know “Are we secure against cyberattacks?” then the answer should be written in stone: No organization should consider itself safe against cyberattackers. They are too powerful and pervasive for any complacency. If major television networks can be stricken,4 if the largest banks can be hit,5 if governments are not immune,6 then the auditor’s own organization is not secure either.
Still, “simple” auditors can ask subtle and meaningful questions, specifically focused on the data and software at risk of an attack. An audit process specific to cybersecurity might delve into the internals of database management systems and system software, requiring the considerable skills of a “tech-savvy” auditor. Or it might call for asking simple questions and applying basic arithmetic.
Arithmetic
If an auditor’s concern is the theft of valuable information, the simple corrective is to make the data valueless, which is usually achieved through encryption. The “simple” auditor’s question might be, “Of all our data, what percentage is encrypted?” If the answer is 100 percent, the follow-up question is whether the data are always encrypted—at rest, in transit and in use. If it cannot be shown that all data are secured all of time, the next steps are to determine what is not protected and under what circumstances. The audit finding would consist of a flat statement of the amount of unencrypted data susceptible to theft and a recitation of the potential value to an attacker in stealing each category of unprotected data.
Careful readers may note that data must be decrypted in order to be used and conclude that eternal encryption in use is, ultimately, a futile dream. There are vendors who think otherwise, but let us accept the concept that data will, at some time, be exposed within a computer’s memory. Is that a fault attributable to the data or to the memory and the programs running in it? I say it is the latter. In-memory attacks are fairly devious, but the solutions are not. Rebooting gets rid of them and antimalware programs that scan memory can find them. So a “simple” auditor can ask, “How often is each system rebooted?” and “Does our antimalware software scan memory?”7
To the extent that software used for attacks is embedded in the programs themselves, the problem lies in a failure of malware protection or of change management. A “simple” auditor need not worry; many auditors (and security professionals) have wrestled with this problem and not solved it either. All a “simple” auditor needs to ask is whether anyone would be able to know whether a program had been subverted. An audit of the change management process would often provide a bounty of findings, but would not answer a “simple” auditor’s question. The solution lies in having a version of a program known to be free from flaws (such as newly released code) and an audit trail of known changes. It is probably beyond the talents of a “simple” auditor to generate a hash total using a program as data and then to apply the known changes in order to see if the version running in production matches a recalculated hash total. But it is not beyond the skills of the people responsible for keeping the programs safe. An auditor need only find out if anyone is performing such a check. If not, the auditor can only conclude and report that no one knows for sure if the programs have been penetrated or not.
Finally, a “simple” auditor might want to find out if the environment in which data are processed can be secured. Ancient software running on hardware or operating systems that have passed their end of life are probably not reliable in that regard. Here again, a “simple” auditor need only obtain lists and count. How many programs have not been maintained for, say, five years or more? Which operating systems that are no longer supported are still in use? How much equipment in the data center is more than 10 years old? It is only arithmetic.
A “simple” auditor need not despair. In life, simple questions often lead to profound answers. If the questions are simple, but the answers are too complicated to understand, then who indeed is “technical”?
Endnotes
1 Ross, S.; “Frameworkers of the World, Unite 2,” ISACA Journal, vol. 3, 2015
2 I am not thinking exclusively of actions to be taken by members of an audit function. While surely auditors—internal or external—perform audits, independent assessments of the attainment of objectives, including control objectives, can be performed by any disinterested party.
3 There are several excellent sources of information for auditors who would like to approach this subject, many of them from ISACA and the Institute of Internal Auditors. See particularly Cybercrime Audit/Assurance Program, ISACA, 2012, and Cybersecurity: What the Board of Directors Needs to Ask, ISACA and the Institute of Internal Auditors Research Foundation, 2014.
4 Le Monde, Reuters, “TV5 Monde: les pirates n’ont pas diffusé de documents confidentiels de l’armée,” Le Monde, 10 April 2015, www.lemonde.fr/pixels/article/2015/04/10/tv5-monde-les-pirates-n-ont-pas-diffuse-de-documents-confidentiels-de-l-armee_4613876_4408996.html
5 Wilson, H.; “Millions Affected After Cyber Attack on HSBC,” The Telegraph, 19 October 2012, www.telegraph.co.uk/finance/newsbysector/banksandfinance/9621883/Millions-affected-after-cyber-attack-on-HSBC.html
6 Office of Personnel Management, “Information About the Recent Cybersecurity Incidents,” USA, 23 June 2015, www.opm.gov/news/latest-news/announcements/
7 Grimes, R. A.; “Should You Worry About Memory-only Malware?” InfoWorld, 4 February 2014, www.infoworld.com/article/2608848/security/should-you-worry-about-memory-only-malware-.html
Steven J. Ross, CISA, CISSP, MBCP, is executive principal of Risk Masters International LLC. Ross has been writing one of the Journal’s most popular columns since 1998. He can be reached at stross@riskmastersintl.com.