In a couple of recent editions of the ISACA® Journal1, 2 (referred to as columns one and two herein), my fellow columnist, Steven J. Ross, made a case whereby he believes that, contrary to the EU General Data Protection Regulation (GDPR), privacy cannot, as a practical matter, be part of a system’s design. I beg to differ and will do so by running through the points made in his columns as I understood them.
Genuine Harm
At the end of his first column, Mr. Ross notes that designing “privacy” into systems where a breach will have no real consequences diminishes the attention that is required to protect against truly intrusive systems.3 This conclusion was reached by providing an example wherein he considered buying a castle in Spain and reviewed the prices online. However, from a privacy perspective, the point is that not everyone may have that simple luxury. A case in point is Facebook being accused by the US government of breaking the law by restricting who can view housing-related ads based on their “race, colour, national origin, religion” (sensitive personal data under the GDPR).4 For the individuals involved, this had very real consequences. This is a failure of design. Why was this data collected? How could it have been used for that purpose?
Privacy by Design and GDPR
At the beginning of the second column, Mr. Ross challenges anyone to remember the beginning of the first sentence of GDPR article 25 (Data Protection by Design and by Default)5 by the end of it. It appears to be a fault that the article was written by a committee. On the contrary, I consider this a strength and, as a member of the Certified in the Governance of Enterprise IT® (CGEIT®) Item Development Group, I see this very strength in action. For Item Writing Groups, ISACA requires, to the extent possible, geographical representation. All members must accept the text of every question before it can move forward to the item bank. In other words, ISACA wants to ensure that all points of view are considered. Committees and, indeed, compromise are the very foundations of a liberal democracy.
At the time of this writing, the European Data Protection Board (another committee) has Guidelines on Data Protection by Design and by Default6 out for public consultation. The document interprets GDPR article 25 discussing the rights and freedoms referenced therein. The rights are documented in Article 8 of the Charter of Fundamental Rights of the European Union7 and include the right to the protection of personal data and the right to have data processed fairly (for specified purposes) on the basis of the consent of the person concerned or some other legitimate basis laid down by law. The freedoms are discussed in GDPR recital 4,8 which contains an important addition to the help provided in column two:
This Regulation respects all fundamental rights and observes the freedoms and principles recognised in the Charter as enshrined in the Treaties, in particular the respect for private and family life, home and communications, the protection of personal data, freedom of thought, conscience and religion, freedom of expression and information, freedom to conduct a business, the right to an effective remedy and to a fair trial, and cultural, religious and linguistic diversity.9
In other words, GDPR article 25 is not just about security; it is also about privacy. It is important to remember that security does not (necessarily) mean privacy. Privacy is a possible outcome of security,10 but it is possible to have a privacy violation affecting these freedoms without a security breach.
Cyberthefts of Personal Information
That is not to say that the breaches identified in column two (Equifax in the United States, British Airways, Caisse Desjardins in Canada, Uniqlo in Japan and from virtually the entire population of Bulgaria)11 are not privacy breaches; they most definitely are, however, I would like to examine further whether these are a failure of design.
The aforementioned GDPR guidance12 notes that a technical or organizational measure can be anything from the use of advanced technical solutions to the basic training of personnel, for example, on how to handle customer data. Further, the term “measures” can be understood in a broad sense as any method or means that a controller may employ in the processing. These measures must be appropriate, meaning that they must be suited to achieve the intended purpose, i.e., they must be fit to implement the data protection principles effectively by reducing the risk of infringing the rights and freedoms of data subjects.13
Considering this, any reading of the identified cyberattack with which I am most familiar, Equifax,14 could only conclude that this was, indeed, a failure of design. Further, I am not sure one could claim that everything was done to deter cyberattacks15 in this instance.
“Big Tech”
The essence of a privacy violation may well be in the use of personal information for purposes other than those for which it was collected.16 However, Google simply saying that it will share information with other organizations is not enough. Per the UK’s Information Commissioner’s Office (ICO) report on Real Time Bidding:
As bid requests are often not sent to single entities or defined groups of entities, the potential is for these requests to be processed by any organisation using the available protocols, whether or not they are on any vendor list and whether or not they are processing personal data in accordance with the requirements of data protection law.…Multiple parties receive information about a user, but only one will ‘win’ the auction to serve that user an advert. There are no guarantees or technical controls about the processing of personal data by other parties, e.g., retention, security, etc. In essence, once data is out of the hands of one party, essentially that party has no way to guarantee that the data will remain subject to appropriate protection and controls.17
Further:
…[R]reliance on contractual agreements to protect how bid request data is shared, secured and deleted…does not seem appropriate given the type of personal data sharing and the number of intermediaries involved. This contract-only approach does not satisfy the requirements of data protection legislation. Organisations cannot rely on standard terms and conditions by themselves, without undertaking appropriate monitoring and ensuring technical and organisational controls back up those terms.18
From reading the ICO’s report, it is arguable that Google tells everyone exactly what it will do with their personal information if they use a Google service. Indeed, it is arguable that Google actually knows what is done with users’ data. In addition, the only positive thing I can say about Google’s privacy policy is that it has morphed over time, mainly in an attempt to keep up with regulations such as GDPR, from when users’ data were collected in aggregate to the 4,000-word monster it is now.19
And, while we are on the subject of Google, I firmly believe that the controversy about YouTube being used by pedophiles referred to in column one was, indeed, a failure of privacy by design. YouTube was designed to be viral; comments are part of that virality. Similar to the Facebook issue discussed earlier, no thought went into how this could be abused.
TO PROTECT THE RIGHTS AND FREEDOMS OF ALL INDIVIDUALS, PRIVACY MUST BE INCORPORATED INTO NETWORKED DATA SYSTEMS AND TECHNOLOGIES BY DEFAULT.
Privacy by Design and IT Audit
I hope I have made a strong case for privacy by design. If one accepts that there is a need, then for what should we as IT auditors look? Traditionally, designing secure and trustworthy systems has focused on analyzing risk and responding to threats that affect the security goals20 (i.e., confidentiality, integrity and availability). However, as we have seen, there are other risk factors that may affect the rights and freedoms of data subjects.
The loss of control in decision-making, excessive data collection, re-identification, discrimination and/or stigmatization of persons, biases in automated decisions, users’ lack of comprehension of the scope and the risk of unlawful processing or profiling that is invasive or incorrect, are examples of risk to privacy that cannot be managed by using only a traditional risk model that focuses exclusively on security goals.21
To cover these risk scenarios, it is necessary to include three new privacy-focused protection goals:22
- Unlinkability—Seeks to process data in such a manner that the personal data within a domain cannot be linked to the personal data in a different domain, or that establishing such a link involves a disproportionate amount of effort. This privacy goal minimizes the risk of an unauthorized use of personal data and the creation of profiles by interconnecting data from different sets, establishing guarantees regarding the principles of purpose limitation, data minimization and storage limitation.
- Transparency—Seeks to clarify data processing such that the collection, processing and use of information can be understood and reproduced by all the parties involved and at any time during the processing. This privacy goal strives to delineate the processing context and make the information on the goals and the legal, technical and organizational conditions applicable to them available before, during and after data processing to all involved parties, both for the controller and the subject whose data are processed, thus minimizing the risk to the principles of loyalty and transparency.
- Intervenability—Ensures that it is possible for the parties involved in personal data processing and, especially the subjects whose data are processed, to intervene in the processing whenever necessary to apply corrective measures to the information processing. This objective is closely linked to the definition and implementation of procedures for exercising data protection rights, presenting complaints or revoking consent given by the data subjects, as well as the mechanisms to guarantee the data controller’s evaluation of the fulfillment and effectiveness of the obligations that are assigned to them by law.
Conclusion
To protect the rights and freedoms of all individuals, privacy must be incorporated into networked data systems and technologies by default. Privacy must become integral to organizational priorities, project objectives, design processes and planning operations. Privacy must be embedded into every standard, protocol and process that touches our lives.23 I believe that it is incumbent on all IT auditors to defend privacy by design.
Overall, I am delighted that something I have written has aroused enough passion that Mr. Cooke has dedicated one of his columns to reply. Respectful back and forth among professionals only adds to readers’ appreciation for the issues involved. I will respond in turn in one of my future columns.
Endnotes
1 Ross, S. J.; “Why Do We Need Data Privacy
Laws?” ISACA Journal, vol. 5, 2019,
http://h04.v6pu.com/archives
2 Ross, S. J.; “Un-Privacy by Design,” ISACA
Journal, vol. 6, 2019, http://h04.v6pu.com/archives
3 Op cit Ross, ISACA Journal, vol. 5, 2019
4 Gabbatt, A.; “Facebook Charged With Housing
Discrimination in Targeted Ads,” The Guardian,
28 March 2019, http://www.theguardian.com/technology/2019/mar/28/facebook-ads-housing-discrimination-charges-us-government-hud
5 Intersoft Consulting, Art. 25 GDPR, Data
Protection by Design and by Default, Belgium,
2018, http://gdpr-info.eu/art-25-gdpr/
6 European Data Protection Board, Guidelines
4/2019 on Article 25 Data Protection by Design
and by Default, http://edpb.europa.eu/our-work-tools/public-consultations-art-704/2019/guidelines-42019-article-25-data-protection-design_en
7 European Convention, Charter of Fundamental
Rights of the European Union, Official Journal of
the European Communities, 18 December 2000, http://www.europarl.europa.eu/charter/pdf/text_en.pdf
8 Intersoft Consulting, General Data Protection
Regulation, Recital 4, Data Protection in Balance
With Other Fundamental Rights, Belgium, 2018,
http://gdpr-info.eu/recitals/no-4/
9 Ibid.
10 ISACA, ISACA Privacy Principles and Program
Management Guide, USA, 2016, http://store.v6pu.com/s/store#/store/browse/detail/a2S4w000004Ko9rEAC
11 Op cit Ross, ISACA Journal, vol. 6, 2019
12 Op cit European Data Protection Board
13 Ibid.
14 Cooke, I.; “Lessons From History,” ISACA Journal,
vol. 4, 2019, http://h04.v6pu.com/ archives
15 Op cit Ross, ISACA Journal, vol. 6, 2019
16 Ibid.
17 Information Commissioner’s Office, Update
Report Into Adtech and Real Time Bidding, UK, 20 June 2019, http://ico.org.uk/media/about-the-ico/documents/2615156/adtech-real-time-bidding-report-201906.pdf
18 Ibid.
19 Warzel, C.; A. Ngu; “Google’s 4,000-Word
Privacy Policy Is a Secret History of the
Internet,” The New York Times, 10 July 2019,
http://www.nytimes.com/interactive/2019/07/10/opinion/google-privacy-policy.html
20 Agencia Española de Protección de Datos,
A Guide to Privacy by Design, Spain, 2019,
http://www.aepd.es/sites/default/files/2019-12/guia-privacidad-desde-diseno_en.pdf
21 Ibid.
22 Ibid.
23 Cavoukian, A.; “Privacy by Design: The Seven
Foundational Principles,” IAPP Resource Center,
http://iapp.org/resources/article/privacy-by-design-the-7-foundational-principles/
Ian Cooke, CISA, CRISC, CGEIT, COBIT 5 Assessor and
Implementer, CFE, CIPM, CIPP/E, CIPT, FIP, CPTE, DipFM, ITIL
Foundation, Six Sigma Green Belt
Is the group IT audit manager with An Post (the Irish Post Office based in
Dublin, Ireland) and has over 30 years of experience in all aspects of
information systems. Cooke has served on several ISACA committees,
was a topic leader for the Audit and Assurance discussions in the ISACA
Online Forums and is a member of ISACA’s CGEIT® Exam Item
Development Working Group. Cooke has supported the update of the CISA®
Review Manual and was a subject matter expert for the development of
ISACA’s CISA and CRISC™ Online Review Courses. He is the recipient of the
2017 John W. Lainhart IV Common Body of Knowledge Award for
contributions to the development and enhancement of ISACA publications
and certification training modules and the 2020 Michael Cangemi Best
Book/Author Award. He welcomes comments or suggestions for articles
via email (Ian_J_Cooke@hotmail.com), Twitter (@COOKEI), LinkedIn
(www.linkedin.com/in/ian-cooke-80700510/), or on the Audit and Assurance
Online Forum (engage.v6pu.com/home). Opinions expressed are his own
and do not necessarily represent the views of An Post.