Menu
Menu
Why CIOs need to keep privacy front of mind in any project

Why CIOs need to keep privacy front of mind in any project

The cost of getting privacy wrong is becoming significant, and a cost that can undo hard work spent designing products and services which look amazing, but fail to meet regulatory requirements and customer expectations of privacy, write Hayley Miller and Campbell Featherstone of Kensington Swan.

The benefits of adopting a culture of privacy by design will become clear as products can go to market quicker, without the need to have them re-engineered to comply with the law.

Hayley Miller and Campbell Featherstone, Kensington Swan

When Thomas Watson Jr, former president of IBM, coined the mantra “Good design is good business”, he was doing so at a time where well-designed products and brands that reflected the mood of the 1960s and 1970s were a necessity for capturing consumers’ attention.

Today, it goes without saying that product design and branding are intrinsic to the success of any business. The value of the Apple brand and the ‘look and feel’ of an Apple iPhone is estimated by Forbes to be worth $170 billion – intangible assets that represent over 20 per cent of Apple’s market value – and all of which is owed to good design.

What is not as easy to measure in the context of a business is the value of good design when it comes to the protection of privacy.

Privacy is not as alluring as the sleek lines of an iPhone, nor is it as front-of-mind as the logos and taglines with which consumers are so familiar.

But the same principle applies: good design is good business. More importantly, it is evident that the cost of getting privacy wrong is becoming significant, and a cost that can undo hard work spent designing products and services which, while they look amazing, fail to meet regulatory requirements and customer expectations of privacy.

With that in mind, we are listing the key components of ‘privacy by design’, and why – in particular in light of local and global developments in privacy law – we think that privacy by design is good business.

Legal and policy developments

CIOs will no doubt be familiar with the introduction of the GDPR, which came into force on 25 May 2018. The GDPR specifically regulates privacy by design: Article 25 requires organisations to employ techniques of ‘data protection by design and by default’; that is, appropriate technical and organisational measures which are designed to implement data protection principles, such as data minimisation. 

A failure to meet the standard of privacy by design imposed by Article 25 can result in a maximum fine under the GDPR of €10 million, or 2 per cent of global turnover (not the maximum fine that could be levied for more egregious breaches of the GDPR, but significant nonetheless).

Across the ditch, Australia has recently increased the maximum fine for breaches of its Privacy Act – and is likely to uplift that maximum later this year.

In New Zealand, the 25-year old Privacy Act 1993 is under review. The Privacy Bill was introduced into Parliament in March, and it seeks to bring our privacy laws into the 21st century. The Privacy Commissioner is pushing for higher fines to incentivise compliance (up to $1 million in the case of a body corporate).

Overall, the regulatory risk of getting privacy wrong is getting more significant.

While New Zealand does not regulate privacy by design, the Privacy Commissioner has recently launched the ‘Privacy Trust Mark’, a certification for products or services which have been designed with privacy in mind. The initial recipients of the Privacy Trust Mark include TradeMe’s annual transparency report, and the New Zealand Government’s online verification service, RealMe.

What is privacy by design?

Privacy by design was first developed by the former Information and Privacy Commissioner for Ontario, Dr Ann Cavoukian. It is based on the following seven foundational principles:

  1. Proactive not reactive: preventative not remedial

  2. Privacy as the default setting

  3. Privacy embedded into design

  4. Full functionality: positive-sum, not zero-sum

  5. End-to-end security: full lifecycle protection

  6. Visibility and transparency: keep it open

  7. Respect user privacy: keep it user-centric

In short, privacy by design obliges the protection of privacy by embedding it into the design specifications of technologies, business practices, and physical infrastructures. A process that implements privacy by design will inherently comply with privacy law.

Why implement privacy by design?

Global privacy laws have moved on from punishing only those who suffer a data breach. A shift to a consumer focus has led to regulation which places a consumer’s personal information at the forefront of legal protection. This means that processes must be designed with privacy in mind in order to comply with privacy law. It’s no longer solely a question of IT security – the whole design process has to keep privacy front-of-mind.

A beautifully-designed user interface and well-structured back-end mean nothing if the end result is a product or service that fails to respect basic privacy principles.

Hayley Miller and Campbell Featherstone, Kensington Swan

A beautifully-designed user interface and well-structured back-end mean nothing if the end result is a product or service that fails to respect basic privacy principles.

So how do you design with privacy in mind?

First, it helps to understand the requirements of privacy law. Talk to your privacy lawyers about the law’s requirements (and better still, what the law is likely to require when reforms come into force). They should help you make decisions about whether, and how, to adjust your offering to manage risks and maximise the benefits of protecting privacy well.

They’ll probably recommend you undertake a privacy impact assessment (PIA). Your PIA should help you identify how a project is likely to impact privacy.

In the course of undertaking a PIA, you should identify:

  • The nature of the information you will collect, and whether that fulfils the requirement that you only collect what is necessary – in other words, that information collected is not excessive or overly-intrusive;
  • How long you need to retain data, keeping in mind that you cannot keep it for longer than is necessary;
  • Whether information needs to flow freely to other parties – cloud storage providers; third-party analytics tools; or others – and if so: whether there is a legitimate basis for sharing that information; what contractual rights you have with those third parties; where the third parties are and how easily can the information be brought back in-house if necessary;     
  • Whether you need to deliver to customers’ expectations of ‘data portability’ without the need to re-engineer processes or access to various data sets.

Once you’ve completed your PIA, it should feed into every aspect of the design process.

You may consider, having completed your PIA, that you can manage privacy risk through the anonymization, pseudonymisation or aggregation of data sets, thereby removing the ‘personal’ element of personal information so it no longer attracts the attention of privacy law.

You may build into your design features which ensure that consent is obtained at the correct time, to legally justify what you are doing.

In any event, you will be designing with privacy front of mind – and therefore will be much more likely to produce a legally-compliant outcome.

It’s about reducing risks  

Privacy by design is a concept that requires buy-in from all participants in the design process. But as it becomes ingrained in organisational culture, the benefits of adopting a culture of privacy by design will become clear as products can go to market quicker, without the need to have them re-engineered to comply with the law.

Most importantly, you’ll reduce risk – the risk of launching a product or service that fails to meet legal and consumer expectations or which exposes personal information to vulnerabilities – risks that probably would have been identified if privacy by design had been followed.

In 2015, UK ISP Talk Talk suffered a massive data breach, due to fundamental flaws in the design of its IT security. The UK ICO [Information Commissioner’s Office] levied a fine of £400,000 – the highest recorded under pre-GDPR UK data protection law. But worse still for Talk Talk, it suffered the loss of over 100,000 customers in the months that followed; and had to invest significant capital into the upgrade of its IT infrastructure. The total cost to Talk Talk from poor design? Estimated to be over £60 million.

That is certainly not good business.

Campbell Featherstone is a senior associate at national law firm Kensington Swan. Campbell works alongside Hayley Miller,  a partner who leads the firm’s technology, media and telecommunications practice.

Learn from your peers: Check out our State of the CIO report on the challenges and concerns of CIOs today. Sign up for CIO newsletters for regular updates on CIO news, career tips, views and events. Follow CIO New Zealand on Twitter:@cio_nz

Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags strategyrisk managementCIO roleproject managementdata securitylegalData managementIBMPMObig dataanalyticsUser Interfaceeuropean uniontradeAIdigital economyglobalisationproject failureWatsonuser experienceglobal ciodigital transformationlaw firmchief security officerKensington SwanGDPRethics of big dataCampbell Featherstonedesign teamHayley MillerThomas Watson Jr

More about AppleAustraliaBillIBMICOKensingtonKensington SwanTwitter

Show Comments