FieldFisher Advises Unethical Approaches to Personal Data to Privacy Professionals

Change is harder if you become combative with data ownership rights.
 

One important difference between European and US privacy legislation is that American companies, in many areas, do not have to obtain consent to resell customer data to third parties. In the US, data on people is traded more freely by so-called data brokers, among others. In the EU, however, all websites with cookies have to obtain informed consent from users before collecting data – a rule with the good intention of informing users and ensuring their consent, but which unfortunately has perhaps ended up blinding Europeans to their right to consent, as many just tick the cookie consent box to access a website. In many ways, the whole idea of consent, heavily emphasised in the new EU data regulation, has been watered down. When you register for a website, for example a social media site, you allow it to use your data for wide range of purposes (which includes trading your data) without thinking twice.” -- Data Ethics, The New Competetive Advantage (2016) pp.134

 

I attended an event packed with lawyers, developers, technologists, data scientists and other professionals attending an event aimed at those on the front lines dealing with matters of European consumer privacy.  Keep in mind, these are the people given charge of the privacy gate-keeping for private information, classification of sensitive information and privacy business process for employees and consumers. The event was unusual for its nature because it was geared toward the upcoming General Data Protection Regulation or GDPR, an EU regulation giving personal data administrative context with personal rights.

The following statement was repeated, with emphasis, from a visiting UK speaker employed by FieldFisher consultancy, based in Silicon Valley.

“Avoid getting consent, if possible.  In fact, avoid it as much as possible.”

...found on the Internet.

On it’s face, the statement sounds completely perverse. I was kind of shocked. So I turned to my neighbour and asked, “Uh, does that sound right to you?”  She answered,"Uhhh,.. it’s a perspective.”  This was followed by a wide-eyed look, a shrug and an expression best expressed in punctuation, “??!!” .

Forget what you would ordinarily expect. Consent is the basis of much of US law.  Consent is a requirement in of our Terms of Service to use consumer produced data.  However, much of the fire given consent has, loses ground on its own legally isolated space with GDPR. Consent, as a principle, is and can be avoided in EU Regulation. So the ethical spirit of the law is somewhat displaced into another GDPR concept: legitimate interest.

Legitimate interest places hard limits on data collected without an audit path for specific purposes. Once personal data has served its purpose it is to be scheduled for deletion.  Data loss for marketing is much more of a trial in American data storage because a company’s ‘earned’ data was intended for marketing and re-marketing as a financial hedge. Permanent contractual consent is no longer a ‘thing’ of corporate data entitlement under GDPR.

Consent is a weak area for the UK and the United States. Historically, the UK, and its nation-state derivatives (US, NZ, AUS, CA), haven’t had the best relationship with observing conceptual consent.  It’s possible that she, and her firm, were asked to speak as a deliberate (and cynical) choice by the organizers.  The audience was primarily comprised of US lawyers in charge of global privacy compliance. Many of them draft US Terms of Service statements and research agreements; which are to be proven legal, but not necessarily ethical.

Early in the seminar, the speaker answered a question concerning issues of class action lawsuits over Equifax consumer privacy losses. This was responded to directly that legal harms due to privacy losses in breach related incidents are unproven and therefore consumer privacy injury has no case. This is a common refrain given by lawyers in a primarily hardened defensive posture against consumer interests once a breach has occurred.

It’s also a flag of an unethical outlook for the people FieldFisher advises. They won’t necessarily be moved toward restraint, risk mitigation or a concillatory posture to improve consumer relations with data owners.  They will paint injured data owners as an enemy in a lawfare battle instead of a key stakeholder interest they are obliged to protect.  That inevitably leads to amplified errors, corrupt business processes denying legal requirements, whistleblower subversion and FTC actions over something preventable they could have managed upward on their own.

Business operation developers at US technical companies have been known to hire a certain type of legal innovator to find vulnerable points in legal privacy armour. They are deliberately placed in leadership positions over consumer privacy. They then write in Terms language that maximizes their ability to scrape up data based on a one-way contract.  So it wasn’t really a surprise the speaker later felt safe disclosing some of her personal contempt for privacy activists running campaigns to invoke new GDPR granted rights for themselves and others.

Given that Deliberately Omitted Consent comes before its next relative, Deliberately Complicated Procedural Consent, it would precipitate a legally depressed area for FieldFisher’s Silicon Valley and Seattle technology clients, at their public advising. I would say FieldFisher’s represented opinion archetypes have direct correlation to a created part of difficulty.  It seems like natural law or simple physics her recommended process would necessitate or even provoke an ambush by privacy activists. I expect they will continue annoying her in her new compliance tasks.

Thankfully, DPA’s in other nation states, like Germany or Switzerland, may be more reliable for ethical consumer safeguards for their citizens. In the US?  Meh. Not so much.  Are you right to demand more? Sure thing!  No one ever said privacy protection was easy, but it's getting easier all the time.  You just have to connect with real solutions provision.

 

WHAT IS CONSENT TO AN API?

To balance the scales, the speaker also said, “Automate as much as possible,”. Perhaps she could innovate her way toward simplifying consent.

Part of her speech went toward perceived issues of complication with the different types of consent, as if it was harder to automate than the other numerated forms of compliance. The types of consent discussed were: implied consent, expressed (overt) consent and explicit consent. Explicit consent would be chronicled for audit with sensitive forms of data. Some of which is a highly regulated area, very detailed in GDPR.  She claimed it was complicated because it “mandated several extra fields” to be populated for consent by a user. 5 or more to be exact.  

The mental inflexibility on display here would mislead uninformed privacy professionals to think consent is harder to administrate technically than, say a shopping cart or any other input field function facing the user.  From an engineering standpoint, that’s just not true.

It’s less complicated when you develop or automate the processes requiring consent-by-design and UX development. Consent design controls aren’t a mystery or inaccessible in the corporate market.  Adding code for five input fields is not an impossible trial of engineering for a developer. If a consent area is incompletely authorised or you lack the legal consent inputs to process the personal data, you have a lot of options.

UX prompt fields can be designed to bring attention to fields lacking consent information. If the initiative was dropped by the user-consumer in process, a time-sensitive notification can be sent to the individual urging them to declare or decide what data can be used.  If you can’t get the needed consent, you schedule the data for disposal.

See? The computer does what you tell it to do when it works. In this case, the AI would be a much better friend to individual data interests than a dangerously scared legal consultant. FieldFisher, like many terrified Silicon Valley firms, are hard pressed to defend inferior, unethical data handling practices instituted over 15-20 years time. If they focused on facilitating the wishes of data owners they could retain their business for the long haul. 

It turns out consent design and process aren’t so hard after all. You just have to build a thing that regards the intent. When you are avoidant toward consent operations, you won’t build toward it, anticipating reasons to accommodate it.  So first, put consent ethics back in play as the spirit of GDPR law.  You can self-regulate from there.

Below are a list of consent design firms who will help administrate consent engineering solutions.

  1. 3PHealth
  2. BayCloud
  3. ConsentCheq
  4. Consentric 
  5. Consentua
  6. Ensighten
  7. Evidon
  8. Integris
  9. OneTrust
  10. PrivacLab
  11. PrivacyCheq
  12. SecuPi
  13. Signatu
  14. Trunomi
  15. Trust Hub