Sheila Dean

View Original

As US privacy needs reform, comprehensive law behind Congressional chokepoints

Everyone believes US privacy protections can improve. Precisely how isn’t being debated yet by the people who matter - those identified in the data sets.



By Sheila Dean

 

Imagine this scenario at an airport as you walk in from the arrival gate.  A loudspeaker announces, “You are now entering the US privacy baggage claim area. Please proceed to the roundabout. There is no one currently available to process federal claims of damaged or lost privacy for cloud services or data breaches of Chinese or Russian origin. Good luck. Thank you for flying, CYA Global Airlines.” It is similar enough to emulate the reality of most US consumers seeking access to justice for their households, many including underage children.


At the end of 2018, there was a flurry of national consumer privacy bills.  Many of these proposals were eyed with earned suspicion from members of Congress.  Some of these bills are endorsed by Facebook, Google, the advertising industry’s finest lawyers to sustain invasive speculation to sell you anything they so desire, at your expense, of course. The same companies were ushered in before Congress recently for anti-trust and data misuse hearings . Amazon and Microsoft were also pulled into the public information fray for other reasons, primarily for consumer adaptations of biometrics and weaponized machine learning.  They were all notably on record as, “in favor of comprehensive privacy legislation”.

These very same companies are at the nexus of global data exchange intrigues: US domestic election interference, censorship, bias and other pertinent human rights issues.  Every partner, from MIT to the smallest 3rd party app exchange, benefits from a virtual supply chain exploiting the US data owner during information banking and would-be secured exchanges of sensitive health and personally identifiable information.

Trustworthiness has clearly been cored out of the heart of these proposals due to the entities behind the legal proposals and poor enforcement delegations. The FTC’s strongest consent decrees over privacy violations held no enforceable powers to aid individual consumers in their quest for access to justice.  The CDT, among the FTC’s strongest proponents for enforcement, labeled the 2011 consent decrees, “paper tigers”. The FBI is still in the throes of a corruption squall and FISA abuse scandal. Bad acting is simply not believable, much less worthy of the public’s high honor, as a source of viable law and public policy.

All practical souls would not want these foxes drafting legislations to protect their chicken coops. Congress saw them coming. Even they have a sense of self-preservation, placing “comprehensive privacy reform” on a far back burner for 2019, right next to an 80 pound bag of salt.  

The back burner is good resting place to weather concurrent seasons of obstruction and partisan shutdowns. Privacy legislation will improbably pass the federal choke points in 2019. Some will gamble on well-oiled weasles, adding “privacy” to an unread 3-foot-thick partisan omnibus legislation, wasting advocacy dollars on PACs. When the President won’t sign another omnibus bill, meh, what can be done? Many won’t gamble at all, resorting to State-borne solutions autonomous from D.C. dysfunction.  Their data security and family privacy interests just won’t wait.

Data breaches persist as realtime nation state aggression from the People’s Republic of China, Iran and Russia. China was found responsible for a breach discovered in December impacting over 5.25 million consumers at Marriott Hotels worldwide.  China also was found to be most culpbable at the bottom of the Equifax attacks.  Russia meddled with all our elections and ran all kinds of disinformation campaigns.  The Brookings Institute recently released a report stating that AI can cheaply persist with disinformation and social engineering campaigning from abroad.


SO WHAT CAN BE DONE TODAY TO IMPROVE US PRIVACY WITHOUT A COMPREHENSIVE NATIONAL LAW?

There is a short list of persistent privacy problems; which can be resolved with stronger enforcement, rules adjustments and legal proposals to update existing laws to serve the spirit of those laws. Technically the rest can be aided with industry led insurance and consumer access to their personal data capital.

 

·         Consumer control of data use in exchanges.

·         Data breach crisis management

·         Defining harm and new realms of consumer law enforcement

·         Adjusting civil legal action to better recognize the global scope of data conflicts.

 

One legal proposal recommended making public data a more recognizable fiduciary so that data-as-money can be better protected.  The main thesis can be better refined to reflect the identified person as owner of this “currency” property. It is the difference between an American citizen being objectified and traded as a serf, or a commodity, versus being able to resolve misuse as a self-owned person.  It is the difference between filing a police report if your data property is “kidnapped” by a Chinese communist hacker group or a 3rd party app exploiting a hole in an Android’s Play securities or just being out of luck during a post-campaign fire sale of voter data.

If fiduciary rules apply, data brokerages would have to come out of the shadows and deal with consumer owners one-to-one. Brokers are regulated fiduciary interests. They would no longer be able to broker your data with opaque or hidden business partners.

Today the average computer user always compensates with their data. No choice is allowed. Online companies would be forced to declare their role as personal data brokerages, conforming to fiduciary laws or face fines, penalties and jail time. If you didn’t want to use their brokerage, you could find another firm or, as many prefer, not broker your data at all.  This would necessitate new revenue models to emerge for online companies non-reliant on personal consumer surveillance.

If you ever stand before a judge for a claim that amounts to .03/record, you realize both the market and price exchange for your data, was not established by you.  Access to monetary business rates and price fixing becomes an anti-trust issue for 40 billion consumers spanning the globe. Current anti-trust laws can be goosed or informed by case law or mass claims for a people who have not been fairly or proportionately compensated for the use or exchange of their data for “routine business purposes”.  When a business goes out of business or even a non-profit campaign wraps, many buttress their expenses by selling consumer data to marketers.  A better legal treatment is needed to qualify data for a fire sale with documented, explicit consumer consent, like GDPR.

The data fiduciary still may have another pertinent and smart role to still play. Generally, there are 2 types of data breaches: intentional or unintentional, regardless of at-fault party. Let’s say over 50 consumers are impacted by intentional negligence, better known as unfair practices, by a company who didn’t install their Apache security updates. A version of FDIC federal data (bank) insurance should step in to cover a consumer’s data commerce interests.  If unintentional data loss occurs by breach via the hired hands of a rival nation State, there should not just be an FDIC treatment but an international criminal filing, State Department case files opened for all consumers victimized by the breach, as part of a national breach notification effort.  

The good news is that fiduciary rules are in the wheelhouse of the US Secretary of Treasury, Steve Mnuchin and Congressional Finance Committees. Fiduciary infrastructure and regulatory policy is established . If personal data, its ownership and data capital were added to federally insured terms of national coverage more remuneration could move forward for investors and data capital holders injured by concurrent breaches.

States, Counties and municipalities can also file fair use conflict claims with the State Attorneys General. Case coalitions can form to in these States with grounds for civil litigation, criminal case law and even handoff legal cases over global regions (APAC, EMEA) against one bad actor. This would all happen so the individual can come to the business table to negotiate and establish what their data is worth or remove it from the exchange when no agreement can be made.  The one-way tech company contract would be a relic of a bygone era. New insurance markets and actuary services would emerge to establish commercial exchange rates for data as market rate. It would also put reasoned risk back onto consumers who endorse a voluntary market of their data.

Alternative pay models to use cloud products to rival the compulsory data exchange should be actively encouraged and enforced by anti-monopoly laws.  To force the consumer to “pay with data” even when they use the paid services is unfair to those who wish to keep their information out of the data marketplace.  If all of this is too much to take on a la carte, simply walk in a draft of California’s Consumer Privacy Act to your State lawmaker and urge them to make new law.

 SO WHEN DO YOU CALL THE POLICE? WHEN A CRIME HAS OCCURRED.

There is the much-scuttled matter of consumer privacy harm definitions bouncing around in the courts as a civil matter.  One could blame poor lawyering for the lack of legal efficacy to apply harms as defined by the Computer Fraud and Abuse Act (CFAA) to Corporate persons who misuse or misappropriate data as an enforceable criminal justice complaint. The CFAA is the law that made invasive hacking a crime. Any elected official, district attorney, law enforcement officer, social caseworker or injured party should be able to file a criminal report as CFAA harm applies. 

Fraud law may be applied when consent for a fiscal data currency transaction is misappropriated, falsified or presumed in a false light.  For instance, if a company installs additional tracking firmware to override consent controls, the law should support any injured party to submit a legal case file with the FBI and local law enforcement.  They can investigate under reasonable suspicion that a crime has occurred.

COPPA is good law, enforceable in criminal justice circles, as well.  If a 3rd party app didn’t get parental consent to pick up location and video chats of a child under the age of 16, criminal complaints may be applied. Criminal due process should move ahead as a deterrent for companies whose bosses greenlit unlawful and invasive company processes; which impact consumers.

Finally, what to do about those machine learning bots who invade on command. I have recently suggested to ethics leaders that AI accountability include a crisis management manual or handbook. Each AI entity will have a programmer or a team of engineers behind its behaviors. Let’s say an algorithm is infected with aberrant code froma social engineering scheme to cause peculiar defamation of geriatric white women in Yonkers sending underworld goons to their homes to redeem falsified or invented debts. Welcome to your non-AI winter modern crisis.

Engineers need a manual with instant step-generated procedural guidance appropriate to AI bias or data misuse categories of crisis, intentional or unintentional.  Companies can generate these themselves as part of risk management.  If the Department of Defense will produce a crisis manual for any scenario, take notes on what they might be doing right.

Damage is damage. No one has a long heart-to-heart with a fire when it’s burning your house down. Have an emergency exit plan. Get everyone to safety. Call the appropriate authorities. Call the news. Put the fire out. Then you call the insurance company, who coordinates the auditors and investigators to find out what happened and how you will rebuild. Get data breach insurance. If you cannot qualify, step up your information security and privacy compliance game to avoid liability risks in the online environment.

Reactive action is how you manage privacy injury today. It’s all triage until we get enough breathing room to plan and prepare for eventual foreseen crisis.