An Ethical Horizon for Artificial Intelligence: Propriety and Accountability (Part 4/4)

"Never refuse or hesitate to take steps against impending dangers ...because you think they are too late. Since things often take much longer than expected, because of their very nature and because of the various obstacles they encounter, it very often happens that the steps you have omitted to take, thinking they would have been too late, would have been in time."

- Francesco Guicciardini

PART 4

If you are being watched by people that is the summary meaning of surveillance. Whomever is watching and why they watch matters.  However, I would volley it matters more whether or not you can enforce your will in whether or not you can be watched by them. If you are never notified of being watched you could never meaningfully consent to the process. That would make it spying.  Spying has been long defined as form of hostility.

We know people feel disconnected to humanity if one is seen as a 2D character on their monitor.  They might argue they have rights to dehumanize you as an information parcel.  Your civil, constitutional and human rights are completely intact, despite this delusion.

People are increasingly being identified now by a body of information; which was compiled over decades by. I call this your ‘information body.' It is collected by people you will never see or know. Now an algorithm (API) can track you down your information body at a predatory pace. The human element has left. If this data is leveraged by someone without your notice and consent, harm can still occur. The lawyers in defense about this are now busy arguing the legal definition of harm

I like to use Pamela Anderson, Paris Hilton and Hulk Hogan as examples of people with a legal team Silicon Valley would never trifle with to market sale of their public image. Their legal interests and public accountability leverages are very well represented.  However, your biometric profile, user imprint imaging and behavior profiling present a unique challenge for territorial and bodily privacy advocates.

Users are endorsing biometrics at highly unfair exchange rates for increasingly intimate, damaging information; which their ownership of may get deleveraged in their Terms of Service agreements.  Reputation harms and health attribute exposures are provable legal harms with long legal history in the court systems.  Yet, you may still sign your digital self right into a virtual brand of “slave trade” if you submit to the wrong agreement. You may call it legal, but you cannot call it ethical.

It is very difficult to wrestle individual identity and personal data property from multinational enterprises, like Google.  Data brokerages do not really consistently recognize individual data property rights in their data administration. They can parse you down the finest detail for 3rd party marketers for sale. However, the power to escape liability through zero logging, encryption engineering and UI agreement conventions could save a lot of souls.  You can make new standards and technology ethical.

This paints an unseemly portrait of the grey market for data brokerages or human traffickers of those information bodies.  It’s precisely the kind of environment where governments and companies trade to target real people.  People find losses of opportunity and exposure to peril without any understanding of what went wrong in a completely opaque circumstance.

In Europe, this particular power differential is completely intolerable to their concept of basic dignity.  So individuals may invoke the Right To Be Forgotten.  That keeps European citizens out of the human information trafficking business, if they decide to opt out of the agreement. US businesses are then compelled to commit to disposal of all personal information or face civil injunctions from a European data authority.

In the United States, we have legal concept of data ownership, data access and 4th Amendment rights recognized by the government. We all know, deep down, you are subject to harm at the inverted end of this data deal. Yet, you are still in this conflict with certain legal, academic, business and government bodies in our society. Some will mutually agree to overlook your data property rights as an identified person as long as the power differential favors them, but certainly not all.

The prescriptive for this avarice is to make instance of self-preservation more relevant than the cooperative interests of those holding systemic corruption in place.  For instance, Representative Jason Chaffetz made it his mission to hold Congressional justice proceedings on his behalf against the US Secret Service agency for violating his employment privacy.  He used the power of self-advocacy to produce a result that was just for him.  Each of us have means to invoke our will to rebalance the scales, but it is not easy, even when you have means and power.

You can test the data integrity of a business or government agency by requesting access to any personal records they possess which are "responsive" to you.  If you are able to get these records, you have passed the first leg of your private data inventory or audit. The next step would be to ask them to correct or remove any unwanted or incorrect information pertaining to you.  If they can confirm correction or destruction of unwanted information, your relationship to your data holder is balanced. If there are indefinite delays, followed by threatening legal letters, demands to invoke false crisis, you know it’s out of balance. (EX: If they demand of a DMCA takedown request or claim identity theft when it didn't occur.) The conflict will require more from you directly to resolve your personal stakes.

It might require you to make demands of companies that they recognize your human rights in data administration practice by enacting a corporate charter commitment to protect human interests.  It will also require you to demand increasingly better accounting and protection of your interests from government actors. Specifically, the Department of State,  can administrate the will of Americans whose personal information (identity) is now in the hands of foreign governments from hacks, FVEY partners or even to field dissent over unethical kill operations using drones

If this process seems arduous, it’s because it is.  If it seems like ethics should be self-evident to those who are taking your information body to market, it is just not so.  If it seems like the government should be doing more to protect the information about you they have amassed to themselves in a way that complies with the law, I agree.

If corporations did their part to recognize the notice and consent rights of those in their offices daily, global standards for human rights would escalate. If military leadership were to mindfully adjust robotics response systems to better respect loss of life and economy, the world would be safer.

However, if you were to add your grain weight to counterbalance the scales you might be the one who rectifies mass injustice for everyone in the same circumstances you are in today.

Please hold them to account.