In most countries, the law does not force criminal suspects to self-incriminate. There is something perverse about making people complicit in their own downfall.
A federal judge in California banned police from forcing suspects to swipe open their phones because it is analogous to self-incrimination.1 And yet we tolerate innocent ‘netizens’ being forced to give up their personal data, which is then used in all sorts of ways contrary to their interests.
On the internet, people’s most sensitive information (such as whether they are alcoholics, or have been the victims of rape, or are HIV positive) gets collected and sold to the highest bidder. That information can be used to show them ads for exploitative products such as payday loans, and to make decisions about whether they get a job, a loan, an apartment and more.
In a society that upholds the rule of law, we should protect netizens at least as much as we protect criminal suspects. Our personal data should not be used as a weapon against our best interests. To accomplish such an objective, we would need to bind institutions that collect and manage personal data to strict fiduciary duties.2 This would make companies that hold personal data owe an equivalent duty of loyalty and care to their customers as do fiduciaries, such as financial advisers, doctors, and lawyers.
The word fiduciary comes from the Latin verb fidere, to trust. Trust is at the heart of fiduciary relationships. First, because the fiduciary is entrusted with something very valuable – your finances, your body, your legal affairs or your personal data. Second, because by entrusting this valuable good to others, you are made extremely vulnerable to them. By accepting what is entrusted to them and in acknowledgement of your vulnerability, fiduciaries owe you their trustworthiness.3
Fiduciary duties exist to protect individuals who are in a position of weakness against professionals who are supposed to serve them but who might have conflicting interests. Your financial adviser could make excessive trades from your account to earn more commissions, or they could use your money to buy securities for themselves. Your doctor could perform a surgery on you that is too risky or unnecessary, simply as an opportunity to practise their skills, or to add a data point to their research. Your lawyer could sell your secrets to another client whose interests oppose yours.
And, as we have seen, those who collect your data can give it to data vultures, criminals and so on. None of these professionals should abuse the power that has been given to them by virtue of their profession.
Fiduciary duties, then, are appropriate when there is an economic relation in which there is an asymmetry in power and knowledge, and in which a professional or a company can have interests that go against the interests of their customers. Financial advisers, doctors, lawyers and data experts know much more about finance, medicine, law and data, respectively, than we do.
They might also know more about you than you know yourself. Your financial adviser is likely to have a better grasp of your financial risks. Your doctor understands what is happening in your body better than you do. Your lawyer will have a deeper understanding of your legal case. And those who analyse your data may know (or may think they know) much more about your habits and psychology than you do. Such knowledge should never be used against you.
Fiduciaries must act in the best interests of their customers, and when conflicts arise, they must put their customers’ interests above their own. People who do not want to have fiduciary duties should not agree to being entrusted with valuable personal information or assets.
If you don’t want to have the duty to act in your patients’ best interests, then you shouldn’t become a doctor. As a society, we understand that having a desire to perform medical interventions on people’s bodies is not enough. The job comes with certain ethical expectations.
In the same way, if companies do not want to have to deal with fiduciary data duties, they should not be in the business of collecting personal data. Wanting to analyse personal data for research or commercial purposes is all very well, but such a privilege comes with responsibilities. As a society, we should oblige data fiduciaries to keep personal data safe, and to use it only in the best interest of data subjects.
Critics of the idea that fiduciary duties should apply to big tech have pointed out that such a policy would go against tech companies’ fiduciary duties towards their stockholders. According to the law in Delaware – where Facebook, Google and Twitter are incorporated – directors have to ‘treat stockholder welfare as the only end, considering other interests only to the extent that doing so is rationally related to stockholder welfare’.4
That companies should only work towards the benefit of their stockholders to the detriment of their customers seems like a morally dubious policy – particularly if the business in question has negative effects on the lives of millions of citizens. Morally, the economic interests of stockholders cannot possibly trump the privacy rights and democratic interests of big tech’s billions of users.
But there are possible ways to work around this problem: one option is to establish that whenever stockholders’ interests come into conflict with users’ interests, fiduciary duties to users take priority. Another option is to institute such large fines for breaches of fiduciary duties towards users that it’s in the stockholders’ best interests for companies to honour those duties, if they care about their bottom line.
Fiduciary duties would go a long way to ensure that the interests of big tech are aligned with the interests of their users. If tech companies want to risk our data, they should risk their business in the process. As long as tech companies can risk our data and be sure that we are the only ones who will pay the bill – through exposure, identity theft, extortion, unfair discrimination, and more – they will continue to be reckless.
Carissa Veliz’s new book, Privacy is Power: How and why you should take back control of your data, which looks at the issue of privacy and fiduciary duties.
Image credit: Simon Carter
- Kramer, A. (2019) ‘Forced Phone Fingerprint Swipes Raise Fifth Amendment Questions’, Bloomberg Law, 7 October.
- Balkin, J.M. (2016) ‘Information Fiduciaries and the First Amendment’, UC Davis Law Review 49, Zittrain, J., (2018) ‘How to Exercise the Power You Didn’t Ask For’, Harvard Business Review, 19 September.
- MacLachlan, A. (2018) ‘Fiduciary Duties and the Ethics of Public Apology’, Journal of Applied Philosophy 35, 366.
- Khan, L. and David E. Pozen, D. E., ‘A Skeptical View of Information Fiduciaries’, Harvard Law Review 133, 2019, 530.
Report with recommendations and findings of a public deliberation on biometrics technology, policy and governance
Examining how the commitment to responsible data in the UK's National Data Strategy could be realised and what it misses
A partnership with Traverse, the Geospatial Commission and Sciencewise to understand public perspectives on the responsible use of location data
A research partnership with NHS AI Lab exploring the potential for algorithmic impact assessments in an AI imaging case study