March 1, 2021

Privacy in Action: Sarah Clarke, Privacy and Cybersecurity GRC Specialist

Sarah Clarke, Privacy and Cybersecurity GRC Specialist

Sarah Clarke is an enthusiastic privacy advocate. She helps clients of all sizes to cut privacy and security governance challenges down to size. She does so while working out the best way to assess and keep track of their specific risks. When she’s not doing any of those things, she frequently writes and speaks about privacy and technology matters. We encourage you to follow her on Twitter. We asked her for her thoughts on privacy in the digital age.

Interview with Sarah Clarke

Startpage: What does privacy mean to you?

Sarah Clarke: Privacy will always be inseparable from other human rights for me. Many people forget that data protection as a discipline is about safeguarding people from harm that might result from infringement of any of their human rights. But that infringement frequently starts with privacy. It’s the right to decide whether someone is permitted to interfere with your personal and home life. Your right to not have that happen without your knowledge and consent unless justified by something that fundamentally outweighs potential harms linked to the infringement. It is not about keeping secrets, it’s about transparency, fairness, and respect.

Startpage: We know confidentiality is one of the components of the CIA Triad of cybersecurity. Is there a difference between confidentiality and privacy?

Sarah Clarke: The meaningful difference between in my risk management world, is usually Personal Data. Privacy, in a human rights and data protection context, applies to personal data (data that indirectly or directly allows one to single out a living individual), Confidentiality applies to any data, whether personal or not, that falls somewhere between public and top secret. Data that comes with some specific rules for access, handling, storage, transfer, disposal. Perhaps financial reports, strategic plans, or sales figures.

Startpage: How is data protection related to human rights?

Sarah Clarke: To expand a bit on where I went before, privacy probably comes most vividly to life in context of the second World War. After which it was enshrined in 1948 in the Universal Declaration of Human Rights (article 12), in an attempt to prevent history repeating itself. In a stark wartime example, data processor René Carmille was tasked with identifying undesirable populations within census data. That data had been shared by data subjects, but they retained their right to privacy.

They retained the right to only have that data used for the purpose for which it was collected- to support the state to plan welfare provision and statistically inform other beneficial public policy. René, recognizing the intent of German authorities, delayed processing thousands of records and hacked the punch card readers to ignore field 11, denoting religion. His teams’ actions, then and in work for the French resistance, up until his arrest, interrogation and death in Dachau, on January 25, 1945, almost certainly saved many hundreds of lives.

In a far more modern and less immediately fatal context, companies are scraping pictures from social media without user consent and making datasets available to facial recognition providers, police, and immigration forces. These examples hopefully show you both why privacy is a fundamental human right and how closely it can relate to most of the others (the Council of Europe, European Convention on Human Rights is well worth a browse).

Startpage: What are some positives and negatives about artificial intelligence when it comes to privacy?

Sarah Clarke: There is no doubt that AI holds massive promise in every area where lots of data needs to be organized and analyzed, or where efficiencies can be gained for menial tasks. There is enormous potential to find new insights about the climate, cancer, nature, and the world as a whole. On the flip-side, there are a lot of data sets riddled with pre-existing bias, fudged configurations and buggy bits of code, which are already being trusted too much to manage bits of our lives. While AI currently needs to fit data, including data about us, into very standard pots to enable training and analysis, there will be large numbers of exceptions and not enough skilled exception management bodies are being rolled out with the models. Models for predictive policing, sentencing decisions, bail decisions, deportation decisions, welfare payment decisions, health insurance decisions, employment decisions, credit decisions.

Users of which typically take a long while to move on to new tech solutions, even if issues have been found in datasets, configuration, or code. There are perverse incentives to get AI into the world and keep it running, because more and cleaner data is seen as the solution. But at what interim cost and how many errors are rolled back in to the next training dataset? I’m working with a body called For Humanity on an AI audit regime and offering help to others to tackle some of that. At the very least we are hoping to inject a bit more user friendly clarity and accountability for adequate due diligence and risk management.

Startpage: What are some misconceptions about digital privacy that laypeople often have?

Sarah Clarke: I think my top three would be:

1) “If you have nothing to hide, you have nothing to fear”. It is a false equivalency. You shouldn’t have to live in fear if there is something you prefer to keep private and keeping something private should not be equated with guilt.

2) Belief that making something public should cost you your right to protection. That is categorically not the case, at least under laws like the GDPR. Your human rights do not disappear if your data is no longer secret. As I talked about above, you may share your data for specific purposes, but if someone takes that data and uses it in unexpected and harmful ways, you have a right to know about that and seek redress.

3) Anyone can own our data. This, depending on where you are in the world, is still true, but I believe it should be false for everyone, everywhere. Under the EU and now UK GDPR, there is no concept of data ownership. We are our data and our data is us. In most current cases data handling works on a basis of custodianship. We lend data, but retain rights. In future, most of the thoughtful data protection and privacy professionals of my acquaintance believe we will move to a model of employment, complete with an ethical revamp of pre-existing concepts of related rights and acceptable working conditions, either paid or otherwise. I don’t see how we can avoid that as ever richer versions of us are indelibly encoded into information and increasingly enhanced by AI, so they become ever more indistinguishable from a physical self. Concepts of data ownership, where we cede rights to digital portions of ourselves, taking some kind of payment in return, assume that a commodity market for our data will set a fair price for that portion of us. They also assume that letting businesses use our data to build better products and services, will be mutually beneficial and begin to decrease inequality. But there is little evidence that a market with partial or whole people as a tradeable commodity, would produce a fair exchange and greater equality while respecting human rights, any more than historical examples of similar things.

Startpage: What are some things ordinary people can do to better protect their privacy?

Sarah Clarke: The best recommendations are old chestnuts, but still worth their weight in gold to stop us from becoming easy targets for common scams. For example; using different strong passwords for each online account and app; using a password safe to help you remember all that; using privacy-respecting browsers like Startpage or Quant, that allow far less invisible data collection. Ditto with email and messaging. Things like Protonmail and Signal offer encryption and don’t harvest data in the background if you miss a privacy setting here or there, unlike other common free services. Otherwise, my best recommendation is just to think twice about what you share, at least until we can work together to make privacy and security by design just a little bit more common everywhere.

Startpage: Do you have any other interesting ideas to share with us?

Sarah Clarke: Only that privacy, data protection, and adjacent professions are really exciting places to work right now and with the ever expanding list of laws and regulations, we are going to need all the help we can get. So please, dive right in, the data lakes are balmy.


Privacy in Action is a series of interviews with privacy-minded Startpage users from diverse backgrounds. If you are interested in participating in the Privacy in Action or would like to nominate someone to be interviewed by us, reach out to us at privacyplease@startpage.com.

The views expressed in this Q&A are those of the interviewee and do not necessarily reflect those of Startpage.

 

Was this article helpful?

Go Private

Make Startpage your
default search engine

Set as default