The problem is not that people are giving up personal information because they do not understand privacy. Rather, there are complex trade-offs being made to make life more bearableby Natalie Nzeyimana / December 20, 2018 / Leave a comment
Your friend has recently moved to Canada. Brexit was the last straw for them and they have upped and left. Realtime banter that once flowed freely is jilted and staggered by time zones. Email doesn’t quite do the trick. You want to know how they are without reading long essays. You need brief updates. Enter Instagram.
As you shuffle into your morning commute carriage, somewhere between an elbow and a shoulder, for a brief moment, you are in Vancouver. For a brief moment, the moving sardine can you are tightly packed into is inconsequential. Your commute is a little less lonely, because despite that elbow jabbing your ribcage, you know your friend is happy and having a whale of a time. There is joy; residual, voyeuristic, immersive joy, and thanks to Instagram, it is “free.”
The permission we give to technology companies to use our data for a set of defined uses is the price we “pay” for that joy. Instagram is owned by Facebook. Data collected by Instagram is data owned and stored by Facebook. You may have noticed how frequently Facebook has been in the news over the last year or so. Courts across several jurisdictions (mainly the US and Europe) are scrutinising how Facebook has decided to use, sell, and share the user data it owns.
Your friend in Vancouver, the one whose life you catch up with on the daily commute, is not immune to data privacy concerns. There are two sets of risks for your friend’s Instagram account. The first is unknown users who may be tracking their location, interests, and other data garnered from watching their profile. The other is that Facebook might be mining data from Instagram that your friend may not be aware of.
Public dialogue around data privacy seems to fall into three categories: personal data ought to be protected (from third parties accessing it without our consent), there ought to be fewer privacy breaches (recent leaks of financial data and healthcare records have many worried), and there ought to be legal protections in place to enforce data privacy (perhaps you’ve heard of the recently passed GDPR legislation?).
Yet, still, we willingly download new apps without reading their privacy statements. Why is that? The problem is not that people are giving up personal information because they do not understand privacy. Rather, there are complex trade-offs being made to make life more bearable.
It may seem strange to insist that we are trading data for joy, rather than, say, convenience. However, online interactions trade in emotion as well as ease. Yes, it is more convenient to do certain things on the internet: it is more convenient to make bank transfers online than in person, and more convenient to arrange travel online than it is over the phone.
This is not the full story, though. We go online not just because we want to pay our bills faster or run our errands more efficiently. We go online because we want to connect; we want the joy that comes from connecting to people we like, even love, and seeing content that speaks to our values, interests, and meme preferences.
When we log on to a social networking app—like Instagram, Twitter or Facebook—we trade aspects of our personal data for the convenience of connecting with other people. As a scholar, the joy of finding someone researching a similar area goes beyond the improved efficiency this adds to my research process. There is a camaraderie, a shorthand. There are in-jokes, support, and advice. There is an emotional experience; a joyous one.
With this in mind, we can see how the challenge of educating people about data privacy is amplified by our economic moment. The convenience of giving up personal information now seems like a small price to pay for feeling something resembling joy. Austerity has decreased the accessibility and availability of joy offline; take, for instance, cuts to crucial public services like libraries and the privatisation of the arts sector. The price of theatre tickets has skyrocketed over the last decade and the closure of smaller live music venues has made it harder for emerging artists and audiences to access affordable gigs. Combined with the increased cost of childcare and the strain on time, energy, and personal budgets, it’s now a lot harder to engage in joyful activities in “the real world.”
A lot of folks who work in data privacy are fiscally protected from the brute realities of this negotiation. Researchers, policy-makers, academics, software engineers, pundits and other data privacy practitioners rarely encounter the issues that arise from poverty on a personal level. They may vehemently protest it, and perhaps empathise deeply, but they’re not likely to need to sign up to a food bank, take out a payday loan, or worry seriously about homelessness. Some may feel the pinch, but very few feel the burn. This poses a challenge in that the people working on data privacy initiatives may not understand why certain people would trade data for a “free” product.
Often, it is the most marginalised groups that turn to the internet to find community. Trans and non-binary folks, refugees, sex workers, and people with disabilities and chronic illnesses all use corners of the internet to create digital communities where issues can be discussed freely, pertinent information can be distributed and the loneliness of being stigmatised and ostracised can be momentarily assuaged. Corners of the internet where joy can be accessed.
What happens, then, is that groups who are most at risk end up sharing their data the most. This year, UK online bank Monzo began offering bank accounts specifically targeting refugees and asylum seekers—yet what on the surface appears to be a benevolent act of altruism equally raises questions about data privacy for an already vulnerable community.
The uncomfortable reality, as we have seen from Facebook’s involvement in the Cambridge Analytica scandal, is that private technology companies are to be treated and viewed with caution, especially by the marginalised. If a refugee has data leaked in the current pre-Brexit surveillance climate, it could plausibly jeopardise their asylum status.
The more you engage on an online platform, the more of a trail you leave. With risk as a priority concern for marginalised groups on the internet, how do we reduce the potential for harm when it comes to data privacy violations? How do we protect the joy?
Education plays a role. There are many organisations working on making privacy easy to understand. Pioneering this kind of work are organisations like Digital Justice Lab in Canada. They are working to build alternative ways of engaging online, working closely with marginalised communities to share knowledge. They also work with other stakeholders, like policymakers, to have input into how the law around digital issues like privacy can be reformed to protect joy, enable community and educate users as to their rights. We can learn a lot from their approach and apply it to UK contexts.
We can also borrow from our response to leaks in other contexts. Whether it is phone hacking or data breaches, news media often shares private information to sell papers. When that information relates to a vulnerable community, a thoughtful response is to refrain from snap judgments, to restrain from ascribing shame to an already targeted group. This thoughtfulness applies to our digital lives, too.
We need a more holistic approach to public dialogue around data privacy. When you are on your commute, using that app, scrolling through those images, think about the trade you are making. What is true, now more than ever, is that as precarity spreads, so does the yearning for joy—and the willingness to trade more personal data for it.