Understanding the price of privacy

“If something is free, you’re the product.” — Richard Serra, 1973

We are living in a digital age accelerated by the access to data and what we can do with it. So many things have become faster and simpler because of the presence of data, computing power, and algorithms. As a UX Designer, it’s very clear to me how big of a role the access of data plays in our lives. But I also don’t think we fully understand the repercussions yet. Everything is so new, so shiny – it’s hard to see where our data could be manipulated long-term. Once we check the box, toggle on, click accept, it’s not really ours anymore. But whose is it? Where does it go?

“People in the U.S. still struggle to understand the nature and scope of the data collected about them, according to a recent survey by the Pew Research Center, and only 9% believe they have “a lot of control” over the data that is collected about them. Still, the vast majority, 74%, say it is very important to them to be in control of who can get that information.”

Steven Melendez and Alez Pasternack

With the vast majority of data collection explanations being glossed over (or worse, purposefully confusing), I have seen a few examples of more care and time spent to explain where my data may end up.

I recently signed up for Modern Fertility and was struck by the section around research participation. Normally, I do a lot of skipping over long paragraphs and information (hypocritical much?), but the large text and all caps caught my eye.

screenshot of Modern Fertility research section

The specific note of “potential risks” gave me a pause – this is a word that is seldom used when explaining data collection. It is inherently a “scary” word, bringing an idea into the reader’s minds that does give them… pause. Not something most companies opt to do when in the throes of making money.

Because of this, I read through this section more carefully than normal and the last sentence really stuck out to me. “There may be additional risks that are not foreseeable at this time”… and they are so right. The honesty was a little startling and, though it didn’t provide much detail, I felt very seen. This is really where the crux of this article lies – there may be unknown future risks that should still be given importance so that we can hold that knowledge in the back of our minds, always.

Apple’s privacy initiative released a huge change to iPhones in 2021. Apple set up its own notifications to ask users whether or not they want their data to be tracked from specific apps.

notifications from Apple regarding app privacy | Photo credit: Apple / Newshub.

Instead of finding privacy settings hidden in your preferences section, it’s the first thing you see when opening the app. It has created a huge stir in the field of targeted ads and marketing revenue, but it more importantly creates a… pause. The pause needed for a user to connect what they decide to do and it’s potential effects. I’m certainly not saying that Apple’s new initiative is perfect, but it’s a step in an important direction.

In the end, we are all still only one “toggle on” away from privacy being taken from our hands. Taken from us for all future uses that might only get spelled out in a sporadic terms + conditions email from a company we forgot we signed up with.

People should always have the freedom to choose for themselves. Period. The questions I’m starting to ask are — How do we design for generations that don’t value privacy? Instead of taking advantage of ignorance, how do we slow down and help complicated systems become tangible?

I hope these are questions that we as a society start asking and we as a community start answering. There is a lot we all don’t know.