Who’s Exactly Watching You? How to Avoid Leaving a Data Trail

Thursday, June 23, 2022

Who’s Exactly Watching You? How to Avoid Leaving a Data Trail with UWaterloo Surveillance Researcher Jennifer Whitson

by Mayuri Punithan

Can you go for a weekend without your phone and laptop?

To many of us, our devices are our life. Not only do they store our school and work, but they can store important memories and personal data. Years ago, Sociology Professor and CPI Researcher Jennifer Whitson based a midterm on this request and students had to discuss its effect on them. Another assignment was where students “hear cop cars coming and they must live off the grid for a term. And how would they do it right? What is their escape plan?” Unfortunately, she had to discontinue these assignments, as it is “impossible because there is no way that we could be untethered from our devices for that long”.

Initially wanting to become a lawyer, Whitson completed a degree in criminology at the University of Alberta. After experimenting with several careers, she became interested in “techno crime and the rising trends of what was new at the time, which was identity theft”. What intrigued Whitson was that people would spend so much money on protection insurance when most identity theft cases were institutionally based. “Credit card companies sending out all of these free credit card offers. But they were also sort of making revenue on the back end, right by selling identity theft insurance”. She became interested in governing online spaces and online crimes. Even if you kick out a user for inappropriate behaviour, they can just return under a new identity. When researching solutions, she focused on the video game industry, such as creating online spaces to prevent harassment and the exploitation of minors. She also decided to study the labour relations within the industry, such as how designers cope with not getting paid enough for their work, which is common in mobile games. She found that designers would study the playing habits of non-paying players who had similar patterns to the game’s biggest spenders. Then, designers would find ways to convert them into paying players, identifying which virtual items were they most likely to buy or which revenue-generating ads would they be more likely to click. Ultimately, she found video game designers would often use surveillance technology, which is linked to the cybersecurity and privacy field.

Whitson acknowledges why we're so tethered to our devices; it makes our lives easier. For instance, many people use tracking apps such as FitBit and period trackers. Some criminal cases have used Fitbits to prove a suspect’s location at a certain time or when a murder took place since it can show when the victim’s heart stopped beating.

We’re under the assumption that the only groups overseeing our information are just the app creators. Whitson highlighted that some insurance companies use health monitoring technologies such as weight trackers or diagnostic tools to determine if someone is a good client for their insurance. She also states that period trackers can be an ‘object of risk’, since it can determine if a user was pregnant or suddenly was not pregnant, it could identify if someone had a miscarriage or an abortion. This information can be shared with anti-abortion groups or law enforcement. This usage is alarmingly especially for people who live in countries where abortion is illegal. 

This type of technology is called surveillance technology. According to Whitson “the concept of surveillance is different from privacy because surveillance denotes watching over somebody. If we think of original conceptions of the term by, like, philosophers like Michel Foucault, it means watching over with the intent of changing somebody's behavior”. A great example is panoptic prisons where the guards could always see the prisoners in the cells, yet they can't see the guards, which forces them to behave. The key difference is that technology users don’t realize we’re being monitored. Even worse, we do not know who’s monitoring us.

jennifer whitson

Even if technology such as FitBit is convenient in goal tracking, Whitson doubts its effectiveness. “Walking 10,000 steps doesn't make me a healthier person, right? The number of the steps that you take is just a poor placeholder for the things that you want to attain a healthier body”. Rather, people should consider more meaningful activities such as diet or exercise changes. Although some diagnostic technology can provide health recommendations, they are biased. Since they are designed by and for Caucasian male bodies, these recommendations are fruitless for other groups. Ultimately, these technologies are not worthwhile especially if you’re trading off your privacy.

Facebook and other social media sites have been praised for instilling messaging services for users exhibiting severe mental health issues. However, Jennifer questions its quality, “What is it about the expertise of Facebook or their employees that would indicate they know the best way to intervene or respond to this [situation]? Where [are] their credentials?” Social media sites are not employing these services for altruistic means. Rather they “designed and developed those tools because they were afraid of the legal liability and the PR nightmare that came” when users would broadcast self-harm or other dire situations on livestreams. Furthermore, this information is shared with external groups, who exploit it to profit off people's vulnerabilities. Namely, many advertisers target self-help books to online users who suffer from depression. Some insurance companies have denied insurance if they discover their client is exhibiting severe mental health issues. Unfortunately, technology like social media counselling and FitBits are painted as helpful and convenient. Yet this helpfulness is just superficial, ineffective, and worse exploitative.

But can surveillance technology benefit people’s mental health? The answer is quite complicated. Jen highlights that some tools can “regulate how much time you spend on your devices, that help you meditate, that help you access immediate virtual counseling care right and connect you to a counselor ASAP”. Then what’s the difference between these types of surveillance technology? Jennifer states that the real issue is that people are assessing their “self-worth according to social media metrics... we're opening ourselves to the comment feeds and then get rated by strangers, who, often jerks, because they're anonymous. Tons of research shows that this [usage] has negative implications for the mental health, particularly of young women and girls”. Hence the real solution is to decrease social media, which is impractical as it entails “decreasing profitability, decreasing power” for these platforms. Other than social media, surveillance technology could “potentially help us regulate our mental health if they're in the right hands and used in the right context”, such as a counsellor rather than a corporate-driven entity.

How does surveillance technology fuel this addiction? It’s a result of design tricks that employ psychological practices, which are known as ‘dark patterns'. One common practice is positive reinforcement; the process of providing someone a motivating/reinforcing stimulus after they exhibit the desired behavior. This process happens when we receive a like on our post or rewards after finishing a game level. Whitson stated that our devices study our behaviour. For example, if a player tends to “tap out at the 45-minute mark, [then] at the 43-minute mark, you give them access to a shiny new tool or resource, or you give them a badge, or you send them a chat notification of something else that will compel them to stay”. For games like Candy Crush, if “they know that you may respond more positively or click more often on green buttons than blue buttons, and so they'll just change the color buttons on your interface”. However, Whitson states these practices could be beneficial if it’s for a positive purpose such as exercising. Overall, designers find ways to compel us to spend more time or money on our devices.

Although Whitson does teach her students how to design more ethically, she cautions us not to blame designers. Oftentimes, their boss orders them to employ ‘dark patterns’ or ‘surveillance practices’ or else they’re fired. This scenario is common for students or first-time job seekers; it is an industry issue, that our society needs to be aware of.

Whitson teaches surveillances courses at the University of Waterloo. She believes these courses are essential “because our economy runs on data about where we go, who we are likely to text, how we spend our time, what triggers are likely to get us to spend our money. We trail this data exhaust behind us and [it is] collected, collated, aggregated, and used to create profiles to predict and manage our future behavior”. Not only is this practice an invasion of our privacy but it is dangerous. Whitson states “it goes to whether the tools to help employers scan resumes automatically discards your specific resume because of something benign like your last name or your university major”. She even compares this tracking to the tools police officers use to compare patrol communities. Whitson hopes the university includes more surveillance courses, so students become more aware of how they’re “manipulated, profited, and addicted to their devices”. Moreover, students can gain hope that there is a future where laws, technologies, and profitable enterprises do not operate on surveillance capitalism. Some breakthroughs include privacy-enhancing technology and laws.

Whitson also recommends other tools. Instead of using Google try DuckDuckGo. It’s a search browser that doesn’t prey on your privacy. Check out the Electronic Frontier Foundation: it’s a portal filled with “lots of resources for preserving your privacy and information on it”. She even recommends the podcast 99% Invisible’s No Armed Bandit episode., MIT-based anthropologist Natasha Dow Schüll guests in this episode to discuss how casino slot machines are designed to fuel addiction. She also authored thebook Addiction by Design. Finally, she advises students not to share group work on Facebook, as it analyzes the assignments’ content and can share it with external groups such as businesses or governments. This can be dangerous if you are writing on a controversial or politically sensitive topic.

When she isn’t teaching, Whitson mentors graduate students conducting surveillance technology research projects under UWaterloo’s Cryptography, Security, and Privacy (CrySP) group. Some projects include the history of surveillance of queer men, such as tracking, categorization, and oppression, to tracking devices for people who have Alzheimer's that are partnered with police agencies and whether this police oversight is necessary. The most rewarding aspect is seeing students “find things that motivate and drive them, the parts of the world that they want to change and help them create projects that address that those parts of the world that they want to intervene with”. Researchers like Whitson and her students continue to work towards an open, informed,  and just society that is free from corporate or state control.

by Mayuri Punithan