Illustration av Bianca Casco
On a Date with Big Data
Text av Jane Ruffino | Bang 2/2018
How much does your dating app know about you and where does all the information go? It's time to put on the tin foil hat when content designer Jane Ruffino writes about data collection, personal integrity and consent.
In April, Facebook CEO Mark Zuckerberg testified in front of US Congress about the company’s practices with user data, including its misuse by consulting company Cambridge Analytica. Zuckerberg’s bewildered look, too-fresh haircut and besuited plaintiveness hinted at a man unaccustomed to even the mildest critique. We are an idealistic company, he said.
Everyone knows by now, if they didn’t before, that Facebook uses our personal data to rent our attention to advertisers. Most people know about ‘fake news.’ Maybe they even know how to take an action, like downloading the data Facebook has collected about them. Much of the questioning from Congress focused on queries about content reviews and ad targeting, things people are increasingly aware about. A few mentioned potentially shady third parties.
While we can alter some of our digital behavior to limit our exposure, telling us to stay offline is no different from insisting that rape can be prevented by not walking alone at night. Hell, Facebook even collects data on people who don’t use it; content is just the start for them.
A feature, not a bug
There’s a saying in software. When you want to convince people that your mistake was deliberate, you say it’s ‘a feature, not a bug.’ We’re told our privacy is just something we have to give up if we want to have nice things. That’s no bug.
While hate speech, revenge porn, harassment, and data breaches are central issues, it’s the data practices we don’t see that most undermine our attempts to build a consent culture.
Today we talk about ‘Big Data,’ ‘personalization’, and ‘unlocking value’ in information, but the business model isn’t innovative.
“A lot of the way that we act is very much rooted in settler colonial mentality, as the basis of capitalism, non-consensually taking what you think you’re entitled to,” says Flavia Dzodan, a writer who also teaches critical theory at Amsterdam’s Sandberg Institute.
“You have people programming algorithms, entering categories of things, and classifying data, using the same classification we’ve been using for things since the 16th century.”
Organizing flora, fauna and people by physical traits, measuring and carving up indigenous landscapes, and treating human beings as actual, physical goods — all of these have been part of turning the observable world into data as a part of making everything ownable. It’s driven by the reckless optimism of relevance-hungry white men, fed with money from slightly older white men, and it’s many centuries old.
Today’s sprawling empire is bigger and deeper. It’s in our pockets, on our wrists, under our skin. Our apps and services tell other people where our bodies are in space, share our height, weight, land speed, reproductive and sexual health, diet, and heart rate. They build profiles, and we know very little about where those digital doppelgangers go. We can no longer say where the person stops and the personal data begins.
Like sexual interactions, digital privacy is about agency. In the era of #metoo and #timesup, it’s relatively uncontroversial to say that if you have to nag or conceal information, then you haven’t really gotten consent. And like sexual and other shared physical interactions, it’s not ‘freely given’ if someone has specifically set out to ‘obtain’ it, rather than create it mutually.
“We have had this other powerful moment,” says Dzodan, “And we have to contend with this idea that the same people teaching the machines consent are the same ones not respecting the consent of their colleagues.”
It’s no surprise that in order to fully consent to the terms of service for most consumer digital services, you need to know backend architectures, data processing laws, intellectual property, and APIs, and you also need an insight into some of the practices around your data that need to be concealed from you for the business to make money.
In other words, it’s impossible to fully consent to much of what happens to our datafied selves. We can know what information we’ve given our services, but we don’t know what they’ve taken.
You’re never alone
Back in March, SVT’s consumer issues series Plus Granskar ran a segment that showed Facebook isn’t alone in being a questionable data-citizen. Sanne Olsson looked at dating app Tinder, along with geolocation-based Happn, and LGBT app Grindr.
With the help of Norwegian research institution SINTEF, Olsson showed that despite telling users they weren’t sharing data with third parties, Happn shares gender, age, ethnicity, precise GPS location, and device ID with external companies, some in the United States, where data protection laws are looser. When we consent to an app’s terms of service, we shouldn’t be forced to accept an uninvited guest on our date, especially when they promised not to.
From a content and feature perspective, Grindr has gone some way toward a duty of care for its community of users. The app lets you change the icon on your home screen if you don’t want to be open about using it. You can include a whole range of information in your profile that helps Grindr users have safe, fun interactions, on the terms they set themselves.
One important piece of data you can list is whether you’re taking PrEP, a preventive drug that drastically reduces a user’s chance of contracting HIV.
“The advent of PrEP has changed the game in a big way,” says John Kamys, a musician and yoga instructor based in Chicago who has used Grindr all over the world. “A friend said that for years he had shame about a sexual encounter, and now he can take PrEP and he feels like all the shame is gone about the way he felt toward sex.”
As an out performing artist in Chicago, Kamys says he doesn’t fear what he puts out there, but he knows others do.
“The problem is when you’re in places where everyone is afraid, like Albania, or Egypt,” Kamys says. “The gay world works under everything. If you are living in fear, you have your own symbols and ways of navigating to keep you under the radar — but there’s a problem with this that it can throw someone under the bus so quickly.”
Grindr sent information such as age, height, precise GPS position (rather than making it ‘fuzzy’ to hinder unwanted identification), sexual preferences, and even HIV status, to third parties that include analytics and retargeting companies, and app monetization platforms, some of it unencrypted. Grindr, which finally stopped sharing HIV status in early April, claims they sent the information to ‘optimize’ experiences for users. But exactly how does your health information improve the performance of an app?
“I absolutely believe that app creators have a responsibility to design their platforms in a way that doesn’t place all the burden of safety on the user,” Says Jillian C York, Director for International Freedom of Expression at the Electronic Frontier Foundation. “Particularly some of the queer apps and platforms have had campaigns around safer sex, and I’m glad they’re doing it, but they’re only thinking about harm reduction in the sexual context.”
“They don’t have any means of refreshing users’ consent when new policy changes happen,” she says.
You don’t get to choose not to be data
“If you’re not paying for it, you’re the product” is a common way to explain the data economy. In a way, it’s true, but it suggests we’re somehow to blame for being duped by a free lunch. And in the case of dating apps, the service isn’t always free. Users pay for enhanced features, more matches, and ad-free experiences. You won’t see ads, but there’s no promise that your information won’t still be collected.
It’s not free, and you’re the product, and you’re not even sure who the customer is. In the case of Happn and Tinder, you need an active Facebook account, which means your apps don’t need to be breached, or even to be oversharing data, for information to find its way there.
“When a person on Grindr gives you a phone number and you text them once or twice, you might see them pop up as a friend suggestion,” says Kamys. “Even in small town Georgia, where I had no other connections.”
Angela Gross*, a designer who has used a range of dating and swinging apps in Ireland, said this was one of the biggest issues. “People sometimes buy a second phone for that — they call them ‘swing phones’ — and still you get all these people who are trying to keep their identities secret, popping up as suggested Facebook friends.”
“That stuff has consequences in Ireland,” she says. “If you were found out to be a swinger or into BDSM, real shit happens to you.”
Facebook is commodifying every interaction they can track, which they can often do despite our attempts to protect ourselves, and without concern for the dangers this can pose.
“There’s a fundamental inability to treat humans as humans,” says Gross. “That’s at the core of why tech is terrible in so many ways.”
What happens in the long-term?
What if we reframed digital consent as an issue of bodily autonomy? What if we stopped accepting the language of sacrifice as standard for our digital lives?
It’s not as if civil society doesn’t know, or doesn’t care, but these issues can’t be addressed by individuals, or even by individual countries. The GDPR, the General Data Protection Regulation, which came into effect in May of this year, is supposed to address the power imbalance between consumers and data-driven businesses, at least in the EU, but it can only be as effective as the motivations behind compliance strategies.
If a business model is based on not being clear exactly how a company uses the data people give it, how will they ever be compliant without revealing their secrets? Just like with sexual consent, many of the worst offenders know how to stay just on the right side of the law. Their idea of compliance might not align with our idea of consent, and consumers will still have to be proactive about removing or moving their personal data.
Data-driven marketing is usually the focus of critiques about data practices, but business intelligence is even more monetizable. This is what companies use to do things like create strategies, make predictions, and set pricing, that make the greater risk not about what you give a company, but what they take from you and what they do next.
Information about your age, location history, income, heart rate, ethnicity, work history, household status, and all kinds of other data points, collected together from your device ID, your dating apps, your fitness tracker, and your credit history, can be used to assign you a measurable value or a risk score.
And that can be used to grant or deny you things you want, things like loans, health care, employment, even entry or residency visas. In European countries, increasing privatization of public services, including concepts like ‘value-based health care’ means this kind of discrimination is a growing risk.
Consenting to enter your age into a form field so you can find a date with someone you might like should not be taken as consent to threaten your right to have an illness treated at an affordable price, or to be offered a job for which you’re qualified.
And that consent can’t easily be automated.
“How are we going to teach consent to machines when humans have a very poor track record, centuries of not understanding or exercising respect for consent?” says Dzodan. “My question at the moment is, when our culture doesn’t understand consent, how can you teach it to robots?”
This text was originally published in Swedish in the second issue of Bang 2018. Bang is Sweden’s biggest feminist and anti-racist publication. Subscribe to Bang to support intersectional feminist journalism!
Hej Bangläsare! Alla kommentarer granskas innan publicering, därför kan det dröja något innan din kommentar kommer upp. Tack för ditt tålamod. Vi vill ha en god samtalston i kommentarerna och att ni använder ert namn. Vi förbehåller oss rätten att inte publicera kommentarer som strider mot Bangs värdegrund.