Blog

Your Days are Numbered: Cathy O'Neil's Five Algorithms That Trick Us Daily

Posted on 29th June 2017 by Martha Greengrass

'The activity of a single Facebook algorithm on Election Day, it’s clear, could not only change the balance of Congress but also decide the presidency.'

Our lives are increasingly controlled by numbers, more specifically by algorithms, models that shape how we shop, what we read, how we work and even how we think. In her eye-opening new book, Weapons of Math Destruction, Cathy O’Neil sheds light on the role of big data in our lives and how the fabric of our society from political systems to civil liberties is steadily being eroded by these all-pervasive systems. Here, she shines a light on five of the most surprising ways in which algorithms trick us in our daily lives.

Most of us assume that mathematical models must be neutral. But as they are created by human beings, such models can incorporate some of their biases. In fact, many ill-conceived algorithms can be found everywhere from advertising to hiring. They are opaque, unquestioned and unaccountable, and they operate at a scale to sort, target, or “optimize” millions of people. Here are five of the most surprising ways in which algorithms can trick us in our daily lives.

Online advertising

If it was true during the early dot-com days that, as a famous New Yorker cartoon had it, “nobody knows you’re a dog,” today we’re seeing the exact opposite. We are ranked, categorized, and scored in hundreds of models, on the basis of our revealed preferences and patterns. 

This can offer a powerful basis for legitimate ad campaigns, but it can also easily fuel predatory practices, allowing for ads that pinpoint people in great need and sell them false or overpriced promises. They find inequality and feast on it. For instance, for-profit universities in the US (such as the University of Phoenix) use targeted online advertising based on post codes, and other personal data to reach poorer possible students and encourage them to take out huge loans. The result is that such campaigns perpetuate our existing social stratification, with all of its injustices. 

Getting a job

The hiring business is automating, and many of the new programs include personality tests. It is now a $500 million annual business and is growing by 10 to 15% a year, according to Hogan Assessment Systems Inc., a testing company. Such tests now are used on 60 to 70% of prospective workers in the United States, up from 30 to 40% about five years ago.

HR departments use automatic systems to winnow down piles of CVs. Some 72% of CVs are never seen by human eyes. Computer programs flip through them, pulling out the skills and experiences that the employer is looking for. Then they score each one as a match for the job opening. It’s up to the people in the human resources department to decide where the cutoff is, but the more candidates they can eliminate with this first screening, the fewer human-hours they’ll have to spend processing the top matches.

In the workplace

Wildly irregular schedules are becoming increasingly common, and they especially affect low- wage workers at companies like Starbucks and McDonald’s. A lack of notice compounds the problem. Many employees find out only a day or two in advance that they’ll have to work a night shift.  The model is optimized for efficiency and profitability, not for justice or the good of the team.

For a few decades, it may have seemed that industrial workers and service workers were the only ones who could be modeled and optimized, while those who trafficked in ideas, from lawyers to engineers, could steer clear of such models, at least at work. But in 2008, just as the great recession was approaching, a San Francisco company called Cataphora marketed a software system that rated tech workers on a number of metrics, including their generation of ideas. Nowadays many companies are busy trying to optimize their white-collar workers by looking at the patterns of their communications. Tech giants are hot on this trail.

Credit 

There are many pseudoscientific models that attempt to predict our creditworthiness, giving each of us so-called e-scores. These numbers, which we rarely see, open doors for some of us, while slamming them in the face of others. Unlike standard, regulated credit scores they resemble, e-scores are arbitrary, unaccountable, unregulated, and often unfair.

A Virginia company called Neustar provides customer targeting services for companies, including one that helps manage call center traffic. In a flash, this technology races through available data on callers and places them in a hierarchy. Those at the top are deemed to be more profitable prospects and are quickly funneled to a human operator. Those at the bottom either wait much longer or are dispatched into an outsourced overflow center, where they are handled largely by machines. 

Some credit card companies carry out similar rapid-fire calculations as soon as someone shows up on their website. They can often access data on web browsing and purchasing patterns, which provide loads of insights about the potential customer. E-scores work as stand-ins for credit scores: since companies are legally prohibited from using credit scores for marketing purposes, they make do with this sloppy substitute.

Civic life 

Facebook, the immensely powerful network we share with 1.9 billion users, is also a publicly traded corporation. While it may feel like a modern town square, the company determines, according to its own interests, what we see and learn on its social network. This wouldn’t necessarily be an issue but as more and more of us rely on Facebook for delivering news, we have to ask: by tweaking its algorithm and moulding what we see, can Facebook game the political system?

The company’s own researchers have been looking into this. During the 2010 and 2012 US elections, Facebook conducted experiments to hone a tool they called the “voter megaphone.” The idea was to encourage people to spread word that they had voted. This seemed reasonable enough. The Facebook campaign started out with a constructive and seemingly innocent goal: to encourage people to vote. And it succeeded. After comparing voting records, researchers estimated that their campaign had increased turnout by 340,000 people. That’s a big enough crowd to swing entire states, and even national elections. The activity of a single Facebook algorithm on Election Day, it’s clear, could not only change the balance of Congress but also decide the presidency. Facebook’s potency comes not only from its reach but also from its ability to use its own customers to influence their friends.

And so models, despite their reputation for impartiality, can often reflect goals and ideology. As big data takes over our lives, now is the time for us to ask for more transparency, stricter laws and algorithmic audits. We have to explicitly embed better values into our algorithms, creating models that follow our ethical lead. Sometimes that will mean putting fairness ahead of profit.

£9.99
Paperback
In stock online
Usually dispatched within 24 hours
A former Wall Street quant sounds an alarm on the mathematical models that pervade modern life - and threaten to rip apart our social fabric.
  • This item has been added to your basket
View basket Checkout

Comments

There are currently no comments.