Yes, Even Algorithms Can Break Bad

Math, as it turns out, is doing more than making millions of teenagers unhappy over hard homework right now — it’s also making the world less democratic and egalitarian. In the wrong hands, mathematics can become a “WMD” — a weapon of math destruction.

Most people, especially those who still have flashbacks about the Calculus SAT II, will probably not find the former all that surprising — they will just add it to the list of things they don’t like about math. But even those avid math haters would likely be surprised by the source of the latter observation.

Cathy O’Neil does not hate math. In fact, she’s a mathematics PhD from Harvard, a professor, a data scientist and a former financial services quant. Cathy O’Neil loves math so much that when she was a child she used to factor license plates for fun. But she’s not exactly your typical mathematician either, since she’s probably the only name in the game who is willing to say that representative democracy in the digital age — is not all it’s cracked up to be.

Well, that’s actually not exactly what she said…

“People keep suggesting that democracy is alive and well because we have two parties that don’t agree on everything. I think that’s total bull—-.”

She’s a lot more fun than the math professor you had in college.

She’s also a lot more skeptical about math — and its powers to the good.

Math alone isn’t a problem, she said — it’s not a weapon, massively destructive or otherwise, it’s just a method for finding answers. O’Neil’s trouble is with algorithms — rule-based processes for solving mathematical problems — and the ever-increasing ways they are deciding more and more of our lives, mostly from within a black box that the average math-hating citizen has no access to — and is actively steered away from gaining access to.

It was an observation she came by during her years as a Wall Street quant between 2007 and 2011, when she first started seeing that algorithms were becoming something like weapons of math destruction. These algorithms are defined by three main features: they are secretive, important and destructive for all but the precious few who actually understood them and what went into them.

“I left, disgusted by finance, because I thought of it as a rigged system and it was rigged for the insiders,” says O’Neil. “I was ashamed by that — as a mathematician I love math and think of it as a tool for good.”

So O’Neil decided to do her part in making mathematics great again by writing a book, Weapons of Math Destruction, which helps to explain in plain (non-mathy) language why everyone should be very aware that the algorithms we use to steer the world are just as capable of doing harm as they are good.

Because even though algorithms feel like a sort of neutral mathematics — like 2+2=4 — that is true no matter who you are or where you do it, that isn’t really quite a correct way of understanding them, O’Neil says.

Algorithms are complex sets of instructions that programmers input to give a computer guidance on how to solve a problem.

And it’s that programmer part where things get a little hairy. The algorithm is only as good as the programmer’s biases, O’Neil says.

Is the programmer hiding an unconscious bias against a group of people? If so, that might well show up in the way the computer starts crunching the numbers. And that, O’Neil notes, speaks to the non-malicious cases — there are also plenty of ways that algorithms can be cooked, tweaked and twisted intentionally to spit out a very specific result.

It may be math — but it’s not always neutral.

And that’s where O’Neil says that the destructive part of algorithms comes into play. Results based on them tend to be taken as gospel because of the perception that they are data-driven and devoid of the background noise of opinion.

“You don’t see a lot of skepticism,” she says. “The algorithms are like shiny new toys that we can’t resist using. We trust them so much that we project meaning on to them.”

Because, as we noted at the outset, most people don’t love math. And they tend to find math intimidating and don’t want to push back on it. Even though algorithms decide so much of a citizen’s life — what ads a person sees, what political messages they hear, what kinds of loans they can get, how they fair in the criminal justice system — these things are all under the sway of algorithms, and most consumers don’t feel empowered to push back because they don’t know the math.

But, O’Neil notes, there are host of other questions that one can ask about applications of algorithms that actually don’t really require knowing any math — such as, “is the application of an algorithm legal at all?”

“Often we see systems using people’s fear and trust of mathematics to prevent them from asking questions” she says. “I think it has a few hallmarks of worship — we turn off parts of our brain, we somehow feel like it’s not our duty, not our right to question this. People should feel more entitled to push back and ask for evidence, but they seem to fold a little too quickly when they’re told that it’s complicated.”

But complicated or not, it is a conversation worth having. If you’re interested in joining it, come join us at Innovation Project 2017 on March 15-16 where Cathy O’Neil will hold court on the impact of these “WMDs” on credit and financial decision making. Who knows, maybe we’ll see if we can get her to factor a license plate for fun.

Join us, won’t you?