Blog post -

Algorithms – the scary truth about code

Algorithms influence more and bigger parts of our lives every day and they are seen as flawless mathematical formulas. The truth is scary.

Joy Buolamwini, a black female engineer from Mississippi, early in her career, encountered a robot with AI, artificial intelligence. Turned out it wasn’t very intelligent. The robot had a face detection function in order for it to interact with humans. Playing peekaboo, more specifically. But the robot couldn’t play with her; it couldn’t detect her face. Simply because she was a woman of color. Oh well, she thought, it will soon be corrected. But she soon experienced it again. And then years later again.

Nobody had fixed these racist robots. She decided to look into it.

We tend to believe that data and algorithms are science, that they are built on facts, and therefore, free from bias and other human flaws. We are now beginning to see that the opposite is true.

The ability of a machine to detect a face is built on a large number of data points defining what a face is. That data is then fed to the machine and the machine “learns”. If a predominant number of those data points are collected from white faces, by people that ignore or don’t realize that all or a big part of the faces they use are white, then the definition of a face in the “mind” of the machine is a white face. So, black faces won’t get detected.

A racist machine is born.

Some say that the reason for this, is that the algorithms were created in an all-white office … that the data the machines are fed, determine the outcome. “Algorithms are opinions embedded in code,” as the mathematician, data scientist and author Cathy O’Neil puts it.

Algorithms today affect many parts of our lives - often in ways we don’t understand. Algorithms decide who gets called to interviews, who gets insurance and at what price, what ads are shown to whom. They even decide where we end up in a customer service telephone queue.

It is true. Some companies have automated systems that rank customers into different categories with the help of data attached to their telephone numbers. High value customers end up at the front of the line and low value customers get to spend their afternoon waiting at the end of it.

Worse things than that happen.

The news organization Propublica has exposed that algorithms used in courts for risk assessment were biased against black people. The algorithm predicts future risk for criminals to reoffend, and the material is used by judges to support their decisions in court. The idea behind it is that independent code would support human judges, because humans within the justice system have over and over again been proven to be racially biased. Turned out the algorithm wrongly labelled black people as high risk at a much higher rate than white people.

Because the algorithms were based on facts and data from our world, a world suffering from injustices, hatred and racism, they repeated our mistakes.

Cathy O’Neil often uses the example of Fox news. The news channel, she says, has a problem with gender bias. If they would use algorithms to hire people, they would feed them data about past and current employees and try to single out data points binding together successful employees in order to find new potentially successful people based on that data. It seems that the gender bias occurring at Fox for a long time, have kept women from succeeding and therefore the data would point to men as having more potential. It is a self-reinforcing mechanism with very harmful effects.

The algorithms automate the status quo.

Joy Buolamwini, calling herself a poet of code, has gone on to create the Algorithmic Justice League, an organization that fights algorithmic bias. The issue with algorithms as carriers of their creators’ flaws is still flying below the radar even though people such as Joy Buolamwini and Cathy O’Neil have started to raise awareness. It is a hard fight to take on. Massive market forces are behind it and, as O’Neil, an outspoken part of the Occupy Wall Street movement, puts it – there is a lot of money to be made from division. Algorithms are marketed as science, truth, perfection but in reality they are just pupils of our past - a past that is far from perfect.

We have got to learn to question the algorithms, to check them, to be able to investigate them. Algorithms can be interrogated, and they will tell us the truth every time. As Cathy O’Neil says

“The era of blind faith in big data must end”.

Idka is a social platform that puts the user, not advertisers, in the front room. The company is created from the idea that who we are and what we believe in is sacred – and not for sale. As a user of Idka all your data remains yours and there are no advertisments. If you delete something it is actually deleted and gone forever. Not just removed from visibility but still stored as a possession of the service provider.

Idka is a place where you can connect, interact, collaborate, share ideas and thoughts without making who you are and what you engage in into a product. Try Idka now, www.idka.com

be social, stay private®

Topics

  • New media

Contacts

Elizabeth Perry

Press contact Chief Marketing Officer Marketing & Communication

Related content