Algorithms are like invisible judges that decide our fates

This worry’s me, but certainly does not surprise me. We are all playing a game which where the winners and losers are all ready selected before they have even decided to play.  I am not comfortable at all with this: though at my time in life its unlikely to effect me directly,  it is part of a growing trend which ends in exclusion for certain people from what is considered normal society. This is a bad thing

 

 


Powered by Guardian.co.ukThis article titled “Algorithms are like invisible judges that decide our fates” was written by Dave Bry, for theguardian.com on Monday 27th April 2015 09.00 UTC

Imagine that you’re a contestant in an audition round of The Voice, where you belt out your best “I Will Always Love You”. A minute passes. No reaction from the celebrity judges. You keep singing. Another minute, still no encouraging smile or nod. You strain to hit your highest note, pleading with your performance: “Please, please accept me! I am doing my best!” The song ends. No one wants you. Your family bow their heads in shame. Your mom cries. You stand on the stage, alone in the spotlight, heartbroken. A trap door opens beneath your feet and you slide screaming into Adam Levine’s basement torture maze.

Think that’s bad? In the real world, science has come up with something worse. A company called Jobaline offers “voice profiling” to predict job success based on how candidates sound; its algorithm identifies and analyzes over one thousand vocal characteristics by which it categorizes job applicants on suitability.

It’s horrible and dehumanizing, like all our other profiling (the racial kind is always a big hit!) Reliant on born-in, luck-of-the-genetic-draw factors that we can neither avoid or control. Regardless of mood or intent, according to NPR’s Aaarti Shehani, “your voice has a hidden, complicated architecture with an intrinsic signature – much like a fingerprint”.

This is not the only creepy algorithm system HR departments have been employing to help the company bottom line. Companies like Wal-Mart and Credit Suisse have been crunching data to predict which employees are “flight risks” who are likely to quit (easily remedied with a simple anklet attaching the worker to his or her cash register or cubicle) vs those deemed “sticky,” meaning in-it-for-the-long-haul. The information lets bosses either improve morale or get a head-start on a search for a replacement.

The inventors of such programs often enjoy the impeachable, amoral cloak of scientific legitimacy. When it comes to voice profiling, computers are not judging the speakers themselves; only the reactions the speaker’s voice provokes in other (presumably human) listeners. “The algorithm functions as a mechanical judge in a voice-based beauty contest”, wrote Chamorro-Premuzic and Adler in The Harvard Business Review. “Desirable voices are invited to the next round, where they are judged by humans, while undesirable voices are eliminated from the contest”.

The makers of voice profiling programs tout this as a moral achievement. Human beings bring loads of biases into any evaluation; computers are blissfully unaware of differences in race, gender, sexual preference or age. “That’s the beauty of math!” Jobaline CEO Luis Salazar told NPR. “It’s blind.”

The problem is, when applied in a capitalist system already plagued by unfairness and inhumanity, this blindness sounds really, really dangerous. An impersonal computer program gets first say as to who gets to earn money to buy food and who doesn’t, based on an application of a binary code too subtle and complex for us to understand. Over a thousand factors, analyzed for every vocal sample. Over a thousand ones or zeros clicked in the corresponding click boxes. Who checks for the glitch? Who do you complain to if you think you’re getting a raw deal? Is it just me or does technology like this simply pass our penchant for prejudice on to the machines who will soon wrest planetary control from our soft, carbon-based hands?

“Hello, I’d like to apply for a job,” the human being says, enunciating as clearly as possible into the phone receiver. “My name is—”

“Disqualified,” says the cold, computerized voice on the other end of the line. “Too squeaky. Perhaps you should seek work in the silent film business.”

guardian.co.uk © Guardian News & Media Limited 2010

Published via the Guardian News Feed plugin for WordPress.