Show Details

Biased Artificial Intelligence

April 17, 2017

Computer scientists discover bias in artificial intelligence programs.

Transcript

BOB HIRSHON (host):

Computers that discriminate. I’m Bob Hirshon and this is Science Update.

Machines that can make decisions using artificial intelligence may sound like science fiction, but companies are already using them for such tasks as reviewing job applications. One reason is that machines are presumed to be fair and unbiased. But in the journal Science, Princeton University computer scientist Aylin Caliskan and her colleagues report that language itself contains ethnic, racial and gender biases— and they found that those biases are acquired by machines.

AYLIN CALISKAN ( Princeton University ):

If artificial intelligence is making such biased decisions, it would result in perpetuating the biases in society, and this directly leads to unfairness.

HIRSHON:

She says the solution is similar to one used to combat bias in humans: the addition of explicit rules for appropriate conduct that override the prejudice. I’m Bob Hirshon, for AAAS, the science society.

Story by Bob Hirshon

LEARN MORE