MIFF: Coded Bias



MIT researcher Joy Buolamwini discovers racial bias in facial recognition software.

"Coded Bias" puts forward a case for the way the past lies within algorithms as data embeds the past. At first, the premise is surprising, but on reflection it's not... systemic biases are becoming hardwired. Where we might have expected electronic integrity or machine neutrality, our entrenched social biases are being propagated. Beyond facial recognition many AI applications are treated as a black box and not subject to sufficient scrutiny. Be concerned if you are not male and white as you may be subject to algorithmic harm. Now you see me now you don't.

Anne Murphy

MIFF 2020
84 mins