Ajouter une intrigue dans votre langueWhen MIT Media Lab researcher Joy Buolamwini discovers that facial recognition does not see dark-skinned faces accurately, she embarks on a journey to push for the first-ever U.S. legislatio... Tout lireWhen MIT Media Lab researcher Joy Buolamwini discovers that facial recognition does not see dark-skinned faces accurately, she embarks on a journey to push for the first-ever U.S. legislation against bias in algorithms that impact us all.When MIT Media Lab researcher Joy Buolamwini discovers that facial recognition does not see dark-skinned faces accurately, she embarks on a journey to push for the first-ever U.S. legislation against bias in algorithms that impact us all.
- Réalisation
- Scénario
- Casting principal
- Récompenses
- 3 victoires et 6 nominations au total
- Self - Author, Weapons of Math Destruction
- (as Cathy O'Neil Ph.D.)
- Self - Author, Twitter and Tear Gas
- (as Zeynep Tufekci Ph.D.)
- Self - Author, Automating Inequality
- (as Virginia Eubanks Ph.D.)
- Self - Technical Co-Lead, Ethical A.I. Team at Google
- (as Timnit Gebru Ph.D.)
- Self - Author, Algorithms of Oppression
- (as Safiya Umoja Noble Ph.D.)
Avis à la une
Fact: The darker ones complexation is the LESS likely that there is usable video or photos for investigators or prosecutors.
The makers of this film claim the opposite is the case, they claim there is a bias against persons with darker complexions -- when in fact that is not at all what the peer reviewed research shows.
The execution of this documentary, however, is very underwhelming, to say the least. There are the usuals: catchy montages, TED-style interviews, news soundbites, and the most annoying of all - artificially created (pun intended) graphics of AI scanning data in a stereotypical digital font paired with silly sound effects which, unless the primary audience of this documentary is fifth graders, I don't understand why it's necessary to incessantly rehash them. And then there's the unimaginative 'robotic voice.' It's just puerile.
Maybe the producers are wary that people still won't get the danger of unregulated AI without these gimmicks. But I'd argue that people would be more alarmed to learn how AI has been infiltrating and affecting our lives in the least expected ways. If the documentary can clearly point out the potential harms as a consequence, I think people will naturally find the lack of regulation disturbing, no silly visuals and sound effects are needed. Sometimes I think they actually undermine the severity of potential danger at hand. For example, the scene where a teenager is mistakenly stopped by plainclothes police, instead of being accompanied with yet another piece of cheesy soundtrack meant to suggest danger, it would be so much more powerful if everything is just eerily silent.
And the interviews and info - yes, AI is like a black box even to the programmers, but can you explain it in layman's terms so that people get it? - could be a lot more insightful. Even some short Vox-style Youtube clips have explored these issues in greater depth.
The themes explored are a bit all over the place too. I get it this domain is relatively new, so the vocabulary and focus aren't that streamlined yet, still... Sometimes the documentary brings up issues of obvious biases, which is consistent with the title, but sometimes we don't even know what the problem is, it's simply an issue of things being completely nontransparent and/or unverified by a third party. The China parts are also a little disjointed from the rest of the documentary and the country itself is painted in broad strokes - it's as if we can't do good until we can identify the bad guy to feel good about ourselves.
That said, this documentary feels incomplete. It seems to be one-sided, with lots of interviews with people who are against the use of AI.
But while the film-makers do an ok job of highlighting the dangers and inadequacies of AI systems such as facial recognition software, they failed to show what really is behind these glaring.blunders - was it some kind of knowing omission meant to create more biases, was it a case of software engineers creating something that they themselves don't understand and thus making a mess of things, or was it simply incompetence.on the part of many involved?
Who knows. And that's the problem with this film.
At the same time, purely as a documentary this is kind of weak. It's sometimes a little muddled, and it sometimes stretches a point a bit too far. Some of the things it tries to fold into the narrative are less examples of technological racism and more examples of actual criminal behavior. There's a difference between slippery tech and actions that resulted in people going to jail.
Still, it's a compelling film.
Le saviez-vous
- Citations
Self - Author, Weapons of Math Destruction: On internet advertising as data scientists, we are competing for eyeballs on one hand, but really we're competing for eyeballs of rich people. And then, the poor people, who's competing for their eyeballs? Predatory industries. So payday lenders, or for-profit colleges, or Caesars Palace. Like, really predatory crap.
- ConnexionsFeatured in Jeremy Vine: Épisode #4.95 (2021)
Meilleurs choix
- How long is Coded Bias?Alimenté par Alexa
Détails
- Date de sortie
- Pays d’origine
- Sites officiels
- Langue
- Aussi connu sous le nom de
- Kodlanmış Önyargı
- Lieux de tournage
- Sociétés de production
- Voir plus de crédits d'entreprise sur IMDbPro
Box-office
- Montant brut aux États-Unis et au Canada
- 10 236 $US
- Week-end de sortie aux États-Unis et au Canada
- 10 236 $US
- 15 nov. 2020
- Montant brut mondial
- 10 236 $US
- Durée1 heure 26 minutes
- Couleur