Als Joy Buolamwini entdeckt, dass die Gesichtserkennung dunkelhäutige Gesichter nicht richtig erkennt, begibt sie sich auf eine Reise, um die allererste US-Gesetzgebung gegen Voreingenommenh... Alles lesenAls Joy Buolamwini entdeckt, dass die Gesichtserkennung dunkelhäutige Gesichter nicht richtig erkennt, begibt sie sich auf eine Reise, um die allererste US-Gesetzgebung gegen Voreingenommenheit in Algorithmen zu fordern.Als Joy Buolamwini entdeckt, dass die Gesichtserkennung dunkelhäutige Gesichter nicht richtig erkennt, begibt sie sich auf eine Reise, um die allererste US-Gesetzgebung gegen Voreingenommenheit in Algorithmen zu fordern.
- Auszeichnungen
- 3 Gewinne & 6 Nominierungen insgesamt
- Self - Author, Weapons of Math Destruction
- (as Cathy O'Neil Ph.D.)
- Self - Author, Twitter and Tear Gas
- (as Zeynep Tufekci Ph.D.)
- Self - Author, Automating Inequality
- (as Virginia Eubanks Ph.D.)
- Self - Technical Co-Lead, Ethical A.I. Team at Google
- (as Timnit Gebru Ph.D.)
- Self - Author, Algorithms of Oppression
- (as Safiya Umoja Noble Ph.D.)
Empfohlene Bewertungen
Fact: The darker ones complexation is the LESS likely that there is usable video or photos for investigators or prosecutors.
The makers of this film claim the opposite is the case, they claim there is a bias against persons with darker complexions -- when in fact that is not at all what the peer reviewed research shows.
At the same time, purely as a documentary this is kind of weak. It's sometimes a little muddled, and it sometimes stretches a point a bit too far. Some of the things it tries to fold into the narrative are less examples of technological racism and more examples of actual criminal behavior. There's a difference between slippery tech and actions that resulted in people going to jail.
Still, it's a compelling film.
That said, this documentary feels incomplete. It seems to be one-sided, with lots of interviews with people who are against the use of AI.
But while the film-makers do an ok job of highlighting the dangers and inadequacies of AI systems such as facial recognition software, they failed to show what really is behind these glaring.blunders - was it some kind of knowing omission meant to create more biases, was it a case of software engineers creating something that they themselves don't understand and thus making a mess of things, or was it simply incompetence.on the part of many involved?
Who knows. And that's the problem with this film.
It's minor, but wasn't sure why some interviewees were framed with so much headroom. Then again, Mr Robot did that and I never understood it there, and given that that was also concerned with themes of technology and surveillance, maybe there's some shared symbolism I'm not picking up on.
Some of the segments with the AI saying menacing things were a little cheesy, but overall brief at least.
There's also a sense that the documentary may cover a little too much in its 85-minute runtime. While I can admire its ambition in covering so many aspects of facial recognition software, its racial biases, algorithm discrimination, and so on, it does make for a documentary that jumps around a fair bit and not always smoothly... at least all the topics are interesting on their own.
But in the end, it covers important topics and presents compelling arguments about particular flaws and biases in technology. It does warn that this is something that if unchecked could become a serious problem, so I like the attempt to bring awareness to this issue before it completely spirals out of control.
It's well edited, features interesting interviewees and subjects, and ends on a little more hope than I was anticipating, which was a nice surprise. Overall, it's one of the better Netflix documentaries I've watched in a while.
The execution of this documentary, however, is very underwhelming, to say the least. There are the usuals: catchy montages, TED-style interviews, news soundbites, and the most annoying of all - artificially created (pun intended) graphics of AI scanning data in a stereotypical digital font paired with silly sound effects which, unless the primary audience of this documentary is fifth graders, I don't understand why it's necessary to incessantly rehash them. And then there's the unimaginative 'robotic voice.' It's just puerile.
Maybe the producers are wary that people still won't get the danger of unregulated AI without these gimmicks. But I'd argue that people would be more alarmed to learn how AI has been infiltrating and affecting our lives in the least expected ways. If the documentary can clearly point out the potential harms as a consequence, I think people will naturally find the lack of regulation disturbing, no silly visuals and sound effects are needed. Sometimes I think they actually undermine the severity of potential danger at hand. For example, the scene where a teenager is mistakenly stopped by plainclothes police, instead of being accompanied with yet another piece of cheesy soundtrack meant to suggest danger, it would be so much more powerful if everything is just eerily silent.
And the interviews and info - yes, AI is like a black box even to the programmers, but can you explain it in layman's terms so that people get it? - could be a lot more insightful. Even some short Vox-style Youtube clips have explored these issues in greater depth.
The themes explored are a bit all over the place too. I get it this domain is relatively new, so the vocabulary and focus aren't that streamlined yet, still... Sometimes the documentary brings up issues of obvious biases, which is consistent with the title, but sometimes we don't even know what the problem is, it's simply an issue of things being completely nontransparent and/or unverified by a third party. The China parts are also a little disjointed from the rest of the documentary and the country itself is painted in broad strokes - it's as if we can't do good until we can identify the bad guy to feel good about ourselves.
Wusstest du schon
- Zitate
Self - Author, Weapons of Math Destruction: On internet advertising as data scientists, we are competing for eyeballs on one hand, but really we're competing for eyeballs of rich people. And then, the poor people, who's competing for their eyeballs? Predatory industries. So payday lenders, or for-profit colleges, or Caesars Palace. Like, really predatory crap.
- VerbindungenFeatured in Jeremy Vine: Folge #4.95 (2021)
Top-Auswahl
- How long is Coded Bias?Powered by Alexa
Details
- Erscheinungsdatum
- Herkunftsländer
- Offizielle Standorte
- Sprache
- Auch bekannt als
- Kodlanmış Önyargı
- Drehorte
- Produktionsfirmen
- Weitere beteiligte Unternehmen bei IMDbPro anzeigen
Box Office
- Bruttoertrag in den USA und Kanada
- 10.236 $
- Eröffnungswochenende in den USA und in Kanada
- 10.236 $
- 15. Nov. 2020
- Weltweiter Bruttoertrag
- 10.236 $
- Laufzeit1 Stunde 26 Minuten
- Farbe