I wish they had put as much effort into their research as they had into their production. Most of all this just felt like a propaganda piece.
There were too many unsubstantiated slippery slope arguments, and demands for oversight, but they never bothered diving into the details. I am sure there are issues with algorithms, AI's, government surveillance etc. But they rarely if ever explained.
Coded Bias
2020
Action / Documentary
Coded Bias
2020
Action / Documentary
Plot summary
When MIT Media Lab researcher Joy Buolamwini discovers that facial recognition does not see dark-skinned faces accurately, she embarks on a journey to push for the first-ever U.S. legislation against bias in algorithms that impact us all.
Uploaded by: FREEMAN
Director
Top cast
Tech specs
720p.WEB 1080p.WEBMovie Reviews
Disappointing
Biased
I think documentary itself biased. It makes everyone working in this field look bad.
Wrong title, no focus, very little actual information, everyone trying to get their 15 minutes
The matter: Yes, it is relevant. Yes, there should be an oversight.
Documentary: Garbage. Here is why: 1. Title is "Coded Bias": Everyone in this documentary seems to agree even programmers do not know why they are getting results. Then where is the *coded* part?
2. AI and ML are *not* same even though they seem to have been used interchangeably.
3. Where are the programmers they keep on referring to? They could not get a *single* programmer to give even an opinion? We are dime a dozen.
4. Everyone seems to trying to get part of the credit. Good for them. But don't make it that obvious. It's ugly.
5. Documentary came out November 2020. The bans started from June 2020. Seems like sooner they got it out, they knew they will get more bang for the buck.
6. The senator whoever he was put a very distinct blame on programmers. Contrary to the belief, programmers are *not* in absolute control of the project (even in non AI\ML projects).
There are way too many people involved.
7. In every project, programmers seldom have all the samples. That would invalidate the test. Instead there is a separate department.
8. In cases like this, there are two scenarios: a. Someone who is *in charge* of the project, typically a product\project manager, would *provide* the dataset.
B. If the data is sensitive, as it should be in this case, you only get very small subset. Actual fixing would happen based on *defects reported by customer*.
9. If programmers had access to all those sensitive records, then you have bigger problem than facial recognition. Probably one or two people would have access to subset of inputs. But we will not know because you never interviewed any.
10. At any point, *management* has absolute authority to override programmer's decision. After all, it is still a job.
11. Programmers do not care about data, not most of the times anyway. Data is only important as part of test. It's the *users* of that data that are interested in what\how\who\where.
12. Programmers just like any other field same percentage conscientious. They know that unethical use of data is commonplace *and* open issue. But when you were at crossroads, just like likes\dislikes\upvotes\downvotes, you took 2 seconds to decide that it's more important for you to be lazy (remember when you could book a taxi without giving almost no details?)
13. This is USA centric documentary. Nothing wrong with that but apart from UK and China other countries did not exist for film makers I suppose. And then they introduce a controversial opinion speaker "China is outspoken about it". Yes, bravo.
14. It's a feel good documentary. You will get riled up but that's about it. It won't give you any actual information. Documentary seemed more interested in getting more number of people to talk than to go deeper.
Watch it if you want to get all emotional but leave programmers out of it please, unless you include their interviews (we are good at those).