- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.
I mean “taking pictures of people who are smiling” is definitely a bias in our culture. How we collectively choose to record information is part of how we encode human biases.
I get what you’re saying in specific circumstances. Sure, a dataset that is built from a single source doesn’t make its biases universal. But these models were trained on a very wide range of sources. Wide enough to cover much of the data we’ve built a culture around.