当前位置:首页 >探索 >【】

【】

2024-11-04 08:00:39 [知識] 来源:有聲有色網

Since the creation of the camera, photography has been technologically optimized to capture white people best. Engineers at Google are trying to change that.

At Google's developer conference, Google I/O, Tuesday, the company announced that it's working to re-work the algorithms and tweak the training data that power the Pixel camera in order to more accurately and brilliantly capture people of color.

Specifically, it is working to better light people with darker skin and more accurately represent skin tone. Also, silhouettes of people with wavy or curly hair will stand out more sharply from the background.

Mashable Top StoriesStay connected with the hottest stories of the day and the latest entertainment news.Sign up for Mashable's Top Stories newsletterBy signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Google isn't the only company having a technological reckoning with racial bias. Just last month, Snap announced it was re-working its camera software to better represent people of color.

Google is calling its project "Image Equity." Like Snap, the company worked with outside experts in photography and representation to guide the undertaking.

Some of the changes will involve training the algorithms that render the photos on a more diverse dataset, so white people and white skin aren't the default definition of "person." Google will also be tweaking the Pixel's auto white-balance and auto-exposure capabilities to better optimize for people with darker skin.

Related Video: Everything you need to know from Google I/O 2021

TopicsActivismCamerasRacial Justice

(责任编辑:百科)

    推荐文章
    热点阅读