Pictures are a big part of how we see each other and the world around us, and historically racial bias in camera technology has overlooked and excluded people of color. That same bias can carry through in our modern imaging tools if they aren’t tested with a diverse group of people and inputs, delivering unfair experiences for people of color, like over-brightening or unnaturally desaturating skin. We acknowledge that Google has struggled in this area in the past, and are committed to continuing to improve our products accordingly. As part of Google’s Product Inclusion and Equity efforts, our teams are on a mission to build camera and imaging products that work equitably for all people, so that everyone feels seen, no matter their skin tone.
Pixel 6: A more equitable camera
Building better tools for a community works best when they’re built with the community. For the new Pixel 6 Camera, we partnered with a diverse range of renowned image makers who are celebrated for their beautiful and accurate depictions of communities of color—including Kira Kelly, Deun Ivory, Adrienne Raquel, Kristian Mercado, Zuly Garcia, Shayan Asgharnia, Natacha Ikoli and more—to help our teams understand where we needed to do better. With their help, we’ve significantly increased the number of portraits of people of color in the image datasets that train our camera models. Their feedback helped us make the key improvements across our face detection, camera and editing products that we call Real Tone.
Let’s take a deeper look at how we approached these improvements:
- In computational photography, making a great portrait depends on the camera’s ability to detect a face. We radically diversified the images that train our face detector to “see” more diverse faces in a wider array of lighting conditions.
- Auto-white balance models help determine color in a picture. Our partners helped us make better decisions about how to render the nuances of skin for people of color.
- Auto-exposure models help determine the brightness of an image. Feedback from our experts helped us ensure that our camera shows you as you are — not unnaturally darker or brighter.
- Our teams noticed that stray light had a tendency to disproportionately wash out darker skin tones, so we developed and implemented an algorithm to reduce its effect in our images.
- Blurriness in portraits is a consistent concern for people with darker skin tones, so our teams used the Tensor chip’s processing power to make our portraits sharper through motion metering, even in low light conditions.
It was important for us to be sure that our adjustments were resonant with our collaborators as well, and we’re proud that they rated Pixel 6’s rendering of skin tone, brightness, depth and detail as best for people of color in a device-agnostic survey comparing top smartphone cameras.
Google Photos: More nuanced auto enhancements
Our partners’ expertise also helped our teams improve Google Photos’ popular auto enhance feature, so you can achieve a beautiful, representative photo regardless of when you took the photo, or which device you used. The updated auto enhance is designed to improve your picture’s color and lighting with just a tap, and works well across skin tones. It will roll out in Google Photos across Android and iOS devices in the coming weeks.
A mission, not a moment
We’re committed to building a more equitable experience across all of our camera and image products. To improve the visibility of meeting participants, we recently launched automaticlighting adjustments in Google Meet, and tested it across a range of skin tones to ensure it works well for everyone. And our Research teams are identifying more inclusive ways to handle skin tone in AI systems, both in Google products and across the industry. We’ll continue to partner with experts, listen to feedback and invest in tools and experiences that work for everyone. Because everyone deserves to be seen as they are.
Learn more about our efforts on Real Tone at g.co/pixel/realtone.