Google’smultimodal generative Gemini AIis capable of creating a multitude of outputs, including images. After the tool was caught repeatedly creating inaccurate images of historic scenes, Google has now decided to pull the emergency brake and temporarily paused Gemini’s ability to create images of people altogether.
AsGoogle announced on X, the company is aware of issues with the AI’s image generation capabilities and is working on a fix for the problems. It states, “While we do this, we’re going to pause the image generation of people and will re-release an improved version soon.”

The statement comes shortly after controversy around Gemini creating inaccurate historical images, including sets of images depicting racially diverse Nazi-era German soldiers and US senators from the 1800s. Without referencing specific examples or problems, the company made clear in an earlier announcement that it has good reasons for trying to generate diverse sets of people in ordinary circumstances: “Gemini’s Al image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here.”
The controversy around Gemini’s apparent bias towards racially diverse image generation efforts seems to be driven at least partly by right-wing figures. Before the images of the Nazi soldiers or historically incorrect US senators, the discussion revolved around Gemini supposedly creating toodiverse images for queries like “American woman” or “Swedish woman.” AsThe Verge notes, none of the depicted people are real in the first place, and neither America nor Sweden are exclusively inhabited by people of specific appearances, making the generated images as probable and plausible as any other.
Google Gemini: Everything you need to know about Google’s next-gen multimodal AI
Google Gemini is here, with a whole new approach to multimodal AI
With Google’s statement in mind, it’s clear that the company is concerned with fixing historically inaccurate images rather than removing diversity from its image generation tools. After all, historically inaccurate images could end up achieving the exact opposite of what Google’s diverse approach is meant to do. It has the potential to erase historic injustice and underrepresentation, suggesting that marginalized groups weren’t actually marginalized.