From Nano Banana to vintage sarees, Gemini's AI techniques must be used with caution

Thursday 18 September 2025 12:16 AM IST

Social media is filled with men in MGR style and women in 90s Bollywood sarees, along with tiny 3D figures of celebrities. The latest model of Gemini, Google's AI assistant, is changing the way the youth look and feel. However, this is raising security concerns as well.

The Nano Banana trend uses Gemini 2.5 flash image technology to create miniature images of movie stars and politicians. At first glance, they look like real figures made of plastic or wood. There are nano images ranging from Mohanlal to the US President. The vintage saree look is a bit more dangerous. No need to spend hours on makeup. No need to buy expensive clothes. The desired costume and makeup will be ready in a single click. Just log in to Gemini's account and give a normal picture and the prompt (command) required for the new look. Even small moles on the face will be maintained. A woman recently came forward alleging that Gemini found even hidden moles. Images of women and children can be misused through such features. Fake images can be created for honey traps.

Concerns about security

Many people are jumping on board without reading the security terms. Gemini has not specified how long the image we provide will be kept on the site or when it will be deleted. Gemini will collect the person's location and even the information on their phone. The information will be used to train artificial intelligence-based language models and form opinions. For example, if a woman asks Gemini to create a picture of her in a red sari, advertisements for red saris will appear on social media in the following days. There is also a possibility of impersonation by combining it with deepfakes.

Things to note

  • Understand the terms of AI models
  • Creators should use watermarks to distinguish AI images
  • Avoid using images of children.