
The “Banana AI Saree Trend” trend, which converts common pictures into dramatic edits in Bollywood style from the 90s, has created quite buzzing on Instagram. It seems that almost every girl is trying a fashion trend powered by artificial intelligence, recording viral modifications with chiffon sarehs hovering in the wind and warm golden clock lighting on social media. However, the Instagram user named Jhalakbhawani found something “worrying” while trying the trend, which made her document the Instagram experience.
What did the user find “Creepy?”
The woman shared that she had tried to create her image and found something “scary”. “The trend is becoming a viral on Instagram where you upload your picture to Gemini with a single challenge, and Gemini converts it into a sari. Last night I tried this trend and found something very worrying,” she wrote.
She shared a picture she recorded on a Gemini-sama in a green suit with a full sleeve-she wrote a challenge with him. The results that Gemini brought, however, shocked her.
Also read | 10 Styles of Nano Banana Styles You can create with Gemini will reveal Google
She said, “I considered this picture very attractive and even published it on my Instagram. But then I noticed something special – on my left hand the mole in the generated picture I actually have in real life.
She also asked, “How did Gemini know I had a mole on this part of my body? You see a mole – it’s very scary and scary. I’m still not sure how it happened, but I wanted to share it with everyone.
Her contribution evoked a number of answers, with some fears of security fears around the trend, while others considered it “normal” and suggested to create videos to gain opinions.
Also read | 15 vintage retro-stel portrait of ai portraits to capture the trend of viral instagram
Read | Photos Gemini Ai Saree: The best challenges to get Vintage Bollywood vibrations
How safe is the Gemini tool Nano Bananas?
While technological giants such as Google and Openai offer tools to protect the content of the user, experts say that security eventually depends on personal practices and the intention of those who approach images. For example, Google Nano Banana images carry an invisible digital watermark called Synthid, along with metadata brands, designed “to clearly identify them as a generated AI”, according to Aistudio.google.com. The watermark, although invisible to the naked eye, can be detected by specialized tools to verify the origin of the AI image, says spielcreative.com.
Can a watermark really prevent abuse?
However, the detection tool is not yet publicly available, which means that most viewers cannot confirm the authenticity of the image, notes Tatler Asia. Critics also warn that watermark can be easily falsified or removed. “No one thinks that the watermarks themselves will be enough,” said Hana Farid, Professor UC Berkeley, while Ben Colman, CEO of Reality Defender, added that his real world often “fails from the beginning”. Experts suggest a combination of watermark with other technologies to better fight convincing deep aids.
(Tagstotranslate) Google Gemini Nano Banana Ai Saree Trend (T) Saree Ai Trend (T) Saree Ai Trend Prompt (T) Saree AI Trend (T) Saree Ai Trend Kaise Banaye (T) Saree Ai Trend Name





