Experts are raising alarms over significant privacy risks as viral AI-generated action figure images flood the internet, sparking widespread concern.
The AI action figure image trend is flooding social media platforms
The viral AI action figure image trend has taken over Instagram, TikTok, and even LinkedIn, with users sharing AI-generated versions of themselves as figurines.
These AI-generated avatars often show people holding laptops, drinks, or in humorous poses, capturing attention and likes across the internet.
However, experts are now raising alarms over the potential **privacy concerns** behind this seemingly fun and harmless trend.
Security specialists warn that this fad could open users to severe risks by exposing sensitive information through the images they upload.
Before uploading your selfie to generate an AI figure, it’s worth understanding what data you’re actually handing over.
What you really share when you upload an image to AI tools like ChatGPT
Tom Vazdar, cybersecurity expert at the Open Institute of Technology, explained the hidden dangers in a recent interview with Wired.
According to Vazdar, every image shared with platforms like ChatGPT includes more than just a face — it shares metadata.
That metadata may contain the time, date, and even the **GPS location** where the photo was taken.
He said: “That includes the EXIF data attached to the image file, such as the time the photo was taken and the GPS coordinates of where it was shot.
“Because platforms like ChatGPT operate conversationally, there’s also behavioural data, such as what you typed, what kind of images you asked for, how you interacted with the interface and the frequency of those actions.”
This type of embedded data is known as EXIF information, and it can be extracted without the user even knowing.
In addition to image data, AI platforms track your behavior, inputs, and interactions while generating these AI figures.
Vazdar emphasized how these behavioral patterns build detailed profiles of users, including how often and in what ways they use the platform.
AI tools log what images you request, how long you engage, and what prompts you use during these interactions.
This gives companies access to not just a face, but contextual surroundings, habits, and emotional expressions in photos.
Images may even reveal other individuals or private environments shown in the background, raising additional privacy concerns.
AI trends could be used to build massive facial recognition datasets
Vazdar believes the AI action figure image trend is fueling something far more complex behind the scenes.
He said the trend may serve as a **convenient opportunity to collect facial recognition data** on a massive scale.
With millions uploading selfies, companies acquire diverse facial data from different countries, ages, and ethnic groups.
Vazdar concluded: “This trend, whether by design or a convenient opportunity, is providing the company with massive volumes of fresh, high-quality facial data from diverse age groups, ethnicities, and geographies.”
This rich, high-quality facial data could later be used for training surveillance systems, security tools, or commercial facial recognition AI.
While not all data collection is malicious, the volume and variety of data being gathered is concerning.
As AI becomes a part of everyday life, such trends normalize giving away sensitive information without much thought.
Think twice before turning yourself into an AI action figure
It’s tempting to join viral AI trends, especially when everyone on your feed is doing it.
But handing over your photo, habits, and data to an AI platform comes with long-term consequences. Once shared, your image and metadata may never be fully deleted or controlled again.
The AI might “remember” you even if the app no longer displays the figure you created.
This concern echoes larger issues about AI transparency, data ownership, and digital consent in a connected world.
As funny as these figurines look, you may not want to star in your own real-life Black Mirror episode.