That Action Figure Trend Could Be Putting Your Cybersecurity At Risk – Here's What To Know
The author created an AI doll version of herself using ChatGPT.
If you’re on social media, it’s highly likely you’re seeing your friends, celebrities and favorite brands transforming themselves into action figures through ChatGPT prompts.
That’s because, lately, artificial intelligence chatbots like ChatGPT are not just for generating ideas about what you should write ― they’re being updated to have the ability to create realistic doll images.
Once you upload an image of yourself and tell ChatGPT to make an action figure with accessories based off the photo, the tool will generate a plastic-doll version of yourself that looks similar to the toys in boxes.
While the AI action figure trend first got popular on LinkedIn, it has gone viral across social media platforms. Actor Brooke Shields, for example, recently posted an image of an action figure version of herself on Instagram that came with a needlepoint kit, shampoo and a ticket to Broadway.
People in favour of the trend say, “It’s fun, free, and super easy!” But before you share your own action figure for all to see, you should consider these data privacy risks, experts say.
One potential con? Sharing so much of your interests makes you an easier target for hackers.
The more you share with ChatGPT, the more realistic your action figure “starter pack” becomes — and that can be the biggest immediate privacy risk if you share it on social media.
In my own prompt, I uploaded a photo of myself and asked ChatGPT to........
© HuffPost
