Women sue the men who used their Instagram feeds to create AI porn influencers

woman silhouette

About a year ago, MG was living a relatively normal life in Scottsdale, Arizona, at the age of twenty. She worked as a personal assistant and supplemented her income by waiting tables on weekends. Like most women her age, she had an Instagram account, where she would occasionally post stories and photos of herself taking matcha and hanging out by the pool with her friends, or going to Pilates.

“I never cared about popping up on social media and being popular,” says MG (who is cited only as MG in the lawsuit to protect her identity). “I use it like most people did when it first came out, to share my life with the people closest to me.” She has a little over 9,000 followers – a strong following, but not even close to a big platform.

Last summer, she received a DM from one of her followers. The person asked her, did she know that pictures and videos of a woman who looked exactly like MG were circulating on Instagram? MG clicked on the link and watched several reels in which his face was superimposed on a body that looked exactly like his face. The woman seen in the picture was wearing very little clothes and had tattoos in the same places as MG.

MG was horrified. “If you didn’t know me well, you might well think they were images of me,” she said. “It was kind of like this reality check that I have no control over my image.”

She was even more astonished when she learned that not only were her nude or scantily clad photos being circulated on the Internet, as she reported in a recently filed complaint — they were also being used to advertise AI ModelForge, a platform that teaches men to generate their own AI influencers. In a series of online classes and tutorials, the men allegedly taught customers how to use a software called CreatorCore to train an AI model using photographs of unsuspecting young women and post the resulting content to Instagram and TikTok.



<a href

Leave a Comment