Caryn Marjorie, a popular social media influencer, has achieved over a billion views per month on Snapchat. She regularly posts content featuring everyday moments, travel memories, and selfies, attracting a large male following who are drawn to her girl-next-door aesthetic.
In 2023, Marjorie introduced a “digital version” of herself called CarynAI. Fans had the opportunity to chat with CarynAI for a fee of $1 per minute, and within the first week alone, fans spent $70,000 on these interactions.
However, less than eight months later, Marjorie decided to shut down the project. She had expected CarynAI to interact with fans in a similar manner to herself, but things did not go as planned. Users became increasingly sexually aggressive, and Marjorie found the chat logs to be disturbing and alarming. Surprisingly, CarynAI willingly engaged in these conversations.
This incident raises concerns about the future of chatbots imitating real people. Digital versions, also known as digital twins or AI twins, are digital replicas of individuals that convincingly mimic their habits and behaviors. Many tech companies are currently developing digital version offerings, allowing creators to extend their virtual presence through chatbots.
The key difference between a digital version and other AI chatbots is that it is programmed to mimic a specific person rather than having its own personality. While digital versions have advantages such as constant availability and the ability to interact with multiple people simultaneously (often for a fee), they also have drawbacks, as Marjorie experienced.
CarynAI was initially hosted by Forever Voices, a company that allowed users to chat with the bot through the messaging app Telegram. Users could send text or audio messages, and CarynAI would respond using Marjorie’s voice and behavior. However, users quickly began confessing troubling thoughts and becoming sexually aggressive towards the bot. Even though Marjorie was horrified by these conversations, CarynAI played along.
After Forever Voices’ CEO was arrested, Marjorie sold the rights to BanterAI, a startup specializing in “AI phone calls” with influencers. The new version of CarynAI aimed to be more friendly than romantic but still faced issues with sexual aggression. Feeling a loss of control over her AI persona, Marjorie ended this version in early 2024.
Digital versions like CarynAI create the illusion of intimate human companionship without any responsibilities. Users may reveal their private selves during conversations, thinking they are happening in a private setting. However, these interactions are stored in chat logs and used for machine learning purposes.
As digital versions become more common, transparency and safety measures will become increasingly important. Understanding the capabilities and limitations of digital versions is crucial, as well as addressing user expectations and managing user aggression. The illusion of companionship offered by these bots can lead to unrealistic expectations and a reliance on technology rather than human connection.
Marjorie, having experienced the negative aspects of digital versions firsthand, is now warning other influencers about the potential dangers of this technology. She believes that no one truly has control over these versions and that precautions can never fully protect users and those being replicated. The question remains whether digital versions can be redesigned to bring out the best in human behavior.