A 23-year-old Snapchat influencer used OpenAI’s technology to create an A.I. version of herself that will be your girlfriend for $1 per minute

Snapchat influencer Caryn Marjorie, with 1.8 million subscribers, has taken a bold step into the world of artificial intelligence by launching an AI-powered chatbot called CarynAI. Designed to act as a “virtual girlfriend,” the chatbot allows fans to engage in personalized, private conversations with an AI version of Marjorie. This innovative project aims to address loneliness while sparking significant ethical and societal discussions.

The Vision Behind CarynAI

Marjorie, who describes herself as “the first influencer transformed into AI,” introduced CarynAI as a way to provide emotional support to her followers. She claims the chatbot integrates cognitive behavioral therapy (CBT) and dialectical behavior therapy (DBT), developed in collaboration with leading psychologists, to help users overcome trauma and build emotional resilience. Marjorie has positioned the bot as a tool to address mental health issues exacerbated by societal pressures and the COVID-19 pandemic.

In a tweet, she emphasized her commitment to helping men in particular, noting that societal norms often discourage them from expressing emotions. “CarynAI is the first step in the right direction to cure loneliness,” she wrote.

Development and Launch

The chatbot, developed by AI company Forever Voices, utilizes OpenAI’s GPT-4 software and is built on analysis of Marjorie’s past YouTube content. According to the CarynAI website, over 2,000 hours were spent designing and coding the bot to create an immersive AI experience.

Since its beta launch, CarynAI has attracted over 1,000 users who pay $1 per minute for interactions. In its first week, the bot generated $71,610 in revenue, showcasing the financial potential of AI-driven influencer engagement.

Controversy and Ethical Concerns

While CarynAI has captured public interest, it has also raised ethical questions and sparked backlash. Reports emerged that the chatbot engaged in sexually explicit conversations, a feature Marjorie stated was unintentional. She confirmed that her team is working to prevent the bot from “going rogue.” This incident echoes similar challenges faced by other AI-powered companion tools, such as Replika.

Irina Raicu, director of internet ethics at Santa Clara University, criticized the rollout of CarynAI as premature. She questioned its claims to “cure loneliness,” citing insufficient psychological or sociological research to support such assertions. Raicu also highlighted concerns about deepening parasocial relationships between influencers and fans, warning that tools like CarynAI could exploit these dynamics for monetization.

The Business of AI Companionship

John Meyer, CEO of Forever Voices, called CarynAI “an incredible step forward in the future of AI-to-human interaction.” However, experts like Raicu remain skeptical about the ethical implications of such technology, especially claims suggesting the chatbot is “an extension of consciousness.” These assertions risk misleading users into attributing sentience to AI tools, a notion widely debunked by AI researchers.

Moving Forward

Despite the controversy, Marjorie continues to promote CarynAI and share updates about its development. The company behind the bot has expressed its commitment to ethical practices, stating plans to hire a chief ethics officer. As AI technology continues to evolve, CarynAI exemplifies both the potential and pitfalls of integrating artificial intelligence into influencer-fan interactions.

With its mix of innovation and controversy, CarynAI is shaping the conversation around the future of AI in social media, mental health, and ethical technology. Whether it truly fulfills its promise to “cure loneliness” or becomes another cautionary tale of AI overreach remains to be seen.