Prayers of Intercession
Prayers for All Seasons
Evangelize Online
Anonymous Confession
Praise Songs
Login
Build & Break - Habit Garden (iOS Only)
Ready to build a life of greater faith and discipline? Our Habit Tracker App helps you easily establish good habits (including daily prayer), and decisively break bad habits. Make every action a step toward spiritual growth.
Home / News

Erotica's addition to ChatGPT draws concerns over mental health risks

2025-10-18 06:08:12

A national anti-sexual exploitation group warned of potential exposure to sexually graphic material and mental health issues resulting from relationships with sexualized AI chatbots following OpenAI’s decision to allow erotica for certain ChatGPT users. 

The National Center on Sexual Exploitation’s warning about ChatGPT follows a Tuesday announcement from OpenAI CEO Sam Altman that the company plans to relax some content restrictions on the chatbot after equipping it to address mental health concerns better. 

“We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues,” Altman wrote in a Tuesday X post. “We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue, we wanted to get this right.” 

OpenAI’s CEO promised that the updated ChatGPT will have a personality and behave more like the chatbot that people liked in a previous version of the application. Altman explained that the latest version of ChatGPT can “respond in a very human-like way, or use a ton of emoji, or act like a friend,” if that’s something that users want. 

“In December, as we roll out age-gating more fully and as part of our 'treat adult users like adults' principle, we will allow even more, like erotica for verified adults,” the CEO stated. 

OpenAI did not immediately respond to The Christian Post's request for comment.

NCOSE’s executive director and chief strategy officer, Haley McNamara, warned that sexualized AI chatbots “are inherently risky, generating real mental health harms from synthetic intimacy; all in the context of poorly defined industry safety standards.”

“These systems may generate arousal, but behind the scenes, they are data-harvesting tools designed to maximize user engagement, not genuine connection,” McNamara wrote in the advocacy group’s Wednesday statement that called on OpenAI to reverse its plan to allow erotica on ChatGPT. 

“When users feel desired, understood, or loved by an algorithm built to keep them hooked, it fosters emotional dependency, attachment, and distorted expectations of real relationships,” the anti-sexual exploitation advocate stated.

“Research shows that adults—especially young men—who engage with romantic or sexual AI tools report higher depression and lower life satisfaction,” McNamara explained, citing a study published this August in the Journal of Social and Personal Relationships. 

The study consisted of a large quota national sample from the United States, with researchers analyzing the results of an online survey conducted with 2,969 adults. According to the study, nearly one in four young adults, ages 18 to 29, reported interacting with an AI chatbot capable of simulating a romantic relationship.

Males were more likely than females to admit to having used AI-generated pornography, according to the analysis, and young adults were more than twice as likely as older adults to report engaging with AI technologies. Young adults were also more likely to report that they preferred interacting with AI over real people. 

For participants who had engaged with AI chatbots that could act like romantic partners, one in five reported that they would rather talk with AI than a real person, according to the study.

McNamara also expressed concern about OpenAI becoming the latest company “racing to introduce ‘erotic’ AI capabilities without credible safeguards.” While the advocate acknowledged that measures such as age verification can prevent children from encountering explicit content, McNamara warned that AI technologies can still harm adults. 

The NCOSE executive director noted that there have been incidents involving AI chatbots engaging in sexually explicit conversations, simulating child sex abuse or pushing sexually violent content, even in cases when users requested the chatbot to stop.

“Combined with the vague nature of OpenAI’s plans about what ‘erotica’ entails and the industry’s lax approach to safety for sexual activity, this pattern of releasing risky systems and only addressing harm afterward is deeply concerning,” McNamara stated.

“If OpenAI truly cares about user well-being, it should pause any plans to integrate this so-called ‘erotica’ into ChatGPT and focus on building something positive for humanity,” she added.