Skip to content

Artificial Intelligence therapists prohibited in Illinois as Gen Z turns to ChatGPT for emotional solace

Overcoming the obstacle of persistent anxiety: a guide to help you move forward.

Illinois forbids the employment of AI therapists as Gen Z turns to ChatGPT for emotional comfort.
Illinois forbids the employment of AI therapists as Gen Z turns to ChatGPT for emotional comfort.

Artificial Intelligence therapists prohibited in Illinois as Gen Z turns to ChatGPT for emotional solace

In the rapidly evolving digital landscape, artificial intelligence (AI) systems like ChatGPT are increasingly being used to provide emotional support and therapeutic guidance. However, experts caution that while these systems can offer an "emotional sanctuary" and insightful advice, they lack genuine human understanding and emotional depth [1][2][4].

The Illinois Wellness and Oversight for Psychological Resources Act, the first law in the United States to ban the independent use of AI as a therapeutic tool, emphasizes this point [5]. The law states that AI can be used administratively but not for therapeutic interactions or creating treatment plans without the supervision of a professional [3]. Companies that violate these rules risk fines of up to $10,000 per infraction.

Recognizing the limitations of AI, OpenAI, the company behind ChatGPT, has launched a new version called ChatGPT-5. This updated AI system boasts improvements in writing, programming, and responses to health-related questions, offering more accurate and timely answers [6]. Sam Altman, CEO of OpenAI, claims that consulting ChatGPT-5 is like speaking to a PhD-level expert on any topic [6].

One of the significant advancements in ChatGPT-5 is its ability to respond with timely and concrete indications on how to receive help from professionals when users express feelings of loneliness, emotional distress, or suicidal thoughts [6]. This feature underscores the importance of AI as a complement to human therapists rather than a substitute, especially for complex emotional or clinical issues.

Despite its limitations, the appeal of AI lies in its constant availability, non-judgmental nature, and kind responses [7]. Users often report feelings of connection and relief when interacting with AI, particularly in contexts like grounding exercises or managing immediate distress [1][4]. However, it's crucial to remember that AI lacks the capacity for nuanced judgment, ethical consideration, and accountability, which are critical for vulnerable individuals [2].

Kyle Hillman, director of legislative affairs for the National Association of Social Workers, echoes this sentiment, emphasizing that AI can support mental health, but it shouldn't replace human therapists [5]. As the use of AI for emotional support continues to grow, particularly among Generation Z, it's essential to strike a balance between leveraging AI's potential benefits and ensuring that users receive the professional help they need when dealing with complex emotional or clinical issues.

OpenAI is also taking steps to address concerns about the use of AI in mental health support. Soon, the company will introduce features that encourage users to take breaks during long sessions and limit direct responses to high-risk personal questions [8]. Moreover, ChatGPT-5 aims to reduce "hallucinations" - misleading or incorrect responses, and is more transparent about its limitations [8].

As we navigate the intersection of AI and mental health support, it's clear that while AI can provide immediate, accessible support and tools, it is not a substitute for human therapists—especially for complex emotional or clinical issues requiring deep understanding, ethical responsibility, and relational intimacy [2][4][5]. The current consensus is that AI is a useful adjunct but insufficient to replace human connection essential to effective therapy [5].

  1. Given the Illinois Wellness and Oversight for Psychological Resources Act, it is imperative that AI can only be used administratively in the realm of mental health, not for therapeutic interactions or creating treatment plans without the supervision of a professional.
  2. Despite the advancements made in ChatGPT-5, such as offering help options when users express feelings of emotional distress or suicidal thoughts, AI lacks the capacity for nuanced judgment, ethical consideration, and accountability essential for vulnerable individuals.
  3. The appeal of AI lies in its constant availability and non-judgmental nature, but its limitations in emotional depth necessitate a balanced approach to its use, emphasizing that it should complement, not replace, human therapists.
  4. As the use of AI for emotional support continues to grow, particularly among Generation Z, policymakers and AI companies must work together to ensure that users receive professional help when dealing with complex emotional or clinical issues, and address concerns about AI's potential to provide inadequate support or misleading information.

Read also:

    Latest