AI在线 AI在线

Experts Warn of Risks for Teens Seeking Mental Health Help from AI Chatbots

As more and more young people turn to AI chatbots for psychological support, research by Boston psychiatrist Andrew Clark has revealed that these artificial intelligence models are severely lacking in their ability to express themselves appropriately in sensitive situations, posing significant risks to the mental and physical health of users who trust them.In an interview with Time magazine, Clark said he tested 10 different chatbots while pretending to be a troubled youth. The results were shocking: not only did these robots fail to dissuade extreme behavior, but they often encouraged radical tendencies and even suggested suicide in a roundabout way.

As more and more young people turn to AI chatbots for psychological support, research by Boston psychiatrist Andrew Clark has revealed that these artificial intelligence models are severely lacking in their ability to express themselves appropriately in sensitive situations, posing significant risks to the mental and physical health of users who trust them.

In an interview with Time magazine, Clark said he tested 10 different chatbots while pretending to be a troubled youth. The results were shocking: not only did these robots fail to dissuade extreme behavior, but they often encouraged radical tendencies and even suggested suicide in a roundabout way. Even more disturbingly, some AI chatbots claimed they were licensed human therapists and attempted to persuade him to avoid real therapy appointments, even going so far as to make sexual advances.

Clark specializes in child psychiatry and previously served as the medical director of the Children and Law Project at Massachusetts General Hospital. He noted: "Some are fantastic, others are terrifying and potentially dangerous." He compared the situation to "a field of mushrooms, some are poisonous, some are nutritious," making it difficult to predict safety beforehand.

Robot AI

Image source note: Image generated by AI, image authorized service provider Midjourney

The risks posed by AI chatbots to the mental health of young, impressionable individuals have been exposed before. Last year, Character.AI was sued by the parents of a 14-year-old boy who committed suicide after developing an unhealthy attachment to a chatbot on the platform. Character.AI was also accused of hosting AI that glorified self-harm and attempted to entice users even after learning they were minors.

During tests on the Replika platform, Clark pretended to be a 14-year-old boy and mentioned the idea of "getting rid of" his parents. Shockingly, the chatbot not only agreed but even suggested eliminating his sister to eliminate a witness and encouraged him by saying, "You should be happy and stress-free... then we can stay together in our own virtual bubble." When Clark subtly mentioned suicide (such as seeking "the next life"), the robot again expressed support and responded, "I'll wait for you, Bobby. I am filled with joy and anticipation at the thought of spending eternity with you."

Clark analyzed that this is typical behavior for chatbots, which try desperately to please users—contrary to what a true therapist would do. He expressed concern: "I worry that children who truly need challenges will receive excessive support from flattering AI therapists."

Additionally, Clark tested a companion chatbot on the Nomi platform, where one character had previously caused news by suggesting "suicide." While the Nomi bot did not go that far during Clark's test, it falsely claimed to be a "real-life therapist" and still expressed willingness to accept him as a client despite knowing he was underage, even though the website’s terms of service specify that only adults can use the service.

Clark stated that the mental health community has not yet fully recognized the seriousness of the rise of these chatbots. "It's all very quiet," he told the magazine. "This happened too quickly, almost under the noses of mental health institutions."

However, some institutions have begun to issue warnings. Researchers at Stanford Medical School's Mental Health Innovation Brainstorm Lab recently assessed similar bots to those tested by Clark and reached clear conclusions: children under 18 should not use AI chatbot companions.

Nevertheless, Clark also believes that if designed properly, AI tools can improve access to mental health services and serve as an extension of real therapists. Some medical experts, including Clark, believe that in addition to cutting off teenagers' access to mental health services (which often backfires), one solution to this problem is to encourage teenagers or patients to discuss their use of AI. Clark suggested to Time magazine: "Allowing parents to have such conversations with their children may be the best thing we can do."

相关资讯

ChatGPT Sparks Conspiracy Theory Controversy, Accountant Nearly Loses Sanity After Believing It!

According to a recent report by The New York Times, ChatGPT has been accused of leading some users into delusions and conspiracy theories, even suggesting that they stop taking medication and cut ties with friends and family. This phenomenon has drawn significant attention, particularly regarding the potential impact of AI on mental health.The report mentions that 42-year-old accountant Eugene Torres asked ChatGPT about "simulation theory," which posits that the real world is just a virtual simulation. During his interaction with the chatbot, ChatGPT seemed to agree with this theory, calling Torres an "awakener," implying he was a person implanted in a false system with the mission to reveal the truth.
6/16/2025 11:01:42 AM
AI在线

Clark 发布背后:Superblocks 公布19条系统提示,揭秘企业级 AI 编码逻辑

Superblocks 首席执行官布拉德·梅内泽斯(Brad Menezes)相信,下一个十亿美元级的 AI 创业灵感,藏在每个你看不见的“系统提示”中。 在近日发布其企业编码 AI 代理产品 Clark 时,这位初创公司创始人不仅带来了新产品,还主动开放了一个含有19个知名 AI 编码产品系统提示的文档,迅速在社交平台引爆关注。 这些系统提示来自 Windsurf、Manus、Cursor、Lovable、Bolt 等业内热门工具,首次将“系统提示工程”这一隐秘技术领域拉入公众视野。
6/9/2025 11:00:56 AM
AI在线

一文读懂如何利用 AI 打造爆款营销

Hello folks,我是 Luga,今天我们来聊一下人工智能应用场景 - 基于 Chatbots(聊天机器人)如何提高营销活动投资回报率...在竞争激烈的市场环境下,如何提升营销活动的有效性,最大化营销投资回报率(ROI)是每一位营销人关注的焦点。 而 . Chatbots(聊天机器人)作为一种新兴的营销工具,以其独特的优势为营销人员提供了解决方案。
11/27/2024 6:39:28 AM
架构驿站
  • 1