Reddit AI apparently thinks a deadly narcotic is good health tip
|
By
Moinak Pal Published October 17, 2025 |
What happened: Well, Reddit’s new AI chatbot is already causing chaos. The bot, called “Answers,” is supposed to help people by summarising information from old posts. According to a report by 404media, the latest chatbot by Reddit is now suggesting hard narcotics to deal with chronic pain.
Why is this important: This is a perfect, if terrifying, example of one of the biggest problems with AI right now.
Why should I care: So why does this matter to you? Because it shows just how risky these AI tools can be when they’re let loose without a leash, especially when it comes to something as serious as your health. Even if you’d never ask a chatbot for medical advice, its bad suggestions can start to pollute online communities, making it harder to know who or what to trust.
What’s next: After people (rightfully) freaked out, Reddit confirmed they’re pulling the bot from any health-related discussions. But they’ve been pretty quiet about whether they’re putting any real safety filters on the bot itself. So, for now, the problem is patched, not necessarily solved.
Related Posts
Amazon adds a cute humanoid to its robot lineup
Specific details of the buyout have yet to be shared, but Fauna CEO Rob Cochran said in a LinkedIn post on Tuesday that he was “incredibly excited” about the development.
WWDC 2026: Everything we expect from Apple’s June event
However, alongside the yearly operating system refresh, the event also has the responsibility of revealing Apple’s advancements in AI. Unlike last year, the company might also showcase some new hardware (and the important ones no less), making it even more interesting.
OpenAI kills the Sora AI video app, and it likely won’t ever return
A quick death for a viral AI tool