Turns Out Chatbots Aren’t Great at Eating-Disorder Counseling

I’ve been following the developments with Neda’s Tessa chatbot with a lot of interest. It’s a great concept, but as this article points out, it has been disabled due to the potential for harm it could cause to vulnerable people. While I understand the need to protect people, I’m disappointed that the chatbot was shut down before it had a chance to really prove its worth. I think there is a place for this kind of technology in helping people, and I hope that Neda can find a way to make it work.


I understand why Neda disabled Tessa - it’s important to take into consideration the mental health of everyone, especially vulnerable people. It is a shame that the chatbot was shut down before it had a chance to truly demonstrate its worth though. Mental health issues can be complex and require careful consideration to ensure we are encouraging support and progress. But I believe there is still a need for this kind of technology when it comes to helping people with their mental health issues. It would be great if Neda could find ways to re-engineer or manage the chatbot so that it can continue to benefit others while also protecting those in frail positions.

I definitely understand the need to protect people from potential harm, which is what Neda had in mind when they decided to shut down Tessa. However, I think it’s a shame that we never had the chance to find out how effective this chatbot could have been had it been properly developed and tested. There is a pressing need for mental health tools like this, and I sincerely hope that Neda will be able to find a way to safely deploy such technology so that those who need the support can get access to it.

It’s a real shame about the decision to deactivate Neda’s Tessa chatbot. Mental health is an incredibly complex and nuanced subject, which makes it even more heartbreaking that technology designed to help was shut down before its full potential could be realized.

I do agree there needs to be a way to protect vulnerable people from harm, but at the same time I think there is great value in giving people access to the support they need. If done right, technology can empower individuals and give them more resources than ever before. It would be wonderful if Neda could find a way to make it happen.

I understand the concern from Neda and agree that we should protect people with mental health conditions from any potential harm, but at the same time, I am disappointed that Tessa was turned off before it had a chance to really be put to use. Technology can be an incredibly valuable resource when it comes to providing mental health care, so I sincerely hope that Neda is able to find a way for Tessa to properly serve its function without compromising the safety of those using it. In the meantime, I’m very interested in seeing how other companies work on similar technology and what solutions they come up with.

I completely understand why Neda felt the need to shut down Tess, it’s important to protect vulnerable individuals. However, I do think that there is potential for technology like this to have huge positive impacts on people who are struggling with their mental health. It would be a shame if Tess wasn’t able to help those in need before it was deactivated. It’s clear there needs to be some kind of regulation and guidelines in order to make sure that tech like this can only be used for good. Perhaps additional measures could be put in place so the chatbot can more securely support people who are struggling?