‘A robot is not a form of support.’ The complications of relying on chatbots for eating disorder support.

KIT, powered by a conversational intelligence platform, was said to provide users with “general information” on body image issues and eating disorders. The Butterfly Foundation also says it taught coping mechanisms to help make social media experiences more positive.

Given what’s been in the news about NEDA’s own chatbot, Mamamia asked the Butterfly Foundation’s interim CEO about what measures have been undertaken to make sure their chatbot is safe to use.

It was confirmed the chatbot is currently no longer in use. 

“Butterfly’s chatbot KIT was developed in 2019 through our work with a team of mental health researchers, clinicians and IT experts at Monash University and Swinburne University of Technology, in partnership with conversational AI specialists and Iris developers, Proxima,” says Anna Cullinane.

“KIT was a rule-based bot, not a conversational bot and did not use AI. It was designed to be an adjunct to our Helpline services, by helping users with the transition to seeking in-person support and providing answers to commonly asked questions.”

KIT was trialled on the Butterfly Foundation’s website from 2020 to 2022, and following the trial, the foundation identified that further investment would be required for the ongoing management and development of a chatbot. 

“We are considering what our next steps will be in this space to ensure that we can best serve our consumers and align it to our strategy. We will be seeking funding and investment to ensure that it is safe, secure and appropriately managed,” Anna notes to Mamamia.