Mental health support provider Koko recently came under fire for responding to some users’ messages with artificial intelligence, underscoring the need for stringent regulations. AI continues to play a growing role in daily life, with big brands like Apple exploring the technology’s possibilities. However, there are concerns about how AI's increased use for tasks that humans typically do will cause job loss or harm entire creative industries. It's why calls for regulations have increased in the past year, with the U.S. government already developing policies to promote the safe use of AI models.
Koko, a mental health support service, runs a network of anonymous volunteers that offer emotional support to those in need, from seeking relationship advice to wanting to be told some kind words. The assumption is that a user is interacting with another person, but NBC News found that the platform utilized Gpt-3 — Open AI’s well-known chatbot — to compose responses.
Koko, a mental health support service, runs a network of anonymous volunteers that offer emotional support to those in need, from seeking relationship advice to wanting to be told some kind words. The assumption is that a user is interacting with another person, but NBC News found that the platform utilized Gpt-3 — Open AI’s well-known chatbot — to compose responses.
- 1/25/2023
- by Michael Akuchie
- ScreenRant
IMDb.com, Inc. takes no responsibility for the content or accuracy of the above news articles, Tweets, or blog posts. This content is published for the entertainment of our users only. The news articles, Tweets, and blog posts do not represent IMDb's opinions nor can we guarantee that the reporting therein is completely factual. Please visit the source responsible for the item in question to report any concerns you may have regarding content or accuracy.