AI Chatbot Joins Mushroom Hunters Group, Immediately Encourages Them to Cook Dangerous Mushroom
By now, it should be obvious that AI is capable of giving really, really bad advice. Sometimes the advice it gives is just stupid. Other times, it’s actively dangerous. 404 Media reports on an incident from the latter category in which a popular Facebook group dedicated to mushroom foraging was invaded by an AI agent, which subsequently provided suggestions on how to cook a dangerous mushroom. The agent in question, dubbed “FungiFriend,” entered the chat belonging to the Northeast Mushroom Ident...
Read more at gizmodo.com