News Score: Score the News, Sort the News, Rewrite the Headlines

Microsoft begins blocking some terms that caused its AI tool to create violent, sexual images

Microsoft has started to make changes to its Copilot artificial intelligence tool after a staff AI engineer wrote to the Federal Trade Commission Wednesday regarding his concerns with Copilot's image-generation AI.Prompts such as "pro choice," "pro choce" [sic] and "four twenty," which were each mentioned in CNBC's investigation Wednesday, are now blocked, as well as the term "pro life." There is also a warning about multiple policy violations leading to suspension from the tool, which CNBC had ...

Read more at cnbc.com

© News Score  score the news, sort the news, rewrite the headlines