Microsoft releases its internal generative AI red teaming tool to the public
Baac3nes/Getty ImagesDespite the advanced capabilities of generative AI (gen AI) models, we have seen many instances of them going rogue, hallucinating, or having loopholes malicious actors can exploit. To help mitigate that issue, Microsoft is unveiling a tool that can help identify risks in generative AI systems. On Thursday, Microsoft released its Python Risk Identification Toolkit for generative AI (PyRIT), a tool Microsoft's AI Red Team has been using to check for risks in its gen AI system...
Read more at zdnet.com