U.K. agency releases tools to test AI model safety | TechCrunch
The U.K. Safety Institute, the U.K.’s recently established AI safety body, has released a toolset designed to “strengthen AI safety” by making it easier for industry, research organizations and academia to develop AI evaluations.
Called Inspect, the toolset — which is available under an open source license, specifically an MIT License — aims to assess certain capabilities of AI models, including models’ core knowledge and ability to reason, and generate a score based on the results.
In a press r...
Read more at techcrunch.com