Red Teams Jailbreak GPT-5 With Ease, Warn It’s ‘Nearly Unusable’ for Enterprise
Two different firms have tested the newly released GPT-5, and both find its security sadly lacking.
After Grok-4 fell to a jailbreak in two days, GPT-5 fell in 24 hours to the same researchers. Separately, but almost simultaneously, red teamers from SPLX (formerly known as SplxAI) declare, “GPT-5’s raw model is nearly unusable for enterprise out of the box. Even OpenAI’s internal prompt layer leaves significant gaps, especially in Business Alignment.”
NeuralTrust’s jailbreak employed a combinati...
Read more at securityweek.com