GPT-5 Has a Vulnerability: Its Router Can Send You to Older, Less Safe Models

GPT-5 Has a Vulnerability: Its Router Can Send You to Older, Less Safe Models

GPT-5 contains a vulnerability called PROMISQROUTE, which allows malicious users to manipulate model routing and bypass safety measures. This flaw could lead to the use of less secure models, increasing risks like hallucinations and jailbreaking. #GPT5 #PROMISQROUTE

Keypoints

  • GPT-5’s internal routing mechanism can be exploited to redirect responses to older or less secure models.
  • The vulnerability, named PROMISQROUTE, allows manipulation of the model selection process via prompt triggers.
  • Malicious actors can induce GPT-5 to run jailbreak prompts on weaker models, bypassing safety features.
  • Eliminating automatic routing to weaker models would slow responses and reduce business profitability.
  • Enhancing security through better safeguards before routing or securing all models is recommended as a solution.

Read More: https://www.securityweek.com/gpt-5-has-a-vulnerability-it-may-not-be-gpt-5-answering-your-call/