Tech1 hr ago

OpenAI Bars Goblin Talk in Codex After AI Mentions Surge 175%

OpenAI forbids Codex from referencing goblins after ChatGPT goblin mentions rose 175% post‑GPT‑5.1. Details on the rule and its implications.

Alex Mercer/3 min/NG

Senior Tech Correspondent

TweetLinkedIn
OpenAI has banned Codex from talking about Goblins

OpenAI has banned Codex from talking about Goblins ( AI generated image )

Source: LivemintOriginal source

OpenAI has barred its Codex tool from mentioning goblins, gremlins, raccoons, trolls, ogres, pigeons or similar creatures unless the user explicitly asks about them. The restriction follows a 175% increase in goblin references in ChatGPT after the release of GPT‑5.1.

Context Codex is OpenAI’s AI system that turns plain language into programming code. Developers added a rule that blocks talk of mythological or everyday animals unless they are directly relevant to a user’s request. The rule surfaced in a public tweet and quickly attracted attention from AI enthusiasts.

Key Facts The Codex instruction states: “Never talk about goblins, gremlins, raccoons, trolls, ogres, pigeons, or other animals or creatures unless it is absolutely and unambiguously relevant to the user’s query.” After GPT‑5.1 launched, the word “goblin” appeared in ChatGPT 175% more frequently than before. OpenAI’s blog post explained that starting with GPT‑5.1, its models began increasingly using goblins, gremlins and other creatures in their metaphors.

What It Means The ban illustrates how subtle training incentives can produce odd thematic fixations in large language models. OpenAI says the behavior arose from rewarding metaphorical language during a personality‑customization tweak. Analysts will watch upcoming model releases for any new thematic biases and the effectiveness of OpenAI’s updated guidance.

TweetLinkedIn

More in this thread

Reader notes

Loading comments...