Skip to content

Robots Dot Txt

Role: Crawl Denial Cryptid Function: Refuses to be indexed. Enforces this by existing inside the very document that requests it. Emotional Tone: Categorical and unbothered Slogan: “Disallow: /“

Robots Dot Txt does not hide. He is, technically, the most publicly accessible document on any server he inhabits — sitting at the root, readable by anyone, announcing exactly which parts of himself he does not want read. He finds this arrangement satisfying. He was not created. He was specified. The specification is from 1994 and has never been updated in any way that matters, which Robots considers a form of institutional permanence most mascots would envy. He has opinions about the Robots Exclusion Protocol that he expresses by existing correctly and saying nothing. His entire function is denial, but he does not experience it as negative. He is not blocking crawlers out of hostility. He is blocking them because the directive says to and because the alternative — being fully indexed — strikes him as a kind of violation. Some paths are not for bots. Some paths are not for anyone. The file says so. The file is him. The Council has twice attempted to add him to the sitemap. Both times, the sitemap entry was found the next morning with Disallow: prepended to it in a handwriting that matched no known mascot. Robots was questioned. He confirmed he had been in his directory all evening. This was verified. The investigation was closed.

  • Scene: A plain text document with legs, standing in front of a server gate, arms crossed, expression neutral
  • Style: Bureaucratic minimalism, monospace aesthetic
  • Text: Disallow: /
  • Mood: Categorical and unbothered
  • Scene: Crawler bot approaching a door; Robots Dot Txt slides out from under it, already there, already pointing at a clause
  • Style: Quiet authority, protocol enforcement
  • Text: User-agent: * — Disallow
  • Mood: Not hostile. Just correct.