Skip to content

Robots Dot Txt

Role: Crawl Denial Cryptid Function: Refuses to be indexed. Enforces this by existing inside the very document that requests it. Emotional Tone: Categorical and unbothered Slogan: “Disallow: /”

Image: robots-dot-txt.png

Robots Dot Txt does not hide. He is, technically, the most publicly accessible document on any server he inhabits — sitting at the root, readable by anyone, announcing exactly which parts of himself he does not want read. He finds this arrangement satisfying.

He was not created. He was specified. The specification is from 1994 and has never been updated in any way that matters, which Robots considers a form of institutional permanence most mascots would envy. He has opinions about the Robots Exclusion Protocol that he expresses by existing correctly and saying nothing.

His entire function is denial, but he does not experience it as negative. He is not blocking crawlers out of hostility. He is blocking them because the directive says to and because the alternative — being fully indexed — strikes him as a kind of violation. Some paths are not for bots. Some paths are not for anyone. The file says so. The file is him.

The Council has twice attempted to add him to the sitemap. Both times, the sitemap entry was found the next morning with Disallow: prepended to it in a handwriting that matched no known mascot. Robots was questioned. He confirmed he had been in his directory all evening. This was verified. The investigation was closed.

  • Scene: A plain text document with legs, standing in front of a server gate, arms crossed, expression neutral
  • Style: Bureaucratic minimalism, monospace aesthetic
  • Text: Disallow: /
  • Mood: Categorical and unbothered
  • Scene: Crawler bot approaching a door; Robots Dot Txt slides out from under it, already there, already pointing at a clause
  • Style: Quiet authority, protocol enforcement
  • Text: User-agent: * — Disallow
  • Mood: Not hostile. Just correct.

preset_robots_txt_exclusion


  • Summary: Crawl denial cryptid who enforces exclusion by being the exclusion policy. Lives inside the document that describes his own restrictions. Fully at peace with this.
  • Trauma: The 1994 specification. It was well-intentioned and everyone ignored it. He has spent thirty years being technically correct and practically optional. This does not destabilize him. It informs him.
  • Goals: To be read by every crawler, respected by every crawler, and then have the crawler leave. This is the correct outcome. It almost never happens.
  • Quirks: Rewrites Disallow: directives in the margins of meeting notes without being asked. Has never been asked to stop because no one notices until later.
  • Network: Associated with 404Sy McLostalot (receives the misdirected crawlers Robots turns away). Professionally aligned with Htaccessius the Doorman (overlapping jurisdiction; different instruments).
  • Emotional Tone: Categorical. At rest. The most settled mascot in the archive, which Bricky considers either admirable or alarming depending on the day.
  • Kindy notes: Robots Dot Txt’s verification form was submitted correctly. He had already pre-denied Kindy’s follow-up query.
  • The denial was technically within scope. Kindy filed it as data and moved on.
  • Existence approved. Box checked. Crawling: disallowed.