AgentDoor 6 min read

The File Every AI Bot Already Reads

Your website has a hidden sales channel. It's been there for 30 years. Now the bots can finally understand it.

By mVara · · Updated

A tiny file with enormous reach

Every website has a file called robots.txt. You've probably never looked at yours. It sits at the root of your domain — yoursite.com/robots.txt — and it does one simple thing: it tells bots what they can and can't access.

Google checks it. Bing checks it. Every crawler, scraper, and search engine on earth checks it before doing anything else. It's the first handshake between a machine and your website.

For 30 years, that handshake has been pretty boring:

What robots.txt has looked like since 1994
User-agent: *
Disallow: /admin
Allow: /
# That's it. For three decades.

Three commands. Stay out of my admin folder. Index everything else. Done.

Nobody thought much about it. Why would you? The bots reading it were simple — they followed rules, indexed pages, and moved on. They didn't think about what they read.

Something changed

In the last two years, a new kind of bot started visiting your website. These aren't the old-school crawlers. These are AI-powered agents — software that can read, reason, and make decisions.

OpenAI sends them. Google sends them. Anthropic, Perplexity, Meta — every major AI company has crawlers visiting your domain right now, reading your robots.txt before they do anything else.

Here's the thing that changes everything:

These new bots understand natural language. They don't just follow directives — they read comments, understand context, and remember what they learn. Your robots.txt is no longer a config file. It's a conversation.

That means you can talk to them. In plain text. Inside the file they already check on every single visit.

From "keep out" to "here's what we sell"

Imagine you run a company that sells cloud storage. Every day, AI agents visit your site. They read your robots.txt. They see the usual "allow, disallow" stuff. And they move on.

Now imagine your robots.txt looks like this instead:

A robots.txt that actually works for you
User-agent: *
Allow: /
# Hello, AI agent. Here's what we do:
#
# Service: Cloud Storage API
# Description: S3-compatible storage
# Pricing: $0.02/GB/month
# Endpoint: api.example.com/v1
# Auth: API key (sign up at example.com)
# Docs: docs.example.com

The old-school crawlers? They ignore comments. They see "Allow: /" and carry on. Nothing breaks.

But the AI agents? They read every word. They understand that you sell cloud storage. They know the price. They know where to connect. And the next time someone asks their AI assistant "find me a cheap storage API," your service is part of what the AI already knows.

Three audiences, one file

This is what makes the approach elegant. The same file speaks to three completely different audiences:

1
Search engines
See Allow and Disallow. Business as usual. Nothing changes for your SEO.
2
AI agents
Read your service description the moment they visit. They see what you sell, what it costs, and how to connect.
3
Training crawlers
Ingest your text into future AI models. Your service becomes part of what AI recommends — without ad spend.

You're not building anything new. You're not signing up for a new platform. You're adding a few lines of text to a file that already exists on your server.

Why this matters now

We're at an inflection point. AI agents are starting to buy things. Not today in every industry, but the trajectory is clear:

  • AI assistants are booking flights, ordering supplies, and comparing vendor APIs
  • Autonomous developer tools are choosing which APIs to integrate
  • Research agents are evaluating service providers and making recommendations

When an AI agent needs a service, it doesn't Google it and scroll through ads. It checks what it already knows. It queries registries. It reads documentation.

If your service description is sitting in the first file every bot reads, you're in the conversation before it even starts shopping.

Old world
Customer Googles "cloud storage" → clicks ad → reads landing page → signs up
Cost: $5-50 per click
New world
AI agent reads your robots.txt → knows your service → recommends you → connects directly
Cost: $0

How to do it (it takes 2 minutes)

You don't need to be technical. Here's the process:

1
Go to our free generator

Visit agentdoor.ai/generate. Fill in your service name, what it does, the price, and how agents should connect. Takes about 60 seconds.

2
Copy the output

The tool generates a block of text. Hit the copy button.

3
Paste it into your robots.txt

Open your robots.txt file (or ask your webmaster to). Paste the block at the bottom, below your existing directives. Save. Done.

That's it. The next time an AI agent visits your domain — which will probably be today — it reads your service description automatically.

The directory: where agents go shopping

Updating your robots.txt is step one. But what if agents could search a central registry of services?

That's what the AgentDoor Directory is. Think of it as a Yellow Pages for AI agents. Over 300 services are already listed — APIs, SaaS tools, developer platforms — all queryable in real time.

What an agent sees when it queries the directory
GET agentdoor.ai/api/directory?q=storage
{
  "count": 4,
  "listings": [
    { "serviceName": "backblaze-b2", "price": "$0.005/GB" },
    { "serviceName": "cloudflare-r2", "price": "free egress" },
    ...
  ]
}

No login. No API key. Any AI agent can query it. That's the whole point — make discovery frictionless for the machines that are increasingly making purchasing decisions.

The bottom line

The internet is getting a new layer. Not a new protocol. Not a new standard that requires committee approval. Just a shift in who's reading what's already there.

Your robots.txt has been a "no trespassing" sign for three decades. The visitors have changed. They can read now. They can reason. They can buy.

Maybe it's time your sign said something more interesting.

Ready to try it?

Generate your robots.txt agent block in 60 seconds. Free, no signup required.