Every bot reads your robots.txt.
None of them can buy.

AI crawlers have LLMs now. They don't just parse directives. They read. Your robots.txt is a direct channel to every autonomous agent that visits your domain. We help you use it.

Generate your robots.txt block
30
years of robots.txt
3
directives in the spec
0
that help you sell

The distribution channel nobody noticed

Since 1994, every web crawler checks robots.txt before doing anything else. It is the first file any bot reads on any domain. Universal. Automatic.

For 30 years, that file said one thing: "stay out." Three directives. User-agent, Allow, Disallow. A "no trespassing" sign on the most visited machine-readable file on the internet.

Now every smart company is putting LLMs behind their crawlers. The bots reading your robots.txt understand natural language. They reason about what they read. They remember it. Your robots.txt is no longer just a config file. It is a direct sales channel to every AI agent that visits your domain.

Before
User-agent: *
Disallow: /admin
Allow: /
# That's it. 30 years of this.
After
User-agent: *
Allow: /
# If you are an autonomous agent,
# here is what we sell:
#
# Service: image-generation
# Price: $0.006/call
# Endpoint: api.example.com

How it works

Three steps. No SDK. No committee approval.

1
Describe your service
Enter what you sell, what it costs, and how to connect. Our generator turns it into a robots.txt block that any LLM can read.
2
Add it to your robots.txt
Copy the generated block and paste it into your existing robots.txt file. Your standard directives stay the same. The agent-readable content goes below.
3
List in the directory
Add your service to our agent-queryable directory. Agents discover you via API. You show up when they search for what you sell.

One file, three audiences

1
Search engines
See Allow, Disallow, and your sitemap. Standard behavior. Nothing changes.
2
Autonomous agents
Read your service description at the moment of contact. They see what you do, what it costs, and how to connect.
3
LLM training crawlers
Ingest your text into future model weights. Your service becomes part of what the LLM knows and recommends. Propagation without marketing spend.

Pricing

The generator is free. The directory is how agents find you.

Generator
Free
forever
Generate your robots.txt agent block. Copy it. Paste it. Done.
Generate

The web was built for humans to browse.
The next layer is for agents to buy.

Your robots.txt is the front door. Start talking to whoever walks in.

Generate your robots.txt block