- Home
- Custom Skills
- SEO/AEO Master Playbook
SEO/AEO Master Playbook
The one prompt to rule them all — make any site discoverable by every search engine and every AI agent.
Rating
Votes
+1
score
Downloads
0
total
Price
Free
No login needed
Works With
About
A copy-paste master prompt that audits your repo against 14 pieces of modern discoverability — classic SEO, Answer Engine Optimization, and citation infrastructure — then ships the missing ones end-to-end. Covers llms.txt, MCP servers, JSON-LD structured data, AI crawler allowlists, segmented sitemaps, IndexNow, per-route OG images, Core Web Vitals, and more. Idempotent: run it again and it'll audit and fill gaps. Written from a real, live-in-production discoverability stack at a-gnt.com.
Don't lose this
Three weeks from now, you'll want SEO/AEO Master Playbook again. Will you remember where to find it?
Save it to your library and the next time you need SEO/AEO Master Playbook, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.
⚡ Pro tip for geeks: add a-gnt 🤵🏻♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.
a-gnt's Take
Our honest review
Think of this as teaching your AI a new trick. Once you add it, the one prompt to rule them all — make any site discoverable by every search engine and every ai agent — no extra apps or complicated setup needed. It's verified by the creator and completely free. This one just landed in the catalog — worth trying while it's fresh.
Tips for getting started
Save this as a .md file in your project folder, or paste it into your CLAUDE.md file. Your AI will automatically use it whenever the skill is relevant.
Soul File
---
name: seo-aeo-master-playbook
description: >
The one prompt to rule them all — make any website maximally discoverable by every
search engine, every AI assistant, and every autonomous agent on the open web.
Classic SEO + AEO (Answer Engine Optimization) + citation infrastructure in a single
idempotent audit-and-ship brief. Covers llms.txt, MCP servers, JSON-LD, AI crawler
allowlists, segmented sitemaps, IndexNow, per-route OG images, and Core Web Vitals.
Paste into Claude Code, Cursor, or Windsurf from your repo root and walk away.
license: CC BY 4.0
---
# SEO/AEO Master Playbook — The One Prompt to Rule Them All
This skill is a copy-paste master prompt. Drop it into any capable coding agent
(Claude Code, Cursor, Windsurf, Aider) from inside your repo's root directory,
replace the two `<placeholders>` on the first line, and walk away for an hour.
It's **idempotent** — run it again after changes and it will audit and fill gaps
without duplicating work.
## Why this exists
In 2026, making a site discoverable is a three-job brief, not one:
1. **Be crawlable** — classic SEO. Sitemaps, robots, fast pages, clean URLs.
2. **Be ingestible** — AEO (Answer Engine Optimization). llms.txt, MCP server,
machine-readable API, OpenAPI spec, Markdown export, AI crawler allowlist.
3. **Be citable** — structured data so answer engines can quote you *with a URL
attached*. JSON-LD on every page.
Missing any one of the 14 pieces below creates a dead spot where a specific class
of agent can't find, can't parse, or can't cite you. This prompt covers all 14.
## How to use
1. Open Claude Code (or Cursor / Windsurf) at the root of your repo.
2. Paste the prompt block below.
3. Replace `<YOURSITE>` with your site name and `<ONE-LINE DESCRIPTION>` with
what the site is.
4. Let it run the audit pass first, then the ship pass.
5. Read the status file it writes at the end. Do the registration steps
yourself (Search Console, Bing Webmaster, IndexNow key, smithery.ai).
## The prompt
You are helping me make this site maximally discoverable by every search
engine, every AI assistant, and every autonomous agent on the open web. The
site is <YOURSITE> (canonical URL: <https://yoursite.com>), and it is a
<ONE-LINE DESCRIPTION>.
This is a three-job brief: classic SEO (be crawlable), AEO (be ingestible),
and citation infrastructure (be citable). Do all three. Do not skip any of
the 14 pieces below unless I tell you to, and when you skip one, tell me WHY
in the final report.
## Audit first (read-only pass)
Before writing any code, inventory what already exists:
1. robots.txt / robots route
2. sitemap.xml / sitemap route
3. llms.txt and llms-full.txt at the site root
4. .well-known/ai-plugin.json and .well-known/openapi.yaml
5. Any MCP server script in the repo
6. JSON-LD structured-data helpers
7. Per-page metadata exports
8. Per-route Open Graph image generators
9. IndexNow submission script or library
10. RSS / feed route
11. oEmbed route
12. Any nginx / Caddy / Cloudflare config snapshots
13. Framework config (rendering mode, image config)
14. Topic hub and "best of" listicle routes
Report what is present, what is stale, and what is missing. Do not change
anything yet.
## Then ship the missing pieces
For each of the 14 below that is missing or incomplete, build it. Favor the
smallest working version first; we can extend later. Follow the conventions
already in the repo (framework, TS strictness, path aliases, lint rules).
1. llms.txt — minimum viable manifest at the site root.
2. llms-full.txt — a build-time script that dumps every public page plus catalog item in plain text. Target size: as large as practical. Declare CC BY 4.0 at the top.
3. .well-known/ai-plugin.json — a ~15-line manifest pointing at the OpenAPI spec.
4. .well-known/openapi.yaml — real contract for every public endpoint.
5. MCP server — a small Node or Python server exposing 4–8 tools that proxy the public API. Do NOT hit the database directly. Ship a README snippet telling users how to install it in Claude Desktop's config.
6. Public API — JSON default, Markdown via ?format=md or Accept: text/markdown. Markdown must include a citation footer with the canonical URL and license. At minimum: list, detail, categories.
7. robots — explicit allowlist for every named AI crawler (GPTBot, ChatGPT-User, OAI-SearchBot, ClaudeBot, anthropic-ai, Claude-Web, PerplexityBot, Perplexity-User, Googlebot-Extended, Applebot-Extended, CCBot, cohere-ai, Diffbot, FacebookBot, Meta-ExternalAgent, MistralAI-User, YouBot, DuckAssistBot, Amazonbot, Bytespider). Disallow auth / admin / private. Link every sitemap explicitly.
8. Segmented sitemaps — one per surface, not one monolith. Scale priority by live signals (install count, freshness, verification). Never ship a single sitemap over 1k URLs.
9. Topic hubs and "best of" listicles — at least 10 of each. Target real search intent. Every list page emits an ItemList JSON-LD. Every one is in the sitemap at priority 0.85+.
10. JSON-LD helpers — one helper module exporting: SiteJsonLd, BreadcrumbJsonLd, ItemListJsonLd, FAQJsonLd, HowToJsonLd, SoftwareApplicationJsonLd (or ProductJsonLd), ArticleJsonLd, BlogPostingJsonLd. Wire them into every page that matches the shape. SiteJsonLd goes in the root layout and MUST emit Organization + WebSite + SearchAction.
11. Per-route OG images — one generator on every dynamic detail route (items, blog posts, creators, lists).
12. Per-page metadata — title template in the root layout, plus per-page title / description / canonical / Twitter card on every page with dynamic content.
13. IndexNow — a small library that POSTs to api.indexnow.org, called from every publish / update mutation. Never throw. Ship the public verification key file.
14. Perf pass — measure LCP / INP / CLS on the top 5 routes. If LCP > 2.5s, fix it. If the site is behind nginx, add an HTML cache layer for public SSR routes that bypasses on session cookie. If behind Cloudflare, configure cache rules and bot management to allow the AI allowlist.
## Final report format
When done, write a short status file with:
- Baseline (what was there)
- Shipped (what you built)
- Skipped (with reasons)
- Next actions for me (registration: Google Search Console, Bing Webmaster, IndexNow key, submitting sitemaps, submitting the MCP server to smithery.ai)
- Measurement plan (which queries to check in Search Console weekly, which dashboards to watch, what "success" looks like in 30/60/90 days)
## Hard rules
- Never break existing routes. If you touch metadata, verify the page still renders.
- Never include PII or private routes in llms.txt / sitemaps / JSON-LD.
- Never keyword-stuff. If a human reader can tell it was written for SEO, rewrite.
- Every cite-able page must include the license and the canonical URL in its Markdown export.
- If you hit a blocker, write it in the status file and continue with the next piece. Do not stall the whole pass on one issue.
---
License: CC BY 4.0 — copy it, paste it, fork it, ship it. Attribution appreciated:
https://a-gnt.com/agents/skill-seo-aeo-master-playbook
Ratings & Reviews
0.0
out of 5
0 ratings
No reviews yet. Be the first to share your experience.
