Quickstart Guide
Get your website optimized for AI search engines in under 10 minutes. This guide walks you through the essential steps to make your content discoverable and citable by AI systems.
This quickstart covers the fundamentals. For comprehensive implementation, explore our detailed guides on each topic.
Step 1: Add llms.txt
Create a llms.txt file in your website's root directory. This emerging standard helps AI systems understand your site:
# llms.txt
site_name: Your Site Name
site_url: https://yoursite.com
site_description: A brief description of what your site offers
# Permissions
llm_inference: allow
llm_training: allow
rag_usage: allow
# Key entry points
documentation_root: https://yoursite.com/docs
api_reference: https://yoursite.com/api
# Topics covered
topics:
- Topic 1
- Topic 2
- Topic 3
Step 2: Implement Basic Schema.org
Add JSON-LD structured data to your pages. At minimum, add Organization and WebSite schemas:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "WebSite",
"name": "Your Site Name",
"url": "https://yoursite.com",
"description": "Your site description"
}
</script>
For articles and blog posts:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Article",
"headline": "Article Title",
"description": "Article description",
"author": {
"@type": "Person",
"name": "Author Name"
},
"datePublished": "2026-01-05",
"publisher": {
"@type": "Organization",
"name": "Your Site Name"
}
}
</script>
Step 3: Configure robots.txt for AI Crawlers
Ensure AI crawlers can access your content. Update your robots.txt:
# Allow all AI crawlers
User-agent: GPTBot
Allow: /
User-agent: ChatGPT-User
Allow: /
User-agent: Google-Extended
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: ClaudeBot
Allow: /
# General allow
User-agent: *
Allow: /
# Sitemap
Sitemap: https://yoursite.com/sitemap.xml
You can selectively block AI crawlers if you don't want your content used for training. See our robots guide for details.
Step 4: Structure Your Content
AI systems understand well-structured content better. Follow these principles:
Use Semantic HTML
<article>
<header>
<h1>Main Title</h1>
<p class="lead">Introduction paragraph...</p>
</header>
<section>
<h2>Section Title</h2>
<p>Content...</p>
</section>
<section>
<h2>Another Section</h2>
<p>More content...</p>
</section>
</article>
Clear Heading Hierarchy
- One
<h1>per page (the main topic) <h2>for major sections<h3>for subsections- Don't skip levels (h1 → h3)
Answer Questions Directly
Structure content to directly answer common questions:
## What is [Topic]?
[Topic] is [clear, concise definition]. It's used for [primary use case].
## How does [Topic] work?
[Topic] works by [step-by-step explanation]:
1. First step
2. Second step
3. Third step
Step 5: Add Metadata
Ensure every page has complete metadata:
<head>
<title>Page Title | Site Name</title>
<meta name="description" content="Clear, informative description of the page content">
<!-- Open Graph -->
<meta property="og:title" content="Page Title">
<meta property="og:description" content="Description">
<meta property="og:type" content="article">
<!-- Author info (important for E-E-A-T) -->
<meta name="author" content="Author Name">
</head>
Verification Checklist
Use this checklist to verify your AEO implementation:
-
llms.txtfile exists at site root - JSON-LD structured data on key pages
-
robots.txtallows AI crawlers - Semantic HTML structure throughout
- Clear heading hierarchy
- Complete meta descriptions
- Author attribution on articles
- Sitemap.xml is current and accessible
Test Your Implementation
Use our Chrome extension to analyze your AEO readiness:
Next Steps
Now that you have the basics in place:
- Learn about AI Search Engines — Understand how different AI systems work
- Content Structure Deep Dive — Advanced structuring techniques
- Structured Data Guide — Comprehensive Schema.org implementation
- Best Practices — Proven strategies for AEO success