AEO Validator
Check your website's Answer Engine Optimization implementation against best practices. Our validator analyzes structured data, llms.txt configuration, and content structure.
The online validator is coming soon. In the meantime, use the manual validation steps below.
What We Validate
Structured Data
- Schema.org markup presence and validity
- JSON-LD syntax correctness
- Required fields completeness
- Type appropriateness
llms.txt
- File accessibility
- Field completeness
- Syntax validity
- Best practice compliance
robots.txt
- AI crawler permissions
- Sitemap reference
- Syntax correctness
Content Structure
- Heading hierarchy
- Semantic HTML usage
- Meta tag completeness
- Authority signals
Manual Validation Guide
While our automated validator is in development, use these tools:
Step 1: Validate Structured Data
Google Rich Results Test
- Go to search.google.com/test/rich-results
- Enter your page URL
- Review detected schemas
- Fix any errors or warnings
Schema.org Validator
- Go to validator.schema.org
- Paste your JSON-LD or enter URL
- Review validation results
Step 2: Check llms.txt
Manual Check
# Check if llms.txt exists
curl -I https://yoursite.com/llms.txt
# View contents
curl https://yoursite.com/llms.txt
Checklist:
- File returns 200 status
- Content-Type is text/plain
-
site_namefield present -
site_urlfield present -
site_descriptionfield present - Permission fields specified
- Entry points listed
Step 3: Verify robots.txt
curl https://yoursite.com/robots.txt
Checklist:
- GPTBot is allowed (or explicitly configured)
- Google-Extended is allowed
- PerplexityBot is allowed
- ClaudeBot is allowed
- Sitemap URL is listed
Step 4: Analyze Content Structure
Using DevTools:
- Open page in Chrome
- Press F12 for DevTools
- Go to Elements tab
- Search for headings:
h1, h2, h3
Checklist:
- Single H1 per page
- Logical heading hierarchy (no skipped levels)
- Descriptive heading text
-
<article>wrapping main content -
<section>for major divisions
Step 5: Check Meta Tags
Using DevTools:
// Run in browser console
console.log('Title:', document.title);
console.log('Description:', document.querySelector('meta[name="description"]')?.content);
console.log('Author:', document.querySelector('meta[name="author"]')?.content);
console.log('OG Title:', document.querySelector('meta[property="og:title"]')?.content);
Validation Checklist
Use this comprehensive checklist for manual validation:
Essential (Required)
- Page has valid HTML structure
- Single H1 heading present
- Meta description exists
- robots.txt allows AI crawlers
- Sitemap.xml exists and is valid
Recommended
- llms.txt file at site root
- Schema.org Article/WebPage markup
- Author information present
- Publication date included
- Open Graph meta tags
Advanced
- FAQ schema for Q&A content
- HowTo schema for tutorials
- Organization schema on homepage
- BreadcrumbList for navigation
- Inter-page entity references
Common Issues
"No structured data detected"
Cause: JSON-LD script not present or malformed
Fix:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "WebPage",
"name": "Page Title",
"description": "Page description"
}
</script>
"llms.txt not found"
Cause: File missing or wrong location
Fix: Create llms.txt in your site's root directory (public folder for Next.js)
"AI crawlers blocked"
Cause: robots.txt blocking AI user agents
Fix:
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /