If you've heard of robots.txt, LLMs.txt follows a similar concept — but instead of telling crawlers what not to index, it tells AI language models who you are, what you do, and what you want cited. It's one of the highest-leverage technical changes a business can make for AI visibility, and most businesses don't have one.
What is LLMs.txt?
LLMs.txt is a plain text file placed at the root of your domain (yourdomain.com/llms.txt) that provides structured, machine-readable information about your business for AI language models to reference.
Unlike robots.txt, which is a well-established standard with formal RFC specification, LLMs.txt is an emerging convention. It's not enforced by any governing body — but major AI systems are increasingly using it as a signal source when retrieving real-time information about businesses and organizations.
When AI engines retrieve real-time information to answer a query about your business, they look for the most authoritative, structured source available. A well-written LLMs.txt gives them exactly what they need — in a format designed for machine consumption, not human readers.
What goes in an LLMs.txt file?
The file should contain structured information that an AI would need to confidently recommend or cite your business. Here's a real-world example for a personal injury law firm:
How AI engines use this file
When a user asks an AI engine a question that might involve your business category, the AI often performs real-time retrieval. During that retrieval, it may fetch your LLMs.txt directly as a high-confidence source of structured information about your business.
More importantly, when AI training data is updated, your LLMs.txt becomes part of the authoritative record about your business — displacing less structured, less reliable sources that might otherwise define how AI represents you.
What makes a good LLMs.txt?
- Specific, verifiable facts — numbers, dates, certifications, outcomes. Vague claims are ignored.
- Query-aligned language — include the exact phrases your prospects use when asking AI for recommendations.
- Citation permissions — explicitly state what AI may and may not attribute to you. This is increasingly important for regulated industries.
- Regular updates — treat this like a living document. Update it when you win awards, add services, or hit new milestones.
- Consistency with other sources — your LLMs.txt should match your Schema.org markup, your Google Business Profile, and your website content.
How to deploy it
Creating the file is straightforward — it's plain text. Deploying it correctly is where most businesses run into issues:
- Create the file as
llms.txt(lowercase, no extension beyond .txt) - Place it at your domain root:
yourdomain.com/llms.txt - Ensure it's publicly accessible — no authentication, no redirect loops
- Set the content-type header to
text/plain; charset=utf-8 - Add it to your sitemap so crawlers know it exists
For WordPress sites, you can place the file in your root directory via FTP or your hosting file manager. For Webflow, use the asset manager or a redirect rule. For Shopify, you'll need to use a page with a custom URL handle.
What SwiftGeo generates for you
SwiftGeo's platform generates a fully structured LLMs.txt file based on your business profile — including your identity data, key facts, target queries, and citation permissions. It's automatically updated when you rescan, and it's deployed to a persistent URL at your SwiftGeo profile that AI engines can reference directly while you work on deploying it to your own domain.
LLMs.txt is one of the fastest-moving areas of the GEO landscape right now. The businesses that establish clean, authoritative LLMs.txt files early are building a compounding advantage — every AI training update that picks up their structured data makes them progressively harder to displace.
Get your LLMs.txt generated automatically.
SwiftGeo creates a fully optimized LLMs.txt based on your business profile, alongside Schema markup and content rewrites — in one scan.
Get Your Free Audit →