Skip to main content
Back to blog
7 min readVelox Digital Team

What Is llms.txt and Do Local Businesses Need It?

llms.txtAEOAI searchsmall businessschema

The Short Version

llms.txt is a plain-text file at the root of your domain (yourdomain.com/llms.txt) that gives AI engines a curated, human-readable summary of your most important pages. Think of it as a "here is what this site is about, in 30 lines" hint for AI search engines.

It is not a replacement for sitemap.xml or robots.txt. It does not unlock any specific ranking outcome. It is a low-cost, low-risk addition to a site, and it is becoming part of the AEO foundation the same way sitemap.xml became standard for traditional SEO ten years ago.

For most local businesses in 2026: ship one. The cost is fifteen minutes; the downside is zero.

What llms.txt Actually Is

The format was proposed in 2024 by llmstxt.org as a structured way to give AI engines context they cannot reliably extract from raw HTML. The spec is open and intentionally simple:

  • An H1 line with the site or business name
  • A blockquote with a one-sentence summary
  • One or more H2 sections grouping related links
  • Each link as a markdown bullet with the URL and a short description

A minimal example looks like:

# Acme Plumbing

> Licensed master plumber serving Brooklyn, Queens, and Manhattan since 2008.

## <a id="core-pages"></a>Core pages
- Home: https://acmeplumbingnyc.com/
- Services: https://acmeplumbingnyc.com/services
- Service area: https://acmeplumbingnyc.com/service-area
- Pricing: https://acmeplumbingnyc.com/pricing
- Contact: https://acmeplumbingnyc.com/contact

## <a id="frequently-asked"></a>Frequently asked
- How much does emergency plumbing cost?: https://acmeplumbingnyc.com/faq#cost
- What areas do you serve?: https://acmeplumbingnyc.com/faq#service-area

## <a id="contact"></a>Contact
Email: hello@acmeplumbingnyc.com
Phone: (718) 555-0123

That is the whole spec. No special tooling, no required JSON, no complex schema. Plain markdown that humans and AI engines can both read.

For a working real example, see our own at veloxenterprises.com/llms.txt. Notice that we curate to the highest-value 15 to 20 URLs rather than mirroring the full sitemap. That curation is the point.

How AI Engines Use It

Adoption is still emerging. As of mid-2026, the engines that actively read llms.txt during their crawl include:

  • Perplexity Sonar (uses it to inform citation selection on local business queries)
  • Anthropic Claude (cites llms.txt-listed URLs preferentially when relevant)
  • OpenAI GPT-4o with browsing (consults llms.txt when present)

Engines that do not yet specifically read llms.txt include:

  • Google Gemini (relies on sitemap.xml + Google's broader knowledge graph)
  • xAI Grok (general web crawl, no llms.txt-specific behavior reported)

The list will shift. The pattern across emerging AI search standards has consistently been: a small number of engines adopt early; a critical mass adopts within 12 to 18 months; the standard becomes table-stakes within 24 months. Robots.txt followed this pattern in 1994; sitemap.xml followed it in 2005. llms.txt is somewhere between the early-adoption and critical-mass phase.

The upside for shipping one today is asymmetric: the engines that read it will preferentially cite the businesses that ship it, and the cost to ship is minimal.

What llms.txt Is Not

A few clarifications that come up regularly:

  • It does not replace robots.txt. Robots.txt controls crawl access (which bots can fetch which paths). llms.txt provides curated context for the bots that are allowed in. Both files coexist; they answer different questions.
  • It does not replace sitemap.xml. Sitemap.xml lists every indexable URL on your site (often hundreds). llms.txt curates the most important 15 to 30 (intentionally not exhaustive). Engines can use both.
  • It does not unlock a ranking guarantee. No file you ship guarantees AI citations or Google rankings. llms.txt makes citation more likely on engines that read it; it does not make citation certain.
  • It is not Schema.org or JSON-LD. Schema is structured data inside HTML pages. llms.txt is plain markdown at a fixed file path. They serve different purposes; serious AEO work uses both.

Do Local Businesses Need One?

It depends on your business type. Three rough buckets:

Local services businesses (plumbers, dentists, salons, restaurants, contractors). Marginal benefit, low cost. Most of your AI search citation work is on-site schema, GBP completeness, and entity-graph signals (Wikidata, vertical directories). llms.txt is a 15-minute add that complements those moves; not the highest-leverage thing on the list, but no reason to skip.

Content-heavy local businesses (educational sites, knowledge bases, multi-location franchises with deep service catalogs). Higher benefit, same cost. llms.txt curates which pages an AI engine reads first; if your site has 50+ pages, the curation function alone earns the file.

Pure e-commerce with deep catalogs. Variable benefit. If your category pages and top SKUs are clearly hierarchical, llms.txt helps engines surface the right entry points. If your catalog churns weekly, the maintenance cost of keeping llms.txt current may exceed the benefit.

For most local businesses in our experience: ship one, do not over-think it, do not refresh it more than quarterly.

How to Write One

The process for a typical local business takes 15 to 30 minutes:

  1. List your top 15 to 25 URLs. Home, top three to five service pages, FAQ page, pricing page, contact page, top blog posts (if any), about page. Skip the legal pages, archive pages, and thin category pages.
  2. Write a one-sentence summary for the blockquote at the top. This should answer "what is this business" in language a human would actually use.
  3. Write a one-line description for each URL. Short. The descriptions are for context, not SEO copy.
  4. Group the URLs under H2 sections that match how a buyer thinks (Core pages, Services, FAQ, Contact). The structure is for AI engines but it should also read coherently to a human skimming the file.
  5. Save it as llms.txt and upload to your domain root. WordPress sites can drop it in the public folder; Wix and Squarespace let you upload via the file manager; custom-coded sites add it to the static assets directory.
  6. Validate by visiting yourdomain.com/llms.txt in a browser. It should render as plain text. If the server returns 404 or HTML, the upload location is wrong.

That is the whole process. There is no validator-as-a-service for llms.txt yet; visual inspection plus the llmstxt.org spec is what you have.

Common Mistakes

A few patterns that show up when local businesses ship llms.txt without thinking it through:

  • Mirroring the full sitemap. Sitemap.xml is supposed to be exhaustive. llms.txt is supposed to be curated. If your llms.txt has 100 URLs, you have copied the wrong file.
  • Listing redirected or 404'd URLs. AI engines verify llms.txt entries on first read; broken links erode trust in the entire file.
  • Stuffing keywords in the descriptions. The descriptions are context, not optimization targets. Plain language wins.
  • Forgetting to update after a rebrand or restructure. Stale llms.txt entries pointing at old URLs hurt more than no file at all. Quarterly review at minimum.
  • Skipping the file because "AI search is overhyped." The file takes 15 minutes. The opportunity cost of skipping it is low; the upside if engine adoption accelerates is meaningful.

How llms.txt Fits with the Rest of AEO

If you have read the How to Get AI to Recommend Your Local Business post, llms.txt is move 4 of the five core moves. It is not the highest-leverage one (schema and answer-first content are higher), but it is the cheapest to ship and the easiest to maintain.

In the 12-item AI search visibility checklist, it is item 5. A clean audit reads green on llms.txt within the first hour of the audit; a red mark there is fixable in a single sitting.

What If You Want Someone Else to Ship This

The 15-minute version above is practical for most local businesses. If you would rather have an experienced operator audit your full AI search visibility setup and ship llms.txt as part of a broader 30-day engagement, the $199 AI Search Visibility Sprint covers it alongside schema, FAQ, content, and off-site fixes.

Cross-references

No ranking or citation guarantees. AI engines change citation behavior frequently and without notice. We sell implementation work and measurement, not specific outcomes.

Want to see where your website stands?

Run a free site audit