🔥 Introducing the Lizha Hyvä Template – Sleek, Fast, and Powerfull Buy Now 🔥

shape-img
eCommerce

LLMS.txt: Setup Guide for WordPress, Shopify, Magento, and Custom Websites

1814 Views August 19, 2025 5 Min Read

Introduction

The online world is changing at breakneck speed. People don’t just rely on search engines any more AI-driven Large Language Models (LLMs) like ChatGPT, Gemini, and Claude are quickly becoming the way users discover, reference, and interact with information.

For forward-thinking businesses, this isn’t just another digital trend. AI brings extraordinary opportunities for exposure, but it also introduces fresh questions about who owns your content, how your brand is perceived, and how you keep your digital assets safe.

Enter LLMS.txt your new line of defence.

At Kiwi Commerce, we help ambitious brands take control of their online presence in a landscape that’s always shifting. This guide will walk you through exactly what LLMS.txt is, why it’s essential, and step-by-step instructions for setting it up on platforms like WordPress, Shopify, Magento, and custom-built sites.

What is LLMS.txt and Why Does It Matter?

Think of LLMS.txt as the modern cousin to robots.txt. Where robots.txt tells search engine spiders which pages to crawl or ignore, LLMS.txt speaks to AI bots those LLM-powered tools that are busy gathering and learning from web content, sometimes without you even realising.

So, what can you do with LLMS.txt?

  • Decide if AI crawlers can see all, some, or none of your website.
  • Write custom rules for individual bots like letting ClaudeBot in, but keeping GPTBot out.
  • Shield private or sensitive parts of your site, like user accounts or checkout pages, from being scraped.
  • Make sure your content appears in AI outputs the way you intend or is left out altogether.

Why Should Every Business Have LLMS.txt?

Let’s be honest: your website content is one of your most valuable digital assets. With LLMS.txt in place, you can…

  • Safeguard your hard work – Stop competitors, aggregators, or rogue bots from misusing your product descriptions, blog posts, and creative assets.
  • Protect your brand’s reputation – Prevent out-of-context AI summaries from misrepresenting your tone, ethics, or expertise.
  • Boost data privacy – Keep customer and sensitive business info away from prying digital eyes.
  • Stay ahead of the curve – Show your customers and partners that you take AI compliance and innovation seriously.

For us at Kiwi Commerce, adding LLMS.txt isn’t just about following best practice it’s about owning your brand’s story in an AI-powered world.

How To Add LLMS.txt to Your Website

The nuts and bolts depend on what your site is built with. Here’s how to do it for each major system:

WordPress

  1. Log into your web hosting panel (like cPanel or Plesk), or use an FTP client.
  2. Navigate to your site’s root folder, usually called /public_html/.
  3. Create a new text file, and call it llms.txt.
  4. Add your crawler rules. Try this as a template:
User-agent: GPTBot
Disallow: /wp-admin/
Allow: /blog/
  1. Save and upload the file. Test it’s live by visiting www.yourdomain.com/llms.txt.

Kiwi Commerce tip: If you prefer, use a plugin such as File Manager or an SEO tool that lets you edit files directly in your dashboard no coding necessary.

Shopify

Shopify doesn’t allow direct access to your website’s root, but there is a workaround:

  1. In your Shopify admin, go to Online Store → Themes.
  2. Click Edit Code on your active theme.
  3. In the Assets folder, create a new file called llms.txt.
  4. Add your rules. For example:
User-agent: *
Disallow: /checkout/
Allow: /collections/
  1. Save, then check www.yourdomain.com/llms.txt to make sure it’s accessible.

Kiwi Commerce tip: Want even tighter control? We offer developer solutions to proxy the file properly, so even advanced bots get the message.

Magento

  1. Connect to your server via FTP or SSH.
  2. Head to your Magento root directory.
  3. Create llms.txt and include rules such as:
User-agent: ClaudeBot
Disallow: /customer/
Disallow: /checkout/
Allow: /products/
  1. Upload and make sure it’s live at www.yourdomain.com/llms.txt.

Kiwi Commerce tip: Always block customer and checkout pages—keep that sensitive data safe.

Custom-Coded Sites (PHP, Node.js, React, etc.)

  1. Open your site’s root directory.
  2. Create a new text file—llms.txt.
  3. Enter your rules, like:
User-agent: *
Disallow: /internal/
Allow: /articles/
  1. Deploy the file to your live web server. Double-check it loads via your browser.

Kiwi Commerce tip: Ensure your server (Apache, Nginx, etc.) is set up to serve .txt files. If you’re not sure, ask your hosting provider most can help in minutes.

LLMS.txt Best Practices

  • Always put the file at your site’s root for maximum effect.
  • Write clear, simple rules avoid technical jargon or vague language.
  • Update the file as your website grows, adding or removing rules as needed.
  • Block all private or sensitive areas by default. Allow areas specifically designed for public or marketing content.
  • Remember that LLMS.txt is advice to AI bots it’s not bulletproof security, so always use it as part of a broader digital protection plan.

Example LLMS.txt Configurations

Block All AI Bots

User-agent: *
Disallow: /

Allow Only Your Blog

User-agent: *
Disallow: /
Allow: /blog/

Block Just One AI Bot (e.g., GPTBot)

User-agent: GPTBot
Disallow: /

Final Thoughts

AI is reshaping how people discover brands and products. With LLMS.txt, you call the shots deciding exactly which areas of your website should be open to AI crawlers and what should remain off-limits.

Here at Kiwi Commerce, we’re experts at integrating LLMS.txt across WordPress, Shopify, Magento, and custom sites. Whether you want your website invisible to LLMs or you’d like to showcase certain pages for digital visibility, we ensure your content and your brand is protected.

Future-proof your website today. Contact KiwiCommerce to get started with LLMS.txt and secure your business for tomorrow’s digital challenges.

FAQ's

Your questions answered

Can’t find what you’re looking for? Contact our team

How to get indexed by LLM through an LLMS.txt file?

Getting your website indexed by AI tools or Large Language Models (LLMs) starts with allowing them to access the right parts of your website. An LLMS.txt file helps you control which sections of your site AI bots can crawl and learn from.

To improve the chances of your content being referenced by LLMs:

  • Create an llms.txt file in the root directory of your website.

  • Allow AI bots to access important content such as blogs, guides, and knowledge pages.

  • Avoid blocking useful informational pages that could provide value in AI-generated answers.

  • Keep the file updated as your website grows and new content is added.

  • Ensure your website content is well-structured, accurate, and helpful, as LLMs prioritise high-quality information.

While LLMS.txt does not guarantee indexing, it helps guide AI crawlers on what they are permitted to access, which can improve your chances of being referenced.

You can check whether a website has an LLMS.txt file by visiting the file directly through a web browser.

Simply type the following format into your browser:

https://www.website.com/llms.txt

If the website has implemented the file, it will display a list of rules and permissions for AI crawlers. These rules may include instructions such as which sections of the site are allowed or blocked from AI bots.

If the page returns a 404 error or blank page, it likely means the website has not created an LLMS.txt file yet.

An LLMS.txt file does not directly improve traditional SEO rankings, but it can support your wider digital visibility strategy.

Here is how it can still benefit your online presence:

  • Helps control how AI platforms access and interpret your website content

  • Protects sensitive pages such as checkout, admin areas, or customer accounts

  • Ensures AI tools reference the right pages instead of outdated or private content

  • Prevents content scraping from certain AI bots if required

  • Supports AI search visibility, which is becoming increasingly important as users rely on AI assistants

As AI-driven discovery continues to grow, managing AI crawlers responsibly can help protect your brand and maintain content integrity.

Yes, the LLMS.txt file is fully customisable. You can create rules for specific AI bots or all bots at once.

This allows your blog content to be accessed while blocking sensitive areas such as checkout pages.

Businesses often use these rules to balance visibility and protection, ensuring useful content is accessible while keeping private sections secure.

No, LLMS.txt is not mandatory, and many websites still operate without it. However, as AI crawlers become more common, it is quickly becoming a recommended best practice.

Adding the file allows businesses to:

  • Take control of how AI tools interact with their content

  • Protect sensitive data from being scraped

  • Manage brand representation in AI-generated responses

  • Prepare for the evolving AI-driven web ecosystem

For businesses that rely heavily on content, implementing LLMS.txt is a simple yet effective way to future-proof their digital presence.

Need Help?

If this guide helped you, imagine what our team can do for your business. Let’s build something powerful together.

Contact Us
  • 24/7 Support
  • Custom Ecommerce Development
  • Certified Experts
  • 10+ Years of Experience
Enter your email to download