So, non-human visitors are not exactly new. Bots have been scrolling the web since the 90s but search is changing and evolving and website owners now need to start preparing for AI agents visiting their site too.
Don’t believe me? Jason Mayes, Web AI Lead at Google, recently spoke at WordCamp Europe saying: “We’re now entering the age where AI agents are growing in popularity”. Hostinger, one of the best web hosting providers, believe it too and have recently launched a new tool to automatically create an LLMs.txt file for WordPress sites.
AI agents are powered by Large Language Models (LLMs) such as Google’s Gemini, Anthropic’s Claude, and OpenAI’s ChatGPT. These models can seemingly read and understand text like a human but there are things we can do to help them understand websites better. Just like how robots.txt and sitemap.html help traditional search methods navigate and understand websites, LLMs.txt severs the purpose for AI agents.
To learn more about AI agents, LLMs, and how they’re shaping search, I spoke to Saulius Lazaravičius, VP of product at Hostinger, about the new tool to create LLMs.txt files.

Traditionally, people find information online through search engines like Google, which rely on robots.txt and sitemap.xml files to navigate and index content. But with the rise of AI tools like ChatGPT and Claude, more users now get answers directly from large language models (LLMs), bypassing traditional search.
That’s where LLMs.txt file comes in. It serves as a map for AI systems, helping them identify and understand the most important parts of a website. LLMs.txt file provides:
- A clear, prioritized list of the site’s key pages
- Concise summaries for page content
- Links to more detailed, authoritative resources
Placed alongside robots.txt and sitemap.xml, the LLMs.txt file improves how AI engines interpret complex site structures – potentially increasing a site’s visibility in AI-generated answers.
Is there currently any data that shows the benefit of having LLMs.txt alongside robots.txt and a sitemap.xml?
Currently, adoption of LLMs.txt is still in its infancy, with fewer than 1% of the top one million websites using it as of early 2025. However, the share of traffic coming from AI platforms is constantly growing. For example, usage of AI-driven search among adults in the US is projected to more than double by 2028.
While hard data on LLMs.txt effectiveness is still emerging, the broader concept of “SEO for AI” – also known as generative engine optimization (GEO) – is gaining traction. Website owners are increasingly looking for ways to make their content more accessible and relevant to AI systems. LLMs.txt is an early, proactive step in that direction.
What makes a good LLMs.txt file, and how do you achieve this through one click?
A well-structured LLMs.txt file is clean, simple, and focused on surfacing a site’s most valuable content for AI systems. It typically begins with the website’s main address, followed by selected pages that AI models should prioritize. Optional descriptions can be added to clarify the content structure or hierarchy.
The file is hosted at the root of the website (e.g. domain.tld/llms.txt) and is easy to set up – especially with automated tools like our one-click LLMs.txt file creator.
Importantly, implementing an LLMs.txt file has no negative impact on traditional SEO. It’s a forward-looking, proactive step that makes a site more accessible to AI tools – both now and in the future.
How soon do you see LLMs.txt becoming a web standard?
With AI playing a growing role in how people discover content, more businesses will need to optimize websites not just for search engines, but for AI systems as well. This adoption is expected to increase significantly in the next few months or years.
It’s still unclear whether LLMs.txt will become a long-term standard. It might evolve into something more sophisticated, like NLWeb or API-driven solutions. But the concept of making content easily digestible for AI is here to stay.
At Hostinger, we’re committed to giving our customers a competitive edge. That’s why we were among the first to offer automatic LLMs.txt file creation, and we’ll continue evolving our tools as the GEO landscape changes.
Are there any other things website owners can do to improve their site visibility to AI?
Like traditional search engines, AI systems look for valuable, high-quality content. That means creating genuinely helpful information for people, ensuring the site is fast, mobile-friendly, and easy to navigate, and making the content technically accessible for crawling and indexing.
Each website owner should understand that AI-backed browsing is here and it’s growing. That means they must constantly check what’s new in the field of GEO and look for tools that expose their website content for LLMs. Today, LLMs.txt is a strong first step.
Looking ahead, we believe that websites may evolve toward Model Context Protocol (MCP) interfaces, where content isn’t just displayed for humans but served via MCP-compatible APIs, and AI agents will consume it on users’ behalf.