Content
summary Summary

A newly proposed web standard called "llms.txt" could change how AI systems find and process information online.

Ad

AI veteran Jeremy Howard suggests websites should include a special file that helps language models access content more effectively.

Howard explains that modern websites need to work for both human visitors and AI systems. But AI systems often struggle with large amounts of text because they can only process limited chunks at once.

This limitation can make it impossible for language models to process entire websites at once. The new standard would solve this problem by providing information in a more focused, AI-friendly format.

Ad
Ad

Howard's idea is simple: Website owners would create a file called "llms.txt" that acts like a guide for AI systems. This guide would help AI quickly find and understand important information without having to process entire websites.

Making websites LLM-readable

The new standard follows a simple format. Every llms.txt file starts with a project name at the top, followed by a quick summary. Site owners can then add more details and link to other Markdown documents as needed. This structure is intended to help AI systems read and understand websites more reliably.

Howard also suggests websites should offer Markdown versions of their HTML pages by adding ".md" to URLs. The FastHTML project is already implementing this approach by automatically generating Markdown versions of all documentation pages.

The standard could be particularly useful for developers and code libraries, since AI systems could better help programmers by reading this structured information, Howard believes. The AI company Anthropic also uploaded an LLMs.txt for its documentation.

Companies could use it to lay out their organization and key resources. Online stores could better organize their products and store policies. Schools and universities could present their courses more clearly, and people could format their professional backgrounds in ways that AI systems can better understand.

Recommendation

Working with existing web standards

The new standard would work together with familiar web tools like robots.txt and sitemap.xml. While these existing standards help search engines crawl websites, llms.txt specifically helps AI systems find and understand the most important content on a site, including relevant links to other resources.

Anyone can review and comment on the proposed standard at llmstxt.org, and technical documentation is available on Github. Website owners should test their llms.txt files with several AI systems to make sure they work as expected, Howard recommends.

The standard's success depends on whether web developers embrace it. If enough websites start using llms.txt, it could fundamentally change how AI systems read and understand web content.

But it also brings up essential questions about the future of the WWW. Who's responsible when AI systems rewrite website content? How do we protect the copyrights of website owners? Do they have copyrights? How can websites make money when their content is accessed by chatbots? And how can AI systems properly understand the full context of a website, including its design?

Ad
Ad
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.

So far, none of the AI labs have sufficient answers to these questions.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:
Bank transfer
Summary
  • A new web standard called "llms.txt" has been proposed to help language models access relevant information from websites more efficiently.
  • The llms.txt file would serve as a central guide for AI systems, summarizing the most important content of a website in a structured format, as AI requires information in a more precise and compact form than humans.
  • The proposed standard is designed to complement existing web standards such as sitemaps and robots.txt. It also raises fundamental questions about the future of the World Wide Web.
Sources
Online journalist Matthias is the co-founder and publisher of THE DECODER. He believes that artificial intelligence will fundamentally change the relationship between humans and computers.
Join our community
Join the DECODER community on Discord, Reddit or Twitter - we can't wait to meet you.