Skip to main content

Get Started in 5 Minutes

Protect your content from unauthorized AI usage with LLMTAG protocol. This quick start guide will have you up and running in minutes.
1

Create llmtag.txt

Create a new file called llmtag.txt in your website’s root directory (the same folder as your index.html or main page).
# LLMTAG Protocol v3.0
# Content Usage Policy for example.com
# For more information, visit: https://docs.llmtag.org

# REQUIRED: Protocol version declaration
spec_version: 3.0

# AI Training Policy: Controls whether content can be used for AI model training
# Values: allow (permit training) | disallow (block training)
ai_training_data: allow

# AI Use Policy: Defines how AI agents can use your content
# Values: search_indexing (for search engines) | generative_synthesis (for AI responses) | research (for academic research)
ai_use: search_indexing, generative_synthesis, research

# Attribution Requirements: Ensures proper credit when content is used
# Values: required (must credit) | optional (preferred) | none (no credit needed)
attribution: required
attribution_format: "Source: Example.com (https://example.com)"

# Contact Information: Where AI agents can reach you for questions
contact: [email protected]
documentation: https://docs.llmtag.org

# Protocol Information: Metadata about this policy file
protocol_name: LLMTAG
protocol_version: 3.0
last_updated: 2024-10-11
policy_effective_date: 2024-10-11
2

Upload to Your Server

Upload the llmtag.txt file to your website’s root directory using your preferred method:
  • FTP/SFTP: Upload via file manager
  • cPanel: Use File Manager
  • Git: Commit and push to your repository
  • CDN: Upload to your content delivery network
3

Verify Installation

Test that your llmtag.txt file is accessible by visiting:
https://yourdomain.com/llmtag.txt
You should see your content usage policy displayed in the browser.
4

Test AI Agent Discovery

AI agents will automatically discover your llmtag.txt file when they crawl your website. No additional configuration is needed.
You can test this by checking your server logs for requests to /llmtag.txt from AI agent user agents.

Common Use Cases

Basic Content Protection

For most websites, this simple configuration provides good protection:
# LLMTAG Protocol v3.0
# Basic content protection policy

spec_version: 3.0

# Block AI training, allow search indexing
ai_training_data: disallow
ai_use: search_indexing

# Require attribution when content is used
attribution: required
attribution_format: "Source: Example.com (https://example.com)"

# Contact information
contact: [email protected]

# Protocol metadata
protocol_name: LLMTAG
protocol_version: 3.0
last_updated: 2024-10-11

Open Content Policy

If you want to allow AI training but require attribution:
# LLMTAG Protocol v3.0
# Open content policy - allows AI training with attribution

spec_version: 3.0

# Allow AI training and all use cases
ai_training_data: allow
ai_use: search_indexing, generative_synthesis, research

# Require attribution and link-back
attribution: required
attribution_format: "Source: Example.com (https://example.com)"

# Allow commercial use with attribution
commercial_use: allow
commercial_attribution: required

# Contact information
contact: [email protected]

# Protocol metadata
protocol_name: LLMTAG
protocol_version: 3.0
last_updated: 2024-10-11

Strict Protection

For maximum content protection:
# LLMTAG Protocol v3.0
# Strict protection policy - blocks most AI usage

spec_version: 3.0

# Block all AI training and most usage
ai_training_data: disallow
ai_use: search_indexing

# Require explicit permission for any use
attribution: required
explicit_permission: required

# Contact for permission requests
contact: [email protected]

# Protocol metadata
protocol_name: LLMTAG
protocol_version: 3.0
last_updated: 2024-10-11

WordPress Users

If you’re using WordPress, you can implement LLMTAG even more easily:
For manual implementation in WordPress:
  1. Create llmtag.txt in your WordPress root directory
  2. Add rewrite rules to your .htaccess file:
RewriteRule ^llmtag\.txt$ /path/to/llmtag.txt [L]
  1. Or use a plugin like “Custom Files” to serve the file

Advanced Configuration

Content-Specific Policies

You can specify different policies for different types of content:
# LLMTAG Protocol v3.0

# General policy
ALLOW: analysis, indexing
DISALLOW: training, commercial-use

# Blog posts - more restrictive
PATH: /blog/*
DISALLOW: training, commercial-use, redistribution
REQUIRE: attribution, link-back

# Documentation - more permissive
PATH: /docs/*
ALLOW: training, analysis, indexing
REQUIRE: attribution

User Agent Targeting

Target specific AI agents with different policies:
# LLMTAG Protocol v3.0

# Default policy
ALLOW: analysis, indexing
DISALLOW: training, commercial-use

# ChatGPT/OpenAI - more restrictive
USER-AGENT: GPTBot
DISALLOW: training, analysis, indexing

# Google AI - allow analysis only
USER-AGENT: Google-Extended
ALLOW: analysis, indexing
DISALLOW: training, commercial-use

Testing Your Implementation

Manual Testing

  1. Check File Accessibility: Visit https://yourdomain.com/llmtag.txt
  2. Validate Syntax: Use our online validator
  3. Test Parsing: Use our parser tool

Automated Testing

# Test with curl
curl -H "User-Agent: LLMTAG-Test/1.0" https://yourdomain.com/llmtag.txt

# Check HTTP headers
curl -I https://yourdomain.com/llmtag.txt

Troubleshooting

Problem: llmtag.txt returns a 404 error.Solutions:
  • Ensure the file is in your website’s root directory
  • Check file permissions (should be readable by web server)
  • Verify the filename is exactly llmtag.txt (case-sensitive)
  • Clear any caching that might be interfering
Problem: File is served with wrong MIME type.Solutions:
  • Add to your .htaccess file:
<Files "llmtag.txt">
    Header set Content-Type "text/plain"
</Files>
  • Or configure your web server to serve .txt files as text/plain
Problem: AI agents are not following your policies.Solutions:
  • Verify your syntax is correct using our validator
  • Check that AI agents are actually reading the file
  • Remember: LLMTAG is a voluntary standard
  • Consider additional technical measures for enforcement

Next Steps

Now that you have LLMTAG implemented, explore these advanced features:

Need Help?

Questions? Check our FAQ or join our community for support.
Important: LLMTAG is a voluntary standard. While we encourage AI agents to respect these policies, enforcement depends on the AI system’s implementation. Consider additional legal and technical measures for stronger protection.