How to Block MistralAI-User

Complete guide to blocking MistralAI-User (Mistral) from crawling your website using robots.txt, server configuration, and Switch workflows.

Operated by MistralAI Assistants

Should You Block MistralAI-User?

Blocking MistralAI-User prevents your content from appearing in Mistral's AI-generated answers. Each visit from this agent represents a real user asking about your content.

Consider allowing MistralAI-User for visibility, or use Switch to serve agent-optimized markdown content instead of blocking entirely.

Blocking Methods

1robots.txt

High for cooperative crawlers

Add a Disallow rule for MistralAI-User's user-agent string in your robots.txt file. This is the standard, cooperative method that well-behaved crawlers respect.

2Server-side UA filtering

High

Configure your web server (nginx, Apache, Cloudflare) to reject requests matching MistralAI-User's user-agent patterns. This blocks at the network level before your application processes the request.

3Switch Journey Workflows

Highest — granular, real-time control

Create a custom journey in Switch that detects MistralAI-User and routes it to a block action, challenge, redirect, or modified content — without touching your server configuration.

robots.txt — Block MistralAI-User

Add the following to your robots.txt file (at the root of your domain) to block MistralAI-User:

User-agent: MistralAI-User
Disallow: /

robots.txt — Allow with Restrictions

Alternatively, allow MistralAI-User on most pages while blocking specific directories:

User-agent: MistralAI-User
Disallow: /private/
Allow: /

MistralAI-User User-Agent Strings

Use these patterns to identify MistralAI-User in your server logs or firewall rules:

MistralAI-User

Frequently Asked Questions

Does blocking MistralAI-User affect my Google search rankings?

No. Blocking MistralAI-User does not affect your Google search rankings. Only blocking Googlebot impacts Google Search visibility.

Does MistralAI-User respect robots.txt?

Yes, MistralAI-User respects robots.txt directives. Adding a Disallow rule for its user-agent will prevent it from crawling blocked paths.

Can I allow MistralAI-User on some pages but not others?

Yes. Use robots.txt to disallow specific directories, or use Switch journey workflows for granular page-level control with conditional logic.

Go beyond robots.txt

Switch detects MistralAI-User in real-time and lets you build custom journey workflows — block, challenge, redirect, or serve modified content. No server changes required.

Get Started Free