How to Block Copilot

Complete guide to blocking Copilot (Microsoft) from crawling your website using robots.txt, server configuration, and Switch workflows.

Operated by MicrosoftAI Assistants

Should You Block Copilot?

Blocking Copilot prevents your content from appearing in Microsoft's AI-generated answers. Each visit from this agent represents a real user asking about your content.

Consider allowing Copilot for visibility, or use Switch to serve agent-optimized markdown content instead of blocking entirely.

Blocking Methods

1robots.txt

High for cooperative crawlers

Add a Disallow rule for Copilot's user-agent string in your robots.txt file. This is the standard, cooperative method that well-behaved crawlers respect.

2Server-side UA filtering

High

Configure your web server (nginx, Apache, Cloudflare) to reject requests matching Copilot's user-agent patterns. This blocks at the network level before your application processes the request.

3Switch Journey Workflows

Highest — granular, real-time control

Create a custom journey in Switch that detects Copilot and routes it to a block action, challenge, redirect, or modified content — without touching your server configuration.

robots.txt — Block Copilot

Add the following to your robots.txt file (at the root of your domain) to block Copilot:

User-agent: CopilotBot
Disallow: /

User-agent: BingPreview
Disallow: /

User-agent: MicrosoftPreview
Disallow: /

robots.txt — Allow with Restrictions

Alternatively, allow Copilot on most pages while blocking specific directories:

User-agent: CopilotBot
Disallow: /private/
Allow: /

User-agent: BingPreview
Disallow: /private/
Allow: /

User-agent: MicrosoftPreview
Disallow: /private/
Allow: /

Copilot User-Agent Strings

Use these patterns to identify Copilot in your server logs or firewall rules:

CopilotBot
BingPreview
MicrosoftPreview

Frequently Asked Questions

Does blocking Copilot affect my Google search rankings?

No. Blocking Copilot does not affect your Google search rankings. Only blocking Googlebot impacts Google Search visibility.

Does Copilot respect robots.txt?

Yes, Copilot respects robots.txt directives. Adding a Disallow rule for its user-agent will prevent it from crawling blocked paths.

Can I allow Copilot on some pages but not others?

Yes. Use robots.txt to disallow specific directories, or use Switch journey workflows for granular page-level control with conditional logic.

Go beyond robots.txt

Switch detects Copilot in real-time and lets you build custom journey workflows — block, challenge, redirect, or serve modified content. No server changes required.

Get Started Free