How to Block DotBot

Complete guide to blocking DotBot (Moz) from crawling your website using robots.txt, server configuration, and Switch workflows.

Operated by MozSEO Scrapers

Should You Block DotBot?

DotBot builds Moz's commercial SEO database. Blocking it has no impact on your search rankings but may affect your site's data in Moz's tools.

Blocking is reasonable to reduce server load, especially if you don't use Moz's SEO tools.

Blocking Methods

1robots.txt

High for cooperative crawlers

Add a Disallow rule for DotBot's user-agent string in your robots.txt file. This is the standard, cooperative method that well-behaved crawlers respect.

2Server-side UA filtering

High

Configure your web server (nginx, Apache, Cloudflare) to reject requests matching DotBot's user-agent patterns. This blocks at the network level before your application processes the request.

3Switch Journey Workflows

Highest — granular, real-time control

Create a custom journey in Switch that detects DotBot and routes it to a block action, challenge, redirect, or modified content — without touching your server configuration.

robots.txt — Block DotBot

Add the following to your robots.txt file (at the root of your domain) to block DotBot:

User-agent: DotBot
Disallow: /

User-agent: dotbot
Disallow: /

robots.txt — Allow with Restrictions

Alternatively, allow DotBot on most pages while blocking specific directories:

User-agent: DotBot
Disallow: /private/
Allow: /

User-agent: dotbot
Disallow: /private/
Allow: /

DotBot User-Agent Strings

Use these patterns to identify DotBot in your server logs or firewall rules:

DotBot
dotbot

Frequently Asked Questions

Does blocking DotBot affect my Google search rankings?

No. Blocking DotBot does not affect your Google search rankings. Only blocking Googlebot impacts Google Search visibility.

Does DotBot respect robots.txt?

Yes, DotBot respects robots.txt directives. Adding a Disallow rule for its user-agent will prevent it from crawling blocked paths.

Can I allow DotBot on some pages but not others?

Yes. Use robots.txt to disallow specific directories, or use Switch journey workflows for granular page-level control with conditional logic.

Go beyond robots.txt

Switch detects DotBot in real-time and lets you build custom journey workflows — block, challenge, redirect, or serve modified content. No server changes required.

Get Started Free