How to Block MJ12bot

Complete guide to blocking MJ12bot (Majestic) from crawling your website using robots.txt, server configuration, and Switch workflows.

Operated by MajesticSEO Scrapers

Should You Block MJ12bot?

MJ12bot builds Majestic's commercial SEO database. Blocking it has no impact on your search rankings but may affect your site's data in Majestic's tools.

Blocking is reasonable to reduce server load, especially if you don't use Majestic's SEO tools.

Blocking Methods

1robots.txt

High for cooperative crawlers

Add a Disallow rule for MJ12bot's user-agent string in your robots.txt file. This is the standard, cooperative method that well-behaved crawlers respect.

2Server-side UA filtering

High

Configure your web server (nginx, Apache, Cloudflare) to reject requests matching MJ12bot's user-agent patterns. This blocks at the network level before your application processes the request.

3Switch Journey Workflows

Highest — granular, real-time control

Create a custom journey in Switch that detects MJ12bot and routes it to a block action, challenge, redirect, or modified content — without touching your server configuration.

robots.txt — Block MJ12bot

Add the following to your robots.txt file (at the root of your domain) to block MJ12bot:

User-agent: MJ12bot
Disallow: /

robots.txt — Allow with Restrictions

Alternatively, allow MJ12bot on most pages while blocking specific directories:

User-agent: MJ12bot
Disallow: /private/
Allow: /

MJ12bot User-Agent Strings

Use these patterns to identify MJ12bot in your server logs or firewall rules:

MJ12bot

Frequently Asked Questions

Does blocking MJ12bot affect my Google search rankings?

No. Blocking MJ12bot does not affect your Google search rankings. Only blocking Googlebot impacts Google Search visibility.

Does MJ12bot respect robots.txt?

Yes, MJ12bot respects robots.txt directives. Adding a Disallow rule for its user-agent will prevent it from crawling blocked paths.

Can I allow MJ12bot on some pages but not others?

Yes. Use robots.txt to disallow specific directories, or use Switch journey workflows for granular page-level control with conditional logic.

Go beyond robots.txt

Switch detects MJ12bot in real-time and lets you build custom journey workflows — block, challenge, redirect, or serve modified content. No server changes required.

Get Started Free