How to Block UptimeRobot

Complete guide to blocking UptimeRobot (UptimeRobot) from crawling your website using robots.txt, server configuration, and Switch workflows.

Operated by UptimeRobotMonitoring

Should You Block UptimeRobot?

Not recommended: UptimeRobot is a monitoring service. If it's configured for your site, blocking it will prevent you from receiving uptime alerts.

Only block if this monitoring traffic is not yours and is coming from another party.

Blocking Methods

1robots.txt

High for cooperative crawlers

Add a Disallow rule for UptimeRobot's user-agent string in your robots.txt file. This is the standard, cooperative method that well-behaved crawlers respect.

2Server-side UA filtering

High

Configure your web server (nginx, Apache, Cloudflare) to reject requests matching UptimeRobot's user-agent patterns. This blocks at the network level before your application processes the request.

3Switch Journey Workflows

Highest — granular, real-time control

Create a custom journey in Switch that detects UptimeRobot and routes it to a block action, challenge, redirect, or modified content — without touching your server configuration.

robots.txt — Block UptimeRobot

Add the following to your robots.txt file (at the root of your domain) to block UptimeRobot:

User-agent: UptimeRobot
Disallow: /

User-agent: uptimerobot
Disallow: /

robots.txt — Allow with Restrictions

Alternatively, allow UptimeRobot on most pages while blocking specific directories:

User-agent: UptimeRobot
Disallow: /private/
Allow: /

User-agent: uptimerobot
Disallow: /private/
Allow: /

UptimeRobot User-Agent Strings

Use these patterns to identify UptimeRobot in your server logs or firewall rules:

UptimeRobot
uptimerobot

Frequently Asked Questions

Does blocking UptimeRobot affect my Google search rankings?

No. Blocking UptimeRobot does not affect your Google search rankings. Only blocking Googlebot impacts Google Search visibility.

Does UptimeRobot respect robots.txt?

Yes, UptimeRobot respects robots.txt directives. Adding a Disallow rule for its user-agent will prevent it from crawling blocked paths.

Can I allow UptimeRobot on some pages but not others?

Yes. Use robots.txt to disallow specific directories, or use Switch journey workflows for granular page-level control with conditional logic.

Go beyond robots.txt

Switch detects UptimeRobot in real-time and lets you build custom journey workflows — block, challenge, redirect, or serve modified content. No server changes required.

Get Started Free