antibot

Other

Last updated:

Monitor

Uncategorized bot with unclear purpose.

Recommended action: Review activity before deciding on access.

Category

Other

Primary use case

Uncategorized

Trust level

Review recommended

robots.txt

Unknown

antibot Traffic (Last 90 Days)

Not enough network data yet.

Track this bot on your site

What is antibot?

Uncategorized bot

What antibot means for your site

antibot is a other with an undocumented operator. Its activity on your site should be reviewed to determine whether it is beneficial, neutral, or unwanted. Robots.txt compliance is not confirmed for this bot.

What should you do?

  • Review this bot's activity in BotSights
  • Check which pages it visits most frequently
  • Consider server-side blocking if access is unwanted

See antibot on your own site

BotSights tracks every antibot visit in real time, including which pages it crawls, how often, and from where.

Start free

How to identify antibot

antibot uses the user-agent "antibot" and robots.txt compliance unconfirmed.

antibot

How to block antibot

Three robots.txt options below. Pick the one that matches your goal. Each snippet lists every known antibot user-agent pattern so the rules apply regardless of which one the bot announces. Compliance with robots.txt is unconfirmed for antibot, so verify with crawl logs after deploying.

Edit robots.txt with care

A single misplaced line can de-index your entire site. Common mistake: pasting User-agent: * followed by Disallow: / blocks every bot, not just antibot, including Googlebot. Always paste the snippet between existing rules (not over them), keep the User-agent line scoped to antibot's patterns, and verify with Google's robots.txt tester before deploying. If you are not sure, ask a developer first.

Option 1: Block all access

Tells antibot not to crawl any URL on your site. Use this when you want the bot completely off your content.

User-agent: antibot
Disallow: /

Option 2: Block specific paths only

Keep public content crawlable but exclude sensitive or non-public sections. Add one Disallow: line per path. Replace the example paths with your own.

User-agent: antibot
Disallow: /admin/
Disallow: /private/
Disallow: /checkout/

Option 3: Slow down with a crawl delay

Crawl-delay is a voluntary directive that asks the bot to wait the given number of seconds between requests. Useful when antibot is hammering your origin and slowing the site down for real visitors, but you do not want to block it outright. The value is in seconds, so 10 means at most one request every ten seconds. Not all bots honour this directive (Googlebot ignores it; Bingbot, Yandex, and many AI crawlers do respect it).

User-agent: antibot
Crawl-delay: 10

Frequently Asked Questions

What is the User-Agent for antibot?

antibot identifies itself with the User-Agent string "antibot". Use this in robots.txt or server-side rules.

Is antibot safe to allow on my site?

The operator is not publicly documented for this bot. Review its activity in your logs and BotSights data before deciding.

Should I block antibot?

Depends on the value the bot provides versus its crawl load. Review activity first, then decide.

How do I block antibot?

Try robots.txt first ("User-agent: antibot / Disallow: /") and verify with crawl logs whether the bot stops appearing.

How can I verify a request is really antibot?

User-Agent strings can be spoofed by malicious crawlers. Without published verification details, the User-Agent alone is not trustworthy — monitor source IPs and behavior patterns. BotSights flags spoofed traffic when verification data is available.

Know what antibot is doing on your site

See which pages it visits, how often it appears, and whether it is helping your visibility or worth blocking.

  • Bot activity tracked per page
  • AI and search crawler insights
  • Better allow, monitor, or block decisions
Track this bot

Free plan available. No credit card required. Setup in 2 minutes.