The Magic SEO Ball

All your SEO questions answered by The Magic SEO Ball!

Should I update my robots.txt file to disallow all bots?

May 11, 2013 By Natan Gesher

Magic SEO Ball says: My reply is no.

My reply is no.

Seriously though, why would you do this?

Maybe if you are creating a new site and want it to be private and specifically aren’t interested in receiving any search engine traffic, that might make sense… but if you’ve already got a big site that gets a lot of its traffic from SEO, you should really make sure that your robots.txt doesn’t block search engines.

Based on a true story.

Filed Under: No

All your SEO questions answered by a black magic ball!

  • About the Magic SEO Ball
  • Ask A Question
  • SEO glossary

Recent Posts

  • Are fragment identifiers that change content cloaking?
  • Does UTM tracking on inbound links affect SEO?
  • Did Thumbtack really break Google’s rules?
  • Is HTTPS a tie-breaker?
  • Is there a duplicate content penalty?

Natan Gesher | Sharav | Colossal Reviews | Megalomania:me | The Magic SEO Ball | LinkedIn | Facebook | Mastodon