In this article on H-online, they noticed that the Microsoft bots are … er … aggressive. In the past, we’ve had to disable their 65.55.x.* access to our sites as they did not respect robots.txt. In the past year or so, they have behaved generally well, though we do notice the occasional blip.
I suspect that their crawlers aren’t terribly smart. Google’s, yahoo’s, and many others are pretty sophisticated. No real problems with them anymore. Sad though, as competition in the search space would be a good thing. Anything that is able to detect and disable the SEO efforts is, IMO, a good thing. SEO is why searches for seemingly disconnected words in an effort to look for problems that might not be common, but someone has run into before, are rendered useless. I could go off on a long tangent on this, but all I will say is that SEO degrades the value and utility of search tools. If your business model is predicated upon your placement in a search engine, then you have some fundamental problems.
Maybe Microsoft can focus on better filtration. I would personally much rather use a search engine which didn’t include any SEO-ized content.
Viewed 16434 times by 3524 viewers