> This is one reason why most crawlers ignore robots.txt now.I don't buy that for a second. Those not obeying robots.txt were doing so either because they were malicious (they wanted everything and wouldn't be told “please don't plough through these bits”) or stupid (not knowing any better) or both.
Anyone who was obeying robots.txt isn't going to start ignoring it because we've put honeypots there. Why would they think “well, now there are honeypots there I'm going to go scan those… honypots, yeah, that's a good idea”.
> The other reason is that bandwidth/bots are cheap enough now that they don't need web admins to help them optimize their crawlers
Web admins are not trying to optimize their crawlers, they are trying to stop their crawlers breaking sites.