Again, with Hashcash, this isn't how it works: most outbound spam messages are worthless. The point of the system is to exploit the negative exponent on the attacker's value function.
The human-labor cost of working around Anubis is unlikely to be paid unless it affects enough data to be worth dedicating time to, and the data they're trying to scrape can typically be obtained "respectfully" in those cases -- instead of hitting the git blame route on every file of every commit of every repo, just clone the repos and run it locally, etc.
I claim that the cost for the two classes of user are meaningfully different: bots care exclusively about the total CPU usage, while humans care about some subjective combination of average and worst-case elapsed times on page loads. Because the sheer number of requests done by bots is so much higher, there's an opportunity to hurt them disproportionately according to their cost model by tweaking Anubis to increase the frequency of checks but decrease each check's elapsed time below the threshold of human annoyance.