Given the occasional articles that crop up showing the sheer volume of badly-behaved (presumably) AI scraper bots this makes all kinds of sense.
I can't find it now, but sometime in the past week or so I saw something that (IIRC) related to the BBC (?) blocking a ton of badly-behaved obvious scraper traffic that was using Meta (?) user-agents but wasn't coming from the ASNs that Meta uses. The graphs looked like this ended up reducing their sustained traffic load by about 80%.
Items where I'm doubting my recall (since I didn't find anything relevant doing some quick searches) are marked with (?)