There's a neat hardware comparison here:
https://en.bitcoin.it/wiki/Mining_hardware_comparisonSeems like ASICs are measured in the thousands to tens or hundreds of thousands of MHashes/sec. Whereas powerful GPUs drawing ~1000 Watts don't even break 1000MH/sec. High-end laptop GPUs seem to be in the 10s of MH/sec, a quad-core Atom shows 2MH/sec, and the Galaxy SII comes in at 1.3.
The vast majority of devices connecting to public APs are not going to be high-power systems. Not to mention the time they'll spend connected is unlikely to be 24/7. Even if it was, mining will probably drain batteries pretty quickly. Plus power-saving is likely to be on for mobile devices and reduce peak perf. And if it's just injecting JS, then backgrounded tabs should get much less CPU time. And WebGL/etc. are unlikely to be running in background tabs.
If you assume a device stays connected and open for 1/4 a day, and stays for 3 days on average, and gives you 1MH/sec (seems optimistic, all things considered), 1 million devices compromised a month gives you ~$300 a month. If the assumption is that you can persistently own a machine, then you'd need less machines. But that's going beyond simple JS injection on HTML pages.
I used this calculator: http://www.alloscomp.com/bitcoin/calculator