Do you even need to read their research to call bullshit?
How many projects have achieved this ever?
How many DARPA project mission statements have you read? They all sound like that.
Your average software on modern hardware is barely able to take advantage of even 1% of the machine's full capability. You spend most of your time waiting for cache operations. Even when the chip is being utilized well, it's spending a massive portion of its power budget on just synchronizing the clock signal amongst all billion transistors. A parallel, asynchronous chip with dynamic power regulation would be able to offer several orders of magnitude better performance than a stock CPU. For an existing real-world example, just look at how much better a GPU outperforms a CPU for its specific parallel tasks.
It also sounds like they are using compressed sensing techniques, or probabilistic linear algebra or other rank-reduction approaches. These can drastically lead to massive reductions in power and increases in throughput, because they fundamentally reduce the number of bits of state that must be managed and transformed.
TL;DR I don't know a lick of this guy's research but I absolutely believe that current hardware and software have a TON of room for improvement, at least where parallel image processing and machine learning are concerned.
They could accomplish great things, they could open the door to new techniques that may achieve this decades from now, but that's different.
Also, you said you could see several orders of magnitude improvement, even that is less than their claim.
Also you mention GPU vs CPU, which as far as I'm aware doesn't meet the standard either.
But seriously, while I'd agree any magnitude of breakthrough is possible, I also think it's unhealthy to encourage people to make extremely unlikely claims just to get a decent research grant.