I've always thought PSNR gets a bad rap. It's not that PSNR isn't a very blunt tool, it's just that every other tool in it's class is pretty blunt too.
I mean it compares a video codec by treating it as a series of independent still images. That right there is crazy, but the same basic methodology applies to most of the alternatives, even the crazy obscure ones that no-one really uses because they're too new or processor intensive.
The Mozilla/Daala team have written a lot about these topics in regards to their work on a) Daala, b) netvc (new IETF codec project just getting started), c) evaluating improvements to JPEG for MozJPEG, d) evaluating WebP
https://tools.ietf.org/html/draft-daede-netvc-testing-00#sec...
https://arewecompressedyet.com/
http://people.mozilla.org/~josh/lossy_compressed_image_study...
In the end, like unit tests, performance benchmarks, static analysis or various other software development tools, they're useful if you use them wisely and know their limitations, and dangerous if you abuse them or treat them as if they are magical.
But crappy tools that can be easily automated fill an important part of the toolbox and I feel PSNR has it's place. I'm deeply suspicious of anyone who looks down their nose at PSNR because they've just discovered SSIM for example, which seems to be a common sentiment. They're both just crappy tools that can be used for good or ill if you know what you're doing, running them both (and others too) might help catch more bugs than either alone and if you're automating then why the heck not?