Indeed, though I do recommend watching that old western on BluRay. Someone who knows how to nail it with film actually does capture way more than they even expected back then.
There is a fine art there to be appreciated. This is why Spielberg still shoots film. He has a mastery that continues to have value. He may find all of that blunted in the digital realm. Can't blame him.
I think we will eventually find the art in high resolution digital comes down to things other than perfection. Sometimes too much of a good thing is too much.
On another thread, I mentioned Pink Floyd, "The Wall" and how it has been reissued on CD and a remastered gold version that is insane good.
Actually, it is too good. The average person may well appreciate the production values from the earlier work more, despite its considerable distance from perfection.
The trouble on all these discussions centers in on what people like as opposed to perfection. Those two are not always the same, and "better" has that subjective component to it always.
Many digital movie productions I see have subtle aspects to them one can find distracting or that break the immersion a movie is supposed to deliver. Not breaking that is Cinematography. I remain unconvinced everyone really understands that.
That's where the art is.
This is not to say advances in tech are bad or to be discouraged. Neither is true.
However, value perception of said tech may vary considerably from expectations.
On your last point, we actually lose the pixel art, and often the discussion goes to other things. Fine, but it's not always bad to see the pixels.
With gaming, extreme realism, or just extreme quality, can break some of the escape and fantasy, abstraction inherent in the entertainment form. There is definitely room for both and an active retro culture and indie scene taking liberally from retro in order to see that art continue.
The are also some economies with analog means. I'm on a chip project right now that can offer up great analog display. (Truth is, it will do a 4k on analog, no sweat, if one wants to do that)
Some of that is lost on digital devices. The thing is, generating the analog is at least an order more lean, while being able to offer comparable quality. Barrier to entry is low. That is also high value.
Raising that bar is good in many cases, but not all. So I find myself dealing with many subtle timing matters, artifacts of A to D conversion all a non issue on actual analog displays. I will end up with a lean device that can do HDTV signals nicely, while still maxing out an old TV, which does way more than most expect, and do so sans an awful lot of hassles.
As a "do it myself" kind of person, I do not always see the benefit of complex, resource intensive signals, compression, etc... as a good, or the better thing. And there are IP concerns too, all near completely absent from analog means and methods.
So then, "better" takes on some new depth when on is making or building from first principles. In the end, the system will deliver a great display for a fraction of the effort required to employ a fully digital path.
That's tech I know down to the core, can trust and control in any way desired. Does exactly what I want. High value as far as I am concerned. Timeless too. Works on anything ever made.
I just got a media player that refuses to interface with my other goodies. HDCP in play. It's hooked up via analog component, and delivers the great experience it is supposed to. Funny how that mess can all work.