However modern content on modern, decent, LCD panels and especially on OLED panels, blows CRTs out of the water. The vibrancy of the colors, the overall quality of the picture, not having that CRT 'glow', readability of text is all improved in my opinion. CRTs also had a number of maintenance items, the screens attracted dust like a magnet, having to adjust vertical and horizontal alignment, color adjustments (both which often ended up out of whack for some reason).
I'm sure there are benefits, for older games I see it, but for modern games I sometimes wonder if this is people waxing poetic or being nostalgic. I'm sure there are some people who will make claims about gaining an edge in online shooters, but I'm curious how much of that is real considering other losses in the pipeline like digital to analog conversion and how low the refresh rate is compared to modern gaming panels.
Nowadays we call that sub-pixel rendering or antialiasing. But in the 80s so many people were convinced TVs could only do 640x480. Our titling systems were typically doing 2400x480 to get good quality character aliasing and image shading on CRTs. This is still somewhat true today analog-wise but common sub-pixel rendering or antialiasing does achieve the same effect on LCDs.
You just brought back vivid memory of something I haven’t thought about in years!
For those who never experienced it: the constant firing of electrons into the screen caused it to become statically charged over time.
It would, yes, attract dust - and if you ran your hand over a TV that had been on for a while, you could feel the ‘fuzz’ of the electric charge directly with your fingers. I can remember the feeling so vividly now!
However this article is from 2019, and I know they have some pretty snazzy gaming monitors today that might well be better than old top-end CRTs.
Most modern systems (and computers) are sending digital signals where entire frames need to be constructed and reconstructed in a buffer before they can be displayed. Many HD CRT televisions have terrible latency because they’ll take the analog scan line signal, buffer it into a frame, scale it to the native CRT resolution, then scan it back out to the screen. A high end PVM might allow a straight path, but there is maybe one Toshiba HD CRT that doesn’t introduce several frames of latency (iirc).
That said, from 1999 to 2008 I ran 1600x1200 on 19in CRTs and except for professional LCDs, nothing had resolution, pitch, and color that came close. For 2008 was the inflection point where cost and quality of LCDs exceeded CRTs.
The article is outdated (like you said) because LCD/OLED displays have long since surpassed CRTs in latency and refresh rate.
A modern gaming LCD can refresh the entire screen multiple times over before an old CRT scans the entire frame and returns to the top to redraw the next one.
I'll take a wild guess though that my group (crt media watchers) is slightly less easy to take advantage of with actual hardware than video gamers which I'd guess to be the reason why there are few articles or HOLY GRAIL crt's like this fw900 widescreen in the media watching community. Not that we aren't often suckers for things like ORIGINAL VINTAGE POSTERS and SEALED MEDIA lol.
Would it really be 0 though? Assuming 60Hz, the bottom of each frame is scanned out 16ms after the top. Assuming that Rock Band 3 renders an entire frame before displaying it (definitely true) and that it actually renders at 60FPS as opposed to rendering at a higher resolution and tearing (definitely true on console, might not be true on an emulator?), the video latency will range from 0ms for the top of the frame to 16ms for the bottom, for an average of 8ms.
Admittedly, I don't know what Rock Band 3 calibration numbers actually measure, e.g. whether they already take into account factors like this.
If you can manage to render at a high frame rate then you could reduce latency by tearing, but at that point, I feel like you're leaving a lot on the table by not using a 240Hz OLED or something, which can show the entirety of every frame.
Supposedly OLED has comparable response times to CRT anyway. The article says that OLED gaming monitors are unattainable, but it's 5 years old and things have changed.
And it's not the refresh rate, it's the time from input -> display picture updates. With a CRT that can happen during the current field being displayed, but it will take at least one frame for any LCD.
You can find some modern screens that do better at certain things than others but compared with what most people have CRTs are still likely to have better color accuracy, better contrast, zero motion blur, no dead/stuck pixels, far better viewing angles, no fixed resolutions, higher refresh rates, and zero backlight bleed. It's not all nostalgia (although I really do miss the degauss button) but CRTs were also before DRM, data collection, and ads being pushed at you and it's hard to not be nostalgic about that.
Don't get me wrong, that smell will always take me back to a time when I had hair and it will without a doubt put a happy smile on my face any day. But it's all nostalgia anchored by a couple of technical advantages that pale in the face of overall tech today.
You're comparing the best CRTs ever made with the "average" modern LCD.
The run of the mill CRT that "most people had" was dim, blurry, flickery, and bloom-y.
This isn't true, and hasn't been true for decades. CRTs have relatively poor display contrast due to all of the light bleed. CRT contrast in real scenes (that is, not just comparing all-black to all-white screens) can be around 100:1 or 200:1. LCDs have been better than that for a very long time. Even the cheap oens.
> zero motion blur,
Also not true. CRT phosphors have some persistence. Even the best CRT monitors back in the day had several milliseconds of persistence, depending on what you're measuring. Definitely not zero!
Modern LCD monitors have motion blur in a similar range, perhaps less if you're getting a gaming panel.
> no fixed resolutions
At the cost of some very blurry text, of course.
> You can find some modern screens that do better at certain things than others but compared with what most people have
It's almost certainly easier to find a high performing LCD than a quality CRT. A working FW900 CRT can easily fetch $1000-2000, which will buy a modern OLED display that completely blows it out of the water.
The only reason to buy a good CRT monitor is for the nostalgia/retro effects.
You're absolutely deep in nostalgia land lol
Sure, if you like motion blur that makes your content look like a slideshow. I personally don’t. It’s embarrassing that I’m still faced with worse motion qualities than I had 30 years ago.
The sample and hold motion blur of LCD/OLED ruined gaming for me for a long time. 240FPS OLED panels have begun to just make it bearable again when such rates can be achieved.
What crappy panels are you buying?
I know 10 years ago, IPS panels were known for having terrible response times, but there are other types of LCD that have pixel response times in the 1 ms range.
This seems to suppose a world where, had LCDs been common, people wouldn't have used low-resolution bitmaps. But... there were not a lot of alternative technologies available! There are a few vector-art CRT arcade games out there, but generally very black-and-white/virtual-boy-style, and I believe bitmap graphics even without super-high-resolution would've won out anyway.
This also suggests that you couldn't tell the difference as much between NES/SNES/Playstation resolutions of bitmap-art games, when you DEFINITELY could. The old games never looked "great", it was just the best we had.
It's less about "They would not have used low res bitmaps for displaying graphics" and more about "The artists would likely have arranged the pixels in the bitmaps in different ways"
There are some really interesting articles out there about how the artists for old NES and SNES games designed the sprites to look better with scanlines, and without scanlines (like when playing on modern emulators) everything kind of looks worse than it did when it was rendered on CRTs
No. They looked their best because CRT were able to reproduce more colors than an a LCD.
Only now, after more than 10 years, LCDs with HDR come close to CRTs.
I remember vividly my first LCDs who were marketed as 24 bit but you could see the colour gradient like in 16 bit mode on CRTs.
> LCDs with HDR come close to CRTs.
This is pretty meaningless and conflating gamut with dynamic range. The vast majority of CRTs back in the day would be driven with 8-bit per channel RGB DACs, so not HDR, and most CRTs would have an sRGB or similar gamut (so not a wide gamut). It is true that both the dynamic range and gamut of cheaper LCD panels is pretty poor (~5 bits per channel) and not even complete sRGB, and this set the tone for many low cost TN displays of the early 2000s (and still adorns the lowend of laptops and even some Thinkpads to this day).
However affordable LCD monitors have been around for YEARS with wide gamut (ex Adobe RGB or DCI-P3), superior to all but the most expensive reference CRT monitors that virtually no one owned, and long before HDR becoming commonplace. I bought a 98% Adobe RGB monitor about 14 years ago for less than $800, color reproduction and contrast wise completely blowing any CRT out of the water I ever owned. But even a cheap <$300 IPS display on sale for the past 15 years including all MacBooks will exceed most CRTs as well. In practice CRTs also have middling contrast ratio as well unless you work in a pitch dark room, which almost no one does.
> I remember vividly my first LCDs who were marketed as 24 bit
IPS and true 8-bit TN panels have been mainstream for a long time now. Nothing to do with recent uptake of HDR.
In 1998, I liked the active matrix LCD on my Gateway laptop a lot more than the GDM-17E11 Trinitron on my SGI because the Trinitron had these 2 fucking wires across the picture, which annoyed me, and also the RGB convergence was off on the edges, and the geometry was poor and heavily bowed by about 0.5" on the bottom. Gross.
In 2020, I bought a cheap new-old-stock CRT monitor for retro nonsense, and threw it on an underpowered Linux system in my office for funzies, and I was like HOLY FUCK the response time on this is INSANE, and the blacks are UNREAL. I felt a Responsive Computer-Feeling I hadn't felt since using my CRT'd dual PPro Debian system in college. Blew away every aspect of every LCD in the house, even my overpriced top-of-the-range Sony LCD TV-- apart from the abysmal, headache-inducing max 60Hz refresh rate at a low 1024x768 resolution, distracting flyback transformer whine, and high-ish power consumption, that is.
Conclusions: every monitor sucks. Always have, always will. CRTs flicker, LCDs muddle, OLEDs over-contrast and burn in. With apologies to Dorothy Parker: you might as well live.
But the glow is real! Gorgeous! I want it on my 70s terminals, but don't need it on my workstation.
Even in regards to the latency, I'm kind of convinced that those claims are a little overblown. LCDs do increase latency, but some of the more modern LCD TVs have a "low latency mode", that claims to get the latency to below 15 milliseconds; assuming most games are 60FPS, that's below a single frame, and I don't think that a vast majority of humans can even detect that. and for the few that can, OLEDs have you covered with latency on the order of like 2ms.
Still, I don't miss it, I never really liked it. People love to crap on LCD TVs, but honestly I'm an unapologetic fan of them. Even pretty cheap LCD TVs nowadays are really decent [1], and give a really sharp, nice picture with very few downsides. I have a MiSTer plugged into my $400 Vizio in my bedroom plugged in via HDMI, and SNES games just look so much better on it than they ever did on my CRT as a kid.
[1] Except for the speakers. Somehow built in speakers have gotten way worse than they were in the 90s, and TVs are borderline unusable without a soundbar or something.
I remember with my first multi-monitor setup the desk heavily bowed in the middle from all the weight. Now I have a 3x2 monitor setup, all larger than the largest of those, supported much more easily.
So your 91 pound TV has a significantly larger screen, and still weighs less than a third. This still seems like a win to me.
I’m not sure I’d risk full-time desktop use, but in a couple of years? Maybe.
Your other post mentions the screen brightness dropping with white menus open, but that is not because of a burn in protection feature, but instead it's because of limits on the total power consumption of the panel (either because of power efficiency laws, or too much heat).
The BlurBusters site has explored various other approaches over the years. Black frame insertion to reduce persistence has been tried, but requires high brightness and can create flicker. AI upscaling or interpolation stuff could help with frame rates.
I had one of these FW900s I got for free from work. The screen was extremely dim and it was assumed the flyback had gone bad. Since I am a card carrying member of "program the damn VCR" generation I knew that it was probably a bad capacitor. Throwing complete caution to the wind I removed the case and shields around the multiple circuit boards. And with nothing more than a multimeter and a few cans of Redbull (gives you wings, and you're going to need them) I found the offending cap on the back of the tube (IIRC, the D-board).
With nothing more than time, a 10c capacitor, and the immortality of a young, 20yo I was able to inline a working capacitor and got a free 150lb, $2000 monitor for my effort.
If you're comfortable with working on HV electronics and know how to solder and use a multimeter getting one of these for cheap is completely doable.
Err, you missed the most important one: knowledge to find the bad cap
I've gone through a bunch of high refresh rate LCDs since, but nothing has matched the perfect hand-eye sync of a CRT running at high FPS and high refresh rate with a 1000 Hz mouse, zooming around dm3 in a complete state of flow.
Seriously, I'm just throwing a ball with my kid and I notice that shit now.
https://web.archive.org/web/20110927044427/https://geek.com/...
The hidef 1080p CRTs used by Sony Broadcast R&D in the 90s were absolute beasts. Gorgeous displays, I remember so many people really were blown away at the hidef content, and it wasn't even close to 4K! (I haven't yet seen any 8K content on an 8K screen, I prefer high refresh rates for 4K, at least 120Hz, rather than more pixels)
They're not that expensive, relative to some of the larger TV screens[0]. Plus, when it's off, there's no bulky appliance to navigate around.
[0]: https://www.benq.com/en-us/projector/gaming/tk700/buy.html
I ended up with a 34" 16:9 Sony hi def CRT. No one wanted it. When I went to pick it up, I found out why... the thing weighed over 200 lbs!
I remember getting 3 of my friends to help me move that thing into my second story walk up apartment. I wouldn't be surprised if it's still there.
Naturally, the buyers often didn't want them either and it could become a point of contention between the parties! He even lost a couple sales because neither side would budge.
Fortunately my roommate at the time was very nice, and also a huge gym bro and was able to help me lug it back to my car after the semester was over.
I have no idea how I moved my 36" WEGA back in the 90s. I've blanked the pain out.
I don't play games anywhere close to what I used to do, but I can't imagine working for 8-10 hours on CRT monitor and not losing health.
But then when you're programming with a black background with the brightness turned way way down on a crt, there is very little light shining into your face.
Maybe oled will fix that, but it still has some growing up to do.
User error, IMO.
I'm of the opinion that if your monitor showing a pure white screen is painful, then either your brightness is too high or you don't have adequate ambient light.
And let's be honest, many CRT don't look that great, they tend to suffer from bad geometry and color aberrations.
Things have changed substantially, especially regarding the availability of OLED gaming oriented monitors. CRTs were great (especially the Trinitrons), but they were stupid heavy. I’d much rather have an OLED today.
But I think my problem is I sit VERY close to my monitor. If I'm sitting up straight, my monitor is only ~20 inches from my screen.
My eyes are just too accustomed that focus at that distance.
Recently was playing Zelda 1 on the NES when the game crashed and deleted my save half way through the 2nd quest. Tried to play it on an emulator and on the switch, and both felt clunky compared to playing on the NES. Could probably get through it, but it wouldn't have been fun.
In any case, most emulation software is geared towards gaming — many art galleries for example still use CRTs for preservation (https://www.nytimes.com/2023/10/17/t-magazine/technology-art..., https://youtu.be/rHBtmPZx82A?t=1828) and these software solutions don't really fill the gap of a general-purpose CRT display.
See the screenshots on https://dosbox-staging.github.io/
Edit: It was actually only up to 96hz, but gaming on it didn't feel like there was too much motion blur
For those big sizes, there was nothing comparable - everything else was so convex that it was hard to use as a monitor once you got about 19 or 21 inches.
Throughout the 80s, 90s, and early 2000s I spent obscene amounts of money trying to minimize them. Smaller and smaller grills, the switch to shadow masks and their infernal lines, it never stopped.
To me, any screen technology where I can look at the clock in the upper right or lower left corner and see the pixels comprising the numbers (or a blurry smear) is trash.
The same with colors. Any screen technology that cannot accurately reproduce colors is trash. Asking a CRT to accurately reproduce colors, even something as minimally acceptable as Rec.709, is like asking me to perform brain surgery: it ain't happening brah.
Edit: also the DigitalFoundry video referenced in the article is a much better watch imo https://www.youtube.com/watch?v=V8BVTHxc4LM
Oh, and thinking about it, I just had a horrible flashback to the intersection of two horrible aspects of that time of my life, a day when I needed to use the interlaced mode and the neighbouring resort had one of their weekly reggae concerts with 120+ dB sound, my headache was epic.
The LCD beats it in every metric. Even in price - I can get an LCD monitor for $10 from the thrift store.
P.S. I can't hear the whine anymore in a CRT. Gettin' old.
“I had this friend his name was Marc with a C, his sister was like the heat coming off the back of an old TV”
It wasn't until OLED monitors came around that I finally felt like flat panel displays had really caught up.
I owned one, but I didn't want to. My 21" Trinitron blew up in the final month of its 5-year warranty and they didn't have a way to replace it so they sent me the 24". Crazy, must have cost a fortune for them just to ship it.
HD image which works with the many retro light gun games.
Has humankind stopped manufacturing CRTs entirely?
And of course, modern OLED beats the pants off of both in that regard.