• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: PlayStation 5 and Series X tested on a CRT - a game-changer for image quality?



In the past, Digital Foundry has enthused about the quality of high-end CRTs, the display technology of yesteryear that can still deliver some aspects of image quality that no modern screen can match. We've talked about contrast, precision, motion resolution and much more. Now, two years on from us acquiring the Sony GDM-FW900 - quite possibly the best gaming CRT money can buy - we've tested out how the display works with the new wave of consoles: Xbox Series X and PlayStation 5. Are all of the benefits of CRT still there? How on earth do you connect an HDMI device to an 18-year-old display? How does today's 4K rendering stand up on a CRT screen? And since we first looked at CRTs, have modern displays made any strides in matching up to the strengths of the cathode ray experience?

You can see for yourself by checking out the video below, where I test out a range of PlayStation 5 and Xbox Series X games on my own FW900 and show off how the latest OLED screens from LG are able to compete against one of the key strengths of CRT - but first of all, let's go back to basics. What makes the FW900 so special? Put simply, it's Sony deploying its Trinitron tech to maximum effect, with a relatively large 24-inch 16:10 screen. It's able to process virtually resolution up to 2560x1600 at 60Hz, and if you scale down resolution, it's possible to increase frequency - so yes, high refresh rate gaming is possible. The downsides? A 24-inch screen is small by today's standards, but the FW900 is a huge, desk dominating box, and it weighs 42kg, meaning it is hardly portable.

Inputs are via VGA or BNC, where five sockets allow for individual connection of red, green, blue signals along with horizontal and vertical sync. In terms of connecting a modern device, HDMI to VGA adapters are possible and achieving 1080p60 is possible even with cheapo adapters. We've tested a Vention box from Amazon UK (US link here), which seems to allow for 1440p60 too. USB-C and DisplayPort adapters are also available which will get the job done for PC users.

At this point I think it's worth stressing that the primary use-case scenario for CRTs remains in the retro gaming space - and while Sony's high-end broadcast monitors offer the best possible experience, good consumer level TVs still look wonderful. PC CRT monitors are mostly using a 4:3 aspect ratio, so won't be entirely suitable for modern gaming, but still work beautifully for older games, particularly when classic DOS games could target 70Hz refresh rates, causing problems when running on the most common screens of today.

But in our tests here, the FW900 proved stunning in running Xbox Series X and PlayStation 5 games - even downscaled to 1080p. That's the first key advantage of CRT technology: the concept of native resolution isn't really relevant to this technology. Yes, there are resolution limits set by the aperture grill or shadow mask, but input resolution is always resolved with no scaling. Modern monitors have a fixed pixel grid structure and if you aren't at native res, ugly interpolation is involved in the upscaling process. CRTs go about things in an entirely different way. An electron gun at the neck of the tube sends out beams of electrons directed by an anode through a mesh (a shadow mask or aperture grill) before striking the phosphor-coated screen, lighting the final image.

The end result is very different to the modern fixed grid approach: if you opt for extra resolution, this can add detail (check out the Ratchet comparison image below), but the point is that liberated from the concept of native resolution, a CRT with downscaled 4K content still looks stunning, even at 720p - because the way in which CRTs literally 'beam out' information changes the user's perception of detail and resolution. This is the reason why developers are tackling the challenge of 4K rendering through the use of image reconstruction - to mitigate the majority of the issues of modern upscaling by attempting to get as close as possible to a 4K image without the screen's upscaling kicking in.

The other key advantage of CRT is motion resolution. Modern displays use a system called sample and hold, which drops displayed resolution significantly when the content is in motion. Remember how blurry football matches looked moving left to right on the earliest LCD and plasma displays? That's the most egregious example of sample and hold's problems. The situation has improved over time, but there's still a dramatic drop in motion resolution even in the most modern screens. The native of CRT's technology ensures that this is a non-issue. The good news is that strides are being made here and we are getting very, very close to CRT quality thanks to blank frame insertion, known as Motion Pro on LG screens. By strobing quickly between content and a black frame, the artefacts of sample and hold are mitigated. LG's 2020 and 2021 OLED screens with Motion Pro are getting very, very close to matching CRT's motion quality - the minus point being a dimmer look to the content owing to the black frame strobe. Beyond that, the good news is that other display manufacturers are offering their own solutions, albeit with varying levels of effectiveness.

So with our CRT testing with PlayStation 5 and Series X complete, what have we learned? Well, as fascinating as it is and as fantastic as the results are, modern day gaming on a CRT is very much a niche endeavour - the Sony GDM-FW900 works beautifully because it is a widescreen display and it's relatively easy to convert HDMI across to work with it. In terms of a precision look, with astonishing contrast and a crystal-clear image, it is still peerless - but it's not living room friendly to say the least, and if you do spend thousands to get one, the chances are you'll need to put the screen through an extensive calibration process to get the best results from it. It's not easy. The truth is that the popularity of CRTs with enthusiast gamers is very much in the retro space and that is very definitely the best fit for this kind of technology - the games machines and the titles on them were specifically designed for this technology.

And the good news is that modern displays are improving - dramatically so. OLED looks beautiful, screen sizes are getting larger and prices are getting cheaper. Meanwhile, it's gaming that's driving the feature set and making these screens better - and it's all paying off with HDMI 2.1. High refresh 120Hz is codified as a standard in most new living room displays, right up to 4K support. Auto low latency modes are ensuring that games bypass the lag-inducing processing found on many screens, while we're seeing the arrival of genuinely game-changing features too - like variable refresh rate support for example - not to mention black frame insertion. We're never going to go back to the CRT era, but there's still something very special about gaming on the FW900 - something unique and wonderful and something we've lost in the modern era.

 

YCoCg

Member
Must be a slow news week for DF.

They also seem obsessed with BFI on OLEDs which given the dimming and flickering, looks horribly distracting in my opinion.
Eventually we will get there, but the goal would be for strobing at 120Hz (so in effect 240Hz with every other being a full black frame) and a high enough nit count and high enough rated HDR count to offset the dimming.
 

ManaByte

Member
Babylon 5 Reaction Gifs GIF by hero0fwar
 

RaZoR No1

Member
Was the CRT technology already at its limits or why did we change to LCD?
Weight? Size? Thickness?
Is it even possible to make a modern CRT?
 
It's funny how we dumped CRT years ago but are still chasing it's image quality. I had the biggest CRT non projection tv they made a Sony 36 inch 4:3 HDTV, of course when you watched HD content it went to widescreen so the actual viewing area was smaller than 36 inches. The tv was around 2 feet deep and over 200lbs so it wasn't exactly a good fit for smaller rooms or to sit in an entertainment center. The HD picture on it was great though, yeah older content looked worse but HD was crystal clear, no ghosting during movement etc.
 

TLZ

Banned
Just let them die already.

I tried the CRT thing for a while and it's really not worth it. Takes up a lot of space and very heavy for very very little difference in image quality for older stuff.

Just let go John. You're obsessing.
 
Last edited:

TrueLegend

Member
This is the problem with DF. Not once he points how horrible the color reproduction is compared to modern panels. The reason we need higher brightness is because certain shades of a color are only visible to the human eye at a certain luminescence level not for just making things brighter. Also LG CX is a decent OLED but it has horrible color science. If anyone here is looking to buy an OLED I will suggest get SONY. If you got the money get A90J but you will have to wait for VRR. The CX produces cooler tones comparatively and deviates from the creator's intentions significantly. If not an OLED then Q90A.
 

bender

What time is it?
This is the problem with DF. Not once he points how horrible the color reproduction is compared to modern panels. The reason we need higher brightness is because certain shades of a color are only visible to the human eye at a certain luminescence level not for just making things brighter. Also LG CX is a decent OLED but it has horrible color science. If anyone here is looking to buy an OLED I will suggest get SONY. If you got the money get A90J but you will have to wait for VRR. The CX produces cooler tones comparatively and deviates from the creator's intentions significantly. If not an OLED then Q90A.

I jumped off the OLED bandwagon due to imagine retention.

EIPf3pKW4AEiLXF


Less than two years of use. I know the newer panels are supposed to have improved but it's just too expensive to risk it again.

My KURO lasted 10 years (still works). Plasma has it's own drawback (power hungry), image retention (never had a problem with mine though) and buzzing at altitude.
 
Last edited:

RaZoR No1

Member
A 34" XBR WEGA was about 190lbs. Dimensionally it is quiet bulky as well.
Yeah... CRT was pretty heavy 😅
still, what was the real reason? The weight is a problem, but not a reason to use the inferior tech. How often do we have to move the TV? In best case only once or twice.
 

Chiggs

Member
This is the problem with DF. Not once he points how horrible the color reproduction is compared to modern panels. The reason we need higher brightness is because certain shades of a color are only visible to the human eye at a certain luminescence level not for just making things brighter. Also LG CX is a decent OLED but it has horrible color science. If anyone here is looking to buy an OLED I will suggest get SONY. If you got the money get A90J but you will have to wait for VRR. The CX produces cooler tones comparatively and deviates from the creator's intentions significantly. If not an OLED then Q90A.

A lack of technical proficiency best sums up DF, in general. Maybe Modern Vintage Gamer can help them out with some more cameo appearances. That guy at least has some technical chops.
 
Last edited:

bender

What time is it?
Yeah... CRT was pretty heavy 😅
still, what was the real reason? The weight is a problem, but not a reason to use the inferior tech. How often do we have to move the TV? In best case only once or twice.

The weight and bulk isn't good for manufacturers (shipping), distributors (storage) or end users (furniture to accommodate). Beyond that, the average screen size demand these days wouldn't make CRT feasible.
 

Kilau

Gold Member
Yeah... CRT was pretty heavy 😅
still, what was the real reason? The weight is a problem, but not a reason to use the inferior tech. How often do we have to move the TV? In best case only once or twice.
The technology doesn't scale well and they are extremely dangerous under the wrong circumstances. Going larger wasn't stable or safe.

Old content looks great on CRTs because it was designed to be displayed on them, there is no reason to use them for content that is intended for newer technology.

A lack of technical proficiency best sums up DF, in general. Maybe Modern Vintage Gamer can help them out with some more cameo appearances. That guy at least has some technical chops.
Do you mean to tell me that a plucky console with enough grit and determination can't "punch above it's weight" to produce impossible results?
 

Dream-Knife

Banned
I have an old Toshiba crt for retro gaming.

Idk why you'd want to play modern games on them though. Games back then weren't built to see small details in the distance. This is like some hipster thing I guess.

I also wouldn't buy an oled either though due to image burn.

HDR is a scam. Have to wear sunglasses for color accuracy? I think think artist's vision can make a compromise.
 
Last edited:

ethomaz

Banned
Was the CRT technology already at its limits or why did we change to LCD?
Weight? Size? Thickness?
Is it even possible to make a modern CRT?
Costs.
With LCD you could reach big screens at way lower cost.

Yes it is possible to make a modern CRT decide but who wants that?
 
Last edited:

Kenpachii

Member
Was the CRT technology already at its limits or why did we change to LCD?
Weight? Size? Thickness?
Is it even possible to make a modern CRT?

jup that's why, weight size. 21 inch CRT was no joke, thing was heavy as fuck and big. Also CRT colors are dog shit. Flat screens came and nobody ever wanted them back.
 

old-parts

Member
The video is decent and CRT's still have some image quality advantages but two things I disagree with.

1) The Sony CRT FW900 in the video was top of the line, many consumer CRT's from that era do not hold up and have image quality issues (CRT's do degrade over time).

2) Sample and hold performs better the higher the refresh rate so motion clarity improves a bit (not as good as CRT). This video was focused on consoles but they really should have explained this is part of the reason for high refresh rate displays. For example look at the minimal blur on the Asus VG279QM at 280Hz refresh rate on PC.

LG CX/C1 OLED plus BFI (motion pro) at 120Hz is amazing and definitely the next best thing to a high quality CRT, Sony OLEDs are not gaming focused and their specific advantages are not worth it, as VRR is not compatible with BFI the lowest input lag is what matters most. LG BFI input lag on the C1 is still lower than input lag of Sony A90 no BFI.

Panasonic are finally starting to taking gaming seriously on their OLED, so they are the ones who might outdo LG with BFI and gaming.

This is better look at the pros and cons of CRT's from aperture grille, very detailed.
 

Arioco

Member
I jumped off the OLED bandwagon due to imagine retention.

EIPf3pKW4AEiLXF


Less than two years of use. I know the newer panels are supposed to have improved but it's just too expensive to risk it again.

My KURO lasted 10 years (still works). Plasma has it's own drawback (power hungry), image retention (never had a problem with mine though) and buzzing at altitude.


Which TV do you have (brand and model)? Is that image retention or permanent burn-in? And finally, what kind of use do you give to your TV? Do you vary your content? A lot of gaming?

Looks really bad to be honest. As an OLED owner I wouldn't like to find that on my panel. 😓
 

bender

What time is it?
Which TV do you have (brand and model)? Is that image retention or permanent burn-in? And finally, what kind of use do you give to your TV? Do you vary your content? A lot of gaming?

Looks really bad to be honest. As an OLED owner I wouldn't like to find that on my panel. 😓

2017 LG. It's permanent. My sets are primarily used for gaming. Rocket League was the culprit and I consider myself I heavy user. My KURO took my gaming abuse much better as it would get temporary retention but using the built in test patterns would fix those retention issues. Not so with the LG. I used my KURO for 10 years, the LG less than 2. I'm using a Sony LED now which has it's own set of issues. Going from a god tier plasma to OLED to LED with local dimming zones took time to adjust to.
 

Arioco

Member
2017 LG. It's permanent. My sets are primarily used for gaming. Rocket League was the culprit and I consider myself I heavy user. My KURO took my gaming abuse much better as it would get temporary retention but using the built in test patterns would fix those retention issues. Not so with the LG. I used my KURO for 10 years, the LG less than 2. I'm using a Sony LED now which has it's own set of issues. Going from a god tier plasma to OLED to LED with local dimming zones took time to adjust to.


So sorry to hear that. Hope newer OLEDs have improved when it comes to prevent burn-in. I'm very cautious with mine, though. I don't game too much and if I do I remove the HUD. I also have a LED TV (Samsung Qled in my case) but yes, going from perfect blacks to dimming zones takes a while to get uses to.
 

bender

What time is it?
So sorry to hear that. Hope newer OLEDs have improved when it comes to prevent burn-in. I'm very cautious with mine, though. I don't game too much and if I do I remove the HUD. I also have a LED TV (Samsung Qled in my case) but yes, going from perfect blacks to dimming zones takes a while to get uses to.

Supposedly newer panels are better in that regard, but I'm obviously gun shy. for a premium product, I wasn't terribly happy with the set. A lot of the 2017 (including mine) had factory defects:


I'd definitely not do business with LG again and Sony (same panels) always seem to be behind on features that are important to me.
 
Last edited:
This seems to be some sort of hipster fetish or whatever. Really, most of people do not want to deal with the bulky CRTs or even have a place for it. Also, color reproduction is much worse than a modern panel.
 

supernova8

Banned
I care but there is no chance in hell that i can get one for under $3k.
I dunno it just reminds me of that thread where people are discussing that adapter thing which makes emulating games super duper accurate to what they would've been. Like a hardware interlacer or something. Massive waste of money IMO but I guess some people are into that sort of thing.
 
Supposedly newer panels are better in that regard, but I'm obviously gun shy. for a premium product, I wasn't terribly happy with the set. A lot of the 2017 (including mine) had factory defects:


I'd definitely not do business with LG again and Sony (same panels) always seem to be behind on features that are important to me.
LG covered that factory defect even out of guarantee, replacing the entire monitor with a newer one. If you had this factory defect , you should have contact LG. I did it about an year and an half ago, and a single photo of the screen with the lighter rectangle on the reds has been enough to make a technician come to my home with a new screen. I'm sure it worked like this for at least all of Europe and USA
 

Hoddi

Member
I wish they had found ways to keep improving CRTs. They seem pretty great.

Also, makes me want to hold on to my plasma for a bit more.

I'm holding on to mine for the time being. I've already upgraded the living room TV to OLED but ended up moving my consoles back to the plasma.

I also have an old tube TV for retrogaming and would kill for an FW900.
 
This video seems to be getting shite here, but I enjoyed it.

I love CRTs, and while I would never actually use my PS5 on a CRT in general I would absolutely love to see a PS5 on the FW900. That monitor is just second to none. I have a small Sony CRT that I pull out sometimes for PS2, and occasionally for watching stuff like Evangelion or Paranoia Agent off my PS3. I've been on the lookout for a good (cheap and not $3K) monitor as a secondary for old PC games but my searches have found nothing.

What was that display type that somebody was developing that combined the best aspects of both CRTs and LCDs? I can't for the life of me remember the name but it sounded amazing.
 
Top Bottom