• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

svbarnard

Banned
I bet for a lot of you this is going to be entirely new information because it was for me when I first read this article back in 2018 https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/ Read this article if you truly want to know what causes motion blur on modern LCDs/OLEDs. The guy who wrote the article, MARK D. REJHON, is truly a pioneer in the field of "display motion blur", he's literally single handedly writing the book on display motion blur, he's even been called the Albert Einstein of his field by people who have seen him give talks. It's funny I first read that article back in 2018 (everyone needs to read it though) and before that I remember playing Destiny on my 60 inch LCD and I remember noticing motoin blur when I would quickly move around in the game, I remember googling "motion blur on TVs" but nothing really could explain it, till I stumbled upon that article! All I can say is read it and you'll see that modern flat panels have a glaring problem with motion blur compared to old CRT TVs and the only way to fix it is through higher frame rates, 60fps is not good enough, 120fps is not good enough, 240fps might be good enough but honestly we're going to need 1,000fps! Isn't that crazy! Read the article and you'll understand.

This is what bothers me about the TV industry, the industry is coming out with 8K 60hz TVs (after 8K they're going to go to 12K all the while keeping the refresh rate at 60hz), the industry is focused on higher resolutions and not higher refresh rates, but you know what I honestly had no idea that CRT TVs had minimal motion blur and that LCDs have horrible motion blur , I honestly didn't know that so most likely the executives at the TV companies probably aren't aware either, they're just businessmen not scientists after all. The TV industry needs to shift focus away from higher resolutions and focus on higher frame rates/refresh rates instead, this needs to be the future! The TV industry needs to STOP at 4K and NOT proceed to 8K, instead of getting 8K TVs we should be getting 4K TVs but with higher refresh rates. We should get 4K 240hz then 4K 480hz then 4K 960hz. And 4K is fine because 1080p looks absolutely fine on a 4k screen, but say everyone had a 4K 240hz LCD, now the new all powerful Xbox Series X with 12 teraflops couldn't play games at 4K 240fps but it could maybe do 1080p at 240fps which would be fine because again 1080p looks just fine on a 4k screen.

This illustration shows how having higher frame rates on LCDs/OLEDs leads to less motion blur. The illustration was copied from the article. Sample-and-hold-displays are LCDs/OLEDs.

.
1NJMxXx.png


Just to be clear you can't take a 1,000hz display and play a 30fps game on it and expect to see less motion blur it doesn't work like that. You not only need a display with a 1,000hz refresh rate but you need to play games or movies at 1,000 frames per second in order to see less motion blur. So was this new information for you like it was for me?

p.s. Also apparently NVIDIA and ASUS now have a long term roadmap to 1,000hz displays by the 2030s.

One final thing. A lot of games have motion blur turned on in the settings (Doom 2016 and Doom Eternal for example) go into settings and if you see "motion blur" turn it the fuck off! LCDs/OLEDs already have motion blur so why would you want more? Why game developers do that is beyond me, modern flat panel displays already have a lot of motion blur as it is so why they add more to it is crazy. Listen just take 15 minutes out of your day and read the article I swear to god it will change the way you look at modern TVs.

Edit: I want to add one more thing, you know how they dedicate silicon just for doing ray tracing, you know how they take a computer chip and specially tweak it to do one task and one task only which greatly increases efficiency, I mean that's how we're able to get ray tracing on the PS5 and Xbox Series X because they're certainly not powerful enough to do it natively. Well they can do the same thing for ultra high frame rates as well. It's called frame rate amplification technology. https://blurbusters.com/frame-rate-...es-frat-more-frame-rate-with-better-graphics/
 
Last edited:

cireza

Banned
Always good to remind people of this.

I have tried many HD TVs, and even a good OLED one, it never comes close to the motion clarity of my CRT TV. I have been having motion clarity problems with my consoles ever since I moved to HD TVs. Noticed the blur from the very first day I got my first LCD HD TV, and back then, I simply could not understand what was going on with people and the industry.
 
Last edited:

StreetsofBeige

Gold Member
Was it motion blur? Or just how the lines blended together?

It was like CRTs had this built in anti-aliasing to make things smoothed out despite even being shitty resolution. I don't know what res CRTs had, but it was lower than the 720p HD era when LCDs popped up.

Yet, put in a Genesis game on a CRT and it looks and moves reasonably well for its era. Put the same game loaded on a modern day flat panel monitor or TV and it looks like blocky bright shit you can count the lines.
 
Last edited:
I play on a sony oled with 120hz black frame insertion, the best (sony's) implementation of bfi there is. A game could have locked 60fps with the bfi engaged, and the motion is still FAR cleaner and smoother on my Trinitron. I have not yet tried 120fps gameplay with bfi on my oled, so not sure about that, but I would be surprised if it was as good as the crt.

EDIT : Also I wonder how much 240hz bfi when tvs start having 240hz panels will help.
 
Last edited:
One final thing. A lot of games have motion blur turned on in the settings (Doom 2016 and Doom Eternal for example) go into settings and if you see "motion blur" turn it the fuck off! LCDs/OLEDs already have motion blur so why would you want more? Why game developers do that is beyond me, modern flat panel displays already have a lot of motion blur as it is so why they add more to it is crazy. Listen just take 15 minutes out of your day and read the article I swear to god it will change the way you look at modern TVs.
This isn't really true for oled. At 30fps, you need camera motion blur on oled because of the fast response time, there really isn't any perceptible motion blur, or it is minimal. Without any blur the choppiness sticks out horribly. At 60fps however, you don't really need motion blur on oled though a tasteful (Meaning high sample count and only applied to fast motion!) implementation of per object blur could still be beneficial.
 
Eeeeeh...depends on the scene. In dark scenes with bright objects in them CRTs can have pretty horrendous ghosting as the object moves around. And unlike an LCD, where the blur is a result of sample and hold so you can just strobe the backlight, the blur on a CRT is a result of the phosphor decay time, there's literally no way to improve it.
 
Eeeeeh...depends on the scene. In dark scenes with bright objects in them CRTs can have pretty horrendous ghosting as the object moves around. And unlike an LCD, where the blur is a result of sample and hold so you can just strobe the backlight, the blur on a CRT is a result of the phosphor decay time, there's literally no way to improve it.
Yes the different phosphor decay rate for different colors results in some trailing at times, but honestly i've learned to enjoy it glady for the vast benefits CRT's give. Unless you're talking about bloom, which was solved (although, the bloom actually benefits classic games in my estimation) through voltage regulators on PVMs and later crt models.

I don't know though, perhaps they could've eventually solved the trailing as well, where there's a will there's a way. Different materials maybe. They defnitely could have made CRTS lighter over time.
 

nkarafo

Member
Black frame insertion sucks. It cleans up the motion blur but also darkens the image and kills the colors. There is no way to re-balance them. Horrible solution.

Eeeeeh...depends on the scene. In dark scenes with bright objects in them CRTs can have pretty horrendous ghosting as the object moves around. And unlike an LCD, where the blur is a result of sample and hold so you can just strobe the backlight, the blur on a CRT is a result of the phosphor decay time, there's literally no way to improve it.
It's still much better than the whole scene being blurry while scrolling. The best example is playing 60fps 2D scrolling games (like most retro games). The blurring of the whole scene in LCDs is awful. To the point i just can't play these games. On a CRT they are as clear as when they are still. It's a huge difference. Even at 240fps (i played Monster Boy on my 240hz screen this way) it's still not as clear as a 60fps game on a 60hz CRT.

The only problem of 60hz CRTs is the noticeable flickering. But if we are going to count 75hz+ CRTs then there is no saving grace for LCDs. Although the issue there is that locked 60fps content won't sync perfectly.
 
Black frame insertion sucks. It cleans up the motion blur but also darkens the image and kills the colors. There is no way to re-balance them. Horrible solution.
BFI implementation differs with each company / display. On monitors, yes it does suck. On Sony and Panasonic televisions, it's a godsend. You also don't need extra brightness unless it's an hdr source.
 

StreetsofBeige

Gold Member
Why did plasma die out? They had great picture.
They were too dim, and it would have been very very power hungry/heavy at 4k.

Bright enough for a pitch black room, but it didn't win over potential buyers on the sales floor who only cared about brightness.
I think the key thing that started killing it was the PR that they were heavy (which they were). But when they started fizzling out, they still had best picture quality. Darks, speed were in no way better on LED/LCD.

But plasmas still tanked.

I never understood the heavy thing. Its not like anyone is moving a TV every day. Once you plunk it somewhere it stays. And lots of people even hung plasmas on the wall, so it's not like its so heavy it wont hang. Anyone mounting a big LCD/LED requires two people anyway.

I think another issue was technology for plasmas I think was tapering out while LCDs and such kept going, improving, OLED etc.... You never really saw plasma makers say giant innovations were coming.
 

ShirAhava

Plays with kids toys, in the adult gaming world
CRT is the past as well as the future

Come hang out with our CRT bros in this thread!

 

SkylineRKR

Member
Why did plasma die out? They had great picture.

I figure energy consumption. Even a 1080p 42 inch Plasma eats much more than a 55 LED 4K. They were also heavy. I don't think Plasma would be viable for 65 inch 4k sets. I've had Plasma for 10 years and in terms of viewing angle, black levels it was superior to LED but they simply didn't go beyond 1080p and were phased out.
 
Last edited:

SantaC

Member
I figure energy consumption. Even a 1080p 42 inch Plasma eats much more than a 55 LED 4K. They were also heavy. I don't think Plasma would be viable for 65 inch 4k sets. I've had Plasma for 10 years and in terms of viewing angle, black levels it was superior to LED but they simply didn't go beyond 1080p and were phased out.
Was there any 4K plasma?

Also wasnt input lag lower on plasma than LED?
 

nkarafo

Member
But as we will check this video in phone or led TVs, how could we appreciate the CRT advantage? It's like watching an HDR comparison video in a TV without HDR.
True but you can just hear the words and imagine how it would be :p

Unless you do still have a CRT monitor to compare or you remember how it looked. If you never saw how a CRT looks, you are lucky. Stay away from them. They will completely ruin any TV/monitor you are currently using.
 

Stuart360

Member
Yeah when i'm playing like a shooter and going 'in and out' of the screen, i dont really notice it, its only when panning left or right.
Football games really show it as the screen is constantly going left and right. I can live with it though and doesnt really bother me.
 
Was there any 4K plasma?

Also wasnt input lag lower on plasma than LED?
No 4k plasma, would've been too power hungry and heavy unless there was a technological breakthrough.

Also no, they didn't really have any benefit to input lag when compared to todays lcd. The last panasonic 1080p viera measured just over 40ms of lag, Vincent on HDTV test has a video on Plasma vs oled motion. Currently samsung lcds are just over 10ms at 60hz.
 

SkylineRKR

Member
Was there any 4K plasma?

Also wasnt input lag lower on plasma than LED?

No, I think it was impossible to do 4k on plasma without sacrificing the lighting or something. Input lag on Plasma was about the same I think, my V20 had around 20ms which is the same as my KS9000.
 

OmegaSupreme

advanced basic bitch
No 4k plasma, would've been too power hungry and heavy unless there was a technological breakthrough.

Also no, they didn't really have any benefit to input lag when compared to todays lcd. The last panasonic 1080p viera measured just over 40ms of lag, Vincent on HDTV test has a video on Plasma vs oled motion. Currently samsung lcds are just over 10ms at 60hz.
I believe there was a 4k model but it wasn't available to general consumers. It was 115 inches. I see it pop up on ebay every now and again for like 10 grand lol.
 
Plasmas motion is the next best thing to crt. I've got a calibrated Panasonic still as my main display.
For games yes. For movies and sports where you can use motion interpolation and bfi together, it can be argued the 2020 Panasonic and sony oleds can actually surpass plasma, with no phosphor trailing to boot. Potentially if AI interpolation gets good enough with no artifacts (Sony's doesn't have much currently but it can be improved), and doesn't incur an input lag hit it could finally be a modern solution for games that isn't inferior to CRT, years down the line.

I can imagine a micro led with 0 artifacting interpolation and high hz bfi as the ultimate display one day.
 

TonyK

Member
True but you can just hear the words and imagine how it would be :p

Unless you do still have a CRT monitor to compare or you remember how it looked. If you never saw how a CRT looks, you are lucky. Stay away from them. They will completely ruin any TV/monitor you are currently using.
I don't know, I played many many years on a CRT (I'm 46) and I remember being amazed with my first HD TV that it was a lot worse than actual TVs. I think the overall blur CRT adds fools the brain, like the 24fps do for movies, and it creates the illusion of better image quality.
 

99Luffy

Banned
Whats even worse is that mouse trail effect gets fainter as the refresh goes up, but gets alot more distracting since you can still see it, and more of it.

VD9upLq.png

O4H8HO1.png
 
Last edited:

svbarnard

Banned
Black frame insertion sucks. It cleans up the motion blur but also darkens the image and kills the colors. There is no way to re-balance them. Horrible solution.


It's still much better than the whole scene being blurry while scrolling. The best example is playing 60fps 2D scrolling games (like most retro games). The blurring of the whole scene in LCDs is awful. To the point i just can't play these games. On a CRT they are as clear as when they are still. It's a huge difference. Even at 240fps (i played Monster Boy on my 240hz screen this way) it's still not as clear as a 60fps game on a 60hz CRT.

The only problem of 60hz CRTs is the noticeable flickering. But if we are going to count 75hz+ CRTs then there is no saving grace for LCDs. Although the issue there is that locked 60fps content won't sync perfectly.
"On a CRT they are as clear as when they are still" Wow really? That's amazing! Too bad modern flat panels couldn't be like that, but apparently they can we just need 1,000fps@1,000hz lol. So the first time we ever got rid of our CRT and replaced it with an LCD was in 2008 and it was a Samsung 42 inch and honestly I thought it was superior to the CRT in every way lol, I mean I didn't notice the worse motion blur at all lol. Like I said reading this article was a real eye opener. https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/
 

nkarafo

Member
I don't know, I played many many years on a CRT (I'm 46) and I remember being amazed with my first HD TV that it was a lot worse than actual TVs. I think the overall blur CRT adds fools the brain, like the 24fps do for movies, and it creates the illusion of better image quality.
Dunno what you are talking about.

I still have both a CRT TV and a CRT monitor and can often compare them to my LCD 240hz monitor or 60hz LCD TV. The difference in motion clarity is night and day between the two technologies and even the 240hz LCD monitor can't quite catch up (though it's getting there). I have shown the difference to plenty of friends and other people and they all leave with a visible regret on their faces because they got rid of their CRTs or because they were assured their LCDs were superior all these years.
 

Xdrive05

Member
I just ran the ol' UFO test on my 240hz Acer Predator, and I can confirm that is exactly how it looks at that pixel speed.

I desperately want to nail a high performance CRT, but they are increasingly unicorns. I think the holy grail Sony FW900 is going for like $4,000 these days on ebay. Something crazy like that.

I really want to see a company bring back high def and high refresh rate CRTs again. *dreaming on...*
 

lukilladog

Member
Yeah, motion clarity was never the same after I switched to LCD. I still use crt to watch series, sports, and old games though.
 

nkarafo

Member
"On a CRT they are as clear as when they are still" Wow really? That's amazing!
Yes and this no no hyperbole. I have all types of panels and can compare them in real time. There is no difference in clarity between a still or moving image on a CRT, provided the motion is synced.

60fps on a 60hz CRT is still cleaner than 240fps on a 240hz LCD. Though i believe 1000fps on 1000hz LCD is too much. Judging from the improvement i see compared to a regular 60hz LCD, a 480hz monitor should be enough to reach CRT quality.
 
Last edited:

nkarafo

Member
Friend of mine who has his own moving company found this monitor in a closing studio and they happened to have this monitor, which they gave it to him for free.

That was about a year ago. He is still waiting for me to go to his place and test it (it's still in the storage). I am holding back because i know i won't be able to handle the envy XD.
 
Why did plasma die out? They had great picture.
Expensive and more complex to manufacture when compared to LCD, required more energy and were heavier.

On top of it all, it needed a big investment to go past 65" as they needed to retool the production facilities they had, and 4K would be a hurdle. It probably seemed easier to give up when sales weren't even that great at that point.

I bought a 65VT60 before they became unavailable though, such a good purchase.


OLED and 4K seemed like the future, so Samsung and Panasonic dropped them like a rock. Unfortunately.
Bright enough for a pitch black room, but it didn't win over potential buyers on the sales floor who only cared about brightness.
Also this. I remember how hard it was to convince people that a plasma TV was the best choice on a store sales floor, they didn't exactly make it easy on them
 
Last edited:

nkarafo

Member
which is more important for motion clarity , HZ or response time?
Response time. That makes 50/60hz CRTs look perfectly clear.

240hz LCDs is still not as clean but a hell lot cleaner than 60hz LCDs. I can read the text in this forum while smooth scrolling. The text retains it's clarity and doesn't look like a blurry mess like on a 60hz LCD, where i have to keep the screen still to read it. Even at 120/144hz, they are a bit too blurry for that.
 
Last edited:

Azurro

Banned
This is silly, you are not going to get games at close to 1000 fps unless you want to go back to N64 quality visuals, lol. It's not feasible to do that just to completely get rid of motion blur.
 

Soodanim

Member
I'm happy with the lower power consumption and lower weight given that the only trade off is motion blur.

I enjoy BFI on the monitors I've used it on, and if they could improve that technology (brightness counteraction, etc) and make it compatible with a wider range of refresh rates and VRR tech then it will definitely be good enough for all but the smallest niche.
 

mansoor1980

Member
Response time. That makes 50/60hz CRTs look perfectly clear.

240hz LCDs is still not as clean but a hell lot cleaner than 60hz LCDs. I can read the text in this forum while smooth scrolling. The text retains it's clarity and doesn't look like a blurry mess like on a 60hz LCD, where i have to keep the screen still to read it. Even at 120/144hz, they are a bit too blurry for that.
i connected my PC to a samsung LCD and the blur was awful , darker colors r especially visible with leaving trails , comparitively a monitor is bearable
 
Top Bottom