• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

carsar

Member
Strobing at 10,000 nits for only 10% of the time, still only averages 1000 nits to the human eye.
to avoid waiting of such bright screens I sugges this BFI method.
l52S3Sr.jpg

I guess, FALD TVs already use same method in normal cases, not sure about BFI.
 

RoboFu

One of the green rats
Yup two big factors why you go to play an old school nes game and totally suck where you once ruled. Screen lag on lcds vs crt and controller lag wired vs wireless.
 

01011001

Banned
This is a hipster thing, it has to be. I grew up around CRT TVs, the resolution is shit, there's color bleeding from the analog signal and they are big boys for a tiny screen.

There's a very good reason why everyone switched to LCD.

the resolution is shit? the current PC standard so to say is 1440p... there are sony Trinitron monitors that support 2304 x 1440 which is basically the 16:10 version of 1440p and that at 80hz
 
Last edited:

Azurro

Banned
the resolution is shit? the current PC standard so to say is 1440p... there are sony Trinitron monitors that support 2304 x 1440 which is basically the 16:10 version of 1440p and that at 80hz

The great majority of them during the golden age of CRTs were standard definition. We are at 4K and dabbling with 8K, I don't understand why you'd want to go back to a deformed, small, and very heavy display.
 

01011001

Banned
The great majority of them during the golden age of CRTs were standard definition. We are at 4K and dabbling with 8K, I don't understand why you'd want to go back to a deformed, small, and very heavy display.

for some games it is simply is superior. also I'm playing Apex legends on PS4 because the game doesn't have cross progression yet, and there I have to play at about 720p with a shitty framerate and an image completely ruined by a terrible temporal AA implementation.

so there is plenty worse things even in modern games compared to a bit of a warped image on the side.

if CRTs were still standard the game wouldn't need temporal AA in the desperate attempt to scale up the image, because on a CRT they could simply play the game at 720p or 800p and it would look good because it doesn't need to fit a fixed pixel grid.

also good PC CRTs were usually in the 800p area not SD
 
Last edited:

Kataploom

Gold Member
Srsly? I've never noticed something like that on decent LCD screens, and I've been seeing CRT TVs till like 2 years ago on a daily basis... Cheap LCD is ugly and uncomfortable to look at but it's not "motion blur" what I see in them...
 

Azurro

Banned
for some games it is simply is superior. also I'm playing Apex legends on PS4 because the game doesn't have cross progression yet, and there I have to play at about 720p with a shitty framerate and an image completely ruined by a terrible temporal AA implementation.

so there is plenty worse things even in modern games compared to a bit of a warped image on the side.

if CRTs were still standard the game wouldn't need temporal AA in the desperate attempt to scale up the image, because on a CRT they could simply play the game at 720p or 800p and it would look good because it doesn't need to fit a fixed pixel grid.

also good PC CRTs were usually in the 800p area not SD

I mean, it wouldn't need to be upscaled to a fixed resolution, but it's still not enough to show the detail in the image, it'd still be really a blurry mess. You honestly enjoy that?
 
Last edited:
Yeh u have just have to get used to it, wish all games would allow u turn off motion blur, or give a slider to much to add, motion blur and 30fps need to die.
 

svbarnard

Banned
BFI implementation differs with each company / display. On monitors, yes it does suck. On Sony and Panasonic televisions, it's a godsend. You also don't need extra brightness unless it's an hdr source.
You're saying Sony and Panasonic actually have a BFI implementation that really does reduce motion blur? I had a 2015 60in Vizio that had black frame insertion to reduce motion blur but honestly all it did was introduce flicker, I couldn't really tell if it reduced motion blur or not, in fact I don't think it did it all.
 
If I had a time machine and could fix three mistakes, one of those mistakes would be not keeping my last CRT TV and monitor. Should have stashed them in the basement. That monitor was lovely.
 
Last edited:

Trogdor1123

Gold Member
Posts like this make me always want to wait for my next tv. I'm currently using a 60 inch Samsung plasma (about 6 or 7 years old) and it's been good but I'm want to get to 4k with vrr as it is supposed to be great. Now, I just don't know
 

Sophist

Member
i have one of the best CRTs ever, the iiyama HM704. i also had syncmaster, flat sony, ... in the past.
there is no doubt to me that today's IPS monitors have a (much) better image quality; it's sharper and has better colors.
They also are both faster and slower which is why you still have ghosting.

Below are the response times of the Dell Alienware AW2521H (taken from https://www.rtings.com/monitor/reviews/dell/alienware-aw2521h)
A 120hz CRT has a constant 8.3ms response time (with a ~2ms decay)
The AW2521H is faster in most cases but unequal.
For a 360hz monitor, everything should be at least under 2.7ms to avoid ghosting and blur.
What monitor manufacturers should target is not to raise the refresh rate but lower the response times.

alienware-aw2521h-tabsijuh.jpg
 
Last edited:

svbarnard

Banned
Response time. That makes 50/60hz CRTs look perfectly clear.

240hz LCDs is still not as clean but a hell lot cleaner than 60hz LCDs. I can read the text in this forum while smooth scrolling. The text retains it's clarity and doesn't look like a blurry mess like on a 60hz LCD, where i have to keep the screen still to read it. Even at 120/144hz, they are a bit too blurry for that.
You have 240hz LCD, and you're able to surf the web in 240fps? And when you scroll you can still read the text huh? That's honestly why I said 240fps@240hz might be good enough.

Too bad the TV industry is focusing on increasing resolution and not hz. Otherwise 240hz could easily be the standard by now.
 

Ywap

Member
I can´t stand motion blur so i still game on a crt with my PC.

I replaced my plasma last year with an LG oled and Bfi does a really good job keeping the motion clarity at reasonable levels.

Most plasmas had mediocre blacks, line bleeding, risk for permanent burn in and phosphor trailing. The latest versions were not that bad regarding the phosphor trailing and were still a much better choice than LED:s if you appreciate good motion handling (imo).
 
Last edited:

svbarnard

Banned
Chief Blur Buster Here, inventor of TestUFO!

While some of this is true, This needs further explanation.

Display Science

For those who haven't been studying the redesigned Area 51 Display Research Section of Blur Busters lately, I need to correct some myths.

OLEDs also have motion blur too unless impulse driven too. Even 0ms GtG has lots of motion blur because MPRT is still big.

VR headsets such as Oculus Rift can strobe (impulse-drive like a CRT), as well as LC CX OLED BFI setting. However, the MPRT of Oculus Rift Original is 2ms MPRT, and LG CX OLED is about 4ms MPRT. Be careful not to confuse GtG and MPRT -- see Pixel Response FAQ: Two Pixel Response Benchmarks: GtG Versus MPRT. To kill motion blur, both GtG *and* MPRT must be low. OLED has fast GtG but high MPRT, unless strobed.

Faster GtG Can Reduce Blur -- But Only To a Point

However, faster GtG does lower the flicker fusion threshold of the stutter-to-blur continuum where objects during low framerates "appear" to vibrate (like slow guitar/harp string) and high framerates vibrate so fast they blur (like a fast guitar/harp string). Regular stutter (aka 30fps) and persistence motion blur (aka 120fps) is the same thing --- the stutter-to-blur continuum is easily watched at www.testufo.com/vrr (a frame rate ramping animation during variable refresh rate).

Slow GtG fuzzies up the fade between refresh cycles as seen in High Speed Videos of LCD Refresh Cycles (videos of LCD and OLED in high speed video), which can lower the threshold of the stutter-to-blur continuum.

Some great animations to watch to help you understand the stutter-to-blur continuum:

Stutters & Persistence Motion Blur is the Same Thing

It's simply a function of how slow/fast the regular-stutter vibration. Super fast stutter vibrates so fast it blends into motion blur.

Educational Takeaways of Above Animations

View all of the above on any fast-GtG display (LCD or OLED, such as TN LCD, modern "1ms-GtG"-IPS-LCD, or an OLED). You'll observe once GtG is an insignificant percentage of refresh cycle, these things are easy to observe (assuming framerate = Hz).

1. Motion blur is pixel visibility time
2. Pixel visibility time of a strobed display is the flash time
3. Pixel visibility time of a non-strobed display is the refresh cycle time
4. Stutter and persistence blur is the same thing; it's a function of the flicker fusion threshold where stutter blends to blur
5. OLED has motion blur just like LCDs
6. You need to flash briefer to reduce motion blur on an impulse driven display.

Once you've seen all the educational animations, this helps greatly improve the understanding of displays.

Now, LCDs do not need to be 1000Hz to reduce motion blur -- you can simply use strobing. But strobing (CRT, LCD, OLED) are just a humankind bandaid because real-life doesn't flicker or strobe. We need 1000Hz OLEDs / LCDs, regardless of whether OLED or LCD.

Only Two Ways To Reduce Persistence (MPRT) Display Motion Blur

There are only two ways to shorten frame visibility time (aka motion blur)

1. Flash the frame briefer (1ms MPRT requires flashing each frame only for 1ms) -- like a CRT
2. Add more frames in more refresh cycles per second. (1ms MPRT requires 1000fps @ 1000Hz)

And, the OP forgot to post both images, which are equally important due to context ;)

motion_blur_from_persistence_on_sample-and-hold-displays.png


motion_blur_from_persistence_on_impulsed-displays.png
It says 16ms flash per frame at any Hz, but how can you have a flash for 16ms if the content you're viewing, is in 120fps or higher? Because at 120fps each frame is only there for 8ms before the next frame is shown, I mean I understand the reason we see motion blur on LCD displays is because the back light is always on, the back light doesn't strobe it's always 100% on, one of the ways to reduce motion blur is to strobe the backlight for each frame, you need to ELI5 (explain like I'm 5). How can you have a flash that lasts for 16ms when each frame is only being shown for 8ms as in 120fps, you see this is kind of heady stuff for a non-engineer type person like myself?

Listen Mr. Rejhon you know what you really need to do you need to make an accompanying companion video to your 2017 article "the amazing journey to Future 1,000Hz displays." You need to make a an in-depth video where you go over every point and explain it like you're talking to five year olds, so people either have the option of reading the article or watching the video (the video should have a link at the very beginning of the article), seriously please do this?
 

nkarafo

Member
Their resolution is also insultingly low. It's a hipster thing, just because the tech has one or two aspects that are still very good by today's standards, it doesn't mean that the most important factors of a TV are way better in what is available in the market now. I would never want to code in a curved CRT tv ever again, give me high resolution, bright colors and ample screen space.
So everyone who might prefer a CRT monitor to play games with no input lag or better motion clarity are hipsters?

Quit being disingenuous. I could do the same. "Modern TVs are for normies who don't know any better. I would never want to play a fast paced shooter on a slow LCD ever again, give me zero delay and perfect motion clarity".


svbarnard said:
You have 240hz LCD, and you're able to surf the web in 240fps?
Yes. Everything i do on my desktop runs at 240fps, expect playing most games obviously. I use smooth scrolling on the browser so i don't have to stop the scrolling while i'm reading the text. This is how i used the internet since the late 90's. Standard 60hz LCDs were a pain for me to use because i had to re-wire my brain and make the screen stop from moving in order to read. Mouse movement is also much more responsive and doesn't feel like i'm dragging the pointer.
 
Last edited:

bender

What time is it?
Their resolution is also insultingly low. It's a hipster thing, just because the tech has one or two aspects that are still very good by today's standards, it doesn't mean that the most important factors of a TV are way better in what is available in the market now. I would never want to code in a curved CRT tv ever again, give me high resolution, bright colors and ample screen space.

Thank God for non-curved CRTs then. :p
 
Anyone who is knowledgeable on the subject know if it is technically possible to make a CRT that didn't require so much depth?

I wonder, given unlimited funds for R & D, and zero care for what consumers bought, what display technology could produce the best display? Could they make a flat 4K CRT? Would HDR be possible with CRT or plasma?


Why did plasma die out? They had great picture.

mainly it comes down to:
+plasma tvs were more expensive to create when compared to lcd tvs (also heavier)
+plasma was marketed poorly and couldn't keep up with the competition of lcd

Importantly, I think, is that they were more expensive for the consumer.
 

93xfan

Banned
I have a plasma still, so I’m good for now. I remember playing Forza Horizon four in 4K at a Microsoft store. It was the first game I’d seen on an LCD (excluding my switch). Really did not like the way it looked in motion.
 

Tmack

Member
I think the key thing that started killing it was the PR that they were heavy (which they were). But when they started fizzling out, they still had best picture quality. Darks, speed were in no way better on LED/LCD.

But plasmas still tanked.

I never understood the heavy thing. Its not like anyone is moving a TV every day. Once you plunk it somewhere it stays. And lots of people even hung plasmas on the wall, so it's not like its so heavy it wont hang. Anyone mounting a big LCD/LED requires two people anyway.

I think another issue was technology for plasmas I think was tapering out while LCDs and such kept going, improving, OLED etc.... You never really saw plasma makers say giant innovations were coming.

Being heavy meant it was hard to scale it to smaller and bigger sizes. Plasmas were non-existent on the low end of screen sizes which accounteed for a big chunk of the worldwide market and also lacked on the real big screens market.

And heavy also meant thicc!! Plasma was around when those really slim LCD screens star to come out.... no matter how bad the image quality of some of the LCD models were, only by being those slim sleek things they become objects of desire for a lot of people.

Also being heavy gave plasma a negative perception...
 

nkarafo

Member
The great majority of them during the golden age of CRTs were standard definition. We are at 4K and dabbling with 8K, I don't understand why you'd want to go back to a deformed, small, and very heavy display.
Personally, i'm mostly interested in high-res PC monitors and not standard def TVs.

But even then, there is a large community of retro gamers who prefer playing retro games on such TVs. Every game from N64 or older was made with these TVs in mind and they look much better (and correct) in them.
 

StreetsofBeige

Gold Member
Personally, i'm mostly interested in high-res PC monitors and not standard def TVs.

But even then, there is a large community of retro gamers who prefer playing retro games on such TVs. Every game from N64 or older was made with these TVs in mind and they look much better (and correct) in them.
Yup.

I got one of those Pandora 5 boxes hooked up to my 4k tv. Bought for cheap thrills 80s and 90s arcade gaming.

It works and everything, but these games blown up on a big modern screen, it looks so blocky and too clean. Those CRT arcade machines from 1985 with cigarette fumes baked into the cabinet looks so much better. And so did playing old ass 8 and 16 bit games on an ancient RCA tv even though CRT glass was curved.
 
Last edited:

Rentahamster

Rodent Whores
Chief Blur Buster Here, inventor of TestUFO!

While some of this is true, This needs further explanation.

Display Science

For those who haven't been studying the redesigned Area 51 Display Research Section of Blur Busters lately, I need to correct some myths.

OLEDs also have motion blur too unless impulse driven too. Even 0ms GtG has lots of motion blur because MPRT is still big.

VR headsets such as Oculus Rift can strobe (impulse-drive like a CRT), as well as LC CX OLED BFI setting. However, the MPRT of Oculus Rift Original is 2ms MPRT, and LG CX OLED is about 4ms MPRT. Be careful not to confuse GtG and MPRT -- see Pixel Response FAQ: Two Pixel Response Benchmarks: GtG Versus MPRT. To kill motion blur, both GtG *and* MPRT must be low. OLED has fast GtG but high MPRT, unless strobed.

Faster GtG Can Reduce Blur -- But Only To a Point

However, faster GtG does lower the flicker fusion threshold of the stutter-to-blur continuum where objects during low framerates "appear" to vibrate (like slow guitar/harp string) and high framerates vibrate so fast they blur (like a fast guitar/harp string). Regular stutter (aka 30fps) and persistence motion blur (aka 120fps) is the same thing --- the stutter-to-blur continuum is easily watched at www.testufo.com/vrr (a frame rate ramping animation during variable refresh rate).

Slow GtG fuzzies up the fade between refresh cycles as seen in High Speed Videos of LCD Refresh Cycles (videos of LCD and OLED in high speed video), which can lower the threshold of the stutter-to-blur continuum.

Some great animations to watch to help you understand the stutter-to-blur continuum:

Stutters & Persistence Motion Blur is the Same Thing

It's simply a function of how slow/fast the regular-stutter vibration. Super fast stutter vibrates so fast it blends into motion blur.

Educational Takeaways of Above Animations

View all of the above on any fast-GtG display (LCD or OLED, such as TN LCD, modern "1ms-GtG"-IPS-LCD, or an OLED). You'll observe once GtG is an insignificant percentage of refresh cycle, these things are easy to observe (assuming framerate = Hz).

1. Motion blur is pixel visibility time
2. Pixel visibility time of a strobed display is the flash time
3. Pixel visibility time of a non-strobed display is the refresh cycle time
4. Stutter and persistence blur is the same thing; it's a function of the flicker fusion threshold where stutter blends to blur
5. OLED has motion blur just like LCDs
6. You need to flash briefer to reduce motion blur on an impulse driven display.

Once you've seen all the educational animations, this helps greatly improve the understanding of displays.

Now, LCDs do not need to be 1000Hz to reduce motion blur -- you can simply use strobing. But strobing (CRT, LCD, OLED) are just a humankind bandaid because real-life doesn't flicker or strobe. We need 1000Hz OLEDs / LCDs, regardless of whether OLED or LCD.

Only Two Ways To Reduce Persistence (MPRT) Display Motion Blur

There are only two ways to shorten frame visibility time (aka motion blur)

1. Flash the frame briefer (1ms MPRT requires flashing each frame only for 1ms) -- like a CRT
2. Add more frames in more refresh cycles per second. (1ms MPRT requires 1000fps @ 1000Hz)

And, the OP forgot to post both images, which are equally important due to context ;)

motion_blur_from_persistence_on_sample-and-hold-displays.png


motion_blur_from_persistence_on_impulsed-displays.png
I didn't know you were on here. That's great! Your work is legendary, and it's been very educational. Thanks!
 

noonjam

Member
Anyone who is knowledgeable on the subject know if it is technically possible to make a CRT that didn't require so much depth?

I wonder, given unlimited funds for R & D, and zero care for what consumers bought, what display technology could produce the best display? Could they make a flat 4K CRT? Would HDR be possible with CRT or plasma?






Importantly, I think, is that they were more expensive for the consumer.


SED and FED

Died before it could get to market sadly. Would have been interesting how much more advanced it would be if it had today.
 

mdrejhon

Member
I didn't know you were on here. That's great! Your work is legendary, and it's been very educational. Thanks!
Thank you for the compliment!

When you say "lines of motion resolution" you're referring to frames per second right? Remember a lot of people here are casual people.
It's an ancient benchmark from the NTSC TV / plasma TV days that doesn't convert well to different refresh rates and different resolution screens

A good article about the flaws of old "motion resolution" benchmarks is explained in: Making Of: Why Are TestUFO Animations at 960 Pixels Per Second by Default

SED and FED

Died before it could get to market sadly. Would have been interesting how much more advanced it would be if it had today.
I am sorry to be the bearer of bad news, but SED and FED prototypes had more artifacts than plasma. Some of it caused by certain similarities to plasma, but others caused by the need to concurrently refresh multiple parts of the screen to keep the image bright, etc. On some prototypes, there was a segmented multiscanning-artifact problem (e.g. zigzag artifacts, a complex cousin of interlacing artifacts).

From a good example of multiscan problems in an old but good paper by Charles Poynton:

AOwFSCJ.png

To avoid disjoint artifacts during horizontal motion -- one kind of really don't want to do multiscan or random-access-scan because of artifacts that would occur. So you really want to stick to global presentation, or to sequential scan (left-right, right-left, top-bottom or bottom-top scan) which only has minor skewing artifacts.

Here's another TestUFO to try on a DELL or HP 60Hz screen, or even an LG OLED configured to 60 Hz: www.testufo.com/scanskew ... It only does a simple tilt for sequential-refresh displays like CRT, LCD and OLED (espeically if the screen has a low max-Hz refresh rate). What really craps out is that the Scan Skew test shows zigzag patterns if you run it on certain multiscanned displays such as certian JumboTrons. The smaller FED/SED displays had that problem.

The world went plasma instead.

Now that said, I did figure out how to eliminate multiscan artifacts. I've come up with an algorithm to eliminate multiscan artifacts. (See Custom OELD Rolling Scans -- Custom Built OLED Monitor) This might be used in Future Retina Refresh Rate JumboTrons.

Technological progress could have made FED / SED better than many plasmas, but Pioneer did such a brilliant job of their plasma, that I'm not even sure FED / SED would have been able to beat the legendary Pioneer Plasma.

It says 16ms flash per frame at any Hz, but how can you have a flash for 16ms if the content you're viewing, is in 120fps or higher? Because at 120fps each frame is only there for 8ms before the next frame is shown, I mean I understand the reason we see motion blur on LCD displays is because the back light is always on, the back light doesn't strobe it's always 100% on, one of the ways to reduce motion blur is to strobe the backlight for each frame, you need to ELI5 (explain like I'm 5). How can you have a flash that lasts for 16ms when each frame is only being shown for 8ms as in 120fps, you see this is kind of heady stuff for a non-engineer type person like myself?

The sample-and-hold version of the image was created long after the impulsed version of the PNG, to drive home the fact that motion blur is equal to frame visibility time. So, I agree the 16ms flash can have some potentially confusing interpretation, because you need below-60Hz for that example. 16ms flash is with a strobe backlight. Obviously, you'd need a long enough refresh cycle to be able to flash within the refresh cycle, so 60Hz is kind of low (16ms flash would have no black gaps in between). The matter of the fact is 16ms flash at anything below-60Hz will have identical motion blur.

And 8ms flash at anything below-120Hz will have identical motion blur, e.g. 8ms flash at 75Hz versus 8ms flash at 100Hz. Any refresh rate. The point is, strobing decouples motion blur away from refresh rate.

That's why 60Hz vs 120Hz is much bigger motion-blur-difference on LCD & OLED than 60Hz vs 120Hz on an old CRT. Indeed, there were flicker differences but the motion blur was the same for panning scenery. However, 60Hz strobing is useful due to 60 years of legacy 60fps 60Hz material.

Perhaps the image should say "* The refresh cycle duration needs to be long enough to fit the flash" since I recognize it is not obvious to all readers. Meaning 16ms flash would require any refresh rate 60Hz-and-below.

Many strobe backlights have adjustable flash lengths (e.g. via Blur Busters Strobe Utility on certain BenQ monitors). A brand new rewritten version of Strobe Utility will be released for an upcoming 240Hz IPS monitor also capable of 60Hz single-strobe, so you can adjust persistence (much like an adjustable-phosphor-decay CRT!)

Listen Mr. Rejhon you know what you really need to do you need to make an accompanying companion video to your 2017 article "the amazing journey to Future 1,000Hz displays." You need to make a an in-depth video where you go over every point and explain it like you're talking to five year olds, so people either have the option of reading the article or watching the video (the video should have a link at the very beginning of the article), seriously please do this?

Blur Busters does not yet make videos because I am born deaf

However I am slowly working towards debuting videos (May be 2022, not 2021)

I am planning towards some simple YouTube debut (quarterly type thing to begin with, then maybe monthly). The problem is it takes me 2-5x more work due to my deafness and my voice (that takes away from my ability to put food on the table). My voice is a bit quirky, so I will need some help/assistants to get clearer audio dubbed on top of my speaking when I make YouTube

That said, I am told I am a vision wizard when it comes to displays. Even as a hobbyist working on things I have passion for -- I have been cited in more than 20 peer reviewed research papers (these are all unpaid stuff).

Just like a photogenic memory of some people, I have a display-emulator equivalent in my brain. My brain has the capability to emulate hypothetical displays in my head (and I can uncover display motion flaws before the display is invented). And I can emulate motion tests before I create then (this is how I invented TestUFO Persistence, TestUFO Eyetracking, TestUFO Pursuit Camera, TestUFO Variable Refresh Rate Emulator, etc). People tell me I'm the Refresh Rate Einstein because of these abilities.

I now do work with a few manufacturers though on generic display concepts that most manufacturers don't fully understand (even Samsung television researchers cited me in a recent paper in 2020). I can improve strobing on a display much more cheaply than NVIDIA can, for example.

I make great explainer articles though. Check out Blur Busters Area 51 (the purple color coded section) for some of my flagship articles. However, I fail when it comes to YouTube skillz, so I would probably need to partner-up with someone else who is willing to help. It is not for lack of trying -- just video is not easy for me.

I have done very simple PowerPoint-style videos (High Speed Video of LightBoost) before, but proper YouTube-star videos takes a lot of work -- It will only be "Go Big Or Go Home" -- meaning IF I debut anymore non-PowerPoint videos, it will have to be the full fancy treatment (backgrounds, lighting, talking heads, assistant, etc). And I'd rather wait until after pandemic a bit, since getting assistants are still difficult at this time. If I go YouTube, I'll do it properly or not at all.

Can't have Chief Blur Buster release a mediocre video, shall we? ;)
 
Last edited:

mdrejhon

Member
to avoid waiting of such bright screens I sugges this BFI method.
l52S3Sr.jpg

I guess, FALD TVs already use same method in normal cases, not sure about BFI.

Yes, I've heard of this being worked on.

That said -- there's no difference between "normal cases" and "BFI" because flicker=BFI. It's one and same thing (assuming proper motion-blur-reducing PWM algorithm at one-pulse-per-visible-refresh-cycle).

As long as it's one flash per visible frame (aka visible refresh cycle) flicker means the same thing as BFI (as I consider strobing and sub-frame BFI essentially synonyms under the impulse-driving umbrella from a display physics point of view). :) Also, BFI doesn't have to be an integer divisor of a refresh cycle (variable ON:OFF duty cycle)

Yes, variable per-pixel BFI actually sort of emulates CRT in a sorta way. Brighter colors have longer phosphor decay on a CRT, creating higher MPRTs for bright objects than for dim objects.

It's a legitimate "compromise". There are some side effects (ghosting on bright objects), but sometimes it's certainly an acceptable compromise when there's limited brightness available. It does require extremely good timing precision, e.g. continuum of submillisecond increments to generate 10-bit of greyscale on a per LED basis (or per OLED pixel basis).

In theory, OLEDs could do this too, but that requires extremely complex random-access pixel refresh which is not easy possible to do very fast with current FPGAs/ASICs on a matrix grid -- sequential scan is much faster to achieve.

However, FALD backlights are low resolution and are much easier to have per-LED custom timing precisions to generate 10-bit of brightnesses on a per-LED basis. So, variable-pulse-width combined FALD-and-BFI is definitely possible.

Future HDR FALD strobed LCDs can do this, as long as they do one pulse per refresh cycle (to avoid PWM-style duplicate-image artifacts).

You're welcome! Your contributions to the space were badly needed. Question - any new blur busters approved monitors coming up on the horizon?
Four in 2021 (assuming none of them cancels/delays them to 2022).

Two manufacturers have committed (signed agreements) to 4 models coming out in 2021, and a third manufacturer have already unofficially committed.

Be warned, there is the common typical 12 months for monitor engineering cycles.

The first one is a 24 inch 240Hz IPS monitor that supports 60Hz single strobe, strobe-any-custom-Hz (60Hz thru 240Hz in one analog continuum), and supports optional Strobe Utility operation (no, it's not a BenQ ZOWIE monitor).

the 1000 fps part seems silly to suggest when what you want is the response time of 1000 Hz.

If you had that, it probably wouldn't matter if frames are being doubled/interpolated or not.
This is not true.

Unfortunately 0ms GtG has lots of motion blur.

To understand display science better -- there are two different pixel response benchmarks, GtG and MPRT.

It's impossible to have low MPRT without either (A) flicker, or (B) ultrahigh framerate & Hz.

MPRT is pixel visibility time per frame (Remember MPRT response is not GtG response).

To shorten pixel visibility time to 1ms MPRT, you need to either

(A) Flash it for 1ms -- like a CRT, plasma, strobe, BFI;
--or--
(B) To do it flickerlessly, you need to fill the whole second with 1ms frames (1000 frames of 1ms each)

Motion blur is pixel visibility time. Please see the brand new educational TestUFO animations to understand better.

Doubling Hz and frame rate halves display motion blur (assuming near-0ms GtG). If you've already seen the new 120Hz iPads, it has half the display motion blur.

And if you browser-scrolled on a true-240Hz esports LCD, it has 1/4th the motion blur of a 60Hz LCD but is still not quite as clear as CRT (yet).
 
Last edited:

Rentahamster

Rodent Whores
Four in 2021 (assuming none of them cancels/delays them to 2022).

Two manufacturers have committed (signed agreements) to 4 models coming out in 2021, and a third manufacturer have already unofficially committed.

Be warned, there is the common typical 12 months for monitor engineering cycles.

The first one is a 24 inch 240Hz IPS monitor that supports 60Hz single strobe, strobe-any-custom-Hz (60Hz thru 240Hz in one analog continuum), and supports optional Strobe Utility operation (no, it's not a BenQ ZOWIE monitor).
I'm glad to see more manufacturers recognize the value of the blur busters approved validation process. Does "strobe any custom Hz" mean that we can finally have strobing with VRR?
 

mdrejhon

Member
I use 1080p plasma for PC gaming as much as I can - still less motion blur than either 200 or 240 Hz monitor.
Good move.

That said, strobed LCD progress have greatly improved in the last few years and now can do less motion blur than the best Pioneer plasma if you cherrypick a few things:

(A) Use a strobe backlight
(B) Cherrypick one of the better strobe-capable panels
(C) Use lots of refresh rate headroom to improve strobe quality (e.g. use 120Hz+strobe on a 240Hz+ capable panel)
(D) Use framerate=Hz

Good strobed LCDs now have 1/10th the motion blur of a Pioneer Plasma. The 119Hz and 120Hz mode of the Blur Busters Approved ViewSonic XG270 (240Hz IPS panel) is one good example of a plasma-beating motion blur reduction mode that looks much better than ugly LightBoost or other lesser strobe backlights.

However, plasmas definitely have better black levels and likely probably better colors (especially if it is a good plasma, like Pioneer). And the phosphor decay of a plasma can help make low-Hz flicker more tolerable (e.g. 60Hz flicker).

Also, many strobe backlights don't support low-Hz strobing, but many support strobing as low as 75Hz using a custom resolution (lower the Hz to make it easier to get framerate=Hz strobed motion nirvana).

All 2021-and-later Blur Busters Approved monitors will also have mandatory support for 60 Hz single strobe -- it's a requirement (in addition to very low strobe crosstalk) or it doesn't get the seal of approval.

I'm glad to see more manufacturers recognize the value of the blur busters approved validation process. Does "strobe any custom Hz" mean that we can finally have strobing with VRR?

Strobing and VRR is fiendishly difficult to do concurrently without side effects (variable flicker, variable strobe crosstalk, etc) -- and I'm not yet ready to answer if any of them will do strobing & VRR concurrently.

which is more important for motion clarity , HZ or response time?

Both.

It's best explained at Pixel Response FAQ: GtG Versus MPRT.

(1) For an impulsed display, primarily MPRT response time is more important (and GtG fast enough to be hidden between refresh cycles). Refresh rate doesn't affect motion blur of a flashed display like a CRT.

(2) For a flickerfree display, GtG response time, MPRT response time and refresh rate is important.

That's why for (2) it is difficult to eliminate motion blur strobelessly (NO flash, NO flicker, NO BFI, NO phoshor).

This is exactly why we need 1000 Hz LCDs to achieve flickerless 1ms MPRT without flicker/flashing/bfi/strobe/phosphor/etc.


Flicker is good if you can tolerate it. But not everyone likes flicker. In other words, 1000Hz is the only way to have cake and eat it too (have blurless AND flickerless operation with zero PWM).

See CRT Nirvana Guide for Disappointed CRT to LCD Upgraders, for some motion blur options.

Have you seen browser scrolling on a true 120Hz, 240Hz and 360Hz LCD? (in non-strobed operation)

120 Hz LCDs = 1/2 the browser scrolling motion blur of 60 Hz LCD
240 Hz LCDs = 1/4 the browser scrolling motion blur of 60 Hz LCD
360 Hz LCDs = 1/6 the browser scrolling motion blur of 60 Hz LCD

But still not as clear as a CRT (less than 1/10 browser scrolling motion blur of LCD).

Real life doesn't strobe. Real life doesn't flicker. Real life is essentially infinite frame rate. To have no display-enforced motion blur above-and-beyond real life, you need refresh rates approaching analog-like levels. To be as blurless as real life (only motion blur is natural human vision/brain), requires extremely ultra-high frame rates. See Stroboscopic Effect Of Finite Frame Rates for the flaws of the humankind invention of using a series of static images to emulate moving images.

We have to keep doubling refresh rates (and frame rates) to keep halving display motion blur. The current Blur Busters upgrade recommendation is geometric display upgrades, such as 60 -> 120 -> 240 -> 480 -> 960 to punch the diminishing curve of returns. Or 60 -> 144 -> 360 upgrades. While esports may be happy with small-Hz upgrades, I recommend mainstream users double Hz during upgrades to retain human-noticeable benefits.

Even Apple is going to put 240 Hz displays in their iPhones, just because of browser scrolling motion blur -- 240Hz has 1/4 the scrolling motion blur.

Even 120Hz is slowly going to become a commodity refresh rate -- increasingly supported in more devices. Both Galaxy and iPhones will have >60Hz standard. All new game consoles now support 120 Hz (PS5, XB1, XBSX). Most new 4K TVs support 120Hz PC input. Even DELL/HP is considering adding 120Hz to office monitors later this decade (2025+). Even 240Hz and 480Hz will someday be commoditized later this decade, as technological progress allows.

240 Hz is no longer just for esports -- it helps browser scrolling -- someday, 120 Hz and 240 Hz will be a free "included" feature just like "Retina Displays". Remember, 4K once cost over $10,000 twenty years ago (the IBM T221). Today, 4K is a $299 Walmart special.

Ever since we stopped impulsing (CRT), there is now pressure to increase refresh rates to eliminate motion blur strobelessly -- and the technology is gradually becoming cheaper and cheaper to do so.
 
Last edited:

mdrejhon

Member
Below are the response times of the Dell Alienware AW2521H (taken from https://www.rtings.com/monitor/reviews/dell/alienware-aw2521h)
A 120hz CRT has a constant 8.3ms response time (with a ~2ms decay)
The AW2521H is faster in most cases but unequal.
For a 360hz monitor, everything should be at least under 2.7ms to avoid ghosting and blur.
What monitor manufacturers should target is not to raise the refresh rate but lower the response times.
Yes, it is always ideal for a LCD to have GtG faster than a refresh cycle.

However, it is not the final frontier. You also ideally want to get as close to 0ms GtG where possible.

GtG is a continuum throughout a whole refresh cycle. Even GtG half a refresh cycle still can add some ghosting.

Also, don't forget GtG and MPRT acts together simultaneously to add motion artifacts. Even 0ms GtG still has motion blur too. See Pixel Response FAQ: GtG Versus MPRT, for all the sub-refresh artifacts that can occur.

At 960 pixels/second, even 8ms GtG at 60Hz, can still produce visible smears above-and-beyond MPRT response -- or even 3-4ms real-world GtG at 120Hz. GtG is simply a pixel fading from one color to another color, and can smear even between refresh cycles, even with sub-refresh GtG.

qAsIJGh.png



Now that said, does that make 360 Hz useless? No.

Remember, you can always run a 360 Hz monitor at a lower refresh rate, if you want a very clean 240 Hz or want to have zero-strobe-crosstalk during 100Hz or 144Hz strobe.

5ms GtG can easily hide in the dark period of backlight-turned-off between 1/360sec scanouts (LCD sequentially refresh top-to-bottom like a CRT, except without flicker trailing behind).

That's why I am a big fan of the new 2020-and-later "1ms GtG IPS" panels. They're much easier to strobe-tune. Even if 1ms GtG is still 10ms real-world, it's still possible to hide 10ms GtG between 1/240sec (4ms) scanouts or 1/360sec (2.7ms) scanouts.

The sequence for a good strobe: In total darkness: scanout, then wait for GtG to finish, then strobe. Rinse and repeat. That's why refresh rate headroom is so good for LCDs; it greatly improves their lower-Hz operation. Strobed 144Hz can look much better on a 360Hz-capable LCD panel than on a 144Hz LCD for example, since there's so much more time to hide LCD GtG between refresh cycles.

Most people don't properly cherrypick a brand + tweak their LCDs for the "most CRT clarity" behavior. Literally less than 10% of gaming monitors can match or beat CRT motion clarity without annoying artifacts (excessive dimness, microjitters, strobe crosstalk, etc). And they still need to be fine-tuned/tweaked to correct operation -- since most gamers don't buy a 240Hz panel just to strobe it at only 120Hz, but that's the way to get them to look even more CRT motion clarity....

You're saying Sony and Panasonic actually have a BFI implementation that really does reduce motion blur? I had a 2015 60in Vizio that had black frame insertion to reduce motion blur but honestly all it did was introduce flicker, I couldn't really tell if it reduced motion blur or not, in fact I don't think it did it all.

You probably had really bad strobe crosstalk that interfered with motion blur reduction.

Less than 5% of LCDs can strobe properly with CRT motion clarity. Most manufacturers just flash a backlight and be done with it, creating terrible results (See "DIY 60 Hz Strobing - I Hacked A 60 Hz LCD To Strobe, And It Strobed Badly").

FHLtRks.png

(From the old Strobe Crosstalk FAQ)

Lots of things need to be done properly for high quality CRT clarity strobe without bothersome flicker.

1. Cherrypicked model (good strobe engineering)
3. High enough Hz to avoid flicker (60Hz can be too low for many)
2. Framerate=Hz (CRTs needed this too, but this is harder to do with higher Hz than CRTs did, due to GPU power)
4. Refresh rate headroom diverted to massively improve strobe quality.

For example, 120fps@120Hz strobe on a 240Hz-capable LCD can be more easily engineered to look much better than 120Hz strobe on 120Hz-only LCDs, or 240Hz strobe on 240Hz-only LCDs. Basically, buying too much Hz, then intentionally running the monitor at a lower Hz, and enabling the strobe mode on it. This is basically a lots of laws of physics interacting with each other.

But configuring for a good CRT-clarity experience even on a good strobed LCD is a sweet spot:
A. Not too low Hz, to avoid flicker
B. Not too high Hz, or GPU can't keep framerate=Hz

It was easier to do with CRTs because they had lower refresh rates (75fps 75Hz), but today good strobed LCDs are higher refresh rates, and it is harder to do framerate=Hz at higher refresh rates if you don't have a powerful GPU.

Technically, I know that the LCD built into the Oculus Quest VR headset is actually essentially a defacto 240Hz-capable LCD, but they are running it at a lower Hz (72Hz, 90Hz, and soon 120Hz) to get CRT motion clarity strobing with zero strobe crosstalk.

To many people, it is counterintuitive to buy a high-Hz LCD just to run it at a lower Hz. Most people try strobing at max Hz, see it is crappy quality, and then give up. Or they only try 60Hz-only strobe on a 60Hz-only LCD TV. Or such, failing to get the right combination for great CRT motion clarity experience on an LCD. It's also a shopping frustration for many -- only a few percent of LCDs are capable of striking the right balance of motion blur reduction that's not unobtanium.
 
Last edited:

UltimaKilo

Gold Member
the 1000 fps part seems silly to suggest when what you want is the response time of 1000 Hz.

If you had that, it probably wouldn't matter if frames are being doubled/interpolated or not.


I don't think they ever sold one.

But here's a new's article for it:

-> https://newatlas.com/panasonic-hd-3d/13842/


What they sold were PDP panels with 105 inches, 1080p. I believe.

Well... Thing is there was nothing stopping CRT's from doing what LCD was doing in regards to resolution, people already pointed out Sony FW900 Pc monitor but there were also CRT HD TV's.

Here's a thread from the time, of a Phillips HD CRT and how crazy people thought it was:

-> https://adrenaline.com.br/forum/threads/xbox360-na-crt-philips-32-wide-hd.133146/

This was a really thin CRT for what it was.
I had that TV up until a year ago and it looked great, but it was HUGE. Must have weighed well over 100 pounds.

I have personally been waiting for 240hz to be the standard for over a decade now. I would have thought it would have been by 2021 had you asked me ten years ago.

I’d love it for video games, of course, but I would really want it for watching sports. Watching football and basketball with these refresh rates is so frustrating.

Lets hope manufacturers will start pushing for 240hz over 8K. In my opinion, 4K already looks fantastic.

I understand for larger TVs, some consumers want 8K, but what ever happened to 5K? I remember seeing 80 inch 5K display in 2015 and it was a big difference at that size.

Refresh rate and color should really be the focus.
 

ahmyleg

Banned
Whats even worse is that mouse trail effect gets fainter as the refresh goes up, but gets alot more distracting since you can still see it, and more of it.

VD9upLq.png

O4H8HO1.png
That's actually s good thing. Eventually, like at 960hz maybe, the mouse trail would become a complete blur to us, and wallah! Actual, real motion blur has been achieved. That kind of motion blur is good, motion blur from extended image persistence on sample-and-hold displays (LCD, OLED, etc.) baaaad. CRTs get around that of course bc they're impulse-driven displays.
 
Last edited:

Soodanim

Gold Member
As much as I wouldn’t want to go back to CRT for a main monitor or TV, I have 2 144hz monitors and a 60hz TV (LCD) and I really dislike moving a mouse on the TV. Higher refresh rates becoming even more common is great news for me for that alone, let alone all of the advancements in motion clarity mentioned by mdrejhon mdrejhon (this has become a very interesting thread to read).

This has inspired me to do a bit more gaming on my monitor so that I can set up custom NVidia profiles for games where I can maintain a constant high FPS so that I can enjoy the motion clarity. I already love doing it in HL2, but I rarely think to do it for anything else.
 

mdrejhon

Member
That's actually s good thing. Eventually, like at 960hz maybe, the mouse trail would become a complete blur to us, and wallah! Actual, real motion blur has been achieved. That kind of motion blur is good, motion blur from extended image persistence on sample-and-hold displays (LCD, OLED, etc.) baaaad. CRTs get around that of course bc they're impulse-driven displays.
Correct.

Whats even worse is that mouse trail effect gets fainter as the refresh goes up, but gets alot more distracting since you can still see it, and more of it.
Actually the mouse cursor is less distracting + easier to see at higher Hz.

Photos don't always accurately portray what motion looks like to the human eye.

Put a finger up on your screen. Pretend your fingernail is a mouse pointer. That what 1000fps+ 1000Hz+ mouse cursors look like -- virtually analog motion. Once the pixel step is 1 pixel, it doesn't stroboscopically step at all. 2000 pixel/sec mouse cursor at 2000 Hz would be just a continuous blur.

Now, when you eye-track a moving mouse cursor (eye tracking is different from a stationary eye / stationary camera), it's so much clearer and easier to track, aiming a mouse cursor faster on buttons sooner and clicking buttons faster. It's more of a pleasure to click on hyperlinks at 240 Hz....

But when you photograph (with a stationary camera) your moving finger, your fast-moving finger is just a very faint motion blur.

Again, photos don't always accurately portray what motion looks like to the human eye.
 
Last edited:

poodaddy

Gold Member
How feasible is it for a start up boutique builder to start making custom small batch modern CRT's for gaming? I've always wondered why someone doesn't do that, I mean I'm guessing it'd be insanely expensive to get started with R and D and fabrication facilities, but the demand among gamers is definitely there, particularly as of late.
 
Last edited:

UnNamed

Banned
Are CRT impossible to make anymore? Seems like there would be a market for a nice premium one for gamers.
First you need to rebuild every component from the ground up, cost would be very high. But this is not the main problem.

The real problem is CRT were also not environmental friendly, lot of pollute elements, so you need to re-engineer those components in their non pullute version, this would cost a lot, years and years of development for... 1000 new CRT a year? It's not easily feasible.
 

Calverz

Member
So basically we should get the manufacturers to go back to CRT? Man they were so big. I remember when my parents got a 32” panasonic and it was huge. Now i have a 65” oled and its almost as thin as paper!
 
Top Bottom