• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

nkarafo

Member
This is silly, you are not going to get games at close to 1000 fps unless you want to go back to N64 quality visuals, lol. It's not feasible to do that just to completely get rid of motion blur.
Which is why we need to get rid of current technology and replace it with a better one.
 

Azurro

Banned
True but you can just hear the words and imagine how it would be :p

Unless you do still have a CRT monitor to compare or you remember how it looked. If you never saw how a CRT looks, you are lucky. Stay away from them. They will completely ruin any TV/monitor you are currently using.

This is a hipster thing, it has to be. I grew up around CRT TVs, the resolution is shit, there's color bleeding from the analog signal and they are big boys for a tiny screen.

There's a very good reason why everyone switched to LCD.
 
This is silly, you are not going to get games at close to 1000 fps unless you want to go back to N64 quality visuals, lol. It's not feasible to do that just to completely get rid of motion blur.
the 1000 fps part seems silly to suggest when what you want is the response time of 1000 Hz.

If you had that, it probably wouldn't matter if frames are being doubled/interpolated or not.

I believe there was a 4k model but it wasn't available to general consumers. It was 115 inches. I see it pop up on ebay every now and again for like 10 grand lol
I don't think they ever sold one.

But here's a new's article for it:

-> https://newatlas.com/panasonic-hd-3d/13842/


What they sold were PDP panels with 105 inches, 1080p. I believe.
This is a hipster thing, it has to be. I grew up around CRT TVs, the resolution is shit, there's color bleeding from the analog signal and they are big boys for a tiny screen.
Well... Thing is there was nothing stopping CRT's from doing what LCD was doing in regards to resolution, people already pointed out Sony FW900 Pc monitor but there were also CRT HD TV's.

Here's a thread from the time, of a Phillips HD CRT and how crazy people thought it was:

-> https://adrenaline.com.br/forum/threads/xbox360-na-crt-philips-32-wide-hd.133146/

This was a really thin CRT for what it was.
 
Last edited:

Pagusas

Elden Member
it is amazing to me that more attention hasn't been given to motion clarity in the R&D departments.

Regardless though, all these threads praise CRT's but forget to talk about all the massive failings they had (and im not talking about weight or size limits). I'm talking about poor geometry, blooming from shadow mask warping, bluring at the edges, convergence issues, scanlines, poor point resolution, black to white decay time (smearing/streaking at times with certain models). Yes some sets were better than others, Ivory Shadow mask fixed most blooming issues, and you could play the lottery to find a set with decent geometry, but just like with LCD's and OLED's, there was always a tradeoff, and I would never go back to my 36" Sony Wega over my 82" Qled or 77" OLED.
 
Last edited:

nkarafo

Member
This is a hipster thing, it has to be. I grew up around CRT TVs, the resolution is shit, there's color bleeding from the analog signal and they are big boys for a tiny screen.

There's a very good reason why everyone switched to LCD.
The only reason i'm using a LCD is because my CRT monitor is slowly dying and because the LCD monitor is 240hz so the downgrade is smaller.

2 of the 3 points you mention don't count for a high-res CRT PC monitor.
 
Last edited:

mdrejhon

Member
This isn't really true for oled. At 30fps, you need camera motion blur on oled because of the fast response time, there really isn't any perceptible motion blur, or it is minimal. Without any blur the choppiness sticks out horribly. At 60fps however, you don't really need motion blur on oled though a tasteful (Meaning high sample count and only applied to fast motion!) implementation of per object blur could still be beneficial.
Chief Blur Buster Here, inventor of TestUFO!

While some of this is true, This needs further explanation.

Display Science

For those who haven't been studying the redesigned Area 51 Display Research Section of Blur Busters lately, I need to correct some myths.

OLEDs also have motion blur too unless impulse driven too. Even 0ms GtG has lots of motion blur because MPRT is still big.

VR headsets such as Oculus Rift can strobe (impulse-drive like a CRT), as well as LC CX OLED BFI setting. However, the MPRT of Oculus Rift Original is 2ms MPRT, and LG CX OLED is about 4ms MPRT. Be careful not to confuse GtG and MPRT -- see Pixel Response FAQ: Two Pixel Response Benchmarks: GtG Versus MPRT. To kill motion blur, both GtG *and* MPRT must be low. OLED has fast GtG but high MPRT, unless strobed.

Faster GtG Can Reduce Blur -- But Only To a Point

However, faster GtG does lower the flicker fusion threshold of the stutter-to-blur continuum where objects during low framerates "appear" to vibrate (like slow guitar/harp string) and high framerates vibrate so fast they blur (like a fast guitar/harp string). Regular stutter (aka 30fps) and persistence motion blur (aka 120fps) is the same thing --- the stutter-to-blur continuum is easily watched at www.testufo.com/vrr (a frame rate ramping animation during variable refresh rate).

Slow GtG fuzzies up the fade between refresh cycles as seen in High Speed Videos of LCD Refresh Cycles (videos of LCD and OLED in high speed video), which can lower the threshold of the stutter-to-blur continuum.

Some great animations to watch to help you understand the stutter-to-blur continuum:

Stutters & Persistence Motion Blur is the Same Thing

It's simply a function of how slow/fast the regular-stutter vibration. Super fast stutter vibrates so fast it blends into motion blur.

Educational Takeaways of Above Animations

View all of the above on any fast-GtG display (LCD or OLED, such as TN LCD, modern "1ms-GtG"-IPS-LCD, or an OLED). You'll observe once GtG is an insignificant percentage of refresh cycle, these things are easy to observe (assuming framerate = Hz).

1. Motion blur is pixel visibility time
2. Pixel visibility time of a strobed display is the flash time
3. Pixel visibility time of a non-strobed display is the refresh cycle time
4. Stutter and persistence blur is the same thing; it's a function of the flicker fusion threshold where stutter blends to blur
5. OLED has motion blur just like LCDs
6. You need to flash briefer to reduce motion blur on an impulse driven display.

Once you've seen all the educational animations, this helps greatly improve the understanding of displays.

Now, LCDs do not need to be 1000Hz to reduce motion blur -- you can simply use strobing. But strobing (CRT, LCD, OLED) are just a humankind bandaid because real-life doesn't flicker or strobe. We need 1000Hz OLEDs / LCDs, regardless of whether OLED or LCD.

Only Two Ways To Reduce Persistence (MPRT) Display Motion Blur

There are only two ways to shorten frame visibility time (aka motion blur)

1. Flash the frame briefer (1ms MPRT requires flashing each frame only for 1ms) -- like a CRT
2. Add more frames in more refresh cycles per second. (1ms MPRT requires 1000fps @ 1000Hz)

And, the OP forgot to post both images, which are equally important due to context ;)

motion_blur_from_persistence_on_sample-and-hold-displays.png


motion_blur_from_persistence_on_impulsed-displays.png
 
Last edited:

nkarafo

Member
CRTs are not a better technology, a TV that weights 50 kilos or whatever that is only 35" and takes up way more room in your house is a dead end technology.

Again, it's hipster shit.
CRTs have better motion clarity. They also have nearly zero input lag. A high-res PC monitor is better than an LCD in almost every way.

Living room TVs are different. Because size matters, there are more benefits having a big LCD.
 
Last edited:

x@3f*oo_e!

Member
I think the key thing that started killing it was the PR that they were heavy (which they were). But when they started fizzling out, they still had best picture quality. Darks, speed were in no way better on LED/LCD.

But plasmas still tanked.

I never understood the heavy thing. Its not like anyone is moving a TV every day. Once you plunk it somewhere it stays. And lots of people even hung plasmas on the wall, so it's not like its so heavy it wont hang. Anyone mounting a big LCD/LED requires two people anyway.

I think another issue was technology for plasmas I think was tapering out while LCDs and such kept going, improving, OLED etc.... You never really saw plasma makers say giant innovations were coming.
afaik also instrincially (more) EXPENSIVE TO MAKE .. never going to reach the bottom end, and bulk market. That's a guaranteed death sentence
 

nkarafo

Member
Another thing we haven't mentioned about LCD monitors is that, fast response, high refresh rate monitors tend to use TN panels. That meant, shitty blacks and shitty view angles.

CRT PC monitors don't have any of those negatives and they are still faster.

I know TVs are meant for different things but for desktop PC usage, CRTs are superior in every way.
 
I play on a sony oled with 120hz black frame insertion, the best (sony's) implementation of bfi there is. A game could have locked 60fps with the bfi engaged, and the motion is still FAR cleaner and smoother on my Trinitron. I have not yet tried 120fps gameplay with bfi on my oled, so not sure about that, but I would be surprised if it was as good as the crt.

EDIT : Also I wonder how much 240hz bfi when tvs start having 240hz panels will help.

Black frame insertion is a terrible band aid solution though. It takes the already dim image of oled and makes it even less bright.
 

mdrejhon

Member
CRTs have better motion clarity.
Being founder of Blur Busters, I see thousands of displays. If you have tried a HTC Vive or Oculus Quest, you have noticed that LCDs can now beat CRTs in motion clarity. Vive and Quest are able to do true real-world 0.3ms MPRT (measured, all color combos), with zero strobe crosstalk.

Also, at 119Hz, the Blur Busters Approved ViewSonic XG270 mostly matched/beat the Sony FW900 in ApertureGrille's tests. Click the comparision images in this section of ApertureGrille review. CRTs still have better black levels but a cherrypicked LCD (best 1% strobed LCDs) can now have less ghosting/motion blur than a CRT -- you can see the phosphor trails in the CRT pursuit camera photo.
 

OmegaSupreme

advanced basic bitch
Chief Blur Buster Here, inventor of TestUFO!

While some of this is true, This needs further explanation.

Display Science

For those who haven't been studying the redesigned Area 51 Display Research Section of Blur Busters lately, I need to correct some myths.

OLEDs also have motion blur too unless impulse driven too. Even 0ms GtG has lots of motion blur because MPRT is still big.

VR headsets such as Oculus Rift can strobe (impulse-drive like a CRT), as well as LC CX OLED BFI setting. However, the MPRT of Oculus Rift Original is 2ms MPRT, and LG CX OLED is about 4ms MPRT. Be careful not to confuse GtG and MPRT -- see Pixel Response FAQ: Two Pixel Response Benchmarks: GtG Versus MPRT. To kill motion blur, both GtG *and* MPRT must be low. OLED has fast GtG but high MPRT, unless strobed.

Faster GtG Can Reduce Blur -- But Only To a Point

However, faster GtG does lower the flicker fusion threshold of the stutter-to-blur continuum where objects during low framerates "appear" to vibrate (like slow guitar/harp string) and high framerates vibrate so fast they blur (like a fast guitar/harp string). Regular stutter (aka 30fps) and persistence motion blur (aka 120fps) is the same thing --- the stutter-to-blur continuum is easily watched at www.testufo.com/vrr (a frame rate ramping animation during variable refresh rate).

Slow GtG fuzzies up the fade between refresh cycles as seen in High Speed Videos of LCD Refresh Cycles (videos of LCD and OLED in high speed video), which can lower the threshold of the stutter-to-blur continuum.

Some great animations to watch to help you understand the stutter-to-blur continuum:

Stutters & Persistence Motion Blur is the Same Thing

It's simply a function of how slow/fast the regular-stutter vibration. Super fast stutter vibrates so fast it blends into motion blur.

View all of the above on any fast-GtG display (LCD or OLED, such as TN LCD, modern "1ms-GtG"-IPS-LCD, or an OLED). You'll observe once GtG is an insignificant percentage of refresh cycle, these things are easy to observe (assuming framerate = Hz).

1. Motion blur is pixel visibility time
2. Pixel visibility time of a strobed display is the flash time
3. Pixel visibility time of a non-strobed display is the refresh cycle time
4. Stutter and persistence blur is the same thing; it's a function of the flicker fusion threshold where stutter blends to blur
5. OLED has motion blur just like LCDs
6. You need to flash briefer to reduce motion blur on an impulse driven display.

Once you've seen all the educational animations, this helps greatly improve the understanding of displays.

Now, LCDs do not need to be 1000Hz to reduce motion blur -- you can simply use strobing. But strobing (CRT, LCD, OLED) are just a humankind bandaid because real-life doesn't flicker or strobe. We need 1000Hz OLEDs / LCDs, regardless of whether OLED or LCD.

There are only two ways to shorten frame visibility time (aka motion blur)
1. Flash the frame briefer (1ms MPRT requires flashing each frame only for 1ms)
2. Add more frames in more refresh cycles per second. (1ms MPRT requires 1000fps @ 1000Hz)

And, the OP forgot to post both images, which are equally important due to context ;)

motion_blur_from_persistence_on_sample-and-hold-displays.png


motion_blur_from_persistence_on_impulsed-displays.png
Very cool. Thanks for this.
 
Last edited:

OmegaSupreme

advanced basic bitch
Being founder of Blur Busters, I see thousands of displays. If you have tried a HTC Vive or Oculus Quest, you have noticed that LCDs can now beat CRTs in motion clarity. Vive and Quest are able to do true real-world 0.3ms MPRT (measured, all color combos), with zero strobe crosstalk.

Also, at 119Hz, the Blur Busters Approved ViewSonic XG270 mostly matched/beat the Sony FW900 in ApertureGrille's tests. Click the comparision images in this section of ApertureGrille review. CRTs still have better black levels but a cherrypicked LCD (best 1% strobed LCDs) can now have less ghosting/motion blur than a CRT -- you can see the phosphor trails in the CRT pursuit camera photo.
Welcome back btw.
 

nkarafo

Member
Being founder of Blur Busters, I see thousands of displays. If you have tried a HTC Vive or Oculus Quest, you have noticed that LCDs can now beat CRTs in motion clarity. Vive and Quest are able to do true real-world 0.3ms MPRT (measured, all color combos), with zero strobe crosstalk.

Also, at 119Hz, the Blur Busters Approved ViewSonic XG270 mostly matched/beat the Sony FW900 in ApertureGrille's tests. Click the comparision images in this section of ApertureGrille review. CRTs still have better black levels but a cherrypicked LCD (best 1% strobed LCDs) can now have less ghosting/motion blur than a CRT -- you can see the phosphor trails in the CRT pursuit camera photo.
Interesting. How does the ViewSonic XG270 fare with 60fps content?
 
I believe there was a 4k model but it wasn't available to general consumers. It was 115 inches. I see it pop up on ebay every now and again for like 10 grand lol.
Adendum on my part, you were right:

-> http://www.panasonic.com/business/plasma/pdf/TH152UX1SpecSheet.pdf

4K, 152", 3700 W
Being founder of Blur Busters, I see thousands of displays. If you have tried a HTC Vive or Oculus Quest, you have noticed that LCDs can now beat CRTs in motion clarity. Vive and Quest are able to do true real-world 0.3ms MPRT (measured, all color combos), with zero strobe crosstalk.

Also, at 119Hz, the Blur Busters Approved ViewSonic XG270 mostly matched/beat the Sony FW900 in ApertureGrille's tests. Click the comparision images in this section of ApertureGrille review. CRTs still have better black levels but a cherrypicked LCD (best 1% strobed LCDs) can now have less ghosting/motion blur than a CRT -- you can see the phosphor trails in the CRT pursuit camera photo.
That's surprising for me to read.

Any idea of how many lines of motion resolution that translates to? Above 600?
 
Last edited:

Diddy X

Member
We've come a long way tho, first lcd were accepted only due to HD, IQ-wise they were trash, still we have these new technologies that can improve the motion aspect, the contrast has been already nailed by OLED/microLED.
 

mdrejhon

Member
Interesting. How does the ViewSonic XG270 fare with 60fps content?
Not well, unfortunately. But good news for 2021.

In the past, I have been dissapointed at how many manufacturers has not done the 1-line programming fix to enable single-strobe 60 Hz.

Some of the strobe programming is easy and some of it is hard.

The Problems of Most Strobe Backlights

- Only supports a few refresh rates (e.g. ULMB 85Hz, 100Hz, 120Hz)
- Has bad strobe crosstalk on many models
- Poor colors (aka LightBoost of the old days)
- Doesn't Support 60 Hz single-strobe

Blur Busters Approved 2.0 for year 2021

- All refresh rates MUST support strobing
- 60Hz strobe must be supported to correctly emulate 60 Hz CRT
- Minimize strobe crosstalk as much as feasibly possible for the panel
- More configurable (strobe phase, strobe length, overdrive gain, etc)

Strobing is optional. But the user should have the flexibility.

Four displays from two manufacturers have finally signed up, and there will all be out by end of 2021 (hopefully). One is already coming out April/May (announcement coming soon)

Convincing Manufacturers Is The HARD Part

I have been teaching at display manufacturers to help explain, coax, plead, beg, them to support the goddamned capabilities of LCD panels. As a passionate hobbyist turned business, I love my work & I believe in better strobing technologies where possible.

acfbDPk.jpg
 
Last edited:

NeoIkaruGAF

Gold Member
The issues of flat screens get more evident the bigger the screen, too.
Like I chronicled several times here and elsewhere, when I got a 55" OLED 4K TV I was instantly aware of motion issues I had never noticed before, not even on consumer LCD TVs, because of size.

OLED's fast refresh time is, in my opinion, disastrous for low framerate content. Even when watching TV and movies I need some slight motion interpolation, otherwise most shows are visibly choppy. The exception is traditional cartoons, which are horrible instead with motion interpolation on.

Unfortunately, motion interpolation adds a ton of lag with games. It's definitely noticeable enough to make even slower games significantly worse. It was kinda necessary for me to play something like FF9 on OLED though, because the world map and camera movement during battles was literally like watching a slideshow, no hyperbole. Even prerendered backgrounds are a pain when scrolling. Stuff like Rare's Banjo games via Rare Replay is a pain to watch and play, and even with 60fps 2D games you can see that things get fuzzy in motion. Some people don't notice it, but for me, it's torture.

Motion was much better on plasma, no question. One problem with plasma was, even the most recent models had judder in movies. Judder-free TVs are relatively new on the scene.
Plasma was also prone to image retention and burn-in, it was absolutely less energy-efficient than LCD, and it was pretty dim. It could probably be improved in many ways regarding features, but when it became clear that consumers were actually prepared to buy screens bigger than 65" and 4K became a viable commercial goal, plasma was doomed. Shame, because 1080p from a good plasma still looks almost as good as 4K on an OLED if you're not counting pixels. Image quality was that good on plasma.

Modern TV tech would need a combination of several features to make motion look as good as with older tech. Intelligent strobing, 240+Hz and clever motion interpolation that doesn't add input lag would probably all be required in combination to achieve CRT-like motion. BFI, at least the LG implementation of it previous to the CX series (which I haven't witnessed in person), is a scam. It makes almost no difference with actual motion, and instead makes the image much darker and gives it a shimmery quality - that is, when it doesn't flicker so much that it's unwatchable. We must remember that, while CRTs used to flicker, screen size made it almost unnoticeable during actual vision. Blow that shit up on a huge modern screen, and I promise you can't unsee it.

OLED looks amazing in stills. In motion, though? 120Hz is where it starts to get acceptable (when I first saw someone using an iPad Pro, I knew it was a 120Hz screen at the exact moment the owner scrolled down an internet page. It's that noticeable)

Blur Busters is a treasure, but it's also the source of much pain.
 

mdrejhon

Member
Another thing we haven't mentioned about LCD monitors is that, fast response, high refresh rate monitors tend to use TN panels. That meant, shitty blacks and shitty view angles.
Consider Post-2020 "Fast IPS" Tech

Cherrypicked models of IPS manufactured in year 2020 or later.

The good news is that since year 2020, 1ms IPS panels are already in the TN ballpark.

A cherrypicked Strobed IPS can be very good

If you've tried the IPS panel in the HTC Vive or Quest 2 VR headset, they are superlative. Every CRT snob who tried these VR headsets now have stopped laughing and see that IPS can beat CRT in motion clarity.

For me, the demarcation point of the new "Fast IPS" panels is so dramatic that it's like B.C. versus A.D. (after year 1). For IPS, any "2019 and earlier" IPS is blurry, and any "2020 and newer 165Hz+ IPS" is much clearer.

The problem is that the strobe programming in many monitors is very sub-par, and it takes good programming to make IPS strobing very good with zero or near-zero strobe crosstalk. For example, refresh rate headroom helps (e.g. 240Hz IPS LCD running at 100Hz, which means ~6ms refreshing (100 Hz refresh cycle refreshed in 1/240sec) and 10ms of total darkness to hide LCD GtG in darkness, allowing 1ms GtG to achieve realworld 100% transition for most colors in the dark period between refresh cycles. Combined with very good pefectly tuned neutral overdrive, this is one of the magic of zero strobe crosstalk -- overkill refresh rate headroom combined with slightly lower-Hz strobe.
 

mdrejhon

Member
Unfortunately, motion interpolation adds a ton of lag with games.
Adding extra frames does not have to add extra lag. You should check out Frame Rate Amplification Technology.

This also becomes a cheap lagless way to reach 1000fps by year 2030, while keeping Unreal 5 visuals.

NVIDIA DLSS 2.0 (spatial frame rate amplification) as well as Oculus ASW 2.0 (temporal frame rate amplification) are among the variants.

Work is already under way to achieve 5:1 and 10:1 frame rate amplification ratios without graphics degradation or additional lag.

The best blurless experiences I've currently seen are currently in VR headsets, where it's "must eliminate blur to eliminate VR nausea", so there is a bigger zeal in motion blur elimination in VR. It's shocking how much things are better in VR (blur-wise) than with non-VR. They've literally spent 100x+ more engineering money on killing motion blur in VR-headset panels.
 
Last edited:

svbarnard

Banned
This is silly, you are not going to get games at close to 1000 fps unless you want to go back to N64 quality visuals, lol. It's not feasible to do that just to completely get rid of motion blur.
You know how they have dedicated silicon to producing Ray tracing? You know how they're taking computer chips now and they're specially tweaking them to do one task and one task only? Well they can do something like that for increased frame rates as well, read this https://blurbusters.com/frame-rate-...es-frat-more-frame-rate-with-better-graphics/
 

mdrejhon

Member
You know how they have dedicated silicon to producing Ray tracing? You know how they're taking computer chips now and they're specially tweaking them to do one task and one task only? Well they can do something like that for increased frame rates as well, read this https://blurbusters.com/frame-rate-...es-frat-more-frame-rate-with-better-graphics/
Correct.

ETA: Year 2030 for 1000fps Unreal5-visuals on one GPU.

We're already partially there -- slowly. The technology designed for my VR headset can double frame rate with no visual quality loss -- for playing Half Life Alyx. Without that old fashioned "Sony MotionFlow Interpolation" lag of your grandpa's TV.
 
Last edited:

StreetsofBeige

Gold Member
Not a direct reply, but I even found an 8K one:

-> http://www.controlcal.com/forum/showthread.php?t=1038

It's a shame they ended production that year. 145" @ 8K means they could do 4K at 72,5" at that point.
Ya. I had two Panny plasmas and thought they were great.

But when it came to 4k time (I got one during Black Friday 2017), plasmas were long dead.

Personally I don't care if a plasma is 50 lbs heavier. If plasmas were still made, at 4k with the bells and whistles like modern 4k LCDs and OLEDs, I would have stuck with a plasma.
 
Last edited:

x@3f*oo_e!

Member
Dynamic range (in real world) is not good on CRT

0 For the measurements made in a completely dark room with with almost no additive flare, the luminance of black on the LCD display was about 58 times the luminance of black on the CRT and the dynamic range (ratio of white to black luminances) is around 357 : 1 for the LCD display and 4351 : 1 for the CRT. On the face of it, the CRT appears to have a larger dynamic range, however, in practice the exact converse is true because a large region of the CRTs dynamic range is lost to additive flare under typical viewing conditions. In the presence of typical viewing flare, the ratio black to white luminance for the CRT falls to 16 : 1 whereas the corresponding ratio for the LCD remains significantly higher at 209 : 1. This reversal is owing to the fact that the typical viewing flare has a much higher luminance than the CRT black in a dark room but is quite negligible as compared to the light leakage through the LCD cells already present in the LCD black. The higher white luminance for the LCD displays gives them a higher effective dynamic range than typical CRTs, which is clearly apparent in practice.

CRT dynamic range 16:1 (4 bit) vs 209:1 (>7bit) on LCD, because of flare.

This also affects color gamut :
The significantly larger gamut in the dark color regions provides the LCD displays a significant advantage over CRTs when displaying images with large dynamic range and shadow-detail. Fig. 8 presents a comparison of the LCD (wire-frame) and CRT (solid) gamuts in “absolute” CIELAB (Case 3 above). The “absolute” CRT gamut is almost entirely contained inside the LCD gamut and quite small in comparison. This is primarily due to the much higher luminances of the LCD display. These gamut comparisons also illustrate why the CRT appears quite satisfactory when viewed by itself independently of the LCD display (gamut comparisons of Fig. 7), but seems to be quite “washed out” when viewed side-by-side with the LCD display

https://doi.org/10.1117/12.452987 Comparative Evaluation of Color Characterization and Gamut of LCDs versus CRTs ,Gaurav Sharma , Xerox Corp

afaik afterglow is/was still a problem on CRT as well

Nostalgia is a wonderful thing. Grow strong lifting thoses CRT boxes.
 

mdrejhon

Member
We've come a long way tho, first lcd were accepted only due to HD, IQ-wise they were trash, still we have these new technologies that can improve the motion aspect, the contrast has been already nailed by OLED/microLED.
And nailed by good FALD LCDs.

Remember, I've seen thousands of displays. I've seen expensive LCDs with better blacks and better colors than some OLEDs thanks to it being driven by over 100,000 LEDs. No blooming. Even CRTs have a bit of blooming around its phosphor dot.

The problem is FALD is a luxury priced feature. FALD isn't cheap. But building a 100,000-led MicroLED FALD backlight is still cheaper than building a desktop OLED in many ways.

So, it's a two-horse race for OLED versus FALD-MicroLED desktop gaming monitors.

I also see commoditization of FALD sheets coming; ETA 2025. LED lighting became cheap in other markets (e.g. light bulbs, ribbons, edgelights, etc) and it's time for FALD sheets to become cheap -- about ****ing time.

What we can buy for 3-figure prices is almost a joke in quality compared to what you can witness in the laboratories or ultra-niche displays. Yes, credit where credit is due -- gaming monitors only sell one-hundredth the quantity of a television model -- and scale of economics for many quality attributes are harder with a 24" display than a 55" display. But it's still sometimes immensely disappointing that what I've seen in labs aren't on our desktops yet.
 
Last edited:
Chief Blur Buster Here, inventor of TestUFO!

While some of this is true, This needs further explanation.
Very interesting stuff about where the future of displays needs to be, and thank you, but I mostly just meant that the motion blur on my oled isn't really visible to my eyes during normal viewing, esp. with bfi engaged. I can definitely see (camera) trailing if I look for it due to, yes we need absurdly high refresh to deal with that, but the stutter at 30fps dominates the screen due to the low motion blur on oled compared to lcd. Basically the response time is too high to help mask the low fps. Even at 60fps with 120hz bfi engaged, I do still see trailing blur if I look for it, but if I don't look it's not readily apparent.

Phunkydiabetic Phunkydiabetic the lowered brightness is really not an issue in a dark room on oled, unless it's an hdr source. However, hdr is still impactful to me with bfi engaged in a dark room, just signifcantly less so. The 1300 nit Sony A90J will help tremendously with hdr content.
 
Last edited:

mdrejhon

Member
Even at 60fps with 120hz bfi engaged, I do still see trailing blur if I look for it, but if I don't look it's not readily apparent.
For further improvement, BFI/strobing needs framerate=Hz

This isn't motion blur, but occurs with all flashed displays, 30fps@60Hz or 60fps@120Hz -- regardless of CRT/plasma/strobed OLED/strobed LCD.

Remember the old CRT 30fps @ 60Hz effect? It's also a problem for 60fps at 120Hz strobed.

The problem is you need to reduce your refresh rate to 60Hz for 60fps games, for things to look better (both on OLED and LCD).

I can emulate the double-image effect in software-based black frame insertion too:
TestUFO: Software-Based Double Image Effect

(Looks best viewed on a 240Hz display, as emulating it in software requires 4 refresh cycles per frame, but it's still interpretable on a 120Hz display. 60 Hz displays will not correctly convey the educationalness of the above animation accurately).

It's a big jump in motion clarity when you do VSYNC ON framerate=Hz with any impulsed display of any kind (and there's low-lag VSYNC tricks for those VSYNC OFF users who want to also eliminate motion blur).

The problem is that to do this with an LCD, you need a monitor that supports single-strobe 60 Hz. Most manufacturers have refused to add 60 Hz strobe even though it's technically easy to engineer.

nqxW3Oz.png
 
Last edited:
You need framerate=Hz when you're strobing.

This isn't motion blur, but occurs with all flashed displays, 30fps@60Hz or 60fps@120Hz -- regardless of CRT/plasma/strobed OLED/strobed LCD.

nqxW3Oz.png
I actually had a discussion about that with JeloSWE JeloSWE and he explained that to me, but even using the test he linked to me on your site I could not see an increase in duplication due to my 60fps game 120hz enabled. Because even with BFI not enabled, it had image trailing if I looked and this did not change for me when I turned bfi on. This is on my Sony A8H and x900e, but he says he noticed duplication due to the way his Z9F handled strobing.

Perhaps you can link me a proper test for these displays and I could take a second look?
 
Last edited:

carsar

Member
1000 hz is not a panacea. In order to get crisp-clear object moving from left to the right in one second period you need 4k@ 3840hz.
And yes, you can pursuit objects moving with such speed, and difference between 1000hz, 2000 and 3000hz is visible
 
Last edited:
Very interesting stuff about where the future of displays needs to be, and thank you, but I mostly just meant that the motion blur on my oled isn't really visible to my eyes during normal viewing, esp. with bfi engaged. I can definitely see (camera) trailing if I look for it due to, yes we need absurdly high refresh to deal with that, but the stutter at 30fps dominates the screen due to the low motion blur on oled compared to lcd. Basically the response time is too high to help mask the low fps. Even at 60fps with 120hz bfi engaged, I do still see trailing blur if I look for it, but if I don't look it's not readily apparent.

Phunkydiabetic Phunkydiabetic the lowered brightness is really not an issue in a dark room on oled, unless it's an hdr source. However, hdr is still impactful to me with bfi engaged in a dark room, just signifcantly less so. The 1300 nit Sony A90J will help tremendously with hdr content.

Yeah sorry, if I need to make sure I'm in a room that is dark and not use hdr just to enjoy black frame insertion than the feature is shit. I'll just deal with the motion blur at that point.
 

mdrejhon

Member
Yeah sorry, if I need to make sure I'm in a room that is dark and not use hdr just to enjoy black frame insertion than the feature is shit. I'll just deal with the motion blur at that point.
I've seen prototype BFI exceed 1000nits+

It's a function of Tablot-Plateau Theorem where you need to flash twice as bright to strobe half as long, to maintain the same nits. LED can become stadium-bright. It's just a function of cramming brighter LEDs into a backlight.

I've seen the Sony 10,000-nit prototype LCD display. You can still have 1,000nits while reducing persistence by 90% on that.

High speed videos of CRT shows that the CRT electron gun is briefly incredibly bright (10,000nits and up), so that puts things into perspective.

It's easier to make a brighter LCD than an OLED because you can simply out-source to a heatsinked or watercooled LED backlight, but you can't easily do the same thing with direct-view OLED pixels. So the world's brightest laboratory flat panels I've seen are all LCD.

It's why Oculus Quest can do 0.3ms MPRT (LCD VR) while Oculus Rift can only do 2ms MPRT (OLED VR). I have multiple VR headsets here too, including both, and the LCD Quest has ~1/6th the motion blur of the OLED Rift. Headturn diplay motion blur requires fast MPRT.

With the exception of the MicroLED microdisplays (2-million nit LED displays) but that's for future projectors where "the projector bulb is the display" :D

1000 hz is not a panacea. In order to get crisp-clear object moving from left to the right in one second period you need 4k@ 3840hz.
And yes, you can pursuit objects moving with such speed, and difference between 1000hz, 2000 and 3000hz is visible
Correct, too.

For a non-strobed display, 1000Hz isn't even the final frontier -- especially for huge high-resolution wide-FOV displays.

The Vicious Cycle Effect means a higher resolution display makes it easier to see motion blur.

- Higher resolution means more pixels to motionblur over
- Wider FOV means more time to track moving objects
- Closer viewing distance means easier to see motion blur

8K display panning 1 screenwidth per second = about 8 pixels of motion blur at 1ms MPRT.

8K 60Hz motion blur is godawful -- 128 pixels of display motion blur at 60Hz, if you're tracking your eyes on panning motions. 7680 pixels/sec pan, divided by 60Hz, equals 128 pixel jump between frames, which is the persistence.

This doesn't matter much if you're viewing a medium TV from across a room, but matters a lot when you've got a 180-degree VR headset strapped to your face. A slow head turn can be 8000 pixels/second pan, so 1ms MPRT = 8 pixels of motion blur for 8K!

That's why the Valve Index and Oculus Quest are both 0.3ms MPRT now. To do 0.3ms MPRT without strobing, you need 3333fps@3333Hz to match the motion clarity of a Valve Index or Oculus Quest headset.
 
Last edited:
Yeah sorry, if I need to make sure I'm in a room that is dark and not use hdr just to enjoy black frame insertion than the feature is shit. I'll just deal with the motion blur at that point.
For a bright room with lots of natural light, yes high end FALD LCD is the way to go. But my room isn't even pitch black, just has some darkening curtains but there are not covered windows. In SDR even in the daytime, it's still bright, the only issue being reflections.

I am willing to go the extra mile for the best picture, oled gives you that. Not everything I do is in HDR, hell it's honestly a small fraction of content.
 
Last edited:

Soltype

Member
I have been saying this for years, everyone suddenly just accepted form over function. We have been fighting uphill to get back to the level we had with CRTs. I just don't understand why companies never pursued more developments in CRT.
 
I've seen prototype BFI exceed 1000nits+

It's a function of Tablot-Plateau Theorem where you need to flash twice as bright to strobe briefer. LED can become stadium-bright. It's just a function of cramming brighter LEDs into a backlight.

I've seen the Sony 10,000-nit prototype LCD display. You can still have 1,000nits while reducing persistence by 90% on that.

High speed videos of CRT shows that the CRT electron gun is briefly incredibly bright (10,000nits and up), so that puts things into perspective.

It's easier to make a brighter LCD than an OLED because you can simply out-source to a heatsinked or watercooled LED backlight, but you can't easily do the same thing with direct-view OLED pixels. So the world's brightest laboratory flat panels I've seen are all LCD. With the sole-exception of MicroLED microdisplays (2-million nit LED displays) but that's for future projectors where "the projector bulb is the display" :D


Correct, too.

For a non-strobed display, 1000Hz isn't even the final frontier -- especially for huge high-resolution wide-FOV displays.

The Vicious Cycle Effect means a higher resolution display makes it easier to see motion blur.

- Higher resolution means more pixels to motionblur over
- Wider FOV means more time to track moving objects
- Closer viewing distance means easier to see motion blur

I'm referring to oled bfi insertion being useless because of the already weak brightness 💅


But yes, solve the brightness problem and bfi because a much more useful feature. Unfortunately that just won't be happening with OLED. OLED is a stagnant tech, consumer grade microled can't get here soon enough.
 
Last edited:
mdrejhon mdrejhon As I understand it, the max bfi setting on my displays is 60hz bfi which matches 60fps content, but I can't use it due to flicker. Since you're here, does it flicker because it is only 60hz bfi, or does it flicker because because the bfi matches the refresh? I mean, 120hz bfi won't flicker with 120fps content will it?

120hz bfi really improves the smoothness on 30 and 60fps content, but it'd be good to know it'll be duplicate free on 120fps content.
 
Last edited:

mdrejhon

Member
I actually had a discussion about that with JeloSWE JeloSWE and he explained that to me, but even using the test he linked to me on your site I could not see an increase in duplication due to my 60fps game 120hz enabled. Because even with BFI not enabled, it had image trailing if I looked and this did not change for me when I turned bfi on. This is on my Sony A8H and x900e, but he says he noticed duplication due to the way his Z9F handled strobing.

Perhaps you can link me a proper test for these displays and I could take a second look?

1. Turn strobing or BFI ON.
2. Turn VSYNC ON (if you turned VSYNC OFF in NVCP for all fullscreen apps)
3. Verify framerate = Hz
4. View this text-scrolling animation in full screen mode
TestUFO Text Scrolling Animation
5. View this DOTA animation in full screen mode
DOTA Scrolling Animation - Click if you're running at 120 Hz
DOTA Scrolling Animation - Click if you're running at 60 Hz

If you've enabled strobing before you view the above animations, you will see double-images at half-framerate when trying to read text or nametags. You should easily see the floating nametags have a double-image at the half frame rate. Making it hard to read while scrolling.

If your game erratically stutters, you won't see as much difference between single-strobe and double-strobe. The effect is much easier to see when frame rate is consistently matching refresh rate (the moment of strobed nirvana).

(P.S. If you got an LG CX OLED, configure BFI to maximum. Do not use the medium setting for 60 Hz, the CX firmware has a quirk/bug causing an additional double strobe, so you will get a double-image versus a quadruple-image instead of a single-image versus double-image comparison)
 
Last edited:
1. Turn strobing or BFI ON.
2. Turn VSYNC ON (if you turned VSYNC OFF in NVCP for all fullscreen apps)
3. Verify framerate = Hz
4. View this text-scrolling animation in full screen mode
TestUFO Text Scrolling Animation
5. View this DOTA animation in full screen mode
DOTA Scrolling Animation - Click if you're running at 120 Hz
DOTA Scrolling Animation - Click if you're running at 60 Hz

If you've enabled strobing before you view the above animations, you will see double-images at half-framerate when trying to read text or nametags. You should easily see the floating nametags have a double-image at the half frame rate. Making it hard to read while scrolling.

If your game erratically stutters, you won't see as much difference between single-strobe and double-strobe. The effect is much easier to see when frame rate is consistently matching refresh rate (the moment of strobed nirvana).

(P.S. If you got an LG CX OLED, configure BFI to maximum. Do not use the medium setting for 60 Hz, the CX firmware has a quirk/bug causing an additional double strobe, so you will get a double-image versus a quadruple-image instead of a single-image versus double-image comparison)
Thank you very much, I will test later when I get home and let you know what I see.
 

carsar

Member
. You can still have 1,000nits while reducing persistence by 90% on that.
but why do we need bfi at such bright objects(the sun, flashlights etc)?

As I understand, we can control each led backlight separately, so we can handle 90% of LDR(100nits and lower) with BFI flickering diodes in order to get clear image, and other diodes(which show bright highlights) can be set flickerless and as bright as possible.

The only problem -tv has to do some real-time processing to understand where the most bright peaces are placed
 
Last edited:

mdrejhon

Member
Since you're here, does it flicker because it is only 60hz bfi, or does it flicker because because the bfi matches the refresh?

Correct. "Because it is only 60Hz BFI".

You want frame rates above flicker fusion threshold (aka 120fps) to get beautiful single-strobe nirvana

Side Note - this is the benefits of custom refresh rate support

This is why I've reformulated Blur Busters Approved for year 2021 (the programme where I help gaming display manufacturers improve strobing) to support strobing at any refresh rate. Whether you prefer 60 Hz single strobe or 85 Hz single strobe or 97.3579 Hz single strobe or 120 Hz strobe, etc.

Most gaming monitors now support custom refresh rates in 0.001Hz increments, so you can raise the refresh rate to the edge of your GPU capability (e.g. 100fps 100Hz) and get great strobing while still keeping frame rates above flicker detectability threshold.

That way, you can run your GPU at non-unobtainium frame rates, such as playing Cyberpunk 2077 at approximately 85fps strobed (it's much easier to run at 85fps than at 120fps, and 85Hz strobing is much more flicker free than 60Hz strobing).

While many don't mind 60 Hz single strobe, many don't like the flicker. But all user should still have the choice of optionally strobing at any referesh rate they want (not just only presets).

I mean, 120hz bfi won't flicker with 120fps content will it?
Nope. 120Hz BFI won't flicker at any frame rate (0fps to 120fps to beyond). It'll just stutter more when frame rates are not matching refresh rate.

This is the same for CRTs. The lack of motion blur of CRTs made stutters easier to see too (kind of an advantage of LCD motion blur for stutter-haters), but it's always better to have zero-stutter AND zero-blur.

The only way to do simultaneously zero-blur zero-stutter simultaneously at current low native frame rates (60fps or 120fps) is via strobing / BFI / impulse driving (ala CRT/plasma) while having framerate = Hz.
 
Last edited:

mdrejhon

Member
but why do we need bfi at such bright objects(the sun, flashlights etc)?
I'm not sure I understand your exact question, I will disambiguate it into answering two completely different questions.

Imagine you can only flash the screen for 1/10th of a refresh cycle. The screen is OFF 90% of the time, so BFI will dim the brightness by 90% in that case (without any tricks such as voltage-boosting like the backlight-based BFI strobe used in BenQ XL2546 to achieve the 300nit strobe it already does).

Remember the CRT electron gun shines at over 10,000 nits at the electron beam dot. That's why high speed videos of a CRT often overexposes the electron beam dot location, while the rest of the screen is dark. It's kind of a defacto sequential one-pixel-at-a-time strobe, while it raster-scans the tube's phosphor.

1. Why do we need SDR BFI to be so bright?
You need to flash twice as bright for half as long, to maintain the same average nits.

Strobing at 1000 nits for only 10% of the time, still only averages 100 nits to the human eye.

That's only enough for SDR.


2. Why do we still need BFI *and* HDR?
You need insane brightness to combine BFI and HDR.

Strobing at 10,000 nits for only 10% of the time, still only averages 1000 nits to the human eye.

That's just enough for HDR.
 
Last edited:

Azurro

Banned
CRTs have better motion clarity. They also have nearly zero input lag. A high-res PC monitor is better than an LCD in almost every way.

Living room TVs are different. Because size matters, there are more benefits having a big LCD.

Their resolution is also insultingly low. It's a hipster thing, just because the tech has one or two aspects that are still very good by today's standards, it doesn't mean that the most important factors of a TV are way better in what is available in the market now. I would never want to code in a curved CRT tv ever again, give me high resolution, bright colors and ample screen space.
 
Last edited:
Top Bottom