• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

Chronicle

Member
How much do large 55 inch crts weigh? How deep are they? That might put some perspective in this crts are the greatest bullcrap around here these days.
 
480hz monitors are coming soon.
At what price though?

I feel that in 2022 at a 50% premium against 1080p 75Hz screens you should have 2160p (4K) available. Instead they cost double and are often just 60 Hz.

I also feel 120 Hz screens (and 4K) should be common place once you get to the 300€ territory, it's nowhere near that.


So 480 Hz will be very expensive and I'd like them to focus on giving me 240 Hz at a price I don't have to sell a kidney.
How much do large 55 inch crts weigh? How deep are they? That might put some perspective in this crts are the greatest bullcrap around here these days.
If that was the criteria, projectors are unbeatable.
 
Last edited:

Meicyn

Gold Member
480hz monitors are coming soon.
https://blurbusters.com/expose-stea...e-oleds-by-boe-china-surprising-blur-busters/

So the old CRT TVs had essentially zero motion blur, whereas modern day LCDs/OLEDs have a lot of motion blur, so the only way to fix this is to have 1,000 frames per second according to Blurbusters. And believe it or not it doesn't stop there, we are eventually going to need 10,000fps, why? Because there is a screen artifact called the phantom array effect that doesn't go away till you hit 10,000 frames per second. So this means we would need a computer capable of playing a game at 10,000 fps and a screen capable of displaying a 10,000hz refresh rate (I think this will take 50 years or more to accomplish). Look at the screenshot, that's the founder of Blurbusters, it's from the Blurbusters forum.

GOAfMDq.jpg
I’m curious what the perceived difference is. What I mean is that most of the audio we consume daily is lossy. Likewise video. And it’s fine right now, a massive improvement over what lossy audio and video was like ten years ago. I guess what I’m asking is that, is this a genuine issue, or is this the usual enthusiast fodder where the 0.0001% dude listens to a 256 kbps VBR MP3 and swears he can hear the compression artifacts.
 
We still have a Sony 34XBR960 CRT and lately Ive been playing Ghost N Goblins Resurrection on it via Switch (the tv has 1 HDMI input thankfully) It really is crazy how much smoother 60fps/60Hz looks on a CRT vs. even my main 144Hz PC LCD and I won't even get into how much better the contrast ratio & black levels are on a decent CRT.

But the convergence has slipped a bit on the XBR960 over the years, thats one downside w/ CRT they need occasional calibration to keep things looking crisp.
I agree, this is in part why I can't stand today's gaming. Between the shitty looking panels especially in motion, and the introduction of ambient occlusion in developers rendering pipelines almost a decade ago (blown out high contrast, glowing characters ect.), just awful.
 
I’m curious what the perceived difference is.
Quite huge actually. We're used to it but LCD screens are very blurry in motion while OLED is too clear but not perfect either.

LCD's refresh only ~1 quarter of a frame per Hertz. so 300 lines if you have a 1080 line panel and they're also not that fast doing transitions taking a few miliseconds to stabilize the change.

If you go do OLED you still have the line issue, but transitions are so fast that 60 Hz sources end up creating a motion problem with repeating frames creating a jarring effect.

Regardless, 240 Hz being common could improve LCD's a lot. As it already did on the TV front. 480 Hz is just even better.
What I mean is that most of the audio we consume daily is lossy. Likewise video. And it’s fine right now, a massive improvement over what lossy audio and video was like ten years ago. I guess what I’m asking is that, is this a genuine issue, or is this the usual enthusiast fodder where the 0.0001% dude listens to a 256 kbps VBR MP3 and swears he can hear the compression artifacts.
Video improved a lot, but audio didn't.

We're basically exactly where we were 10 years ago in regards to how 320 Kbps VBR sounds like. Streaming services are still not as good as CD even when they have "high quality streaming" options.

It's not so much about compression artifacts but how sound compression works. It can sound fine, but still be missing information. I actually think the overall and choice of the things they call high end audio actually decreased. Bluetooth is shite and not premium, impedance and sometimes headphone jack is missing and dynamic speakers/earbuds are easy to expose and shit whenever that becomes apparent. They're just convenient.

Jpeg's are fine, but quality is not as good as RAW. It's the same thing here.


Video is perfectly passable now, but streamed video is still a regression from the Bluray quality we had 10 years ago, specially if compared against 4K bluray. Granted, bluray is also lossy and compressed, but you get the "improvement is relative" point.
 
Last edited:

Chronicle

Member
At what price though?

I feel that in 2022 at a 50% premium against 1080p 75Hz screens you should have 2160p (4K) available. Instead they cost double and are often just 60 Hz.

I also feel 120 Hz screens (and 4K) should be common place once you get to the 300€ territory, it's nowhere near that.


So 480 Hz will be very expensive and I'd like them to focus on giving me 240 Hz at a price I don't have to sell a kidney.

If that was the criteria, projectors are unbeatable.
Profectors are garbage and need to be used in pitch black rooms. And it's not the criteria. CRTs are gone for a reason. You can buy one if you want. You can buy a projector too. No one is stopping you (except the drawbacks).
 

svbarnard

Banned
Whats even worse is that mouse trail effect gets fainter as the refresh goes up, but gets alot more distracting since you can still see it, and more of it.

VD9upLq.png

O4H8HO1.png
Yes the phantom array effect also known as the mouse cursor stepping effect will not disappear on LCDs/OLEDs til we hit 10,000 frames per second (yes that's right ten thousand frames per second!!!) I imagine it will be 50 years or more before we ever see screens with a 10,000hz refresh rate. And don't take it from me take it from the founder of blurbusters, look at the screenshot, it's from the blurbusters forum.

TV manufacturers need to quit focusing on high screen resolution and instead focus on high refresh rates, 4k is good enough but 60hz isn't. The future of gaming is ultra high refresh rates, in fact eventually we are going to need 10,000fps@10,000hz!!!!

snkzLjV.jpg
 

svbarnard

Banned
I'm happy with the lower power consumption and lower weight given that the only trade off is motion blur.

I enjoy BFI on the monitors I've used it on, and if they could improve that technology (brightness counteraction, etc) and make it compatible with a wider range of refresh rates and VRR tech then it will definitely be good enough for all but the smallest niche.
BFI (black frame insertion) is what all VR headsets are currently using to combat the atrocious motion blur that LCDs/OLEDs have. Every single VR headset is using black frame insertion, it was the only way to fix the horrible motion blur. I know this because I was reading the blog of Michael Abrash who is a virtual reality pioneer and is now developing the Oculus Quest for Zuckerberg (I would provide the link to the blog but unfortunately it was removed from the internet for some reason).
 
But the convergence has slipped a bit on the XBR960 over the years, thats one downside w/ CRT they need occasional calibration to keep things looking crisp.
Some more expensive Sony professional grade sets were able to autocalibrate geometry. that would be the future, I think.

Profectors are garbage and need to be used in pitch black rooms. And it's not the criteria. CRTs are gone for a reason. You can buy one if you want. You can buy a projector too. No one is stopping you (except the drawbacks).
Projectors can be good. I was pointing out that if it's down to weight when you get more inches weighting less with them.

Good CRT's are awesome, but of course they were phased out for a few reasons. The biggest was not quality though, it was cost. weight and size means extra cost during transport too. They had to invest massively to make CRT HDTV's and the TV size increases they wanted would only make CRT's weight more so it was time to kill it instead of letting it go to those highs and compare favourably against LCD.

I think it's not so much that CRT's were a really good tech, but how we took for granted things that we're still trying to tame on the succeeding tech after 20 years.

The truth is that as good as LCD can be these days, the base design is cheap but not very good. Plasma was better, SED never took off, and LPD as well but would be better against LCD too. They were all more complex (plasma was also killed because of the predicted cost to make the jump to 4K), but these were better or would have no reason for existing.

OLED introduces a whole range of other issues (while keeping some of the issues from the LCD tech), even if on paper it is the best thing ever in the "direct emitting" gold standard goal, it's just that it's not very good handling motion. The main reason it also made it to market though was also because it can be as cheap as LCD.
BFI (black frame insertion) is what all VR headsets are currently using to combat the atrocious motion blur that LCDs/OLEDs have. Every single VR headset is using black frame insertion, it was the only way to fix the horrible motion blur. I know this because I was reading the blog of Michael Abrash who is a virtual reality pioneer and is now developing the Oculus Quest for Zuckerberg (I would provide the link to the blog but unfortunately it was removed from the internet for some reason).
Yes. And BFI is basically simulating how CRT operated in the first place. :)

It just works, but you need to have a lot of hertz and brightness on an LCD/OLED to be able to pull it.
 
Last edited:

svbarnard

Banned
TV manufacturers absolutely need to quit focusing on higher and higher screen resolution (4K is good enough, we do not need 8K!!!) and instead focus on ultra high refresh rates, we need ultra high refresh rates!!!!

Ultra high refresh rates is the future of gaming!
 

NeoIkaruGAF

Gold Member
Now that I think of it, I wonder what spurred the massive LCD takeover in the mid-2000s. It wasn’t even a fight - CRT was dropped almost in a heartbeat, even if early LCD TVs were pretty terrible and 720p brought nothing to the table in a market where DVD was 99+% of home video sales. Nobody seemed to care about video quality. The perspective of having a 32” screen that wasn’t 2 foot deep and 30 kilos was enough for most people to buy that many new TVs? I doubt PS3 and X360 alone could have justified the massive market shift.

I actually recently ”downgraded” to LED from OLED. OLED motion is completely unbearable to me for games. It’s a great tech for movies, provided you use some slight motion interpolation, but I just can’t stand it with sub-60fps gaming and non-interpolated 24fps movies.
 

NeoIkaruGAF

Gold Member
I wonder if MicroLED will improve motion clarity when they go mainstream.
16k will probably be mainstream before TV manufacturers start offering effective motion features beyond interpolation and TVs go beyond 120Hz. LG going backwards with BFI on the C2 isn’t a good sign. I don’t think microLED offers inherently better motion than the tech we have today, and you can bet it will be pushed first and foremost with uber-high resolution and other buzzwords when it’s ready for the mass market.
 

cireza

Banned
We need excellent BFI rather than very high refresh rates. This might become true for monitors, but for TVs, especially smaller ones, I am not sure we will see this someday. I play on a 32 inch TV (I don't need anything bigger), so this is a bit frustrating.
 

Fafalada

Fafracer forever
Yes. And BFI is basically simulating how CRT operated in the first place. :)
It isn't quite though, CRT refresh pulse is on a curve, not an even refresh split, not to mention being refresh rate limited to begin with.

BFI (black frame insertion) is what all VR headsets are currently using to combat the
I'm not sure that they are though, afaik VR HMDs keep pixels lit for only around 1/3rd-1/4 of the refresh, which would mean their panels would have to run at ~480hz if it was just bfi. But maybe some hw engineer can answer how viable that was over 6 years ago.
Also i seem to remember some talk about hmds that can even simulate pulse(not just on/off) but i could be imagining things, getting old and all.
 
Last edited:

Kataploom

Gold Member
I know there is motion blur on my OLED and my LED, but I don't know that I am even bothered by it or notice it.
Same here, what's that motion blur people talk about? I've been playing on a crt tv up until 10 years ago alongside with playing on LCD and never noticed it... I'm a 20/20 eyes person btw so clearly my eyes are not the problem here
 

Ulysses 31

Member
Same here, what's that motion blur people talk about? I've been playing on a crt tv up until 10 years ago alongside with playing on LCD and never noticed it... I'm a 20/20 eyes person btw so clearly my eyes are not the problem here
He may be confusing motion blur with low motion clarity. (motion blur can come from the source itself and not the display)

When moving, the sharpness/detail on modern displays tends to fade a lot which shouldn't be the case ideally.

banners-motion-blur-faq-2x.png.webp


Just try to read something like text/signs when the camera scrolls/pans in games or text/signs on a moving truck across the screen.

www.testufo.com shows the motion clarity improving the more hz your display can do.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Same here, what's that motion blur people talk about? I've been playing on a crt tv up until 10 years ago alongside with playing on LCD and never noticed it... I'm a 20/20 eyes person btw so clearly my eyes are not the problem here
I notice the stutter (what you guys are calling blur) when I look for it on my OLED. I can see it on 60fps content and barely on 120 fps content. Unfortunately I can't use BFI when VRR is enabled so I can't tell you if that will make a true difference.

However, where CRT truly shines for me is how it can handle lower resolution content. 480p content looks so godawful bad on a 4K TV, I don't bother, but on a CRT it still looks respectable.
 
I
How much do large 55 inch crts weigh? How deep are they? That might put some perspective in this crts are the greatest bullcrap around here these days.
I wanna say they don’t exist (someone correct me if I’m wrong), that’s specifically talking about a glass screen CRTV. Once they started to sell the 55”screen to retail, it was a plexiglass glass (which introduced a whole new set of problems for image quality) screen and those are rear projection TVs at a 4, 9 ratio.
 

Chronicle

Member
I

I wanna say they don’t exist (someone correct me if I’m wrong), that’s specifically talking about a glass screen CRTV. Once they started to sell the 55”screen to retail, it was a plexiglass glass (which introduced a whole new set of problems for image quality) screen and those are rear projection TVs at a 4, 9 ratio.
There are no 55 inch crts unless they're projection tvs. They're horrible.
 

Kataploom

Gold Member
He may be confusing motion blur with low motion clarity. (motion blur can come from the source itself and not the display)

When moving, the sharpness/detail on modern displays tends to fade a lot which shouldn't be the case ideally.

banners-motion-blur-faq-2x.png.webp


Just try to read something like text/signs when the camera scrolls/pans in games or text/signs on a moving truck across the screen.

www.testufo.com shows the motion clarity improving the more hz your display can do.
Ok, that I can understand because these days was looking for a way to avoid 30 looking so choppy on my TV, that's why I hate 30fps, but maybe that's the motion clarity thing some talk about and can't do nothing against it 🤷
 

Fafalada

Fafracer forever
Ok, that I can understand because these days was looking for a way to avoid 30 looking so choppy on my TV, that's why I hate 30fps, but maybe that's the motion clarity thing some talk about and can't do nothing against it 🤷
30fps indeed further degrades motion clarity - the dreaded 'soap-opera' effect is in part due to increased temporal clarity from higher refresh allowing us to see details that are otherwise mushed together at lower framerate.
But even 60fps still loses about 70% of resolution in motion on most modern displays (standard 1080p 60hz LCD/OLED panel resolves only around 300 lines in motion). With BFI - that number can be increased to 800-900 on many TVs.
 
Last edited:

StreetsofBeige

Gold Member
I used to have a 60" Panasonic plasma (which I think was 1080p) and replaced it with a 65" Sony 4k 900e.

Call me crazy, but I dont think the 900e was any better. It definitely caused me more issues with dead pixels and audio issues. And I bought the Panny in like 2011.

I didnt do a side by side comparison, so I'll assume the resolution is much better on the 4k. But I think everything else was better on my plasma.
 
Last edited:
How much do large 55 inch crts weigh? How deep are they? That might put some perspective in this crts are the greatest bullcrap around here these days.
I had a 36 inch CRT Sony tv, that thing weighed over 200lbs, it had an amazing picture though, sadly it was 4:3 so in HD it was letterboxed. That was pretty much the biggest CRT tv that was made, there may have been some other kind of specialty CRT that was out there somewhere but that was the biggest one you could find in a regular store.
 
the dreaded 'soap-opera' effect is in part due to increased temporal clarity from higher refresh allowing us to see details that are otherwise mushed together at lower framerate.
this makes it sound like interpolation is uncovering new details in source material (or un-mushing it), which isnt really true.

interpolation generates new frames between each frame of the original source material, but the new frames tend to be inaccurate/imperfect which is why you get the weird "soap opera" effect.
 
Last edited:
I used to have a 60" Panasonic plasma (which I think was 1080p) and replaced it with a 65" Sony 4k 900e.

Call me crazy, but I dont think the 900e was any better. It definitely caused me more issues with dead pixels and audio issues. And I bought the Panny in like 2011.

I didnt do a side by side comparison, so I'll assume the resolution is much better on the 4k. But I think everything else was better on my plasma.
was it a VT20?

Awesome set. No lcd is an upgrade to that when it comes to image, and OLED comes with image clarity issues.

I have a 2013 65VT60 at home. It's awesome. 4K. Is not interesting unless it improves motion.
 
Last edited:

Fafalada

Fafracer forever
this makes it sound like interpolation is uncovering new details in source material (or un-mushing it), which isnt really true.
interpolation generates new frames between each frame of the original source material, but the new frames tend to be inaccurate/imperfect which is why you get the weird "soap opera" effect.
So there's two separate things here - one is perception.
While interpolation can cause errors - I've never seen any studies that associated errors specifically with perception of 'soap-opera' (which - to my knowledge references actual TV broadcasts that were originally aired at 50/60hz). More so people exposed to true 48hz (or 60) content with high-production values (eg. The Hobbit), still often reported soap-opera look even though no interpolation was present.
But if you have data suggesting otherwise, I'd be interested to read up on it.

The other is interpolation quality itself. While specifics of 'every' implementation can't be pinned down given how many products have been on the market over decades, as a general approach, we're talking temporal image reconstruction algorithms, not unlike what is commonly in-use in modern games for AA/resolution upscale.
The key difference is that motion vectors are extracted from frame-history analysis (hence why most interpolation modes introduced massive input-lag, they need a long history buffer to produce motion vectors), so there's additional source of error, but the really big issues (disocclusion etc.) are the same as with every temporal-screen space algorithm we know today.
That said - yes, inserting intermediate frames, even when they're not 'perfect' allows us to perceive detail that is obscured in 30fps feed from the same source video (it's not new detail, ultimately those pixels are always there, just reprojected in interpolated frames). Some of it 'may' be fake detail, not unlike how DLSS uses AI to imagine certain details(or indeed, more modern AI frame-interpolators) - but end-result is you get more information to your eyes.
 
Last edited:

StreetsofBeige

Gold Member
was it a VT20?

Awesome set. No lcd is an upgrade to that when it comes to image, and OLED comes with image clarity issues.

I have a 2013 65VT60 at home. It's awesome. 4K. Is not interesting unless it improves motion.
VT? No, that model I think those were the grand daddy models! I had the base model S30 I think. My brother got a 65" Panny plasma a few years later which I think was ST60 model. He still has it and still looks fantastic.
 

Gamer79

Predicts the worst decade for Sony starting 2022
Go try to find a 55 inch CRT? LMFAO

The thing would weight well over 500lbs. Also they were power hogs and mostly did not come in wide screen.
 
VT? No, that model I think those were the grand daddy models! I had the base model S30 I think. My brother got a 65" Panny plasma a few years later which I think was ST60 model. He still has it and still looks fantastic.
S30/ST30 was awesome. I forgot they made those with up to 65".

ST50 was the pinnacle for gaming as ST60 had increased input lag (but slightly better image quality). This was solved on GT60, VT60 and ZT60 models.

Such good years for image quality.
 
So what I’m reading here is we need plasmas back? Right?
They're really good but not going to happen. Power consumption is high as is circuit/panel complexity. So peak brightness was also a problem against LCD.

It's a bit surprising they lasted in production as long as they did.

Micro-LED, good processing along with lots of hertz and BFI is the intersection where it might just not matter anymore.
How about laser-powered phosphor displays?
Not going to happen. The tech is good but it was developed and patented by a videowall company called Prysm.

Which means several things, first, it was made as a module system so you can make your own custom videowall. Each block has 320×240 and dot pitch is above 1 mm per pixel. You can't do a home HDTV with that kind of pixel pitch, you'd need to manufacture a 90" set to do 1920x1080 at the current dot pitch Perhaps even more as I'm not accounting for the space between pixels.

Second, seeing it's patented Samsung, Sony, LG, AUO, Sharp, Innolux, Chimei, no manufacturer is going to appear with similar tech.

Complexity would be a problem. The latest modules are from 2016 so I guess they're losing against cheaper LED videowalls, that suck balls but are comparatively cheap using off-the-shelf crap.


Current Laser Powerer Phosphor would be perfect for planetariums, videowalls with TV quality IQ/IMAX setups without a projector. Sadly very niche. And they seem complex compared to LCD and OLED. I'd say it's in an even worse place to pull 4K, 8K and beyond than plasma was.
 
Last edited:
As I remember it CRTs still had that wagon wheel issue. Just wanted to point that out.
doesn't the brain itself have a wagon wheel like effect, as the brain and eye only have finite Hz processing?
The fact that a perceptual experience akin to the familiar wagon-wheel illusion in movies and on TV can occur in the absence of stroboscopic presentation is intriguing because of its relevance to visuo-temporal parsing. The wagon-wheel effect in continuous light has also been the source of considerable misunderstanding and dispute, as is apparent in a series of recent papers. Here we review this potentially confusing evidence and suggest how it should be interpreted.
Some years ago we wrote a paper that described and analyzed an intriguing perceptual phenomenon, pointing out its possible implications [1]: when the spokes of a wheel or other stimuli with elements that move continuously in one direction are observed in sunlight, the elements are sometimes seen to be moving in the opposite direction. Because of the general similarity to the backwards motion of wagon wheels in movies, TV or other forms of stroboscopic presentation, we called this phenomenon the ‘wagon-wheel illusion in continuous light’ (see Box 1) and suggested on this basis that the visual system can segregate visual information into meaningful episodes from which perception is then constructed.
 

ClosBSAS

Member
i remember my samsung CRT HDTV...best screen ive ever used for 360 era gears of war true next gen experience. so much btter than the shit we have now.
 

Chronicle

Member
I had a 36 inch CRT Sony tv, that thing weighed over 200lbs, it had an amazing picture though, sadly it was 4:3 so in HD it was letterboxed. That was pretty much the biggest CRT tv that was made, there may have been some other kind of specialty CRT that was out there somewhere but that was the biggest one you could find in a regular store.
I know. I used to sell them. One reached 40 - the Sony XBR Trinitron. Fucking humongous and weighed like a 747. 4:3 aspect.
 
I've been using a software BFI program I grabbed off the Blur Busters forums to do some.. unethical... human testing, on myself 🤣
Right now, I am driving my Asus XG279Q at a custom refresh rate of 150hz, and using this software BFI tool to draw 1 real frame followed by 4 black frames for an effective refresh rate of 30hz. Yep, it's a flickery mess. But let me tell you something, emulation and old games built around 30 fps, have NEVER looked so sharp and smooth. When you single cycle strobe, it doesn't matter what refresh/frame rate you're targeting, it will appear sharp and smooth. I've even tried 25 fps from the old Diablo 2 days, and the game looks crystal clear sharp.
And to be clear, this isn't a good LCD. It's a crappy fake IPS screen with around 3-4ms response times. I am so eager to try an OLED with software BFI combined with hardware BFI for these lower fps games and see not only the incredible smoothness, but the deeper contrast too.

Of course, this isn't ideal and no normal human being wants to be subjected to sub 60hz flicker. I can tolerate it because I'm some kind of ungodly beast of a man. But the good news is, if we can reach 960hz screens, then we can simulate rolling scans on just about any "low" fps content. Eg - for 60 fps feeds, you'd have 16 refresh cycles per 60hz frame to simulate a rolling scanout. This would dramatically reduce flicker and also greatly help boost brightness and colors. You don't even need to send a 960hz signal to the display, it can be done locally onboard the panel's built in processor. Just feed it a regular old 60hz signal and let the display do all the heavy lifting of simulating 16 simulated scanout frames from the source. Done.

I hope I live to see the day this can be done because it's going to be so fucking cool.
 

rofif

Can’t Git Gud
He may be confusing motion blur with low motion clarity. (motion blur can come from the source itself and not the display)

When moving, the sharpness/detail on modern displays tends to fade a lot which shouldn't be the case ideally.

banners-motion-blur-faq-2x.png.webp


Just try to read something like text/signs when the camera scrolls/pans in games or text/signs on a moving truck across the screen.

www.testufo.com shows the motion clarity improving the more hz your display can do.
There are diminishing returns of natural motion blur. You can’t expect to read or see details that scroll by fast.
The only thing you can expect is for motion blur to be natural looking and it’s not as easy.

I you need artificial motion blur to convey speed, motion, weight and so on.
Movies cheat it almost naturally because each frame is capturing a moment of time. Not a snapshot of time. 24fps movie built from these captured blurry moments, come up as a motion. That’s why movie is always blurry when you pause.

But ideally we don’t want that in a game. You won’t capture moments of time in a game. So we really need at least 240 or even a 1000 fps to be able to ideally disable fake motion blur. With these many frames there is enough data, for your brain to build a whole motion.
In my tests, 240 looked identical without motion blur as 60hz as with motion blur in doom eternal. At least to eyes/brain.

But the problem is, that even when you have 1000fps, disabled motion blur and thousands of fps to build perfect natural motion blur with every frame sharp… the screen is still small. Car driving around fast in reality next to you is big. It will create different sensation of speed in reality than the same in a game. It’s not the same size on the screen relatively to your eyes.

It’s a complicated topic and I think some sort of faked blur will always be needed. Trying to build game motion from still snapshots frozen in time is a mistake
 

cortadew

Member
CRTs have better motion clarity. They also have nearly zero input lag. A high-res PC monitor is better than an LCD in almost every way.

Living room TVs are different. Because size matters, there are more benefits having a big LCD.
Good luck trying to find a high res crt pc monitor that isn't in the thousands of dollars.
 
Top Bottom