So I thought the whole point was about eliminating motion blur? I thought the whole point was that modern day flat panel TVs such as our LCDs/OLEDs have a shit ton of motion blur compared to the old CRT TVs, and the only way to eliminate this motion blur was to achieve a 1,000 frames per second?
Right Tool For Right Job.
Don’t confuse the screwdriver with the hammer in the toolbox.
We’re simply choosing CRT simulation as a superior method of BFI. The HDR allows brighter BFI, and the rolling-scan ensures that photons are hitting eyeballs are all times, rather than harsher square-wave flicker. 60 Hz flicker CRT has less eyestrain than 60 Hz squarewave BFI.
Running an emulator with
BOTH spatial HLSL filters (existing technology)
AND temporal HLSL filters (my CRT electron gun simulator idea), will make it look both spatially and temporally correct. Basically a superior method of BFI.
The technology can scale to better-and-better temporal accuracy, the more Hz you throw at it, so it can be designed to be Hz-scaling, much like spatial CRT filters begin to look more and more accurate on higher resolutions, even even more accurate with OLED (good blacks, etc).
Again, remember, Right Tool For Right Job. Not everyone wants to emulate a CRT, but there are use cases where you
DO want to simulate a CRT both spatially
AND temporally. (Or even, only temporally).
A phosphor fadebehind rolling scan is the gentlest possible flicker for a specific Hz.
So (if you’re stuck at low Hz) AND (you don’t want extra frames) THEN (rolling scan + fade logic) is the gentlest way to flicker at a specific “X Hz” i
n situations where you actually want to flicker.
Also, dimness of BFI can be compensated by HDR nit surges at the small windowing sizes of a tight rolling scan. A 10,000nit HDR display (I saw a prototype at CES 2020) can still do 500 nit at 1/20th persistence, which is great for simulating the electron beam dot (which is incredibly bright), and at only 5% window.
Again, repeating: Right Tool For Right Job.
Sometimes the superior tool is blurfree sample and hold (1000fps+ at 1000Hz+), but sometimes it’s not always the Right Tool For The Right Job (e.g. faithful retro simulation).
I don't understand why we would try to use software to simulate a CRT TV on LCDs/OLEDs? I thought the two main things we just wanted to eliminate are motion blur and the "phantom array effect AKA wagon wheel effect".
Again, repeating: “Right Tool For Right Job”.
Sometimes you want to, and sometimes you don’t want to.
But if you had a display with a 1920Hz refresh rate (and a computer capable of playing a video game at 1920fps) you wouldn't have motion blur anymore because according to blurbusters motion blur is eliminated once you hit 1,000 frames per second. (And also according to blurbusters we will need a screen with a 10,000Hz refresh rate in order to eliminate the phantom array effect).
Emulators can’t do 1000 frames per second without violating faithfulness.
For some specific applications, sometimes you don’t want interpolation — if you are a human who’s specifically perservation-focussed and are not flicker-sensitive (But want a brighter BFI and a gentler-flicker BFI than old-fashioned digital squarewave BFI). So if you had an answer “What’s the world’s best BFI algorithm” — then the answer is a CRT or a perfect simulation thereof.
In other words, “My post is for people who wants the world’s best BFI algorithm”. WIth that perspective, go back to the post and reread it — it’s for situations where other tools are unsuitable for their specific needs for a specific application. I was just confirming that specific person is correct that display algorithm simulators is something that will eventually (in 10-20 years from now) become a popular substitute to a CRT purchase.
You’d still use the 1000fps 1000Hz to play your PC esports game like you describe, and then when you launch simulator, you’d instead use a CRT simulator as a superior BFI that’s far better than today’s BFI.
Again, Right Tool For Right Job.
It’s not black and white.
What I'm saying is, what's all this talk about using software to simulate CRT TVs? I thought the point was about eliminating motion blur on flat panel TVs? Here's what I want, just eliminate the motion blur and the phantom array effect and I'll be happy. But I don't suppose we'll be getting LCDs or OLEDs with a 10,000Hz refresh rate for at least 50 years from now correct?
Again, Right Tool For Right Job.
Don’t confuse the screwdriver with the hammer, in a manner of speaking, metaphorically…
You’re talking about a different legitimate tool than I am — they both co-exist on the same display and you can switch between display algorithms instantly.
Turn ON/OFF the CRT simulator mode (or plasma simulator mode) like turning ON/OFF BFIl.
As a superior version of BFI for retro-friendly preservation for the 60 years of legacy 60fps 60Hz material, where you actually want to preserve the original CRT flicker, original (low/zero) blur, original phosphor decay, original phantom array effect, etc — all the original artifacts. Whether at home, or in a museum, or a MAME arcade cabinet.
You might wish to re-read it through a corrected lens:
https://www.neogaf.com/threads/old-...blur-as-crt-tvs.1593080/page-7#post-266446063
One moment, your display is perfectly simulating a CRT as perfectly (to human vision margins) as a Sony FW900 CRT tube. Even passing an A/B blind test!!! behind a fake bezel. Including shadowmask/aperturegrille texture, fuzziness, resolution independence, brightness, phosphor ghosting, zero blur, etc. ALL of the attributes correctly spatially AND temporally simulated to human retina league.
Next moment, your display is an ordinary PC 1000fps+ 1000Hz+* (choose any quadruple digit) blurless sample and hold display.
Note: *1000 may not yet be enough to pass blind tests with a CRT. It may require, say, 4000+ Hz. However, 1000 should get pretty close, assuming extremely bright HDR pulses are available.
The same screen thus, capable of chameloning into every single display in humankind.
Just look at all the CRT lovers here, and they “Liked” my earlier post almost a dozen times already — a true chamelon of a display — which becomes technologically possible once (resolution AND refreshrate AND hdr) are all simultaneous retina’d.
The higher the refresh rate (and the more “retina” the resolution and HDR is), the more likely a temporal-domain retro display simulator will pass an A/B blind test with the original display — i.e. passing an A/B blind test with a flat CRT tube versus a flat panel (behind an equally thick glass front layer, anyway).
And being able to simulate infinite number of displays on the same panel — at a moment’s notice. Like an infinite number of custom BFI modes.
TL;DR; Right Tool For Right Job