The question is whether or not we can have this beautiful future without terrible viewing angles. I don’t ever plan to get a TN panel again after getting IPS.Correct.
Actually the mouse cursor is less distracting + easier to see at higher Hz.
Photos don't always accurately portray what motion looks like to the human eye.
Put a finger up on your screen. Pretend your fingernail is a mouse pointer. That what 1000fps+ 1000Hz+ mouse cursors look like -- virtually analog motion. Once the pixel step is 1 pixel, it doesn't stroboscopically step at all. 2000 pixel/sec mouse cursor at 2000 Hz would be just a continuous blur.
Now, when you eye-track a moving mouse cursor (eye tracking is different from a stationary eye / stationary camera), it's so much clearer and easier to track, aiming a mouse cursor faster on buttons sooner and clicking buttons faster. It's more of a pleasure to click on hyperlinks at 240 Hz....
But when you photograph (with a stationary camera) your moving finger, your fast-moving finger is just a very faint motion blur.
Again, photos don't always accurately portray what motion looks like to the human eye.
Depends. I have a plasma and I worried a lot about this when I first got it. I put a lot of effort in trying to vary content, letting it run for an hour or two with just dead channel snow or random TV. I think it also has an option to move the image around a few pixels at a time. I eventually stopped caring so much and am still using it twelve years later. I sometimes get stuck doing something and come back many hours later on a static menu. While there can be some image retention, if I play a movie for half an hour, it's gone.And they could also burn in as many Destiny 1 players found out.
Could you add new Videogame motion tests to imitate camera panoraming of fps, tps games?Correct.
I think what I was saying is what you've said here:This is not true.the 1000 fps part seems silly to suggest when what you want is the response time of 1000 Hz.
If you had that, it probably wouldn't matter if frames are being doubled/interpolated or not.
Specifically through interpolation (or frame doubling with a sufficient framerate like 120 or 240), or inteligent frame creation (although I didn't write that).Adding extra frames does not have to add extra lag. You should check out Frame Rate Amplification Technology.
This also becomes a cheap lagless way to reach 1000fps by year 2030, while keeping Unreal 5 visuals.
No, no. It was a metric a few years back, Motion resolution on LCD/OLED spec sheets used to come up as 300p or 300 lines.When you say "lines of motion resolution" you're referring to frames per second right? Remember a lot of people here are casual people.
I have to point out that burn-in on a plasma is very different, than on an OLED, mind you. Way less definitive.And they could also burn in as many Destiny 1 players found out.
Samsung made some models that tried to “flatten” the tube in the early 2000s. They shaved some centimeters off the TV, but at the price of terrible, terrible distortion. I desperately wanted one until I saw one in person. Straight lines distorted like they do with a cheap camera lens.Anyone who is knowledgeable on the subject know if it is technically possible to make a CRT that didn't require so much depth?
I’ve wondered about this too, but there’s plenty of reasons why the whole thing would be unfeasible.How feasible is it for a start up boutique builder to start making custom small batch modern CRT's for gaming? I've always wondered why someone doesn't do that, I mean I'm guessing it'd be insanely expensive to get started with R and D and fabrication facilities, but the demand among gamers is definitely there, particularly as of late.
A modern CRT would never be curved outward and stuck on 480i at 14", just like LCD's are not passive matrix, 10" and limited to 480p anymore these days.The great majority of them during the golden age of CRTs were standard definition. We are at 4K and dabbling with 8K, I don't understand why you'd want to go back to a deformed, small, and very heavy display.
Phillips had better success although granted, geometry always tended to suffer a bit. With flat CRT displays too.Samsung made some models that tried to “flatten” the tube in the early 2000s. They shaved some centimeters off the TV, but at the price of terrible, terrible distortion. I desperately wanted one until I saw one in person. Straight lines distorted like they do with a cheap camera lens.
Why is that, that seems very counterintuitive, can you ELI5 (explain like I'm five)? Strobing at 120fps on a 240Hz screen looks better than strobing at 120fps on a 120Hz screen?For example, 120fps@120Hz strobe on a 240Hz-capable LCD can be more easily engineered to look much better than 120Hz strobe on 120Hz-only LCDs, or 240Hz strobe on 240Hz-only LCDs. Basically, buying too much Hz, then intentionally running the monitor at a lower Hz, and enabling the strobe mode on it. This is basically a lots of laws of physics interacting with each other.
.
My Sony A1 dims the hell out of the screen with BFI on...BFI implementation differs with each company / display. On monitors, yes it does suck. On Sony and Panasonic televisions, it's a godsend. You also don't need extra brightness unless it's an hdr source.
This
I use 1080p plasma for PC gaming as much as I can - still less motion blur than either 200 or 240 Hz monitor.
I have mild tinnitus so I basically hear this noise freq all the time in silence. Or maybe there’s just a huge CRT monitor stalking me.Hate the PS5 coil whine? Wait until you get a dose of that wonderful, persistent 15.625 kHz whine that NTSC CRT displays have!
They want to sell screens.Also are the executives at Samsung, Sony, LG, and TCL aware of this? Are they aware of the horrible motion blur lcd's have and that there are solutions to fix it, namely the solutions blurbusters has come up with? That's why I started this thread I'm trying to raise awareness about this!
Its accurate. I had a dell 165hz monitor. The mouse curser was pretty distracting compared to my 75hz monitor and I returned the Dell for that reason. Its similiar to turning on the 'cursor trail' feature in windows.Correct.
Actually the mouse cursor is less distracting + easier to see at higher Hz.
Photos don't always accurately portray what motion looks like to the human eye.
Put a finger up on your screen. Pretend your fingernail is a mouse pointer. That what 1000fps+ 1000Hz+ mouse cursors look like -- virtually analog motion. Once the pixel step is 1 pixel, it doesn't stroboscopically step at all. 2000 pixel/sec mouse cursor at 2000 Hz would be just a continuous blur.
Now, when you eye-track a moving mouse cursor (eye tracking is different from a stationary eye / stationary camera), it's so much clearer and easier to track, aiming a mouse cursor faster on buttons sooner and clicking buttons faster. It's more of a pleasure to click on hyperlinks at 240 Hz....
But when you photograph (with a stationary camera) your moving finger, your fast-moving finger is just a very faint motion blur.
Again, photos don't always accurately portray what motion looks like to the human eye.
Thank fucking god we moved away from CRTs. They were super low res (higher res ones cost a fortune) and worst of all they destroyed your eyes if you used them for any significant time. My eyes would tear up if I gamed or used the PC too much in the CRT days, now with LCD it's a non-issue.
At this time, it would probably be cheaper to create a 1000 Hz LCD/OLED just to emulate a low-Hz CRT.How feasible is it for a start up boutique builder to start making custom small batch modern CRT's for gaming? I've always wondered why someone doesn't do that, I mean I'm guessing it'd be insanely expensive to get started with R and D and fabrication facilities, but the demand among gamers is definitely there, particularly as of late.
First, take a look at testufo.com/persistence and testufo.com/eyetracking.I thought that, from the get go, motion blur is some sort of natural effect in visual perception beyond display technology. I mean, go outside and look at any object moving at a reasonable speed, and you will realize you see don't see it in 100% clarity during its trajectory; the object sort of.. well... blurs in your vision. "Per-object" motion blur in media is supposed to imitate that effect while we're watching TV, so the choppiness that we would otherwise perceive is avoided or mitigated.
Considering that, I don't know if "having a minimal" or even getting completely rid of this effect in TVs would produce a natural picture, which is precisely what we're supposed to aim at.
Excellent Science Question: Why is low-Hz strobing superior on high-Hz LCD?Why is that, that seems very counterintuitive, can you ELI5 (explain like I'm five)? Strobing at 120fps on a 240Hz screen looks better than strobing at 120fps on a 120Hz screen?
Where do you see the future of the TV industry going? Don't you agree with me that they should stop at 4K and not proceed to 8K, they should focus on 4K screens but with higher refresh rates? we need 4K 240hz as soon as possible. Also is there any problems with 8K or higher resolution at 1,000fps@1,000Hz like stuttering or anything?
Also are the executives at Samsung, Sony, LG, and TCL aware of this? Are they aware of the horrible motion blur lcd's have and that there are solutions to fix it, namely the solutions blurbusters has come up with? That's why I started this thread I'm trying to raise awareness about this!
Crt also has very viewing angles, no bleed, glow and oled like contrast.I'm happy with the lower power consumption and lower weight given that the only trade off is motion blur.
I enjoy BFI on the monitors I've used it on, and if they could improve that technology (brightness counteraction, etc) and make it compatible with a wider range of refresh rates and VRR tech then it will definitely be good enough for all but the smallest niche.
on any decent crt display you get numerous options in the OSD to correct and change geometry. many higher end ones even have service menus you can often get into with some combination of monitor or remote button presses that offer even more options.How about geometry issues?
Agreed -- with a big "BUT".People should play games instead of focusing on every technicals imperfect specs because... NOTHING IS PERFECT.
Yeah LCD are not perfect, so what ? CRT neither. But all were and are good enough in their respective era.
I can't stop laughing at those streamers/people with fps in a corner like "if you really need this to notice 60fps, why do you even care at the first place ?".
OCD will never die. ENJOY. YOUR. GAMES.
Excellent Science Question: Why is low-Hz strobing superior on high-Hz LCD?
It is only counterintuitive because of LCD.
LCD have a finite pixel response called a "GtG", which is Grey-to-Grey.
It's how an LCD pixel "fades" from one color to the next.
You can see LCD GtG in high speed video: High Speed Video of LCD Refreshing
Strobing at max Hz means you don't have enough time to hide LCD GtG between refresh cycles.
Not all pixels refresh at the same time. A 240Hz display takes 1/240sec to refresh the first pixel to the last pixel. Then GtG pixel transition time is additional to that above-and-beyond!
You want LCD GtG to occur hidden unseen by human eyes. See High Speed Video of LightBoost to understand.
Most GtG is measured from the 10% to 90% point of the curve, as explained at blurbusters.com/gtg-vs-mprt.
But that means 10% of GtG from black to white is a dark grey, and 90% of a GtG from black to white is a very very light grey. This can create strobe crosstalk (duplicate images trailing behind images):
Slow GtG can amplify strobe crosstalk, and we need almost 100% to be really fast. 1ms GtG 90% often takes more than 10ms to complete 99% or 100% of GtG transition.
Pixel colors coast very slowly to its final color value near the end, like a slow rolling ball near the end of its rolling journey -- that is a strobe crosstalk problem.
More advanced reading about this can be explained in Advanced Strobe Crosstalk FAQ, as well as Electronics Hacking: Creating A Strobe Backlight. (Optional reading, not germane to understanding ELI5, but useful if you want to read more)
Now, Imagine a 240Hz IPS LCD. It has an advertised 1ms GtG, but its GtG is closer to about 5ms real-world GtG for most color combinations. GtG speeds can be different for different colors, so you've got a "weak link of chain" problem!
Example 1: Strobe 240Hz LCD at 240Hz
1/240sec = 4.166 milliseconds.
.....Thus, repeat every refresh cycle:
T+0ms: Backlight is OFF
T+0ms: Begin Refreshing
T+4ms: End Refreshing (240Hz refresh cycle takes 1/240sec)
T+4ms: Wait for GtG to Finish before turning on backlight (ouch, not much time)
T+4.1ms: Backlight is ON
T+4.166ms: Backlight is OFF
.....Rinse and repeat.
Problems:
- Not enough time for GtG to finish between refresh cycles
- Not enough time to flash long enough for a bright strobe backlight
- We lose lots in strobe quality
Example 2: Configure 240Hz LCD to 100Hz and strobe at 100Hz
1/240sec = 4.166 milliseconds.
1/100sec = 10 milliseconds
T+0ms: Backlight is OFF
T+0ms: Begin Refreshing
T+4ms: End Refreshing (100Hz refresh cycle in approx 1/240sec, thanks to fast panel)
T+4ms: Wait for GtG to finish
T+9ms: Finally finished waiting for GtG (5ms in total darkness)
T+9ms: Backlight is ON
T+10ms: Backlight is OFF
.....Rinse and repeat.
Voila! We win!
- LCD GtG completely hidden by human eyes (in the backlight OFF period)
- Requires LCD that supports quick scanout at lower refresh rates (most do, if manufacturers bother)
- Only fully-refreshed refresh cycles are seen by human eyes (more perfectly complete GtG).
- Strobe crosstalk goes to zero (or almost zero)
Internally we call it the “Large Vertical Total” technique (large Vertical Blanking Interval aka VBI) because it's a large interval BETWEEN refresh cycles. The interval/pause between refresh cycles (VBI) can be several times longer (in milliseconds) than the visible refresh cycle, to allow more time for GtG to complete between refresh cycles!
Here is a comparision before/after for www.testufo.com/crosstalk
<Appendix: Advanced Optional Reading>
Sometimes GtG is not even 100% between refresh cycles, but the worst GtG incompletions can often be “pushed” into the top/bottom edges of the screen via special strobe phase timing delays relative to the LCD refresh cycles. So 7ms GtG can still be mostly hidden by a 5ms VBI with only small leakage into visibility. So the backlight timing phase is sometimes slightly overlapped with the refresh cycle, to account for the GtG lagbehind effects, measured using a photodiode oscilloscope.
Reminder: Not all pixels refresh at the same time. There isn't millions of wires to refresh all pixels at the same time. They use nanowire grids in digital panels. They can activate one vertical wire and activate one horizontal wire to refresh essentially one pixel at a time (where the wires meet) -- in ultra high speed fashion, left-to-right, top-to-bottom.
Displays have been raster-refreshing for a century, from the first analog TVs of the 1920s, through today's 2020s DisplayPort LCDs. Like the days of a calendar or book -- you start at upper-left corner, scan towards the right, then go to the next row. This is how two-dimensional images (refresh cycles) are delivered sequentially over a one-dimensional medium (over analog or digital video cable, or over a TV radio channel, or executing displays refreshing electronics). Pictures of a refresh cycles are metaphorically like building mosaic art one square at a time. And repeating it every single refresh cycles.
You can see that most displays refresh 1 pixel at a time in high speed videos. Otherwise, we'd end up having millions of miles of nanowires just to refresh all pixels of an 8K display simultaneously. Even "global" refresh displays (plasmas, DLPs) still sequential-refresh, just simply ultrafast sequential scanouts (e.g. 1/1000sec for DLP chip). Either way, to save money & engineering, thin digital displays only have wire grids, and they essentially (more or less) refresh one pixel row at a time. It takes time to refresh the first pixel through the last pixel.
Good strobing (during a long VBI that allows GtG to finish unseen by eyes) allows you to filter slow LCD GtG (hidden in total darkness with backlight OFF) while having short MPRT (the length of strobe flash), allowing certain LCDs such as Oculus Quest VR or Valve Index VR headset to have less motion blur than an average CRT.
Few people realize that a cherrypicked well-engineered strobed LCD can beat CRT in motion clarity (zero ghosting, zero blur, zero strobe crosstalk, zero afterimages, perfect clarity, no phosphor ghosting), especially during perfect framerate=Hz (VSYNC ON or similar technologies).
Nontheless, it is my belief that users should have the choice of strobing at max-Hz (lower lag but lower quality than CRT), or strobing at well-tuned lower-Hz strobe on high-Hz panel. I prefer manufacturers uncap the arbitrary strobe refresh rate presets/ranges, and let users choose.
</Appendix: Advanced Optional Reading>
Hope this helps explains (in sufficiently simple science) why refresh rate headroom is very good for LCD strobe backlights.
TL;DR: Refresh rate headroom gives more time for LCD GtG between refresh cycles at lower strove Hz. This allows strobing to be MUCH more CRT motion clarity.
You not only need a display with a 1,000hz refresh rate but you need to play games or movies at 1,000 frames per second in order to see less motion blur.
Oh don't worry about it that much. After age 30 or so you won't be able to hear it anymore.Hate the PS5 coil whine? Wait until you get a dose of that wonderful, persistent 15.625 kHz whine that NTSC CRT displays have!
I watched Mad Max Fury road in 3D on my quest 2 and it’s a better experience than the theatre.Agreed -- with a big "BUT".
Many people should just focus on the game rather than the details. But.
1. Have you tried a recent VR headset? (2020 or newer good VR LCD) Display motion blur is 100x+ more nauseaus/headachey in virtual reality. Sometimes it's necessary in some display technologies. What looks fine on a small TV from a distance, can be motion nausea. The breakthroughs in perfectly zero-out LCD motion blur (better than CRT) in some 2020-and-newer LCD VR headsets, made them more comfortable than movie glasses (Real 3D, Disney 3D), and modern VR is now more comfortable than cinema 3D.
2. Also, some people are very flicker sensitive, while other people are very motion blur sensitive (headaches from motion blur!!). Everybody sees differently. 12% of population is color blind. Not everyone wears same eyeglasses prescription. Even "motion blindness" exists (Akinetopsia). Some people metaphorically has a "dyslexia-equivalent" for certain motion that makes them more stutter sensitive (they can't see game motion well because of stutter, creating headaches from stutter that doesn't bother you). One display motion artifact may be faint to you can be a bleeping migraine beacon to the next individual. Displays don't correctly emulate a view of real life. There is also a lot that prevents some people from being able to play comfortably. So give up "visionsplaining" other people about things that don't bother you.
TL;DR: What doesn't bother you, can sometimes be a big problem to the other person.
You would need a multisync monitor that can handle 15hz. almost any more modern crt monitor won't do that as they expect 640x480 and up.I currently have a laptop connected to a Viewsonic GS815. Using Retroarch, retro gaming on a CRT is nice. Not quite the 240p experience (I’m having difficulties forcing 240p on the monitor), but the input response, smooth motion and black level is nice. I don’t have to worry about pixel perfect aspect ratios either.
Has anyone had any luck with forcing 240p on a CRT monitor? My laptop has an Nvidia 950m and integrated Intel graphics. The only way I can change resolution is via the Intel settings menu, so that might be the limiting factor, but apparently the 950m is being used as the GPU for games. Any tips would be appreciated.
Wellcome to Samsung's shitty PWM (pulse with modulation) implementation, They basically flicker the backlight at a constant 120hz regardless of picture mode or source refreshrate. This leads to 2 flickers per frame at 60hz. Thus giving you double images when things pan on screen. This is one of the biggest reasons I booted my 3 trial Samsungs I went through before settling on a Sony. Samsung does increase the PWM in movie mode ONLY though, which is baffling as they could keep it in all modes, to around 960hz which virtually eliminates image doubling and Sony did use to do 720hz in all modes, but only support it in graphics and game mode on the newer models. This is utter bullshit from Samsung and a bad trend for Sony, it's likely lazy or cost reducing reasons and few reviewers and consumer are educated and witted enough with use scenarios where it crops up unfortunately. It's how ever very visible when scrolling text or panning a camera in games with graphics containing thin lines were there is no ingame motion blur being applied to the camera.I have a Samsung The Frame 32 inches (small size, as I game on small TV). Still the best TV I found in this size.
There is a very noticeable doubling of the picture when scrolling fast. My previous TV was a Sony KDL (same size), it was extremely blurry for fast scrolling. Despite me disabling all the features etc...
These HD TVs are a complete joke for video-games. They are ok for watching movies though.
svbarnard said:
You not only need a display with a 1,000hz refresh rate but you need to play games or movies at 1,000 frames per second in order to see less motion blur.
Basically, you won't see much benefit to motion clarity if you don't increase the frame rate of the game or movie as well. A 24fps movie or a 60fps game will look as blurry to they eye regardless if the display is updating at 120hz or 1000hz because the frames will just be doubled to fill out the extra hz. So when your eyes track motion on the screen, the same frame will still stay in the same place for the same duration causing the same in-retina motion blur as a lower hz screen.
Yes, it's impressive how they've made VR headsets more comfortable to view than cinema 3D glasses. Less blur than a plasma, less blur than a CRT, less blur than a 35mm projector. And far less nauseating 3D.I watched Mad Max Fury road in 3D on my quest 2 and it’s a better experience than the theatre.
Hello,Mr. Rejhon I have two more questions for you.
1. So where do you see the TV industry going from here? In my personal opinion I wish the TV industry would STOP at 4K and NOT proceed to 8K but that's not happening, I wish they would just stick with 4K and focus on higher refresh rates. Do you see the TV industry going to 16k eventually? When do you think the TV industry will realize the seriousness of motion blur on flat panel displays and try to fix it?
2. In your article "the amazing journey to 1,000 Hz displays" you mentioned something about stuttering at 8K at 1,000Hz? What's the difference between 4K@1,000Hz and 8K@1,000Hz? So you're saying there are other problems you run into at ultra high resolution?
2. In your article "the amazing journey to 1,000 Hz displays" you mentioned something about stuttering at 8K at 1,000Hz? What's the difference between 4K@1,000Hz and 8K@1,000Hz? So you're saying there are other problems you run into at ultra high resolution?
OLEDs are beautiful in SDR and can give you a near perfect image in theory. If manufacturers and consumers didn't care about burn-in we would easily have 2000+ nit HDR capable screens with no Automatic Brightness Limiter or Static Logo Diming. And Panasonic has almost eliminated near black gamma issues and flickering. You could likely also do bright BFI if they just pushed the pixels harder inbetween the black screens. But as it stands with those pixels being organic, they age too quickly so we are stuck with LCD for real bright HDR and desktop work where you have mostly static content for a bit longer. miniLED LCD and eventually microLED or QD-LEDs are going to replace OLED.I'm going to say first that the image quality of an OLED panel continues to blow my shit away. It's the best looking panel technology my lizard brain has ever seen.
I do miss the motion clarity of CRTs. Always been a proponent of 120hz, or at the very least 60hz, on EVERYTHING. Games, movies, phones, doesn't matter. It's a shame that movies will never go there except in niche situations, and games will always strive for fidelity over smoothness the further along the hardware gets.
I noticed that too in the hobbit movie. Galadriel was doing this slow pronounced fake slow mo walk that would have looked cool at 24 FPS that looked ridiculous at 48fps. I liked the higher frame rate regardless but..OLEDs are beautiful in SDR and can give you a near perfect image in theory. If manufacturers and consumers didn't care about burn-in we would easily have 2000+ nit HDR capable screens with no Automatic Brightness Limiter or Static Logo Diming. And Panasonic has almost eliminated near black gamma issues and flickering. You could likely also do bright BFI if they just pushed the pixels harder inbetween the black screens. But as it stands with those pixels being organic, they age too quickly so we are stuck with LCD for real bright HDR and desktop work where you have mostly static content for a bit longer. miniLED LCD and eventually microLED or QD-LEDs are going to replace OLED.
Regarding frame rates, I'd love to see more movies goin for 120hz. I can't watch anything without motion smoothing these days as any 24fps motion on screen just looks blurry. I have a Sony so the algorithm is really good and doesn't look too unnatural. I saw Gemini man at 60fps and you get used to it quickly but I can also understand some of the argument against it, such as things looking too real, LOL. It puts a really high bar for make up, vfx, lighting, acting etc that's hidden behind low frame rate and motion blur.
Interestingly I think overacting looks really weird at higher fps, it's the same way it would look if your friend started to overacting next you in real life, so I think in those scenarios actors need to dial back or act more realistic for it to look believable at higher fps.
Sample-and-hold 48fps "HFR" is nauseating to me.I noticed that too in the hobbit movie. Galadriel was doing this slow pronounced fake slow mo walk that would have looked cool at 24 FPS that looked ridiculous at 48fps. I liked the higher frame rate regardless but..
Plasmas motion is the next best thing to crt. I've got a calibrated Panasonic still as my main display.
Plasmas motion is the next best thing to crt. I've got a calibrated Panasonic still as my main display.
In year 1999, I got some training from a ISF technician and learned to calibrate CRT projector back in my twenties. Zone convergence, astig, keystone, bow, pincushion, etc. And that steps I had to do before the color work.You and me both
I will try playing in Movie mode, to see if there is any difference. I suppose input lag will be one...Wellcome to Samsung's shitty PWM (pulse with modulation) implementation, They basically flicker the backlight at a constant 120hz regardless of picture mode or source refreshrate. This leads to 2 flickers per frame at 60hz. Thus giving you double images when things pan on screen. This is one of the biggest reasons I booted my 3 trial Samsungs I went through before settling on a Sony. Samsung does increase the PWM in movie mode ONLY though, which is baffling as they could keep it in all modes, to around 960hz which virtually eliminates image doubling and Sony did use to do 720hz in all modes, but only support it in graphics and game mode on the newer models. This is utter bullshit from Samsung and a bad trend for Sony, it's likely lazy or cost reducing reasons and few reviewers and consumer are educated and witted enough with use scenarios where it crops up unfortunately. It's how ever very visible when scrolling text or panning a camera in games with graphics containing thin lines were there is no ingame motion blur being applied to the camera.