• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

You can play all the games you want at 60 FPS on Bravia TV if you want

DeepEnigma

Gold Member
If you were on an island by yourself, and blue waffles were the only consumable thing.... Would you rather eat it, or experience the rest of your life with motion interpolation?
Im Not No Way GIF

Well, I die.
 
Last edited:

Knightime_X

Member
Do all of that and try to play fighting games online.
Or nes games that require timing precision.

Realize what grave choices you made then proceed to bury the thought forever.
 

rofif

Can’t Git Gud
I mean, I love [good] motion blur and even film grain if it helps with banding.
So I should not be allowed to criticize but it's on another level :p
 

onesvenus

Member
I'm totally ignorant about TV tech. About console hardware tech well, we can discuss about whatever you want and see.

I know time ago some developers tried interpolation (or something similar) in a Star Wars game on ps360 to fake 60 fps; honestly I don't see the connection between ignorance about TV motionflow and the whole tech hardware console argument, frankly. I mean there are aspects we can ignore about the tech, it's a huge vaste matter, but doesn't implies to be totally ignorant about the whole argument. Otherwise wouldn't exist dedicated developers for different aspect of the graphic pipeline. It's like medicine with different specialization.
I'd argue that you'd be hard pressed to find someone who works doing rendering, either on the hardware or software side, who doesn't know what motion interpolation is and how it works.
It's a vast field of knowledge, you are right, but there are some basics that need to be know before being able to talk about other things.
 

JT14

Banned
This gotta be a troll post, unless he likes horrible input lag. I only use true motion for movies, nothing else
 

DonkeyPunchJr

World’s Biggest Weeb
Is there some kind of box that can apply this to the video signal? I’m thinking it would be amazing to upscale the video output to 60FPS, then have the TV interpolate that to 120 FPS. Someday!
 

Shtef

Member
This thread will remain in history like “polish car wash” thread.

 
Last edited:

AGRacing

Gold Member
I'm totally ignorant because I genuinely admitted to never followed TV tech (never interested at all, mea culpa) and I post a naive enthusiastic thread? Jesus Christ people, you are laughable just to make some good member war :messenger_tears_of_joy: But go on, I'm not want to ruin your fun.
You seem like a nice guy, buddy. Don't take it personally. You learned something important in the world of gaming technology today. Chalk that up as a win.
 

Rentahamster

Rodent Whores
Can’t do it due to lag, there’s a reason game mode exists. Some games and gamers obviously can adapt better.

What’s puzzling is that interpolation isn’t an option in game engines. TV’s only work on flat frames to generate a mid point, whereas a game could use multiple data points, rotation, velocity vectors, etc. A game could achieve a much better result and optimise latency.
The technology exists, but no one uses it.


At the recent SIGGRAPH 2010, LucasArts coder Dmitry Andreev showed off a quite remarkable tech demo based on work he carried out during the development of Star Wars: The Force Unleashed II. In a video demonstration running on Xbox 360, he showed the game operating at its default 30FPS, but then seemingly magically running at 60FPS - with no apparent graphical compromises aside from the removal of motion blur.

Andreev first got the idea for the technique by studying 120Hz TVs that interpolate two frames in order to produce an intermediate image, producing a smoother picture. Software filters on some media players (for example Philips' Trimension as seen on the WinDVD player) were also considered. If this approach could be replicated within the game engine, an effect far more pleasing than most motion blur algorithms could be produced. Discussions after SIGGRAPH 2008 soon led to prototyping.

"So as soon as I got back home, I started to play with it and soon after that realised that there are a lot of issues," Andreev reveals.

"Mostly the artifacts of a different kind, that appear in more or less complex scenes, as well as performance issues (it is really slow when done properly). And to better understand the problem, I made a very quick and simple prototype to play with."

"We already know how things are moving as we have full control over them. This way we don't need to do any kind of estimation," Andreev says.

"Moreover, when we do the interpolation, we can handle different things differently, depending on the kind of quality we are happy with. On top of that, we can use different interpolation techniques for different parts of the image, such as layers of transparency, shadows, reflections and even entire characters."

The key is to re-use as much of the available processing as possible. In the case of Andreev's demo, the depth buffer and velocity map for the next full frame are generated, but directly after this, midway through the processing, this data, combined with elements from the last frame, is used to interpolate the intermediate image before calculations on the next real frame continue.

You'd think that this technique would cause lag, but as the interpolated image is being generated using elements from the next "real" frame, it actually reduces latency. Andreev's technique is single-frame based rather than dual-frame. The latter approach would require buffering two images so has a big memory and latency overhead, while the technique Andreev used effectively interpolates on the fly using past and future rendering elements.

"The most simple and efficient solution is to do the interpolation in-place, during the current frame, while the previous one is on screen. This way the previous front buffer can be mapped as a texture (on Xbox 360) and used for interpolation directly," he explains.

"In terms of latency there is something interesting going on. I said that there is no extra latency, which is true. But if you think about it, latency is actually reduced because we get the new visual result 16.6 ms earlier. You see the result of your actions earlier."



 

Rentahamster

Rodent Whores
If our TVs were powerful enough to do this in real time without noticeable lag, that would be amazing, but unfortunately, we are not at that level yet.

 

Great Hair

Banned
watch it at 0.25 to 0.50 speed
O6UoWGg.png

The left looks way nicer, is that what OP is talking about?
Yes, i think so. nothing to do with 30 to 60fps. tried it on an LG LCD, it was crap.

Sansuns, Feelips TVs are much better though. If the game has severe frame pacing issues, you will still notice them ... just moving the camera will "feel" smoother as seen in the HZD (distant objects are not stuttering, when moving the camera).

Enabling locked 120hz on system lvl PS5, might improve this even further.

30fps + 120hz + nature motion, interpolation = better? Who knew we had VRR for years ... :p partially joking
 
Last edited:

NeoIkaruGAF

Gold Member
And this is why there is no place like NeoGAF.


This thread will remain in history like “polish car wash” thread.

OMG :messenger_tears_of_joy::messenger_tears_of_joy:
 
If our TVs were powerful enough to do this in real time without noticeable lag, that would be amazing, but unfortunately, we are not at that level yet.





I time stamped why AI interpolation is still shit for most application and why it will be a very long time before something with the computational power of a console could do a decent job of things.

Watch the arms on the astronaut in front. They completely disappear multiple times. Hilarious that the guy making the video chose that moment to say "I would have a hard time telling them apart from the AI generated ones". Maybe the ones where the arms are gone, that would be a good starting point.
 
Top Bottom