• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Oberon PlayStation 5 SoC Die Delidded and Pictured

rnlval

Member
Do these benchmarks use the same clocks? The PS5's CPU tops at 3.5GHz whereas the 4750G goes all the way up to 4.4GHz.




The Zen3 APUs also double the L3 cache, have better branching prediction, larger L1 + op caches with higher bandwidth and a bunch of other upgrades. The higher performance isn't just coming from the unified L3.


lWpT6oc.jpg
Ryzen 7 4750G Pro is a Zen 2 CPU-based APU and it has beaten AMD 4700S with a higher clock boost mode. I'm already aware of 4700S has a higher clock boost mode.

4700S shows reduced AVX FPU resource when compared to Ryzen 7 4750G Pro.
 
Last edited:

rnlval

Member
Whatever is in the PS5 the console has already proven that its plenty powerful enough to deliver bleeding edge games. It's going to be an amazing Gen for Sony games.
Comparison with the Jaguar CPU is a very low bar to beat. Zen v1 beats the very low-cost mobile/embedded Jaguar CPU, let alone PS5's cutdown Zen 2**.

**FPU resource reduced.
 
Last edited:

rnlval

Member
Obviously, the 4700S CPU's clock speed is set lower than 4750G's CPU.
4700S's CPUs have a 4 GHz boost mode. 4700S re-targeted for the PC market is not bound by PS5's shared GPU TDP limits.

Cinebench R15 and R20 benchmarks for Ryzen 7 4800H and 4800HS from https://www.techspot.com/review/2018-amd-ryzen-4800h/

For Cinebench R15, there's a noticeable difference between 4700S's "Zen 2" CPUs vs Ryzen 7 4800H's and 4800HS's "Zen 2" CPUs.

f47ZKG3.jpg




mF2ghr9.jpg
 
Last edited:

azertydu91

Hard to Kill
Yes it's kind of like when Blue said the PS5 had no hardware based RT. And then we see the results that proves that it does. Exactly the same situation here.

People wanting to know what the PS5 is capable of only need to look at the games. While I do believe improvements will come I don't believe either system has secret tech that will massively improve the level of performance with a flip of a switch. That's blue guys level of thinking.
Damn I had forgotten about him.... Crazy how he has stopped being followed here.It's a good thing though, but was it him or Battaglia that said it, because I remember Alex saying he didn't believe ps5 had raytracing even after it was announced and confirmed by Sony?
 
Damn I had forgotten about him.... Crazy how he has stopped being followed here.It's a good thing though, but was it him or Battaglia that said it, because I remember Alex saying he didn't believe ps5 had raytracing even after it was announced and confirmed by Sony?

I think Battaglia said he didn't believe it before Sony announced it. Blue was the one that called Cerny a liar. That's if I'm remembering it correctly.
 

azertydu91

Hard to Kill
I think Battaglia said he didn't believe it before Sony announced it. Blue was the one that called Cerny a liar. That's if I'm remembering it correctly.
That's not what happened, here's the video where he says he doesn't believe in hardware raytracing (in reaction to the spec reveal of the ps5, I hope I timestamped it right).



Which would make him basically call Cerny a liar too.
 
Last edited:

Md Ray

Member
So, the PS5 and series X are weaker than the 6600xt?
Same number of ROPs… higher clock.
In terms of Gtri/s and Gpix/s yes.
What ethomaz ethomaz said.
1. The pixel fill rate is bound by memory bandwidth.
2. ROPS bound can be bypassed via the compute shader with UAV texture buffer path.

Fj2RqUo.png


v9Hg03S.jpg


Baseline RDNA 1 and RDNA 2 ROPS are connected to multi-MB L2 cache to mitigate against memory bandwdith bound issues. PC RDNA 2 dGPUs have a larger L3 cache. The usual optimization path involves software tiled cache (processing via cache localization) immediate mode render techniques.

XSX GPU = 5 MB L2 cache.
PS5 GPU = 4 MB L2 cache.

PC NAVI 21 = 4 MB L2 cache + 128 MB L3 cache.
PC NAVI 22 = 3 MB L2 cache + 96 MB L3 cache.
PC NAVI 23 = 2 MB L2 cache + 32 MB L3 cache.
Yeah, but what about 22% higher rasterization throughput? Doesn't it mean anything?

I think we can attribute scenarios like this to PS5 GPU's higher clock speed:
JOqj3Wi.png


And in many other games where we see neck and neck perf... The recent one being F1 2021. Obviously, games/scenes where XSX takes point can be attributed to the advantages of the XSX GPU: 18% higher computational power, higher mem bandwidth, and whatnot. I believe we'll see this perf advantage flip-flopping on both sides throughout this gen.
 
Last edited:
That's not what happened, here's the video where he says he doesn't believe in hardware raytracing (in reaction to the spec reveal of the ps5, I hope I timestamped it right).



Which would make him basically call Cerny a liar too.

Let's be fair here; they likely meant hardware RT in terms of dedicated RT cores and acceleration for RT outside of the compute units and TMUs. Which in such case he'd of been correct.

Though he should've phrased it better, like "hardware-accelerated RT". Which would more or less be true of all the current-gen systems because dedicated silicon for RT is one of AMD's biggest weaknesses currently (I mean even Intel are leapfrogging them there and could end up beating Nvidia in RT too going by what's been shared so far).
 

azertydu91

Hard to Kill
Let's be fair here; they likely meant hardware RT in terms of dedicated RT cores and acceleration for RT outside of the compute units and TMUs. Which in such case he'd of been correct.

Though he should've phrased it better, like "hardware-accelerated RT". Which would more or less be true of all the current-gen systems because dedicated silicon for RT is one of AMD's biggest weaknesses currently (I mean even Intel are leapfrogging them there and could end up beating Nvidia in RT too going by what's been shared so far).
That would've been a great explanation if he had said the same thing about the XSX or the XSS but not once has he questionned raytracing on Xbox.I mean he clearly says that ps5 having RT capabilities was out of the left field, which means he clearly base his speculation above what a constructor just announced.
And he never went back and corrected himself and I would've been happy if he had corrected exactly like you said, meaning RT dedicated cores.
But it is far from the only times he has been dismissive about ps5 RT, which has since been proven to be there and working great (even though not as good as Nvidia I still use my 2080 and it is amazing).
 

ethomaz

Banned
Let's be fair here; they likely meant hardware RT in terms of dedicated RT cores and acceleration for RT outside of the compute units and TMUs. Which in such case he'd of been correct.

Though he should've phrased it better, like "hardware-accelerated RT". Which would more or less be true of all the current-gen systems because dedicated silicon for RT is one of AMD's biggest weaknesses currently (I mean even Intel are leapfrogging them there and could end up beating Nvidia in RT too going by what's been shared so far).
That is not what he tried to say.
Because he agreed with Series X "hardware-accelerated RT" that is just what PS5 has... or what RDNA 2 has.

In his delirious Series and RDNA2 have RT hardware but PS5 not.
 
Last edited:
That would've been a great explanation if he had said the same thing about the XSX or the XSS but not once has he questionned raytracing on Xbox.I mean he clearly says that ps5 having RT capabilities was out of the left field, which means he clearly base his speculation above what a constructor just announced.
And he never went back and corrected himself and I would've been happy if he had corrected exactly like you said, meaning RT dedicated cores.
But it is far from the only times he has been dismissive about ps5 RT, which has since been proven to be there and working great (even though not as good as Nvidia I still use my 2080 and it is amazing).
Fair point. But I think at the time he made that observation they based it on the Oberon leaks and RT was disabled on those tests (either because RT units were not operational or they were not needed due to regression testing for PS4 and PS4 Pro). He might've never considered those possibilities or came across them, tho they were mentioned on B3D and he frequents them somewhat regularly, possibility he missed that specific speculation though.

I mean keep in mind there were rumors back then the PS5 had a separate RT block with PowerVR tech that a lot of people considered plausible (though in reality it wouldn't of been unless it were integrated into the GPU as an accelerated hardware block otherwise it'd be extremely bandwidth-starved). So as crazy as it is to think people thought PS5 had no RT abilities whatsoever at one point, with the limited info from leaks at the time it made that a plausible possibility for a lot of people, even folks like Dictator.
 

azertydu91

Hard to Kill
Fair point. But I think at the time he made that observation they based it on the Oberon leaks and RT was disabled on those tests (either because RT units were not operational or they were not needed due to regression testing for PS4 and PS4 Pro). He might've never considered those possibilities or came across them, tho they were mentioned on B3D and he frequents them somewhat regularly, possibility he missed that specific speculation though.

I mean keep in mind there were rumors back then the PS5 had a separate RT block with PowerVR tech that a lot of people considered plausible (though in reality it wouldn't of been unless it were integrated into the GPU as an accelerated hardware block otherwise it'd be extremely bandwidth-starved). So as crazy as it is to think people thought PS5 had no RT abilities whatsoever at one point, with the limited info from leaks at the time it made that a plausible possibility for a lot of people, even folks like Dictator.
I get your point but why would an analyst form an opinion supposedly educated based on rumors and leak instead of the constructor announcement with specs revealed?
Even worse when it is an analysis of said reveals?
I mean when I watch an analysis on specs reveal I want people to talk about the spec reveal and not false specs based on incorrect data.
If they were to analyse a game and base the analysis on the rumor that said game is rumor to performed better on whatever platform ignoring the real data ,they have, would you think it is an impartial analysis?
This was just an uneducated opinion (on RT only) contradicting what has been announced which is the opposite of analysis.
He can make some great analysis but this was not.
I mean just listen to his words:"I still don't believe it has hardware raytracing" it is indeed a belief based on rumors is that professional?
 

rnlval

Member
What ethomaz ethomaz said.

Yeah, but what about 22% higher rasterization throughput? Doesn't it mean anything?

I think we can attribute scenarios like this to PS5 GPU's higher clock speed:
JOqj3Wi.png


And in many other games where we see neck and neck perf... The recent one being F1 2021. Obviously, games/scenes where XSX takes point can be attributed to the advantages of the XSX GPU: 18% higher computational power, higher mem bandwidth, and whatnot. I believe we'll see this perf advantage flip-flopping on both sides throughout this gen.

jtz8yjK.png


Higher raytracing reflections usage when compared to your screenshot. For missing 60hz target, XSX has working HDMI 2.1 VRR.
 
Last edited:

rnlval

Member
Let's be fair here; they likely meant hardware RT in terms of dedicated RT cores and acceleration for RT outside of the compute units and TMUs. Which in such case he'd of been correct.

Though he should've phrased it better, like "hardware-accelerated RT". Which would more or less be true of all the current-gen systems because dedicated silicon for RT is one of AMD's biggest weaknesses currently (I mean even Intel are leapfrogging them there and could end up beating Nvidia in RT too going by what's been shared so far).
FYI, Intel's highest gaming ARC GPU (with 256-bit bus) aims about RTX 3070 (with 256-bit bus) and RX 6700 XT (192-bit bus + 96 MB Infinity Cache).

AMD's RT hardware accelerates 2 of 3 BVH raytracing workloads i.e. bound box vs triangle intersection test and ray/triangle intersection test. The current real-time BVH raytracing method still needs good raster performance since RT cores are used as a guide for pixel shader's pixel color process. BVH traversal workload is processed on compute shaders with AMD RDNA 2 GPUs.

sUzp7oy.jpg


FYI, RDNA CU's shader cores have improved scalar units with its own schedulers to mitigate against SIMD/SIMT divergent workloads

6qi9kUu.jpg


Raytracing denoise pass is processed on shader units on both RDNA 2 and RTX.

GA102 has TFLOPS "high ground" when compared to NAVI 21.

Most TFLOPS discussions with NAVI GPUs are just with vector units and don't factor in scalar units. CPU has both scalar and vector units.
 
Last edited:

Md Ray

Member
jtz8yjK.png


Higher raytracing reflections usage when compared to your screenshot.
Obviously, games/scenes where XSX takes point can be attributed to the advantages of the XSX GPU: 18% higher computational power, higher mem bandwidth, and whatnot. I believe we'll see this perf advantage flip-flopping on both sides throughout this gen.
For missing 60hz target, XSX has working HDMI 2.1 VRR.
So will PS5.
 

rnlval

Member
So will PS5.
PS5 still has missing HDMI 2.1 VRR functionality.

PS; I have RMA form for my Yamaha RX-V6A. Both PS5 and Yamaha RX-V6A have claims for HDMI 2.1 with Panasonic HDMI Encoder IC.

PS5 uses a Panasonic HDMI Encoder IC MN864739.


I used MSI RTX 3080 Ti Gaming X Trio to verify HDMI 2.1 functionality. NVIDIA RTX Ampere doesn't use Panasonic HDMI Encoder IC.

Reference

I do NOT believe HDMI 2.1 claims from Japan's Panasonic.
 

Leyasu

Banned
What ethomaz ethomaz said.

Yeah, but what about 22% higher rasterization throughput? Doesn't it mean anything?

I think we can attribute scenarios like this to PS5 GPU's higher clock speed:
JOqj3Wi.png


And in many other games where we see neck and neck perf... The recent one being F1 2021. Obviously, games/scenes where XSX takes point can be attributed to the advantages of the XSX GPU: 18% higher computational power, higher mem bandwidth, and whatnot. I believe we'll see this perf advantage flip-flopping on both sides throughout this gen.
Now I am not saying that the PS5 is not up to snuff.

But let me give you an alternative scenario.. Multiplat devs are just hitting or getting as near as possible to ticking the marketing check box and leaving it there. Seeing as you mention F1, why don't you go and compare it to the Forza Horizon 5 footage that is everywhere?? Even Forza Horizon 4.
 

rnlval

Member
Fair point. But I think at the time he made that observation they based it on the Oberon leaks and RT was disabled on those tests (either because RT units were not operational or they were not needed due to regression testing for PS4 and PS4 Pro). He might've never considered those possibilities or came across them, tho they were mentioned on B3D and he frequents them somewhat regularly, possibility he missed that specific speculation though.

I mean keep in mind there were rumors back then the PS5 had a separate RT block with PowerVR tech that a lot of people considered plausible (though in reality it wouldn't of been unless it were integrated into the GPU as an accelerated hardware block otherwise it'd be extremely bandwidth-starved). So as crazy as it is to think people thought PS5 had no RT abilities whatsoever at one point, with the limited info from leaks at the time it made that a plausible possibility for a lot of people, even folks like Dictator.
When DX12U features are not used, NAVI 2x and NAVI 1x are similar for most cases.

NAVI 21 effectively has two NAVI 10 "super-glued" together e.g. 2X 64 ROPS and 2X 40 CU. 3rd generation 7nm improves perf/watt for NAVI 2x over 2nd generation 7nm NAVI 10.

AMD attached 128 MB Infinity Cache with NAVI 21 to avoid the expensive 512-bit bus GDDR6 PCB design.
 

kyliethicc

Member
PS5 still has missing HDMI 2.1 VRR functionality.

PS; I have RMA form for my Yamaha RX-V6A. Both PS5 and Yamaha RX-V6A have claims for HDMI 2.1 with Panasonic HDMI Encoder IC.

PS5 uses a Panasonic HDMI Encoder IC MN864739.


I used MSI RTX 3080 Ti Gaming X Trio to verify HDMI 2.1 functionality. NVIDIA RTX Ampere doesn't use Panasonic HDMI Encoder IC.

Reference

I do NOT believe HDMI 2.1 claims from Japan's Panasonic.

1 - the PS5 is HDMI 2.1 and already outputs at 4K 120

2 - VRR and HDMI 2.1 are different things, you can have HDMI 2.0 and still have VRR / Freesync / Gsync

3 - the PS5 will have VRR added later. Sony aren't gonna enable VRR on PS5 until they add it to their TVs. And the HDMI Forum has to finalize their VRR format first.

4 - The issue with your AVR was with HDMI 2.1 using FRL w/ Display Stream Compression, used by some Nvidia GPUs and the Xbox SX. Its chip cannot pass through a FRL signal that uses DSC. The PS5 does not use DSC and thus its 4K120 output works totally fine with that AVR.
 
Last edited:

Zathalus

Member
That's not VRR.
It's literally in the article:
HDMI Specification 2.1 Features Include:
Enhanced refresh rate features ensure an added level of smooth and seamless
motion and transitions for gaming, movies and video. They include:


  • Variable Refresh Rate (VRR) reduces or eliminates lag, stutter and frame tearing for more fluid and better detailed gameplay.
 

ethomaz

Banned
That was done years ago.

That is de specification of the HDMI 2.1 features.
Not the specification how these features needs to work.

HDMI 2.1 VRR implementation is not finalized yet because when you are implementing it you find new issues... after HDMI create a solution for that case the implementation documentation is updated.

I can't even believe you thought just with that feature specification (https://www.hdmi.org/spec21sub/variablerefreshrate) you could implement it lol

Like there is no way to that works if there is no implementation documentation to make sure every part is in the same page.

It's literally in the article:
So tell me how do you implement that text in your device lol
 
Last edited:

Zathalus

Member
That is de specification of the HDMI 2.1 features.
Not the specification how these features needs to work.

HDMI 2.1 VRR implementation is not finalized yet.
Proof? All information I can find from the HDMI forum is that it was finalised years ago. Hence the reason for Nvidia, AMD, Intel, Samsung and LG all supporting it.

It's even on the official presentation document that was released:

https://hdmiforum.org/wp-content/up...1-November-Release-Presentation-EN.pdf?x96243 - page 34

As well as the official press release:


VRR is part of the base HDMI 2.1 specification.

Mentioned again here:


As well as under the certification for HDMI 2.1 cables:

 
Last edited:

Kilau

Member
That is de specification of the HDMI 2.1 features.
Not the specification how these features needs to work.

HDMI 2.1 VRR implementation is not finalized yet because when you are implementing it you find new issues... after HDMI create a solution for that case the implementation documentation is updated.

I can't even believe you thought just with that feature specification (https://www.hdmi.org/spec21sub/variablerefreshrate) you could implement it lol

Like there is no way to that works if there is no implementation documentation to make sure every part is in the same page.
I believe the hold up for Sony is they are waiting on the HDMI forum to certify the chipsets. Other manufacturers have gone ahead with enabling the feature but Sony always waits for certification.

I don’t know why it’s taking so long but the rollout of HDMI 2.1 has not been smooth at all.
 

saintjules

Member
Proof? All information I can find from the HDMI forum is that it was finalised years ago. Hence the reason for Nvidia, AMD, Intel, Samsung and LG all supporting it.

It's even on the official presentation document that was released:

https://hdmiforum.org/wp-content/up...1-November-Release-Presentation-EN.pdf?x96243 - page 34

As well as the official press release:


VRR is part of the base HDMI 2.1 specification.

Mentioned again here:


As well as under the certification for HDMI 2.1 cables:


Thorough investigation (y)
 

ethomaz

Banned
Proof? All information I can find from the HDMI forum is that it was finalised years ago. Hence the reason for Nvidia, AMD, Intel, Samsung and LG all supporting it.

It's even on the official presentation document that was released:

https://hdmiforum.org/wp-content/up...1-November-Release-Presentation-EN.pdf?x96243 - page 34

As well as the official press release:


VRR is part of the base HDMI 2.1 specification.

Mentioned again here:


As well as under the certification for HDMI 2.1 cables:

Yes the feature set is defined years ago.
The implementation is yet in development and HDMI has a lot of changes to make because there are several issues to be fixed yet.

"What they are waiting for is HDMI is going to be updating some of their specs on CEC. The CEC link is going to be changing and so is ARC (Audio Return Channel) and eARC (Enhanced Audio Return Channel). There will be some protocol changes in the way that HDMI is configuring that and Sony engineers want to wait and see. They have one firmware update for that."




Nvidia, AMD, Intel, Samsung, LG, etc will all have to update their HDMI 2.1 implementation too.

BOOM!!!

But I agree that Sony should have launched a early HDMI 2.1 implementation like others... after all the issues being found is due the implementation of these others brands... and that is why HDMI forum is updating their HDMI 2.1 implementation.
 
Last edited:

kyliethicc

Member
It's literally in the article:
Thorough investigation (y)

Yes and the HDMI Forum VRR specification hasn't been finalized yet.

The HDMI Forum finalized the HDMI 2.1 specification.

They're different.

This guy from Sony TVs says, at the 33:00 timestamp, that they're waiting for the HDMI Forum to "have set the standards for VRR and have got them set in stone, so we'll move on that once that happens basically." Maybe he's mistaken or lying, but I doubt it.

"
 
Last edited:

ethomaz

Banned
Yes and the HDMI Forum VRR specification hasn't been finalized yet.

The HDMI Forum finalized the HDMI 2.1 specification.

They're different.

This guy from Sony TVs says, at the 33:00 timestamp, that they're waiting for the HDMI Forum to "have set the standards for VRR and have got them set in stone, so we'll move on that once that happens basically." Maybe he's mistaken or lying, but I doubt it.

"

The specification is just the feature set that version 2.1 will have and it is finalized… any new feature will come with 2.2? or so.

But the implementation standard of HDMI 2.1 is not finalized… how it is implemented these feature, how it works, how the exceptions and combination with others factors will works, how the bits are used to make these features works are not finalized yet.

Not only VRR but there are a lot of issues to be fixed with HDMI 2.1… it is expected a big update in the implementation standard late this year and of course all manufacturers will have to update their implementation via firmware.

Sony waiting is just a choice but I choice I don’t agree.
 
Last edited:

Zathalus

Member
Yes the feature set is defined years ago.

The implementation is yet in development and HDMI has a lot of changes to make.

"What they are waiting for is HDMI is going to be updating some of their specs on CEC. The CEC link is going to be changing and so is ARC (Audio Return Channel) and eARC (Enhanced Audio Return Channel). There will be some protocol changes in the way that HDMI is configuring that and Sony engineers want to wait and see. They have one firmware update for that."


The owner of Value Electronics is hardly proof dude. It's pure speculation and goes against the released, official information from the HDMI forum.

Yes and the HDMI Forum VRR specification hasn't been finalized yet.

The HDMI Forum finalized the HDMI 2.1 specification.

They're different.

This guy from Sony says, at the 33:00 timestamp, that they're waiting for the HDMI Forum to "have set the standards for VRR and have got them set in stone, so we'll move on that once that happens basically.

"

VRR is part of the basic HDMI 2.1 spec and has been finalised. This is repeatedly mentioned in all press releases and announcements the HDMI forum has made. There is no announcement from the HDMI forum that VRR is still in the works or not finalised, despite the HDMI specification being set in stone almost 4 years ago. That Sony rep is basically making that up wholesale. He is directly contradicting the information directly from the HDMI forum themselves. Which makes sense, as he is trying to come up with reasons as to why they have not implemented VRR yet, as you would expect from a Marketing Manager.

VRR being stated as a HDMI 2.1 feature for certifying HDMI 2.1 cables:
To ensure quality Ultra High Speed HDMI® cables reach the market and support 4K and 8K video, HDR, VRR, eARC, and all other HDMI 2.1 features, HDMI Forum, Inc. today announced a mandatory certification program for all Ultra High Speed HDMI cables. Ultra High Speed HDMI cable certification includes testing to meet current EMI requirements to minimize wireless interference. All certified cables of any length must pass certification testing at an HDMI Authorized Testing Center (ATC).

Heck, even Sony themselves have an article stating that VRR is part of the HDMI 2.1 standard.

Feel free to show any proof from the HDMI forum themselves that HDMI 2.1 or VRR is not finalised, or that VRR is not part of the 2.1 specification.

After further investigation it appears the issue has got buggerall to do with the HDMI forum, but the crappy Mediatek chipset that both Sony and Panasonic use. Check this out:



It appears 4k+120Hz+VRR is going to be a problem with TVs that have that Mediatek chipset, and currently causes half resolution issues. It is telling that Panasonic has an upcoming "announcement" regarding 4k+120Hz+VRR.

This explains the issue rather well.

Personally, I am a bit pissed off that issues with Sony TVs are holding back my PS5 from having VRR enabled.
 

kyliethicc

Member
The owner of Value Electronics is hardly proof dude. It's pure speculation and goes against the released, official information from the HDMI forum.


VRR is part of the basic HDMI 2.1 spec and has been finalised. This is repeatedly mentioned in all press releases and announcements the HDMI forum has made. There is no announcement from the HDMI forum that VRR is still in the works or not finalised, despite the HDMI specification being set in stone almost 4 years ago. That Sony rep is basically making that up wholesale. He is directly contradicting the information directly from the HDMI forum themselves. Which makes sense, as he is trying to come up with reasons as to why they have not implemented VRR yet, as you would expect from a Marketing Manager.

VRR being stated as a HDMI 2.1 feature for certifying HDMI 2.1 cables:


Heck, even Sony themselves have an article stating that VRR is part of the HDMI 2.1 standard.

Feel free to show any proof from the HDMI forum themselves that HDMI 2.1 or VRR is not finalised, or that VRR is not part of the 2.1 specification.

After further investigation it appears the issue has got buggerall to do with the HDMI forum, but the crappy Mediatek chipset that both Sony and Panasonic use. Check this out:



It appears 4k+120Hz+VRR is going to be a problem with TVs that have that Mediatek chipset, and currently causes half resolution issues. It is telling that Panasonic has an upcoming "announcement" regarding 4k+120Hz+VRR.

This explains the issue rather well.

Personally, I am a bit pissed off that issues with Sony TVs are holding back my PS5 from having VRR enabled.

You still are misunderstanding how the HDMI 2.1 spec is final but the HDMI Forum VRR spec is not final yet. They're different things.

And the Ultra High Speed HDMI cables are another separate thing as well.
 

Zathalus

Member
You still are misunderstanding how the HDMI 2.1 spec is final but the HDMI Forum VRR spec is not final yet. They're different things.

And the Ultra High Speed HDMI cables are another separate thing as well.
OK, I'm willing to believe you, but can you post evidence of this at all? All the information directly from the HDMI forum indicates VRR is part of the HDMI 2.1 spec and both are final.
 

kyliethicc

Member
OK, I'm willing to believe you, but can you post evidence of this at all? All the information directly from the HDMI forum indicates VRR is part of the HDMI 2.1 spec and both are final.
The HDMI 2.1 spec is like a recipe for bread. It has flour, water, salt, etc.

The VRR spec is like a recipe for flour. An ingredient in bread sure, but different.

The HDMI Forum all agree that HDMI 2.1 (bread) includes VRR (flour), but they haven't agreed on what exact specs their VRR implementation will be (what kinda of flour, how much, etc).
 

Zathalus

Member
The HDMI 2.1 spec is like a recipe for bread. It has flour, water, salt, etc.

The VRR spec is like a recipe for flour. An ingredient in bread sure, but different.

The HDMI Forum all agree that HDMI 2.1 (bread) includes VRR (flour), but they haven't agreed on what exact specs their VRR implementation will be (what kinda of flour, how much, etc).
I get that, I'm asking where on earth that is stated? Every single press release or announcement states HDMI 2.1 and VRR go together and the official specifications for HDMI 2.1 - including all features such as VRR has been finalised. Sure, you can have a HDMI 2.1 connection without VRR, but VRR is still part of the HDMI 2.1 specification.

Hence VRR working the same across Nvidia, AMD, Intel, Samsung, LG, ASUS, Acer and Gigabyte.

Show me a reliable source (that is not a Sony marketing manager) that shows this is not the case.
 

ethomaz

Banned
The owner of Value Electronics is hardly proof dude. It's pure speculation and goes against the released, official information from the HDMI forum.


VRR is part of the basic HDMI 2.1 spec and has been finalised. This is repeatedly mentioned in all press releases and announcements the HDMI forum has made. There is no announcement from the HDMI forum that VRR is still in the works or not finalised, despite the HDMI specification being set in stone almost 4 years ago. That Sony rep is basically making that up wholesale. He is directly contradicting the information directly from the HDMI forum themselves. Which makes sense, as he is trying to come up with reasons as to why they have not implemented VRR yet, as you would expect from a Marketing Manager.

VRR being stated as a HDMI 2.1 feature for certifying HDMI 2.1 cables:


Heck, even Sony themselves have an article stating that VRR is part of the HDMI 2.1 standard.

Feel free to show any proof from the HDMI forum themselves that HDMI 2.1 or VRR is not finalised, or that VRR is not part of the 2.1 specification.

After further investigation it appears the issue has got buggerall to do with the HDMI forum, but the crappy Mediatek chipset that both Sony and Panasonic use. Check this out:



It appears 4k+120Hz+VRR is going to be a problem with TVs that have that Mediatek chipset, and currently causes half resolution issues. It is telling that Panasonic has an upcoming "announcement" regarding 4k+120Hz+VRR.

This explains the issue rather well.

Personally, I am a bit pissed off that issues with Sony TVs are holding back my PS5 from having VRR enabled.

You are just in denial to the truth lol
 
Last edited:

SmokSmog

Member
No infinity cache in ps5 for GPU and CPU is slower than mobile Zen2 in laptops + it doesn't have unified cache for CPU. Same with Xbox S/X. Get over it. You get what you paid for.

4700S which is PS5 CPU but with higher clocks gets mogged by mobile Zen2 let alone Zen3.

Console CPUs have only 2x4MB l3 cache, they have lower clocks than desktop and laptop parts and if this was not enaugh, both PS5 and XSX are connected to high latency GDDR6.

This PS5 based APU boost to 4ghz.

ee5d1f73308cd3e8e8646cfbdb28469ef070e20942de8d50c709bf8386a31998.jpg
 

ethomaz

Banned
Stellar debating skills. I post numerous links from the official HDMI Forum site, yet I'm the one in denial.

How's that belief that PS5 has L3 GPU cache coming along?
I showed you proof of what Sony said is waiting... I don't agree with than but it is their choice.

You are still in denial.

HDMI 2.1 implementation is still being updated.
 
Last edited:

kyliethicc

Member
I get that, I'm asking where on earth that is stated? Every single press release or announcement states HDMI 2.1 and VRR go together and the official specifications for HDMI 2.1 - including all features such as VRR has been finalised. Sure, you can have a HDMI 2.1 connection without VRR, but VRR is still part of the HDMI 2.1 specification.

Hence VRR working the same across Nvidia, AMD, Intel, Samsung, LG, ASUS, Acer and Gigabyte.

Show me a reliable source (that is not a Sony marketing manager) that shows this is not the case.
Currently HDMi Forum VRR is not the same on all the TVs that support it.

Each implementation is different at this point. Minimum refresh, maximum refresh, supported resolutions, required bandwidth, impact on picture accuracy, picture modes compatibility, hdr & sdr or just sdr, with/without dolby vision, etc.

None of it is finalized. It varies TV model by TV model.

And Freesync / Gysnc is, again, a totally different thing.
 

Zathalus

Member
Currently HDMi Forum VRR is not the same on all the TVs that support it.

Each implementation is different at this point. Minimum refresh, maximum refresh, supported resolutions, required bandwidth, impact on picture accuracy, picture modes compatibility, hdr & sdr or just sdr, with/without dolby vision, etc.

None of it is finalized. It varies TV model by TV model.

And Freesync / Gysnc is, again, a totally different thing.
None of that has got anything to do with VRR. Having different VRR ranges and the rest is not defined by the specification itself. Just as Gsync and Freesync have widely different ranges and display types.

Also, VRR is the exact same thing as Gsync or Freesync. A form of adaptive sync.
 
HDMI 2.1 will be fantastic for gaming as much of it will be beneficial but I think it's by far been the most poorly implemented HDMI spec upon. There's certainly never been as much confusion as what is specific to HDMI 2.1 and what is also possible vis HDMI 2.0. To be fair, it hasn't been easy for the hardware manufacturers either as you can tell from the number of issues there has been with various pieces of AV hardware from companies that you would normally expect to have got things spot on.
There is of course Forum VRR, Freesync and GSync and each does have it's own specs. By that I don't mean just that there are differing higher end and lower end limits on different hardware, I mean that the specs for all three are simply not the same and supporting one doesn't mean that you support all three.
I think a large part of the confusion this time simply originates with the fact that there were two very high profile console releases from Microsoft and Sony at pretty much the same time the HDMI 2.1 Spec was announced at around the same time which means that far more people who don't usually have to think about the intricacies of spec certification.
In reality, that's how it should be for the vast majority of people. Ideally there should be a big label on the box that simply has a big YES or NO on it and that's all people will need to see.
Of course, it will never be that simple but while we're still waiting for the HDMI Forum as end users, it means that the manufacturers are also still waiting for them and until everything lines up it's always going to be...a little uneven.
I'm happily waiting a year or so before buying a new TV, that in no way means that you can't buy a fantastic TV now that works fine and does everything you want but I'm just more comfortable waiting until things settle down a little.
 

rnlval

Member
1 - the PS5 is HDMI 2.1 and already outputs at 4K 120

2 - VRR and HDMI 2.1 are different things, you can have HDMI 2.0 and still have VRR / Freesync / Gsync

3 - the PS5 will have VRR added later. Sony aren't gonna enable VRR on PS5 until they add it to their TVs. And the HDMI Forum has to finalize their VRR format first.

4 - The issue with your AVR was with HDMI 2.1 using FRL w/ Display Stream Compression, used by some Nvidia GPUs and the Xbox SX. Its chip cannot pass through a FRL signal that uses DSC. The PS5 does not use DSC and thus its 4K120 output works totally fine with that AVR.
1. The argument is PS5 doesn't have VRR functionality. Your argument is a red herring. PS5 can output 4K 120 Hz with compromised ("compressed pixels" LOL) 32 GBps bandwidth. LOL. READ https://gamingbolt.com/ps5s-hdmi-2-1-bandwidth-reportedly-limited-to-32-gb-s

PS5's 120 Hz 4K has reduced 4:2:2 chroma subsampling with 32 GBps bandwidth.

2. HDMI 2.0b's VRR uses HDMI 2.1 VRR data format. My Ryzen 7 Pro 4750U APU's RX Vega driver's control panel makes a noted difference between FreeSync and VRR.

My current PC GPUs
MSI GeForce RTX 3080 Ti Gaming X Trio, HDMI 2.1
MSI GeForce RTX 2080 Ti Gaming X Trio, HDMI 2.0b
ASUS Geforce RTX 2080 Evo Duo, HDMI 2.0b
Ryzen 7 Pro 4750U APU's RX Vega iGPU (7 nm generation, HDMI 2.0b).
Ryzen 5 2500U APU's RX Vega iGPU (14 nm generation, HDMI 2.0b).

3. Panasonic broken HDMI 2.1 encoding IC is problematic.

4. Read https://www.flatpanelshd.com/news.php?subaction=showfull&id=1629971809

"Yamaha HDMI 2.1 receivers capped to 24 Gbps even after board replacement". LOL

Both PS5's and Yamaha's HDMI 2.1 encoder chips were designed by Panasonic. I plan to ditch my less than year old Yamaha receiver. Like other NVIDIA RTX and Xbox owners, I don't plan to buy any Yamaha in the future. I'm going to throw $$$ at HDMI 2.1 problem.

PS5's 120 Hz 4K has reduced 4:2:2 chroma subsampling with 32 GBps bandwidth.
 
Last edited:

rnlval

Member
I get that, I'm asking where on earth that is stated? Every single press release or announcement states HDMI 2.1 and VRR go together and the official specifications for HDMI 2.1 - including all features such as VRR has been finalised. Sure, you can have a HDMI 2.1 connection without VRR, but VRR is still part of the HDMI 2.1 specification.

Hence VRR working the same across Nvidia, AMD, Intel, Samsung, LG, ASUS, Acer and Gigabyte.

Show me a reliable source (that is not a Sony marketing manager) that shows this is not the case.

HDMI VRR and FreeSync have different data formats. AMD's FreeSync is based on VESA's Adaptive-Sync implementation. Both AMD and NVIDIA support VESA's Adaptive-Sync implementation over DisplayPort.

For LG 27UL600
RTX 2080 Ti doesn't support FreeSync over HDMI port.
RTX 2080 Ti supports FreeSync over VESA's DisplayPort.
Ryzen 7 4750U's RX Vega iGPU supports FreeSync over HDMI port.

For LG NANO91
RTX 2080 Ti supports VRR HDMI 2.0b port
Ryzen 7 4750U's RX Vega iGPU supports VRR HDMI 2.0b port
Ryzen 7 4750U's RX Vega iGPU supports FreeSync Premium HDMI 2.0b port

Conclusion: VRR HDMI is different from FreeSync over HDMI.

I have LG NANO91 and RTX 3080 Ti as the HDMI 2.1 reference setup.

Both of my RTX 3080 Ti and RTX 2080 Ti are MSI Gaming X Trio AIB OC SKUs.
 
Last edited:
Top Bottom