TTOOLL
Member
Next Wednesday!!Microsoft buying Quantic Dreams confirmed.
Next Wednesday!!Microsoft buying Quantic Dreams confirmed.
Nope, AI enhanced software is something I for the last 20 year's have stated will garner performance up to 85% on this gen of hardware in fact - so not a MS propaganda thing more a Computer Scientist rationale.
Both?Both have geometry engine
Compared to RX 5700 XT, XSX GPU has the followingWhere would you run more wavefront offchip? again double counting things... you have the same L1 cache feeding 7 DCU’s on one side and 5 DCU’s on the other... period.
You want more TFLOPS (more wavefronts/threads/operations) without additional Shader Engines (which has a bigger fixed cost HW wise... sure, but you have a tradeoff that the L1 cache is shared with more DCU’s.
Compared to RX 5700 XT, XSX GPU has the following
30% increased in DCU
25% increased in memory bandwidth i.e. 448 GB/s to 560 GB/s
25% increased in L2 cache i.e. 4 MB to 5MB
For a quick Gears 5 port, XSX rivals RTX 2080 which is 20% superior over RX 5700 XT's results. Memory bandwidth bound is real.
Wrong and extremely wrong, Sir it states in 7 book's across 13 books in totality in Computer Science Curriculum that AI utilization will take this generation of hardware and increase it's performance 135%.You need dedicate hardware to support such enhancement. You take a bunch of tech stuff heard here and there, and you come with overoptimistic conclusion mixing all in the bag without rationality. It doesn't work like that.
They will. They were far behind in the SDK schedule. Honestly I'm expect something in line with what we have seen or we will see on ps5, maybe some refinement in some tech stuff.I also heard about Machine unlearning, the technique 343i used for Halo Infinite showed on Jul 23rd. Ok, kidding aside, why we didn't see any impressive XSX showcase yet ? The hardware is there, easy to develop for, all is ready into the Hardware. Why MS didn't think about showing something really impressive on the XSX ? This is a legitimate question to say the least.
Can you stop to say absurdity please? You take what MS said as the holy bible, you mix a bunch of notion with a lot of confusion if I can say. That's embarrassing. We are very luck to have a boost of 20/30% in the hardware performance with software optimization, but now an increase beyond that just...how? It has it's physical limits.Wrong and extremely wrong, Sir it states in 7 book's across 13 books in totality in Computer Science Curriculum that AI utilization will take this generation of hardware and increase it's performance 135%.
As a computer scientist/someone that has finished all the curriculum (across 13 book's of coursework) I find it pertinent to inform everyone here that Software Improvements bolstered by AI alone (Machine Learning)are due to deliver upwards of 305% performance gains - with hardware this generation if utilized properly - with or without dedicated ML hardware. (soley utilizing ML optimized Software)
And anyone that has read and understands the curriculum can easily vouch to that here.
We are essentially in for a "Machine Learning" Revolution... neigh.. a quadrillion fold quantum leap in compute... and multiple new paradigm shifts for computing bolstered solely by machine learning infused software.
As anyone who has read the curriculum/passed the coursework will attest. It is blatantly plastered across 5 of 7 book's. That paradigm begins with these consoles/GPU's/CPU's being released.
Particularly beginning in 2021.
Being that Microsoft has clearly stated it can dedicate a portion of it's hardware to DirectML, a solution it created and hardcoded into DX12Ultimate - I expect no less than an 86% performance increase when this feature is utilized properly.
With that said, I am also an avid Trans-human/Singularitist - also something Microsoft has nothing to do with.
Each DCU has a Local Data Share which scales with DCU count. XSX's 26 DCU LDS / PS5's 18 DCU LDS = ~44% advantage for XSX. LOL
More DCU has the following
1. more wave32 processing on the chip instead of outside the chip.
2. more Texture Filter Units
3. more texture load/store units
4. more L1 cache
5. more branch & message unit
6. more scalar units
7. more RT function blocks (RDNA 2)
Trips to the external memory bus have a higher cost.
Again, nothing I've stated in my previous post has anything to do with what 'Microsoft' taught it's fan's, this is Computer Science Curriculum - sir.Can you stop to say absurdity please? You take what MS has the holy bible. That's embarrassing.
Is this some kind of Neogaf meme that I don’t know? Please, tell me this is irony and not lunacy.... As a computer scientist/someone that has finished all the curriculum (across 13 book's of coursework) I find it pertinent to inform everyone here ...
Based on what exactly? You read a bunch of book so you think every hardware with a finger of machine learning can be pushed from 80% to beyond just because it uses machine learning? That's crazy stuff.Again, nothing I've stated in my previous post has anything to do with what 'Microsoft' taught it's fan's, this is Computer Science Curriculum - sir.
These software Optimizations will provide these performance benefit's - eventually for both consoles and as I stated previously, anyone that has read the coursework will attest to an on average performance increase of 105% due to MACHINE LEARNING. This is not science fiction and with all thing's computer science - will be proved science fact by 2021 and beyond.
Yes, only on NeoGaf - would Computer Science, and it's in fact 13 books of coursework if you also elect to take the philosophical courses - be scoffed at.Is this some kind of Neogaf meme that I don’t know? Please, tell me this irony and not lunacy.
Yes, only on NeoGaf - would Computer Science, and it's in fact 13 books of coursework if you also elect to take the philosophical courses - be scoffed at.
Unbelievable. So you are really serious.Yes, only on NeoGaf - would Computer Science, and it's in fact 13 books of coursework if you also elect to take the philosophical courses - be scoffed at.
Based in what exactly?
Additionally there is the small matter of XSX's RAM pool not being 560 GB/s entirely. There is the 336 GB/s part sharing the same adress space to consider. Not every game can stay within the 10 GB (CPU usage included).Keep double counting things (higher bandwidth —> more MC’s —> more L2), adding charts, etc... not sure beyond agreeing that it has more memory bandwidth and also a higher TFLOPS rating to feed we can exchange platitudes and talk over each other all day. You know what is also a thing? Memory contention... one system is feeding more units off of the same L1 cache than the other.
You can of course fallback on the bigger and shared L2 and if you can have enough running threads on the chip you can hide the higher latency than the increased L1 cache misses will cause... but you are still giving a latency advantage to the GPU feeding a smaller amount of CU’s from the same L1 pool.
It's not computer science here. You mix a lot of notions without contest, machine learning on series X(as on ps5) is very limited, the hardware part dedicate to it need to cowork for other stuff, they aren't completely dedicate to that scope. That's the double edge sword. When console will have a "true" machine learning dedicate in the hardware, we need to see how will end.AGAIN -
Computer Science.
But you know what, I'm sure other's here will eventually chime in and be more than happy to politely attest to the fact that it states Machine Learning will clearly bolster hardware performance over 105%, across various Computer Science Curriculums.
And why wouldn't A.I. be able to perform this achievement is the real question? Why is this concept of A.I. bolstering hardware performance through software optimization so hard to grasp?
Because you haven't heard of it?
Because SCI-FI does not already illustrate massive advancement due to AI - without hammering it in that all of that has been borrowed verbatim from Computer Science Coursework?
That, to me - is almost as crazy as insinuating (factually stating in this case) Machine Learning will bolster future hardware over 300% through software optimization. Which it will.
The result's of "these books" are evident in every single piece of hardware you have ever owned since pre-90's.You can read 14 books, but nobody is seeing the results of those books in the posts you have made. You are not even quoting things without context, you are just hinting at things you could quote out of context.
The result's of "these books" are evident in every single piece of hardware you have ever owned since pre-90's.
Look, I've told you all verbatim what this coursework teaches, offhand as I am in fact a computer scientist - I don't need to reference literature I've slaved over already to consummate correct responses here but I've told you it is blatantly plastered across 5 of 7 books on the subject, a subject extending 13 books in totality. If you're that interested, look into computer science curriculum and get set to be amazed at the future we are approaching. With that said, I promise other's here will probably vouche that this data is correct - eventually, and I'd rather look to that response from other users to resolve your misgivings than needlessly try to prove I'm not lying.
Again, nothing I've stated in my previous post has anything to do with what 'Microsoft' taught it's fan's, this is Computer Science Curriculum - sir.
These software Optimizations will provide these performance benefit's - eventually for both consoles and as I stated previously, anyone that has read the coursework will attest to an on average performance increase of 105% due to MACHINE LEARNING. This is not science fiction and with all thing's computer science - will be proved science fact by 2021 and beyond.
Microsoft buying Quantic Dreams confirmed.
Or what?Overall, I think that the pure analysis of the hardware shows an advantage for Microsoft, but experience tells us that hardware is only part of the equation: Sony showed in the past that their consoles could deliver the best-looking games because their architecture and software were usually very consistent and efficient.
And this will repeat in the new gen.
p-3dconv. particularly. but that isn't the only instance. but that one's important, 135% performance uplift due to machine learning.
edit: should i have waited a bit here, so you all think i hit google for that? under 1 minute I solved that, which i hope clears up any misconceptions.
I also heard about Machine unlearning, the technique 343i used for Halo Infinite showed on Jul 23rd. Ok, kidding aside, why we didn't see any impressive XSX showcase yet ? The hardware is there, easy to develop for, all is ready into the Hardware. Why MS didn't think about showing something really impressive on the XSX ? This is a legitimate question to say the least.
Well MS did say they waited for all the RDNA 2 features to be completed by AMD and their GDK dev kit wasn't complete until June.I also heard about Machine unlearning, the technique 343i used for Halo Infinite showed on Jul 23rd. Ok, kidding aside, why we didn't see any impressive XSX showcase yet ? The hardware is there, easy to develop for, all is ready into the Hardware. Why MS didn't think about showing something really impressive on the XSX ? This is a legitimate question to say the least.
Bbbut they’re one the M$ Payroll!!David Cage, CEO and founder of Quantic Dream, highlighted the Xbox Series X's shader cores as more suitable for machine learning tasks, which could allow the console to perform a DLSS-like performance-enhancing image reconstruction technique.
Xbox Series X's Advantage Could Lie in Its Machine Learning-Powered Shader Cores, Says Quantic Dream
Looking at the hardware of Xbox Series X and PS5, Quantic Dream believes Microsoft's advantage could lie in the console's ML-powered shaders.wccftech.com
I know right? If I were a dev who's last 3 projects were Sony exclusives, I'd lie for MS too. It's not like I know more about system architecture than a Gaf userI'll take BS for 100$ Alex...
Considering they will have ML for launch your being disingenuous. It probably won't ever be upscaling like nvidia but there will be other uses. If anyone can find a way to make ml upscaling more efficient it will be Microsoft. I see it as a cool new feature that will get more use as the generation goes on. It is sad it is being down played because Sony is not talking it up like SSD SSD tempest tempest controller controller.You need dedicate hardware to support such enhancement. You take a bunch of tech stuff heard here and there, and you take enthusiast conclusion mixed all the bag without rationality. It doesn't work like that.
Well MS did say they waited for all the RDNA 2 features to be completed by AMD and their GDK dev kit wasn't complete until June.
I never heard third parties are worried about the power difference between the two console.Start of the next gen. There's a pandemic occurring if you have not noticed. Many devs are doing work at home in 2020. I expect MS first party to shine in 2022. You swear PS5 has multiple exclusives at launch that show of their hardware?
This is the third party year. You're buying new consoles to play those games with better graphics.
Either way PS5 first party devs don't worry about the series x power difference. It only matters to third party devs.
I know right? If I were a dev who's last 3 projects were Sony exclusives, I'd lie for MS too. It's not like I know more about system architecture than a Gaf user
Machine learning is on ps5 too. Cerny mentioned it on the road of ps5. But doubt you ever care to know when you heard just SSD, tempest or controller about ps5...neither machine will offer something significant in that sense. There aren't exclusive hardware part dedicate as Nvidia.Considering they will have ML for launch your being disingenuous. It probably won't ever be upscaling like nvidia but there will be other uses. If anyone can find a way to make ml upscaling more efficient it will be Microsoft. I see it as a cool new feature that will get more use as the generation goes on. It is sad it is being down played because Sony is not talking it up like SSD SSD tempest tempest controller controller.
I mean you are here.....in a David Cage thread, reacting to His quotes....giving away your fucks like thier going out of style.I'll take I don't give a flying fuck what David Cage says or does for 200$ Alex
I'll take I don't give a flying fuck what David Cage says or does for 200$ Alex
I mean you are here.....in a David Cage thread, reacting to His quotes....giving away your fucks like thier going out of style.
Truth is truth, fucks be damned
I'll take someone reputable said something I don't like therefore I act like it is not reliably info for 1000 alex!
Lmao now who's being obtuse
What's there to like or not to like?
Edit: Oh snap, looking at your post history you should be in my Hall of Fame. Congratulations, you just made it! Better late than never...
Exactly it is a feature that will be used because both sides have it. That is why there is should be no down playing of it. Sony should be front and center with it to push it.Machine learning is on ps5 too. Cerny mentioned it on the road of ps5. But doubt you ever care to know when you heard just SSD, tempest or controller about ps5...neither machine will offer something significant in that sense. There aren't exclusive hardware part dedicate as Nvidia.
In what way is downplayed? We are talking about it. But now let's not pretend to talk about something of miraculous just because MS sell it in that way. It's MS if I'm not wrong it's started this meme of incredible and unprecedented achievement with ML with the new direct X but it's not even totally hardware, there are a lot of if around his benefit.Exactly it is a feature that will be used because both sides have it. That is why there is should be no down playing of it. Sony should be front and center with it to push it.
Miracle lol they are selling it as a cool feature see auto hdr. You get to easily triggered by a company that will have 25-30% market share. Sony will still win with ease no reason to be triggered.In what way is downplayed? We are talking about it. But now let's not pretend to talk about something of miraculous just because MS sell it in that way
Sure, convince the people they will get a boost of the 80% in performance it's the right way to keep cool and genuine the userbase attitude, I seen the result . They are not new to such practices as to sell TF as the only factor to value the console specs.Miracle lol they are selling it as a cool feature see auto hdr. You get to easily triggered by a company that will have 25-30% market share. Sony will still win with ease no reason to be triggered.
What's there to like or not to like?
Edit: Oh snap, looking at your post history you should be in my Hall of Fame. Congratulations, you just made it! Better late than never...
Yes P3MM is also scheduled for massive gains solely due to machine learning, but everyone expect's caching will see large performance boost's, not everyone believes ML software optimization's will yield large performance gains.Yeah and I added my facial recognition humour just after..