• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us 2 Is Targeting the Base PS4 Model, Not the Pro

ethomaz

Banned
You haven't even presented a counterpoint, that situation presents itself when you're wrong, buddy.

Here's the reality, this whole post not only definitively dispels your delusion notion of the weaker hardware limiting the stronger, it also dispel that it has more to give.

Game Over, end of the road. I don't need to say anything else, this factually and entirely ends both discussions.
Again.

You reapeat false info.
Xbox One X and Pro are subutilized hardware that will never live up to their potential because they are limited by base hardware.

That won't change no matter how you wish or dream because MS or Sony won't develop a exclusive game for these machines.
 
Last edited:
Again.

You reapeat false info.
Xbox One X and Pro are subutilized hardware that will never live up to their potential because they are limited by base hardware.
There's nothing false.

This is all facts. You're dead in the water.

Don't relate the Pro having underwhelming showings relative to the base system to the X as if they somehow share that.

We were talking about Horizon 4 before so let's continue that.

Better and/or only on the X beyond just resolution

B = Better than / O = Only on

  • (B) World lighting
  • (B) Ambient lighting (objects passing reflected light onto other objects)
  • (O) Starburst lensflares
  • (B) Shadow quality
  • (O) Dynamic shadow casting
  • (B) Reflections
  • (B) Motion blur rendering
  • (B) World LoD
  • (B) Car LoD
  • (B) Draw distance
  • (B) Foliage density
  • (B) Textures
  • (B) Texture blend shading (where surfaces merge)
  • (B) Crowd density
  • (O) Ambient occlusion

If you applied these details to the base system it would collapse the build, it would buckle the framerate to single digits and in all likelihood crash. Even the 1080p 60 FPS mode on the X is graphically a step above the base systems settings.

In absolutely no way is the base system in any way limiting. I don't know how many different ways it needs to be explained to you guys that developers are by no means limited by these weaker base consoles. The work they do on the more powerful ones can/is taken and scaled back for operation on the base. Forza Horizon 4 is by a considerable margin the best looking racing game in existence (where applicable X/PC), so to say something like it would look better if the base system wasn't factored into development is totally asinine. The fact that they even hit certain limitations and there's a higher ceiling for some things only reserved for the higher level PC's also dispels this notion as total nonsense as if they could push harder on the X.

The hardware is tapped out, your ethos is ridiculous.
 
Again false info.

Ask any developer that will be shocked lol
You have no counterargument.

This is done, peace.

Don't relate the Pro having underwhelming showings relative to the base system to the X as if they somehow share that.

We were talking about Horizon 4 before so let's continue that.

Better and/or only on the X beyond just resolution

B = Better than / O = Only on

  • (B) World lighting
  • (B) Ambient lighting (objects passing reflected light onto other objects)
  • (O) Starburst lensflares
  • (B) Shadow quality
  • (O) Dynamic shadow casting
  • (B) Reflections
  • (B) Motion blur rendering
  • (B) World LoD
  • (B) Car LoD
  • (B) Draw distance
  • (B) Foliage density
  • (B) Textures
  • (B) Texture blend shading (where surfaces merge)
  • (B) Crowd density
  • (O) Ambient occlusion

If you applied these details to the base system it would collapse the build, it would buckle the framerate to single digits and in all likelihood crash. Even the 1080p 60 FPS mode on the X is graphically a step above the base systems settings.

In absolutely no way is the base system in any way limiting. I don't know how many different ways it needs to be explained to you guys that developers are by no means limited by these weaker base consoles. The work they do on the more powerful ones can/is taken and scaled back for operation on the base. Forza Horizon 4 is by a considerable margin the best looking racing game in existence (where applicable X/PC), so to say something like it would look better if the base system wasn't factored into development is totally asinine. The fact that they even hit certain limitations and there's a higher ceiling for some things only reserved for the higher level PC's also dispels this notion as total nonsense as if they could push harder on the X.

The hardware is tapped out, your ethos is ridiculous.
 

ethomaz

Banned
You have no counterargument.

This is done, peace.
I already give you all arguments and most of your arguments just exemplify what I posted working against you (Crysis, FH4, etc).

When you realize how wrong you are it will be funny.

Xbox One X has huge potential that will never be used by any game because it is not the base hardware.

But seems like Xbox users uses to believe in secret sauce or magical coding.
 
Last edited:
I already give you all arguments and most of your arguments just exemplify what I posted.
You haven't given me anything. You have no argument.

The X is totally tapped out and is doing things on every fundamental rendering level the base system could handle.

Where do you have to go from there? No where.
 

ethomaz

Banned
You haven't given me anything. You have no argument.

The X is totally tapped out and is doing things on every fundamental rendering level the base system could handle.

Where do you have to go from there? No where.
X is subutilized because games are made to run on Xbox One.

Any developer targeting X only could do way better than anything you see until now.

That is basically the main point why devs were mad about Lockhart to the point that MS canned it.
 
Last edited:
This debate is pointless.

There are 2 ways to max out hardware (100% utilization).

Either you get a base console game and crank it up in terms of resolution and/or fps (that's what beefy PCs, mid-gen consoles and last-gen remasters do) or you make a more expensive game maxing out graphics at 1080p30 (that's what next-gen is supposed to do).

In both cases you will have 100% CPU/GPU usage (so technically both are maxed out), but only in the 2nd case you will have a next-gen game.

One example of this is comparing TLOU Remastered and Uncharted 4. Both of them max out the hardware, but they're not on the same level in terms of graphics fidelity, AI/NPC behavior etc.

To be fair, I don't think XB1X (let alone PS4 Pro) would be able to provide a next-gen leap just because it has a faster GPU. You also need a next-gen CPU (Zen 2), next-gen I/O storage (NVMe SSD) and last but not least, next-gen rendering techniques (RTX ON).
 
Last edited:

ethomaz

Banned
This debate is pointless.

There are 2 ways to max out hardware (100% utilization).

Either you get a base console game and crank it up in terms of resolution and/or fps (that's what beefy PCs, mid-gen consoles and last-gen remasters do) or you make a more expensive game maxing out graphics at 1080p30 (that's what next-gen is supposed to do).

In both cases you will have 100% CPU/GPU usage (so technically both are maxed out), but only in the 2nd case you will have a next-gen game.

One example of this is comparing TLOU Remastered and Uncharted 4. Both of them max out the hardware, but they're not on the same level in terms of graphics fidelity, AI/NPC behavior etc.

To be fair, I don't think XB1X (let alone PS4 Pro) would be able to provide a next-gen leap just because it has a faster GPU. You also need a next-gen CPU (Zen 2), next-gen I/O storage (NVMe SSD) and last but not least, next-gen rendering techniques (RTX ON).
I agree with you.

But max out hardware (100%) is a lazzy word because it is impossible to happen in any hardware.

If you do more optimizations, if you do a better code path, etc you will always get better results (performance or graphics)... so even if a game uses 100% of the CPU or GPU it could be far way to using the fully potential of that hardware because you can do better if you have time to made optimizations.

That is why base hardware hold the strong hardware... you need to make sure it runs in both machines and you won't have time (costs) to specific fully otimize for a single hardware to get everything from it... you will maybe will do optimizations to add some features or make reach some level of performance... but never fully optimize it.

Scale increase result in optimization decrease... more scalable the code is = less optimized for specific hardware... giving an example every time you have to code a logical if to make your code scalable you are adding overhead to the processing time.

So when you direct code for one specific hardware you don't need 90% of the logics you do to the code be scalable and that way you get way closer to metal without most overhead you have in an scalable system.

If a dev wishes to make a game only to run on Xbox One they can reach level of optimizations that will make you thing you create a game for a new generation and not that one.

That is the main point.

Base hardware holds mid-gen upgrades forever... there is no way to avoid that unless you create two separated codes (branches) for each hardware and optimize each one for it own hardware... and that can exponential increase the game budget and no publisher/developer ever will do that.

You know UE4 is available to almost every hardware in the market... you still needs specific code path if you want to make the game runs even better in a specific hardware... like when you can do a code path that runs faster than the default code UE4 uses for nVidia RTX... if without talk about the zillions of lines of codes with ifs that UE4 uses to make the scalable works.

At the end... specific hardware optimizations expensive.
UE4 scalability is a hell of cheaper.
 
Last edited:
This debate is pointless.

There are 2 ways to max out hardware (100% utilization).

Either you get a base console game and crank it up in terms of resolution and/or fps (that's what beefy PCs, mid-gen consoles and last-gen remasters do) or you make a more expensive game maxing out graphics at 1080p30 (that's what next-gen is supposed to do).

In both cases you will have 100% CPU/GPU usage (so technically both are maxed out), but only in the 2nd case you will have a next-gen game.

One example of this is comparing TLOU Remastered and Uncharted 4. Both of them max out the hardware, but they're not on the same level in terms of graphics fidelity, AI/NPC behavior etc.

To be fair, I don't think XB1X (let alone PS4 Pro) would be able to provide a next-gen leap just because it has a faster GPU. You also need a next-gen CPU (Zen 2), next-gen I/O storage (NVMe SSD) and last but not least, next-gen rendering techniques (RTX ON).
I appreciate you actually making a real post.

Here's the thing though, Uncharted 4 is exactly the same on the Pro vs. the base bar the resolution increase, as is Lost Legacy, and TLOU2 will likely be the same. They tapped out their system resources for rendering with resolution, the system has nothing more to give without causing net negative results such as impacting the frame rate.

For my example Forza Horizon 4 on the X is quadrupling the resolution from 1920x1080 to 3840x2160 while maintaining the 30 FPS framerate, added to that doing all of this in excess of that.

Better and/or only on the X beyond just resolution

B = Better than / O = Only on

  • (B) World lighting
  • (B) Ambient lighting (objects passing reflected light onto other objects)
  • (O) Starburst lensflares
  • (B) Shadow quality
  • (O) Dynamic shadow casting
  • (B) Reflections
  • (B) Motion blur rendering
  • (B) World LoD
  • (B) Car LoD
  • (B) Draw distance
  • (B) Foliage density
  • (B) Textures
  • (B) Texture blend shading (where surfaces merge)
  • (B) Crowd density
  • (O) Ambient occlusion

Running graphical features not only not available on the base system but also taking what is there and going much beyond it because it was developed much beyond it, and then scaled down to it for the base.

This is what happens when you develop for the top platform, you are able to take full advantage of it while scaling back the work done on it for the lesser system, Sony's work is the result of the opposite, taking the work done on the base system and simply increasing its resolution.

That other guy has a delusional notion that the base limits the top end, it doesn't, unless you're Sony because they tackle things opposite Microsoft.
 
Last edited:
But max out hardware (100%) is a lazzy word because it is impossible to happen in any hardware.

If you do more optimizations, if you do a better code path, etc you will always get better results (performance or graphics)... so even if a game uses 100% of the CPU or GPU it could be far way to using the fully potential of that hardware because you can do better if you have time to made optimizations.
Well, ND was the first in the industry to claim they achieved 100% SPU usage on Cell and it really showed on Uncharted 2 vs UC1 (only 30%).

Uncharted 3 wasn't a huge step up over UC2 and there's a reason for that. Yeah, they optimized some stuff (they had to for 3D), but 100% is 100%. Not a lot of juice left for another monumental UC1 vs UC2 leap.

Here's the thing though, Uncharted 4 is exactly the same on the Pro vs. the base bar the resolution increase, as is Lost Legacy, and TLOU2 will likely be the same.
They're not the same. Same for other PS4 exclusives.

For example, shadows have lower resolution on OG PS4. Lighting is also different. Detroit for example has lower-quality volumetric lighting on OG PS4 vs Pro.

For my example Forza Horizon 4 on the X is quadrupling the resolution from 1920x1080 to 3840x2160 while maintaining the 30 FPS framerate, added to that doing all of this in excess of that.
Would you say FH4 on XB1X constitutes a next-gen experience vs FH4 on OG XB1?

Also, how would FH4 look if it had a 1080p30 target on XB1X?

That's the thing, the more you lower the resolution/fps, the more budget you have to work with.

Personally, I value fps over graphics/resolution, but some people would rather have higher image quality. To each his own.
 

ethomaz

Banned
Well, ND was the first in the industry to claim they achieved 100% SPU usage on Cell and it really showed on Uncharted 2 vs UC1 (only 30%).

Uncharted 3 wasn't a huge step up over UC2 and there's a reason for that. Yeah, they optimized some stuff (they had to for 3D), but 100% is 100%. Not a lot of juice left for another monumental UC1 vs UC2 leap.


They're not the same. Same for other PS4 exclusives.

For example, shadows have lower resolution on OG PS4. Lighting is also different. Detroit for example has lower-quality volumetric lighting on OG PS4 vs Pro.


Would you say FH4 on XB1X constitutes a next-gen experience vs FH4 on OG XB1?

Also, how would FH4 look if it had a 1080p30 target on XB1X?

That's the thing, the more you lower the resolution/fps, the more budget you have to work with.

Personally, I value fps over graphics/resolution, but some people would rather have higher image quality. To each his own.
Games after UC3 did used even better these 100% SPUs... TLOU for example for the same devs.
Optimization is a forever task... you will always get more using the same hardware.

I don't want to get into that FH4 debate because it is clear if Playground Games could work only for Xbox One X the game could delivery way more than what they did with multiple hardware.
 
Well, ND was the first in the industry to claim they achieved 100% SPU usage on Cell and it really showed on Uncharted 2 vs UC1 (only 30%).

Uncharted 3 wasn't a huge step up over UC2 and there's a reason for that. Yeah, they optimized some stuff (they had to for 3D), but 100% is 100%. Not a lot of juice left for another monumental UC1 vs UC2 leap.


They're not the same. Same for other PS4 exclusives.

For example, shadows have lower resolution on OG PS4. Lighting is also different. Detroit for example has lower-quality volumetric lighting on OG PS4 vs Pro.


Would you say FH4 on XB1X constitutes a next-gen experience vs FH4 on OG XB1?

Also, how would FH4 look if it had a 1080p30 target on XB1X?

That's the thing, the more you lower the resolution/fps, the more budget you have to work with.

Personally, I value fps over graphics/resolution, but some people would rather have higher image quality. To each his own.
The Sony status quo for the Pro has basically been "take X game, increase its resolution and/or checkerboard it". There's some edge cases but that's been their MO for Pro work. Microsoft has gone a very different direction where they focus on producing defined graphical differences while also increasing the resolution.

No it doesn't constitute a next-gen experience, but it does constitute a considerably superior experience within the confines of this generation. Why is that? Because they targeted the top platform during development to take full advantage of the graphical superiority it can offer, and thus they took that and scaled back what needed to be for the base system.

Like I also said Horizon 4 has a 60 FPS mode at 1080p, and even in this mode some of those extenuating graphical aspects seen on the X and not seen on the base carry over into this mode.
 
Games after UC3 did used even better these 100% SPUs... TLOU for example for the same devs.
TLOU sacrificed graphics quality in some areas to offer bigger maps.

Just compare UC2 cars (Nepal) vs the cars on TLOU1. Huge downgrade, but it makes sense if you want bigger maps for exploration. Not to mention that UC2 cars could be exploded, so Cell had to do more physics calculations.

It's a zero-sum game after a certain point.

Optimization is a forever task... you will always get more using the same hardware.
I remember Santa Monica saying the same thing about GoW Ascension, but I don't think endless optimization after you've already hit 100% is going to yield GoW PS4 results. :)
 
TLOU sacrificed graphics quality in some areas to offer bigger maps.

Just compare UC2 cars (Nepal) vs the cars on TLOU1. Huge downgrade, but it makes sense if you want bigger maps for exploration. Not to mention that UC2 cars could be exploded, so Cell had to do more physics calculations.

It's a zero-sum game after a certain point.


I remember Santa Monica saying the same thing about GoW Ascension, but I don't think endless optimization after you've already hit 100% is going to yield GoW PS4 results. :)
Not to mention the PS3 was just a development minefield while x86 is a pretty straightforward pipeline for development. It's PC, we already know all the limitations especially with architectures which have existed for years, and years, and years.
 
Not to mention the PS3 was just a development minefield while x86 is a pretty straightforward pipeline for development. It's PC, we already know all the limitations especially with architectures which have existed for years, and years, and years.
The "Cell" (vector processor) of this generation is the Radeon GPGPU.

It's not easy to max it out (hence the stigma that GCN is less "efficient"), unless you have the proper tools/profilers (PIX for MS). MS did crazy stuff on Gears 5, including software raytracing. Cell could also do similar stuff.

Regarding the x86 CPU, multi-threading is not easy either. Maxing out 6-7 Jaguar cores requires some expertise. 2013-2014 PS4/XB1 games used old, single-threaded engines, which limited draw calls and the scope of those games.

Next-gen will still be x86/GCN-based, but it will also offer some paradigm shifts (such as deep learning acceleration on the Navi GPGPU). These paradigm shifts will take a while to become mainstream, they won't be here day 1. It's the same thing every gen, regardless of the hardware.
 
The "Cell" (vector processor) of this generation is the Radeon GPGPU.

It's not easy to max it out (hence the stigma that GCN is less "efficient"), unless you have the proper tools/profilers (PIX for MS). MS did crazy stuff on Gears 5, including software raytracing. Cell could also do similar stuff.

Regarding the x86 CPU, multi-threading is not easy either. Maxing out 6-7 Jaguar cores requires some expertise. 2013-2014 PS4/XB1 games used old, single-threaded engines, which limited draw calls and the scope of those games.

Next-gen will still be x86/GCN-based, but it will also offer some paradigm shifts (such as deep learning acceleration on the Navi GPGPU). These paradigm shifts will take a while to become mainstream, they won't be here day 1. It's the same thing every gen, regardless of the hardware.
Next-gen is not GCN, it's RDNA.
 

DeepEnigma

Gold Member

But he's not though.



 
I read that the day it was released.

RDNA is still compatible with GCN assembly instructions. The Wave64 mode is telling enough.

An AMD engineer clarified it long time ago:


nVidia also adheres to the same RISC ISA since G80, otherwise CUDA would not offer BC. :)

edit: ninja'd by DeepEnigma DeepEnigma :p
 
Last edited:
But he's not though.



You do realize that these consoles are using Next-Gen RDNA correct not 1.0?

It's not GCN ISA, it's not of the 5700 lineage we currently see in the discrete graphics market. What is hypothesized as Navi 20 is apparently going to be the basis for these consoles, which is also a true break in architecture.

You are wrong again... RDNA is GCN ISA.
And 99% sure next AMD's architectures will be too.
Stop trying to dogpile, you don't know anything. You're trying to piggyback this conversation.
 
Last edited:

ethomaz

Banned
You do realize that these consoles are using Next-Gen RDNA correct not 1.0?

It's not GCN ISA, it's not of the 5700 lineage we currently see in the discrete graphics market.
What you call next-gen RDNA is GCN ISA too.

Stop trying to dogpile, you don't know anything. You're trying to piggyback this conversation.
I will stop when you stop to post nonsenses like RDNA is not GCN ISA lol
I can't dogpile you when you post correct things ;) it is that simple.
 
Last edited:

DeepEnigma

Gold Member
You do realize that these consoles are using Next-Gen RDNA correct not 1.0?

It's not GCN ISA, it's not of the 5700 lineage we currently see in the discrete graphics market.

I think you are confused with that the ISA and micro-architecture are.

The micro-architecture is more advanced, evolved, refined, efficient, whatever the word you want to throw at it, but the ISA still remains the same for compatibility. You can still have the same 1.25+ performance increase over GCN/GCN with RDNA/GCN.

Just like the CPU example. Pentium, Celeron, Core i, etc., are all micro-architectures based off the core x86/64 ISA.
 
I think you are confused with that the ISA and micro-architecture are.

The micro-architecture is more advanced, evolved, refined, efficient, whatever the word you want to throw at it, but the ISA still remains the same for compatibility. You can still have the same 1.25+ performance increase over GCN/GCN with RDNA/GCN.

Just like the CPU example. Pentium, Celeron, Core i, etc., are all micro-architectures based off the core x86/64 ISA.
I'm not confused, I understand what is being discussed here.

"The architecture features a new processor design, although the first details released at AMD's Computex keynote hints at aspects from the previous a Graphics Core Next architecture being present. It will feature multi-level cache hierarchy and an improved rendering pipeline, with support for GDDR6 memory. A completely redesigned (non-hybrid) architecture is planned as a successor (rumor)"

This is apparently Navi 20, Next-Gen, what will be the basis for these new console incorporating RT and the like. It's supposed to be a complete break, that doesn't mean compatibility will not be a factor.

I've kept up on all of this, I'm not flying blind pissing into the wind.
 
Last edited:

DeepEnigma

Gold Member
I'm not confused, I understand what is being discussed here.

"The architecture features a new processor design, although the first details released at AMD's Computex keynote hints at aspects from the previous a Graphics Core Next architecture being present. It will feature multi-level cache hierarchy and an improved rendering pipeline, with support for GDDR6 memory. A completely redesigned (non-hybrid) architecture is planned as a successor (rumor)"

This is apparently Navi 20, Next-Gen, what will be the basis for these new console incorporating RT and the like. It's supposed to be a complete break, that doesn't mean compatibility will not be a factor.

I've kept up on all of this, I'm not flying blind pissing into the wind.

We will see if that (rumor) will pan out. What we do have however, is out of the mouth of people who work on the architecture itself, currently.

I bet it will still have GDN ISAs in there in some form. Just as nVidia still using the same RISC ISA even in their RTX RT cards.
 
We will see if that (rumor) will pan out. What we do have however, is out of the mouth of people who work on the architecture itself, currently.

I bet it will still have GDN ISAs in there in some form. Just as nVidia still using the same RISC ISA even in their RTX RT cards.
We'll have to see I guess.
 

ethomaz

Banned
You know you can have a complete new break, archiceture using the same ISA.

RDNA is GCN ISA.

The successor will be GCN ISA... there is no point in create a new ISA (ala Itanium).
 
Last edited:

ethomaz

Banned
If we get a totally new ISA, then I will understand the lack of PS4 BC, otherwise I totally expect it.
If we get a totally new ISA PS4 BC Will be a minor subject... actual PC OS and drivers won't work, games won't work, apps won't work, APIs won't work, Engines won't work, devtools won't work, etc etc etc.

There is a reason Itanium died while x86 has already over 30 years... software support.
 
Last edited:

Fahdis

Member
Yes I do. I think it‘s extremely shady to remaster games a year in.

It's not a remaster if its BC. Wtf? I don't think you know how that works. I don't think there will be a Last of Us Part II remaster, because PS4 and PS5 will be on the same architecture. It will just be enhanced.
 

ethomaz

Banned
It's not a remaster if its BC. Wtf? I don't think you know how that works. I don't think there will be a Last of Us Part II remaster, because PS4 and PS5 will be on the same architecture. It will just be enhanced.
I hope there is a remaster... they did a good job with TLOU remaster.
 

TimFL

Member
It's not a remaster if its BC. Wtf? I don't think you know how that works. I don't think there will be a Last of Us Part II remaster, because PS4 and PS5 will be on the same architecture. It will just be enhanced.
I was replying to the people saying they wait to buy the "definitive edition" on the PS5. I am in favor of playing PS4 titles on PS5 (with potential enhancements via patches or w/e), I am not in favor of them releasing a "remaster" with better framerate or graphics for PS5 (which they probably will do anyways).
 
I was replying to the people saying they wait to buy the "definitive edition" on the PS5. I am in favor of playing PS4 titles on PS5 (with potential enhancements via patches or w/e), I am not in favor of them releasing a "remaster" with better framerate or graphics for PS5 (which they probably will do anyways).
They should have just delayed it until the PS5 launch and done what Halo Infinite is doing and release as cross-gen. We all know a year on from its release they're going to hit everyone with a PS5 version, this is Sony we're talking about here.
 
They should have just delayed it until the PS5 launch and done what Halo Infinite is doing and release as cross-gen. We all know a year on from its release they're going to hit everyone with a PS5 version, this is Sony we're talking about here.
I don't think Sony can justify paid remasters anymore, since we're talking about the same ISA.

Both companies will offer free game updates to boost next-gen console sales.
 

meirl

Banned
They should have just delayed it until the PS5 launch and done what Halo Infinite is doing and release as cross-gen. We all know a year on from its release they're going to hit everyone with a PS5 version, this is Sony we're talking about here.

yeah, definitely. there is really no good launch title announced.. "godfall"? meh. and all people will already played it by then... so they should have waited.
 

sunnysideup

Banned
The pro should be seen like an High end dvd player. You get better fidelity, But essentially its still an dvd player.

No one should target the pro with specific development.
 

Trimesh

Banned
It was very hot that time. 'member? : )

TLOU was the game that finally killed my CECHA00, although it had been alarmingly noisy for some time before that and had suffered from occasional shutdowns, so I guess it was just a matter of time anyway. Even the (slim) replacement machine got pretty hot with the fans spooled up.
 
Top Bottom