• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Can we stop pretending switch docked mode is of secondary importance to portable play?

Is Switch more of a home console or a handheld?


  • Total voters
    376
The only time I undock my Switch is when I'm cleaning/dusting. Ever since the PSP/DS/smartphone era kicked off I've felt like I fell through a worm hole into some alternate bizarro universe where everything is backwards and people generally prefer playing video games and watching movies/shows on little tiny hand held devices instead of on nice big widescreen TVs.
Never underestimate the convenience of being able to take your media consumption with you. If I had a nickel for every time I missed a show or didn't get to play games because somebody wanted to watch something...
 

Neff

Member
That the docking station is bundled in instead of being sold separately is your argument? Wow..... By that retarded logic a laptop bundled with a docking station and a BT controller would automatically become a desktop system. :messenger_tears_of_joy::messenger_tears_of_joy::messenger_tears_of_joy:
We`re seriously getting into silly territory as this goes beyond grasping for straws already.

Oh dear
 

Jsisto

Member
Cant remember the last time I played undocked. I pretty much exclusively play docked, but then, I also live alone and don’t have to worry about sharing the TV. It’s just nice to have options, that’s whats so wonderful about the switch.
 
that makes no sense and would defeat the entire purpose behind the Switch. the Lite is only aimed at people who want a cheap and/or handheld only option. i prefer playing in handheld and it makes way more sense to get the real Switch because you can still take that anywhere with you and it has better battery life which ykno is important for a handheld device.

The Switch Lite is more portable though. Without it, Nintendo wouldn't really have anything to compete with mobile phones.

The idea of the Switch, first and foremost, was to consolidate development time into a variable spec standard.
 

NinjaBoiX

Member
that makes no sense and would defeat the entire purpose behind the Switch. the Lite is only aimed at people who want a cheap and/or handheld only option. i prefer playing in handheld and it makes way more sense to get the real Switch because you can still take that anywhere with you and it has better battery life which ykno is important for a handheld device.
It makes way LESS sense to get the real switch if you only play handheld; it’s far less comfortable, a decent bit heavier and way more expensive.

Are you serious?
 

Alexios

Cores, shaders and BIOS oh my!
People here, as well as those at digital foundry seem to call it a portable console as a way to deflect criticism from its docked performance.
You really made a thread just for that? Lol. Even if it wasn't at all portable, which it very much is whether it connects to a TV or not, people would still, not defend but rather point out, it's a tablet sized little device topping out at 15w and thus, yes, those ports would still be miracle ports. Get over it.
it’s in the name *Switch* and it isn’t “Nintendo’s handheld device that can also be played on the tv”.
Guess the Switch Lite also having it "in the name" and having all the same innards doesn't occur to your logic either. Maybe you'd be fine with "people here, as well as those at digital foundry" if they were testing it on the Lite, you wouldn't whine about them not testing TV mode and accept it, yeah.
 
Last edited:

PaintTinJr

Member
PaintTinJr PaintTinJr

I did find this DF video though, and the wii U version has those frame pacing issues in split screen as well.



And on switch, there are no frame pacing issues on DF’s report, and not only that but he said the 3 player refreshes the same way on Wii U. Skip to 3:41.


I haven't watched those videos - other than the first 30secs of the first one -, but do remember the MK8 Wiiu DF "analysis" where they presented a non-issue - claimed it dropped 1 frame when the source footage they provided showed unique histograms for each frame - just to give the game more coverage disguised as analysis.

Going by the footage their analysis shows nothing because their methodology is flawed, and the quality of those racing isn't representative of real players pushing the systems.

As someone that's got both versions and consoles, and has been playing the cube version from nearly 2 decades - with friends - the wiiu version of MK8 is the better driving experience in split screen with friends - and AI enabled.

Looking at the lack of Switch's ability to do (havoc? physics) of the original SMB1 and SMB2, deluxe or wii Blitz mini games - like monkey target - in banana mania, and plays like the same adobe flash game from ARM powered DS and Vita - that is all down to the Tegra being too weak to even match a 400MHz POWER cpu in 4 player split.

Then we've got Splatoon 1 on the WiiU that has the ability for 2 players to play online in splitscreen, and although the excuse was no one used the mode - when dropped for Splatoon 2 on switch - it is clear that the system lacks the CPU capability to match the more powerful brawny processor, because CPU cores - with the exception of the Cell BE's SPUs - are limited by the effectiveness of the primary CPU core that continually supplies and coordinates the work.

Even with you saying about memory bandwidth being the only paper spec you are comparing - that the WiiU system is better than the Switch . Paper specs don't translate between different architectures, and certainly not comparing a brawny architecture designed for highest end business servers - then scaled down.. as opposed to a puny architecture designed for smartphones. Just try comparing a rasp PI 4 trying to run a minecraft server compared to a core2duo windows laptop.

The difference between architectures is maybe most easily seen by the supporting chipset features - because they give insight into what the CPU can really handle as per rasp pi 4 versus core2duo - and the faster memory bandwidth of the WiiU is a by-product of it being a stronger architecture. If the Wiiu GPU was given the same advantages of the Switch's tegra, the WiiU would be even more convincing as a home console than the Switch, just like the Wii, cube still are IMHO.
 
Last edited:
I picked up a Switch to replace the WiiU, 1080/60 Hyrule Warriors was too good to not upgrade, so I see it as more of a home console myself - games also seem to control better when you move away from the joycons. Main obstacle to that at times tho is the lack of decent docked profiles on a fair amount of games, several I have seem the same as portable when docked. Still use portable sometimes when on review duties mainly, but I do find it a little cumbersome for a handheld as its quite bulky compared to the Vita & New 3DS XL I occasionally carry around instead (I can even put one of those handhelds in the pouch of my Switch carry case)

To each his own tho, makes no difference to me how people use their Switch.
 
I haven't watched those videos - other than the first 30secs of the first one -, but do remember the MK8 Wiiu DF "analysis" where they presented a non-issue - claimed it dropped 1 frame when the source footage they provided showed unique histograms for each frame - just to give the game more coverage disguised as analysis.

Going by the footage their analysis shows nothing because their methodology is flawed, and the quality of those racing isn't representative of real players pushing the systems.

As someone that's got both versions and consoles, and has been playing the cube version from nearly 2 decades - with friends - the wiiu version of MK8 is the better driving experience in split screen with friends - and AI enabled.

Looking at the lack of Switch's ability to do (havoc? physics) of the original SMB1 and SMB2, deluxe or wii Blitz mini games - like monkey target - in banana mania, and plays like the same adobe flash game from ARM powered DS and Vita - that is all down to the Tegra being too weak to even match a 400MHz POWER cpu in 4 player split.

Then we've got Splatoon 1 on the WiiU that has the ability for 2 players to play online in splitscreen, and although the excuse was no one used the mode - when dropped for Splatoon 2 on switch - it is clear that the system lacks the CPU capability to match the more powerful brawny processor, because CPU cores - with the exception of the Cell BE's SPUs - are limited by the effectiveness of the primary CPU core that continually supplies and coordinates the work.

Even with you saying about memory bandwidth being the only paper spec you are comparing - that the WiiU system is better than the Switch . Paper specs don't translate between different architectures, and certainly not comparing a brawny architecture designed for highest end business servers - then scaled down.. as opposed to a puny architecture designed for smartphones. Just try comparing a rasp PI 4 trying to run a minecraft server compared to a core2duo windows laptop.

The difference between architectures is maybe most easily seen by the supporting chipset features - because they give insight into what the CPU can really handle as per rasp pi 4 versus core2duo - and the faster memory bandwidth of the WiiU is a by-product of it being a stronger architecture. If the Wiiu GPU was given the same advantages of the Switch's tegra, the WiiU would be even more convincing as a home console than the Switch, just like the Wii, cube still are IMHO.
Uh, ok. That 59 fps frame skip is a real issue on Wii U, and I played near 400 hours of that version. I don't know what else to tell you, but switch is more powerful than Wii U man. You can discuss that on beyond3d if you want, or ask the developers (though it sounds like you might not believe them judging by your splatoon comment).

I mean we had developers like 4A games of metro last light notoriously saying the Wii U cpu was *horribly* slow, and backing out of the deal to port the game because of that. Look that up if ya want. The Switch cpu is generally better than xbox 360, which was absolutely more powerful than Wii U (Cpu wise). Dynasty warrior developer saying Wii U was slower than 360 cpu ; I was there back then watching Wii U news. Look at how Ubisoft open world games ran slower on Wii U vs 360. Check out DF vids. Why? Cpu. Though, I guess DF isn't credible to you lol, so *shrug*.

Speaking of Dynasty warriors, hyrule warriors on Wii U couldn't even hold steady 30fps, but Switch version can run up to twice as fast and has a 60fps target. Fire emblem warriors runs at 60fps as well. Switch's better cpu can handle more enemies on screen and the games from Koei Tecmo, as well as their Wii U comments empirically prove this.

But yeah, I was there, playing these games and hearing developers comment on Wii U specs, and your info is simply false.

Wii U gpu is rated at 176 gflops, Switch (while docked) is more than double that at 384, on a superior architecture. There's not some grand conspiracy pulling the wool over your eyes with Switch's power ; Mk8 runs at 1080p vs 720p on Wii U and i've never witnessed, nor heard of the issues you claim, and i"m going to leave it at that.
 
Last edited:

PaintTinJr

Member
Uh, ok. That 59 fps frame skip is a real issue on Wii U, and I played near 400 hours of that version. I don't know what else to tell you,
Feel free to elaborate on how exactly you detected the drop and in which situation - although I was actually not talking about the CPU performant weakness in regards of single scene SP @ 60fps but four scene 4 player splitscreen@30fps(120fps workload) if you remember my original post in this thread.

but switch is more powerful than Wii U man. You can discuss that on beyond3d if you want, or ask the developers (though it sounds like you might not believe them judging by your splatoon comment).

I mean we had developers like 4A games of metro last light notoriously saying the Wii U cpu was *horribly* slow, and backing out of the deal to port the game because of that. Look that up if ya want.
The game isn't split screen, so you are conflating technical spec performance with how performant a CPU is at handling rapidly switching workloads, the guy who was the lead software architect on the PS5 tweeted in the build up to the PS5 release how difficult handling such small time-slices is for gaming.
But even then, the Switch has a huge GPU advantage to offload CPU work because it is only rendering at 30fps so latency is less of an issue. The WiiU could have taken a custom optimise version, with massively degraded graphics but the console wasn't a 90m selling handheld hybrid with lots of 3rd party support - like all Nintendo handhelds get - and instead was a true Nintendo home consoles - that gets poor support, like they have for years. The Wiiu even had games like Odyssey cancelled and moved to the Switch IIRC, because it was only really supported for 3 of its 4 year life, so hardly a surprise Metro and Project Cars never got compromised versions.

The Switch cpu is generally better than xbox 360, which was absolutely more powerful than Wii U (Cpu wise). Dynasty warrior developer saying Wii U was slower than 360 cpu ; I was there back then watching Wii U news. Look at how Ubisoft open world games ran slower on Wii U vs 360. Check out DF vids. Why? Cpu. Though, I guess DF isn't credible to you lol, so *shrug*.
You are mixing things again. It isn't more performant at 120fps workloads than the 360 Xenon. It was a 2way tri core at 3.6Ghz per core - the same family as the Cell PPU which even the PS4 struggled to handle all the aspects of PS3's Journey in the Ps4 port. Clockspeed of a brawny CPU still makes a huge difference. But again you are using SP games that don't rapidly switch workloads as your proof cases.
Speaking of Dynasty warriors, hyrule warriors on Wii U couldn't even hold steady 30fps, but Switch version can run up to twice as fast and has a 60fps target. Fire emblem warriors runs at 60fps as well. Switch's better cpu can handle more enemies on screen and the games from Koei Tecmo, as well as their Wii U comments empirically prove this.

But yeah, I was there, playing these games and hearing developers comment on Wii U specs, and your info is simply false.

Wii U gpu is rated at 176 gflops, Switch (while docked) is more than double that at 384, on a superior architecture. There's not some grand conspiracy pulling the wool over your eyes with Switch's power ; Mk8 runs at 1080p vs 720p on Wii U and i've never witnessed, nor heard of the issues you claim, and i"m going to leave it at that.
Again, if you gave the WiiU the same GPU capabilities the weak POWER chip - by PC and home console standards - in the WiiU would deliver far more than the docked ARM Tegra in the Switch for splitscreen multiplayer. I had an early tegra based Sony Android tablet (the tablet S) and for general android apps the hardware was rubbish - even giving it consideration that the software could have been better.
 
Last edited:
Feel free to elaborate on how exactly you detected the drop and in which situation - although I was actually not talking about the CPU performant weakness in regards of single scene SP @ 60fps but four scene 4 player splitscreen@30fps(120fps workload) if you remember my original post in this thread.


The game isn't split screen, so you are conflating technical spec performance with how performant a CPU is at handling rapidly switching workloads, the guy who was the lead software architect on the PS5 tweeted in the build up to the PS5 release how difficult handling such small time-slices is for gaming.
But even then, the Switch has a huge GPU advantage to offload CPU work because it is only rendering at 30fps so latency is less of an issue. The WiiU could have taken a custom optimise version, with massively degraded graphics but the console wasn't a 90m selling handheld hybrid with lots of 3rd party support - like all Nintendo handhelds get - and instead was a true Nintendo home consoles - that gets poor support, like they have for years. The Wiiu even had games like Odyssey cancelled and moved to the Switch IIRC, because it was only really supported for 3 of its 4 year life, so hardly a surprise Metro and Project Cars never got compromised versions.


You are mixing things again. It isn't more performant at 120fps workloads than the 360 Xenon. It was a 2way tri core at 3.6Ghz per core - the same family as the Cell PPU which even the PS4 struggled to handle all the aspects of PS3's Journey in the Ps4 port. Clockspeed of a brawny CPU still makes a huge difference. But again you are using SP games that don't rapidly switch workloads as your proof cases.

Again, if you gave the WiiU the same GPU capabilities the weak POWER chip - by PC and home console standards - in the WiiU would deliver far more than the docked ARM Tegra in the Switch for splitscreen multiplayer. I had an early tegra based Sony Android tablet (the tablet S) and for general android apps the hardware was rubbish - even giving it consideration that the software could have been better.
Mate, you're not even getting basic info right. The 360 has 3 cores at 3.2 ghz. Lots of wrong technical jargon here. I suggest taking this discussion to beyond3d.
 

Redneckerz

Those long posts don't cover that red neck boy
I like how the OP has refrained from understanding that the marketing focus of the Switch is on portable play, but seems content with anyone saying its both a portable and a home console - Which is literally what the Switch is, but that isn't discussed, lol.

the great thing is the Switch is whatever form factor you want it to be - handheld, console or tabletop.
But that's not enough for OP - He does not even emphatize this enough i feel - For him, the Switch should be more about Docked performance first for developers - And Nintendo should reflect that aswell.

But that's literally not what the Switch is made for. Docked play is more an option given the hardware.
The Switch cpu is generally better than xbox 360, which was absolutely more powerful than Wii U (Cpu wise). Dynasty warrior developer saying Wii U was slower than 360 cpu ; I was there back then watching Wii U news. Look at how Ubisoft open world games ran slower on Wii U vs 360. Check out DF vids. Why? Cpu. Though, I guess DF isn't credible to you lol, so *shrug*.
Whilst in general this is correct, it may also have to do with the fact that the GPU itself also isn't that powerful - a 160 shader part. Think of it as a Radeon HD 6450, but with DX10/OpenGL3 like equivalents.

Its impressive on power draw, but 160 shaders only gets you so far.

If the Wiiu GPU was given the same advantages of the Switch's tegra, the WiiU would be even more convincing as a home console than the Switch, just like the Wii, cube still are IMHO.
I don't find that believable - Latte is a Terascale 1 derived product, and not even its DX11 incarnation, but more in line with Radeon HD 3000 and HD 4000. Basically, a HD6450 shoehorned into the Radeon HD 4000 series. Its not even Terascale 2 based, and that architecture was already on the market in 2009/2010.
You are mixing things again. It isn't more performant at 120fps workloads than the 360 Xenon. It was a 2way tri core at 3.6Ghz per core - the same family as the Cell PPU which even the PS4 struggled to handle all the aspects of PS3's Journey in the Ps4 port. Clockspeed of a brawny CPU still makes a huge difference. But again you are using SP games that don't rapidly switch workloads as your proof cases.
No PS360 game to my knowledge runs at 120 fps so i am at odds what you are on about.
I had an early tegra based Sony Android tablet (the tablet S) and for general android apps the hardware was rubbish - even giving it consideration that the software could have been better.
Early Tegra (APX2600/Tegra 2) suffered because they didn't support ARM NEON - When every other Cortex-derived SoC did. That's why the hardware rubbish - not because the GPU couldn't back it up.
 

Owari

Member
When I am playing video games on a TV I prefer the resolution of said game to be relatively modern. The 720p average the docked switch shits out is unacceptable to be considered a home console.
Switch is a handheld first. Let me know when they do the home console box that can hit the handheld performance metrics then we’ll talk.
 

PaintTinJr

Member
Mate, you're not even getting basic info right. The 360 has 3 cores at 3.2 ghz. Lots of wrong technical jargon here. I suggest taking this discussion to beyond3d.
Yeah one poorly remembered frequency still doesn't invalidate the point made.

Do you keep pushing beyond3d because you aren't making your own argument based on your own knowledge - and making a proxy argument from elsewhere?

I notice that you completely ignored the core2duo laptop versus rasp pi 4 minecraft server comparison I made - and linux and Java is heavily optimised for the Pi, with good multi-core support - and yet compared to a lowly old laptop struggles with that CPU intensive task. And ignored the monkeyball mini game issues with physics on every arm version - that aren't present in the original on Cube or PS2/Xbox original or Wii version where the common issue is brawny CPUs versus ARM.
 

PaintTinJr

Member
I don't find that believable - Latte is a Terascale 1 derived product, and not even its DX11 incarnation, but more in line with Radeon HD 3000 and HD 4000. Basically, a HD6450 shoehorned into the Radeon HD 4000 series. Its not even Terascale 2 based, and that architecture was already on the market in 2009/2010.
The switch doesn't render techniques that couldn't be done on first release opencl capable GPUs, so if performance was equal on WiiU - for the GPU - i don't see why your comment would have any bearing on the point I made. Most of the switch lighting is all prebaked.
No PS360 game to my knowledge runs at 120 fps so i am at odds what you are on about.
I wasn't directly referencing 120fps at that point, just that the PS4 CPU which is definitely superior to the Switch isn't able to usurp the all aspects of the PPU in the PS3, which it shared with the 360 Xenon.
Early Tegra (APX2600/Tegra 2) suffered because they didn't support ARM NEON - When every other Cortex-derived SoC did. That's why the hardware rubbish - not because the GPU couldn't back it up.
I was talking about the CPU being rubbish, ratherthan its GPU, but by what you said it probably isn't a fair comparison I made between the Switch Tegra and Tablet S CPU.
 

Redneckerz

Those long posts don't cover that red neck boy
The switch doesn't render techniques that couldn't be done on first release opencl capable GPUs, so if performance was equal on WiiU - for the GPU - i don't see why your comment would have any bearing on the point I made.
I do tend to believe Maxwell is more refined - Purely on the notion its a more modern architecture (2015 vs 2008). That's 7 years of GPU architecture changes.
Most of the switch lighting is all prebaked.
In what games? Because this can change heavily depending on a game - Dying Light is a deferred renderer, for instance, with lots of lights thrown around. They also use PBR (Alien: Isolation). Mostly, its baked + dynamic light probes.
I wasn't directly referencing 120fps at that point, just that the PS4 CPU which is definitely superior to the Switch isn't able to usurp the all aspects of the PPU in the PS3, which it shared with the 360 Xenon.
PS3 and X360 CPU's are more raw gigaflops than PS4 or XBO, but their GPU's are now outdated. PS4/XBO have it opposite - Low power CPU's, more raw GPU power. And with AMD socalled Fine Wine technology, The Radeon HD7000 based hardware still delivers - Crysis 3 is still 1080p, on 7-8 year old hardware.
I was talking about the CPU being rubbish, ratherthan its GPU, but by what you said it probably isn't a fair comparison I made between the Switch Tegra and Tablet S CPU.
Its not the fault of the Tegra GPU though - Prior to Tegra K1, Nvidia used separate pixel/vertex shaders mimicking Geforce 6 series. It delivered a lot of raw power, but everyone else did unified shaders. What Tegra lacked majorly was the NEON support - Being an ubitiqious standard, its still bizarre the Tegra 2 lacked it wholesale.
 
Yeah one poorly remembered frequency still doesn't invalidate the point made.

Do you keep pushing beyond3d because you aren't making your own argument based on your own knowledge - and making a proxy argument from elsewhere?

I notice that you completely ignored the core2duo laptop versus rasp pi 4 minecraft server comparison I made - and linux and Java is heavily optimised for the Pi, with good multi-core support - and yet compared to a lowly old laptop struggles with that CPU intensive task. And ignored the monkeyball mini game issues with physics on every arm version - that aren't present in the original on Cube or PS2/Xbox original or Wii version where the common issue is brawny CPUs versus ARM.
So much of what you say is wrong, as well as you talking about things in Mario kart that i've never witnessed nor heard about despite my many many hours with both versions. I'm suggesting beyon3d to you so you may challenge your assertions with more people than just me, because it's tiring when you're this wrong and you're not going to listen to one gaf poster. Hell even DF have useless tools according to you.

Like your statement on Wii U bandwidth, no it isn't a "by product of Wii U's architecture" regarding its gpu and cpu. Wii U's 32mb of edram (it also has 3mb of extra to emulate Wii) was a choice that could have just as easily been a solution with a unified pool of gddr3 or 5. In addition, Wii U CAN provide more bandwidth than switch, but only if a developer puts in the work to optimize for the eDRAM. Switch's single pool is twice as fast as Wii U's main pool with superior color compression to boot. But Mario kart 8 clearly got the most out of Wii U so I would not be shocked if switch dipped more in 4 player screen despite me not having personally witnessed these drops on Switch.

Switch has a 64 bit bus, and if it had even a 96 bit bus, Wii U would have no advantage in any scenario, and with 128 bit bus with maxwell color compression it would be in another class entirely.
 
Last edited:

PaintTinJr

Member
I do tend to believe Maxwell is more refined - Purely on the notion its a more modern architecture (2015 vs 2008). That's 7 years of GPU architecture changes.

In what games? Because this can change heavily depending on a game - Dying Light is a deferred renderer, for instance, with lots of lights thrown around. They also use PBR (Alien: Isolation). Mostly, its baked + dynamic light probes.
But we aren't seeing techniques on Switch that wouldn't have worked on the Ps3 - despite the age of the RSX and Cell BE - which as you say was largely deferred rendering with deferred lighting fx - so even if the Tegra X1 210 of the Switch is more refined in that scenario, it wouldn't matter because the base performance hit of rendering anything at switch's 1080p30 isn't leaving enough performance to move up even to XB1 level graphics, the techniques are all still just resolution/frame-rate improvements on WiiU visuals IMHO.

And obviously deferred rendering doesn't fit with splitscreen rendering anyway, because you don't have time to defer 4 different frustum's passes looking in different directions - along with four times more CPU processing for user input , game logic, geometry culling, collision tests, physics, audio - within 33ms, so it is all forward rendering with largely pre-baked lighting for the splitscreen (in MK8 and Super monkeyBall , Splatoon ) situation I've been discussing.
 

Belthazar

Member
Yup, I've played it handheld like twice since I've owned one. It's a home console to be, especially since I play using a dualshock 4.
 

PaintTinJr

Member
So much of what you say is wrong, as well as you talking about things in Mario kart that i've never witnessed nor heard about despite my many many hours with both versions. I'm suggesting beyon3d to you so you may challenge your assertions with more people than just me, because it's tiring when you're this wrong and you're not going to listen to one gaf poster. Hell even DF have useless tools according to you.
If you were providing information - and it was correct - I'd happily accept your info as fact, but you keep telling me things without substantiation - seemingly because you don't want to accept that a cornerstone of a Nintendo home console - splitscreen multiplayer - is better served on the actual home console they released 4 years prior to their hybrid console's launch.

I asked you to elaborate on your asserted 59fps drops in MK8 on WiiU because I actually checked their video back in the day with a video editing package with the histogram viewer on, and stepped through their claimed frame drop section, and had unique histogram graphs for every frame - concluding all frames were unique. And as you were so adamant that you can tell it is 59fps, I was happy to let you elaborate.
Like your statement on Wii U bandwidth, no it isn't a "by product of Wii U's architecture" regarding its gpu and cpu. Wii U's 32mb of edram (it also has 3mb of extra to emulate Wii) was a choice that could have just as easily been a solution with a unified pool of gddr3 or 5.
Not according to the available info. The WiiU Espresso CPU is largely based on high-end POWERPC 750 tech like the cube/wii chips, but the edram was one of the IBM Watson POWER7 design feature enhancements AFAIK - despite it not being a POWER7 designed chip - although apparently has the POWER7 instruction set.

Compared to the Switch's TegraX1 T210, 4x Cortex-A57 (BIG core cluster) that is used when the Switch is docked(AFAIK), the Espresso has an extra 1MB of L2 cache, 3MB in total (512KB for the primary core - diving up the work - 2MB for the 2nd Core, and another 512KB for the third core) compared to the 2MB shared cache on the Cortex-A57.

From looking at the technical info available of both chips, the brawny POWER chip has far more advanced features for cache performance as you'd expect from downscaled IBM Server tech (as a by-product), which again explains a lot about why the Switch's improved GPU and additional RAM, alone, aren't enough to usurp the Wiiu in splitscreen multiplayer.

If you need the by-product argument made more convincing, tell me why can't you buy an Intel X299 motherboard with the ability to use a Pentium Gold CPU in it? The answer is: the exact same reason. The chipset is designed to compliment the CPU capabilities, so no point having quad channel memory interfaces, abundant PCIe lanes,etc if the CPU using the chipset lacks the design, cores and cache hierarchy to exploit those features.

In addition, Wii U CAN provide more bandwidth than switch, but only if a developer puts in the work to optimize for the eDRAM. Switch's single pool is twice as fast as Wii U's main pool with superior color compression to boot. But Mario kart 8 clearly got the most out of Wii U so I would not be shocked if switch dipped more in 4 player screen despite me not having personally witnessed these drops on Switch.
That's why MK8 i(and Splatoon2) are quite revealing, they are marque title that get the best out of both their consoles - with no expense spared on optimisation for either.
 

DarkestHour

Banned
The Switch Lite proves that it is nothing but a portable device that happens to run faster using the dock. End of story.

I want a Switch console.
 
Last edited:

Edgelord79

Gold Member
They don’t just say “oh well people buy this for a portable, so let’s make the docked experience worse than it should be.”. That doesn’t happen. Equal amounts of effort are being put into both versions and indeed, sometimes more so into the docked version where they have more performance headroom.
Of course they don't try and make something worse, but they certainly know how their bread is buttered. Metrics drive everything now in companies and they may focus a little more on certain aspects versus others. Nothing is equal when it comes to money. All factors are weighed and considered.

Also this random poll isn't really indicative of the greater population at all and how they use their Switch.
 
Of course they don't try and make something worse, but they certainly know how their bread is buttered. Metrics drive everything now in companies and they may focus a little more on certain aspects versus others. Nothing is equal when it comes to money. All factors are weighed and considered.

Also this random poll isn't really indicative of the greater population at all and how they use their Switch.
Anecdotal, but in retail every time I sold a switch to a family who was asking about the switch and switch lite, basically nobody wanted the switch lite. Most of the time the switch lite was bought as a cheaper option for peoples young kids.

People are not realizing that a lot of switch's success comes from family's and friends wanting a wii like experience to play games like mario party and mario kart, on a big screen.

Initially in the thread I acknowledged that more people probably buy it for the handheld experience, as there is no 3ds anymore. But I think the gap is smaller than you think.
 

Artistic

Member
Think I'd still be more inclined to use it in portable mode if I had a regular Switch.

Guess both are equally important though.
 

Stoneberg7

Neo Member
sadly the build quality of these is miserable (like almost every controller Hori releases honestly) my left stick for example on these doesn't press LS anymore if I move it too far to the edge. and my right stick has drift upwards.
and the thing is, I maybe used these a total of 15h max. I played a little bit of Mario 3D World with these and I played through Metroid Dread with them, that's about it. so no excuse for them to be at that state with so little use... absolutely atrocious quality... like sadly almost every third party controller no matter how expensive or "premium", there is always something wrong with them -_-

this sucks especially in this case because Nintendo's official versions suck as well... so you are stuck between crap and crap here and you have to choose between which kind of crap you want -_-
The lack of rumble is what keeps me from using them more often. I know that it would likely be hell on the Switch's battery and the price of the cons would go up, but after years of controllers with rumble, it feels like playing with a Sixaxis controller.
 
There is no point in playing it on a big screen, it looks ugly as hell. With emulation on PC, the switch is pretty much a handheld and with the steam deck coming out, switch will be mighty outdated on every level.

So yea as of now i would just say its a handheld, as that's the form factor it has. The video output to a screen doesn't change that.

Nope. The Steam deck's display will look like poop compared to the switch oled.
 

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
Of course it's primarily a handheld. You can literally buy a version that has zero docking capability and it's still called a Switch, and their latest hardware upgrade for an "actual" Switch has zero effect on the docked mode.

Look at the sales numbers between Nintendo's consoles and handhelds and tell me which market you think they're primarily targeting.
 
Last edited:
I love being able to swap back and forth, but the fact remains - the Switch is a very impressive handheld console, but a very unimpressive home console.

For that reason, I consider docked mode a very nice bonus, much like playing Vita through PS TV, PSP via video out or playing GameBoy Advance games on the TV through the GameCube adapter.
 
Last edited:

GymWolf

Member
It is secondary, it's the main reason why switch is a tablet and not a classic powerfull home console.

And switch games on a 4k big screen look like shit.
 

tygertrip

Member
I always imagine it's a vizio or entry level tcl/hisense unless they tell me otherwise lol.

That said, 82 inches, dang that's huge. What model of Samsung do you have?
Its a QN82Q6FNA. The kids and I love it. Our couch is extremely easy to push further or closer, so we can adjust for whatever we are using it for.
 

GeorgPrime

Banned
So, I just don’t accept the narrative that switch is primarily a handheld device. Because it obviously is not, it’s in the name *Switch* and it isn’t “Nintendo’s handheld device that can also be played on the tv”.

People here, as well as those at digital foundry seem to call it a portable console as a way to deflect criticism from its docked performance.

Yet if we look at the launch of the console, we saw games made by Nintendo that prioritized use of the joycons on a home display ; i.e. arms, and 1 2 switch. The joy cons themselves are for home play primarily.

If we look at Nintendo’s main Mario games, there’s enough evidence to reasonably claim Mario odyssey and esp. bowsers fury are designed around the docked experience first.

Bowsers fury runs at 30fps in handheld mode, which is enough for me to say this mode was second in importance to the developers. This is Mario we are talking about ; 60fps has been a franchise staple for the majority of the series.

Mario odyssey does not run at the native resolution of the handheld, and is obviously using some shaky upscaling method from 480p or something ; like they made these hd assets and just cut the game down for handheld mode.

So yeah, since day 1 and from Nintendo’s own studios docked play has not been second to portable play, it’s in the NAME *switch*, and many of us primarily play docked mode.

I just wanted to say this cute deflecting with “it’s a portable console” as a defense of anything the switch doesn’t do so well is a cop out and should be called out.

I love the switch, and many games are beautiful and run smoothly. But the fact is it could have been a better experience with regards to docked play, in a number of games. Just because there has been no alternative to handheld play for these years, doesn’t mean the switch should be free from criticism in its docked mode.

Rant over.

Let me just say... if the switch hadnt a docked mode and was portable only... i would never buy one.
 

Banjo64

cumsessed
sadly the build quality of these is miserable (like almost every controller Hori releases honestly) my left stick for example on these doesn't press LS anymore if I move it too far to the edge. and my right stick has drift upwards.
and the thing is, I maybe used these a total of 15h max. I played a little bit of Mario 3D World with these and I played through Metroid Dread with them, that's about it. so no excuse for them to be at that state with so little use... absolutely atrocious quality... like sadly almost every third party controller no matter how expensive or "premium", there is always something wrong with them -_-

this sucks especially in this case because Nintendo's official versions suck as well... so you are stuck between crap and crap here and you have to choose between which kind of crap you want -_-
I have an 8bitdo pro plus and it’s as good as any first party pad I’ve owned. I agree that pretty much every other third party has build quality issues though.
 

Redneckerz

Those long posts don't cover that red neck boy
But we aren't seeing techniques on Switch that wouldn't have worked on the Ps3 - despite the age of the RSX and Cell BE - which as you say was largely deferred rendering with deferred lighting fx - so even if the Tegra X1 210 of the Switch is more refined in that scenario, it wouldn't matter because the base performance hit of rendering anything at switch's 1080p30 isn't leaving enough performance to move up even to XB1 level graphics, the techniques are all still just resolution/frame-rate improvements on WiiU visuals IMHO.
The thing is this - Maxwell is a more modern part than Latte, which is rightly evident in the fact that it can use DX11, or Vulkan - Nintendo's NVN. I agree, the Switch does not have the legging to match XB1 performance, but the GPU feature set is equivalent - Hence why i say its DX11-featureset-equivalent with the other consoles.

Latte is a more modern part than Xenos or RSX and so it can use more modern effects, but you can't squeeze more out of 160 cores than what was seen on Wii U.
And obviously deferred rendering doesn't fit with splitscreen rendering anyway, because you don't have time to defer 4 different frustum's passes looking in different directions - along with four times more CPU processing for user input , game logic, geometry culling, collision tests, physics, audio - within 33ms, so it is all forward rendering with largely pre-baked lighting for the splitscreen (in MK8 and Super monkeyBall , Splatoon ) situation I've been discussing.
I am not sure how forward rendering and largely pre-baked lighting work for these games. These aren'' general principles.
Not according to the available info. The WiiU Espresso CPU is largely based on high-end POWERPC 750 tech like the cube/wii chips, but the edram was one of the IBM Watson POWER7 design feature enhancements AFAIK - despite it not being a POWER7 designed chip - although apparently has the POWER7 instruction set.
This is simply untrue. Watson and Espresso only have a superficial relationship to eachother - Namely, multicore and cache support). Everything else is not true: Espresso is quite literally an IBM Gekko with 3 cores first and formost.
From looking at the technical info available of both chips, the brawny POWER chip has far more advanced features for cache performance as you'd expect from downscaled IBM Server tech (as a by-product), which again explains a lot about why the Switch's improved GPU and additional RAM, alone, aren't enough to usurp the Wiiu in splitscreen multiplayer.
Are you looking at the developer SDK for that kind of info or?
 

PaintTinJr

Member
The thing is this - Maxwell is a more modern part than Latte, which is rightly evident in the fact that it can use DX11, or Vulkan - Nintendo's NVN. I agree, the Switch does not have the legging to match XB1 performance, but the GPU feature set is equivalent - Hence why i say its DX11-featureset-equivalent with the other consoles.

Latte is a more modern part than Xenos or RSX and so it can use more modern effects, but you can't squeeze more out of 160 cores than what was seen on Wii U.
Nintendo don't use Microsoft's directx, they use Opengl (ES now) AFAIK, or list their hardware by opengl as refence, and have done since the Gamecube. DirectX capabilities are always behind opengl extensions, so has no bearing on the discussion, if performance was equal between Wiiu GPu and Switch the age of the desktop chipset, versus the mobile maxwell chipset would play no part rendering 2006 PS360 techniques.
I am not sure how forward rendering and largely pre-baked lighting work for these games. These aren'' general principles.
They are general principles - see latest Halo lighting as a modern example.

As you reduce the rendering window per frame, your bottleneck becomes the CPU cache when scaling resolution and fx on the GPU as needed. This issue was repeatedly quoted (a tweet from the PS5 lead software architect) in our neogaf Next-gen thread - leading up to the Ps5/XsX launches - around the difficulty of higher frame-rate rendering. More passes, means more workload switching - something that forward rendering with at most 2 pass lighting for shadows/cubemapping reflections keeps to a minimum - and workload switching stresses the caches far more, because every workload switch has another workload setup cost to amortise, even more so when the 120fps is four independent 1/4 resolution viewports, as you quadruple the non-redundant CPU workloads also.

This is simply untrue. Watson and Espresso only have a superficial relationship to eachother - Namely, multicore and cache support). Everything else is not true: Espresso is quite literally an IBM Gekko with 3 cores first and formost.

Are you looking at the developer SDK for that kind of info or?
No, I found out the CPUs from the tegra X1 wikipedia, and the Espresso wiki, then looked Coretek A57 benchmarks with regards to cache (three configs, of 512KB, 1MB and 2M) and the POWERPC 750 info from IBM. The Espresso info mentions the edram/instruction set was from POWER7 with larger improved caches and multicore, and it lists the info as being from those reverse engineering the wiiu(hotchips?), not from Nintendo.

If that public info is wrong, then my info by extension would be wrong.
 

Josemayuste

Member
But.. what if it is of secondary importance to me indeed.. because.. I already have got consoles to play on a big tv.. yeah..
 

Redneckerz

Those long posts don't cover that red neck boy
Nintendo don't use Microsoft's directx, they use Opengl (ES now) AFAIK, or list their hardware by opengl as refence, and have done since the Gamecube. DirectX capabilities are always behind opengl extensions, so has no bearing on the discussion, if performance was equal between Wiiu GPu and Switch the age of the desktop chipset, versus the mobile maxwell chipset would play no part rendering 2006 PS360 techniques.
*Sigh* see this is what i particularly annoying that i have to do apparently. I say DX11-equivalent-feature set because i know they use native API's - I use this term to describe a standard which all machines have (DX11-equivalent GPU's).

It is thus annoying that you ignore this and then proceed to state the obvious.

The rest of this part i cannot explain.

They are general principles - see latest Halo lighting as a modern example.
No, you don't understand, but i don't feel compelled to explain in full. Yes, most games have a mixture of baked/dynamic lighting, but not all games are the same.
No, I found out the CPUs from the tegra X1 wikipedia, and the Espresso wiki, then looked Coretek A57 benchmarks with regards to cache (three configs, of 512KB, 1MB and 2M) and the POWERPC 750 info from IBM. The Espresso info mentions the edram/instruction set was from POWER7 with larger improved caches and multicore, and it lists the info as being from those reverse engineering the wiiu(hotchips?), not from Nintendo.
Yes, and that's at best a superficial improvement. The Espresso cores are derived from Broadway and Gekko.
If that public info is wrong, then my info by extension would be wrong.
It is indeed. The GPU info i have is directly from the SDK, aswell as other specifics.
 

PaintTinJr

Member
*Sigh* see this is what i particularly annoying that i have to do apparently. I say DX11-equivalent-feature set because i know they use native API's - I use this term to describe a standard which all machines have (DX11-equivalent GPU's).

It is thus annoying that you ignore this and then proceed to state the obvious.
I didn't ignore it, it is just that DX11 isn't the industry standard for graphics, no matter how Microsoft present it. So if you are a developer with access to all of Nintendo's sdk documentation, why would even mention DirectX?
The rest of this part i cannot explain.


No, you don't understand, but i don't feel compelled to explain in full. Yes, most games have a mixture of baked/dynamic lighting, but not all games are the same.
Yes, I didn't argue that point, but most high frame-rate games that scale well with hardware and resolution over time are forward renderers, unless Carmack's idtech - that's used as the basis of many other shooters like Titanfall - was doing it wrong in your opinion? moving into high frame-rate VR - again where high frame rate is necessary - seems like a good fit for his skills, no?
Yes, and that's at best a superficial improvement. The Espresso cores are derived from Broadway and Gekko.

It is indeed. The GPU info i have is directly from the SDK, aswell as other specifics.
Well, what part about the Coretek A57 in the tegra X1 t210 is wrong - bigger cache - to make it superior at high frame-rate 4 player splitscreen? And viceversa with the Wiiu Espresso info. Does it not have an extra megabyte of L2 cache over the T210, or higher bandwidth, lower latency via the edram?
 
Oh yeah I forgot to mention the horrendous battery life. Play for 30 minutes and the battery is at 60%. Nope.
Which version do you have? the V2 runs like 5 hours on a charge. Much better than the OG model, as it has a bigger battery. The oled i hear is the same as v2.
 
Nintendo does act like docked mode is of secondary importance. They released a portable only version of their console, no console only version so far.

So the switch is mostly a portable console that can come with a dock that allows it to work on the TV.
 
Top Bottom