• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

I just tried Metroid Prime 3 on the Wii using an old CRT TV and it blew my mind.

cireza

Banned
"Recommended" was not the right word. I should say "intended" maybe?

Developers knew 99% of consumers would play these games on RF or composite/regular scart. Therefore they made the games catering to that crowd. And they took advantage of all the quirks and artifacts these connections cause while they were at it. There's already the Sonic waterfall posted but there are countless other examples using dithering for blending fake colors, smoother gradients and transparencies. By using a better connection you don't get these quirks and you lose some intended effects.

So yeah, you gain some you lose some. It's a "pick your poison" situation. Sure, i don't get as sharp image but i prefer playing pre-DC games on composite because i know i won't lose any of the intended quirks that complete the picture the artists intended.
In France at least, every single SEGA console was bundled with a RGB SCART cable, so I don't think many people have seen the intended waterfall here. But I agree with what you said of course.
 

nkarafo

Member
In France at least, every single SEGA console was bundled with a RGB SCART cable, so I don't think many people have seen the intended waterfall here. But I agree with what you said of course.
Was it an RGB scart or a regular SCART? Or was it composite with a SCART adapter? Also, did all/most of the TVs had an RGB input?

I'm asking because i don't know/remember. I do remember our main TV in the late 80's had a SCART input but it wasn't RGB. So if you connected an RGB SCART it would show the image but it wouldn't be as sharp as a proper RGB screen.

Or maybe i'm wrong and all SCART inputs supported RGB? That can't be because our next TV had two SCART inputs where only one of them was RGB (it was labeled as such).

Anyway, i live in Greece and during the 8/16 bit days consoles were bundled with RF. At least the ones i bought were (A Master System 2 and later a Mega Drive). In the PS1/N64 gen they were bundled with composite and a SCART adapter where you connected the composite cables.
 
Last edited:

cireza

Banned
Was it an RGB scart or a regular SCART? Or was it composite with a SCART adapter? Also, did all/most of the TVs had an RGB input?

I'm asking because i don't know/remember. I do remember our main TV in the late 80's had a SCART input but it wasn't RGB. So if you connected an RGB SCART it would show the image but it wouldn't be as sharp as a proper RGB screen.

Or maybe i'm wrong and all SCART inputs supported RGB? That can't be because our next TV had two SCART inputs where only one of them was RGB (it was labeled as such).

Anyway, i live in Greece and during the 8/16 bit days consoles were bundled with RF. At least the ones i bought were (A Master System 2 and later a Mega Drive). In the PS1/N64 gen they were bundled with composite and a SCART adapter where you connected the composite cables.
In France SEGA were including true SCART RGB cables with all consoles (not composite through SCART). They were the only manufacturer to do this.
 
Last edited:

Moonjt9

Member
I wonder if companies will ever produce CRTs again, even at short supply like vinyls are nowadays would be awesome
My wish. There is certainly a market for CRT televisions. Or can they make improvements on the tech to modernize it, etc. Until then I baby my PVM
 
Yeah I had a Samsung HD CRT from Best Buy. It was ok but I upgraded to a plasma soon after and it was much better.
Plasma tv's, at that time sucked ass big time. When you did not pay attention the screen get burned in in no time.

I had a big Sony flatscreen CRT, with scart rbg cable, it was great, beautiful picture quality, one of the best tv's i ever had. But that thing was heavy as fuck, you needed 3 people to lift it up and put in in a other place....😳
 
Last edited:
I'm asking because i don't know/remember. I do remember our main TV in the late 80's had a SCART input but it wasn't RGB. So if you connected an RGB SCART it would show the image but it wouldn't be as sharp as a proper RGB screen.
Yeah, scart was something akin to a "container" it supported composite, s-video, RGB, Component and sometimes even VGA with progressive signal.

Of course though, some TV's were wired differently to economize traces, so some toped out at composite or s-video signals. And most didn't support neither component nor VGA being passed through it.

When Scart RGB cables fail to display RGB, they usually still work, because they use the composite channel as the sync channel.

It's a shame that Scart wasn't used in USA, it was years ahead of S-Video the 90's were a mess. Best TV's were Japanese or European due to extra plugs, best consoles/game versions were American/Japanese. Only Japan got the best of both worlds.
Or maybe i'm wrong and all SCART inputs supported RGB? That can't be because our next TV had two SCART inputs where only one of them was RGB (it was labeled as such).
You're not wrong. ;)
 
Last edited:

zeomax

Member
Last edited:
I'm listening???
Lol, sorry.

Well I think it's great. Yeah it's a little more streamlined than the first two, a little more condensed, but I think that works in its favor sometimes, esp. compared to echoes. Bryyo and Skytown Elysia are great areas.

Also, the Wii mote controls aside from a few gimmicky motions to take out the fuel batteries, or pull a lever were fantastic and made it feel so much better to play vs. the first 2. And yeah the prime trilogy exists, but the controls still feel more engaging in 3.

The worst game retro did, has to be dk country returns. A good game, really good, but feels safe. Tropical freeze was noticeably better in every way, and I would still put it below corruption on the whole. I love the DK games but they're less grand experiences.
 

mcjmetroid

Member
I played it on dolphin recently with wiimote support.

Excellent game but they need to lose that "triforce hunt" end game that's in all the games. There's no need for it. Echoes was by far the worst but it's extra disappointing in 3 considering how well paced the rest of the game is, brings the game to a grinding halt.
 
Plasma tv's, at that time sucked ass big time. When you did not pay attention the screen get burned in in no time.
Note that Plasma burn-in is nothing like OLED burn-in.

On plasma we call that to image persistense caused by over-excited phosphor maintaining a different (more bright) behaviour after being 100% lit for a while. Meaning it would actually go away after a while, it's just that it could take quite a bit to do so.
My wish. There is certainly a market for CRT televisions. Or can they make improvements on the tech to modernize it, etc. Until then I baby my PVM
It'll never happen at this point but there were some tech aside from Plasma, that could mimic the CRT strenghts. See SED, FED and LPD.

We were lucky we even got plasma.
Where? There are no "new CRTS. The components inside undergo chemistry and can still degrade.
See Thomas Electronics. They manufacture, and refurbish. Obviously positioned for big clients like the army and big contracts (not your old television) but it can be done. They operate in america and europe.

Some CRT's can be "rebuild" by technicians and enthusiasts, either using rejuvenation tools (basically recalibrating and increasing voltage, much like you do with a failing laser on CD drives) and although in some cases that's a temporary fix, let's say something like a PVM can last a while until it reaches it's state before intervention. If there was enough interest to you could replace electron guns and phosphor inside a cathode ray (which is probably what Thomas Electronics does).

They were manufactured in india and china at least until a few years back, perhaps also Russia, the guys still build ancient cars. There are advantages to old tech in regards to low scale production, no R&D, cheap manufacturing machinery and maintenance of said equipment. The tech is certainly on the way out, but a niche market is a niche market. People paying a lot of money for some old CRT is what makes something like Retrotink exist, so I wouldn't be surprised if more CRT restoration facilities pop up or target more regular customers. Mass manufacturing is not possible unless they do what photo film manufacturers are doing, investing in R&D to come up with less-poluting stuff.

Well I think it's great. Yeah it's a little more streamlined than the first two, a little more condensed, but I think that works in its favor sometimes, esp. compared to echoes. Bryyo and Skytown Elysia are great areas.
Metroid Prime 3 is great, but gets further apart from what made Metroid Prime 1 so great: Being alone in space and the transition between areas being organic. In a lot of ways, it was going the way Skyward Sword went, before Skyward Sword existed.

It also invests a bit more in texture detailing than geometry, which Metroid Prime did to amazing effect. So in some ways I think artistic direction on Prime 1 feels more modern. Metroid prime was all about geometric detail, like holes in walls and stuff, with 64x64 pixel textures. Metroid Prime 3 is all about, some of that yes, but a lot of 512x512 textures. Thing is geometric detail looks awesome when you blow it up, 512x512 textures are huge for the Wii, but they're just blurry in HD. looks positively amazing on a Wii plugged to a CRT though.

Echoes is a motherfucker to get into, due to the game palette being very "dune-ish", and you can't stop playing until you finish or you'll be lost, but the level design is super tight.
 
Note that Plasma burn-in is nothing like OLED burn-in.

On plasma we call that to image persistense caused by over-excited phosphor maintaining a different (more bright) behaviour after being 100% lit for a while. Meaning it would actually go away after a while, it's just that it could take quite a bit to do so.

It'll never happen at this point but there were some tech aside from Plasma, that could mimic the CRT strenghts. See SED, FED and LPD.

We were lucky we even got plasma.

See Thomas Electronics. They manufacture, and refurbish. Obviously positioned for big clients like the army and big contracts (not your old television) but it can be done. They operate in america and europe.

Some CRT's can be "rebuild" by technicians and enthusiasts, either using rejuvenation tools (basically recalibrating and increasing voltage, much like you do with a failing laser on CD drives) and although in some cases that's a temporary fix, let's say something like a PVM can last a while until it reaches it's state before intervention. If there was enough interest to you could replace electron guns and phosphor inside a cathode ray (which is probably what Thomas Electronics does).

They were manufactured in india and china at least until a few years back, perhaps also Russia, the guys still build ancient cars. There are advantages to old tech in regards to low scale production, no R&D, cheap manufacturing machinery and maintenance of said equipment. The tech is certainly on the way out, but a niche market is a niche market. People paying a lot of money for some old CRT is what makes something like Retrotink exist, so I wouldn't be surprised if more CRT restoration facilities pop up or target more regular customers. Mass manufacturing is not possible unless they do what photo film manufacturers are doing, investing in R&D to come up with less-poluting stuff.


Metroid Prime 3 is great, but gets further apart from what made Metroid Prime 1 so great: Being alone in space and the transition between areas being organic. In a lot of ways, it was going the way Skyward Sword went, before Skyward Sword existed.

It also invests a bit more in texture detailing than geometry, which Metroid Prime did to amazing effect. So in some ways I think artistic direction on Prime 1 feels more modern. Metroid prime was all about geometric detail, like holes in walls and stuff, with 64x64 pixel textures. Metroid Prime 3 is all about, some of that yes, but a lot of 512x512 textures. Thing is geometric detail looks awesome when you blow it up, 512x512 textures are huge for the Wii, but they're just blurry in HD. looks positively amazing on a Wii plugged to a CRT though.

Echoes is a motherfucker to get into, due to the game palette being very "dune-ish", and you can't stop playing until you finish or you'll be lost, but the level design is super tight.
That's interesting to say about the geometry, because prime 3 is more dense than both 1 and 2.

I mean just landing in bryyo there's a lot of things jutting out of walls... But yeah maybe it's more emphasized in 1? I can't remember.
 

Dorago

Member
I wonder if companies will ever produce CRTs again, even at short supply like vinyls are nowadays would be awesome
You need the tubes with the phosphor coating and mask. You need the electron gun. Everything else is off the shelf components or at least close enough. The indie guys making their own 8K panels show the way I think.
 
PS2/GC/XBOX/Wii, are all SD devices. They don't benefit from HD displays. in fact the HD displays will make them look worse.

But this isn't even about resolution. It's about motion clarity. Even a modern game will have much better motion clarity on a CRT.
Dreamcast too: the output through VGA on a CRT is glorious. The only way to play it (and the way it was meant to anyway).

I miss my CRT. I probably should buy one.
 
That's interesting to say about the geometry, because prime 3 is more dense than both 1 and 2.

I mean just landing in bryyo there's a lot of things jutting out of walls... But yeah maybe it's more emphasized in 1? I can't remember.
More emphasized, yes. But what I said was actually sourced from an interview they did back then that stuck with me.

Metroid Prime 3 Art Director Todd Keller: "You can ask any artist here what the first 'Prime' was about and they'll say cracks. All we did was put thousands of cracks everywhere. For some reason at the time I was real big into cracks and everything had to be beveled. Every crack was custom. There is not one crack that was copied around. I made them chop up everything. We chopped up every stone that was unique on the game. Every pebble." So what is "Prime 3" about? "Texture detail." Keller said that in the Wii game, every texture -- the flat pieces of art that coat every side of every figure, object and piece of terrain in any 3-D game -- was handmade, ideally at 512 pixels wide, double the resolution of textures in the two earlier "Prime" games.

"All those mushrooms in the Bryyo world -- those big, spiky mushrooms -- those aren't really copied around at all," Keller said. "They're kind of made new each time, just because we didn't want the mushrooms to be similar. We would take all the vertices and move them different places or extrude, pull out new polygons to make the mushrooms fit into the hill differently. When we get closer to [the toxic energy called] Phazon they are a little more corrupted."
What his team did for mushrooms they tried to do for everything in the game, from the enemies to the chambers on each of the game's planets. "Our focus is to make every room its own custom stage," Keller said. "We think it's up to us to present something that's very high-quality for the player to enjoy. We don't want to copy our own rooms or textures if we can. Because we want it to be new for you every time you walk in one of those doors."
Source: https://www.mtv.com/news/y2lcs9/met...ushed-wii-graphics-and-that-famous-controller

Of course polygon counts didn't go down, Metroid Prime 3 still has cracks and if Metroid Prime was a 10 million polygon per second game, I'm sure Metroid Prime 3 is pulling more polygons than that (doors are rounder, everything is higher polygon), but surfaces are flatter, the game is cleaner. The "detail" is in having more objects, feeling more rounded and better textures.

It's just that it still feels cleaner, less organic to me.
You need the tubes with the phosphor coating and mask. You need the electron gun. Everything else is off the shelf components or at least close enough. The indie guys making their own 8K panels show the way I think.
Not aware of that but seems interesting. Can you hook me up?
 
Last edited:
I truly miss CRTs. I still have a Trinitron in storage but even though it’s a small 13” I just don’t have room for it right now so stuff like the GameCube is hooked up via Kaico and my BC PS3 handles all old PlayStation.

The biggest travesty though is the abandonment of SED displays.

Seriously look up SED if you don’t know what it is.

 

Kenpachii

Member
played counterstrike on crts back in the day, was so good. then moved to LCD couldn't even play shooters anymore the motion blur was just horrible.
 

Type_Raver

Member
Yeah, i wasn't expecting it to be that good. I know old CRTs have some qualities that modern panels haven't reached yet, like the complete lack of motion blur because they don't use the "sample and hold" method modern panels use.

So i knew it was going to look sharp in motion and all that. But i wasn't expecting such a huge difference. And this comes from someone who has a 240hz PC monitor and regularly plays some older games at 240fps. In modern panels, the more fps you get, the less the motion blur is. At 240hz/fps, the motion is very clear compared to the 60hz/fps blur fest. But even that doesn't come close to the clarity of the CRT, despite only being 60hz/fps (the game runs at 60fps). It looks so sharp when in motion and everything feels like it has more depth. It's a feast for the eyes and that's with a crappy composite cable.

Younger people who never played on a CRT don't know what they are missing and this is kind of a blessing. Because once you see how much more clarity CRTs offer, it's hard to go back and not think about it. It's like display technology took a few steps forward and a massive step back. That's without even mentioning the lower input lag a CRT offers as well.

Now i get that the Wii on that CRT only outputs a 480i image. Yet my eyes felt better than playing a game on my 2020 Samsung LCD at 4k, or my 1080p Alienware 240hz PC monitor. It was better than playing the same game upscaled on the Dolphin emulator. Yes, modern games look amazing and very detailed on the modern panels when they don't move. But when they start moving, everything goes down the drain. So, if i had to choose between lower resolution/size and CRT clarity VS the best quality still images, i would choose the CRT. And if i needed higher resolution, a PC CRT monitor would be even better. But i find it's too difficult to connect a modern PC/console on an old 480i CRT. Has anyone ever did this? I do have some CRT PC monitors but all of them have issues so i can't use them.

Originally, i hooked up my Wii to a CRT monitor (using a 3rd party Wii HDMI dongle to a HDMI to VGA adapter) and it was amazing to see it in motion! Subsequently, i've now got my Wii U hooked up instead (as it has my eShop transfers on it). I've also got my Switch connected to it (output of 720p) and absolutely enjoying playing Xenoblade Chronicles 3 this way!

What I wanted to ask though, has anyone successfully plugged a SNES Mini Classic to a CRT TV?

Its so convenient having all the games on one console with an original snes pad, instead of hunting expensive snes carts.
 

StreetsofBeige

Gold Member
then moved to LCD couldn't even play shooters anymore the motion blur was just horrible.
A shame LCDs kicked plasma to the curb. During that early HD era, I jumped from a CRT to a small 42" Panasonic plasma. This was somewhere around 2007 I think. Nice picture quality. At the time, LCD TVs had terrible ghosting when 8ms lag was the norm, but everyone was still amped up getting these shit TVs though you could see the ghosting on these 720p models. I guess the marketing worked where plasma's big cons were burn in and heavy as hell. I dont remember price, so maybe plasmas were way more expensive for comparable size? I dont remember. Maybe that was a killer factor.

I then got a bigger Panasonic plasma around 2011. Never had burn in. And I'd leave my tv on for hours after dinner if I was surfing the net watching sports where there'd be fixed UI on screen. I agree plasmas are heavy as I always made sure my brother or buddy helped me pick up the 60" TV from the store or put it in the right spot when I moved, but there were times I said fuck it and just heaved it myself onto the metal stand. Heavy yes, but it's not like any person is going to be moving a TV around the house every week, so who cares. I must had left that 60" TV in the same spot on the TV stand for 7 years when it was all set in place.
 
Last edited:

Hoddi

Member
What I wanted to ask though, has anyone successfully plugged a SNES Mini Classic to a CRT TV?
I don’t think that’s really feasible without losing 240p support. I’ve never seen an HDMI adapter that supports 240p, in any case.

I think your much better bet would be getting a Raspberry Pi since the SNES Mini is just using emulation anyway. You can get native composite output from the Pi at 240p or even use S-video or RGB SCART if your TV supports them. It’s basically the same thing but with much better output options.
 

Hoddi

Member
A shame LCDs kicked plasma to the curb. During that early HD era, I jumped from a CRT to a small 42" Panasonic plasma. This was somewhere around 2007 I think. Nice picture quality. At the time, LCD TVs had terrible ghosting when 8ms lag was the norm, but everyone was still amped up getting these shit TVs though you could see the ghosting on these 720p models. I guess the marketing worked where plasma's big cons were burn in and heavy as hell. I dont remember price, so maybe plasmas were way more expensive for comparable size? I dont remember. Maybe that was a killer factor.

I then got a bigger Panasonic plasma around 2011. Never had burn in. And I'd leave my tv on for hours after dinner if I was surfing the net watching sports where there'd be fixed UI on screen. I agree plasmas are heavy as I always made sure my brother or buddy helped me pick up the 60" TV from the store or put it in the right spot when I moved, but there were times I said fuck it and just heaved it myself onto the metal stand. Heavy yes, but it's not like any person is going to be moving a TV around the house every week, so who cares. I must had left that 60" TV in the same spot on the TV stand for 7 years when it was all set in place.
I’m still in the plasma holdout camp. I upgraded to OLED in 2019 and brought my old 2010 plasma into the home office. I mostly just thought I’d use it with my PS3 and Wii U but my PS4 Pro only lasted a few months on the new TV before I brought it into the office as well.

I currently have my PS3, PS5, Wii U, and Series S all plugged into the plasma. I also have an SX plugged into the OLED but I basically never use it. 4k HDR is certainly nice enough but not enough for me to forgo the plasma.

Needless to say, I’m still pretty mad about the whole ‘LED TV’ marketing bullcrap that killed plasmas. Like changing the backlight would make LCDs any less crap.
 
A shame LCDs kicked plasma to the curb. During that early HD era, I jumped from a CRT to a small 42" Panasonic plasma. This was somewhere around 2007 I think. Nice picture quality. At the time, LCD TVs had terrible ghosting when 8ms lag was the norm, but everyone was still amped up getting these shit TVs though you could see the ghosting on these 720p models. I guess the marketing worked where plasma's big cons were burn in and heavy as hell. I dont remember price, so maybe plasmas were way more expensive for comparable size? I dont remember. Maybe that was a killer factor.
power consumption and heat generation were two more negatives.
plus, plasma generally couldnt get quite as bright as LCDs, was a bit more expensive, and had fewer models to choose from.

so to the layman, browsing in a store, you could either buy the heavier, more expensive, costlier to run, dimmer, hotter plasma (& you get to worry about burn-in too), or... you can buy the "superior" LCD, which is what "everyone else is buying" too.
 

Type_Raver

Member
I don’t think that’s really feasible without losing 240p support. I’ve never seen an HDMI adapter that supports 240p, in any case.

I think your much better bet would be getting a Raspberry Pi since the SNES Mini is just using emulation anyway. You can get native composite output from the Pi at 240p or even use S-video or RGB SCART if your TV supports them. It’s basically the same thing but with much better output options.
Since posting my question, this is what ive been considering too.

Thanks for your reply.
 

NeoIkaruGAF

Gold Member
Note that Plasma burn-in is nothing like OLED burn-in.

On plasma we call that to image persistense caused by over-excited phosphor maintaining a different (more bright) behaviour after being 100% lit for a while. Meaning it would actually go away after a while, it's just that it could take quite a bit to do so.
In my own experience, retention on plasma was very real, and very scary.
TV channel logos and game HUDs would get stamped on the screen in a matter of minutes (if pure white) to a couple of hours, and some would remain faintly visible for weeks after you stopped actively displaying them. Temporary retention was guaranteed if you spent just an hour on a single video game. I would always, always leave the TV on an empty channel with static noise for at least 15 minutes before turning it off after playing a game.

With OLED it’s very different. Only pure white static elements persist onscreen, and it takes longer than on plasma. They go away quickly and easily. I played Elden Ring for 190 hours and that compass that’s always onscreen didn’t leave a trace behind.



What I wanted to ask though, has anyone successfully plugged a SNES Mini Classic to a CRT TV?
I used a very cheap HDMi-to-SCART adapter from Amazon.
The problem with the Classic Mini consoles is they always display a 16:9 screen with borders. So on a 4:3 TV, the actual game’s screen is squashed. A pity, because even if it isn’t 240p, it’s still interesting to see the results.

The Switch instead, it switches (heh) to 4:3 automatically and it’s oddly satisfying to see a 1080p game like MK8 running on an analog screen. Motion is absolutely flawless.
 

small_law

Member
It's been a pain in the ass, but I've hung on to my 27-in Sony Wega CRT with a flat display that I bought in 2001. It's heavier than the sun and I've moved it six or seven times now. Every time I think about getting rid of it, I read a post like this and I'm very glad I hung onto it.
 

Hoddi

Member
In my own experience, retention on plasma was very real, and very scary.
TV channel logos and game HUDs would get stamped on the screen in a matter of minutes (if pure white) to a couple of hours, and some would remain faintly visible for weeks after you stopped actively displaying them. Temporary retention was guaranteed if you spent just an hour on a single video game. I would always, always leave the TV on an empty channel with static noise for at least 15 minutes before turning it off after playing a game.
Ya, I think it largely depended on the use case. My 2010 Panasonic is still free of burn-in but I only ever used it for films and games. My friend’s 2013 plasma got badly burned in (in half the time) because he also used it a lot for TV broadcasts and football etc.

My OLED is similarly still fine and probably for the same reason. It’s mostly just a movie TV outside of the occasional game I play on it.
 
In my own experience, retention on plasma was very real, and very scary.
TV channel logos and game HUDs would get stamped on the screen in a matter of minutes (if pure white) to a couple of hours, and some would remain faintly visible for weeks after you stopped actively displaying them. Temporary retention was guaranteed if you spent just an hour on a single video game. I would always, always leave the TV on an empty channel with static noise for at least 15 minutes before turning it off after playing a game.
Weeks is not forever and still "mostly" temporary, it just takes quite a while to go away. Now don't get me wrong, early plasmas were quite lazy when it came to going back to normal and were quite easy to "excite" (and their expected lifespan was also lower); they also definitely had the burn-in period where phosphor was even less "trained" so you had some disasters that at the very least took quite a while to clean-up and at worst took so long that were deemed as definitive burn-in. In reality I doubt it ever was, I've never seem permanent burn-in that wasn't getting better if you discontinued the abuse. It was just very slow in some cases.

I don't have such issues with any of the plasmas I own though, the only care I have is not using Dynamic picture mode. There's retention of course, after hundreds of hours playing a game, but after I play another equally long game, it's gone. You only ever notice it in regular use with pure black screens anyway.

OLED is different alright. Plasma retention is not pixel half life, it's overexited phosphor, so it actually translates to more brightness on the pixel that should be even with the brightness to it's side, with oled it's less brightness on the "burnt" spot and what modern OLED TV's are doing as a calibration measure is "burning" the pixels that are not burned so everything is burned in even fashion. I despise that methodology. Some videos on youtube demonstrate how they do it, the brightness goes down evenly for the whole panel in a constant way. It's basically the equivalent of sharpening a pencil.

With OLED our TV is gradually getting worse, that's a constant given (and a best case scenario or we'd notice).
One last rather drastic feature OLED TVs provide to address potential screen burn issues is Panel or Pixel Refreshing. This feature essentially attempts to equalise the ‘wear’ on the organic elements in the picture so that no specific parts of the image age faster than the rest.

(...) The feature has short term and long term elements. The short term one sees your TV running a pixel refresh that takes around 10 minutes when you turn the TV into standby after running it for more than four uninterrupted hours. You won’t even know that this process is taking place.

The long term LG pixel refresher kicks in every 2000 hours of use, and involves a much longer ‘equalisation’ process that takes up to an hour to complete. Here the user will get a message that the refresh system is about to run, and see a white horizontal line on screen when the process is almost finished.

Many OLED TVs also let you run a pixel/panel refresh routine as and when you wish, via dedicated options in the onscreen menus. (...) Sony’s advice on its own Panel Refresh feature is that running it ‘may affect the panel. [So] as a reference… do not perform it more than once a year, as it may affect the usable life of the panel’.
Source: https://www.whathifi.com/advice/ole...e-worried-about-it-and-how-can-you-prevent-it

It prevents burn-in by aging every pixel. That means that yes, burn-in is gone for most people (still a problem in PC environments and prolonged gaming with lots of menus or Dynamic mode) but if that behaviour wasn't enforced that wouldn't be the case as uneven "burn-in" it's inherent to OLED in regular use.
With OLED it’s very different. Only pure white static elements persist onscreen, and it takes longer than on plasma. They go away quickly and easily. I played Elden Ring for 190 hours and that compass that’s always onscreen didn’t leave a trace behind.
Yes, but it probably burned the rest of the pixels to match the burned ones. If you measured your OLED now it would certainly output less brightness than it did as new.

Or, alternatively, it's spending more energy in order to increase the brightness throughput (also means less lifespan butI doubt they are as they would have to invest extra on the power supply).

OLED has gotten good at masking the way it works and making it unnoticeable in regular use, but be aware that image quality and actual specs are always decreasing. I wouldn't want to buy a used OLED, ever.
Ya, I think it largely depended on the use case. My 2010 Panasonic is still free of burn-in but I only ever used it for films and games. My friend’s 2013 plasma got badly burned in (in half the time) because he also used it a lot for TV broadcasts and football etc.
Probably overuse of the Dynamic Mode as well.

If he discontinued all that mistreatment for a while and completely changed habits it could be alright, I doubt it reached anything remotely near half life (minimum 100000 hours for a 2013 plasma).
 
Last edited:

sn0man

Member
PS2/GC/XBOX/Wii, are all SD devices. They don't benefit from HD displays. in fact the HD displays will make them look worse.
It’s a trade off for most of those systems. You get more accurate temporal resolution (e.g., less blurring) but you lose out on higher resolution such as 480p/720p/1080i in some cases.
 

01011001

Banned
PS2/GC/XBOX/Wii, are all SD devices. They don't benefit from HD displays. in fact the HD displays will make them look worse.

The Xbox has a handful of 720p titles tho, and those will definitely benefit from HD displays.

many of the Tony Hawk games for example, or Soul Calibur 2
 

Dr_Salt

Member
I like how we actually see the games for like 3 percent of the video and the 97% other percents is just seeing those 2 dudes faces. YouTubers egos became ridiculous, all those videos are unwatchable, you learn nothing except that those 2 dudes like to suck their own dicks and they could do it all day long.
There is no use seeing the games on a crt through a camera anyway you won't appreciate it unless you have them in front of you.
 
If I can see this effect in these photos on my flat screen, why can't these effects be faked (in a good way, people always complain about the fake crt filters)?

Kega Megadrive emulator has a really effective filter that emulates this effect almost perfectly, even the nice 'transparent' water effects you see in Sonic games works perfectly.
 

sachos

Member
CRT Royale Composite filters in RetroArch can emulate the waterfall effect too, those filters are crazy good.
But as OP mentioned, this is about motion clarity of CRTs, that shit is insane. I recently got a SyncMaster 793s after watching those DF CRT Hype videos and i gotta say, they are right, it looks insanely smooth even at "just" 60hz.

The SnycMaster 793s can do 4:3 Aspect Ratio at 640x480@135hz, 736x552@120hz, 800x600@110hz, 1024x768@85hz and 16:9 Aspect Ratio at 976x549@120hz or 720p@85hz

I tried taking some pics but my phone does not do it justice, as you can see these things are better when viewd in the dark.
Dark Forces 2 at 135hz is so smooth, fighting games look amazing on this (that btw is a RetroPi connected via hdmi to vga cable), i tried emulating PS2 with Tony Hawk Pro Skater 3 in 4K downsampled to 1280x960@60hz i've never seen Tony skate so smoothly before. For some reason i can only get Quake 2 to run at 800x600@110hz, still insanely smooth. Doom pics are 736x552 for 4:3 and 976x549 for 16:9, both 120hz.

Only downside i've seen so far is blooming and phosphor decay/trails and this monitor is kinda failing so sometimes you get wrong colors but it fixes itself after a while.

lh8GolW.jpg
o5xfFF8.jpg
LqFONec.jpg
MNhKBKA.jpg
DFxV7eA.jpg
hwAFaCT.jpg
 
Last edited:

BlackTron

Member
This post hurt my feelings. Yesterday I talked myself out of getting this 27" Toshiba CRT with s-video off craigslist for $25. They're huge, they're heavy, I don't have anywhere to put it, I accumulate too much stuff, be more minimalist, blah blah.

And then you come in here with this soul-crushing wall of text, god dammit...

I had more pressing matters and couldn't deal with getting this TV but by some Grace of the gaming Gods it was still available today and I snagged it. Almost 3 weeks later.

I tested it out with a variety of old games and yeah, there's pretty much no contest, I thought my gaming monitor was good, but while using this my eyes feel like they're going on vacation.

I thought I was rusty going back to Sonic Adventure and failing to beat the first level under 2 minutes on other solutions, including some I thought were really good, like Flycast on Series X. I was wrong, everything else is complete garbage compared to real DC on a CRT, I got under 2:00 without paying attention and taking the long path on purpose because gfx test, everything just clicked and felt buttery smooth, not only was it very pleasing to the eye, but I was also astonished at what a difference it made in my gameplay.

Every other way I've tried to play Rogue Squadron II is essentially unplayable compared to this, with all the swinging around and aiming at targets. Complete game changer.

I might actually have to connect my Series X to this to play Halo:CE. HOLY SHIT THAT'S POSSIBLE

oIRUXtQ.jpg
vXPPkf6.jpg
 

Type_Raver

Member
snip...


I used a very cheap HDMi-to-SCART adapter from Amazon.
The problem with the Classic Mini consoles is they always display a 16:9 screen with borders. So on a 4:3 TV, the actual game’s screen is squashed. A pity, because even if it isn’t 240p, it’s still interesting to see the results.

The Switch instead, it switches (heh) to 4:3 automatically and it’s oddly satisfying to see a 1080p game like MK8 running on an analog screen. Motion is absolutely flawless.
So i'e decided to go a different route.
As mentioned, getting full screen 4:3 out of the SNES mini generally wont work without some other equipment to scale/process the image.

The Raspberry Pi 4 was my first initial idea, but while it is quite cheap alone, i'd still need a 40pin Component/Svideo hat to connect to it. However those are either difficult to find now (ive had no luck) or have been superseded by newer but more expensive models. Im not closed to this idea, but its less urgent now, especially now that i've figured out the next solution.

Instead, i have installed the Homebrew on my old Wii, which is working really well, as well composite cables allow. I cant find my original component cables so ive had to order a 3rd party replacement. Nonetheless, NES and SNES look and run so well, though n64 (using F-zero X as the use case) has frame dips/stutters which is quite noticeable. Maybe theres some tweaks i can perform in, but what annoys me more so is that Wii64 doesnt offer the finer Video options like the other two aforementioned console emu's do. (EG: I want to choose NTSC 480i).
 

tygertrip

Member
The art wasn't made that way though. The art was not made to be seen as the pixels showing only their own information, the artists created art knowing the CRT will "bleed." The artist created that spite to look like the left, using the right to achieve that. Perfect example below, do you think they really wanted this nice perfect red square dot on the right, or do you think they were trying to achieve the left?

Castlevania-Symphony-of-the-Night1.jpg


Hell, the concept of a "pixel" is tenuous at best on old systems because there is so much analog stuff going on with non square pixels and all sorts of weird shit. Like, if an artist is going to display a circle on an SNES the sprite would be 20x16 pixels, not 20x20 pixels.

"Pixel art" is unbelievably misunderstood, and playing emulators on a computer for decades by people who have never even used an actual console on a CRT has completely distorted the perception of it.
MY MAN. I preached this to my kids (they like retro games... even dug my old Atari 2600 out of my parents' attic). Unfortunately, I soon came to regret my proselytizing, because my oldest son found a 32" Toshiba behemoth for like $20. Guess who got to haul it down a flight of stairs and through the house? The old games look fantastic on it, though!
 
Last edited:

tygertrip

Member
A shame LCDs kicked plasma to the curb. During that early HD era, I jumped from a CRT to a small 42" Panasonic plasma. This was somewhere around 2007 I think. Nice picture quality. At the time, LCD TVs had terrible ghosting when 8ms lag was the norm, but everyone was still amped up getting these shit TVs though you could see the ghosting on these 720p models. I guess the marketing worked where plasma's big cons were burn in and heavy as hell. I dont remember price, so maybe plasmas were way more expensive for comparable size? I dont remember. Maybe that was a killer factor.

I then got a bigger Panasonic plasma around 2011. Never had burn in. And I'd leave my tv on for hours after dinner if I was surfing the net watching sports where there'd be fixed UI on screen. I agree plasmas are heavy as I always made sure my brother or buddy helped me pick up the 60" TV from the store or put it in the right spot when I moved, but there were times I said fuck it and just heaved it myself onto the metal stand. Heavy yes, but it's not like any person is going to be moving a TV around the house every week, so who cares. I must had left that 60" TV in the same spot on the TV stand for 7 years when it was all set in place.
The plasmas of the time were leagues better than the lcds of the time, no doubt. SD systems (like the Wii) still looked good on an HD plasma screen (though not as good as on a 480i or 480p CRT, of course), while looking like absolute ASS on HD LCD. They look like ass on modern LCDs too.
 

tygertrip

Member
power consumption and heat generation were two more negatives.
plus, plasma generally couldnt get quite as bright as LCDs, was a bit more expensive, and had fewer models to choose from.

so to the layman, browsing in a store, you could either buy the heavier, more expensive, costlier to run, dimmer, hotter plasma (& you get to worry about burn-in too), or... you can buy the "superior" LCD, which is what "everyone else is buying" too.
You just know it was the same mouth breathers that went home and connected their DVD player to their new HD LCD with a composite video cable, and then bragged about how good it looks. No buddy, it looks like absolute ass, get a component cable. I swear, they must have been blind.
 

tygertrip

Member
CRT Royale Composite filters in RetroArch can emulate the waterfall effect too, those filters are crazy good.
But as OP mentioned, this is about motion clarity of CRTs, that shit is insane. I recently got a SyncMaster 793s after watching those DF CRT Hype videos and i gotta say, they are right, it looks insanely smooth even at "just" 60hz.

The SnycMaster 793s can do 4:3 Aspect Ratio at 640x480@135hz, 736x552@120hz, 800x600@110hz, 1024x768@85hz and 16:9 Aspect Ratio at 976x549@120hz or 720p@85hz

I tried taking some pics but my phone does not do it justice, as you can see these things are better when viewd in the dark.
Dark Forces 2 at 135hz is so smooth, fighting games look amazing on this (that btw is a RetroPi connected via hdmi to vga cable), i tried emulating PS2 with Tony Hawk Pro Skater 3 in 4K downsampled to 1280x960@60hz i've never seen Tony skate so smoothly before. For some reason i can only get Quake 2 to run at 800x600@110hz, still insanely smooth. Doom pics are 736x552 for 4:3 and 976x549 for 16:9, both 120hz.

Only downside i've seen so far is blooming and phosphor decay/trails and this monitor is kinda failing so sometimes you get wrong colors but it fixes itself after a while.

lh8GolW.jpg
o5xfFF8.jpg
LqFONec.jpg
MNhKBKA.jpg
DFxV7eA.jpg
hwAFaCT.jpg
Fucking awesome.
 

kunonabi

Member
I got a 36" Toshiba that I refuse to let go of especially now that I found out it supports component. I just moved though and am going to have to let go of the crt PC monitor I was using for my Dreamcast.
 
Top Bottom