Haha so funny. Can I laugh now?
Nothing wrong with pushing graphical power for cost of 30fps. I prefer better graphics
Nothing wrong for you and the games you gravitate towards perhaps.
I think 30fps shouldn't even exist in a medium where you are directly controlling what is happening on the screen. I realize it will likely be a staple for a long time as there will always be low-end console hardware and aging platforms, I have a Switch so obviously I play a lot of games at 30fps but I don't have to pretend to like it. Some games are "okay" to play at 30 fps but if I revisit those same experiences later with better hardware at 60fps the experience is always much better from a gameplay perspective which to me is much more important than just graphical fidelity which is often not really noticeable in motion during gameplay.
I guess it depends on what games you're playing though. To me, limiting a game 30fps just so that you can have better cutscenes and better photomode is a failure.
In many ways 24fps is more cinematic on an emotional level to the viewer. This is why 60/120fps films often don't hit with a lot of people.
If you took a very powerful system and used that power to create very accurate frames (in comparison to motion video), addressing the different motion blur, lighting, pacing issues with that, most would probably view that in a more favorable light. I'm speaking of cutscenes and not gameplay. It's just how the human mind is conditioned to work due to the high-level exposure to low fps video.
Here's a quick (non-scientific) article I found kind of referencing this phenomenon in regards to film:
https://www.vulture.com/2019/10/how-high-frame-rate-technology-killed-ang-lees-gemini-man.html
That's just film and cutscenes though. I don't control the character's actions in a film and I tend to avoid games with a lot of cutscenes interrupting the game. Personal preferences I suppose.
In movies however, directors work with a known output resolution and make the movie with that in mind. When the shitty Hobbit movies came out I watched the first one at 48fps. The sweeping outdoor camera shots were incredible at 48fps but the rest of the movie felt off, almost like someone was filming it with a phone or something. I watched the sequels in regular theatres. The wide sweeping outdoor shots looked awful at 24fps but the rest of the films felt better to watch even though technically they weren't. It's all in my head of course, absolutely everything about the 48fps version looked better, less motion blur, smoother transitions, less detail lost in the steps between frame that my eyes have to track. I'm used to 24 fps in film and unwittingly my brain expects 24fps when I watch a movie. 48fps or 60fps in film is something I will have to get used to before my brain stops giving me the signal something is wierd..
Games are different though. I played most games at 60Hz in the 8-bit and 16-bit days. When polygons hit the scene I had to get used to 30fps (or lower) with the odd 60fps game like F-Zero or Einhander looking incredibly fluid, and often in the Gamecube days the framerate being 60fps wasn't at the expense of the visuals. Games like the Star Wars titles and Metroid Prime were amazing and still ran at locked 60fps. Racing games, which are my mainstay, absolutely much run at 60fps+. Through the next few gens I gravitated to games with higher framerates and since getting into PC gaming I've revisited games that I'd originally played at 30fps and found them much more enjoyable with better refresh rates.