No, not really, while I really appreciate the option to have both in games, I prefer to max out IQ before reducing it to increase fps. As long as the fps is consistently 30 or 60 (or 120 on my PC since I can do 1440p120hz there, though I rarely do) I don't mind really. Exception ofc, racing games, FPSes and other games where delay is accentuated because the game is asking for rapidly changing input.
I've been a gaming since the start of modern GPUs, on a 75hz monitor I never cared about getting the game to run at 60 or 75, as long as it was above 30 I was happy. The excitement for me came from turning the settings/res up and having better lighting or so much more clarity in the image, there was such a huge difference between 640x480 and 1280x960, and especially the hallowed 1600x1200. Like the difference between actually being able to read a sign in a game or identify an enemy at distance.
Nowadays obviously, for the most part, you aren't missing out on gameplay or can't read the world if its 1080p vs 2160p, but for me its shifted to the microdetail they push for in modern games, so being able to perceive minute texture detail when you are at a medium distance a wall or the texture of a baddie skin, like in the intro to Spiderman: MM, you miss out on all this detail on the first thug you see in the first battle. I played the game in Fidelity mode first and I was blown away by how good that first thug looked, such crazy micro detail, you can even see their eyelashes are extremely well rendered.
After beating it I started NG+ in Performance mode and I played for a couple of hours and got bored, there wasn't a meaningful difference in input lag or responsiveness to me and the loss of clarity in the way I've described above made the game feel less cool/exciting/next-gen/buzzword for me.
I'm hoping that RE8 keeps the native 4K mode even if the framerate drops really low in parts (I'm sure it will given the chandelier causes drops to 50s), because if they do put in a 60fps mode I can guarantee they will still bias towards image quality, ie if 1800p gives 60 during almost all gameplay but cutscenes are dropping to low 40s due to a massive delta in GPU usage compared to the rest of the game then won't drop to 1440p to sort that out. I would think that those that favour performance modes/playing at 60fps mostly care that the gameplay itself feels better and don't care if the cutscenes drop but the cutscenes are just as important to the gameplay for me, maybe more so in some cases, so seeing them without distraction is a massive boon for me. If a film had a slight stutter every 5 minutes I wouldn't watch it either, I don't care that I'm not in control of it.
I will probably play more games in 30fps going forwards tbh, since my GPU is now too old for 4K pixel pushing + high gfx settings and next-gen won't achieve native 4K without dropping to 30 in most cases, unless it has a simpler art style maybe. I played Assassin's Creed 1 at 1080p when it came out, I want to move on to higher resolutions as a base now and many devs seem to be embracing that with the amount of detail they put it thats lost when you aren't very near or at native 2160p.
There are a lot of multiplatform games that came out in the last two years, that on last-gen consoles people said were "average or "shit" graphics and when I compare 1080p and near/native 4K on my PC version of the game its like the difference between streaming versions and isos/remuxes of films.