indeed 750ti was never better. it was a 1.3 tflops maxwell GPU, a far cry from 1.8 tflops.
it performeed better than ps4 at some titles because of the CPU difference. lets take witcher 3 for example: on PS4, you have to set the game to a 30 fps limit because you cannot realistically get a locked 60 with jaguar cores. I'm "pretty" sure that PS4's GPU is more than capable of 30 frames in Witcher 3 at 1080p. 750ti had the privelege of running those early gen games at unlocked framerates with much better CPUs, naturally it "seemed" it performed better.
this is like saying a gtx 1070 is better than a ps5. is it? absolutely no. can you run rdr 2 at 1080p/60 FPS with a gtx 1060 on PC? yes. why can't you do that on ps5? because Rockstar is to blame for not providing a simple nextgen patch for 2 years now.
in that era, all 2 gb GPUs suffered heavily. it is so misleading. their raw performance can still
match or
tail just behind of PS4. their VRAMbuffer is not up for the task. it was 2 gb buffer versus a 3.5-4 GB buffer of PS4 (a total of 8 GB it had, but only 4 gb was usable for GPU operations and 2 GB for CPU operations)
take a look at this picture. it is very informative. you have 2 gb 960, which as a limited 2 gb buffer and a 4 gb varian with exact same GPU core.
one renders 14 frames, and other renders 32 frames. that's a whoppin 2 times difference. this is what LACK of buffer looks like.
now you might ask, what does this have to do with our discussion? people like these keep yapping about "those GPUs cannot run games these days!!" well that's the problem. almost all of the PS4 equivalent GPUs from that time have 2 GB. all of them suffer high performance losses due to limited VRAM buffer. if some of them had 4 GB buffer, you would see they perform very close to the actual PS4 hardware.
HD 7850? 2 GB.
GTX 760? 2 GB.
HD 7870? 2 GB.
PS4 is somewhere between a HD 7850 (16 cu) and a HD 7870 (20 cu). At its cores, PS4 has a 7870 chipset (20 CU) but two CU units are removed. It has 18 CUs clocked at 800 mhz.. so it is not a complete replica of hd 7850. its not a 7870 either. its just sits in between. and what do wo have in between? lots of 2 gb cards.
r7 265? 2 GB
R7 270? 2 GB
GTX 6660? 2 GB
take in account that ps4 has higher bandwidth than all of these gpus. most of these gpus have bandwidth between 40-112 gb/s whereas ps4 has 140-150 gb/s available for GPU operations. so i don't where exactly it sits in this comparison. it may even outpace a hd 7870 in actual scenario due to being fed better thanks to higher memory bandwidth.
lets see how rx 460 performs then,
in rdr2, in one of the heaviest locations in the game at 1080p, ultra textures with optimized settings, it gets a nice 35 fps average. can this gpu really match? ps4 absolutely. can it perform a bit better? yeah.
can consoles get better performance? absolutely. with metal APIs, they are bound to get better performance. we can see that 460 barely gets a 900p/ps4 matched settings/30 fps in horizon zero dawn
does it depend on dev? yeah. is it still playable? also yeah. is it "smoking"? no. since they're matched, it seems ps4 performs %20-25 above a 460 in this title, which is pretty respectable and admirable for devs. then again, rx 460 only has 112 gb/s bandwidth
compared to the 224 gb/s total ps4 has (140-150 gb/s usable for gpu operations). so there's still some room for more performance on 460, but at this point i've proven my point.
in the case of 3060 and 3080, this will never happen. consoles have 10 GB buffer for GPU operations and 3060 has more than plenty. 3080 will get by. 8 GB GPUs will have to do cutbacks. 4-6 GB GPUs will pretty get whacked though, just like the 2 GB GPUs of the yore.
for the bandwidth case, 2070 2080 and co, all of them have full 448 gb/s. they're always nice fed.