Personally, I see little reason to upgrade from my AM4 platform. It's never been easier to hold on to aging hardware with the advent of DLSS stretching older cards further, diminishing returns on the newer gen GPUs, and the 'realism' of video games plateauing.
Last year I said I should have upgraded my 1060 last year.
I bought it second hand 7 years ago and it is still the same price.
I don't do much gaming, and it runs Immich / etc light inference just fine. One thing I don't regret is getting 32GB of DDR4 when I built the system around the time of the GPU upgrade.
7 years ago it was the same price, but then again, the last 7 years have involved accelerated inflation. So, the same price is actually a lower price.
If you're looking for a card in the sane $300 area, the Intel ARC B580 (12GB) or the RX 9060XT (8GB) are a reasonable value. If you want 12GB+ from Nvidia or AMD the used market in previous generations is a good place to look: maybe something like a RTX 3060Ti (12GB) or RX6800XT (16GB).
I personally don't think the GPU market is incredibly miserable. Maybe I am just used to the pain or something? Nvidia has a bit of a tax where but something like the RX 9070XT is basically the 3rd fastest gaming GPU money can buy and it's around $700. (I'm not sure why the 5070ti costs $200 more even given Nvidia's software advantages. It performs almost identically it just doesn't make purchase sense)
2017 GTX 1070 and 32GB ram. I don’t run games 4K and still haven’t had any problems running reasonably pretty recent stuff.
It may sound like pseudo-Buddhist claptrap, but it's also true. Or, I suppose, Fight Club claptrap. It's still true.
The choice is "do you want to participate in society, its benefits and drawbacks". You can't have only one side of that.
I used to think the plateau was here when the Xbox 360 and PS3 came out.
I don't mind that graphics have plateaued, because they aren't the important bit. If anything, I would rather that devs stop trying to chase graphics and make more games with shorter dev cycles.
Partially this is because there was usually an overlap in sales for early PS4 and late PS3, etc. if you have to support both console generations, it won’t truly be able to take advantage of the newer gen stuff.
I've kept playing games and upgrading my GPU every other generation, and they're still fully utilized, but I can't really see where the additional compute and money is going. My biggest visual upgrade during that time was actually going from LED to HDR OLED which is something that requires virtually no additional processing power.