GPU advances have also gone way way down.
For many years, YoY GPU increases lead to node shrinkages, and (if we simplify it to the top tier card) huge performance gains with slightly more power usage. The last 4-5 generations have seen the exact opposite: huge power increases closely scaling with performance increases. That is literally stagnation. Also they are literally reaching the limit of node shrinkage with current silicon technology which is leading to larger dies and way more heat to try to get even close to the same generational performance gain.
Luckily they found other uses for uses GPU acceleration. Just because there is an increase in demand for a new usecase does not, in any way, mean that the development of the device itself is still moving at the same pace.
That's like saying that a leg of a chair is reaching new heights of technological advancement because they used the same chair leg to be the leg of a table also.
It is a similar story of memory. They are literally just packing more dies in a PCB or layering PCBs outside of a couple niche ultra-expensive processes made for data centers.
My original comment was also correct. There is a reason why >10 year old MCUs are still used in embedded devices today. That doesn't mean that it can't still be exciting finding new novel uses for the same technology.
Again, stagnation ≠ bad
The area that electronics technology has really progressed quite a bit is Signal Integrity and EMC. The things we know now and can measure now are pretty crazy and enable the ultra high frequency and high data rates that come out in the new standards.
This is not about pro gamer upgrades. This is about the electronics (silicon based) industry (I am an electronics engineer) as a whole