I suspect the reason we just don’t use repurposed gpus as opposed to current cpus is because most software can’t take advantage of it.
Most software is still written for one to four cores at most. Parallel processing to the tune of hundreds of threads is something the industry just isn’t doing, and isn’t practical for most workloads anyway.
I generally agree, you only need so many threads to handle the GUI, and most any non-compute transform, and simple control logic.
Maybe AI’s running on GPUs can break down serial processor problems into parallel compute problems in the future.
Humans are getting better at parallel execution software design and development, but many strides have been made by simply building better tools.
Perhaps AI will be the ultimate programmer in the end.
But for a laptop or a desktop -— a standardized ARM system with PCIE slots and a standard BootROM would go a long way into bringing ARM to developers.
Mac is close to that, but not really. I generally like Macs but other than owning some old obsolete ones that will go to a bone yard in the near future for recycling, i don’t run any. I don’t like the fact Apple abandons you after a decade. There are ways to keep your customers going until their HW dies, without dropping support their stuff.