I do expect personal AI machines to take off in a few years once local models and local hardware hit an inflection point. M5 Max is a major improvement for local inference due to the added matmul accelerators, but the RAM capacity and bandwidth bottleneck is huge.
That said, enterprise AI chips will still take the cake in terms of margins.