It's not crazy to want to train or run models like these, it's actually quite popular right now! :) The question for you to answer is how handy with scikit-learn and pandas are you, and how much do you want to be on the bleeding edge of things? Most stuff is coming out for CUDA first, since that's what the industrial grade GPUS (A100s) use, so with Apple Arm you either have to wait for someone to port it, or port it yourself.
On the other hand, getting > 8 GiB VRAM on a laptop GPU is rare; you're definitely not getting 128 GiB VRAM, so Apple Arm, with 32 or 64 GiB or RAM (get 128 if you can afford it) is going to get you more gigabytes of usable RAM for training/inference.