Anyone capable of predicting the future well enough to reliably answer your question would be spending their time making a killing off the stock market, not answering questions on HN like a fortune teller. Nobody
actually knows the answer to what you're asking any better than you probably do. It's all guessing with different amounts of undeserved confidence.
My personal position is that it hardly matters, and a takeoff scenario where AI can properly replace complex knowledge workers (I assume this is what you're worrying about) is more or less assured to infect every other kind of work in a fairly negligible timeframe as it continues to improve. A machine intelligence exhibiting exponential growth that can already engineer software isn't going to be too far off from designing and manufacturing equipment to efficiently produce physical machines that it can operate. The question that everyone's arguing about is whether or not there's any intelligence in the system in the first place. If there is, it seems very unlikely that it'll hit a convenient growth wall right between the skills of human-level software engineering and marginally superhuman robotics design.
I'm continuing on with business as normal, with the assumption that if I get bit by AI, everyone else will be soon to follow and I won't be at any particular disadvantage for long.