This is a lot of unfounded assumptions.
You don't need Moore's Law. GPU's are not really made with ML training in mind. You don't need exponential growth for anything. The money Open ai spent on GPT-4 a year ago could train a model twice as large today. and that amount is a drop in the bucket for the R&D of large corporations. Microsoft gave open ai 10B. amazon gave anthropic 4B
>So compute will not fall at the rate you would need it to for LLMs to actually compete in any meaningful way with human software engineers.
I don't think the compute reuired is anywhere near as much as you think it is.
https://arxiv.org/abs/2309.12499
>We are not guaranteed to continue to progress in anything just because we have in the past.
Nothing is guaranteed. But the scaling plots show no indication of a slow down so it's up to you to provide a concrete reason this object in motion is going to stop immediately and conveniently right now. If all you have is "well it just can't keep getting better right" then visit the 2 and 3 threads to see how meaningless such unfounded assertions are.