As for as what you linked, Altman is saying the same thing I'm saying:
> That doesn’t mean that OpenAI won't continue to try to make the models bigger, it just means they will likely double or triple in size each year rather than increasing by many orders of magnitude.
This is exactly my point; doubling or tripling of the size will be possible, but it won't result in a doubling of performance. We won't see a GPT 5 that's twice as good as GPT 4, for example. The jump from 2 to 3 was exponential. The jump from 3 to 4 was also exponential, though not as much. The jump from 4 to 5 will follow that curve, according to Altman, which means exactly what he said in my quote; the value will continue to decrease. For a 2 to 3 type jump, GPU technology would have to completely transform in capability, which there are no indications that we've found that innovation.