It's been like this since GPT 3.5. This is not a limitation and is generally considered a natural outcome of the process.
So there's no major update in the sense that you might be thinking. Most of the time there's not even an announcement when/if training cut offs are updated. It's just another byline.
A 6 month lag seems to be the standard across the frontier models.