There are way too many personal definitions of what "Moore's Law" even is to have a discussion without deciding on a shared definition before hand.
But Goodhart's law; "When a measure becomes a target, it ceases to be a good measure"
Directly applies here, Moore's Law was used to set long term plans at semiconductor companies, and Moore didn't have empirical evidence it was even going to continue.
If you say, arbitrarily decide CPU, or worse, single core performance as your measurement, it hasn't held for well over a decade.
If you hold minimum feature size without regard to cost, it is still holding.
What you want to prove usually dictates what interpretation you make.
That said, the scaling law is still unknown, but you can game it as much as you want in similar ways.
GPT4 was already hinting at an asymptote on MMLU, but the question is if it is valid for real work etc...
Time will tell, but I am seeing far less optimism from my sources, but that is just anecdotal.