>I've already shared one specific pattern above
If you mean this:
"Say you have a table of inventory movements and want instead a snapshot table of inventory at point in time"
Then I can make my own data to play with - I only want to be clear about the constraints. Would 500K records be enough to obviate the distinction between naive and non-naive approaches? Can you quantify (not precisely) "struggle"?
I have used Table.Buffer, but I probably don't thoroughly understand its use yet.
(I belatedly realized your problem is something I've done with Sharepoint list history recently, but not that many records, so I'm going to look for a public dataset to try)
P.P.S. I guess it also makes me think - I frequently am getting my data from an Oracle database, so if something is easier done there, I'd put it in the SQL. Analytic functions are convenient.
P.P.P.S. Aha! I found a file of parking meter transactions for 2020 in San Diego, which is about 140MB and almost 2 million records. This seems like a good test because not only is it well over the number you said was problematic, but it's well over the number of rows you can have directly in one Excel sheet.
https://data.sandiego.gov/datasets/parking-meters-transactio...