> ask law enforcement to run Apple’s algorithm on data sets Apple themselves don’t have access to
Sounds like that's what they did since they say they're matching against hashes provided by NCMEC generated from their 200k CSAM corpus.
[edit: Ah, in the PDF someone else linked, "First, Apple receives the NeuralHashes corresponding to known CSAM from the above child-safety organizations."]