The difference is that Apple has been doing this on-device for maybe 4-5 years already with the Neural Engine. Every iOS version has brought more stuff you can search for.
The current addition is "just" about adding a natural language interface on top of data they already have about your photos (on device, not in the cloud).
My iPhone 14 can, for example, detect the breed of my dog correctly from the pictures and it can search for a specific pet by name. Again on-device, not by sending my stuff to Google's cloud to be analysed.