Hypothetically you have hashes for two people of gender X (lets be honest, based on popularity of different types of porn two men). This is not meaningfully different from opaque hash of "CSAM".
But you're missing the point:
Step 1. generate some opaque hash of the "semantics" of an image
Step 2. compare those hashes to some list of hashes of "CSAM", which again fundamentally cannot be audited
Step 3. report any hits to law enforcement
Step 4. person X is being investigated due to reported violations of laws against child abuse.
Basically: how do you design a system in which the state provides "semantic" hashes of "CSAM" that cannot be trivially abused by inclusion of non-CSAM as "CSAM", or by laws mandating inclusion of things that are objectively not-CSAM. Hypothetically: hashes that match christian crosses, star of David, muslim star and/or crescent, etc. Or in the US DNC, RNC, pride, etc flags. Recall that definitionally no one can audit the hashes that would trigger notifying law encforcement.