If you produce a list of hashes, you can reverse those hashes simply comparing against the hash.
This still isn't private for any URLs in the list, but it does anonymize any URLs which get transmitted in case your white-listing goes bad and allows all. It also makes it clear that you're only transmitting exact matches and not fuzzy matches. This also makes it clear only the URL part and no URL parameters are transmitted which is also important for privacy. (Otherwise any URLs in redirects are also hoovered up).
I haven't looked at the flattr code but if it's regex without going through proper URL parsing I also wouldn't trust it to go 'bad' at some point, there are too many edge cases from the tricky to the mundane such as matching on http://example.com#twitter.com , etc.
A side benefit to hashes for safebrowsing but perhaps a bad thing for this use case is that it also effectively allows you to anonymize a whitelist from your end-users because you can then compare client-side without leaking the list.