If the storage media (spinning rust, SSD, USB key, etc) is to see sensitive data, the device must be encrypted at the block level such that plaintext never touches it.
At that point, destroying it is a "nice to have," but not really required. Without the encryption key, it's just a source of pseudorandom data, and destroying a key securely is an awful lot more reliable than destroying a large amount of data.
However, if you have some media that you don't want recovered, what you do to it depends somewhat on the media.
If the drive is encrypted and spinning rust, I totally trust a dd of /dev/zero, followed by a hexdump of the device to verify that it's all zeros (hexdump, by default, won't show repeated lines, so if you see a line of 0s, a ** line, and then it finishes, the drive is zeroed).
For SSDs that either haven't contained sensitive data or have been encrypted, blkdiscard is fine, though enough devices don't implement it properly that you must hexdump it after, and if that shows anything, give 'er a couple passes of /dev/zero with a blkdiscard after. And then let it sit, powered on, for a couple hours to finish processing whatever it has queued.
If a drive has sensitive plaintext on it, you should zero it a few times, then, how's your industrial grinder look?
For spinning rust, to actually remove any residual data, you'll want to bring the platters above the Curie temperature that will scramble magnetic domains. "Glowing red from a propane torch" should do this, though thermite may be a bit more fun.
Putting bullets through them will satisfy the requirement for preventing casual recovery, but any residual data is still there if you wanted to get really fancy, even with a bullet hole or few in the drive.
For SSDs, I would generally take a good torch to the drive, get it "as glowing as possible," then sledegehammer it. There shouldn't be much left beyond powder after that, and at that point, the data is well and truly gone. If you want to be sure, a good "Will it Blend?" test of the remains should cover your edge cases, though I wouldn't use a blender you ever intend to use on food...
But, really, if you're at all concerned about this realm of data destruction, you must be using block device encryption.
Years of using FB when I was younger, general lapses in judgement or overlooks in privacy when in tense or pressing situations, compromises I've made in using services with certain people or jobs, many regretful apps or purchases and sign ups, years of my email and phone number shared to 3rd parties, I could go on...
And that's just the data I've shared knowingly, mostly due to the social contract when functioning in certain groups. The odds are stacked up against an individual.
The reason I mentioned that it's a platonic ideal is because it takes a lot of education and experience for one to even _know_ how to be anonymous—we're not born a priori with an understanding of privacy wrt technology, and we don't learn about tech privacy early say compared to privacy in the physical world. Maybe older folk that were wise to all of these violations as they became prevalent, but younger people are already profiled and marketed to before they understand any of these concepts—it's so dire. I think an approach that may work is for the blocking and scrubbing to be on the hardware and software vendors level with privacy-options baked in and set as default. A given, sort of like how you expect a bathroom stall to be private.
Another solution, assuming we can't escape being tracked, would be to weaponise our data with tools like trackmenot[0] and adnauseam[1].
Always a good read, I'll see if I find anything in particular that stands out as lacking or wrong!
For a historic example, '...although Newton's solution was anonymous, he was recognized by Bernoulli as its author; "tanquam ex ungue leonem" (we recognize the lion by his claw).' https://en.wikipedia.org/wiki/Later_life_of_Isaac_Newton#Ber... I am guessing the proof techniques were so hard/original that Bernoulli concluded only Newton could've come up with it at that time.
Likewise, I would assume there's only a handful of people in the world who have this level of expertise on online anonymity. So do you then have to keep this expertise completely isolate to one particular anonymous account to avoid being correlated?
My concern applies to knowledge that isn't particularly world class too, as long as it's in a unique combination. I guess in some sense what we know makes us who we are so it's hard to avoid being "identified" that way without splitting your personality?
I think you left out the word know from the above paragraph.
For a lot of people, their biggest threats will be personal relationships: an abusive relative or ex, someone they are currently divorcing. That seems to not make your threat assessment chart and I don't know if that's a good thing or not.
I'm not sure this level of "paranoia" is required to evade an abusive ex in most cases. But in any case, if a reader thinks it is, the guide can help for sure.