story
The two best conversations I can recall were when we changed a customer's email address about a half dozen times over a year because "hackers were getting in and sending them emails" (internal customer note: stop signing up for porn sites), and a customer's computer could barely browse the web because they were running about 5 software firewalls because they were "under surveillance by the NSA" (internal customer note: schizophrenia).
The expected value of processing requests like this any way other than patting the reporter on their head and assuring them the company will research it, then sending them along their way with a new device while chucking the old one in the "reflash" pile isn't just zero, it's sharply negative.
The author's mistake was not posting somewhere like NANOG or Full-Disclosure with a detailed write-up. The right circles would've seen it, the detailed write-up would've revealed that the author wasn't an idiot or paranoid, and the popped device might've been researched.
This is an organizational equivalent of a code smell. Something is off when support people aren't writing up the anomalies and escalating them.
Some of the most serious security issues I've ever had to deal with started with either a sales rep getting a call or a very humble ticket with a level one escalating it up. Problem is for every serious security issue that gets written up, forty-two or so end up getting ignored because the support agent is evaluated on tickets per hour or some other metric that incentivizes ignoring anything that can't be closed by sending a knowledge base article.
What is described in the article is a fantastic hack. Given my organization's structure and skills, you'd need to send it straight past three layers of support and several layers of engineering before you find someone who'd be able to assemble a team to analyze the claims. We'd spend four figures an hour just to confirm the problem actually exists - then we'd all go "oh shit, how do we get in touch with the FBI, because this is miles above our paygrade."
An average cable internet user walks into a retail ISP location, sets a cable modem on the counter, and says "this is hacked". What is the probability you'd assign to them being correct? How much of your budget are you willing to spend to prove/disprove their theory? How often are you willing to spend that - remembering Cox has 3.5 million subscribers.
Friction is good. Hell, it's underrated! Introduce it to filter out fantastic claims: the stupid and paranoid are ignored quickly, leaving the ones that make it through as more likely to be real.
And "code smell" doesn't apply in a similar or metaphorical way towards cable modem support personnel. Those people aren't supposed to know how to escalate a case of a customer bringing in suspected hacked modem. If they did that for every idiot customer that brought in a "suspicious" modem, the company's tech support staff wouldn't be able to get anything done. 99.999999999999% of the cases would not in fact be a hacked modem, so there really shouldn't be any pathway to escalate this as a serious issue.
I think after reading this I’ll continue that habit. Putting the phrase “code smell” in a review is like using the dark side of the force: you’re just being an ass
The support staff probably wouldn't get anything done with any given bad modem ticket, but the analyst looking at support data for the week/month might notice that we've had 82 reports of defective modems of a specific model in a short time frame, and this is a new problem... one that we should probably grab a defective modem, get the vendor and take a look to make sure we don't have a big problem (the assumption might be defective hardware, but that's why you gather evidence and investigate further).
The aunt who is convinced she has a stalker who is hacking all her devices and moving icons around and renaming files to mess with her (watching her use the computer, she has trouble with clicking/double-clicking and brushing up against the mouse/trackpad. call her out on it, she says she didn't do it)
The coworker who was a college football player, who now has TBI-induced paranoia. He was changing his passwords about 3 times a day. Last thing I heard about him before he got cut out of my social circle was he got in a car accident because he was changing his password while he was driving.
Meanwhile I know zero people who have found any real vulnerabilities.
Admittedly, this is anecdotal, and it was a small company, and my skillset was being very underutilized at the time. However, I don't think it's hard to imagine a me that would have been closed minded enough to normalize my experiences and expect it of others. In fact, I'd say I still fight with it regardless of having seen it.
Where's the lie? We all are.
Source: was a volunteer front-desk person at a museum. Spent a lot of my life dealing with people. They were sure of incorrect things all the time and could not be relied on to know.
In retrospect, Sam should definitely have hit the responsible disclosure page (if such a thing even existed in 2021) but I don't fault anyone for the choices they made here.