https://starecat.com/content/wp-content/uploads/control-of-i...
I would agree with you if Telegram actually had e2ee like Signal. But it isn't. No encryption breaking required to moderate public content.
Bad and criminal behaviors always existed. I see no evidence of them having been made any lesser or infrequent by virtue of giving massively powerful legally empowered organizations the right to monitor whatever they like at their self-righteously couched discretion.
Also note this part:
> IWF said that the company did remove CSAM once material was confirmed but said it was slower and less responsive to day-to-day requests.
So in the end Telegram removed the content.
I think it would be better if Telegram used the hash lists, however I think that they should use manual review and not remove content automatically, because this is an US platform that theoretically can be misused to remove legal content that US govt doesn't like.
And the capability to remove anything means they have to respond to secret orders from the government to remove something.
So with this attack on Telegram encryption, definitely EU didn't wanna see what political opponents are doing or who's organizing what protest so they undermine it before it happens. We're just hunting pedophiles, what's your problem?
Seems beyond a "norm" if your CEO is jailed for not "conforming"
But let's face it many on HN deny the norm and couldn't care less if Telegram is used for criminal content. It's undeniable that the app has a certain reputation.
For those who have built or participated in building large scale social networks (nvm global scale), you learn the Tim Ferris rule very quickly: 1 in a million is a common occurrence.
As soon as you have a social network you’ll experience a massive industry, trained over decades & with plenty of financial backers, farming it for victims.
Government bodies are hand-waving in the same way companies are - moderation is a difficult, unsolved problem and if you solved it you’d have a new set of difficult problems (opposing sides feeling they are getting more censored than others). No one has a solution to this, which is why regulations are entirely “doing enough to prevent the problem.” Can’t be done.
What I’ve seen expressed on HN has been the central posit of information systems since the beginning - if you require your information systems to be crime & abuse free, you will not have an information system anyone can use.
My personal stance is that an issue does not become a moral one until there is an actual solution on the table or the will to fund its development for the public good.
Where is the EU grants to solve this problem?
It’s seems fairly clear that the EU has invented a revenue engine that collects rent from tech to alleviate pressure on its own unpopular cost burden on its member countries. It’s smart, probably inevitable, and a reality for tech companies to contend with for the foreseeable future.
I'm not sure if Signal has that feature?