The bad people can use encrypted services just like they use guns, even if they are illegal. But then there's a spike in arrests for posting on social media if people express opinions or content contrary to preferred narratives.
Fortunately, there are people exposing NGO money flows and who favors them.Fortunately, the US keeps free speech sacred.
I'm immensely grateful to the founding fathers and their ability to come up with something so helpful so many years down the line
Can you speak freely in Russia, China, the UK, and beyond like you do in the US? How many people are incarcerated in each of those countries just for saying stuff vs. in the US?
As for the president, you can't control him from pulling every lever to push his agenda, but at least the system allows for accountability.
In the meantime, you can tell as many people as possible what's going on, so that when voters vote again, they can be made aware of contrary information.
It's not a perfect system, but I think it's way better than exposing people to the risk of a crooked elite infiltrating the propaganda and censorship layer and making it impossible for contrary ideas to be shared.
For most of my life, news orgs have been treating national IC/LEO as if they have a history of truth-telling. Whenever a press conference comes up, journalists/editors reliably forget that they've never been told a meaningful truth in one of these.
If the people's who's job it is to highlight the lies of the powerful - usually don't, what hope is there for the proletariat?
I admire that they're saying this, and wish other VPN companies would do similar public relations to highlight the risks of ad targeting.
Tbh, I'm a customer. Before Mullvad, I used PIA.
https://mullvad.net/en/blog/advertising-that-targets-everyon...
> And then?
> And then the European Parliament, in an almost historic consensus, voted against the proposal and called Chat Control nothing but mass surveillance. As one of the members of the parliament said: “The Commission wasn’t focusing on protecting children but wanted mass surveillance.”
As in Tom and Jerry: "This shouldn't happen to a dog, said dogs".
Or in Orwell's Animal Farm: "All animals are equal, but some animals are more equal than others."
and
"The creatures outside looked from pig to man, and from man to pig, and from pig to man again; but already it was impossible to say which was which."
It's true that servers are flagged, but every VPN has that issue. Usually switching to a new server resolves it and I've noticed some servers aren't use much and are very fast and not flagged by many websites.
What I like about Mullvad is not only the commitment to privacy, but also the VPN speeds. I get 300-500MB/s pretty regularly. Some servers get congested during peak times, but switching to another I'll usually find a fast one in a desirable country very quickly.
Every VPN provider's IPs are blocked now. IP data providers finally got serious about identifying them, 5 or 10 years back.
If you want to look for alternatives: https://kumu.io/embed/9ced55e897e74fd807be51990b26b415#vpn-c...
It's a noble fight trying to get E2EE be compatible with the law. But I think some perspective for privacy advocates is due. People don't want freedom and privacy at the cost of their own security. We shouldn't have to choose, but if nothing else, the government has one most important role, and that is not safeguarding freedoms, but ensuring the safety of its people.
No government, no matter how free or wealthy can abdicate its role in securing its people. There must be a solution to fight harmful (not neccesarily illegal) content incorporated into secure messaging solutions. I'm not arguing for backdoors in this post, but even things like Apple's CSAM scanning approach are met with fierce resistance from the privacy advocate community.
This stance that "No, we can't have any solutions, leave E2EE alone" is not a practical stance.
Speaking purely as a citizen, if you're telling me "you will lose civil liberties and democracy, if you let governments reduce cp content", my response would be "what's the hold up?". Even if governments are just using that as an excuse. As someone slightly familiar with the topic, of course I wouldn't want to trade my liberties and freedoms, but is anyone working on a solution? are there working groups? Why did Apple get so much resistance, but there are no opensource solutions?
There are solutions for anonymous payments using homomorphic encryption. Things like Zcash and Monero exist. But you're telling me privacy preserving solutions to combat illicit content are impossible? My problem is with the impossible part. Are there researchers working to make this happen using differential privacy or some other solution? How can I help? Let's talk about solutions.
If your position is that governments (who represent us,voters) should accept the status quo, and just let their people suffer injustice, I don't think I can support that.
Mullvad is also in for a rude awakening. If criminals use Tor or VPNs, those will also face a ban. We need to give governments solutions that lets them do what they claim they want to do (protect the public from victimization) while preserving privacy to avoid a very real dystopia.
Freedoms and liberties must not come at the cost of injustice. And as i argued elsewhere on HN, in the end, ignoring ongoing injustice will result in even less freedoms and liberties. If there was a pluralistic referendum in the EU over chat control, I would be surprised if the result isn't a law that is even far worse than chat control.
EDIT: Here is one idea I had: Sign images/video with hardware-secured chips (camera sensor or GPU?) that is traceable to the device. When images are further processed/edited, then they will be subject to differential-privacy scanning. This can also combat deepfakes, if image authenticity can be proven by the device that took the image.
Yes. You cannot have a system that positively associates illicit content with an owner while preserving privacy.
Apple tried and made good progress. They had bugs which could be resolved but your insistence that it couldn't be done caused too much of an uproar.
You can have a system that flags illicit content with some confidence level and have a human review that content. You can make any model or heuristic used is publicly logged and audited. You can anonymously flag that content to reviewers, and when deemed as actually illicit by a human, the hash or some other signature of the content can be published globally to reveal the devices and owners of those devices. You can presume innocence (such as a parent taking a pic of their kids bathing) and question suspects discretely without an arrest. You can require cops to build multiple sufficient points of independently corroborated evidence before arresting people.
These are just some of the things that are possible that I came up with in the last minute of typing this post. Better and more well thought out solutions can be developed if taken seriously and funded well.
However, your response of "Yes." is materially false, law makers will catch on to that and discredit anything the privacy community has been advocating. Even simple heuristics that isn't using ML models can have a higher "true positive" rate of identifying criminal activity than eye witness testimony, which is used to convict people of serious crimes. And I suspect, you meant security, not privacy. Because as I mentioned, for privacy, humans can review before a decision is made to search for the confirmed content across devices.
The world won't fall apart because people have secrets.
The main problem is there are no products that solve the problem Chat Control aims to solve without infringing massively on everyone's privacy, (including children). Any suggestions that do exist come with serious risks or have complexities, eg homomorphic encryption is a generally new area that has expensive computational requirements.
The reason for that is because it's easier to encrypt data than develop some kind of system with a magical key only authorized people are able to use under certain circumstances.
What Mullvad highlights is that the whole chat control proposal is mired in corruption. A particular individual with an agenda to sell something has adjacent financial interests to being part of the solution. No doubt they will want funding for "research", because they don't actually have a solution everyone can use. They try to make it appear as if they do (grift) to get the politicians on board. Then there's a harassment campaign component (specifically the EU Survivors Taskforce) portion which aims to apply public emotional pressure on any remaining politicians who have concerns.
In the end everyone else (companies, developers etc) will have to do the heavy lifting to try to find some way to comply by their legal interpretation with whatever vague brain fart is passed into law.
Make no mistake about it, this proposal has nothing to do with child protection but rather is all about demonizing the use of encryption. Law enforcement would love to be able to simply argue the presence of encryption means there is likely to be offending. This is why they fight so hard in the UK in regard to Apple having default encryption on ADP. You can't make the argument to a court owning an iPhone means you're a criminal for instance.
The end game, and goal post movement will simply be to argue they used non-compliant software/products. If they do have something on the person then this will be used to argue that further offenses were likely concealed, (even if that is not the case) and they went to effort to do so (premeditation). It's a gift that keeps giving all along the trial process.
> EDIT: Here is one idea I had: Sign images/video with hardware-secured chips (camera sensor or GPU?) that is traceable to the device. When images are further processed/edited, then they will be subject to differential-privacy scanning. This can also combat deepfakes, if image authenticity can be proven by the device that took the image.
And there obviously will be totally like no way to like not do that and then have an anonymous photo. What are you going to do, confiscate all the computers, phones and cameras that already exist and don't have this special "hardware secure chip". Honestly at this point I think you're a troll.
> If your position is that governments (who represent us,voters) should accept the status quo, and just let their people suffer injustice, I don't think I can support that.
Things can be always worse, and you shouldn't assume that the powers that be will use these things to prosecute the things you find morally offensive. Which is another problem as well.
> Mullvad is also in for a rude awakening. If criminals use Tor or VPNs, those will also face a ban. We need to give governments solutions that lets them do what they claim they want to do (protect the public from victimization) while preserving privacy to avoid a very real dystopia.
The space will innovate regardless of what governments want, so that's the rude awakening. Criminals always will be criminals and they'll just get better at doing what they want to do regardless.
> Freedoms and liberties must not come at the cost of injustice. And as i argued elsewhere on HN, in the end, ignoring ongoing injustice will result in even less freedoms and liberties. If there was a pluralistic referendum in the EU over chat control, I would be surprised if the result isn't a law that is even far worse than chat control.
Okay then guess we can all "think of the children" whenever anyone is worrying about the injustice caused by abuse of these new powers.
> I understand that you seem to think that adding systems like this will placate governments around the world but that is not the case. We have already conceded far more than we ever should have to government surveillance for a false sense of security.
Placation of government and law enforcement is never complete. For them every goal post moved is perceived as making their job easier. They only have one job, and that's to convict people of things. That is the only metric they care about. That includes making up new offences to charge people with, including "the defendant used non compliant products to hide their offending which may or may not exist" - not a crime in the EU right now, but you can bet that will be the next step if people refuse to use compliant products.
> Let me post a longer reply later. But for your last point, we do have automated machine generated alarms in form of smoke detectors. We're legally required to have them in our homes.
A smoke alarm has very little room for abuse as it only does one thing which largely aligns with the occupant's interests. A more comparable argument would be that you must have cameras in every room in your house to record burglars, home invaders and potential child abductors. We need not look any further than the abuse of door bell cameras in the US to see how that plays out.
Funny how nobody has ever made that argument.