Are there exceptions? I'm sure. Will I be erring sometimes by being cautious? Definitely. But, there is really not much of an alternative these days.
The chance of the data leaking nears 100% with time.
The corporate cloud is a seriously unsafe place to be. It's a dangerous place to store your intimate secrets and a shaky foundation on which to build a culture.
Edit: at least when it comes to PII, which I presume should include photos of you, or any personal detail of you. The content you may have posted there up until then - that might be a different story
Am I too idealistic? If such apps are not aggressively seeking hyper growth, it seems like these more trustworthy services could be deployed to cheap servers and let people use them for cheap without having to resort to selling user data.
The real problem is how to trust that a "privacy-focused" app is actually privacy-focused. You certainly can't take the publisher's word for it.
The only safe stance is to withhold as much personal information from as much software and services as possible.
I think the real problem is actually that legislative bodies will make privacy focused apps illegal. California AB 1043 is an example of what can happen.
On one spectrum, you have privacy -- at one extreme, the most private of people don't even use social apps, they are traditionally private people. At the other extreme, you have the highest consumers of apps -- the people who demand sharing the most.
On the other spectrum, you have technical acuity -- at one extreme you have people who can audit software they use and verify that it actually does what it says -- at the other extreme, you have people who have no clue and will believe whatever is convincing.
Given this, the market for "app that enables sharing, but has privacy controls, and is verifiably so" is a tiny circle somewhere in the middle of this grid.
Unless the software sends data off to the cloud or a sever somewhere. You can't audit what happens there.
Ten years later, the social media revolution is in full swing, the relatively small service they built that had catered mostly to nerds was suddenly lucrative, and they sell to Match Group and this happens.
To be entirely fair to these guys, I don't think they came into it intending to sell out as their long-term goal. But four guys who got into data analytics in college also didn't find themselves as their mid-30s approached particularly wanting to run a dating service for the rest of their lives, either.
Whatever happened to FetLife? If any dating service had to be privacy-focused, that was it.
Open source developers are wildly idealistic. In the rest of the world, I have finally internalized...
1. Most people say they care about privacy... but won't spend even $1 for it. They care about their privacy about as much as an open source developer cares about user experience. Just extract the tarball, it's not that hard.
2. Most people don't care about technology and want it out of their lives. They don't want to know what sideloading is. They don't want to know how to discern safe from dangerous. And they aren't wrong. How many open source developers know how to drive manual? Car enthusiasts have just as much of a righteous claim to attention, after all. The model railroad enthusiasts are also upset by our community's lack of attention. Every enthusiast, in every field, hundreds of them, are upset by lack of mainstream attention, and this will never change.
3. Linux and open source software in general are not even close to being popular on the desktop. Gaming and web browsing is a tiny subset of what people buy PCs to do, and Linux isn't even close on the rest. Even the gaming success is so niche it's irrelevant in the grand scheme of things (Switch 2 outsold 3 years Steam Deck sales in the first 24 hours).
4. Some of this optimism was deluded from the start. Like when Stallman said we can defeat proprietary software with open source, then openly admitted he had no idea how any open source developers could afford rent. "If everyone works for free, while the big companies stop working, we could get ahead" is gobsmackingly naive and it's honestly astounding anyone fell for it.
Maybe they are smarter than you and noticed that trust is being violated constantly so paying for it in no way means you will obtain it and is just a waste of money?
I mean, an app that starts out as "privacy focused" won't necessarily stay that way.
If we had a sort of "federated" system we'd still have this problem because you might always find yourself federated with someone who just wants to sell the information.
It's a cultural problem within this hyper-aggressive version of Capitalism that we've adopted, that even data about people has value. Until we decide as a culture that this kind of data sale or data use is shameful and unacceptable we'll be in this situation no matter what technical solution we adopt.
But it seems to be the natural outcome of the incentives, of an organization made of organisms in an entropy-based simulation.
i.e. the problem might be slightly deeper than an economic or political model. That being said, we might see something approximating post-scarcity economics in our lifetimes, which will be very interesting.
In the meantime... we might fiddle with the incentives a bit ;)
Can you elaborate more on this? All I see is growing inequality.
This deserves a few qualifiers. I think this should be applied to any service that is
- "free" or "freemium"
- wrapped as a black box which gives no way out for customers.
There are plenty of companies out there who provide services based on FOSS, but we collectively shy away from paying them because it seems "silly" to pay for software that people can run for free.
Most AI startups will never be profitable.
Reduces anxiety.
Banking is anxiety inducing, but other than that I’m probably better off. I don’t really send anything sensitive.
So... Their punishment for breaking the law is having to promise to follow the law going forward?
I wish I had that superpower, too.
I was dismissed. "The privacy policy doesn't allow it"
Peeps: privacy policies are not binding agreements, and even if they were, it always allows a corporation to sell your data.
Always.
No matter what it says today, because literally tomorrow they can change it to whatever they want.
[0]: https://en.wikipedia.org/wiki/Biometric_Information_Privacy_...
Per violation. Wow.
> OkCupid and Match do not have to pay a financial penalty in a deal made with the FTC over an incident from 2014. OkCupid and Match did not admit or deny the allegations but agreed to a permanent prohibition barring them from misrepresenting how they use and share personal data
And
> In September 2014, the CEO of Clarifai, Inc. e-mailed one of OkCupid’s founders requesting that Humor Rainbow give Clarifai, Inc. (i.e., the Data Recipient) access to large datasets of OkCupid photos. Despite not having any business relationship with Humor Rainbow, the Data Recipient sought Humor Rainbow’s assistance because each of OkCupid’s founders, including Humor Rainbow’s President and Match Group, LLC’s CEO, were financially invested in the Data Recipient.
Could this be the backstabbing surveillance capitalism incident that finally gives pause to tech executives?
Money was already being made off the dating alone, and the accumulating facial data was a no-cost item from the beginning.
Even though the data is mainly just a working foundation for the dating service, eventually the database got so big that lots of value could be extracted in other ways.
It would be difficult to put an exact dollar figure on the value of a database like that itself for sure.
And selling it could be considered unethical in some peoples' eyes, so those in control could very well have decided to start that adjacent facial recognition company in response. After all, regardless of an inaccurately valued asset, OkCupid is not passing the data on to a different company for good. The dating company is not losing anything nor getting any compensation for it. OkCupid just keeps on going like normal while the new face-recognition company springs up.
This is AI. This "limited" facial recognition approach doesn't require ownership of the data, they just needed to "borrow" it for a while.
One counter-pressure is regulation. But hey the US has a fetish about deregulation and so here we are.
I don't participate in this stuff anymore the dating app algos have put me in the ugly stack, sad but true
Also nowadays hard to tell if people are real
Why? There is no re-education that could make someone like Sam Altman, Elon Musk, Donald Trump or the people behind Match Group be a net positive contributor to society again.
Therefore... I'm fine with everything that makes them suffer, just like they made us all suffer.
- Card payment (non-prepaid cards)
- Government ID photo or passport
- Live video recording
I mean, come on. This bullshit is what you said before.
You haven’t changed, you’re just pissed off you caught but a bit smug you got away with it scott free.
All considered, I can't think of a worse database to train facial recognition on.