https://seekingalpha.com/news/3427520-apple-banning-facebook...
> "Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple," Apple says. "Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.”
[1] I previously posted this development as a story, but it got flagged as a dupe. It seems like this is the more truthful angle. since Apple seems to be the real agent that caused this shutdown.
I think I found it? Was it called onavo on Android? https://play.google.com/store/apps/details?id=com.onavo.spac...
It is only available via referral.
We collect information such as:
* The apps installed on your phone
* Time you spend using apps
* Mobile and Wi-Fi data you use per app
* The websites you visit
* Your country, device and network type
We use this data to:
* Improve and operate the app
* Analyze apps usage
* Help improve Facebook Products
* Build better experiences for our community
Reasons for the contrary (Apple could not have known) my premise is incorrect, the metadata is insufficient, Facebook or in-house apps in general are much more widely use than presumed and often deployed from a public facing servers resulting in a location distribution too similar to the actual US population distribution, etc.
I would hardly expect Apple to be scrutinizing the install patterns of an enterprise app. Those resources would be better spent improving the App Store review process and TestFlight.
I have two objections to that. First, they're almost certainly not making that deal from an informed position. Security and privacy are very complicated matters, and most people won't even realize to ask questions about data retention, aggregation, sharing, differential privacy, etc. Instead, they'll substitute in an easier more salient question (https://en.wikipedia.org/wiki/Attribute_substitution), like "Is $20 a month something I want?"
The second issue is that MITMing all network traffic on a phone will necessary scoop up the user's credentials, as well as private messages and metadata from that user's friends and family. It's naive to think about privacy as something an individual can accomplish. When you sell data from your networked social device (i.e., your phone), you're also selling out all your friends and family. Given all that really hot data, what incentives are there for the data collector to act responsibly and protect that data? The reality is pretty much none.
After spending an hour reading similar comments to yours, I have to ask this question.
How informed should the user be? What qualifies as an informed user? This is getting into some dangerous territory because it because implies so some sort of contract literacy.
>The second issue is that MITMing all network traffic on a phone will necessary scoop up the user's credentials, as well as private messages and metadata from that user's friends and family.
Do they not already do that with access to Facebook Messenger and Instagram? Why is this not screamed from the top of every hill?
This data collection is also different because they literally say it is for research.( They are protected legally, unfortunately)
>Given all that really hot data, what incentives are there for the data collector to act responsibly and protect that data?
Hmm how about getting fined?(The issue is how much should they be fined and the answer in my opinion should be similar to how the SEC prosecutes for insider trading: fine on top of whatever you made, to strongly discourage you from doing so again or jail them.
As a company they want to collect data for whatever reason (a research project, training data for an ML system, creating a new product, refining an existing one or assessing the overall market). So they built software for data collection and are paying people money in exchange. How is this different from Amazon Mechanical Turk or any crowdsourcing platform which pays people for data?
I admit I haven't seen what the contract looks like but are people suggesting that the end user still does not understand that they are providing user data to be used by Facebook in exchange for money or have no power to decline such offer?
But it's much easier for the press to run click baity titles like "Facebook spies on teens! Again!!" and many folks on HN quickly jump on the bandwagon.
That said, the part of this backlash I sympathize with is if someone hates benefiting FB and have any of their data in their hands but then has a friend who uses this service and uploads all their collected chat conversations to FB.
and no, I don't work for them.
https://seekingalpha.com/news/3427520-apple-banning-facebook...
> Apple says. "Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.”
It sucks huge amounts of power, I assume the facebook app is collecting all sorts of nefarious metadata off phones at no benefit to the user. Fine grained controls would allow users to prevent facebook from accessing their information without the user's explicit choice, which would be a huge improvement.
There should be a big "Facebook wants to read and analyze ALL of your contacts. OK or Cancel" prompt before each read, or you should have to explicitly allow "Allow Facebook to read and analyze all of your SMS messages at any time"
Apple's revenue was a disappointing what 60 some billion dollars last quarter based on devices that have a multitude of functions beyond facebook. Not having facebook available on the device would probably decrease their value somewhat but with facebook getting tons of bad press it could work to apple and alternative social networks long term benefit by further cementing Apple as protector of your privacy and Apple blessed social networks as better than that trash android users run on their comparatively open platform. Users already don't seem to value openness much and place higher stock in privacy and security.
Facebooks was in the same time frame around 16 billion. Facebook depends heavily on access to mobile platforms to reach users. Apple is only 10ish % of the global market but around at least 1/3 of the US market which facebook needs. As people are unlikely to chuck their $1000 phones in the trash before 1-2 years are up it seems likely that most users would keep their facebook account to keep in contact with friends/family on facebook but come to rely on other chat apps on their phone and may ultimately use those apps instead of facebook on and off their phones.
From facebooks perspective its users don't immediately drop but engagement can drop through the floor for that segment while offering substantial help to competitors that may in the long run turn facebook into myspace 2.0. Apple losing facebook would be a negative. Facebook losing apple would be an existential threat. If removed from the play store as well it would basically be dead in the water. This would probably never happen because if apple and or google threatened to do so it would basically have to give apple and or google its left arm and throw in a leg to secure its continued existence. It's bargaining position with either is terrible.
I think both should work together to force facebook to do a better job of protecting its users privacy.
As an aside I think that in the longer run it could probably consider bypassing the respective app stores entirely to work entirely as a website but don't think that would be likely to be a short term prospect.
Apple has $245 billion cash on hand. That's enough to build their own feature equivalent social network and give $20 to anyone who signs up.
Fb only stop doing things that are obviously morally reprehensible when they get caught or called out in the media. They never proactively doing the right thing. :-(
Also, Google updated API certificate behavior to only trust built in roots by default. https://android-developers.googleblog.com/2016/07/changes-to... might explain why "project atlas" is only available for Android devices marshmallow and earlier (they can't snoop encrypted app traffic on later versions)
https://www.betabound.com/referral-instructions-for-project-...
He turned down our offer to take a job building a data analytics platform at Facebook. "Oh it was just too exciting tech not to work on."
I mean consider this scenario: A, B, C, D, E are in a meeting to discuss a new project that A has planned. D and E like the idea. B is not committed either way. C says they can't see the project being a good thing but uses the magic words "disagree and commit". How often does it cause A to go back and say ok maybe my idea was not good?
The solution to these problems is oversight from security, compliance, and privacy on all systems dealing with consumer information, and having privacy education for all employees on regular basis. GDPR is a step in that direction.
Come on guys we just want to build a better ad what could possibly go wrong!
That's another interesting move.
Ouch. Yes that's really bad, even if it wasn't a violation of Apple's terms of service (it was, apparently).
Wonder what it will reappear as next?
"We totally apologize for making this mistake, which is totally on me and we promise to clean up our act and behave better in the future and we're sorry if you feel offended"
apology by Mr. Zuckerberg a few month later.
(inspired by https://knowyourmeme.com/memes/milkshake-duck)
The apple gate keeping here is terrible.
1) Facebook said to the media that they’d stop distributing it to iOS users.
2) Apple revoked Facebook’s certificates from the enterprise signing program.
https://support.google.com/audiencemeasurement/answer/757381...
https://support.google.com/audiencemeasurement/answer/757389...
I have previously worked with developers in start-ups and was amazed to hear some of their backgrounds. One previously worked for a company that bundles spyware with freeware MSI products. Another worked for an airline agency, where they advertise fake a discount on tickets when the price was in fact higher than market. Needless to say, their actions outside the work environment matched their actions inside.
A $200k/year graduate starting salary is enough to get a lot of people to set their values to one side for a while.
Personally, this doesn't really go against my morals or values. They installed the app. No one forced them to. Not everyone has the same set of morals and values you do. If it's not illegal and no one is harmed. I don't care so much.
I can't prove a negative. So you would have to prove the positive, what harm came from this app?
It’s an ongoing source of debate and confusion as to what Software ‘Engineer’ means.
Someone likely will do nearly every unethical thing, but that doesn't mean it is right for anyone to do it.
It would be too much to expect engineers to somehow be angels without the threat of punishment when even doctors need these systems.
But on the other hand, imagine if you heard about this agile thing but you can’t legally apply it because waterfall is mandated and if you apply agile you might be stripped of your license to practice software engineering.
This isn't a very meaningful statement as people widely disagree about what the ethical action is.
Facebook in general seems to have good intentions and terrible awareness of what is crossing a line. They really wanted their platform to be the center of everyone’s lives and now appear completely unprepared for the consequences of pushing that agenda so hard.
-- Upton Sinclair (https://en.wikipedia.org/wiki/Upton_Sinclair)
These are good people. They just don’t care.
I think these two statements contradict each other.
It's a given assumption that anyone who once worked for a shady company is also shady?
EDIT: https://seekingalpha.com/news/3427520-apple-banning-facebook...
If FB hires these people, is there a difference?
Do you think a random admin assistant knows more than a techy teenager about the dangers of data privacy? Or is she just trying to scrape by for another week of rent?
This sounds like alarm from the rich and well to do about an ‘immoral’ way for the poor to make money.
Interesting nonetheless.
No harm done...
less than 8 hours after the original post hits #1 on HN.
Could Google Play?
I am aware that's not a wanted discussion topic normally, but if behaviour related to an article about dirty behaviour smells dirty it feels a bit relevant.
(in contrast, the similarly highly-upvoted Facetime bug thread [0] seemed to stay up longer, but no official fix had been made in the ~18 hours since it was first discussed)
Facebook will shut down its c̶o̶n̶t̶r̶o̶v̶e̶r̶s̶i̶a̶l̶ dastardly market research app for iOS
"Your honor, I swear! He clicked on the button saying he was over 18!!! How was my porn site supposed to know he was lying?"
If Facebook wanted to do something useful, it could devote some of that $400 billion empire into identity verification and ways to prevent deceptive practices online. Something that goes beyond the Stone Age practices like SSN and Captchas that we rely on today.
Now the new controversy is data. Don't ever sell your data! Don't give up your data! But teens are doing it widely and profusely.
At some point you just have to accept that new people will be born and will have new ideas and just won't give a fuck. As a business you can either sit on the sidelines and watch or capitalize on it.
Facebook will lead the way to allow for smaller players to get away with doing this as well.
That's not how any of this works. It's hard to 'move fast and break things' when a company like Facebook has used all the loopholes and gotten caught.
Small players will never be able to replicate Facebook or their scummy tactics because at this point Facebook wants regulations. Regulations will ensure Facebook can and will continue to operate (can pay any fine) while the competition is hamstrung. If your competition is hamstrung it makes it easier to buy em up or wipe em out.
Just to come full circle Facebook purchased onvo so they could spy on users and figure out which apps they were using and how often. Facebook then used this data to buy up companies.
https://www.fool.com/investing/2018/12/05/facebooks-onavo-sp...
I don't believe the behavior you're describing actually falls along generational lines, though, rather than technical competence or awareness. It isn't only "teens" who use social media, nor is it only "old people" who are concerned about privacy. In fact, younger generations are leaving Facebook and social media because of privacy concerns and the negative effect it has on their lives.
Your arguments here are stereotypical ageism and lack enough nuance to be convincing.
* Apple is probably the least guilty here, however as long as their OS remains closed, we don't actually know how much personal data they are harvesting.
This is quite a step up from sending home some telemetry data.
Facebook's market research app isn't "redirecting ALL INTERNET TRAFFIC through their servers" either. If you read the article it states that the app monitors phone and web activity and sends it back to Facebook. This is not really any different from sending home telemetry data. Google, Microsoft and Apple all send home encrypted data, most of which is not verifiable using Wireshark.
I'm no fan of Facebook, far from it in fact, but at least this is an opt-in service and users are being compensated for sacrificing personal data. The same can't be said for the other three companies.
What's next? People will start harassing students/researchers that do paid studies?
Pushing the app distribution thing aside, being aboveboard in every other respect is still insufficient justification for ethically questionable processes.
Facebook has a history of unscrupulous untrustworthiness which should not be overlooked when examining the implications of the scheme, particularly the requirement to install a root certificate. To ignore the context of the polemic, to pretend Facebook is just another company rather than one of the largest collectors of personal, private information on the planet, multiple times caught invading peoples' privacy through less than honourable means, is foolhardy at best and dangerous at worst.
Will it blow up in your face? Perhaps. I could not care less. It's your problem. You didn't know what you were being asked for? Again - it's your problem.
People agreed to do that on their own, they got paid for that, and, most likely, they don't care if FB knows what kind of porn do they browse.
All the stuff about ethics and "honourable means" is irrelevant in this argument. Is war ethical? Is spying honourable? Depends on whom you ask.
>requirement to install a root certificate
Requirement? You can just tell them to f*ck off.
In any case, I could not care less about this, but what annoys me is the people that pretend to be super-nannies that gonna save the world by telling what the others should do. Through history, this has never worked.
That isn't the point. The point is that Facebook preyed on technical illiteracy and a general and widespread lack of understanding of the implications of participating in the program. Effectively, the majority of participants were tricked. The age range for participants also included people who were not of age, and therefore not legally responsible for their actions. Facebook must accept that responsibility.
The point is that Facebook flouted Apple's guidelines for the distribution of apps outside the App Store, showing a blatant disregard for the protections put in place to protect consumers from bad actors. Facebook has positioned itself as a bad actor through their actions, not only in the questionable collection of data the implications of which the users will likely be unaware, but also in how such a program was distributed: in direct violation of the protections offered by the App Store.
The point is not about the choices made by the end users, but rather the unscrupulousness of a big company that knows better — but has done this before, far too many times.
Finally, the very nature of online interactions in the modern era means that people aren't just signing away their own privacy but also, to a lesser extent, the privacy of those with whom they interact. Facebook is perfectly aware of this potentiality, but users are not and, on the whole, will never be because it isn't their job to understand the technical dimensions of online communication. A big company like Facebook, however, does know and should have behaved accordingly.
> [...] is irrelevant in this argument. Is war ethical? Is spying honourable?
What do these two examples have to do with the actual circumstance? "Market research" is not war, and it certainly should never be construed as synonymous with spying. Do not construct a strawman against which to argue, it demeans your argument.
> Requirement? You can just tell them to fck off.
You could, but then you would not be adequately participating in the research. You would not be using the VPN as described. You would not earn the $20. There would be no discussion.
Of course, you're failing to account for the fact that none of the participants will have had the privacy and security implications of the root certificate explained to them in a way that made sense to them. They'll have simply followed instructions to get their money.
I do not think that people should ever be blamed for being deceived as to the severity of their actions in situations such as this; a big company like Facebook does not escape scrutiny here. Clearly you believe differently, although the downvotes will tell you how well-received such a laissez-faire attitude to other peoples' private lives and preying on their technical ignorance is seen, so I shan't bother to comment any further.
> what annoys me is the people that pretend to be super-nannies that gonna save the world by telling what the others should do
Thankfully, that's not the situation
at all*, and I fear you're simply projecting some negative feelings on to this article in order to justify having an unjustifiable gripe.Here's what's happening:
- Facebook previously had a VPN service that it advertised as being for market research purposes. It was removed from the App Store.
- Facebook then started using Apple's alternative app distribution method intended only for use in enterprise situations, not for the general public.
- They were found out.
- Facebook voluntarily ended the program for iOS users.
- Apple revoked Facebook's certificate as punishment for flouting the rules.
Who is tell whom to do what, here? Facebook did many things wrong, were found out, and were punished appropriately. The technical details of their actions were analysed and found to be vastly overstepping their bounds, yet in step with their continual and repetitive breaches of personal privacy.
There's nothing more to it than that, so put the strawman back on the farm where it belongs.
I'm very interested to see what Apple's response is going to be. I'd not be shocked (in fact I'd be delighted) to see them penalize FB in some way, perhaps suspend their App Store account or something.
OK let’s imagine that you have a close friend or family member with some confidential issue - maybe an illness, maybe debt, maybe they are in the closet. Occasionally they message you, on old -fashioned SMS or email mentioning something about it.
How many dollars is a reasonable trade to tell a data collection agency everything you know so they can add it to their file on your friend/relative?
I probably wouldn't, unless it was in excess of maybe >$1000/month. And even then I'd probably just get a new phone. But people should have the right to sign contracts, even if they seem exploitative, as long as they are aware of what they are agreeing too.
The main problem it seems here is that a lot of the people were underage.
Even if you stick with apps like Telegram or Wire (my choice), you have to have in mind that your phone might have a keylogger on (looking at Xiomi and Huawei).
Well there’s the rub isn’t it. I don’t think most people would consciously decide to rat out their friends secrets for $20 - or indeed for any price. But somehow, it’s happening.