The risk Facebook runs is in having lost that cachet.
Facebook transitioned from "the cool kids" to "all the kids". They've been playing a highly defensive game, through pricey acquisitions, of seeing that they remain "all the kids", though with significant leakage forming.
No single other party has decided to enter into the frey with a decentralised social networking model (Google could have, didn't, and at this point I'm suspecting won't, more on that: https://plus.google.com/104092656004159577193/posts/3L3Z5GhJ...).
One thought that's been bubbling up my head is that we still don't fully understand networks, values, and costs. First it was Sarnoff (V = n), then Metcalfe (V = n^2), then Tilly-Odlyzko (V = n(log(n))).
I argue that V = n(log(n)) - kn
That is: there's a decreasing value for each additional user, but a constant cost function. Where log(n) < kn, adding additional users hurts you.
You also risk a smaller, higher-value network beating you out.
There are a few implications, one of which is that network size is highly dependent on 'k'. This strikes me as a very general characteristic, common for pretty much any network-type structure: social networks, phone systems (crank and sales calls), cities (crime, disease, congestion), computer chips (heat, noise, defects), and more.
There are also some psychological and perceptual limitations at play. I've been trying to find what I can on human limits of information consumption. Several sources (Stephen Wolfram, Walt Mossberg) suggest ~100 - 300 emails/day is a near upper limit, the NY Times comments moderation desk averages slightly fewer than 800 comments/day per person. That's about one every 36 seconds, straight.
I know my own bandwidth for articles, comments, posts, and books is limited, as well as my capacity for handling longer forms once I've started consuming shorter-form content.
I'm tumbling this around to see what I can get out of it. Not sure yet.