Probably not true for " This saved costs ". From what i've seen, virtual machines usually cost more than twice the price of renting the equivalent "real" machine monthly.
They could have used dedicated servers; there are more dedicated server providers than VM providers, thus achieving the same goal, less expensively.
Probably not true for " better uptime " either; VMs are still hosted on real hardware, which fails, too. (Although distributing the work on more independent machines can improve uptime.)
1) Hardware seizure expenses vs LEOs duplicating the hdd of a virt.
2) TPB needs to locate in disparate jurisdictions to take advantages of different legal situations. That would involve a ton of shipping costs, probably more lost hardware, and paying for remote hands
3) They had been paying a premium for 'bulletproof' hosting.
There are dedicated server providers in every part of the world, including MPAA-proof countries.
Is there a way the codebase could be entirely encrypted and not even accessible to the cloud provider (with some 'boot password' needed each time the server starts up)?
I don't know how accurate this is, though.
Encryption (with the decryption key being gotten at boot from, say, a particular .onion address) would work against backups, but won't protect against an adversary with admin access to the server when the virtual server is on.
I'd love to hear a little more about the architecture.
[1] https://www.ipredator.se/ [2] http://torrentfreak.com/pirate-bay-announces-ipredator-globa...
I don't know, that's why I'm asking though.
That level of hardware/cores seems a bit over the top given what TPB does.
When I was a boy we had this thing called 'Alta Vista'. It was the search engine before Bing! came along. Processors did not run at gigahertz speeds back then and a large disk was 2Gb. Nonetheless most offices had the internet and when people went searching 'Alta Vista' was the first port of call for many.
TPB has an index of a selective part of the internets, i.e. movies, software, music, that sort of thing. Meanwhile, back in the 1990's, AltaVista indexed everything, as in the entire known internets, with everything stored away in less than the 620Gb used by TPB for their collection of 'stolen' material.
From http://en.wikipedia.org/wiki/AltaVista
Alta Vista is a very large project, requiring the cooperation of at least 5 servers, configured for searching huge indices and handling a huge Internet traffic load. The initial hardware configuration for Alta Vista is as follows:
Alta Vista -- AlphaStation 250 4/266 4 GB disk 196 MB memory Primary web server for gotcha.com Queries directed to WebIndexer or NewsIndexer
NewsServer -- AlphaStation 400 4/233 24 GB of RAID disks 160 MB memory News spool from which news index is generated Serves articles (via http) to those without news server
NewsIndexer -- AlphaStation 250 4/266 13 GB disk 196 MB memory Builds news index using articles from NewsServer Answers news index queries from Alta Vista
Spider -- DEC 3000 Model 900 (replacement for Model 500) 30 GB of RAID disk 1GB memory Collects pages from the web for WebIndexer
WebIndexer -- Alpha Server 8400 5/300 210 GB RAID disk (expandable) 4 GB memory (expandable) 4 processors (expandable) Builds the web index using pages sent by Spider. Answers web index queries from Alta Vista
They also didn't get as much traffic as TBP, since there wasn't that many connected back then.
I would also imagine that they didn't have to HIDE their services either.
IIRC there where (quite) a few before bing. More to the point google was the pinnacle of web searches long before bing came into existence.
Alta Vista started out with a modest size index of 20 million pages. Let's imagine those pages were all of 1Kb in size, then, 20 10^6 10^3 comes to 20 *10^9 or 20Gb. So, in terms of stuff indexed, that is considerably larger than TPB. Agreed?
Well, maybe not. They could have used compression to get the vastness of TPB onto that USB stick. Around that time - 2012 - they had 1.6 million torrents. That is some way off the Gb that AltaVista indexed, no matter how you bloat the maths. Sad to say, but, in the 1990's, the internet was actually larger than your porn collection.
How useful is reqs/second anyway? By that score Google probably does very badly as a search usually returns the answer on the first page. With old-style search engines you might need to go through scores of pages before getting what you want. I found TPB to be a bit like that too, wading through results pages more than necessary.
TPB is not 'safe for work' and in a lot of jurisdictions you cannot even access it from home. In the UK (which is a small but well populated country) it is not that easy to get onto TPB - you have to have hacker voodoo skills to do that or route through a VPN as none of the main ISPs will let you on. Most of the civilised world has the same need to protect citizens from the evils of TPB so places where it can be accessed are not that common. Even if you could access it, would you? Probably...
Meanwhile, back in 1998 - a year or two before the dotcom crash - plenty of people were using search engines such as AltaVista (which was the best back then) for actual work. Maybe not everyone, but enough people knew about computers and things like AOL disks, modems and what not. The internet was big.
Which reminds me of my main point, the one you thought so important to down vote rather than give kudos for being insightful. TPB uses a constellation of computers and consumes vastly more resources than the biggest search engine of the 1990's, yet, the utility of TPB is limited to only a few fortunate enough to live somewhere where TPB can be accessed. What can be searched for on TPB is a mere subset of what was on Altavista albeit different and not so useful stuff. I would say that with AltaVista they were doing far more with what they had, reaching a better audience, doing something more useful for the world (than serving weight loss adverts) and all together performing a miracle. TPB is a slouch in comparison.