Not to perpetuate any stereotypes but fleshlight / masculine enlargement products and so forth would be happy to advertise there.
By the way, some of the posters commenting on 4chan degrading morality. This is an anon forum and its just a reflection of our society! The Guardian summarized 4chan as "lunatic, juvenile... brilliant, ridiculous and alarming". Perhaps the best description for 4chan.
And remember 4chan gave us LolCats, Memes, Rage faces etc.
P.S. By the way, why did mods kill the other thread?
4chan gave us a voice. We are now legion.
Shared culture, sure - but it showed us there was an "us" to share that culture among. Its real effects are much, much larger.
There's no accounting for an inconsiderate user base. This has always been a big problem for 4chan. I remember several instances where the site was effectively taken offline for days by a JavaScript virus embedded in PNG images that required the user to follow directions in the image in order to spread.
Getting the extension authors on board is probably the best option, because there's only a few of them, and they are not stupid. Stupid users don't have the skills to write broken extensions for themselves.
The only alternative would be (as others have said) to add auto-refresh JS to every page and thus bypass the need for extensions entirely. If the number of people who currently use these extensions is small enough, however, this could just as easily drive their traffic up because now every user on the site gets the same functionality without doing anything.
Wait, so he only has a single image server?
That Google analytics "guess" was abysmal. Client side logging is fine to complement server side.
I'm not a scaling wizard, but I'd guess 99/100 times the reason CRUD apps have problems scaling is because they are over-engineered, and there is a tendency to solve scaling issues by adding another layer of complexity instead of optimizing the root application.
Besides, how does caching solve the "My bandwidth bills are killing my wallet!" issue?
I'd argue that all of the text could be served from 1 nice box if you wanted to(multiple boxes make it more complicated but not that much more). Send the post to each box, add it to a table/indexes in memory and write the post to disk and backups for recovery purposes. Then either update and cache every page the new post affects, or mark the pages dirty and update and cache them the next time they are requested.
Done, all pages served out of memory super fast, what am I missing?
As far as bandwidth bills, most browsers observe cache settings and won't re-download what it has already downloaded. His complaint is about getting hit too hard serving the html/text not the images.