That said, this is taking now well established startups that will have begun their lives 5+ years ago in general. If I was picking a back end stack then I'd probably hesitate before picking the .NET Framework. But these days I'd easily pick .NET Core.
Likewise, I'd want a statically typed back end and 5 years ago I'd probably have hesitated before using TypeScript and node together. Now I do it regularly.
Would be super interesting to see the same chart in 5 years with companies starting now.
But since then, that space has gotten a lot more crowded from all sides. There is now a pretty big selection of mature technology to pick from depending on your needs.
In the case of Python, it got also a massive boost from its popularity in datascience. Couple of years ago I thought Python web development is dying out slowly, but seems to have swung back quite a bit.
That said, last I checked regarding performance Rails and Django were pretty much neck and neck with a negligible difference with rails being very slightly faster (iirc).
I think also a big reason why Ruby is feature so heavily in this list is because of it's focus of developer productivity. It's a nice mesh of flexibility and convention. Generally all projects are setup and organized the same but you can still write a conditional if statement a dozen ways. Compared to my personal experience working with Django and Python are the woes of getting everything configured and setup for each project which is frankly a pita. Also, Python is fairly regimented on a lot of flexibility which has its perks too.
Comparatively, I'm sure a chart with database swapped out for language would show a huge over-representation of NoSQL products like Mongo and Couchbase. Yet for the overwhelming majority of projects the right answer is just use a solid SQL product like Postgres.
Again, for SV hyper-growth startups the calculus is different. Mongo has tons of downsides and pitfalls. But the one thing it has going for it is, you can get started fast without having to think about it. Just throw shit in a giant nested key-val map and pull it out later. You don't have to design schemas or provision ahead of time. Change the JSON on the fly, and if you hit performance issues, just throw more hardware at it. Move fast and break things.
For 99% of projects that kind of attitude comes back to bite you in the ass. For every hour that you save today, you'll end up spending ten hours in a year or so from now when you're eventually forced to detangle a crusty sloppy mess.
But for a unicorn-aspiring SV startup, that tradeoff works. Almost any startup would gladly spend a hundred man-hours in a year or two, to get back one man hour today. And that works, because the growth rates are astronomical. In a few years, you'll hopefully be a billion-dollar unicorn with tons of resources to throw at the minefield of quick and dirty technical decisions you made at the seed stage. (More realistically, you simply won't be around anymore, at which point the pitfalls in waiting also don't matter.)
The point is don't necessarily pay too much attention to the decisions made by YC startups. This is true even for regular startups with more prosaic aspirations. (Some of us are more than happy to own $10 million companies, and aren't aiming for billionaire or bust.) If your business model is rooted in 1000% hyper-growth rates, that encourages many tradeoffs that are otherwise deeply pathological.
We have used Postgres from day 1 in March and are now 1200 people. Managing growth with a JSON blob sounds absolutely insane.
You need _more_ guarantees, not fewer.
Ignorance of a technology is a poor excuse to crap on it. I've used many different technologies over my 15 year career. SQL Server, MySQL, Postgres, MongoDB, Firebase, etc. Each worked well enough, each had it's drawbacks. I'm not here to tell you MongoDB is a silver bullet for every specific use case, but from personal experience I haven't run into an instance where it was a hindrance.
Disclaimer: I currently work for MongoDB.
And this is a financial company.
There are cases where it's better when you're using a JSON blog inside SQL. Consider the common scenario where the pure SQL solution would be having 1 abstract model with 50 children that inherit from it, and are only each different by one or two fields.
Having one SQL model that gets queried in one way and in one place, but has 50 different JSON validators, is a lot easier to understand and work with than having 50 different SQL models that get queried in 50 different ways and places.
I try Rust, Ruby and others on the side. But Python is just so heavily fortified now that I constantly suggest startups who are going to build non-trivial tech to select Python. Also the whole Linux, deployment, etc. space is filled with Python.
Biology, Physics, Math folks also use more Python than any other language.
https://docs.microsoft.com/en-us/aspnet/core/host-and-deploy...
It's also faster than Node. Significantly faster. I was troubleshooting a connection issue to a db from a .NET Core API so I set up a NodeJS API hitting the same db to isolate it to the connection. After I was done fixing things, I compared the API response speeds for the same thing. .NET Core outperformed Node dramatically just like in benchmarks.
or Scala if you need bigger scale?
After that there's a fairly healthy dose of C and Fortran from the performance-oriented crowd, quite a few people use Python (but its hard to express just how completely and totally alien their uses are from one another), and the closer you shift to biology or stats the more likely you are to see somebody using R. A few enthusiasts are extremely excited about Julia.
The result of that is that if I lookup basic concepts in web dev today I'll still find a lot of tutorials and courses aimed at total beginners for Ruby. Meaning that even today, Ruby would be a good choice for CS students in a program where they have no web dev course if they wanted to get a small website running.
Now Python is gaining adoption as an introductory language (MIT ditched Scheme for it!). My bet is that a lot of times python was chosen simply because everyone in the team knew it.
If I knew the scope of the project right when running git init I wouldn't pick these two, but my gut feeling is that a lot of these started as "let's get a demo working, we won't have more than 10 users ever anyways unless we get funding!"
So, relieving that burden allowed me a playground to pseudocode a program and then translate that logic to a more optimized inplementation.
I still use that strategy in forming mental models.
But now it's a matter of placing weak joints on a pipeline and accepting bottlenecks where they arise.
Can I MVP in 2-6 weeks on those 10 (the startup) alpha users? Fine. Can I leverage my engineering strengths and understand replacement priorities? That's my goal and I run very far from startups that want to build panickedly with toothpicks and glue.
We have been ripping out node bc basic backend frameworks are too small and mercurial, so breaking API changes every year on simple things like making an HTTP request has meant a lot of time not spent on our users. The ROI is often ultimately miniscule perf, and ironically, CVE fixes from such a cowboy culture. (We are also ripping out node bc current inability to work with data ecosystem well + shitshow that is build times / maintaining them.)
Frontend has been getting better. Part of that is we have largely stopped adding any dependencies and been chiseling down to react + a few components, and between them + the webpack team, focusing on the few teams we trust to respect end user code more than they value whatever feature idea that month.
I spent years on making JS better and we have built stuff to make it sing, so bittersweet business decision driven more by the unmaintainable code culture than the language (npm/node are growing up, V8 is amazing, React achieved a lot of what early framework people marched for, ... ).
EDIT 1: Part of the technical problem may be around npm <> semvar and in turn, the dev culture around it. Dutifully updating semvars upon breaking changes may have given a false sense of being friendly for maintenance. Major upgrades are still necessary in practice due to peer dependency requirements triggered by CVEs etc. It's ~awesome to now see breaking changes in better modules. However, having to do via searching github stinks, and horrifying how many there are. This is still a far cry from not consistently breaking everywhere, gofmt, etc.
EDIT 2: I don't blame OSS devs for hobby projects unrelated to their work and with no claims of aiming for prod ready / corp use. But as soon as we get to core node infra teams and their VC sponsors causing a bait-and-switch on infra focus, or tech leads at unicorns & bigco's doing their here today / gone tomorrow thing, etc., I get heart pain around adding any dependencies around this broken tech<>social ecosystem.
And for ML and AI-focussed startups, Python is a no-brainer.
More startups on the list use Ruby than any other language, and it accounts for over fifty percent of the valuation. It was very popular with startups ten, twelve years ago, but may since have slipped?
On the other hand among younger companions es one might expect to see more Go, Node and maybe Rust?
Still, an interesting analysis.
Ruby is still popular with startups, and there's no falloff in Rails usage, if the gem download graph is anything to go by.
I’m guessing maybe not having a single central repository like Gems will prevent this data from being so easily accessible.
> Note: Ruby and ruby on rails was a popular choice for YC startups around 2010 - 2012. Anecdotally, ~40% of YC startups used ruby during its peak popularity.
Reading this list, I'm struck by just how mainstream the languages are. I don't have anything against Python or Ruby, but it'd be hard to describe either as a secret weapon — indeed, about the only "secret weapon" languages on that list are Lisp and Elixir, each of which shows up only once.
Nowadays, I'd argue that the secret weapon is never the language itself. Ruby's popularity was never about Ruby itself, it was about Rails. Python is, IMO, an ugly hack of a language, but it's still the one I'm always pushing for at work, because its unbelievably lush open source ecosystem means that choosing Python means you'll end up having to do a lot fewer things yourself. Java's was originally about cross-platform deployment, but, now that it's been a decade or so since anyone was actually that worried about interpreted languages, I'd guess it's now more the fact that Java developers are cheap and easy to hire.
I'm also honestly not super impressed by Graham's comments on programming languages. Sure, Viaweb sold for a lot of money. But then it turned out that it was an unmaintainable mess that needed to be rewritten. Graham has done a good job of pitching the idea that this is because people can't understand the obvious genius of Lisp, and I'll admit, as a Lisper, that that story once beguiled me. But, now that I've been around the block a few times, I realize that code that can only be maintained by its own author is never good code, and that sale price is rarely a good proxy for quality (especially when the purchaser is Yahoo!), and that sheer dumb luck plays a much larger part in entrepeneurs' success than any business essayist cares to admit, least of all the ones writing autobiographical essays.
I think that's because the code after the sale had to meet different requirements than the code before the sale.
Before the sale, the chief requirement, at least from what I gather from pg's essays, was fast implementation of new features. That was the secret weapon, and Lisp was a key enabler for it. The number of people touching the code was very small, so having the code be understandable by others was not a high priority.
After the sale, the chief requirement, I suspect, was maintainability while running at scale. The before-the-sale period had already sufficiently explored the feature space that fast implementation of new features was no longer a requirement, so the advantages of Lisp were no longer crucial. But having code that lots of people could understand and modify reliably was crucial.
So the code had to be rewritten after the sale. That doesn't mean the code before the sale was bad, just that it was tailored to different requirements.
I think the ultimate test of code is whether it generates money. That is the sole reason the world cares so much about code.
It’s an unpopular definition, and I don’t like it, but it seems correct.
I'd say the ecosystem has move considerably in the direction he advocated. Even Java has moved considerably from 2001 Java.
Looks like HN brought it down.
Run something like Ruby/Python for the majority of your web services, and deploy Elixir/Go/Kotlin/C++/Swift whenever you hit a data-munching need that Ruby/Python can't handle.
This doesn't fly if you're doing games, or augmented reality (AR), or mobile apps, or embedded stuff... but the web is still a pretty big place to build.
The premature optimization adage has been touted around since forever but it can be widely applied.
He's on HN too.
[0]: https://en.m.wikipedia.org/wiki/David_Heinemeier_Hansson
It could just be a statistical fluke, given this sample size and all the selection effects at play, but I still have to wonder.
Moreover, you need to find developers that want to work on the .NET platform. Most of those people work in banking or in large organizations, and are unlikely to want to jump ship to a startup.
No more lock-in with Windows Server, no more SQL Server. Bring your favourite Linux distro and open-source RDBMS.
I left .NET for Ruby in 2013, and in the last two years, I've come sprinting back to .NET Core. Microsoft is definitely hot again.
Ruby, and to a large extent Python, started from zero, and gained a ton of traction during the whole Web 2.0 rumble.
Microsoft is starting from a negative perception amongst a lot of engineers, and unless they can provide some sort of killer reason to use .NET technologies -- one that doesn't lock you into their ecosystem -- then I don't see them unseating the established players.
Swift and its descendants will be around as long as Apple is, same story for Java and Android. Ruby/Python/Node will continue on in the Web space, with Go/Rust/C++ duking it out for backend services.
This game changes should Microsoft blow a new market open, though, like Apple and Google did with smartphones.
I did end up in a project a little while ago where they picked .net and azure as their technology of choice. Not sure what factors were in play there.
[0] https://nickcraver.com/blog/2016/02/17/stack-overflow-the-ar...
That's perhaps down to experience having the effect of hardening oneself. You've made the mistakes, you don't want to make them again, so you create solutions that are more robust and by definition "better code" and "better designed". Unfortunately, this can cause you take longer to get to market and if you do have to pivot, can be a harder effort pivoting off of a larger, purpose-built codebase.
It worked perfectly fine until it was move into a proper database with in house built portals to handle the tasks.
HN was more a response to PG wanting more moderation control over the content.
Engineering is about building and maintaining things fit for their purpose.
A bridge that only needs to last for one year is designed very differently than a bridge intended to be used for twenty years.
In the case of software, that means (a) building what your users and business need; (b) designing software that is easy to change; and (c) delivering enough scale to meet your expected needs for the current business cycle.
For a new startup, you can probably get by with a Rails app on a single Digital Ocean droplet, with plans to scale horizontally. For an established business, you might have a large data warehousing operation running custom C++ code across multiple datacenters, of which the web frontend is a tiny (but important) part.
Good software engineering makes the steps between "New Startup" and "Established Business" as painless as possible, with a minimum of additional overhead.
Successful startups are analogous to surprising research results in academia, in the sense that we have much more information on them and therefore we can draw better conclusions about these entrepreneurial/academic successes.
Today, every good scientist is aware that having plentiful information on unsurprising research is almost as equally important, for many obvious and non-obvious reasons. This has resulted in journals created exclusively for publishing unsurprising research. [1]
If the most rigorous among us w/r to taking care of our collective body of knowledge are only starting to do it seriously, we can have hope that the industry might ramp up its capacities in this regard at some point - I wouldn't hold my breath that's it's going to happen soon though.
Did using Node.JS or Typescript correlate with business failure or is it just more recent?
I'm also particularly interested if trying to use Kubernetes correlates with failure. Kubernetes, for all its good, is a rats nest of complexity.
So it's more of background/instincts things than language. C# technically has nothing which makes it a bad choice for startups.
No. It may even be an advantage for some verticals such as finance, and some business/deployment models like on-prem enterprise software, especially if being able to fit seamlessly into a customer's existing Microsoft setup would be a major selling point.
That said, if you find yourself contemplating how to scale your Active Directory and SQL Server deployment[0] to accommodate your growing user base of personalized emoji GIF makers, consider that your prior experience with your chosen platform may have led you astray somewhere.
[0] Apologies if the metaphor lacks punch, my experience with the MS stack is a decade or two out of date at this point.
Edit: oops, I see now it is the initial language. Indeed PagerDuty started using Elixir only later on. Although one could argue Elixir was part of Podium in its initial growth.
It was by no means a perfect science - was a lot of digging around on the internet for the oldest reference I could find!
Then they moved forward with ruby for the whole platform, maybe as that's what he understood and what was the alternative back in the day.
However for any of these businesses the language of choice if starting today for me would be rust.
I really feel you get a much more maintainable codebase than a ruby project,
As I'm sure you're aware, this is entirely subjective. I really feel the opposite.
There are plenty of people who have spent time learning rust but haven't had a chance to use it on a production project. I would start looking there.
It has not been proved large, long-lived Rust projects have fewer bugs than large, long-lived Python/Ruby projects.
That's not how startups work.
[0] Dropbox is listed as python only, but it is well known (and documented) they have re-written key elements of their back end in rust. https://dropbox.tech/infrastructure/rewriting-the-heart-of-o...
They will (for instance) avoid certain tech stacks, regardless of how useful the software being written is (eg “I wouldn't work for Wikipedia, they use PHP”).
I suspect any tech stack that they refuse to work with is at a systematic advantage when it comes to producing useful work, because engineers who don’t want to think about fiscal outcomes can readily lead you to ruin.
The CTO imposes pointless arbitrary rules, tools and languages and then talented developers figure out a way to deliver value within those arbitrary constraints. Developers will use whatever language their boss tells them, then later when the project succeeds, they will praise that language, those tools and their bosses.
People will always praise the leaders of a successful company, no matter how incompetent they are.
People who succeed always think that it was because of good decision making across the board. They don't admit to themselves that the only decisions that actually matter are who the CEO is and who their friends are. Our system is crony-capitalism, no doubt about it. Nothing to do with value creation; the evidence is everywhere.
Any good developer who analyzes cryptocurrency projects, for example, will realize that there is no correlation between quality (or scalability) of the technology and ranking/market cap of the project. The top, most valuable project is Bitcoin and consumes the same amount of electricity as the nation of Ireland to process a measly 4 transactions per second. Anyone who thinks that Bitcoin is the most valuable project due to technical merit is an idiot. It's 100% network effects.
I can say from experience across different tech industries that the current market selection process works the same way across the entire tech sector, not just cryptocurrency.
Every time a project succeeds, the people who built it will try to claim credit for that success. The people who are actually responsible for that success (through their personal connections) will happily let the technical people claim the credit because it diverts attention away from the much more cynical and unjust reality.
Any attempt to insinuate that success has anything to do with choice of tech is either misguided or deeply corrupt as the dogma harms real people (who will be forced to use tools they don't like) and causes real loss in productivity.
Facebook started out using PHP and it was so bad they had to almost rewrite the language from scratch. PHP is now pretty decent and Facebook is doing pretty OK.