It was pretty clear, even 20 years ago, that OOP had major problems in terms of what Casey Muratori now calls "hierarchical encapsulation" of problems.
One thing that really jumped out at me was his quote [0]:
> I think when you're designing new things, you should focus on the hardest stuff. ... we can always then take that and scale it down ... but it's almost impossible to take something that solves simple problems and scale it up into something that solves hard [problems]
I understand the context but this, in general, is abysmally bad advice. I'm not sure about language design or system architecture but this is almost universally not true for any mathematical or algorithmic pursuit.
I don't agree. While starting with the simplest case and expanding out is a valid problem-solving technique, it is also often the case in mathematics that we approach a problem by solving a more general problem and getting our solution as a special case. It's a bit paradoxical, but a problem that be completely intractable if attacked directly can be trivial if approached with a sufficiently powerful abstraction. And our problem-solving abilities grow with our toolbox of ever more powerful and general abstractions.
Also, it's a general principle in engineering that the initial design decisions, the underlying assumptions underlying everything, is in itself the least expensive part of the process but have an outsized influence on the entire rest of the project. The civil engineer who halfway through the construction of his bridge discovers there is a flaw in his design is having a very bad day (and likely year). With software things are more flexible, so we can build our solution incrementally from a simpler case and swap bits out as our understanding of the problem changes; but even there, if we discover there is something wrong with our fundamental architectural decisions, with how we model the problem domain, we can't fix it just by rewriting some modules. That's something that can only be fixed by a complete rewrite, possibly even in a different language.
So while I don't agree with your absolute statement in general, I think it is especially wrong given the context of language design and system architecture. Those are precisely the kind of areas where it's really important that you consider all the possible things you might want to do, and make sure you're not making some false assumption that will massively screw you over at some later date.
This is a really good point. LLL and "Feynman's" integral trick come to mind. There are many others.
I got it in my head that this doesn't apply to NP-complete problems so should be discounted. When trying to "solve" NP-complete problems, the usual tactic is to restrict the problem domain into something tractable and then try to branch out other regions of applicability.
> Those are precisely the kind of areas where it's really important that you consider all the possible things you might want to do, and make sure you're not making some false assumption that will massively screw you over at some later date.
I will say that abstraction is its own type of optimization and generalization like this shouldn't be done without some understanding of the problem domain. My guess is that we're in agreement about this point and the talk essentially makes this argument explicitly.
I would say, if you have to design a good consensus algorithm, PBFT is a much better starting point, and can indeed be scaled down. If you have to run something tomorrow, the majority-vote code probably runs as-is, but doesn't help you with the literature at all. It's essentially the iron triangle - good vs. cheap. In the talk the speaker was clearly aiming for quality above all else.
Unfortunately, the "history" omits prototype-based OO (Self, Io, Lua, etc.) which doesn't suffer from many of the "issues" cited by the speaker.
Having said that: why would he though? In this particular talk he's trying to argue to people who program in C++ why the historical C++ architectures are limiting it, he's not trying to convince anyone to switch languages. So those languages aren't his audience.
The context, for the record, is inventing good general software architectures (and by extension generalized programming paradigms) for everyone to use. I agree with you that this is bad advice for generally fixing things, but for this context it absolutely makes sense to me. The hard problems are more likely to cover all the walls you'd bump into if you start from the oversimplified ones, so they are much better use-cases to battle-test ideas of what good architectures or programming paradigms are.
There is an appeal to building complexity through Emergence, where you design several small self-contained pieces that have rich interactions with each other and through those rich interactions you can accomplish more complex things. Its how the universe seems to work. But I also think that the kinds of tools that we have make designing things like this largely impossible. Emergence tends to result in things that we dont expect, and for precise computation and engineering, it feels like we are not close to accomplishing this.
So the idea that we need a sense of 'omniscience' for designing programs on individual systems feels like it is the right way to go.
So if you look at it through that lens, the need for a little omniscience seems natural. The mistake was in thinking that the program was identified with the objects that the laws govern, when really you have to cover those AND the laws themselves.
https://nothings.org/gamedev/thief_rendering.html
https://www.gamedeveloper.com/design/postmortem-i-thief-the-...
I skipped a fair chunk of the middle of this video as I really wanted to get to the Sketchpad discussion, which I found very valuable (starting around 1:10).
I think Casey was fairly balanced, and emphasized near the end of the talk that some of the things under the OOP umbrella aren't necessarily bad, just overused. For example, actors communicating with message passing could be a great way to model distributed systems. Just not, maybe, a game or editor. Along similar lines, I love this old post "reconstructing" OOP ideas with a much simpler take similar to what Casey advocates for:
https://gamedev.net/blogs/entry/2265481-oop-is-dead-long-liv...
But I of course enjoyed him calling out the absolutely dire state of OOP education/tutorials. I satirized this on my own blog ages ago:
https://crabmusket.net/how-i-learned-oop/
In that post I referenced Sandi Metz as an antidote to awful OOP education. I may just have to include Casey as well.
Would you be so kind as to elaborate how/where? (Other than the "arpanet in the 90s")
But I gotta say, I find the graphical background (the blurry text around the edge of the screen that's constantly moving and changing) supremely annoying, not to mention completely unnecessary.
Dear presenters and conference producers: please, please don't do that.
That might have been important for the performance aspects that drove the resurgence in ECS, though I know he's focused more in this talk on how ECS also improves the structure for understanding and implementing complex systems: in the 70s and early 80s memory latency probably hadn't begun diverging from instruction rate to such an extreme degree, but in disks it was always a big issue.
Also would like to hear more about Thinglab and if it had some good stuff to it.
https://archive.esug.org/HistoricalDocuments/ThingLab/ThingL...
Ten years later, he has no game, only a rudimentary, tile-based dungeon-crawler engine, and reams of code he's written and re-written (as API sands shifted beneath his feet), and the project seems to be permanently on hiatus now. Thus, Casey inadvertently proved himself wrong, and the conventional wisdom (use an existing engine) correct.
As far as OOP goes, 45 years has shown that it makes developers highly productive, and ultimately, as the saying goes, "real heroes (handmade or otherwise) ship." Casey's company was founded 20 years ago, and he's never shipped a software product.
He complains often about software getting slower, which I agree with. Yet how many mainstays of Windows 95/98 desktop software were written in a significantly OO style using C++ with MFC?
First, Casey offers refunds on the handmade website for anyone who purchased the pre-order. Second, the pre-orders were primarily purchased by people who wanted to get the in-progress source code of the project, not people who just wanted to get the finished game. I'm not aware of anyone who purchased the pre-order solely to get the finished game itself. (Though it's certainly possible that there were some people.) Whether that makes a difference is up to the reader I suppose, since the original versions of the site didn't say anything about how likely the project was to finish and did state that the pre-order was for both the source-code and the finished game.
Second, the ten-year timeline (I believe the live streams only spanned 8 years) should be taken with the the note that this is live streaming for just one hour per day on weekdays, or for two hours two or three times a week later in the project. There's roughly 1000 hours of video content not including the Q&As at the end of every video. The 1000 hours includes instructional content and white board explanations in addition to the actual coding which was done while explaining the code itself as it was written. (Also, he wrote literally everything from scratch, something which he stated multiple times probably doesn't make sense in a real project.)
Taking into account the non-coding content, and the slower rate of coding while explaining what is being written, I'd estimate somewhere between 2-4 months of actual (40hr/week) work was completed, which includes both a software and a hardware renderer. No idea how accurate that estimate is, but it's definitely far less than 10 years and doesn't seem very indicative that the coding style he was trying to teach is untenable for game projects. (To be clear, it might be untenable. I don't know. I just don't see how the results of the Handmade Hero project specifically are indicative either way.)
How much of that is due to the programming practices he espouses, I'm not sure. Ironically, if he went all-in on OOP with Smalltalk, I could see the super productivity that environment provides actually making it harder for him to finish anything, given how much it facilitates prototyping and wheel-reinvention. You see this with Pharo, where they rewrite the class browser (and other tools) every 2-3 years.
But his track record doesn't support the reputation he's built for himself.
> for game projects
That's the problem. Casey holds up a small problem domain, like AAA games, where OOP's overhead (even C++'s OOP) may genuinely pose a real performance problem, and suggests that it's representative of software as a whole; as if vtables are the reason VisualStudio takes minutes to load today vs. seconds 20 years ago.
[[citation needed]]
As an external industry observer, I've seen many claims, but no actual direct evidence.
the big one is immediate mode UIs, which casey popularized back in 2005. Unity's editor uses it to this day, and if you do editor scripting, you'll be using it. for in-game UI, they switched to a component-based one, which also somewhat aligns with casey's opinions. and they shipped DOTS, which aligns even more with what he's saying
i think his lack of shipping is mostly because he switched to teaching and has absolutely no pressure to ship, rather than his approach being bad
Meanwhile you could probably surpass Handmade Hero with any off the shelf engine with a tutorial and a few hours' work, or even a project template from an asset store. The biggest problem I have with Handmade Hero is that because Casey is putting so much effort into the coding and architecture up front, the game itself isn't interesting. It's supposed to be a "AAA game" but it's little more than a tech demo.
And that's why you use off the shelf engines - they allow you to put effort into designing the game rather than reinventing the wheel.
The vast majority of developers use these engines, so you would expect the vast majority of games to be stuff that's easy to make within those engines.
With how samey new games are, it's hard to argue that what we see comes close to the full design space of possible interesting games. That's partially developers copying games they've seen work and sell, but it's also developers making what is reasonably easy to make within Unity or Unreal with the resources they have.
Thought he was just producing filler content on youtube but this really shows how magical it can be to put real effort into something.
https://www.youtube.com/watch?v=QjJaFG63Hlo
Also, earlier versions of Smalltalk did not have inheritance. Kay talks about this is his 1993 article on the history of the language:
https://worrydream.com/EarlyHistoryOfSmalltalk/
Dismissing all of this as insignificant quips is ludicrous.
https://youtu.be/wo84LFzx5nI?t=823
He mentions Alan Kay about dozen times and uses quotes and dates to create a specific narrative about Smalltalk. That narrative is demonstrably false.
click Casey's links more
click Show transcriptIs there any way to extract just that text into a document I can read?