It takes experience to unlearn this bad habit and realize that “duplication is cheaper than the wrong abstraction”[1].
While this post may not provide a perfect example I think it gestures in the general direction of this very important principle.
1. https://sandimetz.com/blog/2016/1/20/the-wrong-abstraction
On the other hand, DRY as a principle shines when it allows logical changes to a program to only require physical changes to the code in one place. E.g., in <horrible but self-contained algorithm> it's plausible that bugs might exist, and you'd really like bug fixes to apply to all implementations. The easiest way to manage that is to only have a single implementation. Likewise, to the extent that they're sometimes necessary your magic strings should be given a name so that your compiler can catch minor typos (supposing the edit distance between various names in your program is largeish).
It’s really silly to argue about this stuff. This claim has no evidence. For example, I feel the exact opposite, that abstracting early always makes it easier to refactor, and does not prevent ending up at better abstractions later on in any way.
I _feel_ like that’s true. And you can’t prove or disprove either side without agreeing on a cost model.
In terms of your algorithm, you would want to decouple your ingredient prep from your cooking algorithm. Otherwise, if prepping ingredients takes longer because you buy a new prep tool, your food winds up over or under-cooked. Secondly, you want to decouple your cooking algorithm from your equipment model. Otherwise, every time you upgrade your oven you need to rewrite every recipe. But this is all a digression.
In future if you want to make a point about software, I would recommend using either English or real code and not a stressed analogy to a novel domain. But in general, it seems you are still learning the craft. It's really great that you are thinking about the evolution of a codebase over time as this is a key area that people earlier on in their career miss, and IMHO one of the greatest learning experiences for a programmer is maintaining non-trivial system over an extended period as the environment and requirements change.
Oh, and check out https://web.archive.org/web/20021105191447/http://anthus.com... (1985).
To prove anything in the OP solution would be extremely complex. To extract new knowledge and abstract the solution in the future would be nearly impossible without a complete rewrite.
For example: what if we need logging? Timing of the steps taken? A list of dish washing tasks generated? Parallelism in the tasks, given an extra cook? Exception control? Unit testing of the dough?
Adding an extra recipe is not the only possible new requirement you can have. Anticipating and preparing the right abstractions, that’s what good software engineering is about.
It is, unfortunately, very easy to write difficult to read code, especially with good intentions and principles.
In practice, I've found that the most important principle is Locality, that is avoiding nested indirections and unecessary abstractions.
I completely agree with the author of the article here, the simple and dumb recipe with constants local values is both easy to read and easy to maintain.
It might seems like duplication but the complexity has to live somewhere and it is more manageable when it is not scattered.
The issue I often see is that when abstractions are created, the thought process or design of those abstractions aren't well explained. When you create abstractions, more verbose documentation and design is needed to share the ideas. Anyone using this in the future needs to understand your abstractions and there's a cognitive cost of dealing with it.
You reduce this cost when you explain everything well, give examples, show use cases, etc. When you don't provide this sort of verbose documentation, you might as well have made it a large sequential program because it likely would have been easier for the next person to understand.
The author chose a simplified example to demonstrate a point.
Also, without a proven a need, building for these considerations would result in an over-engineered solution.
Shrug. I too can make any point if I get to choose my own contrived examples. And when a bad example is chosen, like here, any reactions will devolve into bikeshedding about the appropriateness of that example (as is clearly seen in this thread).
This is a completely unnecessary personal attack on OP.
The code is the data here. Imagine instead of a program to make cookies it is two different scheduling algorithms for an operating system.
I like to think of this in terms of the Charizard Pokemon card
For context in this example I have this card and I'm sensitive about damage to it
so in this OO example I put the card in a box and allow you to interact with it in a very limited way, you cannot use anything you're used to to interact with it like your own gloves or hands etc
Just my "methods" so I might give you a tiny hole to look at it, you could still damage it through the hole, so I have lots of logic to ensure you cannot poke it incorrectly hopefully the verbosity on both your and my side is/was worth it and bug free and not missing cases, hopefully my hole was in the right place for your uses
Obviously I can't give you too many holes in the box otherwise what's the point in the box? I need the box to maintain my sanity
The other alternative is I just give you the card, and take the risk that you might damage it, this is a disaster for my well being OR I duplicate the card perfectly and give you the duplicate in which case I don't care what happens to the duplicate, MUCH easier in my opinion, so please
Stop creating hellish boxes with holes for other developers to peak through just choose a language with efficient immutability as the default or use pass by value semantics with mostly pure functions
Reserve your classes for things that are truly data structures in the general sense, not bs domain stuff like "bowl", bowl is not a fundamental type of computer science like integer, bowl is just data and it should be treated as such https://www.youtube.com/watch?v=-6BsiVyC1kM so it can have schema and such but don't put it in some kind of anal worry box, otherwise your program may end up more about managing boxes and peak holes than it will be about pokemon cards
A major benefit of OO is that you can actually enforce this. Encapsulation is useful for data objects where some configurations of bits are valid and some are invalid. Careful interfaces let you ensure that the object is always in a valid state and does not permit you to do a thing when it is not valid. The fact that you'd be unsure of these questions is an indication that your interface is done poorly.
Granted, this is really hard to get right. Doing it badly leads to the nightmarish combination of easily mutable state that isn't easily visible.
"Copy everything" can be a really compelling option for many programs and there are persistent data types that help do this in a mostly scalable fashion. But there are plenty of cases where it just won't work. In my job our system primarily works on a data object that is too large to meaningfully copy everywhere. The solution is extremely judicious use of "const" and clear rules for automatically invalidating certain dependent program state when the underlying state we are working with changes. Lots of work, but in the end you get a ton of very strong invariants that make it really easy to work with the data.
> not bs domain stuff like "bowl", bowl is not a fundamental type of computer science like integer, bowl is just data and it should be treated as such
To an extreme: if your abstraction isn't formally verified, kill it?
Assuming as truth the idea that abstractions follow organizational structure, then only divide an organization when you have a formal abstraction for each division?
I wish there was a way to reason about this stuff that isn't so artful. I intuitively understand things like DRY, SOLID, etc, but being absolutely confident that they are true or whether they have been applied correctly is art, and I would prefer it to be math.
My usual view of recipes is poor - I always see them being something like this: 1. blophicate the chicken for 5 minutes or until soft (I made up that word but you should be a good enough cook to have some idea what it means) 2. coat with a paste made from the garlic, herbs and butter (you did know you should've made that earlier, right?) 3. now add them to the fat you've been heating up for the past ten minutes (come on, surely you had that ready?) 4. serve on a bed of hand-soaked cous cous, which you prepared yesterday using this mini recipe: 4a. ...
I was going to say I'm just going to try this out on one of the recipes I've recently used, but someone did that already:
https://web.archive.org/web/20170420110020/http://www.matthe...
How is that not strictly superior to traditional "word-problem" recipes?
--
Also, one thing that I hate about recipes, as a person who cooks only occasionally, is the "to taste" direction. I know what to do when I've done a given dish 10 times. But the first time around? Why no recipes ever provide any kinds of bounds? "Add to taste; between 0.5 and 5 tsp, 2tsp is typical".
(Truly, cooking is what happens to process chemistry when you care so little about the quality of the outcome that you can wing every part of the process.)
When I was learning to cook my process was to read the recipe thoroughly, including any instruction on techniques. A good book will tell you how to chop an onion and sauté it. I would then distil this into a dependency graph which I would write down in a book. I still have this book. A recipe in there looks something like this (excuse the bad ASCII art):
2 eggs -|
200g sugar -|---- beat together---|
200g butter -| |
|--- combine --- bake
240g flour -| |
1tsp baking pdr -|-- combine ------|
1tsp vanilla -|
Notice that, just like make, I don't need to write down how to beat or combine stuff together. I have a library of known steps to get from those ingredients to the desired state.Nowadays I don't have to actually write this down because I stick to a few cuisines that I know well (English, French, Italian and Indian generally). I know 90% of the techniques I'll need for any recipe so I can simply read the recipe, assimilate it, then execute it in the kitchen.
The two biggest mistake I see new cooks making is not reading a recipe through first, and not building a library of common techniques. Instead I see people taking the original recipe right into the kitchen, often on their phone these days, and executing it as they are reading it through for the first time. This usually leads to incredible amount of wasted time due to poor scheduling. Always aim to be free of the recipe. Like a musician you should eventually be able to play the piece without the music.
I tend toward configuration-driven design, the more I get into operations (not necessarily development in the purest sense).
If I'm writing things that I want people to use, I want them to describe what they want - I don't want them writing code unless they need to extend what I've already done.
Configuration as code can definitely work and make some things more clear (at least, until the point an edge-case has to be added to the core routines to account for a new/custom type of configuration process).
Using a lisp tends to treat code as data, which solves all the problems in one fell swoop.
So I think specifically the post's analogy of "it's hard to know how-much of each ingredient to use in each step" doesn't really map very well.
Adding indirection can make things more difficult to read. If the details you need to know are placed in multiple places, this is complex and adds cognitive load to understanding the code. -- Ruby is nice to write, but a PITA to refactor, because the 'type' of a method's argument is implicit. Whereas with languages with ADTs and records, a piece of code can be made 'smaller' and more explicit, and easier to refactor.
Maybe the post's argument can be adjusted where with some baking items, an additional step may-or-may-not be taken.. where an indirect style makes it harder to get an understanding of what's going on. -- But it's also important to note that sometimes the system being modeled is complicated and benefits from the added indirection.
I also hate when people make oversimplified analogies about programming.
Or maybe it's just me and my brain being wired abnormally for cooking. I'd love to learn it, but I keep bouncing off it, hard.
Pick any random ingredient you enjoy and cook it using a few different techniques and a few different time/temperature combinations. Try to understand what is happening with that ingredient, why they taste different and which you prefer. You'll soon get a feel for how different combinations of time and temperature affect different ingredients and will often be able to guess how new ingredients will behave based on your experience with similar ingredients.
I personally consider Alton Browns old TV show Good Eats as a great introduction to cooking following this approach. Most episodes are dedicated to one ingredient or one technique and really breaks down the science behind everything and how different factors affect the outcome. Once you understand the basic techniques and ingredients, putting together recipes becomes a lot easier.
On a piece of paper, draw 5 nodes and make a fully connected graph. What shape does it have? Well, whatever way you drew it, the shape is quite recognizable and distinct.
Now on a piece of paper, draw 1 million nodes and make a fully connected graph. You can use a computer if you want. What shape does it have? You can't tell, because all the lines are in the way? Alright, well what if we make 5 clusters, and represent them as nodes, and since it's a fully connecte graph, you can just connect those 5 nodes fully. It has a recognizable shape again! Nice :) There is the tiny little caveat that you now have a cluster of nodes represented as one node, but what could go wrong?
More layers of indirection, that could go wrong. The more concrete stuff you have, the more you need to abstract away in order to maintain a high level overview, but the tradeoff is that while it is high level, it is less concrete (more abstract).
> Step 5: Bake for 10 minutes, cool for 5, enjoy!
yet even the initial implementation gets things wrong.
sheet.cool(10)u/rgoulter wrote:
"I'd note that the recipe used in the post doesn't include a list of ingredients."
Yup. Also missing are preconditions, assumptions, defensive programming. Maybe forgivable omissions from a blog entry. But those "ingredient" steps are what allow the "recipe" to be simple.
u/contingencies wrote:
"decouple your ingredient prep from your cooking algorithm."
This is The Correct Answer[tm].
But I don't see anyone explaining why: It makes the code testable, directly.
Stated another way:
Decouple all the async, blocking, I/O stuff from the business logic. And do not interleave those tasks.
How you know you're doing it wrong:
Any and all use of mocking, dependency injection, inversion of control is wrong. Therefore, the presence of Spring and Mockito (and their knockoffs) is strong evidence you're doing things wrong.
My professional cookery text book [1] doesn't write recipes like that. It starts each recipe with 'mise en place' [2]. It also requires that you are familiar with the previously described general process for recipes of the type.
It's a shame that most people are only familiar with the trash that comprises the bulk of cookery books.
Sorry, rant over.
[1] Professional Cookery: The Process Approach, Daniel R. Stevenson, https://www.amazon.com/dp/0091583314
https://itnext.io/solid-principles-explanation-and-examples-...
Representing a simple recipe as a process may work, but try modeling an increasingly complex system (say your simple local football players tracking system, to something more extreme like a payroll system or an aircraft traffic control system) in such a way.
It has been tried before with limited success during the decade of structured analysis and dataflow diagrams.
Isn’t such an approach suited to simple transformational problems only?
It reminded me of how FP is applied to a program where the instructions and intent of the developer are encoded to configure the system (recipe) and then the execute function is called. E.g. the onion concept.
Far from being an expert on the topic but was happy about recognizing the pattern.
To the article I would like to add that services can be designed as cooks so each and everyone has a purpose and a separation of concern.
My recipes are vague inspirations at best. Cooking is done by listening, smelling, tasting and looking, not reading a set of instuctions. Not sure how that transcribes to coding.
When making dinner for your family, sure. When needing to turn out 1000s of identical cookies day after day and you don't even know who will actually be making the cookies next week, not so much.
But ya, none of those solutions seem great
One of the best programming books I read was UML Distilled (which introduced the idea of some pattern beside teaching UML) and Design Patterns (Shalloway, Trott).
The moment I started reading this post I thought “that’s a strategy pattern use case”. And that’s the conclusion.
A lot of people fret upon reinventing the wheel (use library instead!) but then they do so very often with the software design where a lot of problems are not only well researched but also peer reviewed and properly described with consequences that come with them.
If you enjoyed this article I would recommend picking up book on design patterns (any popular will do) as there are many more prefabricated solutions to choose from.