Sometimes it tye badness was really effort due to unfamiliarity or not instantly understanding what I was looking at.my laziness.
Sometimes it was because it disagreed with what ever framework or methodology I was using to give me confidence in the face of ignorance. I feel like an imposter but at least I know design patterns so this guy who did MVC wrong is worse.
Sometimes it was looking at something genuinely bad.
Now, later on, maybe my emphasis is more on business outcome than perfect implementation or maybe I've been involved in making enough abominations due to time pressures and architectural compromises that I can read those forces in other people's work.
Either way, I don't feel that kind of disgust anymore. It's code. No one is going to read it. It will be replaced next year. It works or it doesn't. Having to rip stuff out when the business changes or someone ways to use a different stack for resume reasons is part of life.
I wonder if this is an adaption or a maladaption.
They weren't really based on anything more than sounding like they were true.
I'd hop on every paradigm that sounded correct. Clean code. Pure functions. Effective java. Pragmatic programming. Defensive programming. Like it has that righteous vibe to it. I'd totally strap a bucket on my head and go conquer the holy land under any of those banners.
If only we could do it my way, I thought, then we wouldn't have to put up with all these chafing points that annoyed me. Never do this! Always do that! My mind was like thumbnails from fitness youtube.
Along the way I discovered that when I got to do things my way, it turned out that there were actually still a bunch of chafing points. Different, but it sure wasn't great. Maybe my 30-year-old ass didn't know everything.
Eventually, along the way, I sort of came to the insight that I've built what, 15 applications in the course of my private and professional career. I've worked with 3-4 programming languages in enough depth to be competent with them. I've tried a few architectural paradigms. If I work until I'm in my 60s, I'll maybe double that. Life isn't long enough to get much deeper than that into the craft.
Given this pitiful sample size, it's nothing but hubris to think that I or anyone else would have a clear grasp of what is the best way of doing things.
Whether you wrote 15 applications or just one or two, does that really matter? Designing, exploring, writing, iterating on and maintaining these applications _for years_ have given you insights, battle scars and tacit knowledge that can only be gained through experience and continuous learning. Not to mention the different environments technologies and foundational knowledge you explored and internalized.
You've accumulated a hard earned skill set and the ability make wide reaching, pragmatic decisions. Do you or someone else _know_ what the _best_ way of doing things is? Probably not. But I bet you have developed opinions, taste and a toolbox of approaches with different trade offs.
That's maybe where the OP is coming from as well. The mindset of being opinionated is very valuable if you can back it up.
That doesn't mean you're always right and don't let other speak. That doesn't mean you can't change your mind or that your approach excludes other people's perspectives and incentives.
It means you can strive for _better_ and that you're crazy enough to make bold decisions when necessary.
I was often the dumbest guy in the room, and I'm smarter than the average bear.
Dealing with these folks could be infuriating. Every time I would suggest orthogonal approaches (because, like, software is different from hardware), I'd be called "lazy," or "sloppy."
It made me write good code, though.
If those folks saw the way I work now, they'd be horrified. They'd call me a "reckless cowboy," or something to that effect.
But most folks in today's software industry think I'm a stuck-up prig.
I work quickly. I leave good, highly-documented code, that lasts a long time (For example, one of my C SDKs was still in use, 25 years later), and I don't want to toss my cookies, whenever I look at my old code (sometimes, though, I shake my head, and wonder what I was thinking).
I'm my own best customer. I'm the one that usually needs to go into my old codebases, and tweak them, so I write code that I want to see, in the future.
I've come to realize that the term "over-engineered" can mean a couple of things:
1) This code is too naive, complex, and byzantine, which makes it prone to bugs, inflexible, and difficult to maintain; or
2) I don't understand this code. That makes it bad.
I used to have an employee who was "on the spectrum."
Best damn programmer I've ever known. Crazy awesome. Had a high school diploma, and regularly stunned the Ph.Ds in Japan.
His code was written very quickly, was well-designed, well-structured, well-documented, bug-free, highly optimized, and an absolute bitch to understand.
If you're building a quick demo of a product to get user feedback, and you write perfect code that's highly maintainable, you've wasted time - better to throw together something as quick as you can and rebuild it if it's actually going to be used by/sold to customers. That's really overengineering in my mind - doing a poor job with the quality/speed tradeoff given the purpose of the thing you're building.
Then it truly is awful. Deeply awful. It sounds like you've never progressed past dealing with terrible code, so you have my condolences. Good code is read. Good code is not replaced in a year. Even most bad code is not replaced in a year. Truly you live in a world of absolute shit code.
I think it's a good thing. Much of what used to get me irritated (and, from my observations, what gets other irritated) are just matters of style, and what style is being used isn't really important.
If code doesn’t really matter cause “business,” then I think you are right when you say business is more your interest. That’s cool! I go through similar feelings at times.
For example, they swear by writing low-volume web backends in C++ "for performance" and object to any kind of framework. Ironically, having to move more slowly and carefully as a result has led to big compute inefficiencies, on top of the more important hit to dev productivity.
Very often, the mathematically or logically "best" approach is not the correct engineering approach because it doesn't meet those practical requirements.
Engineers who insist on a sort of purity are, in my opinion, not the best engineers even if they are genius at writing code.
I've never seen this happen even once in my 25 years as a developer.
Quite the opposite, in fact. Codebases grow and become more entrenched every year the organization stays in business. The goal becomes to shoehorn into the product more and more features that the original designers never dreamed of. And those original devs are usually long gone. To do that shoehorning long-term in the face of developer churn requires intense discipline around communicating to the next person what you are doing and why.
But we deliver the message more or less.
If we had to have perfectly rehearsed sentences,
We'd never talk
When you are early to mid career, it is crucial to look for ways to amplify the good you can do in your workplace and solidify your brand as an individual. To do this, you should be looking, ironically, to elevate others. Doing so is the only way to build a reputation that people are going to actively WANT to talk about (e.g. "oh, having trouble? You should call in Jim, he helped me with a related thing"). This is invaluable.
Perhaps I am speaking through a lens, but had I taken the authors advice and taken a more combative role at such a juncture, I believe I would have far fewer opportunities now.
The key is illustrated in the book club parable: The elitism is directed outside of the group and becomes only a means of alleviating the fear of judgement for misjudging the paper. The grad student's approach clearly communicates the socially agreed upon reality: the whole paper is crap. This stance and boundary provides a clear decision space to the learning junior members: "if you think you see a mistake, those here will be happy to hear it; no sacred cows".
Bringing this practice into a situation where the target is a member of the group's work changes the dynamics such that you have to mind your Ps and Qs again -- and so, dampens learning.
In order to learn and make things better you _have to_ be critical. But the way things are communicated is very important so everyone is on board.
"We could do better here" - no matter who exactly was responsible in the past.
"This made sense at the time but with what we learned..." - remind each other that improvement and learning is part of the whole deal.
"I like the simple and expressive core idea of this, but if we expand this further..." - elevate and develop the good stuff that's already there.
I make jokes about my mistakes, bring them up early and often. Everything is a bit lighter and easier with a bit of humor and without the fear of making mistakes.
And vice versa it is just as detrimental to be afraid to bring mistakes and inadequacies up and criticize them. It's much more fun and productive if things are continuously improving.
If you start "amplifying" too early, you won't be amplifying selectively, and your coworkers would be just as well off sorting through search results themselves.
(Of course it's a progressive transition, and you're never too inexperienced to advise a coworker not to force-push master or submit a PR with failing tests.)
With a decade under my belt I'm feeling that I recently finally started learning and the conclusion so far is that half of the effectiveness of software engineering comes from obeying ultimately simple and common sense rules that anyone can follow, like "use idiomatic expressions", "read the documentation", "prefer pure functions and immutable data structures".
I'm an average(and kind of lazy) developer, but I found early on that I have an edge over more talented and hard-working people - I gather knowledge instead of compensating for the lack of it with hard work.
Can you frame it as elitism? I hope not, because I deeply believe the worst and laziest developers can use some of those rules so that they're both effective and still bad and lazy.
This sounds like common sense until it becomes common nonsense, because you use Python or Javascript or Ruby or whatever language where you don't have an optimizing compiler and optimized immutable data structures, so what could have been be a single-pass low-memory scan over a big dataset in 10 minutes that could run on a toaster is implemented as an inefficient clusterfuck that takes 12 hours and 4 GB of memory.
It's absolutely important to treat mutability as either a side effect or a local optimization from a design perspective. But Python is not Clojure no matter how hard you try, and at some point you're going to want to mutate a dictionary.
Anyway, the point is that what might seem like "common sense" in some situations is not always obvious or unambiguous in general.
The draining nature of the "this is shit, we need to do clean code" and "TDD is the way" discussions being constantly repeated day-in day-out, can quickly kill any interest in working on code. It's not that far from being forced to write a book using a 100 most common word list and if you use any word outside of that, you're a shit writer because it will make it harder for others to read, because you used a word that's not in the most common word list...
I'm quite glad that I moved away from that corporate environment into startup space a year later, which showed a whole different perspective. Where code itself is useless and what mattered is if it delivered value. If your hacky solution can deliver value - then you can justify making it better. Otherwise - who cares.
I do think there's way too much attachment to code & its perceived quality in the dev community. On the other hand, if you work in a team where majority of people are well into their careers - there's a lot more nuance when it comes to the extremes such as "TDD all day all night" and "daily pair programming". They are seen as tools to utilise when appropriate, rather than mantras to be repeated mindlessly.
One of the reasons is small sample size. Children don't have the sample size to see why their simplistic solutions don't work. Neither does a programmer with two years of experience.
My colleague has 22 years of engineering experience across a wide range of problem spaces and industries and team sizes (large bigco to mom/pop orgs).
Another guy (X) started was a year out of high school.
X insists that 'tech XYZ' is the best. Current tech stack was partially rebuilt by my colleague, but wasn't finished (because... lots of reasons, mostly resourcing).
XYZ is not only not a great fit, but the ecosystem supporting the problem space is small, especially considering what's already in place. Terabytes and years of data need to be migrated (both physically to new data centers and code-wise - new structure handling has to be added to accommodate current and future needs).
In planning meetings, "all voices need to be heard"... so X pushes XYZ a lot. And randomly rebuilds small bits in XYZ. And when it doesn't work - blames everything else (it's the network, it's the supporting libraries, it's ...).
The CTO will not push back. "That sounds great! That sounds like it'll solve all our issues!". There's a criminal deficiency in the understanding of the current tech stack or problems, along with no experience in migrating anything. But any criticism is taken as "we need to be more inclusive and let more people speak up - some of the best ideas can come from people who've not traditionally been heard".
Up to a point, that can make sense. But when do you draw the line? 3 months? 6 months? 18 months? People insisting on promoting child-like understandings of problems and solutions - while not ever delivering anything resembling a working solution - at some point should not be listened to.
Why does my colleague stay? He's only part time right now, and was close to leaving, but there's been some shift to refocus the CTO on something else, which may - over the next month or so - leave the few competent people there alone enough to get things back on track. I think if this was a 'full time' gig for him, he'd have left already.
I thought you meamt the die hard proponents of these software dev tropes are the childlike ones.
Imo I think that is more apt. I see a lot of resources wasted in the name of conforming to standards when the standard does t really apply to the partocular scenario
Now that is a quote for the ages.
I think my point is engineering may lend itself to strong opinions in people with poor social skills.
So, in the one example you gave, they were right. And yet I get downvoted for suggesting: hey, guys, that senior developer who's insisting on doing something a certain way? Sorry you have to hear this, but they're often right.
For your first point, no, they were wrong, I did very well on bitcoin - hence my example.
For your second point - I am confused - I agree senior developers know what they are talking about, I was one for nearly 2 decades. I was referring to peers not superiors. I was saying I have encountered narrow mindedness and ignorance. I did not say that is all I encountered. I also encountered kind, intelligent, superior and inspirational people.
Edit> On second thoughts - I re read your comment - Who downvoted you? When? Was it in regards to this article?
What on earth are you talking about?
Are you chatgpt?
Imagine I look at the Linux kernel source code and I feel it's lacking in automated integration tests, and that C is a poor choice of language for security-critical code.
Am I a competent professional, applying objective quality criteria?
Or am I an arrogant dilettante, to imagine I know better than some of the most influential living programmers?
Objectivity does not imply that it's easy to discern adequate criteria or that they are easy to know or that there is a consensus about them, just that it isn't purely subjective, and code isn't.
People with decades of experience are often integrating thousands of lessons they've learned, in order to satisfy hundreds of individual constraints as best as they can. Consider the "Thinking Fast and Slow" anecdote about the fire chief who orders everyone out of the building, despite nothing obvious being wrong, because they have a gut feeling. If there's a culture of "Who made you the boss? Look at this guy, acting like he's the smartest guy in the room. If you're so smart, you should be able to explain to me why you believe we should all leave the building. All voices deserve to be heard. Sometimes the best ideas can come from ..." then everyone dies.
Look, I know we hate it when that arrogant sportsman is actually good at his sport. It's a bruising to our ego when the senior developer seemingly arrogantly insists on some standard or some change that isn't immediately obvious to you, and she doesn't immediately have the time to explain to you the 600 reasons why. She's probably a hell of a lot more critical of the idea than you are! She's already thought of, and worked through, every single objection you're raising, plus 100 more, and still decided this was the way to go. Yes, mentoring is important, and she probably does spend a lot of time doing that. (Maybe you're not willing to listen?) But not every moment has to be a healthy-debate all-ideas-welcome teaching moment. Sometimes you just fucking listen to the smartest guy in the room before the building collapses.
You might inclined to disagree with "Well, he said 'this technical elitism' and then gave his own definitions so... ". Yes, he said that, and then proceeded to suggest it's not just acceptable, but desirable to view efforts below your standards with disgust.
There was no implication of mindfulness or humility in this process, only that it was an effective way to self motivate. There is a difference between what is sufficient and what is unacceptable. If you set your standard at excellence then anything less would be unacceptable. Tempting a mechanism as it may be for self judgement, it is a slippery slope because we often judge others to the standards we set for ourselves.
if I had only one wish, it would be the end of this reference. Python has no zen, at least not anymore, and if you just want to refer to some abstract moral or best concept, please pick another quote
Says the language whose code-block-endings are invisible! I will never understand the desire of people to make the important punctuation in their language invisible. Python, CoffeeScript, YAML. These people just hate being able to see the important flow-determining punctuation in their language!
Discuss.
(I did read the article)
I think it’s even simpler than this white/black belt metaphor.
Driving your career forward requires delegation, scaling yourself through others. Doing this effectively requires having strong opinions. The author is referring to these opinions as elitism, which is jarring to me. It could be elitism or simple pragmatism.
Quite a few posts here are referring to code quality. People often forget that programmers aren’t paid to write the prettiest code or have the most beautiful abstractions. VALUE is what we want to produce.
I once found myself insulting a monolithic code base only to later realize that mess of a code base has shipped in over 10 million devices and a product rated over 4.5 stars on Bestbuy, Amazon, and many more retailers. It’s entire ecosystem had directly and indirectly generated billions of dollars in sales.
Meanwhile, my own teams’ clean code with well thought abstractions hadn’t generated any revenue at all. In fact, this other “piece of shit” that came before paid for all our compensation.
However, the ethics of posting a poorly written chat bots nonsense points to a deeper issue, We must consider if derivative copyrighted material when misappropriated by users is still plagiarism due to missing citations.