My fellow dev often laugh when I tell them that instead of looking at all the long dead techs that are not useful to me anymore, my way to feel good is to look back at all the long dead techs that I didn't bother to learn.
And, geez, is the graveyard huge.
> Java Applets were also a big thing once upon a time. They were slow, and having the correct version of Java installed on your computer was always a mess.
Java applets were never that big. They didn't work very well (for the reason you mention) and weren't ubiquitous. They also nearly all looked like shit.
But Java isn't disappearing anytime soon. Java is huge and it'll have a legacy dwarfing COBOL big big times. Many may not find Java sexy but the JVM is one heck of a serious piece of tech, with amazing tooling available.
And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.
Did anyone in the mid to late nineties / early 2000s really discover Java and Java applets and thought: "Java applets is the tech that'll catch on, I'll invest in that" and not invest in Java itself? To me Java was the obvious winner (not that it was that great but it was clear to me Sun was on to something). And, well, compared to the other dead tech, at least if you learned Java applets you got to learn Java too so it's not all lost.
Guilty as charged! I hate using Java because everything written in java seems to blend into the same indistinguishable swamp of classes with meaningless names, full of methods that constantly find new and interesting ways to obscure what your program is actually trying to do. Debugging very large Java codebases feels like living through Terry Gilliam's 1985 film Brazil.
I think the problem is cultural, not technological. It seems like there's a lot of people in the Java community who still think OO is a great idea. Who think Bob Martin's awful, muddled code examples in Clean Code are something to aspire towards. People who claim the path of enlightenment requires them to replace concrete classes with interfaces - even when those interfaces only have 1 implementation.
Anyone who writes class names like FactoryBuilderFactoryImpl and claims to walk in the light is at best a scoundrel, and at worst a follower of a religion so dark we don't name it in polite company.
This is what makes IntelliJ so impressive. It takes a master craftsman to forge something beautiful from such an unholy material. VS Code pulls off the same feat on top of election. In each case I can't tell whether to be horrified or impressed! Either way, I'm a huge fan.
The IDE that's written in Java most likely shares nothing with Java ecosystem parts they "hate" (I'm guessing all of the EJB, applets an general 90's Java enterprise stuff). Only irony here is you thinking it is.
The 90's Java was a thing to hate. Running fat complex app server with bunch of code-as-XML for configuration just to run simple API app. IDEs are not doing that (althought Eclipse was a bit of a monster of glued in parts)
> Did anyone in the mid to late nineties / early 2000s really discover Java and Java applets and thought: "Java applets is the tech that'll catch on, I'll invest in that" and not invest in Java itself?
...entirety of management it seems, it was the most popular hammer for the problem for quite some time in enterprise. Hell, we still have some less than decade old servers needing java applets for KVM
I share this sentiment. All my adult life I've been in contact with the "tech-anxious" kind of dev that needs to be up-to-date with all the new frameworks all the time. While I've entertained several new trends, I found it counter-productive to my well-being to focus on more than 1-2 stable stacks at a time. There will always be an order or magnitude more failed frameworks than successful and this perspective keeps me healthy.
Sexy? nope. Hot new thing? neither. But the jobs are out there.
Similarly that mention of RoR being "in danger" seems incorrect. It's just that Rails is not the topmost big player in the market, and I don't think it ever was, even at its early peak. Sure it made rounds in the news and had a big impact but even back then it certainly wasn't The Tech That Everyone And Their Dog Uses.
That's fine. A tech doesn't need to be in the top 3 market leaders to be useful, successful, and lively.
Reminds me of so many devs hating on Ruby and the number of times I heard that Ruby isn't used for anything relevant, while the huge majority of them either use GitHub or Gitlab for their work.
this is how I currently feel about Kubernetes and docker. I'm having all sorts of fun with the JVM, a monolithic jar, and about 20 lines of shell script. I can deploy in 20 seconds without much fuss.
That's how I feel about design patterns. Talk about the most useless pile of academic bullshit that gets overemphasized in terms of what "every programmer" must know before going into an interview. Although there are some domains where they may be necessary to know of the top of one's head, for the most part there's usually no utility in filling your head with individual ideas that worked in one circumstance but may lead to bad patterns in another. I'm so glad I gave up memorizing design patterns, especially given that many employers seem to have lost interest in trivia questions about design patterns as well.
Don't forget Confluence and JIRA and all Atlassian products. Minecraft too. I facepalm when people say Java this and that
I'm not sure why this is a salient point. There's no contradiction in hating Java and using an IDE written in Java. Lots of people hate C, too, but that doesn't make it strange that they use lots of software written in it.
And using AWS that has most services written in Java.
Java improved a lot in the last few years and it’s not going anywhere.
Didn't they also make Kotlin so they could use the JVM without Java?
I don't use an IDE (or its not written in Java, depending how you define it) but I think I can reliably tell a Java app - if not before - when it gives me an error.
They always (or often - I can't say it's definitely not Java when I haven't identified it) seem to spew stack traces that the end user doesn't care about, or in many cases (not an IDE) wouldn't even know what it was.
Fast forward to now and PHP jobs are paying over twice the old rate and grinding along while Flash is long dead and buried. Follow your instincts, not your friend's advice. :P
I feel that's substantially different with Python, Go, JS and ... C/C++.
Not sure if it's the same thing, but... when i was 16 or 17 i did tech support for an uncle who used to do trading with a trading client application written in java and some other tech stuff.
The java thing was launched via a desktop launched and the splash screen i later found out being java web start (.jnlp extension iirc?).
I didn't see the value at that time, but now that i think of it, java web start was a nice solution to manage "apps" installed and also fetch updates, automatically. IIRC there was a setting to always download the latest version, if possible.
Needless to say, this was ahead of its time.
As I move forward in my career i always find some new interesting thing about the java virtual machine, it truly is a marvel of technology.
The latest things that got me amazed were the flight recorder and mission control toolbox (very cool to be able to see in realtime what your application is doing and why it's performing the way it is) and the new garbage collectors (we're currently using G1, but i want to propose testing Shenandoah or ZGC).
A Java applet could be small and fast loading but that required careful scoping, design, and coding. In JDK 1.0 you didn’t have JAR files and no serialization built in the platform. I wrote a program that would let you edit finite element meshes and submit to a FORTRAN code running as a CGI and the serdes code was about as large as the rest of the program.
It's also entirely possible you were encountering them more than you thought. My highschool website, for example, had around 4-8 on the homepage - minimalistic applets that simply showed and cycled through images, placed in a grid in the header area. They totally could have been javascript instead, you couldn't tell the difference just using the page.
I did. Getting people to install JRE didn't really seem that hard of a problem, and neither did sandboxing.
Around the same time Rockstar Games released GTA 3 with its third-person camera, and I thought this is the worst idea ever. 20 years and 400 million sales later, I still hate it.
I wrote a very simple applet in the early days - it leaked memory - nothing I could do.
Also the deployment story - with a single shared version of the JVM ( at a time it was moving quite fast ) in the browser - was an issue.
Quickly decided Applets were dud, but Java itself was fantastic. It's easy to forget what the other options were at the time if you wanted to write complex servers or cross-platform desktop apps.
Not only that, what about the guest language folks spitting on the Java libraries and runtime infrasture (written in a mix of C++ and Java) that makes their shinny snowflake possible to start with?
Same applies to the C# bashing on the CLR side.
Sometimes I think that this is just the nature of software development. Most of the stuff I build is built to solve an immediate business problem. It probably lasts 5/10 years and then someone rewrites it in a new language, or more often the task isn’t relevant anymore so the code gets deleted.
I find myself thinking that maybe if I’d been in civil engineering or something then I’d be building stuff that lasts, but speaking to people who’ve worked a long time in construction has taught me that it’s the same there. Most of the buildings that go up, come down again in a few decades once regulations/fashions change or the new owner of the site wants something else.
Every so often something like a Cathedral gets built, and those get built to last. But most people don’t get to work on those. If there’s a software equivalent of a Cathedral then I still haven’t found it.
But even a Cathedral changes over time, and your work may not last; but all human work is a shrill scream against the eternal void - all will be lost in time, like tears in rain. The best we can do is do the best with what we have in front of us. And maybe all the work you did to make sure your one-off database code correctly handled the Y38 problem back in 2000 will never be noticed; because your software is still running and didn't fail.
My very first job was to work on the backend of some software for internal use. You have probably all bought products that were "administered" in said software. When I worked on it it was 15 years old. It evolved over this time of course but original bits and pieces were all around me. And I evolved bits and pieces of it as did others. By now it's been another 15 years and while I know they did a rewrite of some parts of the system I bet that some of both my code and the code I found to have been written almost 15 years before I started there is still around and you and me keep buying products touched by that software without knowing. The rewrite was also done by at least one person that was around when the original was built. He learned a new language to implement it.ut he took all of his domain and business logic knowledge over.
Its even more funny because I knew the guy that wrote some of the code I worked with directly from a different company but I had no idea he worked there or that he worked in that project until I saw his name in the CVS history. Yes CVS. I personally moved it to SVN to "modernize". I hope that was recognized as "technical debt" and they moved to git but I wouldn't know.
But I like to think that ideas and solutions and products can be legacies.
It's semi-uncommon to write code that legitimately lasts 5+ years.
But it's very common to work on projects/products/companies that last 15+ years.
And I have to be content with that.
Probably like, some parts of Windows or Linux, or GNU tools, or some thing like that that, while still being updated, also has ancient components hanging around.
Right. People who assume otherwise aren't spending much time browsing the relevant subjects on Wikipedia or historical registers or just paying attention to their municipality. Simple demonstration: look into how many Carnegie libraries that were built are now gone versus how many are still around.
Actually you have. It’s HN! The website that hasn’t changed in decades… used by the most tech savvy people in the world!
These sorts of constructions have been repaired and re-set hundreds of times over their existence, and have sometimes gone through periods of destruction during war and natural disasters, disrepair then subsequent periods of restoration and reuse. At a certain point, very little or nothing of the original construction really remains, but you can nevertheless draw a line through hundreds or thousands of years of history.
Software may be more like this: continually rebuilt and maintained, but still physically or philosophically related back to some original construction. Nobody uses Multics any more, but almost everything in use today is derived from it in some way.
Imho, there’s freedom in accepting that nothing I produce will last a long time.
That one old file dialog window that still somehow shows up in windows 11 from time to time?
Or the Linux kernel.
In general, looking back at old code I wrote it seems my solutions were better when I was more naive / less experienced, as I would often go for the immediately obvious and simple solution (which is the right one in 90 % of cases). Working many years in software development and reading HN seems to have made me more insecure regarding software, as I tend to over-engineer systems and constantly doubt / second-guess my technical decisions. So one thing I'm actively trying to do is to go back to using simpler approaches again, and caring less about perfect code.
It's because I've so often seen the cycle here of "X is brilliant" and then 2 years later "how we switched off X and saved millions of manhours!".
When i was more naive, i'd just stick with the first solution that worked. Now, i find that i'm always worried that someone is going to try to input a 30000-line csv in my tool and that i must use the slightly faster way because it's the right thing to do.
One of my tools to prevent this has been the non-accepted answers from stackoverflow. I find that they are generally of similar quality to the accepted answer, but that because they are less tailored to the (necessarly) different problem, they fit less and have a lesser chance of being accepted despite being more useful to more people.
I find this as well. I also think that there is a sub-conscious fear as you become more senior that you need to justify that with more elaborate/complex solutions. In my experience there are also a lot of people in software who never come out of the other side of that view and constantly equate "complex" with "good". Being in an environment with lots of people like this makes it hard sometimes. Ultimately you need to trust yourself and do what you think best. If it goes wrong, at least you know you did it for the right reasons.
Recent example: I have a side project with Typescript on the backend and frontend which also uses an Audio Worklet, so it loads a JS file at runtime to be run in a separate process, isolated from anything else.
I also have a class which I need to use in all three mentioned pieces.
Three years ago I would spend hours trying to figure out how to make this work with the build system so as to not have duplicated code and maybe eventually arrive at a mostly working solution before the e.g. frontend framework maintainers update their build system version which may or may not break something.
This time I just copied the damn file everywhere - the Audio Worklet got the compiled JS output pasted at the start. It's a class, it works and I have maybe one idea what changes I could make there in the future at which point I'm just going to copy everything again.
Can be hard to decouple overengineered messes from Good Software: an idiot admires complexity, et cetera.
No, it's just a fucking product you made. The fact it has to be maintained doesn't mean it is "debt", it's just like any other asset.
You don't get to your car and think "that's technical debt". There is nothing technical about it. It's a tool with maintenance needs.
The difference is choosing worse now to get it faster, that's technical debt.
The author navigates the technology world very differently than I do.
The rest of commenters for some reason seem to equate "technical debt" with old technology and "legacy" projects.
They just expect the existing thing to keep working in the background and more imporantly do the new stuff that is wanted.
It's generally a debt, because engineers are not given time to maintain it properly. Imagine the car never getting to go to a garage to be serviced, because it's needed for driving all the time. There is your debt.
Right, but we always do that. We always make some tradeoff to get things shipped faster. And that's fine, otherwise we wouldn't ship things. It's a balance - and the general meta of "all technical debt bad" is harmful to actually building working software.
Yes, biased old man that saw 30 years of code get replaced with the latest "thing that kind of works like the old thing, but not really as good, but we have to do it to keep up with technology or we'll have debt". I've seen old VB6 apps fulfill a business need just as well as anything written in a full LAMP stack that takes a team to implement. (not that I'd ever use VB6, but hey, it did a job).
Take me out back of the shed and end it quick while I'm hunched over programming something cool.
I wouldn’t consider myself conservative by any means but increasingly in 30s I’m beginning to think everyone needs to stop messing with stuff and accept imperfection.
[1] Was looking at another HN thread, CS 61B Data Structures, Spring 2023 UC Berkeley, https://news.ycombinator.com/item?id=35957811 and in one of the videos, Lecture 27 - Software Engineering I, https://youtu.be/fHEVKqYb9x8?t=387 the professor says "Programming is an act of almost pure creativity"
[2] Japanese Joinery, https://www.youtube.com/watch?v=P-ODWGUfBEM
I hear tales about several layers of emulators the sources for which are forever lost, with the bottommost of them running some very mission critical early COBOL or ALGOL-69 (or Lisp, who knows) code written back in the days when it had been that newfangled thing young guns liked, and which no one now can reimplement today (maybe because elves have died out from smog and the golden age is over), but they may be bunk.
I will leave that to juniors who probably appreciate doing the coding.
* Simple, but reliable file formats: My old photos collection is 25 years old (jpeg) and I still look at them occasionally.
* My influxdb/grafana/openhab setup is now 8 years old and has been operational with only minimal hickups.
* Some home automation scripts I wrote have endured for 8 years now.
* My Linux/Unix knowledge is 26 years old (although much has changed since Slackware).
* My C and C++ programming language knowledge is 28 years old (although I rarely use it anymore, because of Rust). Occasionally it is useful during debugging performance problems in kernels.
* My knowledge of distributed systems (stuff like leader election, distributed and centralised algorithms) is 15 years old. It is still very relevant to understand both existing systems and to evaluate new systems.
It is crucial to distinguish the knowledge that stands the test of time from the fleeting trends. And finally then there are the non-technical skills, which pay off even more. Effective communication, negotiation skills, organisational finesse, inspirational leadership, adept management... These are the true constants that support us during our professional life.
CTO's change. They started replacing it with a "proper" ERP system around the time I left and it's been nearly 6 years of pain. Joel spaketh: https://www.joelonsoftware.com/2000/04/06/things-you-should-...
(As usual: of course you can break the rule, but first deeply understand the risk you're taking.)
sometimes you don't need fancy new stuff. Just learn the things very well at your disposal. Heck sed, awk and bunch of simple cli tools still work well on Big data sets :)
20 years from now, C will probably still be a core language in its niche.
Dr. Hipp spent years achieving this, and any reimplementation will suffer his travails, regardless of language safety.
DO-178B means that SQLite can be used in avionics. No other major database has reached this level of code quality, as the database written for "programmers who are not yet born."
[1] There have been multiple articles on HN where a single person has written a C compiler.
I think C and its direct descendants will slowly fade away over the next 20 years as new developers want to get away from the legacy of language specifications that span existing codebases. There are so many sharp edges in C that have automatic fixes/detections/etc in newer languages and don’t get me started about multiprocessing complexity in C.
I met a traveller from an antique land,
Who said—“Two vast and trunkless legs of stone
Stand in the desert. . . . Near them, on the sand,
Half sunk a shattered visage lies, whose frown,
And wrinkled lip, and sneer of cold command,
Tell that its sculptor well those passions read
Which yet survive, stamped on these lifeless things,
The hand that mocked them, and the heart that fed;
And on the pedestal, these words appear:
My name is Ozymandias, King of Kings;
Look on my Works, ye Mighty, and despair!
Nothing beside remains. Round the decay
Of that colossal Wreck, boundless and bare
The lone and level sands stretch far away.” When I was a King and a Mason - a Master proven and skilled
I cleared me ground for a Palace such as a King should build.
I decreed and dug down to my levels. Presently under the silt
I came on the wreck of a Palace such as a King had built.
There was no worth in the fashion - there was no wit in the plan -
Hither and thither, aimless, the ruined footings ran -
Masonry, brute, mishandled, but carven on every stone:
"After me cometh a Builder. Tell him I too have known.
Swift to my use in the trenches, where my well-planned ground-works grew,
I tumbled his quoins and his ashlars, and cut and reset them anew.
Lime I milled of his marbles; burned it slacked it, and spread;
Taking and leaving at pleasure the gifts of the humble dead.
Yet I despised not nor gloried; yet, as we wrenched them apart,
I read in the razed foundations the heart of that builder’s heart.
As he had written and pleaded, so did I understand
The form of the dream he had followed in the face of the thing he had planned.
When I was a King and a Mason, in the open noon of my pride,
They sent me a Word from the Darkness. They whispered and called me aside.
They said - "The end is forbidden." They said - "Thy use is fulfilled.
"Thy Palace shall stand as that other’s - the spoil of a King who shall build."
I called my men from my trenches, my quarries my wharves and my sheers.
All I had wrought I abandoned to the faith of the faithless years.
Only I cut on the timber - only I carved on the stone:
"After me cometh a Builder. Tell him, I too have known."The small amount of code that's useful for years or decades needs to be built on a stable foundation and with few dependencies. This probably matters a lot more than the code being a bit messy. It should require as little maintenance as possible.
I have two pieces of PHP code I wrote more than 10 years ago, in PHP 5.1, that I still use. Both are messy, one is outright horrible, but they have no dependencies (the horrible one optionally depends on GeSHi, but there's a runtime check for its presence and has a "degraded" mode for when it's not there). They work and are relatively bug-free (the horrible one in particular is battle-tested). I didn't have much to do to run them on PHP 8.2. A few superficial fixes that were actual issues (and took half an hour to fix). By the way, I can't figure out how PHP 5 was even able to run this code. You could have mandatory parameters after ones with default values. Yuk.
It’s ugly, but it gets the job done.
Best thing is that front-facing HTML and JS from 10+ years ago still works flawlessly, even though the UI is certainly dated.
But I guess that’s the same with HN, where the design never really changed at all and just works.
First I wrote single threaded code code with automatic memory management, then single threaded synchronous with manual memory management, then synchronous multi-threaded, then async, and then async lock free.
Now I am writing async lock free, and the compiler is helping me prove it is data-race-free and memory safe.
Each time I rewrite this stuff, someone hands me 6-7 figures. This is awesome.
Lock-free goes way back. Multi-threaded goes way back. Both more than 20 years. SIMD goes way back. GPGPU goes way back.
What is newer-ish are large scale distributed systems. But even that isn't so new any more.
Citation needed. Seems that it just us getting more and more abstracted which can make it easier but not necessarily easier.
The hardest part for the next gen of developers is not having Moore's law to save them from crappy coding.
most of the programs i use (besides the browser) are 20-50 years old (paste, cut, xterm, grep, emacs etc), almost any new thing i try is horribly slow (slack, whatsapp etc, the mac terminal app, even the mail app is slow compared to mutt)
open a web page on a computer without adblock and look at what we have built
https://www.youtube.com/watch?v=pq7NLMwynYg (funny video demonstrating the state of the modern web)
I am writing async lock free code as well, but which compiler is helping you? Rust? As far as I know, it's cutting edge research to figure out a type system that fixes this for you. But maybe i'm not up to date.
> stuff gets better so fast that we get to reimplement the same thing
So you are leaving the API in-tact and just change the implementation with new techniques? Or even better, you just rewrite low-level libraries you are using? Sounds like the ideal job to me.
I envy you.
Making something that really lasts is hard. My favorite example is the Clock of the Long Now, a timepiece designed to operate for the next 10,000 years:
Rails is kicking ass. A huge number of people are coming back from bloated JS frameworks to realize that Rails just keeps getting better every year.
Hotwire and similar technologies make the argument for SPAs look very questionable.
And let's not forget where we are; over 75% of the raw gross value created by YC-backed companies use Rails. So it seems like Rails is only popular with successful startups.
I think TypeScript is important because of how bad JS operators are defined (operator matrix is just plain stupid, throwing error would be a better option), but everything else is optional. Otherwise JS is good enough.
On the other hand hotwire (moving HTML around) costs real latency and money to mobile users with data plans. Since when it’s better than just executing code on the (for most people strong enough mobile) device with close to 0 cost?
C was the 2nd language I learned and I'm still using it. The big surprise for me has been javascript - it's so...bad it became good or at least ubiquitous.
I just ended a job with a guy who going on about how he was a Flash hotshot back in Y2K days, lamenting "Steve Jobs for killing it." Like, dude, everyone with two neurons to rub together told you it was a fundamentally terrible idea, doomed from the start. Gawd.
I also fell for a few of his picks — Angular was a rough blow — but what a list of lousy bets.
The POSIX standards, flawed as they may be, have incredible staying power. These standards run our phones and embedded, supercomputers, current Apple workstations, game consoles, and are significant in many other places.
Microsoft itself implemented POSIX in Windows from the beginning (likely recognizing it's importance as the former vendor of Xenix), and while this has waxed and waned, running the "wsl.exe -l -o" command on modern Windows will catalog Ubuntu, Kali, and Oracle Linux that are not Linux, but serviced by the Windows kernel's POSIX layer under WSL1.
Applications that implement or greatly enhance POSIX have staying power.
Those who seek code longevity would do well to study it.
that's not how that works.
wsl is a Linux kernel running under a hypervisor integrated with windows.
I'm sure some not insignificant parts of Windows, Linux, tools, libraries, browsers etc. etc. are fairly old code that just keeps working .. perhaps with fixes and improvements.
Good code lasts a long time. Technical debt is something you are continuously paying "interest" on. Just like any debt, sometimes it's a good thing and sometimes it's a bad thing.
in my experience, if there's no will or budget for a rewrite, it gets virtualized, locked away behind a firewall/corporate network with a restricted set of users, and will basically run forever as long as the ISA and virtual storage is supported by emulation and the business process still exists and is able to support the infra/people involved.
think of it like a zero coupon 100 year bond in technical debt issuance underwritten by the central bank (corporate hq). and highly liquid in the sense that a virtual image is easy to move around.
the good part is it just gets faster and takes up less % resources as hardware improves, and forever-bugs are usually worked around and documented by the people pushing the buttons.
I think we've done ourselves a very big disservice as an industry by focusing on this boogeyman.
Products don't become technical debt, they merely depreciate along with any aassociated technical debt. This is why I don't like the term technical debt: it implies something that must be paid back. But you don't have to pay back the "debt" on product that will be completely replaced anyway.
When I have a long list of todo items to perfect my work for a client, and then they run out of money for the whole project and end my contract. I don't say "oh no, now I'll never dot those i's and cross those t's!" I say "yay, I will cross everything off my todo list forever."
Code, most code, will eventually turn into some sort of sanitised applied maths corpus (with a heavy overlay of "AI" analytical tools).
In applied mathematics you have cultural obsolescence (stuff that we no longer find interesting or relevant, like an asymptotic formula for the Airy function [0]) but the corpus is intrinsically add-only. Once something is solved it does not have be solved again. Its essence is eternal, so to speak.
Think about code development in such as mature state. It will mostly work by pulling together (using AI prompts) logical units from the future version of Rosetta [1].
When we get to that stage it will be hard to add something truly new. Just like it takes a long incubation and a PhD to add some marginal new thing to applied mathematics, it will take effectively trained applied mathematicians to add something minor to the evolving code corpus.
Coding for the common folk will be mostly compositing. It might be productive and even fun, but not quite the same.
We lived through a period of widespread democratization of development. Cherish the freedom of reinventing the wheel in countless flawed ways :-)
I agree with you. That end state is of course all open source. Anyone not working toward that state is making money (understandably) as the higher priority.
I'm still not sure why businesses do so much proprietary customization as opposed to working in an upstream OSS project. I suppose the OSS foundation in those cases isn't solid enough yet.
I have worked in the field of AI for 40 years, and we have had a huge trash heap of technologies that ended up being useless except for lessons learned from failure.
I have always been motivated to work for just two reasons: supporting myself and my family, and learning and using new tech. The great fun is in learning new things. That said, sometimes there is good short term work supporting old tech that companies still want to use.
The joke is, even the lessons will deprecate at some point, either when younger generations, who did not learn/understand them yet, will enter the playing field, or because the progress of technology will invalidate the lessons.
Nothing is really meant to stay.
You needed to completely change framework and even languages 3 times in not even 30 years?! Just to keep things maintainable?
I don't feel that it's the same for other area's of programming: desktop apps, embedded stuff. Maybe I'm wrong...
Our cathedrals are surely [L,U]inux and the C programming language, HTML has done pretty well too.
We're not getting that money for generating realistic-looking syntax, but for discussing with stakeholders and choosing appropriate designs with an eye for both the bigger picture and details. These are language-and-library-independent things, mostly, and incredibly difficult work.
You may be able to give an LLM directions to do something like what you would have been able to do, but the money is not in the parts the LLM is able to accomplish, it's in the direction you give it.
"Grandfather, what did you spend your life on?"
"I made people click ads. Don't worry, it's all technical debt or deprecated now ..."
"Your grandfather was a brilliant man and his peers all praised him. Go ask him about it!"
"Grandfather, what did you spend your life on?"
"I fixed people's shoes. Don't worry, ..."
I generally think of "fad" tech as frameworks and ecosystems that are built around a commercial interest or novel idea. They often fail to overcome network effects. And this leads them into obscurity to await deprecation.
There's a lot of churn that happens in the frothy red waters of "trying to make programming easier/faster/accessible-to-non-programmers".
And we're not so great at maintaining our legacy, the state of the art, the pedagogy and history of our science. Over a twenty-plus year career I've seen people re-invent solutions to the same problems over and over. Each time it's a revolution. You don't want to dishearten the young and eager but at the same time seeing them run into the same problems, learning the same conclusions, etc means we've not been doing a great job at teaching and mentoring and all that.
MS Visual Basic 6, MS ActiveX, MS Silverlight, MS Visual Foxpro, MS C# .NET Compact Framework, MS ASP.NET WebForms, MS ASP.NET MVC, MS Windows Communication Foundation...
I think there common is a theme there. But shhh, don't tell him.
# Job 1 (3 yrs)
- Worked on around three products, all shut down, code probably lives in some SVN archive.
- Learnt advanced JS, PHP, MySQL, Photoshop, jQuery etc (skills mostly relevant)
# Job 2 (1.6 years) - Project never launched, code never saw the light of the day. Probably lives in some Git archive.
- Learnt a few in-house frameworks (irrelevant) but also leant Git (relevant)
# Job 3 (8.5 years) - Worked on several products, the biggest one is still active and seeing millions of users weekly. Rest got shut down and live in a Git repo.
- Learnt about some inhouse frameworks (irrelevant). React, React Native (skills still relevant)
# Job 4 (2 years) - Actively working
- Learnt Vue (skills still relevant)I would argue that any apps written in Objective C are probably technical debt now"
I switched to Swift a few years ago after many years of obj-c, at first I was reluctant as there were still many things I liked about obj-c but Swift won me over. Thought I would never touch obj-c much until I had to integrate a cpp library, I can't believe how much I forgot in such a short time. It was interesting to find out how to structure the obj-c code to translate nicely to swift though.
visual basic, asp, coldfusion, foxpro, activex, flash, and silverlight, windows ce, asp.net, webforms? proprietary, proprietary, proprietary, proprietary, proprietary, proprietary, proprietary, and proprietary
and mostly pretty dead as a result
how about the non-proprietary things in the list? html, css, js, fortran, java, ruby, and rails are all just about as alive as they were 10 or 20 years ago, if not more so, except that rails didn't exist then
the exceptions are perl, objective-c, and the js frameworks. perl, ember, and backbone aren't going to disappear anytime soon but they will likely continue to stagnate. but unlike silverlight or windows ce, you can probably run your perl and backbone and angular and react and swift code 10 or 20 or 30 years from now on whatever platform people like then
unless the platform is centrally controlled by an owner who forbids it, so try to avoid platforms like that
(java applets were already dead 20 years ago, and soap sucked from the beginning, so these examples are out of place)
there is certain knowledge with a very limited half-life. but hopefully you aren't spending most of your time learning react apis or wasm instructions or editor keystrokes or chromium bugs, but rather general principles that transfer across domains. algorithms, reasoning, type theory, math, writing skills, hierarchical decomposition, scientific debugging, generative testing, heuristic search, that kind of thing. a lot of that stuff goes back before computing
and most code has an even shorter lifespan than the knowledge we use to build it. which is as it should be: most code is written to solve a problem that won't last decades or even years, but it's still profitable for companies to have it written. modifying a big system is harder than modifying a small one, so it's better to maintain just the code you need for today's problems. writing code and throwing it away is mostly fine
still, i've spent my career on free-software tools, and their half-life seems to be a lot longer, about 25 years. i reviewed some of the stuff i'm using right now in a comment on here 10 days ago, https://news.ycombinator.com/item?id=35829663
i know a guy whose preferred programming editor is ex. the non-full-screen version of vi
It's intriguing, because that implies he can work on most things in a way that "fits inside one's head."
Do you know if he extensively calls out to external tools, custom aliases, and the like?
I imagine there is some fullscreen mode where he can--temporarily--suspend ex, make use of custom sourced mappings, and the output is piped back into his editing session.
It's hard to find certain tidbits, so I'm always glad for even the smallest details.
Thank-you.
[1] https://learn.microsoft.com/en-us/previous-versions/visualst...
And let's not forget the influx of junior developers during that time. I mean, we can't blame them entirely, can we? SOAP standards were complex and enormous. It's no wonder they struggled to grasp the underlying paradigms. We had an army of fresh faces flooding the scene, and the sheer complexity overwhelmed them.
So, SOAP ended up being the baby tossed out with the bathwater. It had its merits, but the challenges it faced were just too much to bear. Still, it's worth reflecting on its strengths and the lessons we can learn. Maybe someday we'll find a way to strike a balance between the elegant core idea and the practical realities of implementation.
Ruby is the 8th most popular programming language. Ahead of C, C#, and a bunch of "cool" languages like Scala, Kotlin, and Rust.
Source: https://madnight.github.io/githut/#/pull_requests/2023/1
Remember Crystal Reports? Oh god, the pain. The pain.
OSS tends to have a much longer relevant shelf-life and experience with it, especially internals remain highly transferable skills.
My work in and around Apache and CNCF ecosystems has been the only code I have written that truly endures in a good way rather than an ossified and decrepit legacy way.
(I guess here I should mention the context that I'm not coming up on 30 years; instead it's a little over 10.)
The "no longer in use" part is something I think I ultimately disagree with. It's kind of an "application of Theseus" situation. Where did this data really come from? If it was an older application, did that application ever really go away or did it just become what replaced it? Anyway, I guess I just have to hope I still have this outlook in ~20 years.
My dream retirement job is to work at a tax agency, like CRA or IRS, and help maintain their mountain of COBOL. I don't know COBOL, I don't know the ecosystem around it, but I absolutely know that I would love to learn it.
My first job was in a similar environment, supporting a homegrown application (which had grown out of a long-defunct commercial application) running on a Pick-style database system (UniVerse). The whole thing could trace its roots back to a Prime mainframe. Reading the code, especially the older stuff, was such an adventure.
ScarletDME[0] is seriously scratching the itch to play in this world.
The barnacle did provide more lifetime business value after all.
(I'm more committed to the bit than this position though, it's a judgement call that an engineer must make relative to the requirements and resources available.)
Utilizing the Lindy Effect to my benefit, I've been able to almost entirely avoid the typical framework frustrations, such as breaking changes and abandoned dependencies.
Instead, I've been able to focus on features, figure out a good code style, and support nearly all mainstream or once-mainstream browsers. On the back-end, I'm working on the Windows install process, but on *nix it's fairly uniform across different flavors and lineages.
For those curious, it is Perl, text files, PGP, HTML, CSS, low-sugar JS, and a little bit of sh, Python and PHP for server glue. Now probably to include batch files...
- you solve said problem
- profit
- competitors catch up and hit the same problem
- 3rd party (or one of the 1st parties pivots) notices all of you have the same problem. implements solution to fix the generic version of the problem: a standard is born
- hubs appear that make it easy for your competitors to eat into your market share because they now use the 3rd party solution and are more agile.
- you can't integrate because your solution is not complying to the new "standard"
- rewrite is needed
(edit: I hate HN formatting)
How did technical debt become so bloated and meaningless? Isn't it "remaining half baked/incorrect code, known edge cases, bad/slow implementations, or even bugs due to constraints imposed on the engineering team"? How is dead technologies are imposed constraints?
Author is not talking about technical debt but experience that is not directly applicable anymore.
The typical thing some of us tried to avoid while others (or their employers) totally embraced these comfortable (advertised as productive and fast) and often proprietary tools. The contrasting would be HTML, JS, C/C++, all closer to the bare metal of their respective targets. Or Java and Python as the two that made the race in more complex but purely language levels, as Rust is doing right now. Java has gotten connected more and more to its typical business frameworks these days, so beware :)
Some of them died just with their platform as ObjC/Swift would die with Apple, Kotlin with Android these days.
It seems a split between language and environment/frameworks has proven to provide some stabilty at least.
Seldom (and funny) to see such a "consistent" list, though :) There might be something like "consulting work" as the recurring theme in it. As a product/system/platform developer one might have had (and would have been forced to) put more effort in selecting the tools for long term availability where they are not forced upon you by the platform or time-to-market considerations.
Sure, all code rots, frameworks and even languages come and go. That's not technical debt, to me, though.
Put another way, if everything is tech debt, then there's really no point to having the term at all.
Had the author invested in learning stuff like math, physics, cryptography, advanced data structures and algorithms, functional programming theory, 3d programming, compiler theory, database theory, etc...
None of it would be obsolete today, and all of it would be almost immediately applicable in any language that happens to be the flavor of the day at a given point in time.
Let that be a lesson to all of the folks who become extremely proficient in the latest react-like fad javascript framework: in 10 years time, all your knowledge will be useful for one thing: maintaining and patching old crumbling code piles that no one wants to touch.
Do yourself a favor instead and go learn timeless things that will still be completely relevant 20 years from now. Then spend a minimal amount of time learning how to apply it in whatever language/framework of the day is fashionable at your job.
C++ merrily carried my backend solutions over the same timeframe with the same results.
Same for C when writing firmware.
Same for JavaScript / HTML for browser based front ends.
In all of the above I used some domain specific libs but stayed away from big frameworks as those come and go pretty fast.
I consider neither as a tech debt as they let me concentrate on the product rather than dwell on what tech do I use. I've never felt inferior for not using this new and shiny doodad as I've always delivered superior products and that is what mattered for my clients.
Yes I had to program in whole bunch of other languages upon client requests but this was rather rare. I have good track record in creating new products from scratch and that is what my clients really want. They mostly do not care what I use for development.
So no. I do not feel that tech debt at all and I still play with other tech a little to stay current in case it is needed by client.
If you pick new technologies, don't be surprised that they will be quickly overridden by something else. Pick stable and boring frameworks and languages. I.e. lots of old Java Swing applications runs on today computers without recompilation.
Lisp on the side, plus the Unix cruft that goes with build systems: shell, awk, make, ...
I went through the language churn as a kid. BASIC, assembly languages, Pascal, Modula-2. Studying CS pulled me into C world; all the upper level coursework at that time was systems programming in C on Unix, whether it be compilers, networking, distributed systems, operating systems or computer graphics or what have you.
I didn't think I'd be cranking out C for another 30 years after that, but I also didn't think of any reasons I wouldn't be.
Not everything I worked on is around; but the skills left behind are entirely relevant. There is hardly any technique I ever used that is inapplicable.
On the other hand I'm currently working on Android, where everything seems to be obsolete within a year...
Hardly.
> What once made it unique is now available in other languages.
Sure, yay open source.
You still can't get as much out of the box with one CLI command for a new rails app in node.js. I would love to be corrected on this.
Still works and the codebase is still being incrementally updated ~20 years in.
Stint with Laravel, which was a huge task to add, and an even huger task to eventually remove; After an update by Laravel that was not backwards compatible made us rethink the cost/value.
Javascript on the frontend, with no libraries as critical infra except for HTMX.
Was once a contributor to Mootools; now use quite a bit of homegrown funcs, and use libraries for spot instances (Alpine, Uppy, QuillJS - all of which are written modular with intention to be traded out.)
MySQL and eventually Postgres. "NoSQL" only when it was added (as JSONB) to Postgres.
All of those years of coding are an asset, not a liability.
Don't go chasing the new and shiny, it doesn't last. (And steer clear of Web3, will ya?)
EDIT: We use Tailwind - and time will tell if that is a mistake.
I beg to differ. There are 1000+ Rails developers actively looking for work on railsdevs.com
- SQL - Terminal - OO programming travelled pretty well from C#/Java to Objective-C to Swift
Then there's stuff that is still called the same but completely unrecognizable compared to the time I was good at it like HTML / JS / CSS
1. If the codebase would still work on a "dead platform", but you are targeting something new, you are chasing after hype, to some degree.
2. If the code involves significant data entry, you probably want to leverage a spreadsheet, because eventually you will want to edit something tabular and also make charts and reports. The things "dashboards" do are also very reasonable to do from inside a spreadsheet - make a tiny shim to pipe in some data, and then make the dashboard frontend using all the built-in goodies.
3. If the code is mostly about making custom UI, you are on the path to saleable application software, or at minimum, a tech demo that people will talk about.
C is not deprecated. Maybe in a 100 years it will be, but it seems unlikely.
It's very rare to find someone who has C as their favourite language, but it's incredibly persistent.
C is the opposite of the latest JS framework.
I regularly fix bugs in open source, where when checking when the bug was introduced the trail often runs cold in the mid to early 90s, as it predates that project's use of source control.
But yeah, if you have a 20 year career of following what is clearly the latest fad, then you'll have that experience.
Java applets were never "big". They were a fad, and promise of big. But they always sucked, even by the standards at the time.
Hello, I visit here every day. Nice to meet you
One day the apps (products) may themselves become technical debt, but that is not the fault of the language they were written in.
I have heard this with great emotion for at least 6 years now and nothing. It’s starting to look like bitcoin: all these dreams and ideals that come to nothing in practice but a Ponzi scheme.
The funny thing is that it clearly indicates nobody know what they are doing. People advocating for WASM want DOM bindings because the DOM is great and JavaScript is the great evil. Most people who actually write JavaScript professionally feel equally about the DOM, a great evil, and so they are completely reliant on some massive framework to justify the their existence.
Sorta like how LISP might be the "Guns & Roses" of coding; Not everyone like it, but in their hearts they know it has greatness on some level. And no one can argue with their staying power :D
I guess I don't particularly care if my code lives on or not. I build something. It's hopefully useful for others or myself for some period of time, then its gone.
It's just lines I've written. Why would that be more important than other lines I've written? In other careers I never thought about how my work was eventually going to be forgotten. That's just how life goes.
If you want to build something permanent and long-lasting, go build bridges. It does seem kind of silly to expect software to be around for a long time.
In my career I've gone through so many tech stacks, most of which are dead or dying today. A lot of my projects you could today replace with a couple API calls or an AWS product.
At least I like to think I learnt a few things along the way that make me a better developer, I never know what to say when recruiters or interviewers ask me about my tech stack experience expecting some sort of very specific answer – to me it makes as much sense as asking whether I have firefox experience, or just chrome.
Javascript is an ECMA standard. I'll just do the JSDoc hack and cheer for TC39 rather than waste my time with a technology that's great for now but will probably end up being technical debt.
Think of how little somebody who uses C, bash, vim, and javascript has had to adapt in the past several decades, all because they avoided technologies where the buck stops with a community rather than a corporation.
Corporations are cool and all. I'm not a dirty hippie. But at the end of the day they don't make money when things don't change.
Do walled gardens not last as long I wonder?
ALL of your code will be dead in a few years. The company goes under, or some punk CTO comes in and commands a "full rewrite ASAP!". Even with normal evolution and refactoring, your original work will be unrecognizable.
Either way, you take experience and friendships from your employment, you are not going to be the Picasso of software engineering.
But I appreciate the Picasso joke. Too many devs try applying over-engineered abstraction to simple problems.
------
From this place I want to greet one such "Picasso" whose code from over 20 years ago I was debugging last week... I know your name...
My favorite job -- to refactor something old. If only businesses wanted to pay for it.
If it's that bad in hardware land I don't feel so bad about this virtual detritus we software people spew. At least it's cheap, eh?
Other than that, yeah, most of what I’ve made is no longer used by any significant number of people.
Not sure if there’s any wisdom to draw from this anecdote, but I did spend a lot of cycles in trying to perfect that library!
One exception: I think the jq tool could stay viable, if it made it into a standard toolset. It solves a defined problem.
> Basic, Silverlight, ColdFusion, asp...
I'd add > Lingo, Flash
Basically if Microsoft, Adobe or any other entrenched player make it easy for you to develop in it and they offer the only Dev stack then its probably going to die quickly.
Counter case, apple seems to be doing fine with swift. So I might be wrong here.
The whole project will eventually disappear in history in favour of a rewrite which is around since before the pandemic.
That would probably include the lions' share of system code in the Apple OS ecosystem.
I'll bet a lot of it hearkens back to the NextStep days.
Since it is a UNIX OS (all of the OSes), then there's plenty of good ol' ANSI C, as well.
I'm almost positive that every AAA app is still a ObjC BoM (Ball of Mud).
> Over time, you can see how almost everything you create gets scrapped and replaced for various reasons or is now based on old technology.
Is this problem fixable by using some great technologies? React code does not rot and Common Lisp applications are famous as having the most time without getting rot among other programming languages.
The underlying concepts stay the same. Your own experience is much improved.
Some old technology gets replaced? I celebrate the demise of Flash.
I am not Flash or even a Flash developer. I am a software developer.
I am a bit sad Dlang is not more used? Yes, I am. But I am not a "Dlang" developer.
Tying yourself to some technology seems self limiting.
I wouldn't be so sure about that. Fallen out of fashion perhaps, but still very stable, easy work with, predictable and with pretty clear best practices that don't change every three months.
I am glad I am not the only one who hit this roadblock with Elasticsearch. It felt like always trying to play catch up.
this is probably the most damning thing about the field. We don't invest in developers, so why would they invest in the business?
I mean, I don't like it but...
Contrary to the author, for me, over my entire career, I feel like I have been relatively incredibly lucky with my choices of 1) What (web) technologies to invest lots of time into, and 2) Therefore what tools I use in my own projects and at companies I have worked for.
I think that 50% of the reason is down to when I was born and therefore when I started my software engineering career (for example, therefore avoiding lots of bad alleyways that web dev went down), and 50% was just common sense (realizing early-on that X tool was obviously going to be superior to Y tool).
1. I was fortunate enough to be born at a time such that I just about missed the AngularJS --> Angular2 betrayal + debacle.
2. I realized, through playing around with the Angular2 beta around 2014 (IIRC), that it was going to be inferior to React. I remember that at the time React was this tiny thing that had existed for around a year or so, but it was clearly conceptually superior, with more talented Facebook developers behind it vs the Google developers behind the early Angular2.
Not to be too disparaging or direspectful, but at least in the early days, it felt like Angular2 was made by a team of your usual C++ Google engineers who didn't have a lot of experience with web dev. Just being honest here...
This lead me to A) selecting React for my personal projects, and B) advising to pick React at companies I worked for. This meant that there are React codebases that I contributed to ~8 years ago that are still active at companies today.
3. Through sheer luck, I started my true web dev career after the "random crazy insecure web technologies" era such as Flash, Java applets, and all that nonsense.
4. Through sheer luck, I started my true web dev career shortly after Typescript was created, and I just happened to use a framework (now non-existent) which used it, co-introducing it to me in ~2014. This meant that I have been able to create relatively maintainable Typescript codebases, mostly avoiding creating any quick-to-deprecate JS mess-heaps.
There are a couple more examples, but I think that captures why I feel so lucky. I feel bad for the engineers that, through a combination of bad luck (time born, time entering the field, etc.) and lack of foresight, were led down paths causing them to invest a tonne of time into technologies that were just never going to be around for the long-run.
My time will likely come where my luck runs out, and I end up investing a tonne of time into some nonsense tech that initially seems solid but ends up rubbish for whatever reason, but for now, I'm quite pleased :)
It's pretty frustrating. I keep trying to do courses to keep my skills up-to-date, but it doesn't matter. Those courses by themselves are already outdated a year later. The only things that have been solid in my career, is the CS knowledge and theoretical knowledge from books like 'designing data intensive applications'. That stuff rarely changes. The rest is like waves in a sea. They come and go.
Paint that shed.