One of the strangest things is when people over-interpret modest remarks as declarations of war against the fabric of sensibility and human decency. This seems to be a trend nowadays.
I don’t think that the author of this article/talk meant to discourage people away from all software contrary to the types he discussed here and it doesn’t sound like that. It reads like a modest appeal.
* Prefer
* "Gravitating away"
* "even one of these suggestions"
* "better than nothing"
* "This isn't always possible. Sometimes we have to use software with millions of users. But I think it helps to rely on small-scale tools as much as possible, and to question tools for large crowds of people at every opportunity."
(I'm the author.)
The problem is that your presentation is not covering the various downsides of choosing "small-scale" software. There are tradeoffs and if you don't explicitly highlight them, it's a disservice to readers.
E.g. You mention that large-scale software for millions is "expensive". But small-scale software is also expensive in different ways. (Software used by only a few can be more expensive in time/labor/hassles because of lack of features, workarounds required, lack of tutorials, etc)
I've written several utilities in the "small-scale" software category for my friends to use and that experience has enlightened me that most people (who are not hackers & techies) should use "software for millions" as the default choice.
If you're one of those that chooses small-scale software (e.g. your old Lua v5.1 anecdote), I think you're already part of a self-selected group and you don't need blogs suggesting it to you. You also are willing to overlook the downsides.
I like most of Clay Shirky's writings on various topics but his particular essay on "Situated Software" which you cited is incomplete and misleading because it doesn't cover "software rot": https://en.wikipedia.org/wiki/Software_rot
I used to try a lot of APPS to produce documents and scruture my mind. Notion, google docs, word, calc, excel... It was fine but it I agree that some times is a pain. Google docs run very slow in my computer, Windows overheats in my laptop , etc.
Finally I discovered Emacs. I am pretty bad with elisp but I love it. Now I just use, Emacs, a couple of web browsers, bash, Python and some perl scripts.
While this article oversimplificates some things It has valid points for specific cases. btw, I like this web design!
Do you mean Emacs with Org-mode? Or just plain Emacs? I'm curious, because I don't know much about Emacs but I heard a lot of good things about Orgmode.
Org-mode it's just a mode, Emacs has a ton of modes, major and minor. The GNU Emacs Distribution, the Vanilla Emacs, already comes with org-mode.
It was six months ago. org-mode blown my mind. It was a transcendental experience, I was thrilled. It had everything I wished and much more. Thanks to it I discovered Emacs and I love it, it makes my life easier. What's great about Emacs is that you can solve any question you have with the manual, it's awesome, mind-blowing.
But It has a steep learning curve. I got through vim tutorial in less than an hour. Going through Emacs tutorial took me 4 afternoons, one our every day. It was 4 days because it's too much for one sitting. I almost gave up, I am happy I didn't. You have tot taking it slow, there are a lot of crucial key bindings you have to learn.
If you like writing or ordering idea, notes, etc. I recommend it, at least try it and then decide for yourself. But you have to put some time to get familiar with the Emacs ecosystem. Take it like a pianist, slow and steady makes you learn twice faster.
I am just a novice but this summer I want to try to learn more elisp and try to make one package. I hope this is not off topic. Emacs fits the description of the article, right?
As someone who could've developed an IT/programming career, but didn't because I felt things were already bloating back in the '00s, I agree with the majority: "harvesting your own food" can be rewarding but also a tedious and thankless job. It's certainly not for everyone, but if it works for some people then it is (let's put efficiency aside for a moment) perfectly valid. In fact, being more of a H/W guy I find myself gravitating towards this approach more often than not. Leanness and reproducibility is key for my workflow (I went the RF-world path), I can't afford different end results when a dependency changes/breaks something.
IMHO, keeping up with the modern paradigms for S/W development looks like a never-ending nightmare. Yes it's the modern way, yeah it's the state of the art. Still, I didn't feel it was a wise investment of my time to learn all those "modern dev" ropes, and I still feel that 20 years later. I'm nowhere near antiquated and I'm on top of all things tech (wouldn't read HN otherwise), it's just...
I see former friends/classmates that went this way, and they're in a constant cat-and-mouse game where 50% of time they're learning/setting up something dev-chain related, the rest 50% doing actual work, and 98% of it feeling way too stressed. I see modern Android devices with their multi-MB apps, bloated to hell and beyond for a simple UI that takes ages to open on multi-core, multi-GHZ SOCs. I see people advocating unused RAM is wasted RAM, never satisfied until every byte is put to good use, reluctant to admit that said good use is just leaving the machine there "ready" to do something, but not doing anything _productive_ actually.
And yet.
Without that bloat, without the convienience of pre-made libraries and assist tools for almost every function one could desire, we wouldn't be where we are now. Imagine for a moment doing AI-work, 3D movie rendering, data science etc. with a DBless approach on single-core machines with every resource micro-managed to eke out the most performance. It's simply not feasible, we would still be on the 90s... just a bit more hipster.
This article resonates so well with me. And at the same time, it feels so distant.
But an essential component of this plan is for non-programmers to articulate early and often their desire to migrate away from the current monopoly they are forced to use.
With all that said, my sense is that hardware engineering has its own heap of Sisyphean problems and complexities. I definitely would not go back to working on hardware engineering problems like I did super early in my career (a mix of embedded firmware, device drivers, PCB design, and web development). I shudder at the thought of ever working with anything Verilog/VHDL, Xilinx, or SPICE ever again, or debugging PCB designs on the bench top in the lab with an oscilloscope and a logic probe. At least in school I ran more than a few bodge wires to patch a mistake in a PCB design iteration. Maybe in some sense, it's a blessing that those linear systems theory abstractions fall apart utterly in RF engineering problems, and one has to contend with the fact that all circuits radiate. At least circuits that still contain the magic smoke.
In many ways it's still the same. Transformers use matrix multiplication is their main operation, the underlying matrix multiplication libraries have mostly seen incremental performance improvements over the last two decades or so. Most other ops in eg. core PyTorch are implemented using C++ templates and are mostly familiar to a 2008 C++ programmer. Most of my work is largely C++/Python/Cython as it has been the last 1-2 decades. Sure, the machine learning models have changed, but those are relatively easy to pick up.
I understand the desire to grow your own food and bake your own bread, as a response to the onerous, distressing complexity of the systems we live inside, but you have to accept the cost of that being more work and less connection.
Not to mention platforms like the one we develop makes deploying free software based self-hosted solutions push button, so there really isn't any excuse for sticking to the faceless "crowdware" providers.
Those are not alternatives though. It most certainly does mean more work, and even something as good as achieving mastery & empowerment in this area of your life has opportunity cost that might not be worth it for all.
Which isn't to say I think I should have been a front end webdev chasing after the next new shiney every 3-6 months so I felt like I had my pulse right on the bleeding edge of new technology.
But I use a Jetbrains IDE now quite a bit to write code when I used to be a vi/vim user.
The amount of quality variation that we are willing to accept in our houses is much greater than what we are willing to accept in an automobile. The quality of the automobile is BECAUSE we make millions of the same car over and over again.
I suppose your point is even more valid if we switch to planes. But I'd counter by comparing the levels of regulation between planes and apps. Houses vs apps feels like a fairer comparison on that score.
With software would you expect better quality from an operating system that was one of 50 options each with 100 million of users or something that was one of 5,000,000 options each with 5,000 users? If operating system is an odd corner case, then substitute "Library to Handle Timezones" or something else.
I guess that explains why things like modern refrigerators are so much more reliable and durable than the ones that were made 20 years ago. /s
By that reckoning, modern cars should be just about perfect, modulo certain modern expectations like not emitting greenhouse gases?
The reliability of a modern mass produced automobile is incredible. If a software company could produce software with a few defects as a modern automobile they would rule the world.
Yep, guess which software shows up as a toy CTF challenge for the weekend? Just because you can understand how something works doesn’t mean it’s secure.
* Starting from Lua which seems to have a decent security story;
* Changing a few lines of _safe_ Lua for yourself without introducing new buffer overflows and so on;
* and limiting the reach of those changes to a few thousand people _at most_. (99% of forks won't have even that, thanks to the tyranny of the power law.)
Your comment is very much something I think about. I don't think it's as cut and dried as you make it sound. It seems worth exploring. It seems analogous to doing controlled burns every year to avoid humongous wildfires.
> My first resolution is just to bring less software into my life. It is still early days, we don't really understand computers yet.
While philosophically kind-of true, this approach - especially for things to come - will get you quickly lost in the modern "software-eaten" World, IMHO.
https://spectra.video/w/a1dvx7y6gy1k9pnqDjQBoD (PeerTube)
https://archive.org/details/freewheeling (Internet Archive)
If oversimplifications are being used to prove a point, the argument becomes weak.
I actually read it some months ago, because I'm interested in the smol net, and I use and like the Gemini protocol quite a lot.
Unfortunately this post rants against perceived software obesity quite unreflectedly.
https://pavelfatin.com/typing-with-pleasure/
which shows Gvim on Windows with a maximum latency of 1.2ms - much faster than 30ms on the vintage Apple //e or TI 99/4. Even Eclipse, which usually feels slow to me, came in at a max of 20.8ms. (1.2ms seems very fast to me as I would expect ~2ms of average frame latency even on a 240Hz monitor?)
Which study is correct? Are both of them measuring touch-to-display latency?
It's also worth noting that those vintage 8-bit microcomputers typically used CRT TV monitors at 30Hz, while modern monitors often run at 60Hz or 120Hz or more (possibly with asynchronous refresh), so they will have lower frame latency.
While I haven't tested it myself, I believe newer iPad/Apple Pencil 2 combinations have improved their input latency (they claim 9ms but I'm not sure if that's actually end-to-end touch-to-display latency in something like Notes or Procreate.)
With helpful conclusion I don't mean just stating the facts or comparing transistors or input latency with network latency. As if developers stopped caring and created crappy software on purpose.
The post compares an Apple 2e, which is a single-tasking OS that just displays the pressed key in the basic interpreter on the screen, and modern devices, where it is not always clear, what kind of app or setup is being used. But we know, that it's plenty of layers of GUI and OS code, that most people don't want to miss. Not to mention, that mot of the higher input lag is not detectable by humans in normal work conditions.
Yes, there were years, were CPU performance couldn't keep up with added features, like immediate spell checking. I used computers through all those years and know this first-hand.
And I never dismissed the importance of input lag. I pointed out the oversimplification to support the main argument of the linked post, which suffers from this as a result.