It's a wonder the thing ever worked at all.
Of course, things got better as time went on thanks to process improvement. I started in 1991, and I remember driving over to the NT team's building with a large (for the time) hard drive to grab a physical copy of the source tree. This was before NT was first released - when you tried running your build and went to shut it down, you had to watch the activity LED on the drive flash a few times to be sure the cache had synced to disk and powering down was safe. Fast forward a few years, and building all of WinNT was more routine, to the point it was just another component built by the automated VC++ checkin procedure (we called it submitting code to The Gauntlet), along with Excel and other Office components.
I might be misremembering if NT was part of Gauntlet, but it was definitely something we could and would build as desired.
Windows 2000 was certainly complex, but was it really substantially more complex than a full Linux distribution (including compilers, desktop environment, office suite, etc)? Why was it so difficult to build from scratch?
I'm pretty sure XP came with a digital app store right at the top of the redesigned Start menu, Windows Media Player had ten music stores integrated all selling DRM'd WMA files...
The system requirements especially, must have created a lot of work right down into the kernel team.
Like, rhetorical dude from 2002, you're mad that Windows XP will not let you remove Internet Explorer easily and that it requires online or phone activation to work? Let me tell you about Windows 11...
Yeat another thing where Microsoft was ahead of the curve, nowadays we get Electron (aka Chrome) all over the place.
People even buy laptops where the browser turned into the OS!
Apologies. This was me. Pretty much all 10. Most of them were just white-labels of the same code. Believe me, I hated doing it. MS didn't want to do the right thing and vertically-integrate everything like Apple was doing, which was the better solution as then you owned the entire user experience from end-to-end.
We know how that story ended.
I think people are forgetting how unreliable Windows was in its early days. If you were doing anything complex (programming, editing pictures, ...) Windows couldn't run for 2 hours without crashing every so often.
If anything, the core of the Windows operating system has only gotten better with time. Yes, they keep adding fluff to the desktop environment but that doesn't take away the progress they have made in stablizing their core operating system.
I`m really curious, which version of Windows you mean?
Because I don't remember this on win 3.11, win XP, win 95, etc. etc. Of course there sometimes HW/drivers issue, sometime some programs corrupt system files, etc. etc. But crashing every so often.. thats strange.
Not to mention being as easy to attack as a house made of butter.
That sounds like a Windows 3.1, where applications could easily take down the operating system. Windows 9x wasn't quite as bad. If I recall correctly, properly written applications could not take down the operating system though drivers certainly could. That said, there were certainly ways for developers to break the rules since there was little (if any enforcement) so some applications did take down the operating system. With the Windows NT series, there was sufficient isolation and enforcement of that isolation, that it was very reliable. Drivers could be an issue, as with bugs in Microsoft's code, but that was nothing in comparison to contemporary versions of 3.1 and 9x.
On the whole, I don't think it is reasonable to blame Microsoft for the reliability of their operating system. There were certainly design issues that resulted in it being unreliable, especially when running third-party code. On the other hand, the operating system was basically an evolution of a product line that started on the 8088 with very limited memory (I'm speaking of PC-DOS here) and a great degree of compatibility had to be maintained. Keep in mind, the computer industry did not work at the same pace: features had to wait until processors incorporated them, processor adoption had to wait for manufacturers to build them into their systems, and then consumers buy those systems in sufficient numbers. For example: the 286 was introduced in early 1982, but the IBM PC AT did not come out for another 2.5 years. Microsoft was also limited by the hardware their customers owned, even when it supported particular features. Life is much harder when you cannot throw memory at the problem because people had 2 or 4 or 8 MB of RAM.
On the other hand, Windows NT was a completely different product. There was much less concern over compatibility. There was much more intent to throw away baggage to create a modern (for the time) operating system. It did not crash every two hours.
1. budget devices from OEMs that cut corners at every cost
2. capacitor plague with merchants unable to guarantee good capacitors from any source
Windows, despite its legitimately annoying monetization strategy, has absolutely done the opposite - it does More Stuff every release, and the stuff it did before largely still works.
Do you have some examples of how macOS is doing less / capable of less today, than say 1 or 2 or 3 releases ago?
Another would be fragmenting the settings between the control panel and the new settings menu. It does more stuff (you have twice as many settings apps!) but it is less useful, because you are less likely to find the setting you are liking for.
Another example of doing more and becoming less useful is requiring a TPM for Windows 11. My security should be my decision. Not letting one install Windows obviously makes Windows less useful than if it could be installed.
In general (ie, not a Windows specific issue) ever growing hardware requirements makes the software less useful over time, as it can only run on a smaller and smaller subset of hardware. As software gets better, it should run on more hardware than it did before. Not less. Windows will simply not run on hardware from 15-20 years ago that is otherwise fully functional. That means it is less useful than it was before.
I wouldn't say "doing more" is better. I'd be happy if it did a lot less. I don't care about most of the big new features in windows. I'd be a lot more happy they'd rework their old antiquated stuff that keeps making problems (drivers, registry, focus handling, etc. etc.).
> Apple is making its desktop OS more "secure" (read: convoluted and does less stuff)
What is apple really making less useful with time? For me I really like many of the new features. The only reason I stick to windows is that gamging is still horrible on macOS.
I agree there are definitely shitty chunks Windows, but there are still some very solid foundations there to this day.
https://devblogs.microsoft.com/dotnet/working-through-things...
Also, they have about 10k people working on Windows (and devices) and about 10k people working on ads nowadays (that paints a good story of priorities).
Source: 2nd hand from MS friends
But I miss the ability to only pull down a portion of a monorepo, and the ability to remap where folders are at, or to pull down a single folder into multiple locations.
So much bullshit in with monorepos in Git land exists because Git doesn't support things that Source Depot (and Perforce I presume) supported decades ago.
As an aside for those who don't know what I am talking about, when pulling down a repo in source depot you can specify which directories to pull down locally, and you can also remap directories to a different path. This is super useful for header files, or any other sort of shared dependency. Instead of making the build system get all funky and fancy, the source control system handled putting files into expected locations.
So imagine a large monorepo for a company and you can have some shared CSS styles that exist and they always end up in every projects `styles` folder or what have you.
Or the repo keeps all shared styles in a single place, and you can then import them into your project, but instead of build system bullshit you just go to your mappings and tell it to pull the proper files put them into a sub-directory of your project.
It is a really damn nice to have feature. (That also got misused a ton...)
We have all that with git in Microsoft though. We don't check out the entire office monorepo - only the parts relevant to what you're working on (Excel in my case).
Also sharing stuff in SourceDepot wasn't the bad part (you get links to changelists and those open in a desktop program). The bad part was the branching model, commits, no real/good CI (we had a commit queue) etc). SourceDepot was just overall a bad scm for us.
I’m moderately confident the correct path is monorepo + centralization + virtual filesystem. Not every tool plays nice with VFS but at this point most do.
The D in DVCS is almost entirely a waste. Source control systems should, imho, trivially support petabytes of history and terabyte scale clones.
We did move stuff we could to other git repos inside Microsoft.
SourceDepot is still running for some stuff and is still awful but git is working great.
> Also, they have about 10k people working on Windows (and devices) and about 10k people working on ads nowadays (that paints a good story of priorities).
I'm not sure I'm privvy to all information but looking at the org chart this part is false. The ads org is much much smaller than E+D.
> Windows is only $5m a year
https://news.ycombinator.com/item?id=34934946
I was very impressed to determine that was only $416k/mo. Since I read that I've been like "that can't be right." (There's certainly no qualification of scope to work with.) That's roughly 15-20 (~$250k-$333k) senior developer salaries.
I'm very curious how and where Windows practically fits into the pie chart nowadays, mostly just from the perspective of a passively curious person who likes to file away watermarks and yardsticks :)
There's probably some perfectly externally-facing info out there under a rock I'm not sure where to look for...
Look at Panos' org and compare to the WebXT org (both under E+D).
An interesting question is: why are we still having the same problems today? why haven't they been solved yet?
https://www.amazon.de/Show-Stopper-Cloth-BREAKNECK-GENERATIO...
Dave was an engineer on NT and creator of Task Manager and zip folders. Lots of interesting stories and anecdotes from that period on that channel.
Serialized Development
The model from NT 3.1 -> Windows 2000
All developers on team check-in to a single main line branch
Master build lab synchs to main branch and builds and releases from that branch
Checked in defect affects everyone waiting for results
Diagram: Developer
Developer
Developer
Developer
-> Single MainBranch -> Product Build Machine -> Product Release ServerOuch, and it looks like they only had version control with branching for the last nine months of development.
You had patches you'd float with "changelists" on top of enlistments. Each part large enough in thhe org (for example Excel or Word) gets a "branch" and it gets "forward integrated" and "reverse integrated" to the main "branch".
From your perspective the tool used to submit stuff (usubmit usually) you just push to the same branch as everyone else in your org and if your code breaks things it gets "backed out" by an automatic process.
Using git now is so much nicer.
Windows sources 20 years ago used to have a ridiculously complicated branching strategy, driven by middle managers and made worse by having actual devs sneak around the edges to do "buddy builds" of changes with some godawful batch file that I heard may have originated with RaymondC (who was exactly the kind of person to make ridiculous MSFT somehow bearable for the rest of us). It was Conway's Law, somehow twisted and applied to version control. With permissions SNAFUs.
I still see companies today trying to map their org chart into their branching strategy and just shake my head . . . and run away.