This is so often repeated, but I genuinely don't understand why. Could you try selling me on it? I ended up going the sysadmin/devops route instead after college, but the more I learn about Linux, the less I understand why anyone would choose it for personal, active manual use.
I can understand server deployments, it works well enough. It's available at no cost, Windows Server is way out in the far other end in terms of current desired behavior, and whatever pains it has you get paid to make up for. None of which applies on a personal device level.
The most common selling points I see are more performance and less "spying". I find neither of these very persuasive, and I'm not interested in ideological rationales either (supporting free software). If you have anything else, I'm all ears.
I find hilarious this FOSS concept that developers only use Linux, I wonder who writes software for all other operating systems in the world.
Windows still offers other options, even if MS itself tend to ignore them.
> If you are a developer you should have switched to Linux years ago anyway.
Developers is much broader set than web developers, and even then advantages of Linux escape me.
These days, WSL2 effectively eliminates a need for that for most developers.
Lets recap that this lesson has been learned by Godot developers, regarding their backends as well.
The ICD mechanism is a kind of escape hatch leftover due to backwards compatibility, and even user mode drivers build on top of DirectX runtime infrastructure.
It really depends how you define scope, but I don't think I would've taken on another GPU backend for that.
While I doubt this had anything to do with the decision by the Zed team to make their own toolkit, it is something becoming more common. Hopefully it doesn't start happening in the encryption space.
There aren't too many epithets floating around that offend me specifically. And I haven't heard anyone say I shouldn't/don't exist. So it's hard for me personally to feel the need for CoC and the like. But I'm all for policy that protects everyone against that kind of abuse -- which seems to be on the rise. Are there better alternatives?
So with that, this presents a HUGE opportunity for someone to build something akin to Zed, but not with the baggage that their technical strategy brings.
Not sure it’s so clean-cut. More than avoiding baggage, you’re just shifting it elsewhere. The question is if you want to own (and can handle) the baggage and benefit from the control that brings.
If I switch no vanilla macOS, it’s basically unusable
Clean, but unusable
Zedless: Zed fork focused on privacy and being local-first - https://news.ycombinator.com/item?id=44964916
Sequoia backs Zed - https://news.ycombinator.com/item?id=44961172
It isn't just text editors—nowadays, everything renders on your GPU, even your desktop and terminal (unless you're on a tty). For example, at the bottom of Chromium, Electron, and Avalonia's graphics stack is Skia, which is a cross-platform GPU-accelerated windowing and 2D graphics library.
GPU compositing is what allows transparency, glass effects, shadowing, and it makes actually writing these programs much easier, as everything is the same interface and uses the same rendering pipeline as everything else.
A window in front of another, or a window partially outside the display? No big deal, just set the 3D coordinates, width, and height correctly for each window, and the GPU will do hidden-surface removal and viewing frustum clipping automatically and for free, no need for any sorting. Want a 'preview' of the live contents of each window in a task bar or during Alt-Tab, like on Windows 7? No problem, render each window to a texture and sample it in the taskbar panels' smaller viewports. Want to scale or otherwise squeeze/manipulate the contents of each window during minimise/maximise, like macOS does? Easy, write a shader.
This was a big deal in the early 2000s when GPUs finally had enough raw compute to always run everything, and basically every single OS and compositor switched to GPU rendering roughly in the same timeline—Quartz Extreme on Mac OS X, DWM.exe on Windows, and Linux's variety of compositors, including KWin, Compiz, and more.
There's a reason OSs from that time frame had so many glassy, funky effects—this was primarily to show off just how advanced their GPU-powered compositors were, and this was also a big reason why Windows Vista fell so hard on its face—its compositor was especially hard on the scrawny integrated GPUs of the time, enough that two themes—Aero Basic, and Aero Glass—had to be released for different GPUs.
If you're saying that Zed is built on something like Skia, then it would already be cross-platform and not have to worry about Vulkan vs. DirectX, right?
Since it's more than quick enough to do this on the CPU, they're likely doing it for things like animations and very high quality font rendering. There's image-processing going on when you really care about quality; oversampling and filtering.
I suspect one could do most everything Zed does without a GPU, but about 10 to 20% uglier, depending on how discerning the user is on such things.
This is true until it isn't. A modern-ish CPU at 1080p 60hz it'll be fine. At 4k 120hz even the fastest CPU on the market won't keep up. And then there's 8k.
> they're likely doing it for things like animations and very high quality font renderin
Since they're using native render functions this probably isn't the case.
It doesnt really do a tech breakdown of why it’d be impossible CPU side, but mentions a couple of things about their design process for it.
I tried it and the experience (mainly visually, fonts colours etc) wasn't very good so I can understand why the Zed developers are reluctant to formally release windows binaries.
This is incongruous given Zed uses modern frameworks (which is why they moved to D3D11 from Vulkan in the first place).
If Zed really wanted to target 'old Windows' then they might have used Win32 and GDI+, not D3D11. In fact they could've stuck to D2D (which was released with Windows 7 and back-ported to Vista), and not used their own rendering at all, since D2D is already a GPU-accelerated text-rendering API, and then used Win32 windowing primitives for everything else.
> but we got reports from users that Zed didn't run on their machines due to the Vulkan dependency
This single sentence is abstracting a lot of detail. Vulkan runs on Windows, and quite well. Looking at the bug reports, especially the last one[1]...
> Rejected for device extension "VK_KHR_dynamic_rendering" not supported
Aha, ambitious devs >:) The dynamic rendering extension is pretty new, released with Vulkan 1.3. I suspect targeting Vulkan 1.1 or 1.2 might've been a little more straightforward than... rewriting everything to target DX11. Large games with custom engines (RDR2, Doom, Doom Eternal) were shipped before this was main-lined into Vulkan.
But thinking about it a little more, I suspect switching out the back-end to a dynamic rendering-esque one (which is why D3D11 rather than D3D12) was easier than reverting their Rust code to pre-dynamic rendering Vulkan CPU calls; the Rust code changes are comparatively light and the biggest change is the shader.
That being said, it's a bit annoying to manually write render-passes and subpasses, but it's not the worst thing, and more importantly extremely high performance is less critical here, as Zed is rendering text, not shading billions of triangles. The singular shader is also not necessarily the most complex[2]; a lot of it is window-clipping which Windows does for free.
> we had two implementations of our GPU shaders: one MSL implementation for macOS, and one WGSL implementation for Vulkan. To use DirectX 11, we had to create a third implementation in HLSL.
I wonder why HLSL wasn't adopted from the outset, given roughly 99.999% of shaders—which are mostly shipped with video games, which mostly target Windows—are written in HLSL, and then use dxc to target SPIR-V? HLSL is widely considered the best-specified, most feature-complete, and most documented shader language. I'm writing a Vulkan engine on Windows and Linux, and I only use HLSL. Additionally Vulkan runs on macOS with MoltenVK (and now 'KosmicKrisp'), but I suppose the Zed spirit is 'platform-native and nothing else'.
> symbolicating stack traces requires a .pdb file that is too large to ship to users as part of the installer.
Perhaps publishing a symbol server[3] is a good idea here, rather than users shipping dump files which may contain personally-identifiable information; users can then use WinDbg or Visual Studio to debug the release-mode Zed at their leisure.
[1]: https://github.com/zed-industries/zed/issues/35205
[2]: https://github.com/zed-industries/zed/blob/c995dd2016a3d9f8b...
[3]: https://randomascii.wordpress.com/2020/03/14/creating-a-publ...
You're right that we may be able to get rid of our WGSL implementation, and instead use the HLSL one via SPIR-V. But also, at some point we plan to port Zed to run in a web browser, and will likely build on WebGPU, where WGSL is the native shading language. Honestly, we don't change our graphics primitives that frequently, so the cost of having the three implementations going forward isn't that terrible. We definitely would not use MoltenVK on macOS, vs just using Metal directly.
Good point that we should publish a symbol server.
Except that everything has effectively converged to HLSL (via Slang which is effectively HLSL++) and SPIR-V (coming via Shader 7).
So, your pipelines, shader language, and IR code would all look mostly the same between Windows and Linux if you threw in with DX12 (which looks much more like Vulkan) rather than DX11. And you'd get the ability to multi-thread through the GPU subsystem via DX12/Vulkan.
And, to be fair, we've seen that MoltenVK gets you about 80-90% of native Metal performance on macOS, so you wouldn't have to maintain a Metal backend, anymore.
And you'd gain the ability to use all the standard GPU debugging tools from Microsoft, nVidia, and AMD rather than just RenderDoc.
You'd abandon this all for some mythical future compatibility with WebGPU--which has deployment counts you can measure with a thimble?
Modern Direct3D is almost indistinguishable from Vulkan, on the other hand. So it shouldn't be difficult for them to add.
I also agree with your HLSL comment. It sounds like these guys don’t have much prior graphics or game development experience.
Not everywhere. See the middle bug report, "Zed does not work in Remote Desktop session on windows" (https://github.com/zed-industries/zed/issues/26692).
Most Remote Desktop/Terminal Services environments won't have any Vulkan devices available, unless you ship your own software rendererer (like SwiftShader).
Also, NVIDIA only supports Vulkan on Kepler (GTX 600 series), AMD on GCN 1.0 (Radeon HD 7000 series), and most importantly, Intel on Skylake (6000 series). Especially on the Intel side, there are plenty of old but still-supported Windows 10 machines that lack Vulkan support. For many applications that's ok, but IMO not for a text editor.
I suspect because a huge amount of software engineers develop on Macbooks and consider Linux second and Windows third. Culturally, I think there's a difference in tooling between Graphics developers (who would go straight for HLSL, cross-platform Vulkan, or even SDL3) and mac users (who reach for Apple tools first)