> bun build --compile ./foo.ts
> This lets you distribute your app as a single executable file, without requiring users to install Bun.
> ./foo
This is big! Part of Go's popularity is due to how easy it is to produce self-contained executables.
And it seems to support amd64 and arm according too: https://twitter.com/jarredsumner/status/1657964313888575489
Though Bun doesn't use Node.js I believe, there's a reference
(At least it's not JVM Hotspot...)
The worrying part is that this is mostly code, not dead junk simply occupying space, this is part of the code path, filling the caches, or should I say trashing the caches…
This definitely took me from the 'eh, kinda cool project' to 'I cant wait to try this out immediately' camp.
The binaries are pretty huge, hoping they can bring that down in time.
I'm surprised the numbers are as high as they are and hope they can reduce them... but they'll never get down to the kind of numbers Go and Rust get to because Bun depends on JavaScriptCore, which isn't small, and unless they're doing some truly insane optimizations they're not going to be able to reduce its size.
FWIW QuickJS is a tiny JS runtime by Fabrice Ballard... that also supports creation of standalone executables: https://bellard.org/quickjs/quickjs.html#Executable-generati... though its runtime speed is considerably worse than JSC or V8.
`xz -z -9e` is good to compress it for distribution.
Rust too... :D
zig is by default static linked, it probably has the smallest binary size, smaller than static C.
also: there is a bug in `bun build --compile` I am currently working on fixing. Expect a v0.6.1 in a bit
Given that Oven has taken $7m in VC funding, how do you plan to monetize Bun, etc?
https://twitter.com/jarredsumner/status/1475238259127058433?...
have been doing this (using ES and CommonJS modules in the same file) in clientside code via Browserify or Rollup ever since ESM got popular but it's a bit more nuanced and annoying to do in NodeJS
How did you achieve that? Are there some shortcuts you took, or some feature you deemed not in scope (yet)?
Bun is written in Zig, but it takes the same approach that esbuild took to make things fast: a linear, synchronous pipeline fitting as much as possible in the CPU cache with as little allocation and other overhead as possible. Zig has more knobs to turn than Go.
Bun goes one step further and does a great job of offloading work to the kernel when possible, too (i.e. the large file I/O improvement with the Linux kernel)
Understatement.
Still, in the native-starved web space this sorta meal will be considered haute cuisine.
Any plans on adding "in-memory" / "virtual" file support to Bun.build? I'd be interested in using it for notebook-style use cases
--
Also, ways to do "on-the-fly" "mixed client/server" components (ala hyperfiddle/electric) + sandboxing (ala Deno) would be extremely exciting
Some projects in this vein - https://github.com/jhmaster2000/bun-repl and https://www.val.town/
Also, bun macros are very cool -- they let you write code that writes over itself with GPT-4. Just mentioning as a thing to keep on your radar as you keep pushing the boundaries of what's possible in javascript :) making it more lispy and preserving eval-ability is great
I have seen some desire and works expressed towards using Bun with Electron or Electron alternatives; this interests me greatly. Do you have any plans or aspirations to make any strong push in this direction?
2. Do you cross-compile Bun? If you do, how has your experience been cross-compiling with Zig when you have a C++ dependency?
Yes that is correct and not good. Pedantically, files which don't exist can be created between the call to check if it exists and after. In practice though, it is pretty annoying
> 2. Do you cross-compile Bun? If you do, how has your experience been cross-compiling with Zig when you have a C++ dependency?
We cross-compile the Zig part but not the C++ dependencies. zig c++ was too opinionated for us the last time we gave it a try. I'm optimistic this will improve in the future though.
How standalone are the standalone executables produced by `bun build`? Is a libc or equivalent expected to be present?
We haven't implemented polyfills for everything yet though, like Bun uses posix_spawn and doesn't have an exec + fork fallback
Bun's dependencies otherwise are statically linked
Glad to see you leading this, incredible work and nice to see the positive reception.
Is it just because most Javascript developers have never learnt from any of the lessons that came from decades of compiled languages? (compilers, compiler tools, operating system and kernel development, etc).
Is there some benefit that Javascript bundlers have that I'm unaware of?
Truly curious.
2. Running JS outside of the browser is similarly a layer that was built before any kind of standard existed for it (which still doesn't, really)
The browser standards are the only real standards. Everything else (which has turned into a lot) is "standard by implementation". Implementations usually try to agree with each other, because that's obviously beneficial for everybody, but sometimes they make choices to deviate either out of necessity or in an attempt to improve the ecosystem
So it's all pretty ad-hoc, but in practice most things are mostly compatible most of the time. They orbit the same general things, and the orbit has narrowed in the last few years as most of the big problems have been solved and the community is zeroing in on the agreed solutions (with the caveat of often having to maintain legacy compatibility)
Deno takes a stricter philosophy than most, where it prescribes a ~good way that everything should be done (which is almost entirely based on browser standards which have evolved since all this started), even though it runs outside of a browser, and requiring the ecosystem to fall in line
Bun on the other hand takes a maximalist approach to compatibility; it does its best to make everything from every other sub-ecosystem Just Work without any modifications
It's a pretty unique use-case that not many other programming languages deal with or care about. It's almost as if you are a demo scene coder trying to optimize for the absolute smallest code possible.
Linking native code doesn't really care about optimizing the size of the output and is just just trying to make sure all code paths can be called with some kind of code.
That is not true at all. There are many use cases where native code size is very important. Native code toolchains often target platforms with extremely limited ROM/RAM. Even on big machines, RAM is not free and cache even less so.
Native code linkers will GC unused code sections (`-Wl,--gc-sections `), fold identical data to remove duplication (see COMDAT). Native code compilers have optimization modes that specifically optimize for small code size (`-Os` and the like).
I guess Javascript uses a slightly unusual executable format, text instead of binary. Otherwise, it seems like very much the same thing?
This creates a situation where you need bundlers, whereas other languages don't have the concept at all, just to be able to minimize download time (and honestly, while we end up making rather large apps in comparison to web pages, they're pretty small in comparison to other kinds of applications), and then bundles are too opaque to share common code between applications.
And because there's no chance to benefit across projects from sharing, there's no force driving standardization of bundling, or adoption of said standard.
edit: Oh well, after navigating to some pages on the blog I see that everything was already on browser cache, so that's why it was so fast. Reminds me I need to overwrite Netlify's cache-control on my website, even though it's already very fast to load (Netlify sets max-age=0, must-revalidate by default).
Netifly was equally fast (not free).
When you say paid Netlify is as fast as Cloudflare do you mean the Pro plan or the Enterprise plan? AFAIK the enterprise plans run on a different network, with more distributed servers, although I could be wrong.
It seems at least part of the noticed difference in speed has to do with my region, as pagespeed insights gives me sub-second FCP and LCP on my Netlify website [0], which feels a bit better than what I get at home (with 500mbps fiber). It's possible my ISP is at fault, but I'm not sure how I could diagnose this much better.
cache-control: public, max-age=0, must-revalidate
A few things I notice: - It uses Cloudflare cache (as you pointed out).
- All CSS is in the HTML file, so only one request is needed to display the page.
- The compressed webpage is reasonably lean considering it has all CSS in the same file and uses Tailwind.I've tried a couple and struggled with configuration and, on top of it all, bun is simply faster.
So, if you want to write a bunch of `.ts` files and point something at them, I really recommend `bun` (and, frankly, why would you write `.js` in 2023? Probably because you've not tried bun.
Edit: I don't care about bundle sizes, because I'm just using bun to run my @benchristel/taste sub-second test suite.
https://medium.com/deno-the-complete-reference/deno-vs-bun-p...
It's surprisingly convenient for munging around random json files or calling random apis that return json (or for dealing with an interaction between the two), especially now that fetch ships with node.
i'm sorry
can you file an issue with some code that reproduces it? will take a look
Edit: just saw a comment from the author indicating glibc is required.
Is Bun in a state where I can start thinking about replacing my Node.js production toolchains with it?
• Bun is not stable yet (0.6.0)
• Zig, the language Bun is built upon is not stable either (0.11.0)
Nothing against these awesome projects, I'm all in for a streamlined toolchain (TypeScript, bundling, binary generation...) and other excellent goals driving the Deno and Bun teams.
But...
• Node.js is a powerful piece of software, it's stable and full of battle-tested buttons and knobs
• NPM and Bun/Deno are not real friends at the moment, just acquaintances
• Take benchmarks with a pinch of salt. Real-world app performance depends on a well-greased system, not a particular 60,000 req/s react render benchmark. Remember the adage: your app will as fast as your slowest component.
On a side note, lately I've been extending Node.js with Rust + N-API (ie. napi-rs or neon) and it opens up excellent possibilities.
This seems like the most obvious thing yet to be built.
I wonder how hard it would be to take an existing systems language and add TS syntax to it. Seeing as they are all built on LLVM. Or maybe you could transpile TS to Zig.
was it ever offered for standardisation?
13.3.12.1.1 HostGetImportMetaProperties ( moduleRecord )
The host-defined abstract operation HostGetImportMetaProperties takes argument moduleRecord (a Module Record) and returns a List of Records with fields [[Key]] (a property key) and [[Value]] (an ECMAScript language value). It allows hosts to provide property keys and values for the object returned from import.meta.
https://tc39.es/ecma262/#sec-hostgetimportmetapropertiesState-of-the-art HTTP servers already do a pretty damn good job gzipping stuff on the fly, do we need this garbage?
If it is for obfuscation, fine, can we just call it that?
There are some Steve Souders books on optimization that are pretty good and still pretty relevant.
It would greatly simplify deployment to have the source and deployed code be identical. Obfuscation aside, given JS is an interpreted language, there is no reason to not use it for what it is. We've turned deploying JS into the same level of complexity as deploying C++ by adding building and packaging steps. Interpreted languages should never need build steps, and deployment should be no more than a simple rsync.
This is beyond the stripping comments, whitespace, renaming all symbols to short identifiers, replacing idiomatic constructs with shorter equivalents, etc. that people expect minifiers to be doing.
Bundling with tree shaking can get rid of a lot of code that would otherwise need to be downloaded. This is especially the case when using only a subset of functionality from a larger library.
Otherwise, for most libraries, if you pull in a single function, every single module from that library will also get loaded. This applies both the pre-bundled libraries (i.e. where there is only one large module, so obviously everything gets downloaded), and non-bundled libraries, because most libraries have you import from an "index.(m)js" module, that exports all the various public API of the library. Which means a browser with import maps will need to download all those files, and all files they import, which will be basically every module in the library.
Minimizers themselves often also have some sophisticated dead code elimination. Indeed, one potential (but inefficient) way to implement tree shaking is simply to bundler without shaking, while using module concatenation, and then simply passing it to a minifier with good dead code elimination capabilities. This would be able to eliminate basically anything tree shaking could, and more. The more comes from both from any eliminable code the bundler would not know how to eliminate, but also being able to eliminate anything that was only imported by said dead code.
This is one of reasons why uncompressed minified code can sometimes beat out the compressed original code, and is still at least somewhat compressible itself. Of course I have not even touched on having fewer files to download (which still has meaningful overhead), nor the smaller resulting codebase being faster to parse than the.
Lastly, but not least, many people want to use typescript or jsx when writing their code, which means the code needs to be pre-processed before a JavaScript engine will read it. If you already have a compile step, then adding bundling and minimizing on top of that can be relatively simple and make good sense for the above reasons. (Note can be simple. It depends a lot on what tools you use. Webpack for example can get really complicated, but it also offers some really powerful features.)
> many people want to use typescript or jsx when writing their code
However, if this is true, why don't we just add these languages to browsers (<script type="text/jsx">, <script type="text/typescript">) with some kind of client-side processor for old browsers that turns them into "text/javascript" in-line in the DOM if the browser doesn't support it?
It's kind of weird that JavaScript is becoming a common "bytecode" among other better-structured languages, one would think the bytecode language that becomes the compilation target should be one that is better designed of its own.
If we're compiling JS, JSX, and TS all to some sort of assembly, or even a statically and strongly typed language like C++, I would feel a bit better.
Does Node.js have lots of bugs still?
Let the downvoting of this simple observation begin.