Every time I look I find repos that look promising at first but are either unmaintained or have a team or just one or two maintainers running them as a side project.
I want my sandbox to be backed by a large, well funded security team working for a product with real money on the line if there are any holes!
(Playing with Cloudflare workerd this morning, which seems like it should cover my organizational requirements at least.)
Update: Classic, even Cloudflare workerd has "WARNING: workerd is not a hardened sandbox" in the README! https://github.com/cloudflare/workerd?tab=readme-ov-file#war...
workerd does not include any sandboxing layers other than V8 itself. If someone has a V8 zero-day exploit, they can break out of the sandbox.
But putting aside zero-day exploits for a moment, workerd is designed to be a sandbox. That is, applications by default have access to nothing except what you give them. There is only one default-on type of access: public internet access (covering public IPs only). You can disable this by overriding `globalOutbound` in the config (with which you can either intercept internet requests, or just block them).
This is pretty different from e.g. Node, which starts from the assumption that apps should have permission to run arbitrary native code, limited only by the permissions of the user account under which Node is running.
Some other runtimes advertise various forms of permissions, but workerd is the only one I know of where this is the core intended use case, and where all permissions (other than optionally public internet access, as mentioned) must be granted via capability-based security.
Unfortunately, JavaScript engines are complicated, which means they tend to have bugs, and these bugs are often exploitable to escape the sandbox. This is not just true of V8, it's true of all of them; any that claims otherwise is naive. Cloudflare in production has a multi-layer security model to mitigate this, but our model involves a lot of, shall we say, active management which can't easily be packaged up into an open source product.
With all that said, not all threat models require you to worry about such zero-day exploits, and you need to think about risk/benefit tradeoffs. We obviously have to worry about zero-days at Cloudflare since anyone can just upload code to us and run it. But if you're not literally accepting code directly from anonymous internet users then the risk may be a lot lower, and the overall security benefit of fine-grained sandboxing may be worth the increased exposure to zero-days.
The problem I have is that I'm just one person and I don't want to be on call 24/7 ready to react to sandbox escapes, so I'm hoping I can find a solution that someone else built where they are willing to say "this is safe: you can feed in a string of untrusted JavaScript and we are confident it won't break out again".
I think I might be able to get there via WebAssembly (e.g. with QuickJS or MicroQuickJS compiled to WASM) because the whole point of WebAssembly is to solve this one problem.
> But if you're not literally accepting code directly from anonymous internet users then the risk may be a lot lower
That's the problem: this is exactly what I want to be able to do!
I want to build extension systems for my own apps such that users can run their own code or paste in code written by other people and have it execute safely. Similar to Shopify Functions: https://shopify.dev/docs/apps/build/functions
I think the value unlocked by this kind of extension mechanism is ready to skyrocket, because users can use LLMs to help write that code for them.
Featured recently on HN.
High budget is no guarantee for absence of critical bugs in an engine, maybe even somewhat opposite - on a big team the incentives are aligned with shipping more features (since nobody gets promoted for maintenance, especially at Google) -> increasing complexity -> increasing bug surface.
If speed is less important and you can live without JIT, that expands your options dramatically and eliminates a large class of bugs. You could take a lightweight engine and compile it to a memory-safe runtime, that'd give you yet another security layer for peace of mind. Several projects did such ports to Wasm/JS/Go - for example your browser likely runs QuickJS to interpret JavaScript inside .pdf (https://github.com/mozilla/pdf.js.quickjs)
How much are you ready to pay for a license?
But generally, I think best bet is to offload such things to e.g. Lambda per tenant.
That's a staggering accomplishment.
It'll be interesting to see how much it will affect React Native apps as it gets more and more optimized for this use case
At one point I really thought that Flutter would outclass it but typical Google project stuff has really put a damper on it from all I can see.
It’s not better than native apps, but as far as cross platform GuIs go it’s still very very good
Flutter is amazing, but they really shot themselves in the foot with Dart (and I say that as someone who doesn't mind Dart).
As a React Native developer for, what, 6 years, I don’t have much positivity left to offer. Bug reports to the core team that went nowhere, the Android crash on remote images without dimensions, all the work offloaded to Expo, etc.
Google couldn’t really done better, maybe Flutter should’ve become independent after the initial release.
And SpiderMonkey seems... not up there compared to the other 2
I believe long term, V8 will become the undisputed champ again as Google has a lot more incentive than Apple to make the fastest engine, but this is just a wild guess of mine, and I'm biased being a Node.js Collaborator
I've been hearing for a while that JSCore has a more elegant internal architecture than V8, and seeing the V8 team make big architectural changes as we speak seems to support it [1], but like I said, hopefully they will pay off long term
[1]
I just ran the JetStream2 benchmark and got:
- Firefox: 159 score
- Chromium: 235 score
That's on latest Fedora Linux and Ryzen 3600 CPU.
I'm curious to know what the problem of Firefox is. For example, the 3d-raytrace-SP benchmark is nearly three times faster on Edge than on Firefox on my i7 laptop. The code of that benchmark is very simple and mostly consists of basic math operations and array accesses. Maybe the canvas operations are particularly slow on Firefox? This seems to be an example that developers should take a look at.
- Firefox: 253.584
- Safari: 377.470
- Chrome: 408.332
- Edge: 412.005
My n=1 as a long time Firefox user is that performance is a non-issue (for the sites I frequent). I’m much more likely to switch browsers because of annoying bugs, like crashes due to FF installed as a snap.
It honestly is pretty surprising, given that JS runtime runs website code single-threaded.
The gap is not so big these days. JavaScriptCore, Spidermonkey, and V8 are all competent.
How many of these engines are chasing benchmarks at the cost of increased memory usage?
A few years ago I started work on a kind of abstraction layer that would let you plug Rust code into multiple different engines. Got as far as a proof of concept for JavascriptCore and QuickJS (at the time I had iOS and Android in mind as targets). I still think there’s some value in the idea, to avoid making too heavy a bet on one single JS engine.
The amount of work just to aggregate and compare is admirable, let alone the effort behind the engines themselves.
https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
Just keep benchmark code limited to standard ECMAScript, don't expect any browser or Node APIs besides console.log() or print().