Don't think about syncing that formatting/linting/import aliases configuration with VSCode, all you need is the Deno plugin and you'll get all the benefits of working with TS on VSCode.
Packages are obtained (and heavily cached) from any URL instead of relying on a centralized repository. Obtain your dependencies as you want, whether it's Deno's proxy, directly from raw.githubusercontent.com, your own http server, or any other thing accessible thru an URL.
At the end, the permission system is the least interesting part of the project imho. It's useful, because if you're doing a CLI that just receives stdin, processes it, and prints to stdout, you can block any disk and network access, but apart from that it's really limited because the nature of JS itself. (Maybe for the next trendy language we could think about the Object-capability model before it's too late. https://en.wikipedia.org/wiki/Object-capability_model)
The thing I value the most is consistency and having a fully working development environment out of the box to be productive.
There is an object-capability model in the upcoming OCaml 5.0, however it's only in the Eio library, that deals with IO https://github.com/ocaml-multicore/eio#design-note-object-ca.... There's also Emily, a subset of OCaml based on POLA (Principle of Least Authority) https://www.hpl.hp.com/techreports/2006/HPL-2006-116.pdf. I'm unaware of any plain to extend OCaml in that direction though.
what I really like a new Node.js(e.g. deno) is actually its module system, I don't like the `npm i` in node.js pulls hundreds of modules, a standard library that contains most commonly needed modules is the key. If deno does not do that, I probably will never try that(as I can have the bundle of tooling done myself in one-or-two hours, vite/volar now makes this even simpler).
In short, I will prefer gold-quality set of modules or APIs(e.g. glibc) to the 100% flexibility "you pull whatever you want freely now even over http URL", for security and stability reasons.
Because modules/packages routinely have dependencies. And their dependencies have dependencies. And...
Deno changes nothing in that regard with one single exception:
> a standard library that contains most commonly needed modules is the key
^ This is the bane of Javascript, yes. But this doesn't mean that having a standard library somehow prevents modules having multiple dependencies and subdependencies.
> "you pull whatever you want freely now even over http URL", for security and stability reasons.
- If you pull your deps from a random URL and that URL goes away, how do you solve that?
- If your deps pull other subdeps from a random URL and that URL goes away, how do you solve that?
- For security, how do you vet what your dependencies keep on pulling from random URLs?
For node, the answer is: run your own registry, and don't load anything from outside. That's how many companies operate. How can this be solved with Deno?
Deno API, which comes bundled with Deno and is present globally, always, without importing: https://doc.deno.land/deno/stable. Includes basic stuff like stdin, stdout, stderr, file management... etc and standard Web APIs like Fetch API, web assembly interoperability, localStorage, FormData, TextEncoder/TextDecoder, etc.
Deno std lib, an official library that presents what you could expect from a standard library, based on Deno API: https://deno.land/std@0.120.0, but you need to import it, because it's no different than any other module in the wild. Mime types, more encoding options, extended file system management, etc.
https://github.com/denoland/deno/issues/11964
https://github.com/denoland/deno/issues/9750
API-based access control can’t possibly work because it’s nearly impossible to predict the effect of any single permission. For example, “permission to run specific command” makes no sense without checking the integrity of the binary, controlling the environment for LD_PRELOAD-like hacks and evaluating the code of this command for possible escape hatches. If you want to isolate a program, you need to do it on the OS level.
> deno run --allow-read=./assets
works as intended - preventing most execution of local code, and prevents writing to disk. I think it's a useful real-world use-case, that is complicated to copy with nodejs.That said, I think one should still be wary of running random code - but at least deno makes it a little easier for honest authors to adhere to principle of least privilege?
Is there an alternative tool you can suggest, to allow us securely run arbitrary JS? I was looking at Apple's JavaScriptCore to run JS and and if it happens that I need any level of access to the system(i.e. files) simply handle that in Swift and pass the file to the JS. Would that be a secure approach?
If you want to isolate your program, you should use an OS-level sandbox like bubblewrap or a lightweight VM like Firecracker. I’m not familiar with Apple’s JavaScriptCore, but if it doesn’t provide any access to the system (and instead relies on passing arguments from Swift code), it might also be a viable approach.
The 2nd issue does seem concerning to have taken so long to resolve.
The problem with the Deno security model is that it’s hard to predict how granting any specific permission would affect overall security. For example, it may seem to be kinda reasonable for an application to ask for `--allow-write=~/.config` to create config directories & files, but it’s probably exploitable to escape the sandbox. Is `--allow-env` + `--allow-write=whatever` dangerous? I don’t know. If Deno runtime spawns a subprocess at some point, it could be used to execute arbitrary code via `LD_PRELOAD`. Is there a guarantee that Deno runtime will never spawn subprocesses? There is no way to know.
I read through your bug reports now.
These are sound and very very useful.
I must admit I pattern matched on your language and answered based on that below and therefore my answer even if it is maybe somewhat(?) correct is extremely wrong in tone and what it implies.
Sorry.
I still think that it would be better if you were somewhat more specific. All in this thread seems to be related to subprocesses, which is a scary thing anyways for anything internet facing, isn't it.
False, V8 is a JS runtime with sandboxing built into its core design. It’s not a language and it doesn’t guarantee sandboxing the JS runtime.
> that makes it impossible
False, breaking out of the sandbox is trivial in environments which allow native addons.
Has deno undergone some kind of security audit to verify its claims irt security?
EDIT: I see some referenced issues in comments down below involving the --allow-read/write flag. I'm not interested in that. I'm interested in if anyone can prove that with no permissions granted at all, they can break out of the sandbox and achieve ACE.
I’m academically interested if there are other such exploits, too. But I’d expect if they’re found they’ll be patched before they’re disclosed (or they’ll be exploited in the wild).
Also related: https://microsoftedge.github.io/edgevr/posts/Super-Duper-Sec...
Most Browser exploits these days use Heap Spraying attacks that try to corrupt the state of the sandbox in between bindings and native libraries (or their data structures that are transferred between contexts). So technically, a JIT VM always leads to possibilities for breakouts when there is a discrepancy between the optimizer and deoptimizer's assumptions (e.g. in regards to callstack, garbage, memory ownership etc).
Also: There's a legacy navigator.plugins C-Bridge based API which hasn't been maintained or redesigned/refactored since the late 90s yet it is still active in most Browsers.
It is the usual case of worse is better, and nodejs for better or worse, does it job.
I use Deno mainly for simple scripts with Typescript. In nodejs I always find myself having to configure the environment, while in Deno it mostly just runs.
I also like the Deno standard library.
Of course, it's up to each and every developer that contribute, but looking at some of the most up-stared Deno modules, that's the direction it's heading.
This isn't necessarily enforced by Deno itself right? That seems like more of a side-effect from the self-selection of its users. Once the ecosystem grows and the all the "normies" come in, this doesn't seem guaranteed at all.
Also I don't want smaller modules. I want bigger and better maintained modules with few dependencies. Small modules is what makes npm ecosystem not that great.
This perspective is making Deno feel easy for me to jump to for my next backend project, now that I'm actively fighting against dependencies.
I don't even care nodejs exists.
^ I'm bundling server-side for hmr/watch functionality in a monorepo with many cross-side shared code/modules.
I don't even remember the details, but I remember it all feeling very rickety. I'd never push anything that fragile into production, it didn't even survive 25 days of doing small puzzles without constant nursing
Meanwhile guys like node-fetch and chalk ask us why don't we just adopt ESM.
What I'm trying to do there is to have a client and server in the same ./src, hot-reloaded module-wise on "save" only when a relevant part changes, and vscode to typecheck both at the same time. The language similarity is also a goal. It's a little more than just a traditional lazy-compile-restart cycle.
Non-monorepo non-ts-only guys do not experience my issues, because they only have one environment per project (or per src-<target>), and don't try to make their build configs incompatible with other parts of a build system. I tried to push it as far as it could go to evaluate the state of things for writing non-standard slightly different web apps. To make a ts-react app, they just use CRA, for a backend they just use node main.js.
But anyway this shows how interdependent this ecosystem is, instead of being full of orthogonal possibilities.
Can you elaborate?
Overall, it just makes sense. It feels like you’re using one syntax for front and backend instead of having to use two different ones.
With deno it should be easier to do this, you setup your own cdn, just upload plain js files and point it from your import map[1], the browser will take care of download/cache them all.
Exactly. And it's quite easy to do.
> With deno it should be easier to do this, you setup your own cdn, just upload plain js files and point it from your import map
I was waiting for the inevitable just.
- Just set up your own CDN.
- Just upload a plain js file there (where do I get those files from?)
- Just point to dependencies using a feature that, quote "is not a W3C Standard nor is it on the W3C Standards Track"
- And then the browser... record scratch who said anything about a browser?
Where do you et these lockfiles files from?
> This is no different to module loading in Node.
This is very much different from module loading in Node: https://news.ycombinator.com/item?id=29871936
> If you don’t trust your registry, you should not be loading code from it!
So you immediately pinpointed the difference: with Node I can run my own registry and easily set up npm/yarn to never load packages from anywhere else. Deno loads code from random urls.
the different approach would be yarn's saving zipped versions of packages, I don't know if deno supports it
There is. For starters, I can run my company's registry and make sure all npm packages are downloaded from there since resolution mechanism for npm/yarn is well known.
How do I tell deno to download <random-url> from my own registry?
Looking at how deno "solves" this I can't stop laughing [1]
--- start quote ---
In Deno there is no concept of a package manager as external modules are imported directly into local modules. This raises the question of how to manage remote dependencies without a package manager. In big projects with many dependencies it will become cumbersome and time consuming to update modules if they are all imported individually into individual modules.
The standard practice for solving this problem in Deno is to create a deps.ts file. All required remote dependencies are referenced in this file and the required methods and classes are re-exported. The dependent local modules then reference the deps.ts rather than the remote dependencies...
With all dependencies centralized in deps.ts, managing these becomes easier.
--- end quote ---
Are you for real?
erm, how is that any simpler than `export const/function/class $defintion` ?
I guess if you only touch JS once in a blue moon it's difficult to remember?
Also CommonJS has some acknowledged issues around cyclic dependencies, and being incredibly fudgeable at runtime that makes static analysis and linting a pain.
honestly tough, esm is more complex than it should (especially regarding how live exports happen)
I find sync is often simpler to code for.
// main.js
const fs = require("fs")
const foo = await require_async("foo")
await foo.sleep(1000)
// other.js
require("./main.js")
// RequireError: main.js returned an unsettled promise, use require_async()
Wouldn't that be better of both worlds?But sometimes I'd like the same code to work also in the browser. Then I should be using ES6, but it doesn't work very smoothly with Node.js and I fear there may be complications if I have to wait (or "await") for the modules to have loaded.
The ES6 "dynamic imports" add more considerations to the mix. Surely their exports can not be used until "await" is over?
I think it would simplify things if I didn't have to choose between module-systems when I really just want to choose between sync and async.