Typescript has just as many[1] (in fact more, as it's a superset) quirks than Javascript. I like using it (and it makes JS type-safe-ish), but it's not really some kind of paradigm shift.
Not sure how I feel about import maps. They are quite literally the same thing as package.json. In fact, converting between the two takes about 20 lines of code[2]. I'd bet my bottom dollar that everyone's going to use them, which is going to lead to exactly the same types of problems as Node.
Of course it's still far cry from being as safe as for example Rust.
And yeah the inconsistency of JavaScript/TypeScript can be frustrating for sure. I think my dream language is one that is simply TypeScript cleaned up to be made consistent and sheds a lot the features and retains a simple core.
For lower C# versions and Java you can get your strict mode turned on via static analysis tools.
I have been using Sonar on CI/CD builds since 2008. Static analysis errors break the build, plain and simple.
Also quite convenient for writing sane C and C++ code by the way.
I agree. I think this probably looks a lot like a Rust-lite (GC instead of lifetimes for memory management) or a Go with generics or a mature ReasonML?
However there are tradeoffs with everything you use.
It really comes down to which tradeoffs are the right ones for you, which may not be the same for someone else.
I have worked extensively with Typescript, which I consider to be the minimum viable solution within the Node ecosystem. Typescript shouldn't even be seen as a 'nice-to-have. It should be seen as a required remedy for some of the mistakes of a terribly engineered language. Javascript should never have been considered a serious language for back-end development. Node.js to me seems like someone's 'Frankenstein's monster'-esque experiment that escaped its captivity and wreaked wide-scale havoc on the surrounding world.
Javascript is a breath of fresh air after writing async code in almost every other ecosystem from Go to Java to Python to Swift. Pretty much any async code snippet in those languages can be improved by porting it to Javascript.
Async/await + the ubiquitous Promise make it my go-to choice for writing anything networked. Especially over the other popular dynamically typed languages.
More to the point I think, is that Node (and Deno, FWICT) lack general native or green thread support for true multi-processing without serialization to separate clusters, so you are forced to use async and timers for long-running or parallel work.
Promises just mean having to manually build and manipulate cooperatively multitasking green-threaded call-stacks. It was a dirty necessity following the inability of the earlier callback patterns to manage the level of complexity people were attempting to express in the language.
The problem with async/await: http://journal.stuffwithstuff.com/2015/02/01/what-color-is-y...
A solution proposed by none other than Java (available as an experimental feature in JDK 15): http://cr.openjdk.java.net/~rpressler/loom/loom/sol1_part1.h...
This seems to me to obsolete async/await, quite honestly.
There are many ways of doing I/O, but nowadays everyone seems to do everything async without giving it a second thought.
There's a legitimate concern about async-awaits — they don't have an inbuilt cancellation mechanism. May not be a big thing for the server; but definitely a big thing for the client.
See, for example: https://twitter.com/getify/status/1171820070538022914
https://medium.com/flutter-community/the-ultimate-javascript...
I like the idea of a standard library. This will hopefully only improve with time. It sort of brings the ease of use factor of PHP to a server side JavaScript environment.
I'll be watching Deno very closely these next few months!
Node got popular because it enabled frontend devs that only knew JavaScript to move easily to backend, not any technical reason.
However, not every project will involve pulling in such crazy stacks. I recently experimented with Hapi + TypeORM. Hapi's core goal is to use only what they directly maintain, and I think it works really well! Node_modules is still not tiny, but it's at least an order of magnitude better than what I'm used to.
The only thing that, to me, is a big problem, is that, even though you have types in your TS code, you're basically throwing them away at runtime, wasting huge opportunities for optimisation that even the V8 can't recover. If V8 had support for strictly typed TS code, can you imagine how fast it could get it to run?! I think that's the next stage in the evolution of JavaScript/Node/Deno: Node -> Deno -> Done.
GitHub: https://github.com/sholladay/pogo
Video tutorial: https://www.youtube.com/watch?v=Fe4XdAiqaxI
Extraordinary claims require extraordinary evidence.
Additionally, I would like to pay for a thorough security review when we have more features and users. I doubt any of the other frameworks will do that as it's extremely rare in OSS. Of course, that means very little until it actually happens. But know that my intention is to deliver the first Deno framework that I would personally feel comfortable using in production.
deno run --allow-net myWebserver.ts
With SELinux, one can specify the port range and network interface that the application is allowed to access. It also provides audit log that can be examined by the admin. Maybe there is no need to reinvent the wheel but just use some form of MAC if you really care about security.
This is what I think I'd like to see as well. The most common case isn't that I don't trust the program I'm running, it's that the level of trust for my dependencies plus their dependencies is essentially opaque.
Each package published to Deno could come with a set of declared permissions (similarly to Android apps).
When importing the package in a module, Deno should detect that permissions scoped at current module level are wider than what the package requires, and automatically narrow down the list of authorized calls.
This would probably be very costly. Suppose that I'm importing a function from lodash (that requires no permissions) and my module calls it repeatedly while also accessing the file system...
> --allow-net=\<allow-net> Allow network access. You can specify an optional, comma separated list of domains to provide a whitelist of allowed domains.
So it seems to allow for a bit more fine-grained configs than just opening up everything.
All the examples I’ve seen so far are using the ‘deno’ command to run .ts or .js scripts. Can it also package standalone binaries like go/rust—-to make a cli tool, for instance? If so, how do permissions work in that scenario? Does the user need to grant permissions on every invocation or is there some way to whitelist a script/binary?
- Yarn helped solve that, but because of its backwards compatibility to node_modules, you could not have different versions sitting side-by-side.
- Node_modules could have a different version installed vs lock file and no one would know without looking.
It seems Deno is able to solve the side-by-side versions and distributing the 'lock' to the file itself. The Deno team is trying to create a 'map' file to consolidate the 'distributed version' issue.
Sadly, Ruby's Bundler has solved this for years and while I love TypeScript, I'm always saddened by the state of package management in the Node space.
I'm not saying Bundler perfect, but its basis with canonical lock and ability to have side-by-side versions allows me not to think about that issue.
> - Node_modules could have a different version installed vs lock file and no one would know without looking.
> Sadly, Ruby's Bundler has solved this for years [...]
I don't understand your first point. Different projects can use different versions since the modules are installed locally (inside the `node_modules` directory). And nested modules can also have different dependency versions, e.g.:
A
=> depends on B @ 1.0
=> depends on C, and C can depend on B @ 2.0
Regarding your second point, I haven't ever seen that happen in practice and IIUC it's mostly a property of the the fact that `require 'bundler/setup'` checks your dependency versions, and you could implement something similar for JS (e.g. traverse node_modules directories recursively checking if the versions declared in the package.json of dependencies match the ones in your root lockfile).Since we're on the topic of Ruby and JS, Ruby's module system is probably one of the worst I've ever seen and JS one of the best.
In Ruby, just like in Python, everything in a file is public by default and the only way to make things private, AFAIK, is using Module#private_constant, and that only works for methods/inner classes/things in a class scope.
And, unlike Python's import, require is side-effectful! If you have file a.rb that requires b.rb, and b.rb requires c.rb, everything in c.rb will be visible in a.rb. This is terrible.
JS's module system is one of the best IMO (better than haskell, python, java, etc):
- simple mental model: a file is a module
- everything in a module is private by default, you have to explicitly mark things with `export` to make them public
- You can either qualify imports or explicitly import individual functions, so it's always possible to find out where something is defined by simply looking at a file. Most languages fail here. This is useful for beginners and in places where you don't have an IDE available, like GitHub
I'm speaking about within the same project. It's not hard to have problems over time when node upgrades (for example[0]) or to get a different version than expected.
Any project that's lived long enough runs into some sort of version mis-match where the solution is `rm -rf node_modules`.
Deleting and reinstalling the package folder as a regular fix is symptomatic of a deeper package issue.
Deno solves parts of this by giving module versions their own explicit folder. I'm concerned if it still stores the package locally that you can still run into a deno version mismatch.
.rbenv + Bundler's folder structure has been `.rbenv/versions/2.6.5/lib/ruby/gems/2.6.0/gems/mime-types-3.3`
The version of ruby and the version of the gem are explicit allowing separation.
Again, far from perfect, but this keeps out so many problems.
> Since we're on the topic of Ruby and JS, Ruby's module system is probably one of the worst I've ever seen and JS one of the best.
This thread is about package management. While fair criticism, it's too sideways.
[0] https://stackoverflow.com/questions/46384591/node-was-compil...
node_modules can only have one version and it's not hard to have version drift even while having a lock. The standard answer is to do the `rm -rf node_modules` & install. Often that fixes whatever problem creeped in.
Blowing away a package directory to solve problems for years should not be the answer.
This means that if I write an application that requires filesystem access and has external dependencies, I'm essentially giving them access to the filesystem even if they don't need it.
These dependencies could silently check whether they have permissions and do something fishy only if that is the case.
It would be nice to be able to import dependencies in a nested sandbox but I guess it is not a simple problem.
Access to security sensitive areas or functions requires the use of permissions to be granted to a deno process on the command line. [1]
The only other mention of permissions in documentation is that a program may query or revoke permissions.
[1] https://deno.land/manual/getting_started/permissions [2] https://deno.land/manual/examples/permissions
EDIT: formatting
When you're putting database password in the env, most likely you need to permit env var and network access for your database client library. Then at the same time, the library in example can do that malicious thing.
The problem here is, deno request the permission per process, not per library import
It's not as fine-grained as allowing libraries specific permissions, but it gets you part of the way there.
RIP every other engine, I guess?
In practice, pretty much :(
if so, i can see this type of system being mostly worthless.
The browser actually does something quite a bit like this with iframes. Iframes are sandboxed and can only communicate through postMessage. There's more to it but at a simple level it looks like this.
Chrome nowadays even runs iframes in a separate process! Finally... https://www.chromium.org/developers/design-documents/oop-ifr...
This is actually quite impressive because it presents a decent illusion to JS that all frames are running under the same thread.
At some point I remember writing some gpg wrappers with Node.js and I remember the subprocess API being one of the more pleasant ones to work with. In the case of more stringent Deno process sandboxing, the parent process would spawn another Deno process with a smaller set of capabilities.
Using something like Caja https://developers.google.com/caja/docs/about might work, using object capabilities rather than ambient privileges. Not sure if Deno helps at all there though.
- Would be nice to see a builtin maven/gradle-like standard build system, which IMHO not having it in node.js was a major let down for me. Sure, you have gulp/grunt, but having to write repetitive code (which can be buggy!) for running a compiler/test/packager? With another set of plugins? just give me some standard tool and let me call the equivalent of "mvn package", which will compile/test/package my application with sane defaults.
- the dependencies-as-URL is going to be hit hard as soon as some big fish corporation wants to use Deno for some in-house projects. What if those corps disallow calling https://deno.land/my/dep and pretend to use internal repo? now all 3rd party dependencies won't work unless manually modified to use the internal corp repo.
- I predict that the --allow-this --allow-that will evolve in a SecurityManager-like complexity, that only the early enthusiasts will understand, and the rest will simply put the equivalent of "--allow-all" and let the ops deal with security issues
So why do people not throw the same kind of fit about nearly every other programming environment as they do for Node/NPM? And frankly why do those other environemnts not have the ridiculous security breaches we have seen in Node/NPM land?
The real problem with Node/NPM i suspect is a lack of a standard library. Simply having a standard library would have greatly reduced dependency and package hell. Further, a standard library would mean people would be more willing to write a little more code rather than include a new dependency.
. dependencies are carefully considered by users
. dependencies are not added recursively
. dependencies try to be dependency-free themselves to assist with the previous point
. dependencies are not blindly nor automatically updated
. dependencies solve important domain problems, they are not trivial one-line-functions
. dependencies are typically developed and tested by a known team or company, which you trust, not just someone random
. binaries can be signed
. support contracts are a thing
. etc etc etc...
I believe these points
dependencies are carefully considered by users dependencies try to be dependency-free themselves to assist with the previous point dependencies solve important domain problems, they are not trivial one-line-functions dependencies are typically developed and tested by a known team or company, which you trust, not just someone random
would be solved by the parent comment's proposed standard library.
I believe there was a time when C++ did not yet have a standard library. But now it does. JavaScript should have a standard library, not "Deno".
In the JVM you can use the security Manager [1] and limit file access and access to similarly sensitive areas. If you want you can fully guarantee that nothing is accessed randomly.
Of course that builds on the JVM not having a zero-day bug.
Was this all true of NPM as well?
That's not actually completely right... there are problems with deploying a single jar. With Java 9 modules, you're actually throwing away module encapsulation if you deploy a uberjar. The current state-of-the-art is to deploy the whole app + the JVM in a jlink image, which requires no uber jar.
Deno -> Java
Runtime security options -> Security Manager.
URL based packages with simple HTTP-> Maven works same way.
Bigger standard library -> Java's is huge.
Types -> Java, yes.
Single executable -> Fatjars.
All the features mentioned in this article have been in Java over a decade. Its relieving to see a JS runtime that finally gives in to enterprise niceties. Us Java devs like to crap on JS for reasons besides being "boomers". The features Deno brings were all real reasons to use Java instead of JS up until this point.
Now if they would only fix threading, I would consider Deno/JS a real contender for backend dev
This makes me think, is file-based code-split strategy enough?
What if we'd put code (not data) behind GraphQL and request only the actual piece of code we need to use?
import { foo } from 'https://graphql/{ give: { me: { foo }}}'
One can simply say, 'Deno is like Node but done right'.
In 2020, Deno is just a stone's throw from `ts-node server.ts`.
And the permissions system lacks the granularity to be useful.
FYI, I read the whole article without JS fine, thanks for making the site work without it even if it says it doesn't.
Now he's fighting his own monstrosity.
Anecdotally, there are several languages (French, "Argentinian" Spanish to name a couple) where it's common to re arrange the syllables of the words backward-ish (More often than not for slang - argot for France, lunfardo in Argentina - uses).
Another nitpick which may be totally invalid and I'm open to being educated on this. It seems odd (i.e. inaccurate) that, at the end, the myProgram.bundle.js is being called an "executable binary." It's just minified JavaScript. Is it a binary because it has been minified and optimized, or is all JavaScript suddenly considered "binary?"