I've wondered why the frontend community hasn't gotten together and said, "The next version of JavaScript - is TypeScript!" I've been using TypeScript for five years professionally now and cannot understate how much easier it has made large frontend (not just Web, but mobile and desktop) projects. Surely enough thought and work has been put into TypeScript to make it the next standard.
There's not really much positive value out of having browsers run TypeScript natively. The main feature is static checking, but static checking doesn't benefit end users. When I go to my bank's website, if their front-end code has a type error, it's not like I can fix it right then and there.
The type system is mostly a developer-time feature, so it makes sense to leave it out of the core runtime environment.
In other words, think of JavaScript/ECMAScript more like the architecture that browsers support. That needs to be slow-moving since it's deployed across billions of devices. TypeScript then just targets that.
Adding TypeScript directly to JavaScript would improve the world to about the same degree that adding C++ features directly to x64 machine code would.
As a developer, in the case of an API change that violated my assumptions, I would personally prefer my applications to fail-hard at the point of the API call, rather than to have my scripts run merrily on and only error in some other code far away from the root-cause when one of those assumptions fails.
However, I have a lot of hope that runtime type checking based on auto-code generation around TypeScript's interfaces could be developed in a future version of TypeScript.
Recently, I've been using Zod [2] and find it to be a satisfying equivalent: you define a schema, and then you get both a TS type AND a JS parser/validator (which works as a TS typeguard).
[1] https://github.com/microsoft/TypeScript/wiki/TypeScript-Desi... [2] https://github.com/colinhacks/zod
You could argue that that's not the best kind of language for users. I wouldn't disagree. I work on Dart which used to be optionally typed but now has a fully sound type system with runtime checks.
But that's orthogonal to whether browsers should support TypeScript directly. If they supported a statically typed language that had the runtime checks to be sound, that might be a great language, but it wouldn't be TypeScript.
ES4 was the first shot at adding types to JavaScript which failed due to how big the ambitions were. I'm not sure if there is any more appetite for adding types to JS tho.
In very very serious big applications like a 3D editor you can fall back to WebAssembly and use your favorite typed language. For smaller apps TypeScript is good enough. This way JavaScript stays simple and lean.
There are a few languages where types only exist (for the most part, though with exceptions and hacks here and there) at compile-time, like Rust, C++ or even Haskell IIRC.
try {
…
} catch (e) {
if (e instanceof FooError) {
…
} else if (e instanceof BarError) {
…
} else {
throw e;
}
}
There was once a Mozilla extension (https://web.archive.org/web/20200111091805/https://developer...) that allowed you to abbreviate the above to try {
…
} catch (e if e instanceof FooError) {
…
} catch (e if e instanceof BarError) {
…
}
It was never standardized, but since it’s just syntactic sugar, if there were demand, it could be standardized without bringing in an entirely new type system.There would be at least two problems with using TypeScript types for this. Firstly, TypeScript types are unsound in a number of intentional and unintentional ways, meaning that it’s possible for the compile-time and runtime types to disagree, even in fully typed code. Secondly, TypeScript can express many types that cannot be tested at runtime; for example, there is no way to tell whether a function accepts a string as an argument, or to guess the inner type of an empty array.
Two things. First, TS conceives of itself as having no runtime component. If it did, I think people (including the TS devs) would be more confused.
Second, I'd say rather we need a runtime type system. In fact I've tried my hand at writing one in the most minimalist way possible, and have been working on it recently [1]. The type system is explicit in that a type is a JSON like object, similar to JSON schema, but 100x less code.
[1] https://github.com/javajosh/simpatico/blob/master/friendly.h... This is effectively the test harness for the module.
It's a nice feature request but there's no one feature that makes a programming language "real".
https://github.com/microsoft/TypeScript/blob/main/LICENSE.tx...
(Here’s the specification for ECMAScript, for example: https://tc39.es/ecma262/)
To be clear, I am not bashing Microsoft here—just pointing out a reason that TypeScript can’t be declared “the next version of JavaScript”, which is the context of this thread.
It's very much designed to be a layer of abstraction over JS.
I am as much a fan of TS, however, it's not likely ever going to be the thing that it's really close to being.
[1] https://github.com/Microsoft/TypeScript/wiki/TypeScript-Desi...
From the link I see this example:
in src/deps.ts
// Add a new dependency to "src/deps.ts", used somewhere else.
export { xyz } from "https://unpkg.com/xyz-lib@v0.9.0/lib.ts";
Then essentially a create/update lock-file command is run.
Then the lock file is checked into version control.
Then another developer checks it out and runs a cache reload command.
As you mentioned in practice it's definitely a bit too manual, but should be one of those things that can be automated so it's not the end of the world.Having said I think having it in the import syntax would provide a few benefits:
1. No extra steps need to be run & hopefully IDEs could auto-complete the hash.
2. Would hopefully be standardized with the browser allowing for native browser support as well (or perhaps lock.json could be standardized with something like import maps)
3. Having it right there provides an extra level of assurance that the integrity hash is going to be used (especially in files intended to be used in the browser and in deno ... not sure how common that is though).
I think they're much better as a part of import maps, and later fetch maps so they can apply to non-JS resources like CSS.
Cycles are definitely an issue, I am not sure there is even way to work around that, except to pull the cycles apart (which may not always be possible but is usually not a bad programming practice when it is). However at the library level, libraries tend not to circularly import each other. If it's being done at the inside a project level the build tool would be generating it so dealing with the module graph being invalidated may not be a hassle (or even necessarily a bad thing), in that case it could modify the files or it could be generating a lock file / import map (which I agree has benefits at that level of not forcing every source to be transpiled, but some of that probably still has to happen for module reloading e.g. appended search parameters to the module path for cache invalidation / module reloading during development like vite.js does for example, and realistically given the nature of the ecosystem some transpilation is going to have to happen either because of .ts or just because of browser differences).
For a top-level deps.ts / dep.js file pattern there probably won't be any cycles. That pattern is to declare a root deps.js file for your project that locks things down and re-exports from third party libraries a use the exports from that as the basis for other imports. For this pattern I think SRI would be extremely helpful and add enough benefit to justify it (even though SRI may not be used in the cases you listed).
Also for smaller projects or main modules having the SRI hash inline is really helpful.
Eich was pushing for them in 2011 and they still haven't arrived.
The proposal is humming along through the stages at a good pace.
They don't, using libraries that guarantee runtime immutability has a heavy performance cost in JS currently.
As this is an engine feature rather than a spec thing, there is nothing me (or any other TC39 delegate) can do.
Google and Mozilla simply chose to ignore the spec.
Allows you to contain temporary variables to where they are needed, rather than having them remain active through the remainder of the current scope. You could do that with immediately invoked function expressions, but they are too verbose to be really viable for this. You could also use extra functions, but extra functions don't always make sense. Currently I'm frequently doing this:
Without do expression:
let result;
{
let tmp = 123;
result = tmp * 2;
}
With do expression it would become this: let result = do {
let tmp = 123;
tmp * 2;
}I believe an Electron alternative is an important part of the Deno stack, so hopefully we'll ship a first iteration next year.
Then there are others of us who have been saying no to Node and legacy JS for years. We have no such legacy to maintain and no intention of ever creating any. But some of us (at least I) would reconsider platforms built from scratch on a new TypeScript foundation rather than layered on a pre-ES6 foundation. That would include a Deno-based Electron. You might have more luck converting people who don't use Node than getting Node users to abandon their legacy.
The state of cross-platform desktop apps is terrible. All attention is on mobile, and desktop OS makers have almost zero interest in supporting cross-platform desktop apps. (MS cares a little more than zero, Apple less than zero and barely tolerates their own Mac-only developers.) Only something browser/Chromium based seems realistic for the next few years.
On the server, there are a lot of alternatives to Node that are considered better by (and very popular with) large segments of the market. Deno will be one of them, I think. But for cross-platform desktop apps, Electron would be rejected completely if the alternatives weren't so bad and unlikely to get better. A better Electron, despite its inherent problems, could end up more popular than server-side Deno. Just a thought.
this may help development heavily, also making the browser and deno nearly identical environments.
Huge fan of the achievements that Deno has made in recent years. Several questions:
How do you aim to promote Deno as a viable alternative to node, considering it's significant network effect and legacy?
Do you believe that Deno's 'more sane' defaults for security will appeal to developers in the long term? Do you think that the front end community will be receptive to these defaults?
Choosing TS as your language de jure, do you think that you will alienate any dyed in the wool JS devs? Can we expect that TS is now effectively the superset of JS going forwards?
Do you believe Deno's lack of support for NPM style package management will result in cleaning up the frontend community's over reliance on 'leftpad' style packages? Do you think that Deno's approach to dependencies fosters a more considered approach to transitive dependency bloat?
Again, huge fan of Deno, and happy to hear about this announcement.
This is a great question, but not one I can answer in a small HN comment :-). I may write a blog post about it one day. The core of the argument is that Deno can save you an insane amount of time / discussion (OOTB linting, formatting, testing, standard library, etc). It aims to unify the ecosystem into a single style, like in Go.
> Do you believe that Deno's 'more sane' defaults for security will appeal to developers in the long term? Do you think that the front end community will be receptive to these defaults?
I think many developers do not care about permissions, and also will not in the future. This is a problem, but not something that can be tackled overnight. Security is often not emphasized enough in our industry unfortunately. Because of this I think sane defaults and opt ins are good - they push people to think about security at the most basic level. Maybe the log4shell attack also shows people that it is a good idea to sandbox server side scripts aggressively (something we have been pushing for), to prevent large scale system takeovers through a single vulnerable entrypoint.
> Choosing TS as your language de jure, do you think that you will alienate any dyed in the wool JS devs? Can we expect that TS is now effectively the superset of JS going forwards?
There is work being done on this. I don't have too much to share right now, but expect some updates on this early next year. JS has to evolve to support some form of type annotations first class to stay relevant.
> Do you believe Deno's lack of support for NPM style package management will result in cleaning up the frontend community's over reliance on 'leftpad' style packages? Do you think that Deno's approach to dependencies fosters a more considered approach to transitive dependency bloat?
Maybe, maybe not. I think it is still to early to tell. I do think that so far it is looking like it. People seem to be doing less weird stuff like "leftpad" with Deno so far. Ideally all these little helper modules should just be part of JS directly (hit me up with suggestions!)
> Again, huge fan of Deno, and happy to hear about this announcement.
Thanks, glad you like it :-)
What are you thinking in this regard? Just to define type annotations that can be made but will not necessarily be type-checked at runtime? To have them serve as inputs to the interpreter for optimisations? Or even breaking at runtime if types don't match their annotations?
The medical profession, aviation, even rail transportation [1] have all progressed past the point where avoidable failures are entirely the responsibility of the individual.
Of course, there was resistance in those fields as well because some considered themselves an "above-average" doctor or pilot who didn't need safeguards, checklists, union rules, or laws. But it empirically improved outcomes.
> Better support for explicit resource management
refer to?
This is all when vanilla JS keeping very energetically absorbing new features from *-Scripts thanks to TC39 seemingly intentionally picking them?
'clamp' and 'sortBy', as in Lodash.
Set methods for union, intersection, difference, etc.
let bar = [{id: zz, }, {id: y},...]
let foo = {}
bar.forEach(v => foo[v.id] = v)
I'd love to have something like:
let foo = bar.toMap(v => [v.id, v])
It started as a Function Composition proposal (using the pipe operator |>) but after a change of leadership it has turned into something much different. We might need another perspective on the current trajectory of this proposal, as in its current form it seems to many in the community it might take JS in the wrong direction.
Thanks!
await chain(
Object.keys(envars)
.map(envar => `${envar}=${envars[envar]}`)
.join(' '),
text => `$ ${text}`,
text => chalk.dim(text, 'node', args.join(' ')),
colored => console.log(colored),
)
I don’t want to sound too negative, but this seems like a pointless extension only to bring the burden of support for some syntactic flavor.React example is special though. It feels like very helpful in expr-only context, but in-array ifs and fors would do better there:
<ul>
{[
for (v of vs) {
const text = foo(v)
yield <li>{text}</li>
},
if (vs.length == 0) {
yield <li>No items</li>
},
]}An example with Workers, one script might only need to fetch from Backblaze. I’d like to set their host as a whitelisted address, and so even if a log4j type vuln happens, it can’t go anywhere except Backblaze.
I think this could even work in browser-land? If you don’t need to pull in any resources outside the original host, deny any fetch made unless it’s added to a whitelist. For browsers this would need to be opt-in for backwards compatibility, but an ideal state would be opt-out (to allow all).
[0]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Co...
Of course, a lesser form of the vulnerability -- data leaks rather than RCE -- would still be possible. I agree that being able to restrict outbound traffic would be useful to mitigate that.
As a hack that works now, you could monkey-patch `fetch()` to intercept calls and deny them based on URL.
(I'm the tech lead of Cloudflare Workers.)
Monkey patching is an option of course, but a native solution would be nice.
But speaking more broadly, do you have any examples of this kind of behavior being defined at the language specification level (and not in a platform API)? I can't think of any presently.
It seems problematic for a number of reasons, but if there's other examples to work backwards from that might be helpful for me to grok how this would work in a general sense.
> Better support for explicit resource management
+1
Since everyone is making feature requests, I'd like to point out `ArrayBuffer.transfer`[1] -- ability to effectively move data without copying would do wonders for low-level/high-performance code in JS.
Great news! I wrote an open source library called axax that adds a number of utility methods to async iterators - map, filter etc.
I think having them as part of the language would be awesome.
Standardising async cancellation would be neat too if Deno wants a challenge....
Whatever happens please never give into any misguided pushes to support commonjs/amd/umd or any of the other non-standardised disaster module formats that cause Node and npm etc to be so painful! It's only very recently that modern build tools are managing to overcome such poor foundations...
Asset References: https://github.com/tc39/proposal-asset-references Alternative module reflections (wasm imports): https://github.com/tc39/proposal-import-reflection
The main problem to me is this push to this ESM thing, which I don't know what it brings to me. I understand it's a frontend thing, so I'm not sure why nodejs end npm need to be impacted.
- It has language syntax for importing and exporting instead of relying on an implicit global.
- It is asynchronous, allowing for top level await.
- It is reliably statically analyzable.
- Because it is asynchronous, module asset loading can happen in parallel which can mean a significant startup speedup for larger projects.
- It is standardized, so it behaves the same across Node, Deno, the web, bundlers, linters, and other tooling.