I’m not saying Million is that tool, just that such dismissal of the problem addressed rings a little hollow.
I’m sure I’ll get a lot of downvotes for that.
But after 10 years of using react. Most performance issues were really just poor state management in the wrong locations and not the virtual dom holding it back.
Can you expand on that? I've never hit perf issues with react but I've been curious how it commonly happens.
This sounds very hand-wavy. What does it mean to "use a compiler to directly update dynamic nodes"?
https://www.reddit.com/r/javascript/comments/x2iwim/askjs_mi...
around 2 yrs ago, i messed up benchmarks in the past with million v1. i'm sorry about putting out false information. once the reddit post came out i stopped working/advertising million entirely. i then spent 3 months redesigning the entire library. i tried my best to make it as fast as possible and accurate. benchmarks are real now, see here: https://krausest.github.io/js-framework-benchmark/current.ht...
"Lie" is a bit edgey but I think adults should be able to stomach a little sourness instead of, ironically, accusing people of dirt and malice.
Lying is a deliberate action. The thread you link to seems rather to expose accidental bad benchmarking and poor communication/explanation of results, probably in good faith and due to inexperience.
Real world applications are mostly deep trees of stateful components.
Seems like these are two conceptually similar things.
> React traverses the virtual DOM tree to update the UI, resulting in O(n) time complexity.
That's the worst case, on initial load. On most of UI changes nothing stops React to update only local portions of the tree - elements that have their state changed.
Educated guess: In Million and React.JS cases major bottleneck is inside browser re-layout mechanism, not on JS side I think.
In practice, it still re-renders a lot. It’s easy to get a significant performance increase by not using React (usually at least one order of magnitude) - browsers have improved a lot and what looked like an optimization for IE6 is largely overhead now.
Neither framework will fix programmer's errors.
This
for(let child of container)
child.innerHTML = getHtmlFor(i++);
will always be slower than this: for(let i = 0; i < N; ++i)
html += getHtmlFor(i++);
container.innerHTML = html;
N transactions versus 1 transaction.I mean that compiling separate N "small" DOM access calls is not always faster than one integral update.
Edit: My comment is probably not accurate. Please ignore what I said.
Solid doesn't use a virtual DOM like React. Solid is close to Svelte, Vue is in the middle more leaning towards Solid and Svelte today and eventually will be with Svelte and Solid when Vapor comes out and then React is on the other end.
Solid and React do feels similar to the developer because they both use JSX and similar APIs but they aren't compatible.
Theoretically, could this be merged in the main React project or would this break something?
Nobody has ever complained about their app feeling better to use because it performs better.
Users do not sit there like “man, I really wish the web was SLOWER”.
Additionally, assuming this lives up to the claims, or even lives up to a quarter of the claims, then the optimization is, by definition, NOT premature. Premature optimization is the act of optimizing before you even know if something is slow, or before you measure.
I suppose you are probably working under the Functional Programming definition of “premature optimization” where they tell you to never measure (because it just makes FP look bad).