Why I’m skeptical of rewriting JavaScript tools in “faster” languages

I’ve written a lot of JavaScript. I like JavaScript. And more importantly, I’ve built up a set of skills in understanding, optimizing, and debugging JavaScript that I’m reluctant to give up on.

So maybe it’s natural that I get a worried pit in my stomach over the current mania to rewrite every Node.js tool in a “faster” language like Rust, Zig, Go, etc. Don’t get me wrong – these languages are cool! (I’ve got a copy of the Rust book on my desk right now, and I even contributed a bit to Servo for fun.) But ultimately, I’ve invested a ton of my career in learning the ins and outs of JavaScript, and it’s by far the language I’m most comfortable with.

So I acknowledge my bias (and perhaps over-investment in one skill set). But the more I think about it, the more I feel that my skepticism is also justified by some real objective concerns, which I’d like to cover in this post.

Performance

One reason for my skepticism is that I just don’t think we’ve exhausted all the possibilities of making JavaScript tools faster. Marvin Hagemeister has done an excellent job of demonstrating this, by showing how much low-hanging fruit there is in ESLint, Tailwind, etc.

In the browser world, JavaScript has proven itself to be “fast enough” for most workloads. Sure, WebAssembly exists, but I think it’s fair to say that it’s mostly used for niche, CPU-intensive tasks rather than for building a whole website. So why are JavaScript-based CLI tools rushing to throw JavaScript away?

The big rewrite

I think the perf gap comes from a few different things. First, there’s the aforementioned low-hanging fruit – for a long time, the JavaScript tooling ecosystem has been focused on building something that works, not something fast. Now we’ve reached a saturation point where the API surface is mostly settled, and everyone just wants “the same thing, but faster.” Hence the explosion of new tools that are nearly drop-in replacements for existing ones: Rolldown for Rollup, Oxlint for ESLint, Biome for Prettier, etc.

However, these tools aren’t necessarily faster because they’re using a faster language. They could just be faster because 1) they’re being written with performance in mind, and 2) the API surface is already settled, so the authors don’t have to spend development time tinkering with the overall design. Heck, you don’t even need to write tests! Just use the existing test suite from the previous tool.

In my career, I’ve often seen a rewrite from A to B resulting in a speed boost, followed by the triumphant claim that B is faster than A. However, as Ryan Carniato points out, a rewrite is often faster just because it’s a rewrite – you know more the second time around, you’re paying more attention to perf, etc.

Bytecode and JIT

The second class of performance gaps comes from the things browsers give us for free, and that we rarely think about: the bytecode cache and JIT (Just-In-Time compiler).

When you load a website for the second or third time, if the JavaScript is cached correctly, then the browser doesn’t need to parse and compile the source code into bytecode anymore. It just loads the bytecode directly off disk. This is the bytecode cache in action.

Furthermore, if a function is “hot” (frequently executed), it will be further optimized into machine code. This is the JIT in action.

In the world of Node.js scripts, we don’t get the benefits of the bytecode cache at all. Every time you run a Node script, the entire script has to be parsed and compiled from scratch. This is a big reason for the reported perf wins between JavaScript and non-JavaScript tooling.

Thanks to the inimitable Joyee Cheung, though, Node is now getting a compile cache. You can set an environment variable and immediately get faster Node.js script loads:

export NODE_COMPILE_CACHE=~/.cache/nodejs-compile-cache

I’ve set this in my ~/.bashrc on all my dev machines. I hope it makes it into the default Node settings someday.

As for JIT, this is another thing that (sadly) most Node scripts can’t really benefit from. You have to run a function before it becomes “hot,” so on the server side, it’s more likely to kick in for long-running servers than for one-off scripts.

And the JIT can make a big difference! In Pinafore, I considered replacing the JavaScript-based blurhash library with a Rust (Wasm) version, before realizing that the performance difference was erased by the time we got to the fifth iteration. That’s the power of the JIT.

Maybe eventually a tool like Porffor could be used to do an AOT (Ahead-Of-Time) compilation of Node scripts. In the meantime, though, JIT is still a case where native languages have an edge on JavaScript.

I should also acknowledge: there is a perf hit from using Wasm versus pure-native tools. So this could be another reason native tools are taking the CLI world by storm, but not necessarily the browser frontend.

Contributions and debuggability

I hinted at it earlier, but this is the main source of my skepticism toward the “rewrite it all in native” movement.

JavaScript is, in my opinion, a working-class language. It’s very forgiving of types (this is one reason I’m not a huge TypeScript fan), it’s easy to pick up (compared to something like Rust), and since it’s supported by browsers, there is a huge pool of people who are conversant with it.

For years, we’ve had both library authors and library consumers in the JavaScript ecosystem largely using JavaScript. I think we take for granted what this enables.

For one: the path to contribution is much smoother. To quote Matteo Collina:

Most developers ignore the fact that they have the skills to debug/fix/modify their dependencies. They are not maintained by unknown demigods but by fellow developers.

This breaks down if JavaScript library authors are using languages that are different (and more difficult!) than JavaScript. They may as well be demigods!

For another thing: it’s straightforward to modify JavaScript dependencies locally. I’ve often tweaked something in my local node_modules folder when I’m trying to track down a bug or work on a feature in a library I depend on. Whereas if it’s written in a native language, I’d need to check out the source code and compile it myself – a big barrier to entry.

(To be fair, this has already gotten a bit tricky thanks to the widespread use of TypeScript. But TypeScript is not too far from the source JavaScript, so you’d be amazed how far you can get by clicking “pretty print” in the DevTools. Thankfully most Node libraries are also not minified.)

Of course, this also leads us back to debuggability. If I want to debug a JavaScript library, I can simply use the browser’s DevTools or a Node.js debugger that I’m already familiar with. I can set breakpoints, inspect variables, and reason about the code as I would for my own code. This isn’t impossible with Wasm, but it requires a different skill set.

Conclusion

I think it’s great that there’s a new generation of tooling for the JavaScript ecosystem. I’m excited to see where projects like Oxc and VoidZero end up. The existing incumbents are indeed exceedingly slow and would probably benefit from the competition. (I get especially peeved by the typical eslint + prettier + tsc + rollup lint+build cycle.)

That said, I don’t think that JavaScript is inherently slow, or that we’ve exhausted all the possibilities for improving it. Sometimes I look at truly perf-focused JavaScript, such as the recent improvements to the Chromium DevTools using mind-blowing techniques like using Uint8Arrays as bit vectors, and I feel that we’ve barely scratched the surface. (If you really want an inferiority complex, see other commits from Seth Brenith. They are wild.)

I also think that, as a community, we have not really grappled with what the world would look like if we relegate JavaScript tooling to an elite priesthood of Rust and Zig developers. I can imagine the average JavaScript developer feeling completely hopeless every time there’s a bug in one of their build tools. Rather than empowering the next generation of web developers to achieve more, we might be training them for a career of learned helplessness. Imagine what it will feel like for the average junior developer to face a segfault rather than a familiar JavaScript Error.

At this point, I’m a senior in my career, so of course I have little excuse to cling to my JavaScript security-blanket. It’s part of my job to dig down a few layers deeper and understand how every part of the stack works.

However, I can’t help but feel like we are embarking down an unknown path with unintended consequences, when there is another path that is less fraught and could get us nearly the same results. The current freight train shows no signs of slowing down, though, so I guess we’ll find out when we get there.

22 responses to this post.

  1. akbar akkah's avatar

    Posted by akbar akkah on October 20, 2024 at 2:54 PM

    Great blog! Great read!

    Reply

    • Chris's avatar

      Posted by Chris on October 22, 2024 at 11:49 PM

      Oh no, no, no, no… You’re missing the point entirely. It’s not more performant. It’s more portable.

      I’m tired of being force into installing Node every time I want to use the smallest front linter, formatter or transpiler on a project that otherwise doesn’t depend on Node.

      I wwant something that have binding into my language or, at the very least distributes a single binary. Something that I can package in my language and fire with my framework when I start the server.

      Reply

  2. HN user's avatar
  3. David's avatar

    Posted by David on October 20, 2024 at 5:49 PM

    I have a tool that I am writing that I would like to write in JavaScript – but will probably end up writing in rust for a reason you haven’t included here: most developer machines have 8 or 16 cores and Js can only use one of them at a time … so this allows for a 10x speedup for tasks that are well parallelizable …

    Reply

    • Nolan Lawson's avatar

      Good point! Node does have native support for Workers now, but it’s hard to share memory between them. “Embarrassingly parallel” tasks definitely make a lot of sense to do in Rust.

      Edit: shared structs look like a proposal to solve exactly this!

      Reply

  4. tracker1's avatar

    Posted by tracker1 on October 20, 2024 at 11:39 PM

    I’m genuinely very mixed on this. I learned to program around when JS started and grew up with it as my main language through the years. I still consider it my strongest language by a wide margin..

    I was thrilled when Node took off and JS rolling was finally getting built with JS. 6to4 (Babel) was a blessing and the advances to the browser engineer has been amazing. I absolutely love a lot of what’s come out of it all and wouldn’t go back to 90s style dev at all.

    That said, waiting seconds every commit for limiting and formatting sucks. The first time I used Biome (then Rome) I was sold. It was blink of an eye fast. White there’s something getting about being able to work on your tooling and many do. Not everyone needs to.

    Having a few developer gods among men work on better tools isn’t a bad thing. Relatively few write JS engines. And the aren’t a ton of C compilers in wide use. It’s definitely time for a few winners to rise with the lessons learned. I’m all for it.

    I think the real reason for these faster tools is much the same as having an nvme drive is now essential for web development. There’s so many touch points that all those milliseconds just add up to too much real time.

    Reply

  5. t1oracle's avatar

    Posted by t1oracle on October 21, 2024 at 4:12 AM

    I use JavaScript because that’s the only thing that Web Browsers understand. If I could use Rust instead I would. Unfortunately, that just isn’t practical yet.

    Reply

  6. Fred's avatar

    Posted by Fred on October 21, 2024 at 6:15 AM

    And then there’s the new Deno runtime which is supposed to make things faster.

    Reply

  7. NTR's avatar

    Posted by NTR on October 21, 2024 at 10:50 PM

    We need to replace legacy JavaScript tools, apps and ecosystem utilities with ones written in a modern language more maintainable without the now 30 years of legacy slowing down developer productivity.

    The companies and people on the ECMA, W3C and web standards boards aren’t going to address the overwhelming need to have a simpler ecosystem, with simpler tools and much simpler, consistent main programming language which does not have 30 years of piled on top bandages.

    It needs to be simple, a language with less special cases, less keywords, type checking, none of the worries of hoisting, self=this, multithreading built in, asynchronous built in and a proper way to call remote services without binary -> text -> internet transfer -> text -> json -> binary for a one way call and then the same sequence of expensive and buggy and security lax transformations in reverse for the reply.

    The standards boards, major corporations, software vendors, framework teams won’t fix any of this within any short number of years.

    For example, there is no built in way to package a HTML fragment, its CSS formatting and JavaScript code in one package and then reuse it on many different web pages without using a software supply chain attack-able set of 500 or more JavaScript packages.

    The development community needs to start a new standards working group without it being co-opted by large companies or existing standards boards who favor more of the status quo.

    Some goals to possibly consider:

    • Much smaller programming language for the web with simple language syntax and not the level of nice to have OOP features found in C++, scala, C#, Java, …
    • Fix the mismatch and large maintenance cost problem of the existing mess of matching code (JavaScript) to layout (HTML) using magic strings
    • Fix the problem of the web layout using a legacy language (HTML is a variant of XML) instead of a simpler layout language.
    • Fix the problem that any larger web app has to have code, layout and styling all in different locations, preventing reuse and introducing bugs
    • Fix the problem of everything is a variant data type and the implicit typeing and band-aids like “===”.
    • Larger than the rest by far: Fix the large amount of code needed just to make a WebAPI call with all of the HTTP header, body, form, POST/GET/PATCH/DELETE complexity
      • This needs to be in two parts
        • A transport part – Did the api call get to the server? Yes/No. IF no, what was the error code and error message
        • The business part – Was there an error on the server due to a business issue, such as cannot delete a customer who does not exist? Or was it due to an internal error, such as the server process could not connect to an external service?
    • Support for non-blocking asynchronous methods and API calls as first class supported citizens without the messy promises, callbacks, self=this and other costly maintenance problems

    Each of these may be addressable by a small fix or ‘simple’ javascript tweak. The net cumulative effect of these is to lower developer productivity, increase development time, increase short and long term web app costs for corporations, and prevent forward progress.

    The JavaScript/html/css/webapi/rest stack event with react/angular is getting older and much nearer to the status of COBOL than it is to modern development.

    Requiring developers to be expert level language lawyers in JavaScript, HTML, CSS, agular/react, WebAPI, web security, etc should not be needed.

    Reply

  8. Agent47's avatar

    Posted by Agent47 on October 22, 2024 at 7:23 PM

    Some are due to political reasons. Some are for money. Some are simply for making trouble for no reason.

    Reply

  9. JavaScript makes me sad's avatar

    Posted by JavaScript makes me sad on October 23, 2024 at 9:26 AM

    The sooner we move away from JavaScript the better. The language is horrible and full of pitfalls…only reason it’s so widespread is because “it’s available on browsers”.

    I’m sorry but that’s not a good reason at all. By the same token then we should all build stuff in BASH/SH because it’s widely available?

    Don’t get me wrong, the language is much much much better than it used to be but it’s still the same horrible language with a barely-decent facade.

    I was at first optimistic about TypeScript. At least to improve the types situation but the reality is that it doesn’t go far enough and it doesn’t solve much in fact it’s often in the way because of how types often are either missing or wrong…and they’re not really checked anyway at runtime so it’s just a developer tool.

    Only good reason to use JavaScript is because “it’s available”, it’s a shame so many people are starting programming with such an horrible language which makes you do a lot of stupid things and often not even complain about it (the implicit type conversions are a nightmare).

    I’ve used JavaScript for a long time and it still surprises me in really bad ways to this day. I could give you examples but we all are familiar with it and know very well what I’m talking about.

    The truth is we should have stopped using it years ago instead of starting to use it on the server side and what not. It is an abomination. The standard library is still confusing and half-harsed (look at Set or try to do anything trivial with datetimes).

    I think writing this tools in Rust is a gateway for opening people’s eyes and possibly slowly moving away from JS as much as possible…allowing people to easily and quickly shoot at their feet is not a feature, it’s a bug. I’d rather have the Rust compiler tell me off when I’m trying to do something idiotic…even if it may be a bit frustrating in the short term!

    Reply

    • JavaScript makes me sad's avatar

      Posted by JavaScript makes me sad on October 23, 2024 at 9:44 AM

      …and I haven’t even started to scratch the surface of how horrible the JS/TS ecosystem is!

      Everything is a mess of transpilation, overcomplicated tools with a million options which are just in the way of simplicity (eslint/prettier/jest/jasmine/whatever/esbuild/webpack/tsconfig and what not)…

      And of course every 2 weeks all changes for the fancy new tool or framework which is guaranteed to solve all problems it is faster and is supposed to be 99% compatible with the older tool…but guess what, it never is. NPM is a mess and you can easily end up with 5 different versions of the same dependency. Or of course someone borks an 11 lines-package and brings down civilisation…

      Compare that to rustup/cargo/clippy/rustfmt – it may not be perfect but you have to experience that to fully appreciate what an overcomplicated mess JS tooling is.

      Reply

  10. slidertom's avatar

    If we are talking about browser TypeScript is also waste of time and additional useless complexity. On the server side everything changes and in most cases to write effective server application using only javascript is extremely dangerous and expensive and sooner or later it will be switch based on the Rust, C/C++ solutions or even some complex parts will be rewritten using other languages which will provide low level access for the SIMD, GPU, parallel CPU operations to optimize performance.

    Reply

  11. Lazar Ljubenović's avatar

    Posted by Lazar Ljubenović on October 29, 2024 at 12:53 PM

    You’re beyond delusional if you believe the only reason these things are faster is because they’re a rewrite.

    Reply

    • Nolan Lawson's avatar

      I have no doubt that optimal Rust is faster than optimal JavaScript. And that idiomatic Rust is faster than idiomatic JavaScript. But I think we may have jumped from idiomatic JavaScript to optimal Rust, so I don’t know what would have happened if we had tried for optimal JavaScript first.

      Reply

  12. Pupa's avatar

    Posted by Pupa on November 16, 2024 at 7:48 PM

    This senior java script developer is very bad. After the paragraph about the niche of the web assembly, you don’t have to read it.

    Good senior developer keeps track of current trends in his field

    Reply

  13. Patrick Kerschbaum's avatar

    Posted by Patrick Kerschbaum on December 16, 2024 at 3:16 AM

    Prisma ORM is also migrating its core from Rust to TypeScript to allow easier collaboration: https://www.prisma.io/blog/prisma-orm-manifesto#4-enabling-community-extension-and-collaboration

    Reply

  14. luk's avatar

    Posted by luk on March 26, 2025 at 2:16 AM

    JavaScript should be banned from desktop, this is ABOMINATION that ‘simple’ desktop app needs .5-1GB of space and RAM !!! whereas waaaaayyyy more complex apps (look at any game for example) a few years ago needed 1-5 MB ….

    Reply

  15. luk's avatar

    Posted by luk on March 26, 2025 at 2:20 AM

    “In the browser world, JavaScript has proven itself to be “fast enough” for most workloads.”

    oh yeah, because we have modern super fast 8 or 16 core CPUs to run web sites….

    Reply

  16. makalin's avatar

    Posted by makalin on April 22, 2025 at 12:16 AM

    I respect your perspective, but rewriting in faster languages can bring long-term performance and scalability benefits that outweigh the initial complexity. Sometimes, the right tool for the job isn’t just about developer convenience, but about pushing efficiency where it really matters.

    Reply

  17. mikeschinkel's avatar

    Funny how the biggest problem (I have found) with JS-based CLIs was not even mentioned and that problem is the reason that I do everything possible to avoid having to use anything that needs to be installed with npm install; the absolutely dependency hell!

    I have spent probably a year’s worth of total time over the past 25 years trying to fix damn JS apps that are broken when I install them. Sure it’s “nice” to be able to edit a node_module, but what is a LOT NICER is not have to track down a bunch of incompatibilities every single time you install a JavaScript app!

    OTOH, how many times I have installed a Rust or Go app that didn’t “just work?” Zero.

    Simply put, for anyone who does not work 60+ hours a week in JS, JS CLI apps suck all the joy out of life.

    P.S. If all JS apps start being shipped as single file executables then I will have a different opinion.

    Reply

Leave a reply to NTR Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.