Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As Lee Robinson mentioned and as I had said before [0], Rust and other compiled languages are the future of tooling. We should optimize for speed for tooling, not only whether people can contribute to it or not. We can always have a JS layer for configuration while the core runs in Rust, much like for many Python libraries, the interface is in Python while the computation happens in C++.

I was also looking forward to Rome (a linter, formatter and bundler also built in Rust but still under development) [1] but looks like Vercel beat them to the punch.

[0] https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

[1] https://rome.tools



How often is bundling happening relative to loading the bundle that 5 seconds of savings is pertinent?

There's this fairly recent notion that speed rules everything. Perhaps maintainability and releasing non-alpha quality software could have a day or two in the sun as well?


On a large team, it happens quite often in the CI infrastructure, as well as testing on local machines.

> There's this fairly recent notion that speed rules everything

Recent? I'd say it's very old, then got outdated as everyone moved to dynamic slow languages like Ruby (for Rails), Python (Django) and JS because of their perceived productivity improvements over typed languages like Java around 15 years ago. The pendulum is simply swinging in the other direction now.


> We can always have a JS layer for configuration while the core runs in Rust, much like for many Python libraries, the interface is in Python while the computation happens in C++.

And this behind-the-scenes-native code-but-pretending-to-be-Python is the reason why most Python stuff I try fails to run on the first tries.

Half of them I give up because I don't get them running after going down the weirdest error rabbit holes.

No matter whether Debian, Ubuntu, Windows 7/10, Cygwin, WSL2, Raspberry Pi, x64, ARM ...


What are you running? I've done multiple large deployment for machine learning work and they all work fine, stuff like numpy, scipy, pandas, pytorch etc which are the primary technologies I've seen written in C++. I wouldn't expect normal libraries to be so however, since they're likely not speed critical; Django for example I'm pretty sure is just Python.


Turbopack is just a bundler, while Rome is the whole package. I think only Deno strives for the same, but still can’t replace things like eslint


Sure, but Rome is being eaten by the other things Vercel has, such as SWC replacing Terser and Babel. Pretty soon all of Rome's niches will be already met by other tools, I'd wager. I think Rome simply does not have a large and well-funded enough team compared to Vercel and others like Vite.


I think this could benefit Rome. Both projects are written in Rust and so far they have a linter and a formatter and now they can integrate Turbopack, package it behind "rome build" and don't have to build their own bundler and transpiler.


IMO, The whole thing behind Rome is that it will have one tool to do everything (lint, prettify, minify, build etc), so every part in pipeline remains in sync. Using other tools for parts of it(build etc) would be against the philosophy


Is Rome being used anywhere? I haven't heard about it for a while. Kinda figured it was dead tbh




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: