yt-dlp was also the first application that came to my mind. I got my fingers crossed for this integration. It was interesting to learn how to hijack my own cookies but, nonetheless, rather uncomfortable to say the least.
I also ran into some weird issues where sometimes the binary isn't executable and you have to chmod +x it - including in GitHub Actions workflows. I had to workaround it like this: https://github.com/simonw/denobox/blob/8076ddfd78ee8faa6f1cd...
- name: Run tests
run: |
chmod +x $(python -c "import deno; print(deno.find_deno_bin())")
python -m pytest uvx deno --version
One-liner to run Deno without a separate step to install it first.The wheel comes in five flavors: https://pypi.org/project/deno/#files - Windows x86, manylinux x86 and ARM64, macOS x86 and ARM64.
That's a lot of machines that can now get a working Deno directly from PyPI.
The yt-dlp project also raised concerns that the manylinux wheel incorrectly advertises older glibc support.
... but I always bristle a bit at the "one-liner to run without installing" description. Sure, the ergonomics are great, but you do still have to download the whole thing, and it does create a temporary installation that is hard-linked from a cache folder that is basically itself an installation.
However, other comments make it sound like a bunch of other projects have discovered that PyPI is a good distribution channel. Which, to me, sounds like using the Internet Archive as your CDN. Is PyPI the next apt/yum/brew or what?
(I hope this doesn't become a pattern that puts excessive pressure on PyPI. IMO it should only be used for things that are specifically known to be useful in the Python ecosystem, as a last resort when proper Python API bindings would be infeasible or the developer resources aren't there for it. And everyone should keep in mind that PyPI is just one index, operating a standard protocol that others can implement. Large companies should especially be interested in hosting their own Python package index for supply-chain security reasons. Incidentally, there's even an officially blessed mirroring tool, https://pypi.org/project/bandersnatch/ .)
Lots of packages rely on other languages and runtimes. For example, tabula-py[1] depends on Java.
So if my-package requires a JS runtime, it can add this deno package as its own dependency.
The benefit is consumers only need to specify my-package as a dependency, and the deno runtime will be fetched for free as a transient dependency. This avoids every consumer needing to manage their own JavaScript runtime/environment.
If you get Deno from the system package manager, or from deno.com directly, you're more constrained. Rather, it seems that you can set an environment variable to control where the Deno home page installer will install, but then you still need to make your Python program aware of that path.
Whereas a native Python package can (and does, in this case, and also e.g. in the case of `uv`) provide a shim that can be imported from Python and which tells your program what the path is. So even though the runtime doesn't itself have a Python API, it can be used more readily from a Python program that depends on it.
Also, it's VERY convenient for companies already using python as the primary language because they can manage the dependency with uv rather than introduce a second package manager for devs. (For example, if you run deno code, but don't maintain any JS yourself)
But at the individual project level this definitely isn't new. Aside from the examples cited in https://news.ycombinator.com/item?id=46561197, another fairly obvious example of a compiled binary hosted on PyPI is... uv.
Years ago, there were some development tools coming out of the Ruby world – SASS for sure, and Vagrant if I remember correctly – whose standard method of installation was via a Ruby gem. Ruby on Rails was popular, and I am sure that for the initial users this had almost zero friction. But the tools began to be adopted by non-Ruby-devs, and it was frustrating. Many Ruby libraries had hardcoded file paths that didn’t jive with your distro’s conventions, and they assumed newer versions of Ruby than existed in your package repos. Since then I have seen the same issue crop up with PHP and server-side JavaScript software.
It’s less of a pain today because you can spin up a container or VM and install a whole language ecosystem there, letting it clobber whatever it wants to clobber. But it’s still nicer when everything respects the OS’s local conventions.
Golang has really fast compilation time unlike rust and its cross compatible (usually, yes I know CGo can be considered a menace)
Golang binary applications can also be installed rather simply.
I really enjoy the golang ecosystem.
PyPi: https://pypi.org/project/deno/
GitHub: https://github.com/denoland/deno_pypi
(Note that the GitHub link in the first post of the issue linked by this HN post now redirects to the official location, as of the time I write this.)
You end up with old versions as default installs that are hard to upgrade
Other cool tools you can install from pypi:
1. https://pypi.org/project/cmake/
2. https://pypi.org/project/ninja/
3. an entire c/c++/zig toolchain: https://pypi.org/project/ziglang/
4. the nvcc cuda compiler: https://pypi.org/project/nvidia-cuda-nvcc/
Some time ago in npm, someone has made packages which can install fonts via npm and use the cdn system provided by it for such
I think its more private than many competitors out there. An google fonts alternative is suggested to be coollabs which uses bunny cdn under the hood but using npm's infrastructure which is usually provided by cloudflare is another great idea as well.
Also you are forgetting something that these are economies of scale.
And they aren't using pypi to distribute the official version of deno or the only way they distribute deno. That would be the case which would incur lots of bandwidth good will you could say, but I think the current use case would likely just have in at best 10s of gigs per day or 100s of gigs per day , this is just a method where python is usually installed and it simplifies the installation of deno a lot and there are some really beneficial concepts which can drive up even including recently yt-dlp
its a good idea for what its worth
For context JSdelivr delivered 20,572 TB data per month for free.
I genuinely consider that deno's python might not even reach even 100 GB data per month and I am exaggerating it a lot like with a strech, Python Cuda modules are usually the largest bandwidth eaters imho
All in all, this is an valid implementation/idea. The abuse of good will complaint doesn't stand that much