“ I realize that this article is quite long, and unfortunately this is the short version. Much nuance and history is lost here, even though the post is already long enough that it is destined for the limbo of a bookmarks folder called "to read later". Despite the fact that no one will read more than the next 3 paragraphs…” - pro tip - if you want to make the article shorter and easier to read don’t spend an entire paragraph blabbering about how long it is and how nobody reads anymore. I bet if the author had aimed to be succinct, the article would be half the current size and much quicker to read and understand.
As someone who actively contributes to Python packages using the setuptools cli interface, this is jarring. I had no idea. I guess I'll move my build process to some other tools.
I started contributing to Python about a year ago. At that time, all of the results when you google "how to build a python package" refer to the setuptools CLI with setup.py. I put it in my workflow and bam, I've been using it ever since without incident.
You don't need to completely change if you don't want to. setuptools will most likely continue supporting the old way for a long time since there is so much Python code out there that will probably never update to the new standards.
Also, it takes only minimal changes to bring a setuptools-based project into the present. You just need to add a `pyproject.toml` file to your project with the following:
Then when it's time to distribute the project, instead of `setup.py sdist` and `setup.py bdist_wheel`, you can just use the official build[1] tool like so:
pyproject-build --sdist --wheel
And that's all there is to it. Once that's in place and if you're feeling frisky, you can try taking advantage of new setuptools features that can allow you to eliminate `setup.py` entirely and specify everything in `setup.cfg`[2] so that all your project's information is static and no code needs to be run when installing it, assuming your project doesn't contain any compiled code.
As and added benefit, by putting pyproject.toml in your projects now, if you wanted to switch over to another tool like poetry or flit later on, the act of packaging the project doesn't have to change as long as you update the pyproject.toml file with the new build system. The same `pyproject-build` command will work for all three systems.
Note that with the recent versions of `build` it's recommended not to pass `--sdist --wheel`. Just do:
python -m build
With this invocation it'll produce an sdist, and then it'll build a wheel from that tarball, not from your Git checkout which is much closer to what pip does in the wild. When building both artifacts from a Git checkout, it's possible to mess up packaging making it impossible to produce a wheel out of an sdist (which is necessary in many places, including installs from tarballs that pip tries to build wheels from and downstream packaging in different OS ecosystems).
Sadly there's no easy way to mark all those Google searches out of date... One of the big goals of this project is to spread the knowledge though, so you reading it means success.
I find myself frequently using the date filter in Google when searching programming topics. Especially if if I'm looking up an error message, almost anything older than a year is noise.
Once you get into distributing python package, it is such a nightmare. Relative imports were also not easy to understand.
And C/C++ extensions in your package makes things even worse. CMake for extension, setup.py for python of you hope to build cross-platform.
Pybind11 helped a bit but for each python version you have to compile the wheel again, even with manylinux docker image. Imagine learning docker so you can release a wheel!
The other language I often use, C++, is also no better. I am drawn to Rust and Nim lately (not entirely because of these issues).
There was one way to do things. It was simple, and it worked.
Then python 3 wanted to become JavaScript. Async, Unicode, floating point division. Move fast and break things. No one way to do things, it's however you want!
I had to explain python deployments to a novice the other day. What a nightmare.
Python 3.5 to 3.8 just kept drastically changing things. My first experience wasn't pleasant, and for the first time I needed virtual environments and custom apt repositories to install multiple versions of python.
But it finally seems things have settled a bit. Here's hoping the Python cultural revolution settles down, and stops trying to be a trendy language, and instead a dependable one.
It's just an opinion, but in exchange for the speed hit you get a huge decent standard library, and lots of high quality libraries that you can stitch together much more easily than in C.
In the vast majority of my use cases, I'll take the half second load hit in exchange for that (if it's even that long).
Yeah, it's totally dependent on what you're doing. It would be insane to write a simple script in C, and it would be insane to write an operating system in Python. Most code lies somewhere between those two extremes.
I fully admit that the cult of performance is silly - and anyway, it can sometimes be easier to write performant code in a slightly more high-level language that you're extremely adept at writing. For instance, I'd wager most people could write a more performant web server in Go than they could in C, even if they could write a web server in C.
Sorry, I could have been more clear there, and I've updated the post. I have the "curse of knowledge" and took is as assumed that you can only do this if your particular build configuration fits in setup.cfg. For me, the fact that you have the option to use a setup.py is one of the biggest selling points of setuptools. For other build backends, it's common to have a great experience when you are on the "happy path" and then have no options when you have any of a million exceedingly rare deviations. So if I had a complicated build like you have (and I do have some mildly complicated builds where this is the case), I'd look at that bullet point and think, "Ah, that doesn't apply to this, but it's good to know for my simpler projects".
That said, it would be really nice if at least the C-extension part of this worked with the declarative format. My OSS time has been severely curtailed lately and I'm spread way too thin, so I probably won't be able to execute on this, but my long term vision for setuptools has always been that most common build configurations should be achievable using declarative config only, and everything else has setup.py as an escape hatch. This is similar to the way almost all Rust projects are configured with Cargo.toml, but in rare cases you can use build.rs to execute some Rust code as part of the build.
The thing is, I don't think I have an option to use anything other than setup.py. For example, I read:
> and as part of the deprecation of distutils, having setuptools installed at all even when you don't import it can change the behavior of import distutils.
from setuptools import Extension, setup
from distutils.command.build_ext import build_ext
copt = {'msvc': ['/openmp', '/Ox', '/fp:fast','/favor:INTEL64','/Og'] ,
'mingw32' : ['-fopenmp','-O3','-ffast-math','-march=native'] }
lopt = {'mingw32' : ['-fopenmp'] }
class build_ext_subclass( build_ext ):
def build_extensions(self):
c = self.compiler.compiler_type
if copt.has_key(c):
for e in self.extensions:
e.extra_compile_args = copt[ c ]
if lopt.has_key(c):
for e in self.extensions:
e.extra_link_args = lopt[ c ]
build_ext.build_extensions(self)
If distutils is deprecated, and setuptools causes problems ... what should I do?
I know setuptools is a "hot mess". It took hours of puzzling to figure out how to compile things correctly, and includes input from other people who spent hours puzzling things out.
So I am ... sad isn't quite right. Depressed? Down? Feeling like and outsider? .. every time I am reminded that I am not one of the "most people" in "works for most people".
While I (and you) are here, is some way to distinguish between "developer" build environments and "deployment" build environments?
I was looking more at the general topic, trying to figure out how to migrate. I saw this example:
I have two build environments. A development build environment requires Cython (I have hand-written Python/C extensions and Cython extensions). On the other hand, source distributions include the "cythonized" C code from Cython, so people who install and deploy from source don't need Cython. [1]
Is there a way to distinguish these two environments in pyproject.toml?
[1] Other than 'setuptools' and the need for a C compiler, my package has no dependencies when building from a source distribution.
EDIT: P.S. A reason to stick with "python setup.py develop" instead of "pip install -e ." is to avoid that frequent nag to "consider upgrading" pip. ;)
> setuptools will detect at build time whether Cython is installed or not. If Cython is not found setuptools will ignore pyx files.
> To ensure Cython is available, include Cython in the build-requires section of your pyproject.toml:
[build-system]
requires=[..., "cython"]
> Built with pip 10 or later, that declaration is sufficient to include Cython in the build. For broader compatibility, declare the dependency in your setup-requires of setup.cfg:
The only case where I still invoke setup.py directly is "python setup.py test". I already knew it's deprecated, but don't understand what I'm supposed to do instead.
The article recommends tox, but how did tox get installed?
I can currently create and activate a new empty virtualenv, "pip install -U pip", clone and cd into my project, and run "python setup.py test". My test dependencies are automatically setup and my preferred runner (pytest) is used. If I change those in the future, the same commands will keep working because I am configuring those in setup.cfg, etc.
If I am now expected to manually "pip install tox" before testing my project and change that in the future if we start using something else, isn't that a step backward to the situation pyproject.toml et al were supposed to solve with needing to manually install specific stuff before using a project?
This is at worst a lateral move — you need something installed that handles your test invocation for you. It could be `make`, `tox`, `nox`, `poetry` or whatever.
For a number of reasons (detailed in the article), that thing shouldn't be `setuptools`. Even if the setuptools maintainers wanted to continue supporting this use case, setuptools is uniquely unsuited to working with this, because of the way its hooks don't fire until all its dependencies have been imported already. With `tox` you just need to make sure you have `tox` installed, and you can even specify a minimum version in the `tox.ini` file.
Additionally, tox doesn't need to be installed in your Python environment (in fact, it's an implementation detail that it's Python at all — if it were instead a Go executable, it would work the same). You just need the tool available to be invoked. This means you can install it with pipx, in its own virtual environment, through your system package manager, whatever is most convenient for you.
Python is such a joy to write and such a nightmare to set up, package and distribute.
The Python Foundation needs to adopt or create standard and supported ways of doing things. I spend so much time dealing with issues that should have been solved a long time ago. Special shout out to the PyInstaller devs for providing so much value while constantly fighting to deal with upstream changes.
Right, but the community is different than the foundation. I would like an authoritative "this is how you do it" doc, from the foundation, about setup/packaging/dist, and I want it to work. And if said doc advises using some third party tool, then the foundation needs to fund or take over maintenance of that tool so that it continues to work and the "standard" way of doing it doesn't change every year.
There have been like three or four iterations of the "here's the official way to package and distribute python code" document. Here's the latest: https://packaging.python.org/
I've seen that before, but I dunno, it seems kind of anti-python[0] to have an "official doc" that basically says "here's a third-party tool we like, but also here's 5 other tools you could use instead." I want ONE tool, built and maintained by the same people who build and maintain python, so that it always works.
[0]: There should be one-- and preferably only one --obvious way to do it. (from the zen of python)
That’s never been true for packages. Just because someone else did it doesn’t mean you can’t do it better. The goal in the last few years has been to make it possible for anyone to write a build system or tooling, just like any other library in Python. Otherwise there is no innovation.
I maintain a library implemented mostly in python with some simple cpython extension modules for performance. Since you can't run unit tests against an extension module before you've built it, my Makefile invokes setup.py just to get things compiled:
python setup.py build_ext --inplace
I don't see a valid replacement for this invocation, but fortunately, it's the only one.
When I include the library from a project that uses it, the same `pip install' command that brings the library in also performs all the compilation described by my setup.py.
All I'd need to move away from deprecated syntax entirely is a way to build the module when I'm testing it within it's own source directory. Until then I won't be changing anything.
It's less crazy than you think. The foundation historically never had developer employees (today has just 1 - a CPython core developer that started 3 months ago). The only way taking over a project and making it de facto standard would be to have (IMHO at least) 5 full time employees working on it. That's a big investment the foundation doesn't have and no corporation commited to support that (for at least 3-4 years). Also, there's the huge backlash the PSF would have to deal with from people who inevitable don't like the choosen standard.
I used to recommend articles with titles like "modern Python package structure and tooling" to graduates. I stopped because the tools (poetry, pipenv etc) were changing so much. I try to fit as much of our workflows into hand made templates and PyCharm now.
There was a post here a while ago about packaging or virtual environments. Within a few minutes, there were quite a few responses about better ways to do it.
Unfortunately, each response had a different recommendation. I wish I could find that thread again.
We have been talking about how to support customers who want to use python. At least for me it isn't very appealing, seems like it could take a lot of attention to keep up with what's expected.
The difference between the community adopting a "standard", and something becoming the standard/being included into the langage by a direct decree is enormous. Especially with Python's philosophy.
Nowadays I use Poetry and PyInstaller, which make things easier.
For some of my projects, I produce a "compiled"/bundled executable that takes all the pain out of distribution. Here's my config, maybe it will be useful to you:
I’ve noticed Poetry being used more and more for projects I setup locally, and I must say, it does seem to be very, very good. Minimal overhead, easy to adopt, etc. I’ll need to setup a proper workflow with it for my current projects. Being able to abstract dev/prod environments in particular are really handy.
I'm sort of the Python hero at work, and, for me, the one supremely magical thing that Poetry brings to the table is that I am finally able to spec out a Python development workflow that I can easily document and explain to team members who don't already have a lot of detailed Python expertise.
Maybe the single most underappreciated thing about Poetry compared to other options is its tendency to fail safe.
I managed to find an annoying situation with poetry in which if you specify a range of versions for a dependency, and one of your other dependencies specifies a slightly different range of versions for that dependency, it will refuse to proceed. (I was expecting it to have chosen something in the intersection of the ranges, but it did not)
I suspect in GP's case there was something blocking the intersected range. That should work perfectly.
The poetry author doesn't want to support package version range overrides like yarn does [1]. His call I suppose, but I think it's the one major flaw in the otherwise giant step that poetry takes forward.
I've stopped recommending Python for learning programming, and instead have started recommending Go.
Python's environment difficulties make it exponentially harder for beginners to focus on actual programming concepts. Things like pyenv or virtualenvs or any of those things are blackboxes to beginners.
Go is simple, it's tightly coupled to version control so you can teach git at the same time in a meaningful way, and it's low level enough that CS concepts come up frequently, while being powerful enough that it will scale with a beginner's knowledge.
Coming from Java, my reaction to pyenv and virtualenv was "why is this even needed?" Package management and SDKs are perennial fun times for all languages, but pyenv and virtualenv were just weird.
If you have a specific library or platform requirement then this advice is obviously not applicable, but for learning programming and software engineering principles, I honestly can't think of a better introductory language than Go.
I love Python, and the ecosystem is vast, you're right, but it has a serious issue with packaging, delivery, and dependency management. I have almost a decade of software engineering under my belt, and Python environment issues give me a hard time _regularly_. For beginners, it is a major demotivating factor that can be a death sentence for learning.
> Python is such a joy to write and such a nightmare to set up, package and distribute.
Here's my simple-minded metric on when this problem can be called "solved":
Get a VPS from GoDaddy.
Deploy your Django app as easily as you can deploy a PHP app, say, Wordpress.
I have written about this before. I love Django. And yet I think that they have done a huge disservice with the development server and DB configuration out of the box. Even for a simple application, going from developing on your desktop to deploying the same application on, as an example, GoDaddy, is in a range between nightmare and impossible. I have personally given up multiple times and just said "Fuck it! Just use WP" even when I really, truly wanted to stay in the Python/Django ecosystem.
Try it. Develop a simple photo album application. Get a VPS from GoDaddy and deploy it.
No Heroku et. al., are not solutions. They are indicative of the problem.
I 100% agree. WordPress is the gold standard for ease of deployment. I specifically picked WordPress over other solutions because I knew the website will need maintenance without my help, and WP is simple enough for that.
if you can use docker, that solves pretty much all the python packaging mess and makes deploying your challenge of deploying random python application to a random VPS pretty simple.
i'm not saying that python packaging isn't a mess. but it can be largely mitigated.
As someone else said, Docker introduces yet another level of crazy to having to deploy a Python/Django app. I've looked into and used all of these technologies. We have developed internal "How To" documents on how to deploy different types of apps into a variety of environments. Reading through the Python/Django "How To" is a sobering experience.
This coming from someone who would truly like to use Python/Django exclusively. I hate, hate, hate telling people to just use WP or go a different route. I do not enjoy touching PHP. And yet...
Docker takes away some problems and creates other, smaller problems. It also results in more complexity. That complexity is a problem.
I can teach a slightly technical person to deploy WordPress on a shared host with a pre-installed shared LAMP stack. I cannot do this with Docker, not to to the extent it's sustainable without ongoing help from me.
Sorry, but GoDaddy is a disgrace, and your comment makes no sense at all, you are comparing WP with Django. And instead of recommending VPS as a solution, you recommend the worst party offering these services?
I have been using GoDaddy for, well, I forget how long, maybe two decades. I have run websites for multiple companies from their servers. I can't remember a single issue. Not one. Even hosting and managing email. Frankly, I don't know what you are talking about.
I have also used Linode, AWS, Dreamhost, Rackspace and a bunch of others for different kinds of work. Not sure where the hatred for GoDaddy comes from.
I also know a bunch of people using GoDaddy servers. I think people repeat shit just because they think they sound "cool" and yet have no clue what they are talking about. Also, GoDaddy support, on the rare occasions they are needed, has always been top notch. So, yeah, no clue what you are talking about.
GoDaddy is shady and I would not recommend it ever, but they are a good representative of the target market - something people pick despite the shadiness and inflexibility, because they require zero technical competency to get something up and running.
I almost wonder if we need something like deno but for the python language--a reboot that uses the same language and syntax, but jettisons the cruft of packaging, etc. and starts over with a better developer experience. Perhaps just reference and import packages directly by URLs and forget all the pain and struggle of curating system-wide and virtual environments.
> Python is such a joy to write and such a nightmare to set up, package and distribute.
Agreed. Though I would not call it a "nightmare," the python ecosystem and runtime are cumbersome and annoying to work with when compared to some other languages that have put in a lot of thought into DX.
Once you hit the sweet spot of developing for cross-platform (even just Linux, MacOS, and Windows) and supporting normal average-people users and have (even optional!) C dependencies, Python's packaging situation quickly deteriorates into "nightmare" territory.
I began my professional career writing Rails apps (previous experience in software was cobbled together Python and .NET scripts), and 1-2 years later made the full transition to Django and Python, mostly because my mentors and friends at the time were writing projects in Python. Plus, the job market for fullstack Python devs was much more lucrative at the time.
I must say though, I do miss Ruby and the syntactic sugar, packaging/dependency system, etc. The transition itself was also very painless. I’m keen on checking out the current state of Ruby and/or Rails 4-5 years later, as I’m sure much has changed!
The two main frameworks I use(d) have relative equivalents in both languages which is super nice (Flask and Sinatra, Rails and Django). Right now I’m in the FastAPI craze, but perhaps I’ll opt for Rails for my next personal project.
Honestly, I don't think you'll find it feels like much has changed at all - modern Rails is still pretty consistent with Rails 4+, modern Ruby is still pretty consistent with Ruby 2+. Its one of my favorite things about the language and framework combo, it feels like once we got past Ruby 1.9.3 and Rails 3 things have been pretty well locked in.
There are new niceties and improvements, but bundler is still bundler, everything is still an object, etc.
Im convinced the only people that like ruby are those that want to delve into metaprogramming. It might be fun for the person that constructs it but it is a nightmare as an outsider looking into a project.
I know this is more rails but the fact that autoloading is even a thing speaks to the wild nature of ruby.
how? In ruby you are supposed to use rubygems and bundler for dependencies, and things generally work fine, and it's been like that for a decade or so.
There have been a few version management tools (rvm, rbenv), but those were all "fine" and still are, AFAICT, and you don't strictly need them.
Meanwhile for inexplicable reasons in pythonland we keep reinventing package/version/environment management.
I had a friend who managed to mess up their local env this morning to the point of screwing up their OS install, and it reminded me of the XKCD on python env[0].
It's at least 3 years old, and things have not improved.
I used to love Ruby. But it’s too exciting. Python is boring, in a good way. Like mowing the lawn. Or buying potatoes. I no longer want excitement in the process, only in the results.
Agreed. There has been so much churn and fragmentation with build tools over the last few years. So much complexity for what is not such a complicated problem to solve. Hopefully this is the "has to get worse before it can get better" part.
> So much complexity for what is not such a complicated problem to solve.
As someone who actually worked weeks on their spare time, I beg to differ. This is a very complicated problem to solve. And everyone has their own opinion on how things should work, which is goverened by their narrow use case. But you see a "standard" packaging tool needs to be the opposite of narrow use case. The main reason Anacond Inc exists is because they wanted to solve this for data science. Even with them being a relatively big corporation their "solution" is not loved universally, but works ok most of the time.
I disagree. I actually think it’s a complicated problem because there’s so many issues that have to be solved all at once.
What has been happening is that someone writes a build tool to solve a specific problem that they have but they neglect to solve all the other ones. Why would they? They’re not being funded.
That’s what caused (and still causes) a lot of the churn in JS too.
Poetry helps alleviate some of the issues mentioned here, and lets you separate dev vs prod dependencies. As far as the "new PHP" goes, it'd be great if Python ever powered almost 80% of websites: https://kinsta.com/blog/is-php-dead/
Docker has been the saving grace for Python. I once used to build deb and rpms of a Python project of mine and I'd rather stick a finger in my eye than go through that pain again.
Is this really an issue? While there are a lot of unmaintained packages doing it (which I assume will still be working for some years?), I don't think I've seen a recent package doing it in years.
In the off-chance you end up depending on that legacy code, repackaging that one or two dependencies shouldn't be that big of a deal.
I think that it is true that I should have written this article or something like it years ago, but there's still a long tail of people who don't even know that invoking setup.py is bad, or that the goal is to get rid of all setup.py commands.
Probably the biggest stragglers will be build scripts, and as recently as mid-2020, the answer to "how do I build distributable artifacts without using setup.py" was still "you can do it but there's no good definitive answer": https://stackoverflow.com/q/58753970/467366
There's also been a lot of mixed messaging in this space, so I think there's a decent population of people who think that the way to stop invoking setup.py is to stop using setuptools entirely, mixing up "don't invoke setup.py" with "use a backend that doesn't have a setup.py".
But you are right to say that this has gotten a lot better over time and the word is slowly getting out.
To be honest, as more of a casual user of Python I wasn't aware of the situation at all so I found your post very informative!
I did always find the setup.py approach misguided for several reasons, so it's more of a casual observation I've done that I thankfully rarely encounter it these days.
Py3 feels to me like a massive case of "screw backwards compatibility" for very little gain.
I feel like Python the language peaked somewhere between 2.4 and 2.7.
Python was much slower than most other languages, but in return you got great readability ("runnable pseudocode") and rapid prototyping. Now, with all the additions to the language (more syntax and features, static typing), those advantages are weakening but you're still not getting the advantages of faster compiled languages.
It seems to me that Go or Nim (or D or Zig or ...) are preferable for readability, and that Rust is more worth one's time if you don't mind the complexity.
I agree 100% with you. I built my career on top of Python, and 2.7 is the last version I've used. I don't like that Python 3 is adding any feature under the sun for whatever reason. Have you seen their sad excuse of pattern match proposal? And packaging is still a bloody mess.
There's worse languages, it definitely gets the job done, but I'd rather stick with better languages.
I think Python dependency/build/install/test stuff would be helped greatly by a library or module that did for it what pathlib did for file & path manipulation, eh?
I'm surprised the article, for being so thorough, doesn't have a single mention of Poetry.
Which in terms or "works for most people, for most use cases", really has that down. And of course uses the same PEP standards as the other tools mentioned.
It also has a lot of downsides which is why I avoid it, for example. The purpose of the article is not to point at other tools but to show how to use setuptools better.
Python distribution is honestly horrific. The worst thing is that I actually see people defending it or saying it isn't that bad if you simply learn it "properly" but I have no idea what they mean when they say this.
I simply don't understand this Python packaging insanity to this day. I tried, even released a few modules on PyPI. Why can't it just make sense? Like Ruby gems? Like Rust packages? Even Javascript packages are easier to understand. It's actually easier to make a PKGBUILD file for my C project.
I love making libraries but this Python packaging stuff is just so painful. Such a shame, Python has a really nice module system. There's one feature that I love though: editable mode. Finally I can develop and use a library at the same time! I've been trying to accomplish this with git submodules since forever.
Stockholm syndrome can be very real in the python world. There needs to be room for critical discussion of the pain points in order to address them and improve the ecosystem.
I'm no js/node fan, but I'd say it's a hell of a lot easier to install and manage dependencies with npm (or yarn) than in python. To the extent that people joke about/are horrified by the number of dependencies managed by them in a typical project. There's just a lot more tooling around it that's somewhat official/default/the one way.
With pip you can manually pin versions, or 'freeze' what's currently installed, but then updating them is a manual effort or a third-party service like pyup.io.
> which languages do people feel have excellent package management?
Rust (technically cargo) is probably the best I've seen or used.
I think the most important thing Node did right was shipping an official package manager/manifest format/etc. alongside the runtime, from the very beginning. There is no fragmentation; yarn exists but it's a 100% equivalent drop-in replacement. Everybody's on the same page. Check out the source, run one command, and you're ready to go.
The second most important thing they did right was install dependencies locally in the project directory. There is no global state or environment dependency (unless you go out of your way to install some tool globally). Want to do a clean install? rm -rf node_modules and you're done. The only system-wide prerequisite is a single Node installation. No environment variables, nothing.
One mistake they did make was punting on the management of different Node versions, so you have to reach for nvm and any swapping-out has to happen at a global level. Cargo made the right decision to in-house this. However, even then, in my experience it's rarely an issue because Node is so stable.
Rust's cargo is the high water mark today (and is a substantial reason for Rust's popularity).
Ruby's rubygems, node's npm, and the elm package manager all get different things right and wrong, but at least they are a common approach that is somewhat sane. Python is total anarchy. I think most people would disagree with you about .Net, Go, and Javascript today vs python.
Please elaborate. I've found Ruby gems to be the simplest packaging system. You make a file containing the gem specification and that's pretty much it. The most annoying part for me was bikeshedding the neatest way to glob all files in the project.
I will say that Ruby gems containing C extensions are really bad. It's cemented in mind the notion that foreign function interfaces must be built into the languages so that compiling code is never necessary.
> which languages do people feel have excellent package management?
Rust packages are really impressive. They just work. It's an isolated ecosystem like all the others, which is weird for a native language like Rust. Dependencies are mostly used at build time, many don't make it to the Linux distrubitons.
.net is just nuget. It seems pretty straightforward. The last time I looked for a "blessed" solution in python, it was some third part solution surrounded in some kind of internet/twitter drama.