At least for Python dependency management, I'd hardly call it a mess these days and yes it was a horrible 10 years ago. The de-facto standard of venv and pip (using a requirements.txt file) is generally painless these days. To the extent that moving between MacOS, Linux and Windows is feasible for most (of my) Python work.
And the Python 2/3 pain stopped being an issue a few years back. I haven't been forced because of library requirement to use Python 2 in years.
Your criticisms WERE valid 5 or 10 years ago. In my experience, the ecosystem has improved greatly since then. I wouldn't use Python for large and shared developer code base but for one-man experiments and explorations, Python is pretty compelling these days in my opinion.
> The de-facto standard of venv and pip (using a requirements.txt file) is generally painless these days.
Except Python themselves now recommend the third party, pipenv. [0] Which means that there are multiple authoritative voices saying different things, and that produces confusion.
Now being from 2017 ;)
I suspect no-one has touched that file properly since then because of old scars :(
Part of the issue is there's not a lot of differentiation between most of the "project management" tools (apart from their age, which in some cases explains certain design choices), and the coupling of most of these to their PEP 517 backend doesn't help. There are backends where there is differentiation, such scikit-build, meson-python/mesonpy and enscons for wiring into pre-existing general purpose build systems (cmake, meson and scons respectively), maturin for rust, and flit-core as a bootstrapper, but given setuptools can be used without a setup.py now, I'm not clear on what some of the others provide.
That's why I deliberately described the pip/venv solution as a "de-facto" standard.
Is Python perfect in this regard? Absolutely not but the situation is FAR less painful than it was 10 years ago. The original comment claimed nothing has changed in 10 years. I used Python 10 years ago and disliked it intensely but I've recently (in the last 3 years) used it again and it is simply not true that nothing has improved in 10 years.
I still use python for stuff that never leaves my computer, but in most cases if I know I need to run it on Someone Else's Machine (or even a server of mine) I'll reach for Go instead.
I have a Taskfile for local compilation and running (macOS) and when I need to release I can just say `task publish` and it'll cross-compile a Linux binary and copy it to my server.
Eh, poetry, pipenv, conda, setup.py, GCC and all dev tools to possibly compile some C shite you didn't know you needed? It's a complete shit show still...
I started using Dagster a few days ago, which runs on Python.
As far as I can tell, I have to install the packages with pip. Wait - don't forget venv first. Oh but I'm not supposed to commit the venv, so how is anybody to know what I did? Okay I have to setup a requirements.txt... but now what did I install in the venv? Okay there's a way to dump the venv.. but it's giving me all the dependencies, not the root package I installed?!
Okay there's something called pipenv that seems smart... Oh wait no this Dagster has a pyproject.toml, maybe I should use poetry? Wait poetry doesn't work, it seems like I'm supposed to use setup.py... so what's the point of pyproject.toml.
Okay I put the package into the setup.py. Great now I have to figure out the correct version dependencies by hand?! I don't get it, python is a nightmare.
Compare to .NET:
dotnet add package
On build:
dotnet run # automatically restores packages to a project level scope
And the Python 2/3 pain stopped being an issue a few years back. I haven't been forced because of library requirement to use Python 2 in years.
Your criticisms WERE valid 5 or 10 years ago. In my experience, the ecosystem has improved greatly since then. I wouldn't use Python for large and shared developer code base but for one-man experiments and explorations, Python is pretty compelling these days in my opinion.