Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

At least for Python dependency management, I'd hardly call it a mess these days and yes it was a horrible 10 years ago. The de-facto standard of venv and pip (using a requirements.txt file) is generally painless these days. To the extent that moving between MacOS, Linux and Windows is feasible for most (of my) Python work.

And the Python 2/3 pain stopped being an issue a few years back. I haven't been forced because of library requirement to use Python 2 in years.

Your criticisms WERE valid 5 or 10 years ago. In my experience, the ecosystem has improved greatly since then. I wouldn't use Python for large and shared developer code base but for one-man experiments and explorations, Python is pretty compelling these days in my opinion.



> The de-facto standard of venv and pip (using a requirements.txt file) is generally painless these days.

Except Python themselves now recommend the third party, pipenv. [0] Which means that there are multiple authoritative voices saying different things, and that produces confusion.

[0] https://packaging.python.org/en/latest/tutorials/managing-de...


Now being from 2017 ;) I suspect no-one has touched that file properly since then because of old scars :(

Part of the issue is there's not a lot of differentiation between most of the "project management" tools (apart from their age, which in some cases explains certain design choices), and the coupling of most of these to their PEP 517 backend doesn't help. There are backends where there is differentiation, such scikit-build, meson-python/mesonpy and enscons for wiring into pre-existing general purpose build systems (cmake, meson and scons respectively), maturin for rust, and flit-core as a bootstrapper, but given setuptools can be used without a setup.py now, I'm not clear on what some of the others provide.


That's why I deliberately described the pip/venv solution as a "de-facto" standard.

Is Python perfect in this regard? Absolutely not but the situation is FAR less painful than it was 10 years ago. The original comment claimed nothing has changed in 10 years. I used Python 10 years ago and disliked it intensely but I've recently (in the last 3 years) used it again and it is simply not true that nothing has improved in 10 years.


Poetry is a reasonable solution and has been for a few years now. We use it in production for deployments, and it's worked well for us.


It still won't beat the deployment speed of

   scp executable user@host:direc/tory/
that you get with Go.

I still use python for stuff that never leaves my computer, but in most cases if I know I need to run it on Someone Else's Machine (or even a server of mine) I'll reach for Go instead.


At the expense of absolutely massive, gargantuan, entire-OS-sized binaries.

Just looking at my bin/ folder eg

    k9salpha - a relatively simple curses app - 56MB
    argocd - a cli for magaging argocd - 155MB


It's like what, $0.01 per GB these days? I bought a 2 TB NVMe for Linux 4 years ago, on which I also game, and it's 50% full.

Disk space hasn't been a serious concern in two decades.


Until you start paying for it on cloud deployments per GB transfer.


        strip --strip-all k9salpha argocd


I agree with you, though if you live in a Linux+podman world, you can literally scp whole containers, so at least we made some progress.


Sure but that's not going to work across platforms, right? So not really an apples to apples comparison.


Even the most ardent of Go-haters have to be fair and admit it makes cross-compilation really nice.

    GOOS=linux GOARCH=amd64 go build && scp executable user@host:direc/tory/
Out of the box, no toolchains to manage.


This is pretty much what I do.

I have a Taskfile for local compilation and running (macOS) and when I need to release I can just say `task publish` and it'll cross-compile a Linux binary and copy it to my server.


Sure, it's nice in that regard.

But honestly running poetry install for python is not bad either.


Except in 3 years when poetry is bad for some reason and now there's a new better(?) tool again :D


Like the sibling comment said, cross-compiling is literally setting two environment variables.


Eh, poetry, pipenv, conda, setup.py, GCC and all dev tools to possibly compile some C shite you didn't know you needed? It's a complete shit show still...


+1


I started using Dagster a few days ago, which runs on Python.

As far as I can tell, I have to install the packages with pip. Wait - don't forget venv first. Oh but I'm not supposed to commit the venv, so how is anybody to know what I did? Okay I have to setup a requirements.txt... but now what did I install in the venv? Okay there's a way to dump the venv.. but it's giving me all the dependencies, not the root package I installed?!

Okay there's something called pipenv that seems smart... Oh wait no this Dagster has a pyproject.toml, maybe I should use poetry? Wait poetry doesn't work, it seems like I'm supposed to use setup.py... so what's the point of pyproject.toml.

Okay I put the package into the setup.py. Great now I have to figure out the correct version dependencies by hand?! I don't get it, python is a nightmare.

Compare to .NET: dotnet add package

On build: dotnet run # automatically restores packages to a project level scope


Setting up a requirements.txt file and venv is _not_ a difficult ask. Seems like a skills issue here frankly.


Your attitude towards a new developer in the community is disappointing.

I just wrote in detail why I found it difficult/unintuitive and you dismissed everything without addressing it.


If that's too difficult then you might as well not be in the profession, because it's one of the simplest things in the world.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: