Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I use Anaconda exclusively and deployments (with virtual environments) have been fairly ok.

That said, I do run into trouble when I have a dependency that requires compilation on Windows (i.e. like the popular turbodbc) because say, a wheel isn't available for a particular Python version. Any time a compilation is needed, it's a headache. Windows machines don't come with compilers, so one has to download and install a multigigabyte Visual Studio Build Essentials package just to compile. Sometimes the compilation fails for various reasons.

Require gcc compilation is headache for installing dependencies inside Docker containers too -- you have to install gcc in order to install Python dependencies and then remove gcc after.

I think requiring local compilation (instead of just delivering the binary) is a UNIX-mindset that is holding back many packaging solutions. I think a lot of pain would be alleviated if we could somehow mandate centralized wheel creation for all Python versions, otherwise the package manager marks a package as broken or unavailable and defaults to the last available wheel.

Also if only we applied some standards like R's CRAN repo does -- ie. if it doesn't pass error checks or doesn't build on certain architectures (institute a centralized CI/CD build pipeline in the package repo), it doesn't get published -- the Python packaging experience would be much improved.



Yeah, if PyPi was as annoying as Cran with respect to new versions, then a lot of this pain would go away.

For those who don't realise, when there's a new version of R, anything that doesn't build without errors/warnings is removed from the archive.

This is really annoying if you want something to keep running, but it prevents the kind of dependency rot common to Python (recently I found a dependency that was four years out of date).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: