pyx: a Python-native package registry, now in Beta
pyx: a Python-native package registry, now in Beta

pyx is a Python-native package registry from the creators of uv.

Related discussion:
pyx: a Python-native package registry, now in Beta
pyx is a Python-native package registry from the creators of uv.
Related discussion:
Sorry, what is this really? I don't seem to encounter the problems mentioned in that blog post 🧐
I feel like a lot of the problems people encounter with Python are due to the non-Python dependencies that projects use and are mostly restricted to certain fields. My guess is that the people complaining are a mix of:
Hopefully the next blog post will walk through examples of "this is the problem people encounter with existing tools, this is what it looks like, this is how it looks like with our tools, this is how it works". The "how it works" could be split off into another blog post too.
Right now, this blog post just made me aware of pyx but that's about it.
you're describing the use case for conda, which is something astral is trying to be an upgrade path from. conda is a mess, but it lets people work.
not on Linux
We're on macOS using docker, which is basically Linux, and building numpy et al is a pain. Sometimes we have failures on our build pipelines (Linux), sometimes on our dev machines, and it always takes forever.
We do a little of everything:
That said, we won't be paying for pyx. Our problems can be solved via building our dev docker images in our pipelines and just pulling them down for devs, we just haven't bothered because the above is somewhat rare.
I don't know who this is for, because rebuilding numpy et al should be somewhat rare, and the annoyance is usually only a few min when it happens.
MacOS and docker is asking for trouble. It requires all kinds of hacks and awareness of where things are being built or mounted, which architecture is being emulated because it runs in a VM, memory limits, swapping, and so on. It's no surprise build times are a problem.
I imagine your build pipeline is for aarch64, you have made modifications to numpy itself (build flags or something), you're using a special distro in the pipelines, or the rebuild is unnecessary. For me numpy has always been a simple "pip install numpy" or "poetry install" when not on nixOS.
My guess is that if you used non-emulated Linux, your local build issues would be reduced and also sped up. But I can't say that with any certainty because I don't have nor require nvidia in my Linux rig, nor do I know your setup very well.
You do however fall quite nicely into the criteria I shared. Were you outside of that, python would probably be much much easier (aka no non-pytjon deps).
For me it feels they just try to commercialize on top of open source and slowly destroy it.
Now I'm even more wary of their products, every new peace feels more and more like they are attempting to take over python with apparently unlimited resources.
Exactly.
the first piece of the Astral platform
Because obviously, everything need to be a platform when you are VC backed... 😮💨
Well, their tools are modular, you can use one without the other. PyPi isn't going anywhere either.
yeah i didn't really expect this. i've been using uv a lot lately because it's so damn fast and degrades gracefully to just a normal venv, but if they're tying it to their own platform i'm going to be careful going forward.
Sick, I have a project where 80% of build time is on psycopg. It really doesn't make sense to rebuild it every time. If it's easy to set up a shared registry, we could even use it during development.
Also, installing packages according to the presence of GPU hardware has always been a pain. We've been dependent on tools like conda and more recently pixi. Maybe pyx can alleviate that too, we'll see.