-
jax itself is just pip installable, so this should be drop-in. CUDA is it's own headache, and that's where mamba (now pixi[1]) really shines.
[1]: https://pixi.sh/
-
Judoscale
Save 47% on cloud hosting with autoscaling that just works. Judoscale integrates with Django, FastAPI, Celery, and RQ to make autoscaling easy and reliable. Save big, and say goodbye to request timeouts and backed-up task queues.
-
simple-repository-server
A tool for running a PEP-503 simple Python package repository, including features such as dist metadata (PEP-658) and JSON API (PEP-691)
When testing previous versions of uv, I saw it do that too. But uv uses other tricks to speed things up: it takes advantage of PEP-658 metadata (which doesn't need to download the package) and if that metadata is missing it will next try byte range requests to grab just the metadata of the wheel, and so on. pip was learning some of these tricks in recent releases too.
One problem we have is that support for any repository features beyond PEP-503 (the 'simple' html index) is limited or entirely missing in every repo implementation except warehouse - the software that powers pypi. So if you use artifactory, AWS codeartifact, sonatype nexus, etc, because you are running an internal repository, PEP-658 & PEP-691 support will be missing, and uv runs slower; you may not even have accept-ranges support. (and if you want dependabot, you need to have your repository implement parts of the 'warehouse json api' - https://warehouse.pypa.io/api-reference/json.html - for it to understand your internal packages)
I've been playing with https://github.com/simple-repository/simple-repository-serve... as a proxy to try to fix our internal servers to suck less; it's very small codebase and easy to change. Its internal caching implementation isn't great so I wrapped nginx round it too, using cache agressively and use stale-while-revalidate to reduce round trips, it made our artifactory less painful to use, even with pip.
-
simple-repository-serve
Discontinued [GET https://api.github.com/repos/simple-repository/simple-repository-serve: 404 - Not Found // See: https://docs.github.com/rest/repos/repos#get-a-repository]
When testing previous versions of uv, I saw it do that too. But uv uses other tricks to speed things up: it takes advantage of PEP-658 metadata (which doesn't need to download the package) and if that metadata is missing it will next try byte range requests to grab just the metadata of the wheel, and so on. pip was learning some of these tricks in recent releases too.
One problem we have is that support for any repository features beyond PEP-503 (the 'simple' html index) is limited or entirely missing in every repo implementation except warehouse - the software that powers pypi. So if you use artifactory, AWS codeartifact, sonatype nexus, etc, because you are running an internal repository, PEP-658 & PEP-691 support will be missing, and uv runs slower; you may not even have accept-ranges support. (and if you want dependabot, you need to have your repository implement parts of the 'warehouse json api' - https://warehouse.pypa.io/api-reference/json.html - for it to understand your internal packages)
I've been playing with https://github.com/simple-repository/simple-repository-serve... as a proxy to try to fix our internal servers to suck less; it's very small codebase and easy to change. Its internal caching implementation isn't great so I wrapped nginx round it too, using cache agressively and use stale-while-revalidate to reduce round trips, it made our artifactory less painful to use, even with pip.
-
I would say, as someone who works on performance of pip, no one else was able to reproduce OPs severe performance issue, not saying it didn't happen, just it was an edge case on specific hardware (I am assuming it was this issue https://github.com/pypa/pip/issues/12314).
Since it was posted a lot of work was done on areas which likely caused performance problems, and I would expect in the latest version of pip to see at least a doubling in performance, e.g. I created a scenario similar to OPs that dropped from 266 seconds to 48 seconds on my machine, and more improvements have been made since then. However OP has never followed up to let us know if it improved.
Now, that's not to say you shouldn't use uv, it's performance is great. But just a lot of volunteer work has been put in over the last year (well before uv was announced) to improve the default Python package install performance. And one last thing:
> for a non-compiler language?
Installing packages from PyPI can involve compiling C, C++, Rust, etc. Python's packaging is very very flexible, and in lots of cases it can take a lot of time.
-
InfluxDB
InfluxDB high-performance time series database. Collect, organize, and act on massive volumes of high-resolution data to power real-time intelligent systems.