Holy shit <https://astral.sh/blog/uv>
# random
d
🤯 2
🔥 6
n
I have no idea why it is so fast - is it possible to write python import loader in rust, so we can finally have faster CLI in python?
❤️ 1
i
Apparently it uses this version solver: https://github.com/pubgrub-rs/pubgrub Some discussion from the hacker new post on this I'm paraphrasing as "why doesn't the python steering council come out with a permanent solution for package management" Too big a job? Too slow a council?
j
the Python Packaging Authority (not the SC - the later oversees the language syntax) has been slowly improving pip over the years. many people don’t realise that some of the limitations are on PyPI (pypa/warehouse) and the distribution format instead. for example, backfilling to have metadata separated from the Wheel .zip just started happening this week! regarding lock files, Brett Cannon, from the VSCode team, has tried to standardise 2 PEPs over the past three years. it was not the PyPA (let alone the SC) blocking that, but the community. hundreds of comments of discussion and it was impossible to find consensus.
👍 3
and finally, one has to understand that compiling code is hard, and doing so on Windows is harder. the wheel format was born to try to fix that UX issue and it’s been a massive success. but we’re in a local optimum now, and finding alternatives requires huge community momentum and buy-in before one can declare pip dead. I’m hopeful Astral and Prefix will inject much needed energy and innovation into the ecosystem
👍 1
one last thing: for pip to become significantly faster it would need do concurrent downloads. but I think at this point people prefer to write the stuff in Rust than having to deal with asyncio, one of the biggest headaches of the stdlib
i
@Juan Luis would this backfill make it so pip is able to find candidate packages' transitive dependencies without actually needing to download the wheels? In my experience that's one of the slowest parts in resolving some nightmare environments we have.
j
would this backfill make it so pip is able to find candidate packages' transitive dependencies without actually needing to download the wheels?
exactly
i
Looking through the docs, it looks like they're ignoring rtx/mise which already is implemented in rust and has perfectly replaced pyenv in my workflows. I wonder whether they will actually go through with building another replacement to pyenv or if they'll work with jdx on this like they are doing with Armin from rye
j
yeah I did a quick test a moment ago and didn't find my mise-installed Python versions. too bad, but I hope it's temporary 🙃
i
Using it like this it works
mise exec python@$mise_source_env -- uv venv $virtual_env
https://github.com/inigohidalgo/.dotfiles/commit/94dbfa56ef0a0dc7a978ce450b804b02fb6a125c
👀 1
j
uv pip install kedro
took 7 seconds 😱🔥 https://astral.sh/blog/uv
i
From what I was reading, once its cache is built it should be even faster. I wonder how long it takes on the second go 😂
Unfortunately I’m having some issues getting it to play nice with our internal pip mirror
j
7 seconds was with a cold cache. with a warm one, it was less than 1 second
well, I tested this and tried to break it in several ways. it mostly works, the only weird thing I spotted is that
uv
generates executables with the full absolute path of the interpreter in the shebang, and in my case the full path happened to contain spaces, which isn't allowed. other than that, it works with normal Python venvs, mise-generated venvs, and even conda/[micro]mamba environments.
i
Yeah I reported my index feed issue too, but I saw somebody else had reported it last night 🤣 Since the announcement the issues number has been steadily trending up
d
its a weird being at this point in my career that I recognise half the github handles in those issues 😂
j
@sbrugman
tried it 2 hours ago too 🙃
n
it takes me 1 second to run this on GitPod - not sure if caches exist already
😅
(.venv) gitpod /workspace/kedro-plugins (main) $ uv pip install -e "kedro-datasets[all]"
Built file:///workspace/kedro-plugins/kedro-datasets Built 1 editable in 1.54s
error: Failed to download: google-cloud-bigquery==1.28.2
Caused by: Couldn't parse metadata of google_cloud_bigquery-1.28.2-py2.py3-none-any.whl from https://files.pythonhosted.org/packages/ce/af/89ccb3dd70a86516cb408dd7b7484d2fdd073bdce6405f722f75e6058e66/google_cloud_bigquery-1.28.2-py2.py3-none-any.whl
Caused by: after parsing 2.0, found "de" after it, which is not part of a valid version
pyarrow (<2.0de,>=1.0.0) ; (python_version >= "3.5") and extra == 'all'
^^^^^^
😅 1
d
Raise an issue!
j
…to Google 😂
😂 1
m
This is big news! I like the bold statement of them trying to create
cargo
for python. It’s by far the best package manager/build tool I have worked with!
👍🏼 1
👍 2
a
7 seconds is crazy
d
I did a full install of kedro, duckdb, ibis earlier and it was ~20 seconds which is still amazing
🥳 1
j
pip install kedro-datasets[test] --no-cache-dir
took 19 minutes (not joking). sadly,
uv
chokes on an invalid wheel of a transitive dependency, as @Nok Lam Chan found out, so we can't compare for now.
👀 2
i
i am chomping at the bit to be able to use uv for my local pkg workflows. the few times ive used it to install dependencies which don't depend on internal indices ive been mindblown
n
19 mins - are you installing from pypi or Jforg?
c
it took me over 20 minutes yesterday to pip install pyspark -- tried w/ uv and it failed. but that's mainly my slow internet and I was downloading other things too
😅 2
d
Super interesting see the Pixi folks pivot https://prefix.dev/blog/uv_in_pixi
c
oh that's awesome -- was wondering how this would shake out given they seemed competitive. but now easy mapping of
uv
->
pip
and
pixi
->
conda
, with a superset. nice
❤️ 1
d
Yeah I’ve just had a colleague join Prefix so excited to see what they build
🔥 1
Given Wolf is behind Mamba and the effort there it makes total sense
j
d
I honestly think it takes a huge amount of humility to do that
❤️ 1
but also a slightly amusing case of nominative determinism
😂 2
honestly the novelty doesn’t fade
i
Private indices issue is fixed 🙂 https://github.com/astral-sh/uv/releases/tag/0.1.9
j
🔥 goooooooo
i
it's unbelievable, even on a package we have which has torch dependencies, and thus the lion's share of pip-install time was downloads, the speed-up is incredible. 6 minutes -> 50 seconds on cold cache. 25 seconds warm
🔥 1
image.png
👍🏼 1
j
still not working with our JFrog index I fear 😅
i
on the latest release? 0.1.9 😕
what sort of auth are you using for that?
n
Have been trying to learn Rust, not much progress other than reading the book 🥲 Still learnt a few interesting concepts though.
j
an issue with range requests, but it's fixed in a PR already! https://github.com/astral-sh/uv/issues/1709#issuecomment-1960950953
😅 1
🙌 1
m
@Nok Lam Chan same here. Have read the book did some exercises on Exercism and made some toy CLI. Learned a lot by doing so but I really miss an actual project to learn. Perhaps I should look into contributing to polars-ds?
👍🏼 1
n
Ya I think it's actually the only way works for me... struggle a lot to do that. Mostly spend my time in Python, A compiling language feels very different