-
Notifications
You must be signed in to change notification settings - Fork 3.1k
flag to force use of --find-links and ignore later PyPI versions #2090
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Can you give a specific example? I believe the case you mentioned works as expected - any distributions available via PyPI and If the issue is that you need a binary wheel and the PyPI distribution is source-only, then #2084 may be relevant to you. At present, there's no way to say to pip "only consider wheels", so a newer source-only distribution on PyPI will be preferred to an older wheel. |
For example, scikit-image requires |
Otherwise, we need to make sure ALL of the required dependencies are hosted on that site, even if they can just as easily be gotten from PyPI (e.g. tifffile). |
How about |
I'm afraid you're not missing anything :/ The problem is there's no way to specify per requirement pip options besides
This would indeed solve your problem but I have an impression the root problem is what I described above as you are specifically concerned with using requirements file and there is already a combination of pip options – You might take a look at #271 which is concerned with adding some options per requirement. There you will find #271 (comment) with a link to SO question with workaround (using shell script and invoking pip for every requirement). |
Could you put in your requirements:
? Then it would hopefully prefer your local wheel for matplotlib and get the rest from PyPI, no? |
What I ended up doing is editing the |
So I wonder if a pip change is still desired or if it's not needed since you were able to get things working? |
I would call what I did a hack, and still advocate for a built in solution. |
How does it help? |
It is working for us. We have a Travis build specifically against minimum dependencies, and we just replace all the >= in our requirements file with ==. |
I need this to be able to point to local Windows binaries (of lxml but it could be anything) that aren't available online. Behaviour would be to check the local folder first and only going online if something was missing from it. This would allow configuration using pip.ini which reduces typos and explaining this to users (an welcome hurdle when working with newbies on Windows). |
Does anyone know why this helps here? |
yes, without by restricting it to just "1.1.0", it will end up only sorting the v1.1.0 distributions from find-links and pypi, and our algorithm currently sorts find-links distributions first. |
Personally, what I would like is an "only use versions that have wheels" option. Maybe on a per-requirement basis. That sounds like the actual issue here. |
well, we do currently prefer it would need to be something more complicated like
a reasonable idea, but for this case, wheel is secondary I think. the core issue is that he needed an additional version constraint. the flag you described might not have helped since this project has mac wheels on pypi, and he apparently was using mac from his wheelhouse link.
from what you described, you needed an additional version constraint, so IMO you should just add a version constraint using the support that pip provides for that, which is requirements files. So, it doesn't seem like a hack to me. As implemented, I'm changing the description to be clearer, and closing, since I think it's doubtful something like the 2 flags I mentioned above would be implemented. BUT, if people find their way to this issue and vote it up, then that's another matter. |
If Py3.2 has correct |
the project is "scikit-image" the issue is needing |
to respond to what @Ivoz may have been trying to get at... : ) yes, "scikit-image" can have something like "matplotlib>=1.1.0,<1.4.0" in it's install_requires, but from how I read the conversation, the problem is wanting an "override mechanism" I think for just py3.2 |
@blink1073 you could vary |
Yes, @qwcode, that is probably the best option going forward. |
FWIW zc.buildout has two flags which make behaviour explicit. We already have
|
"--n" is the default for pip upgrades, not for pip installs. if I "pip install SomeProject" it will accept the locally installed distribution in any case, "-n" vs "-N" doesn't seem to be relevant here. it's about PyPI vs find-links preference, not new install vs already installed. the 2 flags you mention don't apply in this case? |
Sure they do when it comes to using a local cache as specified by find-links. This can't be overridden with version pinning so a flag to favour local, cached files over an index is necessary. This is a common requirement for deployments. |
again, -n/-N is about using latest vs installed.
I don't follow. Again, his core issue was needing an additional version constraint for Py32. So, adding an upper bound or pinning (either in a Py32 requirements file or conditionally in install_requires) is the most straightforward solution IMO.
pip does favor |
I am running into the exact same issue actually. I need a way to tell pip to prefer the local version to the pipy version. Is there no way to do so still 2 years after this issue was opened ? |
Why do you need to do that, and why is specifying the explicit version you have locally not an acceptable solution? |
We build numpy, scipy, locally, binding to MKL, which offers better performance than the wheel version from pipy. We want that if a user types "pip install numpy", then it will install our MKL-bound version, not the one from pipy. Previously, numpy was not available in wheel format on pipy, so we had setup This no longer works. |
Running pip install in verbose mode, I see : Local files found: /cvmfs/soft.computecanada.ca/custom/python/wheelhouse/generic/numpy-1.12.0-cp27-cp27mu-linux_x86_64.whl which makes no sense. It finds an acceptable link locally, it is the most recent version and yet retrieve the online version. |
Looks like it is related to #3844 |
Indeed, it does look related. |
Agreed, this seems related to #3844. It also sounds like it's related to the discussions that have gone on in the past (sorry, I don't have an issue number) about letting people specify finer-grained custom compatibility flags ("bound to MKL" may be viewed as a binary compatibility restriction, although it's more a preference than a hard requirement). |
In an HPC environment, running the best performing binary is kind of a requirement.... |
Understood, but as I understand it, MKL is specifically a numeric library, so we wouldn't want to reject a binary of lxml (for example) just because it's not tagged "MKL", given that it wouldn't actually link in any of MKL in any case. |
No, indeed. All I'm saying is that if I bother providing a wheelhouse with a find-links, the packages I put there better be used in priority... there's no reason to do this otherwise... |
Well, the best match algorithm is
The various sources ( [1] Any such proposal would need to be explicit about what counts as local, though. Is a local devpi instance accessed via |
You could also do |
manylinux1, linux, mkl or not... all of those are compatible with our system. |
How would one add the additional tag ?
…
On janv. 26, 2017 at 6:06 PM (x-apple-data-detectors://0), <Donald Stufft ***@***.***)> wrote:
You could also do 1.12.0+mkl or something and that will be newer than 1.12.0 (but older than 1.12.1).
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub (#2090 (comment)), or mute the thread (https://github.com/notifications/unsubscribe-auth/AHKsatsteC7Fnl62bYQMJ8ty7VWgF4RAks5rWSb_gaJpZM4Ct9jH).
|
That's not fine grained enough, what if you have two things specified with It would not be particularly difficult to use the specificity sort to allow you to define a custom platform tag for your platform that takes precedence over all others (but a newer version would still be chosen over an older version). It mostly just takes someone to write the PR for it. Trying to allow pinning to a specific location or something like that is not likely to happen without a revamp of our configuration and repository support (something I have planned) because it's hard to get fine grained enough with out current setup where we are passing in repository locations via CLI flags and environment variables. |
It's a version number, when you build your wheel you'd just add |
Actually, local vs non local does not make a difference. What does is default vs non default. If I specify a URL, online or local, I do expect that those packages are given priority if found. I would even go as far as saying that it should be possible to trump the version and tag priority in favour of location through some configuration. A well optimized build is much more likely to make a difference in performance than v 1.12.0 vs 1.12.1
Maxime
…
On janv. 26, 2017 at 6:04 PM (x-apple-data-detectors://0), <Paul Moore ***@***.***)> wrote:
Well, the best match algorithm is
Version
Compatibility tag priority (conceded the manylinux1 vs linux issue needs resolving here)
The various sources (--index-url and --find-links) don't affect this ordering. If you're arguing that version and compatibility tag being equal then local taking priority over remote[1] should be added as a rule, then we're still looking for a reason for this (your use case should be covered by compatibility tags, MKL and non-MKL builds aren't compatible after all). If you're suggesting that local vs remote override the above list, then that's a major change in semantics, and I can't imagine it happening.
[1] Any such proposal would need to be explicit about what counts as local, though. Is a local devpi instance accessed via --index-url local or remote? Does the answer change if it's a public wheel hosting service? Is --find-links pointing at a http location local or remote?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub (#2090 (comment)), or mute the thread (https://github.com/notifications/unsubscribe-auth/AHKsah4GRJ6Dd5svh-CpDC8Jx1ljx11sks5rWSaJgaJpZM4Ct9jH).
|
So... I usually build with
python setup.py bdist_wheel
I would put the +whatever where ?
…
On janv. 26, 2017 at 6:19 PM (x-apple-data-detectors://0), <Donald Stufft ***@***.***)> wrote:
>
>
> How would one add the additional tag ?
>
>
It's a version number, when you build your wheel you'd just add +whatever onto the end of the version. This is treated as both == to whatever the non +whatever version and newer than it. This means if you pin to ==1.12.0 then it will still select 1.12.0+mkl, and since it considers it newer than 1.12.0, it will always be selected over any other 1.12.0 version.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub (#2090 (comment)), or mute the thread (https://github.com/notifications/unsubscribe-auth/AHKsak52w0aJTxfMonGg99xbOhhEytspks5rWSoQgaJpZM4Ct9jH).
|
If you have two things specified with find-links, prioritize the first one over the last one.
…
On janv. 26, 2017 at 6:18 PM (x-apple-data-detectors://0), <Donald Stufft ***@***.***)> wrote:
That's not fine grained enough, what if you have two things specified with --find-links? How do you decide the priority between them. We have to make some choice on which one we install, so we sort all of the available files using a priority of Version, Format, Specificity, and finally Location (mostly by nature of the fact we use a stable sort, and we scan the URLs in order).
It would not be particularly difficult to use the specificity sort to allow you to define a custom platform tag for your platform that takes precedence over all others (but a newer version would still be chosen over an older version). It mostly just takes someone to write the PR for it.
Trying to allow pinning to a specific location or something like that is not likely to happen without a revamp of our configuration and repository support (something I have planned) because it's hard to get fine grained enough with out current setup where we are passing in repository locations via CLI flags and environment variables.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub (#2090 (comment)), or mute the thread (https://github.com/notifications/unsubscribe-auth/AHKsaoUhF-rOG5a4XPHAky--Wd8ynjNuks5rWSmvgaJpZM4Ct9jH).
|
You'd likely need to modify the |
Well that's rather useless as a general solution.... |
I'm in the same boat. We have a custom built h5py wheel (version 2.9.0, same version as latest PyPI manylinux wheel). We want to always use out custom-built wheel, because it links against the system HDF5 library, instead of using a bundled one like the manylinux wheel does, which is a requirement for us. EDIT: I guess our situation is more related to #3844 |
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
Presumably, if the user specifies
--find-links
, they would prefer to install packages from that directory versus PyPI. For example, if I have a wheelhouse withmatplotlib
, and I specifypip install -f $WHEELHOUSE matplotlib
, it should install from there. However, it will default to install from PyPI unless I ALSO specify--no-index
.You might say that this is the desired behaviour. However, what if I have a
requirements.txt
file and some of the packages are hosted in my wheelhouse and the others are available on PyPI? There is nopip
command that will allow me to install my packages using therequirements.txt
file, unless I am missing something.The text was updated successfully, but these errors were encountered: