-
Notifications
You must be signed in to change notification settings - Fork 3.1k
Provide option to disable hash-checking #4344
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
This is very annoying problem that makes it impossible to use hashes. There should be option to do not require hashes for VCS requirements. I just don't see any sane solution to this except forking pip and patching that bit. We could use separate requirements files for VCS but the problem is that many services assume only one requirements file, since you would normally refer to other files from it. |
So I have to split requirements into 2 files:
Reqs with hashes should be installed first and then VCS reqs with This all is pretty painful to make running. Why can't you just skip requiring hashes for VCS requirements? 😡 |
In my case, I want to provide a binary package that the original author doesn't provide upstream in PyPI. However, when pip goes to install it (even if I give it the full URL to my whl file), it still checks the upstream versions and hashes and refuses to install it. |
Closing in favor of #6469, since that is (IIUC) for the same usecase and has a lot more discussion. |
I would love to provide hashes for the dependencies I use. Specifically I want to protect against remote tampering with only the code we deploy to production.
In our application we have central dependencies, our own code in vcs, some additional forked libraries in github, dependencies for testing and separate dev dependencies - a setup I assume is not particularly unique.
What I would like is to leverage the ability to reference one requirement file from another to allow anyone to simply run
pip install -r X
such that a developer can runpip install -r requirements-dev.txt
(which would include-r requirements.txt
) and our build pipeline can runpip install -r requirements.txt
(and hopefully have the central dependency hashes checked).The first problem is that I can not put our vcs urls into the requirements.txt file as require-hashes will fail with it encounters a vcs url (in spite of the fact that ssh/https already provides some protected against tampering and is ironically the only dependency code we have vetted)
Even if require-hashes would allow vcs urls, the second problem is that when using hashes I'm forced to pin and generate hashes for all developer and test dependencies which for my scenario is unnecessary because for tamper protection, I only care that code we deploy to production - after all, I am unable to vet every piece of code that is installed on a developer's machine anyway.
I can't see an elegant solution.
I'm interested in what approaches other teams take - does everyone just hand craft their own special scripts that manage a combination of
pip/pip-compile/hashin
from separate requirement.in files for dev, testing and production? Or does everyone just package all vcs deps into a local pypi server and reference the hash of that (but if we manage local copies of all our dependencies it somewhat negates the point of hashing in the first place)?The text was updated successfully, but these errors were encountered: