Description
A side-effect of no longer propagating import paths using the PYTHONPATH
envvar is that subprocesses don't inherit the paths. This is usually a good thing, but ends up breaking plain calls to python that assume they're going to inherit the current python's settings.
An example is pre-commit and its invocation of virtual env:
# add pre-commit to requirements and process through pip.parse
# BUILD.bazel
load("//python/entry_points:py_console_script_binary.bzl", "py_console_script_binary")
py_console_script_binary(
name = "pre-commit",
pkg = "@dev_pip//pre_commit",
script = "pre-commit",
)
bazel run --@rules_python//python/config_settings:bootstrap_impl=script //:pre-commit
Eventually, it'll run: [sys.executable, '-mvirtualenv', ...]
, its sys.path will be just the stdlib, and fail to import virtualenv
This is sort of WAI. Part of the purpose of bootstrap_impl=script is to no longer use the envvar so that PYTHONPATH doesn't get too long and bleed into subprocesses.
I'm not sure how to work around this. I guess a legacy option to set the env var?
I'm not sure how this is supposed to work outside of bazel, either. It must assume that it's invoked in a venv or something? The surrounding code seems to indicate it's setting up a venv for pre-commit itself...or something. This all seems odd -- I would have to create a venv with virtualenv in it to run pre-commit so pre-commit can create its own venv? That doesn't sound right.