Skip to content

Tracking issue: PyPI dependencies pixi can't manage yet. #771

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ruben-arts opened this issue Feb 2, 2024 · 79 comments
Open

Tracking issue: PyPI dependencies pixi can't manage yet. #771

ruben-arts opened this issue Feb 2, 2024 · 79 comments
Labels
pypi Issue related to PyPI dependencies

Comments

@ruben-arts
Copy link
Contributor

There are a few PyPI packages pixi can't install yet, where pip can.

Please paste your examples in this issue so we have a list of known packages we can track, test and benchmark with along the way.

Information we would like:

  1. What did you run and what was the outcome?
    e.g. pixi add --pypi packagex
    If it doesn't recreate in an empty environment please share your pixi.toml that recreates the issue.

  2. What error did pixi return?
    e.g.

    × RECORD file doesn't match wheel contents: missing hash for mediapipe/version.txt (expected sha256=-fE2KU)
    
  3. Can pip install the package?
    Does pip install packagex work?

  4. What platform are you on?
    e.g. linux-64

  5. Did you find a workaround, if so please explain.
    e.g. build it into a conda package, using a custom fork, etc.

Your input would greatly help us improve the pixi's experience!
Thanks in advance! ❤️

@pablovela5620
Copy link
Contributor

Mediapipe

  1. pixi add --pypi mediapipe
  2. × RECORD file doesn't match wheel contents: missing hash for mediapipe/version.txt (expected sha256=-fE2KU)
  3. python -m pip install mediapipe does work
  4. platform osx-arm64
  5. work around was just to use pip to install using pixi tasks
[project]
name = "ipdscan"
version = "0.1.0"
description = "Add a short description here"
authors = ["pablovela5620 <[email protected]>"]
channels = ["conda-forge"]
platforms = ["osx-arm64"]

[tasks]
mp-install = "python -m pip install mediapipe"

[dependencies]
python = "3.11.*"
pip = ">=23.3.2,<23.4"
rerun-sdk = ">=0.12.0,<0.13"
requests = ">=2.31.0,<2.32"
tqdm = ">=4.66.1,<4.67"

[pypi-dependencies]
imutils = "*"

@tylerjw
Copy link

tylerjw commented Feb 4, 2024

I wanted to use this library that is packaged in pypi: https://github.com/spirali/elsie

Here is the error I was seeing when trying to use pixi add:

 WARN rattler_installs_packages::index::package_database: errors while processing source distributions:
  × failed to resolve `pypi-dependencies`, due to underlying error
  ╰─▶ No metadata could be extracted for the following available artifacts:
      	- lxml-4.6.5.tar.gz

Error:   × error while processing source distribution 'lxml-4.6.5.tar.gz':
  │  could not build wheel: <string>:67: DeprecationWarning: pkg_resources is deprecated as an API. See https://
  │ setuptools.pypa.io/en/latest/pkg_resources.html
  │ 
  help: Probably an error during processing of source distributions. Please check the error message above.

Posting in the discord channel got it working with this fix:

pixi add python lxml
pixi add --pypi elsie

The reason for this is:

there seems to be an issue with one of the lxml source distributions. Meaning you have to locally build the package but that is not working because of the error it is giving you. pixi reports back on errorous packages where pip probably continues trying other versions.

@liquidcarbon
Copy link
Contributor

liquidcarbon commented Feb 12, 2024

pixi add duckdbpixi add --pypi duckdb ❌ vs pixi run pip install duckdb ✅ on Windows

PS C:\code> pixi init duckdb-pip
✔ Initialized project in C:\code\duckdb-pip
PS C:\code> cd .\duckdb-pip\
PS C:\code\duckdb-pip> pixi add python=3.11
✔ Added python=3.11
PS C:\code\duckdb-pip> pixi add duckdb     
  × could not determine any available versions for duckdb on win-64. Either the package could not be found or version constraints on other
  │ dependencies result in a conflict.
  ╰─▶ Cannot solve the request because of: No candidates were found for duckdb *.


PS C:\code\duckdb-pip> pixi add --pypi duckdb
  × could not build wheel: warning: no files found matching '*.h' under directory 'duckdb'
  │ warning: no files found matching '*.hpp' under directory 'duckdb'
  │ warning: no files found matching '*.cpp' under directory 'duckdb'
  │ warning: no files found matching '*.h' under directory 'src'
  │ warning: manifest_maker: MANIFEST.in, line 6: 'recursive-include' expects <dir> <pattern1> <pattern2> ...
  │
  │ C:\Users\AKISLU~1\AppData\Local\Temp\.tmpkvrxrc\venv\Lib\site-packages\setuptools\command\build_py.py:207: _Warning: Package 'duckdb-      
  │ stubs.value' is absent from the `packages` configuration.
  │ !!
  │
  │         ********************************************************************************
  │         ############################
  │         # Package would be ignored #
  │         ############################
  │         Python recognizes 'duckdb-stubs.value' as an importable package[^1],
  │         but it is absent from setuptools' `packages` configuration.
  │
  │         This leads to an ambiguous overall configuration. If you want to distribute this
  │         package, please make sure that 'duckdb-stubs.value' is explicitly added
  │         to the `packages` configuration field.
  │
  │         Alternatively, you can also rely on setuptools' discovery methods
  │         (for example by using `find_namespace_packages(...)`/`find_namespace:`
  │         instead of `find_packages(...)`/`find:`).
  │
  │         You can read more about "package discovery" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
  │
  │         If you don't want 'duckdb-stubs.value' to be distributed and are
  │         already explicitly excluding 'duckdb-stubs.value' via
  │         `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
  │         you can try to use `exclude_package_data`, or `include-package-data=False` in
  │         combination with a more fine grained `package-data` configuration.
  │
  │         You can read more about "package data files" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
  │
  │
  │         [^1]: For Python, any directory (with suitable naming) can be imported,
  │               even if it does not contain any `.py` files.
  │               On the other hand, currently there is no concept of package data
  │               directory, all directories are treated like packages.
  │         ********************************************************************************
  │
  │ !!
  │   check.warn(importable)
  │ C:\Users\AKISLU~1\AppData\Local\Temp\.tmpkvrxrc\venv\Lib\site-packages\setuptools\command\build_py.py:207: _Warning: Package 'duckdb-      
  │ stubs.value.constant' is absent from the `packages` configuration.
  │ !!
  │
  │         ********************************************************************************
  │         ############################
  │         # Package would be ignored #
  │         ############################
  │         Python recognizes 'duckdb-stubs.value.constant' as an importable package[^1],
  │         but it is absent from setuptools' `packages` configuration.
  │
  │         This leads to an ambiguous overall configuration. If you want to distribute this
  │         package, please make sure that 'duckdb-stubs.value.constant' is explicitly added
  │         to the `packages` configuration field.
  │
  │         Alternatively, you can also rely on setuptools' discovery methods
  │         (for example by using `find_namespace_packages(...)`/`find_namespace:`
  │         instead of `find_packages(...)`/`find:`).
  │
  │         You can read more about "package discovery" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
  │
  │         If you don't want 'duckdb-stubs.value.constant' to be distributed and are
  │         already explicitly excluding 'duckdb-stubs.value.constant' via
  │         `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
  │         you can try to use `exclude_package_data`, or `include-package-data=False` in
  │         combination with a more fine grained `package-data` configuration.
  │
  │         You can read more about "package data files" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
  │
  │
  │         [^1]: For Python, any directory (with suitable naming) can be imported,
  │               even if it does not contain any `.py` files.
  │               On the other hand, currently there is no concept of package data
  │               directory, all directories are treated like packages.
  │         ********************************************************************************
  │
  │ !!
  │   check.warn(importable)
  │ C:\Users\AKISLU~1\AppData\Local\Temp\.tmpkvrxrc\venv\Lib\site-packages\setuptools\command\build_py.py:207: _Warning: Package
  │ 'duckdb.experimental' is absent from the `packages` configuration.
  │ !!
  │
  │         ********************************************************************************
  │         ############################
  │         # Package would be ignored #
  │         ############################
  │         Python recognizes 'duckdb.experimental' as an importable package[^1],
  │         but it is absent from setuptools' `packages` configuration.
  │
  │         This leads to an ambiguous overall configuration. If you want to distribute this
  │         package, please make sure that 'duckdb.experimental' is explicitly added
  │         to the `packages` configuration field.
  │
  │         Alternatively, you can also rely on setuptools' discovery methods
  │         (for example by using `find_namespace_packages(...)`/`find_namespace:`
  │         instead of `find_packages(...)`/`find:`).
  │
  │         You can read more about "package discovery" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html
  │
  │         If you don't want 'duckdb.experimental' to be distributed and are
  │         already explicitly excluding 'duckdb.experimental' via
  │         `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
  │         you can try to use `exclude_package_data`, or `include-package-data=False` in
  │         combination with a more fine grained `package-data` configuration.
  │
  │         You can read more about "package data files" on setuptools documentation page:
  │
  │         - https://setuptools.pypa.io/en/latest/userguide/datafiles.html
  │
  │
  │         [^1]: For Python, any directory (with suitable naming) can be imported,
  │               even if it does not contain any `.py` files.
  │               On the other hand, currently there is no concept of package data
  │               directory, all directories are treated like packages.
  │         ********************************************************************************
  │
  │ !!
  │   check.warn(importable)
  │ error: Microsoft Visual C++ 14.0 or greater is required. Get it with "Microsoft C++ Build Tools": https://visualstudio.microsoft.com/      
  │ visual-cpp-build-tools/
  │

PS C:\code\duckdb-pip> pixi add pip          
✔ Added pip
PS C:\code\duckdb-pip> pixi run pip install duckdb
Collecting duckdb
  Downloading duckdb-0.9.2-cp311-cp311-win_amd64.whl.metadata (798 bytes)
Downloading duckdb-0.9.2-cp311-cp311-win_amd64.whl (10.3 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.3/10.3 MB 21.8 MB/s eta 0:00:00
Installing collected packages: duckdb
Successfully installed duckdb-0.9.2

@liquidcarbon
Copy link
Contributor

liquidcarbon commented Feb 12, 2024

  1. work around was just to use pip to install using pixi tasks

@pablovela5620 I am also finding this necessary quite often, probably more often than @ruben-arts would like :)

In this pattern the environment definition is fragmented between pixi.toml tasks and pixi.lock. It could work, esp with pinned pip installs in tasks, but is it the intention?

@ruben-arts
Copy link
Contributor Author

In this pattern the environment definition is fragmented between pixi.toml tasks and pixi.lock. It could work, esp with pinned pip installs in tasks, but is it the intention?

@liquidcarbon This is indeed not what we want as UX but we simply need to develop more to support all PyPI packages. Its a weird bunch of requirements we have to support to be equivalent to pip. So please keep posting non working packages!

@ruben-arts
Copy link
Contributor Author

@liquidcarbon the duckdb on conda-forge has been continued on python-duckdb which should be available on windows. We'll keep the example for pypi to test more!

@tdejager
Copy link
Contributor

tdejager commented Feb 15, 2024

@liquidcarbon I cannot reproduce this, neither on OSX or on windows. It's strange it does not select the .whl for some reason.

@tdejager
Copy link
Contributor

@tylerjw actually it also fails for the same version in pip on apple silicon, like pip we error out when we cannot build the first source distribution.

@tdejager
Copy link
Contributor

tdejager commented Feb 15, 2024

@pablovela5620 so it seems mediapipe 10.9 is a package with an invalid RECORD file, I manually checked it and it's incorrect.

This is mentioned in the PyPa

During extraction, wheel installers verify all the hashes in RECORD against the file contents. Apart from RECORD and its signatures, installation will fail if any file in the archive is not both mentioned and correctly hashed in RECORD.

Mediapipe has a version.txt that is not mentioned in the RECORD for the 10.9 release.

Which in this case triggers the error, I'm unsure why pip does no do this. But I feel its good to adhere to the standard here.

In any case, mediapipe 10.8 does seem to work, you could use that instead.

Also see: google-ai-edge/mediapipe#5025

tdejager pushed a commit to prefix-dev/rip that referenced this issue Feb 15, 2024
These PR address 2 major issues and 1 small issue
* zip file stamps issue
alexcrichton/tar-rs#349 which we encountered
when installing tomli-2.1.0
* changed from_filename signature so it can also produce Stree, which
it's needed from pixi side
* lowercase Root-Is-Purelib so we can install elsie library (
prefix-dev/pixi#771 (comment))
@liquidcarbon
Copy link
Contributor

liquidcarbon commented Feb 15, 2024

@liquidcarbon I cannot reproduce this, neither on OSX or on windows. It's strange it does not select the .whl for some reason.

@tdejager Just tried on Win10, original comment was on Win11
same thing:

PS C:\Users\a\Desktop\code\duckdb-pip> pixi add python=3.11
✔ Added python 3.11.*
PS C:\Users\a\Desktop\code\duckdb-pip> pixi add duckdb
× could not determine any available versions for duckdb on win-64. Either the package could not be found or version
│ constraints on other dependencies result in a conflict.
╰─▶ Cannot solve the request because of: No candidates were found for duckdb *.

I was on Pixi 0.9.1, upped to 0.13.0 - same thing

But pixi add --pypi duckdb works. 🤷‍♂️

@ruben-arts
Copy link
Contributor Author

The conda package you add with pixi add duckdb should be pixi add python-duckdb

https://prefix.dev/channels/conda-forge/packages/python-duckdb

@liquidcarbon
Copy link
Contributor

@ruben-arts noted -- I just wasn't sure which part @tdejager was trying to reproduce

@awray3
Copy link

awray3 commented Feb 25, 2024

Tensorflow metal on Apple silicon MacOS 14.2.1:

pixi init tf-metal && cd tf-metal
pixi add "python>=3.11" "tensorflow>=2.13" pip
pixi add --pypi tensorflow-metal
> × failed to resolve `pypi-dependencies`, due to underlying error
  ╰─▶ The following packages are incompatible
      └─ tensorflow-metal * cannot be installed because there are no viable options:
         └─ tensorflow-metal 0.1.0 | 0.1.1 | 0.1.2 | 0.2.0 | 0.3.0 | 0.4.0 | 0.5.0 | 0.5.1 | 0.6.0 | 0.7.0 | 0.7.1
      | 0.8.0 | 1.0.0 | 1.0.1 | 1.1.0 is excluded because none of the artifacts are compatible with the Python
      interpreter or glibc version and there are no supported sdists

pixi run pip install tensorflow-metal
> Successfully installed tensorflow-metal-1.1.0

@baszalmstra
Copy link
Contributor

Most likely you are missing a system requirement: https://pixi.sh/latest/configuration/#the-system-requirements-table

Most likely macos=12.0

@awray3
Copy link

awray3 commented Feb 25, 2024

Nice, that fixed it. But would the generated pixi.toml work on a linux machine? I guess I would have to mark tensorflow-metal as platform dependent somehow. My goal is to have a pixi.toml that installs tensorflow metal on a mac and tensorflow with gpu support on linux.

EDIT: Nevermind, I found the example demonstrating how to do this. Thank you!

@ruben-arts ruben-arts added the pypi Issue related to PyPI dependencies label Feb 29, 2024
@roaldarbol
Copy link

Installing opencv-python-headless with conda works (through python -m pip install opencv-python-headless), but not with pixi. I'm on osx-64.

(base) ➜  idtracker pixi add --pypi opencv-python-headless
  × failed to resolve `pypi-dependencies`, due to underlying error
  ╰─▶ No metadata could be extracted for the following available artifacts:
        - opencv-python-headless-4.9.0.80.tar.gz

Error:   × error while processing source distribution 'opencv-python-headless-4.9.0.80.tar.gz':
  │  could not build wheel: Traceback (most recent call last):
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/build_frontend.py", line 124, in <module>
  │     get_requires_for_build_wheel(backend, work_dir)
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/build_frontend.py", line 58, in get_requires_for_build_wheel
  │     result = f()
  │              ^^^
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in
  │ get_requires_for_build_wheel
  │     return self._get_build_requires(config_settings, requirements=['wheel'])
  │            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in _get_build_requires
  │     self.run_setup()
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 487, in run_setup
  │     super().run_setup(setup_script=setup_script)
  │   File "/var/folders/rh/y9ws4bgj7x50twypcwtk2ypr0000gp/T/.tmpvjLm8l/venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in run_setup
  │     exec(code, locals())
  │   File "<string>", line 10, in <module>
  │ ModuleNotFoundError: No module named 'skbuild'
  │ 
  help: Probably an error during processing of source distributions. Please check the error message above.

@baszalmstra
Copy link
Contributor

@roaldarbol The error message is absolutely terrible but if you add:

[system-requirements]
macos = "11.0"

It should work.

@roaldarbol
Copy link

It does indeed! Thanks!

@jacobbieker
Copy link

Trying to install torch-geometric-temporal with pixi add. It fails as it cannot find torch, which had already been install previously

  1. pixi add --pypi torch-geometric-temporal
  2. python -m pip install torch-geomtric-temporal does work, after the other dependencies are install with pixi
  3. platform linux-64
  4. work around was just to use pip to install using pixi tasks
    Configuration:
[project]
name = "graph_weather"
version = "0.1.0"
description = "Add a short description here"
authors = ["Jacob Bieker <[email protected]>"]
channels = ["pyg", "nvidia", "conda-forge", "pytorch"]
platforms = ["linux-64"]

[tasks]
tinstall = "python -m pip install torch-geometric-temporal"

[dependencies]
python = "3.11.*"
torchvision = ">=0.16.1,<0.17"
pytorch-cluster = ">=1.6.3,<1.7"
pytorch-scatter = ">=2.1.2,<2.2"
pytorch-cuda = "12.1.*"
xarray = ">=2024.2.0,<2024.3"
pytorch-spline-conv = ">=1.2.2,<1.3"
pytorch = ">=2.1"
pandas = ">=2.2.1,<2.3"
h3-py = ">=3.7.6,<3.8"
numcodecs = ">=0.12.1,<0.13"
scipy = ">=1.12.0,<1.13"
zarr = ">=2.17.0,<2.18"
pyg = ">=2.5.0,<2.6"
tqdm = ">=4.66.2,<4.67"
pytorch-sparse = ">=0.6.18,<0.7"
lightning = ">=2.2.0.post0,<2.2.1"
einops = ">=0.7.0,<0.8"
fsspec = ">=2024.2.0,<2024.3"
datasets = ">=2.18.0,<2.19"
pip = ">=24.0,<25"


[pypi-dependencies]
pytest = "*"  # This means any version (this `*` is custom in pixi)
pre-commit = "*"
pysolar = "*"

@ruben-arts
Copy link
Contributor Author

Trying to install torch-geometric-temporal with pixi add. It fails as it cannot find torch, which had already been install previously

It does indeed not work. Testing the repro gives me this error:

  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ├─▶ Failed to download and build: torch-scatter==2.1.2
  ├─▶ Failed to build: torch-scatter==2.1.2
  ╰─▶ Build backend failed to determine extra requires with `build_wheel()`:
      --- stdout:
      
      --- stderr:
      Traceback (most recent call last):
        File "<string>", line 14, in <module>
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in
      get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in
      _get_build_requires
          self.run_setup()
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 487, in
      run_setup
          super().run_setup(setup_script=setup_script)
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in
      run_setup
          exec(code, locals())
        File "<string>", line 8, in <module>
      ModuleNotFoundError: No module named 'torch'
      ---

In both the uv and rip workflow although import torch in the pixi run python does work for this pixi.toml.

Thanks for the info @jacobbieker

@tdejager
Copy link
Contributor

tdejager commented Mar 4, 2024

Trying to install torch-geometric-temporal with pixi add. It fails as it cannot find torch, which had already been install previously

It does indeed not work. Testing the repro gives me this error:

  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ├─▶ Failed to download and build: torch-scatter==2.1.2
  ├─▶ Failed to build: torch-scatter==2.1.2
  ╰─▶ Build backend failed to determine extra requires with `build_wheel()`:
      --- stdout:
      
      --- stderr:
      Traceback (most recent call last):
        File "<string>", line 14, in <module>
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in
      get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in
      _get_build_requires
          self.run_setup()
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 487, in
      run_setup
          super().run_setup(setup_script=setup_script)
        File "/home/rarts/.cache/rattler/cache/uv-cache/.tmpvVNUWW/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in
      run_setup
          exec(code, locals())
        File "<string>", line 8, in <module>
      ModuleNotFoundError: No module named 'torch'
      ---

In both the uv and rip workflow although import torch in the pixi run python does work for this pixi.toml.

Thanks for the info @jacobbieker

Would be kind-of useful if we can keep the build environments for uv as well, so it's easier to debug these things.

@liblaf
Copy link

liblaf commented Mar 12, 2024

What did you run and what was the outcome?

pixi add --pypi -vv "trimesh[all]"

What error did pixi return?

Log

 INFO pixi::lock_file::outdated: the pypi dependencies of environment 'default' for platform linux-64 are out of date because the requirement 'trimesh[all]' could not be satisfied (required by '<environment>')
 INFO pixi::lock_file::update: updated conda packages in the 'default' prefix in 5ms 378us 132ns
 INFO resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve: the following python packages are assumed to be installed by conda: libexpat 2.6.1, xz 5.2.6, readline 8.2, openssl 3.2.1, ld-impl-linux-64 2.40, libsqlite 3.45.1, libuuid 2.38.1, libffi 3.4.2, ncurses 6.4, bzip2 1.0.8, libzlib 1.2.13, tk 8.6.13, tzdata 2024a0, libxcrypt 4.4.36, libnsl 2.0.1, libgcc-ng 13.2.0, python 3.11.8, ca-certificates 2024.2.2, libgomp 13.2.0
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh[all] @ 4.2.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh @ 4.2.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh[test] @ 4.2.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh[easy] @ 4.2.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: trimesh[recommend] @ 4.2.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: numpy @ 1.26.4    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pytest-cov @ 4.1.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: coveralls @ 3.3.1    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: mypy @ 1.9.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: ezdxf @ 1.2.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pytest @ 8.1.1    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pymeshlab @ 2022.2.post3    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pyinstrument @ 4.6.2    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: matplotlib @ 3.8.3    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: ruff @ 0.3.2    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: typeguard @ 4.1.5    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: colorlog @ 6.8.2    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: mapbox-earcut @ 1.0.1    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: chardet @ 5.2.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: lxml @ 5.1.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: jsonschema @ 4.21.1    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: networkx @ 3.2.1    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: svg-path @ 6.3    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: pycollada @ 0.8    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: setuptools @ 69.1.1    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: shapely @ 2.0.3    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: xxhash @ 3.4.1    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: rtree @ 1.2.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: httpx @ 0.27.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: scipy @ 1.12.0    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied:
   embreex <2.17.7 | >2.17.7, <2.17.7.post1 | >2.17.7.post1, <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post1 | >2.17.7.post1, <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post4 | >2.17.7.post4 is forbidden    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: backtrack to DecisionLevel(5)    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied:
   embreex ==2.17.7.post4 is forbidden    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex * is forbidden    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: trimesh[easy] ==4.2.0 is forbidden    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: backtrack to DecisionLevel(2)    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 4.1.8 because of its dependencies    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 4.1.7 because of its dependencies    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 4.1.6 because of its dependencies    
...
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 1.9.15 because of its dependencies    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 1.9.14 because of its dependencies    
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: not adding trimesh[easy] @ 1.9.13 because of its dependencies    
 INFO resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata trimesh==1.9.12}:get_or_build_wheel_metadata{dist=trimesh==1.9.12}:build_source_dist_metadata:setup_build{package_id="trimesh==1.9.12" subdirectory=None}:solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0    
 INFO resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata trimesh==1.9.12}:get_or_build_wheel_metadata{dist=trimesh==1.9.12}:build_source_dist_metadata:setup_build{package_id="trimesh==1.9.12" subdirectory=None}:solve: pubgrub::internal::partial_solution: add_decision: setuptools @ 69.1.1    
 INFO resolve_pypi{group=default platform=linux-64}:process_request{request=Metadata trimesh==1.9.12}:get_or_build_wheel_metadata{dist=trimesh==1.9.12}:build_source_dist_metadata:setup_build{package_id="trimesh==1.9.12" subdirectory=None}:solve: pubgrub::internal::partial_solution: add_decision: wheel @ 0.43.0    
  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ├─▶ Failed to download and build: trimesh==1.9.12
  ├─▶ Failed to build: trimesh==1.9.12
  ╰─▶ Build backend failed to determine extra requires with `build_wheel()` with exit status: 1
      --- stdout:
      
      --- stderr:
      Traceback (most recent call last):
        File "<string>", line 9, in <module>
      ModuleNotFoundError: No module named 'pypandoc'
      
      During handling of the above exception, another exception occurred:
      
      Traceback (most recent call last):
        File "<string>", line 14, in <module>
        File "/home/liblaf/.cache/rattler/cache/uv-cache/.tmpc8Jzlx/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 325, in
      get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/liblaf/.cache/rattler/cache/uv-cache/.tmpc8Jzlx/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 295, in _get_build_requires
          self.run_setup()
        File "/home/liblaf/.cache/rattler/cache/uv-cache/.tmpc8Jzlx/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 487, in run_setup
          super().run_setup(setup_script=setup_script)
        File "/home/liblaf/.cache/rattler/cache/uv-cache/.tmpc8Jzlx/.venv/lib/python3.11/site-packages/setuptools/build_meta.py", line 311, in run_setup
          exec(code, locals())
        File "<string>", line 12, in <module>
      FileNotFoundError: [Errno 2] No such file or directory: 'README.md'
      ---

Can pip install the package? Does pip install "trimesh[all]" work?

Yes.

What platform are you on?

linux-64

The above log shows that the problem seems to be with embreex, so I tried adding it directly, and got the following error:

$ pixi add --pypi -vv embreex
 INFO pixi::lock_file::outdated: the pypi dependencies of environment 'default' for platform linux-64 are out of date because the requirement 'embreex' could not be satisfied (required by '<environment>')
 INFO pixi::lock_file::update: updated conda packages in the 'default' prefix in 6ms 587us 9ns
 INFO resolve_pypi{group=default platform=linux-64}: pixi::lock_file::resolve: the following python packages are assumed to be installed by conda: libxcrypt 4.4.36, readline 8.2, libgomp 13.2.0, setuptools 69.1.1, xz 5.2.6, libnsl 2.0.1, tzdata 2024a0, libuuid 2.38.1, wheel 0.42.0, libgcc-ng 13.2.0, bzip2 1.0.8, ca-certificates 2024.2.2, libsqlite 3.45.1, ncurses 6.4, pip 24.0, openssl 3.2.1, ld-impl-linux-64 2.40, libzlib 1.2.13, libffi 3.4.2, libexpat 2.6.1, python 3.11.8, tk 8.6.13
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::partial_solution: add_decision: root @ 0a0.dev0
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: Start conflict resolution because incompat satisfied:
   embreex <2.17.7 | >2.17.7, <2.17.7.post1 | >2.17.7.post1, <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post1 | >2.17.7.post1, <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post2 | >2.17.7.post2, <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post3 | >2.17.7.post3, <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex <2.17.7.post4 | >2.17.7.post4 is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: embreex * is forbidden
 INFO resolve_pypi{group=default platform=linux-64}:solve: pubgrub::internal::core: prior cause: root ==0a0.dev0 is forbidden
  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because only the following versions of embreex are available:
          embreex==2.17.7
          embreex==2.17.7.post1
          embreex==2.17.7.post2
          embreex==2.17.7.post3
          embreex==2.17.7.post4
      and embreex==2.17.7 is unusable because no wheels are available with a matching platform, we can conclude that embreex<2.17.7.post1 cannot be used.
      And because embreex==2.17.7.post1 is unusable because no wheels are available with a matching platform, we can conclude that embreex<2.17.7.post2 cannot be used.
      And because embreex==2.17.7.post2 is unusable because no wheels are available with a matching platform and embreex==2.17.7.post3 is unusable because no wheels are
      available with a matching platform, we can conclude that embreex<2.17.7.post4 cannot be used.
      And because embreex==2.17.7.post4 is unusable because no wheels are available with a matching platform and you require embreex, we can conclude that the
      requirements are unsatisfiable.

@evetion
Copy link

evetion commented Mar 15, 2024

What did you run and what was the outcome?

pixi add --platform osx-arm64 --pypi "meshkernel==4.1"

What error did pixi return?

  × failed to resolve `pypi-dependencies`, due to underlying error
  ╰─▶ The following packages are incompatible
      └─ meshkernel ==4.1 cannot be installed because there are no viable options:
         └─ meshkernel 4.1.0 is excluded because none of the artifacts are compatible with the Python interpreter or glibc version and there are no supported sdists

Can pip install the package?

pixi run pip install "meshkernel==4.1" works on the osx-arm64 platform.

What platform are you on?

This happens on the osx-64 and osx-arm64 platforms (it works on win-64 and linux-64). I expect this to be some mismatching in the name/metadata of/in the .whl files for the macos specific ones.

Did you find a workaround, if so please explain.

Not yet, it would help if the resolver can be more verbose (-vvv has no effect on it) how it matches the platform/interpreter/glibc version and what's incompatible. Possibly renaming the wheels on the meshkernel side would be enough to fix it.

Encountered in Deltares/Ribasim#1137

@wolfv
Copy link
Member

wolfv commented Mar 15, 2024

@evetion that error often means that the system-requirements are not set high enough. I can see that wheels are only available for macos = 14.0 (for arm64). We should do a better job at explaining this.

Unfortunately, setting that in my pixi.toml didn't work just now, so we might also still have to do some more debugging of the uv integration! I'll take a look :)

@wolfv
Copy link
Member

wolfv commented Mar 15, 2024

Sorry, my bad. meshkernel actually does work fine when you add the following:

[system-requirements]
macos = "14.0"

We should make the error more actionable and figure out a better default behavior.

@wolfv
Copy link
Member

wolfv commented Mar 15, 2024

@liblaf I wonder if for you it's a similar issue and adding glibc 2.28 or higher would fix it.

e.g.

[system-requirements]
glibc = "2.28"

@liblaf
Copy link

liblaf commented Mar 15, 2024

@liblaf I wonder if for you it's a similar issue and adding glibc 2.28 or higher would fix it.

e.g.

[system-requirements]
glibc = "2.28"

@wolfv thx! pixi add --pypi "trimesh[all]" now works for me with the following config:

[system-requirements]
libc = "2.39"

@floringogianu
Copy link

@ruben-arts got around to test this again and now I'm not seeing this error anymore 🤷.

@traversaro
Copy link
Contributor

traversaro commented Oct 23, 2024

On pixi 0.34.0, with drake (https://github.com/RobotLocomotion/drake) we have this problem:

traversaro@IITBMP014LW012:~/drakepixi$ rm -rf *
traversaro@IITBMP014LW012:~/drakepixi$ pixi init .
✔ Created /home/traversaro/drakepixi/pixi.toml
traversaro@IITBMP014LW012:~/drakepixi$ pixi add python==3.11.* pip
✔ Added python==3.11.*
✔ Added pip >=24.2,<25
traversaro@IITBMP014LW012:~/drakepixi$ pixi add --pypi drake
  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because only drake>=0.35.0 is available and drake>=0.35.0,<=1.21.0 has no wheels with a matching Python ABI tag,
      we can conclude that drake<0.36.0 cannot be used.
      And because drake>=1.22.0 has no wheels with a matching Python implementation tag and you require drake, we can
      conclude that your requirements are unsatisfiable.

      hint: Pre-releases are available for drake in the requested range (e.g., 0.35.0b1), but pre-releases weren't
      enabled (try: `--prerelease=allow`)
traversaro@IITBMP014LW012:~/drakepixi$ pixi info
System
------------
      Pixi version: 0.34.0
          Platform: linux-64
  Virtual packages: __unix=0=0
                  : __linux=5.15.153.1=0
                  : __glibc=2.39=0
                  : __cuda=12.3=0
                  : __archspec=1=skylake
         Cache dir: /home/traversaro/.cache/rattler/cache
      Auth storage: /home/traversaro/.rattler/credentials.json
  Config locations: No config files found

Global
------------
           Bin dir: /home/traversaro/.pixi/bin
   Environment dir: /home/traversaro/.pixi/envs
      Manifest dir: /home/traversaro/.pixi/manifests/pixi-global.toml

Project
------------
              Name: drakepixi
           Version: 0.1.0
     Manifest file: /home/traversaro/drakepixi/pixi.toml
      Last updated: 23-10-2024 17:06:51

Environments
------------
       Environment: default
          Features: default
          Channels: conda-forge
  Dependency count: 2
      Dependencies: python, pip
  Target platforms: linux-64

@ruben-arts do you prefer this as a new issue or here? Thanks!

fyi @xela-95

@traversaro
Copy link
Contributor

@ruben-arts do you prefer this as a new issue or here? Thanks!

@ruben-arts no hurry on our side, but just to understand, do you prefer this bug report as a comment here or on another issue, thanks!

@ruben-arts
Copy link
Contributor Author

ruben-arts commented Oct 30, 2024

@traversaro That problem has to do with the system requirements. I really want to get rid of this issue through #346

Adding the following to the pixi.toml fixes it:

[system-requirements]
libc = "2.35"

Edit: sometimes it helps to look at https://pypi.org/simple/drake/ available wheels.

@traversaro
Copy link
Contributor

@traversaro That problem has to do with the system requirements. I really want to get rid of this issue through #346

Adding the following to the pixi.toml fixes it:

[system-requirements]
libc = "2.35"

Edit: sometimes it helps to look at https://pypi.org/simple/drake/ available wheels.

@xela-95

@traversaro
Copy link
Contributor

@traversaro That problem has to do with the system requirements. I really want to get rid of this issue through #346

Adding the following to the pixi.toml fixes it:

[system-requirements]
libc = "2.35"

Edit: sometimes it helps to look at https://pypi.org/simple/drake/ available wheels.

Thanks @ruben-arts, this fixed our problem! Cross-linking #346 that seems to be the actual root issue here.

@hameerabbasi
Copy link
Contributor

hameerabbasi commented Nov 14, 2024

Hello, uploading my pixi.toml and the error is below:

Error message
❯ pixi install
 WARN The feature 'notebooks' is defined but not used in any environment
 WARN The feature 'matrepr' is defined but not used in any environment
  × failed to solve the pypi requirements of 'mlir-dev' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because only finch-mlir<=0.0.2 is available and finch-mlir==0.0.2 has no wheels with a matching Python implementation tag, we can
      conclude that finch-mlir>=0.0.2 cannot be used.
      And because you require finch-mlir>=0.0.2, we can conclude that your requirements are unsatisfiable.
pixi.toml
[project]
authors = ["Hameer Abbasi <[email protected]>"]
channels = ["conda-forge"]
name = "sparse"
platforms = ["osx-arm64", "osx-64", "linux-64", "win-64"]

[pypi-dependencies]
sparse = { path = ".", editable = true }

[system-requirements]
libc = "2.35"

[dependencies]
python = ">=3.10"
numpy = ">=1.17"
numba = ">=0.49"

[feature.extras.dependencies]
dask = ">=2024"
scipy = ">=0.19"
scikit-learn = "*"

[feature.docs.pypi-dependencies]
mkdocs-material = "*"
mkdocstrings = { version = "*", extras = ["python"] }
mkdocs-gen-files = "*"
mkdocs-literate-nav = "*"
mkdocs-section-index = "*"
mkdocs-jupyter = "*"

[feature.tests.tasks]
test = "pytest --pyargs sparse -n auto"
test-mlir = { cmd = "pytest --pyargs sparse.mlir_backend -v" }
test-finch = { cmd = "pytest --pyargs sparse/tests -n auto -v", depends-on = ["precompile"] }

[feature.tests.dependencies]
pytest = ">=3.5"
pytest-cov = "*"
pytest-xdist = "*"
pre-commit = "*"
pytest-codspeed = "*"

[feature.notebooks.dependencies]
nbmake = "*"
matplotlib = "*"

[feature.matrepr.dependencies]
matrepr = "*"

[feature.finch.tasks]
precompile = "python -c 'import finch'"

[feature.finch.pypi-dependencies]
scipy = ">=0.19"
finch-tensor = ">=0.1.31"

[feature.finch.activation.env]
SPARSE_BACKEND = "Finch"

[feature.finch.target.osx-arm64.activation.env]
SPARSE_BACKEND = "Finch"
PYTHONFAULTHANDLER = "${HOME}/faulthandler.log"

[feature.mlir.dependencies]
scipy = ">=0.19"

[feature.mlir.pypi-dependencies]
finch-mlir = ">=0.0.2"

[feature.mlir.activation.env]
SPARSE_BACKEND = "MLIR"

[environments]
tests = ["tests", "extras"]
docs = ["docs", "extras"]
mlir-dev = {features = ["tests", "mlir"], no-default-feature = true}
finch-dev = {features = ["tests", "finch"], no-default-feature = true}

To reproduce simply run pixi install. Happens on 0.34 to 0.36.

@enrico5
Copy link

enrico5 commented Nov 15, 2024

The pypi dependency pypylon works fine when the platform is win-64 but fails with linux-64. The wheel name is pypylon-4.0.0-cp312-cp312-manylinux_2_31_x86_64.whl.

pixi init pixi-test --format pyproject
cd pixi-test
pixi add "python>=3.12.0,<3.13"
pixi add --pypi pypylon
× failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because only the following versions of pypylon are available:
          pypylon<=2.0.0
          pypylon>=2.2.0,<=3.0.1
          pypylon>=4.0.0
      and pypylon<=1.4.0 has no wheels with a matching Python implementation tag, we can
      conclude that all of:
          pypylon<1.4.0
          pypylon>2.0.0,<2.2.0
          pypylon>3.0.1,<4.0.0
       cannot be used.
      And because all of:
          pypylon>=1.5.4,<=2.0.0
          pypylon>=2.2.0,<=3.0.1
          pypylon>=4.0.0
      have no wheels with a matching Python ABI tag and you require pypylon, we can
      conclude that your requirements are unsatisfiable.

@ruben-arts
Copy link
Contributor Author

Hey @enrico5,

pypylon is only build for glibc 2.23 and higher. If you check your pixi info you should find the glibc your system has available and then you can tell pixi that it can expect that version of glibc for the solve.

For my machine that looks like this:

❯ pixi info | grep glibc
                  : __glibc=2.35=0

Adding the given version to the system-requirements:

[tool.pixi.system-requirements]
libc = "2.35"
❯ pixi add --pypi pypylon
✔ Added pypylon >=4.0.0, <5
Added these as pypi-dependencies.

@enrico5
Copy link

enrico5 commented Nov 15, 2024

That works great. Thank you!

@Goooyi
Copy link

Goooyi commented Feb 18, 2025

  1. What did you run and what was the outcome?
    pixi global install openi return error message
    I just installed pixi with brew, and I tried to install ruff with pixie global install ruff and it work, but when I global install openi, it returns error,

  2. What error did pixi return?

Error: 
  × Couldn't install openi
  ├─▶ Failed to determine virtual packages for environment openi
  ╰─▶ Cannot solve the request because of: No candidates were found for openi *.
  1. Can pip install the package? Yes, my local pip can install openi

  2. Does pip install packagex work? Yes

  3. What platform are you on? M1 Mac mini

System
------------
       Pixi version: 0.41.3
           Platform: osx-arm64
   Virtual packages: __unix=0=0
                   : __osx=15.2=0
                   : __archspec=1=m1
          Cache dir: /Users/xxx/Library/Caches/rattler/cache
       Auth storage: /Users/xxx/.rattler/credentials.json
   Config locations: No config files found

Global
------------
            Bin dir: /Users/xxx/.pixi/bin
    Environment dir: /Users/xxx/.pixi/envs
       Manifest dir: /Users/xxx/.pixi/manifests/pixi-global.toml

Did you find a workaround, if so please explain.

  • No

@ruben-arts
Copy link
Contributor Author

@Goooyi pixi global doesn't support pypi dependencies: #2261

@pablovela5620
Copy link
Contributor

pablovela5620 commented Feb 25, 2025

having some weirdness with rerun-sdk, feedstock isnt yet merged for latest 0.22.1 update. If I try to add via cli

(pi0-lerobot) ⚡ hocap-dataset ~/pi0-lerobot pixi add rerun-sdk==0.22.1 --pypi
Error: 
  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because rerun-sdk==0.22.1 has no wheels with a matching platform tag (e.g., `manylinux_2_28_x86_64`) and you require rerun-sdk==0.22.1, we
      can conclude that your requirements are unsatisfiable.
      
      hint: Wheels are available for `rerun-sdk` (v0.22.1) on the following platforms: `manylinux_2_31_aarch64`, `manylinux_2_31_x86_64`,
      `macosx_10_12_x86_64`, `macosx_11_0_arm64`, `win_amd64`

Error: 
  × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because only the following versions of rerun-sdk are available:
          rerun-sdk<=0.21.0
          rerun-sdk==0.22.0
          rerun-sdk==0.22.1
      and rerun-sdk>=0.21.0 has no wheels with a matching platform tag (e.g., `manylinux_2_28_x86_64`), we can conclude that rerun-sdk>=0.21.0
      cannot be used.
      And because lerobot==0.1.0 depends on rerun-sdk>=0.21.0, we can conclude that lerobot==0.1.0 cannot be used.
      And because only lerobot[pusht]==0.1.0 is available and you require lerobot[pusht], we can conclude that your requirements are unsatisfiable.
      
      hint: Pre-releases are available for `rerun-sdk` in the requested range (e.g., 0.22.1rc1), but pre-releases weren't enabled (try:
      `--prerelease=allow`)
      
      hint: Wheels are available for `rerun-sdk` (v0.22.1) on the following platforms: `manylinux_2_31_aarch64`, `manylinux_2_31_x86_64`,
      `macosx_10_12_x86_64`, `macosx_11_0_arm64`, `win_amd64`

but if I directly link to the url

rerun-sdk = { url = "https://files.pythonhosted.org/packages/1e/db/3ce2be017d7d4ac0948fb064bf7417b36c96bc07d1b8a922017606ba03c6/rerun_sdk-0.22.1-cp38-abi3-manylinux_2_31_x86_64.whl" }

it works just fine

@traversaro
Copy link
Contributor

feedstock isnt yet merged for latest 0.22.1 update

Kind OT, but anyhow: I pinged in conda-forge/rerun-sdk-feedstock#61, feel free to ping yourself in the future if something like this happens, thanks!

@pablovela5620
Copy link
Contributor

Yes, I should have started with that!

feedstock isnt yet merged for latest 0.22.1 update

Kind OT, but anyhow: I pinged in conda-forge/rerun-sdk-feedstock#61, feel free to ping yourself in the future if something like this happens, thanks!

@dlyz
Copy link

dlyz commented Apr 3, 2025

  1. What did you run and what was the outcome?

    Command:
    pixi add --pypi flash-attn

    pixi.toml

    [workspace]
    channels = ["conda-forge"]
    name = "pixi-example"
    platforms = ["win-64"]
    version = "0.1.0"
    
    [tasks]
    
    [dependencies]
    python = ">=3.11,<3.13"
    pip = ">=25.0.1,<26"
    setuptools = ">=75.8.2,<76"
    
    [pypi-dependencies]
    torch = { version = "<2.6", index = "https://download.pytorch.org/whl/cu124" }
    torchvision = { version = "<0.21", index = "https://download.pytorch.org/whl/cu124" }
    torchaudio = { version = "<2.6", index = "https://download.pytorch.org/whl/cu124" }
    packaging = ">=24.2, <25"
    einops = ">=0.8.1, <0.9"
  2. What error did pixi return?

    Error:   × Failed to update PyPI packages for environment 'default'
      ├─▶ Failed to prepare distributions
      ├─▶ Failed to build `flash-attn==2.7.4.post1`
      ├─▶ The build backend returned an error
      ╰─▶ Call to `setuptools.build_meta:__legacy__.build_wheel` failed (exit code: 1)
          
          [stderr]
          Traceback (most recent call last):
            File "<string>", line 14, in <module>
            File "C:\Users\***\AppData\Local\rattler\cache\uv-cache\builds-v0\.tmp36pAkT\Lib\site-packages\setuptools\build_meta.py", line 334, in get_requires_for_build_wheel
              return self._get_build_requires(config_settings, requirements=[])
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
            File "C:\Users\***\AppData\Local\rattler\cache\uv-cache\builds-v0\.tmp36pAkT\Lib\site-packages\setuptools\build_meta.py", line 304, in _get_build_requires
              self.run_setup()
            File "C:\Users\***\AppData\Local\rattler\cache\uv-cache\builds-v0\.tmp36pAkT\Lib\site-packages\setuptools\build_meta.py", line 522, in run_setup
              super().run_setup(setup_script=setup_script)
            File "C:\Users\***\AppData\Local\rattler\cache\uv-cache\builds-v0\.tmp36pAkT\Lib\site-packages\setuptools\build_meta.py", line 320, in run_setup
              exec(code, locals())
            File "<string>", line 22, in <module>
          ModuleNotFoundError: No module named 'torch'
    
          hint: This error likely indicates that `[email protected]` depends on `torch`, but doesn't declare it as a build dependency. If `flash-attn` is a first-party package, consider adding `torch` to its `build-system.requires`. Otherwise, `uv     
          pip install torch` into the environment and re-run with `--no-build-isolation`.

    import torch works of course using pixi run python

  3. Can pip install the package?
    It works in pure conda env, but does not work with pip inside the pixi (pixi run pip install ...) resulting with ImportError: cannot import name 'setup' from 'setuptools' (unknown location).

  4. What platform are you on?
    win-64

  5. Did you find a workaround, if so please explain.
    I guess one could install torch using conda packages (pytorch-gpu, etc) but there is no torchaudio for win-64 and I couldn't manage to install torchaudio separately using --pypi either. So no workaround so far.

@traversaro
Copy link
Contributor

The hint is relevant:

      hint: This error likely indicates that `[email protected]` depends on `torch`, but doesn't declare it as a build dependency. If `flash-attn` is a first-party package, consider adding `torch` to its `build-system.requires`. Otherwise, `uv     
      pip install torch` into the environment and re-run with `--no-build-isolation`.

Probably you can disable build-isolation for flash-attn, see https://pixi.sh/latest/reference/pixi_manifest/#no-build-isolation ? Note that in that case probably you also nee to install other build dependencies to the environment explicitly.

@dlyz
Copy link

dlyz commented Apr 3, 2025

Probably you can disable build-isolation for flash-attn

Thank you, it helped. I overlooked the thing about build isolation, got confused after seeing uv mentioned, sorry. It would be great though if there would be a message from pixi with the link you provided, but not sure it worth investing.

@dhirschfeld
Copy link
Contributor

When trying to install a pyproject.toml project where a dependency depends on pywin32, pixi will throw a failed to resolve pypi dependencies error, even if the dependency was marked with sys_platform == 'win32'

Example pyproject.toml

[project]
authors = [{name = "Dave Hirschfeld", email = "[email protected]"}]
name = "deleteme"
requires-python = ">= 3.11"
version = "0.1.0"
dependencies = [
  "httpx>=0.27.0",
  "httpx-negotiate-sspi>=0.28.1; sys_platform == 'win32'",
]

[build-system]
build-backend = "hatchling.build"
requires = ["hatchling"]



[tool.pixi.workspace]
channels = ["conda-forge"]
platforms = [
    "linux-64",
    "win-64"
]


[tool.pixi.pypi-dependencies]
deleteme = { path = ".", editable = true }
❯ pixi install --all
Error:   × failed to solve the pypi requirements of 'default' 'linux-64'
  ├─▶ failed to resolve pypi dependencies
  ╰─▶ Because only the following versions of pywin32 are available:
          pywin32<=223
          pywin32==224
          pywin32==225
          pywin32==226
          pywin32==227
          pywin32==228
          pywin32==300
          pywin32==301
          pywin32==302
          pywin32==303
          pywin32==304
          pywin32==305
          pywin32==306
          pywin32==307
          pywin32==308
          pywin32==309
          pywin32==310
      and pywin32>=223,<=305 has no wheels with a matching Python ABI tag (e.g., `cp312`), we can conclude that pywin32>=223,<=305 cannot
      be used.
      And because pywin32>=306 has no wheels with a matching platform tag (e.g., `manylinux_2_28_x86_64`) and httpx-negotiate-sspi==0.28.1
      depends on pywin32>=223, we can conclude that httpx-negotiate-sspi==0.28.1 cannot be used.
      And because only httpx-negotiate-sspi<=0.28.1 is available and you require httpx-negotiate-sspi>=0.28.1, we can conclude that your
      requirements are unsatisfiable.
      
      hint: You require CPython 3.12 (`cp312`), but we only found wheels for `pywin32` (v305) with the following Python ABI tags: `cp36m`,
      `cp37m`, `cp38`, `cp39`, `cp310`, `cp311`
      
      hint: Wheels are available for `pywin32` (v310) on the following platforms: `win32`, `win_amd64`, `win_arm64`

Note

This is when trying to install the project from a linux box.

Copy link
Contributor

tdejager commented Apr 10, 2025

Yes, you are right this is kind of a big issue we should pass the environment markers correctly to uv. Which we currently do not seem to do.

@dhirschfeld
Copy link
Contributor

I have to have things working on both Windows and Linux so this one is definitely biting me.

It's not a show-stopper though - I just make those deps [tool.pixi.target.win-64.dependencies] which works fine for pixi envs. Things are a little broken for pip users, but I'm trying to discourage the use of pip anyway! 🤣

Copy link
Contributor

Aha okay! Good to know 😄

@atonderski
Copy link

Also ran into this issue, and the proposed workaround breaks compatibility with for example direct uv pip install.

Any estimate on if/when this will be fixed?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pypi Issue related to PyPI dependencies
Projects
None yet
Development

No branches or pull requests