Skip to content
This repository was archived by the owner on Oct 25, 2024. It is now read-only.
This repository was archived by the owner on Oct 25, 2024. It is now read-only.

No effect while running models with `trust_remote_code=True #1352

@mudler

Description

@mudler

Hi 👋

I'm the LocalAI author here, and I'm trying to implement transformers support for Intel GPUs in mudler/LocalAI#1746.

I'm struggling to make the example here to work, following the quick start in this repository on top of the oneapi container image (and installing with pip intel-extension-for-transformers) seems to completely ignore the trust_remote_code option:

/usr/local/lib/python3.10/dist-packages/torchvision/io/image.py:13: UserWarning: Failed to load image Python extension: ''If you don't plan on using image functionality from `torchvision.io`, you can ignore this warning. Otherwise, there might be something wrong with your environment. Did you have `libjpeg` or `libpng` installed before building `torchvisi
on` from source?
  warn(
2024-03-06 19:29:10,005 - datasets - INFO - PyTorch version 2.1.0a0+cxx11.abi available.
qwen.tiktoken: 100%|██████████████████████████████████████████████████████████████| 2.56M/2.56M [00:00<00:00, 4.06MB/s]
config.json: 100%|████████████████████████████████████████████████████████████████████| 911/911 [00:00<00:00, 2.99MB/s]
The repository for Qwen/Qwen-7B contains custom code which must be executed to correctly load the model. You can inspect the repository content at https://hf.co/Qwen/Qwen-7B.
You can avoid this prompt in future by passing the argument `trust_remote_code=True`.

To note, I have latest transformers (4.38.2) and I just followed the documentation. things seems to work, but trust_remote_code seems to be completely ignored.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions