Skip to content

Bug: RPC server doesn't load GPU if I use Vulkan  #8536

Closed
@metal3d

Description

@metal3d

What happened?

I compiled llamacpp with Vulkan backend. The "rpc-server" binary is linked to libvulkan but it never uses my GPUs. While "llama-cli" is OK.

Name and Version

version: 3384 (4e24cff)
built with cc (GCC) 14.1.1 20240701 (Red Hat 14.1.1-7) for x86_64-redhat-linux

What operating system are you seeing the problem on?

Linux

Relevant log output

./rpc-server
create_backend: using CPU backend
Starting RPC server on 0.0.0.0:50052, backend memory: 23967 MB


ldd ./rpc-server
        linux-vdso.so.1 (0x00007f18759f2000)
        libllama.so => /home/metal3d/Projects/ML/llama.cpp/build-rpc/src/libllama.so (0x00007f1875879000)
        libggml.so => /home/metal3d/Projects/ML/llama.cpp/build-rpc/ggml/src/libggml.so (0x00007f1875400000)
        libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f1875000000)
        libm.so.6 => /lib64/libm.so.6 (0x00007f187531c000)
        libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f187582b000)
        libc.so.6 => /lib64/libc.so.6 (0x00007f1874e0f000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f18759f4000)
        libvulkan.so.1 => /lib64/libvulkan.so.1 (0x00007f18757af000)
        libgomp.so.1 => /lib64/libgomp.so.1 (0x00007f18752c6000)

Metadata

Metadata

Assignees

Labels

bug-unconfirmedlow severityUsed to report low severity bugs in llama.cpp (e.g. cosmetic issues, non critical UI glitches)

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions