Skip to content

Build with hipblas failed after new changes recenttly #4525

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
sorasoras opened this issue Dec 18, 2023 · 3 comments · Fixed by #4528
Closed

Build with hipblas failed after new changes recenttly #4525

sorasoras opened this issue Dec 18, 2023 · 3 comments · Fixed by #4528

Comments

@sorasoras
Copy link

sorasoras commented Dec 18, 2023

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • [y ] I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • [y] I carefully followed the README.md.
  • [y ] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • [ y] I reviewed the Discussions, and have a new bug or useful enhancement to share.

Expected Behavior

I have compiled many times successful but
after changes from b1658, i was not able to compiled.

Current Behavior

instruction
cmake .. -G "Ninja" -DCMAKE_BUILD_TYPE=Release -DLLAMA_HIPBLAS=ON -DLLAMA_CUDA_DMMV_X=64 -DLLAMA_CUDA_MMV_Y=4 -DCMAKE_C_COMPILER="C:/Program Files/AMD/ROCm/5.5/bin/clang.exe" -DCMAKE_CXX_COMPILER="C:/Program Files/AMD/ROCm/5.5/bin/clang++.exe" -DAMDGPU_TARGETS="gfx1100"

cmake --build . -j 16

Please provide a detailed written description of what llama.cpp did, instead.
build_log.txt

Windows 11, full ROCM SDK 5.5,

cmake version 3.26.4

GNU Make 4.4.1
Built for x86_64-w64-mingw32

g++.exe (MinGW-W64 x86_64-msvcrt-posix-seh, built by Brecht Sanders) 13.1.0
Copyright (C) 2023 Free Software Foundation, Inc.


# Failure Information (for bugs)

Please help provide information about the failure / bug.


# Steps to Reproduce

Please provide detailed steps for reproducing the issue. We are not sitting in front of your screen, so the more detail the better.

1. step 1
cmake .. -G "Ninja" -DCMAKE_BUILD_TYPE=Release -DLLAMA_HIPBLAS=ON -DLLAMA_CUDA_DMMV_X=64 -DLLAMA_CUDA_MMV_Y=4   -DCMAKE_C_COMPILER="C:/Program Files/AMD/ROCm/5.5/bin/clang.exe" -DCMAKE_CXX_COMPILER="C:/Program Files/AMD/ROCm/5.5/bin/clang++.exe" -DAMDGPU_TARGETS="gfx1100"
3. step 2
cmake --build . --config Release

# Failure Logs
FAILED: CMakeFiles/ggml-rocm.dir/ggml-cuda.cu.obj
C:\PROGRA~1\AMD\ROCm\5.5\bin\CLANG_~1.EXE -DGGML_CUDA_DMMV_X=128 -DGGML_CUDA_MMV_Y=4 -DGGML_USE_CUBLAS -DGGML_USE_HIPBLAS -DK_QUANTS_PER_ITERATION=2 -D_CRT_SECURE_NO_WARNINGS -D_XOPEN_SOURCE=600 -D__HIP_PLATFORM_AMD__=1 -D__HIP_PLATFORM_HCC__=1 -isystem "C:/Program Files/AMD/ROCm/5.5/include" -O3 -DNDEBUG -D_DLL -D_MT -Xclang --dependent-lib=msvcrt -std=gnu++14 -mllvm -amdgpu-early-inline-all=true -mllvm -amdgpu-function-calls=false -x hip --offload-arch=gfx1100 -MD -MT CMakeFiles/ggml-rocm.dir/ggml-cuda.cu.obj -MF CMakeFiles\ggml-rocm.dir\ggml-cuda.cu.obj.d -o CMakeFiles/ggml-rocm.dir/ggml-cuda.cu.obj -c "W:/git/New folder/llama.cpp/ggml-cuda.cu"
W:/git/New folder/llama.cpp/ggml-cuda.cu:8394:5: error: unknown type name 'cublasComputeType_t'
    cublasComputeType_t cu_compute_type = CUBLAS_COMPUTE_16F;
    ^
W:/git/New folder/llama.cpp/ggml-cuda.cu:8395:5: error: unknown type name 'cudaDataType_t'
    cudaDataType_t      cu_data_type    = CUDA_R_16F;
    ^
2 errors generated when compiling for gfx1100.


[build_log.txt](https://github.com/ggerganov/llama.cpp/files/13707336/build_log.txt)

@selvaunidi
Copy link

@slaren Can you explain to me better what it means, I'm a newbie at this. Does the problem have a solution? Or is the board not useful for inference? I'm not a developer, just a user. Thank you very much for helping me! hugs.

@slaren
Copy link
Member

slaren commented Jan 25, 2024

I am not sure what you are asking, this issue was already fixed.

@selvaunidi
Copy link

@slaren Can you explain to me better what it means, I'm a newbie at this. Does the problem have a solution? Or is the board not useful for inference? I'm not a developer, just a user. Thank you very much for helping me! hugs.

I'm sorry for my ignorance, I know I'm in the wrong place. But, if you said that the problem was corrected, that's a sign that I can make an inference with the board. Thank you very much. Now, I need to go somewhere where they give me a step by step guide. I know this isn't the place, but you gave me hope. I'm going to bother other people on discord and reddit. Again, thank you very much for your promptness! you are innovating the world!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants