Skip to content

cuBLAS, GPU compile instructions not working #213

Closed as not planned
Closed as not planned
@Free-Radical

Description

@Free-Radical
          > @Free-Radical Try with `CMAKE_ARGS="-DLLAMA_CUBLAS=on" FORCE_CMAKE=1 pip install llama-cpp-python`.

Did not work, BLAS = 0

Originally posted by @Free-Radical in #113 (comment)

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions