Skip to content

Commit 478b500

Browse files
committed
chore: update flags to stay consistent with llama.cpp
1 parent c097d5c commit 478b500

File tree

2 files changed

+4
-3
lines changed

2 files changed

+4
-3
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -28,8 +28,8 @@ There are several installation options:
2828
# use OpenBLAS
2929
$ gem install llama_cpp -- --with-openblas
3030

31-
# use cuBLAS
32-
$ gem install llama_cpp -- --with-cublas
31+
# use CUDA
32+
$ gem install llama_cpp -- --with-cuda
3333
```
3434

3535
Those options are defined in [extconf.rb](https://github.com/yoshoku/llama_cpp.rb/blob/main/ext/llama_cpp/extconf.rb) by with_config method.

ext/llama_cpp/extconf.rb

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,8 @@
1515
make_envs << ' LLAMA_NO_ACCELERATE=1' if with_config('no-accelerate')
1616
make_envs << ' LLAMA_OPENBLAS=1' if with_config('openblas')
1717
make_envs << ' LLAMA_BLIS=1' if with_config('blis')
18-
make_envs << ' LLAMA_CUBLAS=1' if with_config('cublas')
18+
make_envs << ' LLAMA_CUBLAS=1' if with_config('cublas') # Deprecated, use --with-cuda instead
19+
make_envs << ' LLAMA_CUDA=1' if with_config('cuda')
1920
make_envs << ' LLAMA_CLBLAST=1' if with_config('clblast')
2021
make_envs << ' LLAMA_HIPBLAS=1' if with_config('hipblas')
2122
make_envs << ' LLAMA_MPI=1' if with_config('mpi')

0 commit comments

Comments
 (0)