Skip to content

Conversation

edgett
Copy link
Contributor

@edgett edgett commented Dec 24, 2023

This fixes the build step: Compile (Windows) (avx512, -DLLAMA_AVX512=ON -LLAMA_AVX512_VBMI=ON -DLLAMA_AVX512_VNNI=ON)

There is a typo in the cmake command: cmake .. -DLLAMA_NATIVE=OFF -DLLAMA_BUILD_TESTS=OFF -DLLAMA_BUILD_EXAMPLES=OFF -DLLAMA_BUILD_SERVER=OFF -DBUILD_SHARED_LIBS=ON -DLLAMA_AVX512=ON -LLAMA_AVX512_VBMI=ON -DLLAMA_AVX512_VNNI=ON cmake --build . --config Release -j ${env:NUMBER_OF_PROCESSORS}

-LLAMA_AVX512_VBMI
Should be
-DLLAMA_AVX512_VBMI

Here is a link to the build error: https://github.com/edgett/LLamaSharp/actions/runs/7316984325/job/19932084747#logs

And a working build after this change:
https://github.com/edgett/LLamaSharp/actions/runs/7317114102/job/19932341049

This fixes the build step: Compile (Windows) (avx512, -DLLAMA_AVX512=ON -LLAMA_AVX512_VBMI=ON -DLLAMA_AVX512_VNNI=ON)

There is a typo in the cmake command: `cmake .. -DLLAMA_NATIVE=OFF -DLLAMA_BUILD_TESTS=OFF -DLLAMA_BUILD_EXAMPLES=OFF -DLLAMA_BUILD_SERVER=OFF -DBUILD_SHARED_LIBS=ON -DLLAMA_AVX512=ON -LLAMA_AVX512_VBMI=ON -DLLAMA_AVX512_VNNI=ON
  cmake --build . --config Release -j ${env:NUMBER_OF_PROCESSORS}`

-LLAMA_AVX512_VBMI
Should be
-DLLAMA_AVX512_VBMI
@martindevans
Copy link
Member

Thankyou! I'm sure that would have caused me trouble next time I came to do a binary update!

@martindevans martindevans merged commit 889d99b into SciSharp:master Dec 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants