Skip to content

Conversation

Coopaguard
Copy link
Contributor

@Coopaguard Coopaguard commented Feb 13, 2025

  • Add Musl compillation of LlamaCPP libs
  • Detect Alpine OS
  • package

Notes:
Vulkan & Cuda will not be supported for this time

@Coopaguard Coopaguard marked this pull request as draft February 14, 2025 09:58
@Coopaguard Coopaguard marked this pull request as ready for review February 14, 2025 10:41
@martindevans
Copy link
Member

Hi @Coopaguard, thanks for this PR. I don't personally know much at all about Musel or Alpine, could you explain a bit about what they're for? From a bit of research it looks like this would improve compatibility with Docker?

@martindevans
Copy link
Member

martindevans commented Feb 14, 2025

Important Note To Reviewers: BEFORE MERGING do a binary update that includes the new musl binaries in deps.zip!

@Coopaguard
Copy link
Contributor Author

Coopaguard commented Feb 15, 2025

Important Note: BEFORE MERGING do a binary update that includes the new musl binaries in deps.zip!

Hi @martindevans , I don't know how to do so.

Can't run the update Binaries action manually,
So I build it on my fork llamaSharp and get a deps.json file but don't how to add it to the projects through git.

@Coopaguard
Copy link
Contributor Author

Hi @Coopaguard, thanks for this PR. I don't personally know much at all about Musel or Alpine, could you explain a bit about what they're for? From a bit of research it looks like this would improve compatibility with Docker?

Alpine Linux uses the musl C standard library instead of glibc, which is used by many other Linux distributions. Here's why it is often necessary to compile libraries with musl to use them on Alpine Linux.

Net8, Alpine & Docker
When creating Docker containers for .NET 8 applications, Alpine Linux is often used as the base image due to its small size and security.

So yes this should allow us to create some docker images of LLama.WebAPI ou Lama.Web

@martindevans
Copy link
Member

martindevans commented Feb 15, 2025

Hi @martindevans , I don't know how to do so.

Really sorry, I should have been clearer there! I've updated the comment to clarify - that was a note to people reviewing/merging to do that before merging.

Copy link
Member

@martindevans martindevans left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your work putting this together! I'll include this when I'm doing the next binary update in the coming weeks, and it should be included in the next version of LLamaSharp.

@martindevans martindevans mentioned this pull request Mar 9, 2025
8 tasks
@martindevans
Copy link
Member

Changes have been merged into #1126 and will be merged when that PR is, so I'll close this one. Thanks for the hard work @Coopaguard ❤️

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants