Skip to content

Conversation

saddam213
Copy link
Collaborator

This builds on top of PR #64 as that brought in the new native API changes needed to allow multiple contexts per model instance

Due to some unfortunate naming there are quite a few files changes, but I did that ins a seperate commit to make review a bit less painless

LLamaModel was renamed to LLamaModelContext and has very little changes, it now just takes a model as a parameter

Basic flow now
LLamaModel -> LLamaModelContext -> ILLammaExecutor

All examples should be updated and working, and the Web interface is activly using mutiple contexts, just open more tabs

@martindevans
Copy link
Member

Thanks for doing this, I was planning to investigate something similar once my other PR was merged. LGTM 👍

In future PRs I'm hoping to push some of what's done in the LLamaModelContext down into the SafeLlamaContextSafeHandle, with the intent that anything you can do with a llama_context in llama.cpp can be done with a SafeLlamaContextSafeHandle in LlamaSharp. I don't think this PR blocks that change though.

@martindevans martindevans mentioned this pull request Jul 28, 2023
@saddam213 saddam213 closed this Aug 3, 2023
@saddam213 saddam213 deleted the Multi_Context branch August 3, 2023 06:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants