-
Notifications
You must be signed in to change notification settings - Fork 470
Closed
Description
Description
I just upgraded the NuGet packages from 0.16 to 0.18. CPU backend. The simple test console program still compiled but it no longer runs. Inside ChatSession.ChatAsync
I get the exception:
System.NullReferenceException: 'Object reference not set to an instance of an object.'
LLama.InteractiveExecutor.InferInternal(LLama.Abstractions.IInferenceParams, LLama.StatefulExecutorBase.InferStateArgs) LLama.StatefulExecutorBase.InferAsync(string, LLama.Abstractions.IInferenceParams, System.Threading.CancellationToken) LLama.LLamaTransforms.KeywordTextOutputStreamTransform.TransformAsync(System.Collections.Generic.IAsyncEnumerable<string>) LLama.LLamaTransforms.KeywordTextOutputStreamTransform.TransformAsync(System.Collections.Generic.IAsyncEnumerable<string>) LLama.ChatSession.ChatAsyncInternal(string, LLama.Abstractions.IInferenceParams, System.Threading.CancellationToken) LLama.ChatSession.ChatAsyncInternal(string, LLama.Abstractions.IInferenceParams, System.Threading.CancellationToken) LLama.ChatSession.ChatAsync(LLama.Common.ChatHistory.Message, bool, LLama.Abstractions.IInferenceParams, System.Threading.CancellationToken) LLama.ChatSession.ChatAsync(LLama.Common.ChatHistory.Message, bool, LLama.Abstractions.IInferenceParams, System.Threading.CancellationToken) LLamaSharpConsole.Program.Main(string[]) in Program.cs
Reproduction Steps
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using LLama;
using LLama.Common;
using LLama.Sampling;
namespace LLamaSharpConsole
{
internal class Program
{
static async Task Main(string[] args)
{
string modelPath = @"D:\Projects\dotNet\Solutions\LLMPlayground\Models\model.gguf";
//string modelPath = @"D:\Projects\dotNet\Solutions\LLMPlayground\Models\model.quantized.gguf";
var parameters = new ModelParams(modelPath)
{
ContextSize = 4096,
GpuLayerCount = 10
};
using var model = LLamaWeights.LoadFromFile(parameters);
using var context = model.CreateContext(parameters);
var executor = new InteractiveExecutor(context);
ChatHistory chatHistory = new ChatHistory();
ChatSession session = new(executor, chatHistory);
session.WithOutputTransform(new LLamaTransforms.KeywordTextOutputStreamTransform(
new string[] { "User:", "Assistant:", "�" },
redundancyLength: 8));
InferenceParams inferenceParams = new InferenceParams()
{
AntiPrompts = new List<string> { "User:", "Assistant:" },
//SamplingPipeline = new Mirostat2SamplingPipeline()
//{
// Tau = 1f,
// Eta = 0.2f
//}
};
Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("The chat session has started.");
Console.Write("\r\nReady> ");
// show the prompt
Console.ForegroundColor = ConsoleColor.Green;
string userInput = Console.ReadLine() ?? "";
while (userInput != "exit")
{
Console.ForegroundColor = ConsoleColor.White;
await foreach (
var text
in session.ChatAsync(
new ChatHistory.Message(AuthorRole.User, userInput + "\nAssistant: "),
inferenceParams))
{
Console.Write(text);
}
Console.ForegroundColor = ConsoleColor.Yellow;
Console.Write("\r\nReady> ");
Console.ForegroundColor = ConsoleColor.Green;
userInput = Console.ReadLine() ?? "";
}
Console.ForegroundColor = ConsoleColor.White;
}
} //end of class Program
}
Environment & Configuration
- Windows 10
- .NET 8.0
- LLamaSharp version: 0.18
- CUDA version (if you are using cuda backend): CPU backend
- CPU & GPU device: x64
Known Workarounds
No response
Metadata
Metadata
Assignees
Labels
No labels