Skip to content

.Net: Bug: When using ChatCompletionAgent and the locally deployed llama3.2:3b model, the user's Chinese question became garbled in the function call parameters. #12103

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
yong-zhang-newtera opened this issue May 16, 2025 · 0 comments
Assignees
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code

Comments

@yong-zhang-newtera
Copy link

yong-zhang-newtera commented May 16, 2025

Describe the bug
Framework: Microsoft Semantic Kernel 1.49.0

I am testing ChatCompletionAgent with a locally deployed llama3.2:3b to query a knowledge base with data in Chinese via a text search plugin. When a user asks a question in Chinese, the agent can invoke the text search plugin, but with a garbled Chinese text, causing the search to fail. Please see the screenshot below:

Screenshots

Image

I attach part of my code below:

        kernelBuilder.Services.AddOllamaChatCompletion(
                modelId: LLMConfig.Instance.ConfigModel.ModelId,
                endpoint: new Uri(LLMConfig.Instance.ConfigModel.ApiEndpoint)
            );
    var textEmbeddingGeneration = vectorStoreFixture.TextEmbeddingGenerationService;
        var vectorSearch = vectorStoreFixture.VectorStoreRecordCollection;
        var customVectorSearch = new CustomVectorSearch(vectorSearch, threshold);

        // Create a text search instance using the InMemory vector store.
        var textSearch = new VectorStoreTextSearch<VectorRecordModel>(
            customVectorSearch,
            textEmbeddingGeneration);

        var searchPlugin = KernelPluginFactory.CreateFromFunctions(
            pluginName, description,
            [textSearch.CreateGetTextSearchResults(searchOptions: searchOptions)]);

        kernel.Plugins.Add(searchPlugin);

       var kernel = kernelBuilder.Build();
       ChatCompletionAgent faqAgent =
           new()
           {
               Name = "SearchFAQAgent",
               Instructions = LLMConfig.Instance.ConfigModel.Instructions,
               Kernel = kernel,
               Arguments =
                   new KernelArguments(new OllamaPromptExecutionSettings()
                   {
                       FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
                   })
           };

`

Platform

  • Language: [C#]
  • AI model: [llama3.2:3b]
  • IDE: [Visual Studio]
  • OS: [Windows]
@yong-zhang-newtera yong-zhang-newtera added the bug Something isn't working label May 16, 2025
@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels May 16, 2025
@github-actions github-actions bot changed the title Bug: When using ChatCompletionAgent and the locally deployed llama3.2:3b model, the user's Chinese question became garbled in the function call parameters. .Net: Bug: When using ChatCompletionAgent and the locally deployed llama3.2:3b model, the user's Chinese question became garbled in the function call parameters. May 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code
Projects
Status: Bug
Development

No branches or pull requests

3 participants