How to get Multi Turn responses? #455
-
Hi, I want to use [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Hello, how are you?"
},
{
"role": "assistant",
"content": "I'm good, thank you!"
},
{
"role": "user",
"content": "What is the weather in Tokyo?"
}
] Currently I see the ChatGPT has suggested I format the message array into a format like this:
However, this does not work, it makes the model respond weird. Please help thanks! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
You can set the current chat history of a import path from "path";
import {getLlama, LlamaChatSession, ChatHistoryItem} from "node-llama-cpp";
const llama = await getLlama();
const model = await llama.loadModel({
modelPath: path.join(__dirname, "models", "Meta-Llama-3.1-8B-Instruct.Q4_K_M.gguf")
});
const context = await model.createContext();
const session = new LlamaChatSession({
contextSequence: context.getSequence()
});
const chatHistory: ChatHistoryItem[] = [
{
type: "system",
text: "You are a helpful assistant."
},
{
type: "user",
text: "Hello, how are you?"
},
{
type: "model",
response: ["I'm good, thank you!"]
}
];
session.setChatHistory(chatHistory);
const res = await session.prompt("What is the weather in Tokyo?");
console.log("AI: " + res); You can also use |
Beta Was this translation helpful? Give feedback.
You can set the current chat history of a
LlamaChatSession
and then call the.prompt(...)
function with the new prompt: