Skip to content

How to get Multi Turn responses? #455

Closed Answered by giladgd
SpicyMelonYT asked this question in Q&A
Discussion options

You must be logged in to vote

You can set the current chat history of aLlamaChatSession and then call the .prompt(...) function with the new prompt:

import path from "path";
import {getLlama, LlamaChatSession, ChatHistoryItem} from "node-llama-cpp";

const llama = await getLlama();
const model = await llama.loadModel({
    modelPath: path.join(__dirname, "models", "Meta-Llama-3.1-8B-Instruct.Q4_K_M.gguf")
});
const context = await model.createContext();
const session = new LlamaChatSession({
    contextSequence: context.getSequence()
});

const chatHistory: ChatHistoryItem[] = [
    {
        type: "system",
        text: "You are a helpful assistant."
    },
    {
        type: "user",
        text: "Hello, how are y…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by giladgd
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants