Langchain js ConversationalRetrievalQAChain and prompt

667 views Asked by At

Im using langchain js and make some esperimental with ConversationalRetrievalQAChain and prompt.

Im trying to add a variable called lang in my prompt and set the value during the call but i always get this error

throw new Error(Missing value for input ${node.name});

Im using langchain.js and i cant figure out why. Without {lang} and with the right lenguage replacemente like 'spanish' it works fine.



const QA_PROMPT = `You are an Assistant that speak only in {lang}, you speak and write only in {lang}. Use the following pieces of context to answer the question at the end.
If you don't know the answer, just say you don't know. DO NOT try to make up an answer.
If the question is not related to the context, politely respond that you are tuned to only answer questions that are related to the context.

{context}

Question: {question}
Helpful answer write in {lang}:`;

const chain = await ConversationalRetrievalQAChain.fromLLM(
    model,
    vectorStore.asRetriever(),
    //dobbiamo specificare i k
    {
        qaTemplate: QA_PROMPT,
        returnSourceDocuments: true,

        },

)

let res = ""
const stream = await chain.call({
    question: "Chi erano i partecipanti al verbale?",
    chat_history: [],
    lang: "Italian" // Qui dovremmo specificare la lingua


}, [{
    handleLLMNewToken(token) {
        res += token;
        console.clear()
        console.log(res)
    }
    }]
);

Im expecting to use a custom language as input for the prompt.

I tried use a prompt or other ways to set a prompt for ConversationalRetrievalQAChain but there isnt because it accept only string.

1

There are 1 answers

0
Nicolas Alexandre On

change your code to:

const chain = await ConversationalRetrievalQAChain.fromLLM(
model,
vectorStore.asRetriever(),
//dobbiamo specificare i k
{
    qaChainOptions: {
        type: 'stuff',
        prompt: PromptTemplate.fromTemplate(QA_PROMPT)
    },
    returnSourceDocuments: true,

    },
)

I'm a junior so I can explain to you why it works, but I found the answer by looking at the langchain tests https://github.com/langchain-ai/langchainjs/blob/main/langchain/src/chains/tests/conversational_retrieval_chain.int.test.ts