I'm new to Langchain and I made a chatbot using Next.js (so the Javascript library) that uses a CSV with soccer info to answer questions. Specific questions, for example "How many goals did Haaland score?" get answered properly, since it searches info about Haaland in the CSV (I'm embedding the CSV and storing the vectors in Pinecone).
The problem starts when I ask general questions, meaning questions without keywords. For example, "who made more assists?", or maybe something extreme like "how many rows are there in the CSV?". It completely fails. I'm guessing that it only gets the relevant info from the vector db based on the query and it can't answer these types of questions.
I'm using ConversationalRetrievalQAChain
from Langchain
chain.ts
/* create vectorstore */
const vectorStore = await PineconeStore.fromExistingIndex(
new OpenAIEmbeddings({}),
{
pineconeIndex,
textKey: "text",
}
);
return ConversationalRetrievalQAChain.fromLLM(
model,
vectorStore.asRetriever(),
{ returnSourceDocuments: true }
);
And using it in my API in Next.js.
route.ts
const res = await chain.call({
question: question,
chat_history: history
.map((h) => {
h.content;
})
.join("\n"),
});
Any suggestions are welcomed and appreciated. Also feel free to ask any questions. Thanks in advance