I'm new to LangChain, OpenAI and Python. The current trifecta. I'm using
agent = create_csv_agent(
OpenAI(temperature=0, max_tokens=500), file_path, verbose=True)
prompt = query #"Which product line had the lowest average price"
if prompt is None or prompt == "":
return jsonify({"error": "No user question provided"}), 400
response = agent.run(prompt)
But every time a new user comes to the site and wants to question the same csv file openAI charges because the file is sent. I want to be able to store a fairly large csv file in the same type of dataframe that is being used. I like the results that are being returned so I want to stay as close as possible. The OpenAI call takes the file_path, is there a way that it can take a faiss.index file?