Is there a way to use llama-index only on the indexing side?

177 views Asked by At

When I create VectorStoreIndex object, why does it require ChatGPT or some other LLM that will be used at the query time?

Is there a way to use LlamaIndex only on the indexing side? Is there no good separation between query and indexing pipelines in llama-index?

0

There are 0 answers