We are building a very small index (roughly 4k docs in the vector store) and we are comparing a local FAISS implementation against the Vertex AI Vector Search solution provided by Google.
While the FAISS index takes 5 minutes to be computed (creating embeddings included), its almost a full hour for the Vector Search index to be created and deployed. Why is that? And is there any way to speed up the process?
My fear is that the product might not be suited for our use-case (too much computational overweight that is needed if you have billions of docs, not 4k). Is that correct? If yes, can you recommend other solutions that are faster and built for smaller indexes?