I am seeing a "cannot import name 'Vector' from azure.search.documents.models" error when I invoke my chain. Origin of my error is line 434 in lanchain/vectorstores/azuresearch.py (from azure.search.documents.models import Vector)

this is the relevant code snippet, I get the import error when I execute rag_chain.invoke(question)

from langchain.schema.runnable import RunnablePassthrough
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models.azure_openai import AzureChatOpenAI

question = "my question.."

# vector_store is initialized using AzureSearch(), not including that snippet here
retriever = vector_store.as_retriever()

template = ''' 
Answer the question based on the following context: 
{context}

Question: {question} 
'''

prompt = ChatPromptTemplate.from_template(template=template)

llm = AzureChatOpenAI( deployment_name='MY_DEPLOYMENT_NAME', model_name='MY_MODEL', openai_api_base=MY_AZURE_OPENAI_ENDPOINT, openai_api_key=MY_AZURE_OPENAI_KEY, openai_api_version='2023-05-15', openai_api_type='azure' )

rag_chain = {'context' : retriever, 'question' : RunnablePassthrough} | prompt | llm 
rag_chain.invoke(question)

my package versions

  • langchain==0.0.331
  • azure-search-documents==11.4.0b11
  • azure-core==1.29.5
  • openai==0.28.1
1

There are 1 answers

1
Venkatesan On BEST ANSWER

"cannot import name 'Vector' from azure.search.documents.models" error when I invoke my chain. Origin of my error is line 434 in lanchain/vectorstores/azuresearch.py (from azure.search.documents.models import Vector)

According to this document you need to install the azure-search-documents==11.4.0b8 for vector stores of azure search.

Now you can use the below code that I tested in my environment:

Code:

from langchain.prompts import ChatPromptTemplate
from langchain.chat_models.azure_openai import AzureChatOpenAI
from langchain.vectorstores.azuresearch import AzureSearch
from langchain.embeddings import OpenAIEmbeddings
from langchain.schema import StrOutputParser
from langchain.schema.runnable import RunnablePassthrough
import os


model = "xxxxx"
chunk_size = 1

os.environ["OPENAI_API_TYPE"] = "azure"
os.environ["OPENAI_API_BASE"] = "xxxx"
os.environ["OPENAI_API_KEY"] = "xxxx"
os.environ["OPENAI_API_VERSION"] = "2023-05-15"

MY_AZURE_OPENAI_ENDPOINT="xxxx"
OPENAIKEY="xxxxx"
vector_store_address = "xxxx"
vector_store_password = "xxxxx"
index_name = "sample-index"
embeddings = OpenAIEmbeddings(deployment=model, chunk_size=chunk_size)
vector_store = AzureSearch(
    azure_search_endpoint=vector_store_address,
    azure_search_key=vector_store_password,
    index_name=index_name,
    embedding_function=embeddings.embed_query,
)

retriever = vector_store.as_retriever()


template = """Answer the question based only on the following context:

{context}

Question: {question}
"""
prompt = ChatPromptTemplate.from_template(template)

llm = AzureChatOpenAI( deployment_name='gpt-35-turbo', openai_api_base=MY_AZURE_OPENAI_ENDPOINT, openai_api_key=OPENAIKEY, openai_api_version='2023-05-15', openai_api_type='azure' )

def format_docs(docs):
    return "\n\n".join([d.page_content for d in docs])


chain = (
    {"context": retriever | format_docs, "question": RunnablePassthrough()}
    | prompt
    | llm
    | StrOutputParser()
)
print(chain.invoke("What did the president say about technology?"))

Output:

> The president mentioned the importance of investing in emerging
> technologies and American manufacturing to compete with China and
> other competitors. He also mentioned the role of computer chips in
> powering everyday technology and the potential for Intel to increase
> its investment in manufacturing from $20 billion to $100 billion.

enter image description here

Reference: Langchain-Full-Course/langchain_expressions.ipynb at main · Coding-Crashkurse/Langchain-Full-Course · GitHub