Error using Langchain AzureMLOnlineEndpoint chain.run

407 views Asked by At

Trying a sample implementation from: https://python.langchain.com/docs/integrations/llms/azureml_endpoint_example Dolly with LLMChain

Added deployment_name="<DOLLY_ENDPOINT_URL_deployment_name>",

from langchain import PromptTemplate
from langchain.llms.azureml_endpoint import DollyContentFormatter
from langchain.chains import LLMChain

formatter_template = "Write a {word_count} word essay about {topic}."

prompt = PromptTemplate(
    input_variables=["word_count", "topic"], template=formatter_template
)

content_formatter = DollyContentFormatter()

llm = AzureMLOnlineEndpoint(
    endpoint_api_key="<DOLLY_ENDPOINT_API_KEY>",
    endpoint_url="<DOLLY_ENDPOINT_URL>",
    deployment_name="<DOLLY_ENDPOINT_URL_deployment_name>",
    model_kwargs={"temperature": 0.0, "max_tokens": 300},
    content_formatter=content_formatter,
)

chain = LLMChain(llm=llm, prompt=prompt)
print(chain.run({"word_count": 100, "topic": "how to make friends"}))

No matter what I do, I am always getting this error:

---------------------------------------------------------------------------
ValidationError                           Traceback (most recent call last)
Cell In[109], line 22
     13 llm = AzureMLOnlineEndpoint(
     14     endpoint_api_key="<API key>",
     15     endpoint_url="<enpoint>",
   (...)
     18     content_formatter=content_formatter,
     19 )
     21 chain = LLMChain(llm=llm, prompt=prompt)
---> 22 print(chain.run({"word_count": 100, "topic": "how to make friends"}))

...

File ~\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\langchain\llms\base.py:606, in LLM._generate(self, prompts, stop, run_manager, **kwargs)
    600 for prompt in prompts:
    601     text = (
    602         self._call(prompt, stop=stop, run_manager=run_manager, **kwargs)
    603         if new_arg_supported
    604         else self._call(prompt, stop=stop, **kwargs)
    605     )
--> 606     generations.append([Generation(text=text)])
    607 return LLMResult(generations=generations)

File ~\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\langchain\load\serializable.py:74, in Serializable.__init__(self, **kwargs)
     73 def __init__(self, **kwargs: Any) -> None:
---> 74     super().__init__(**kwargs)
     75     self._lc_kwargs = kwargs

File ~\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\pydantic\main.py:341, in pydantic.main.BaseModel.__init__()

**ValidationError: 1 validation error for Generation
text
  str type expected (type=type_error.str)**

Also this endpoint works with both:

https://github.com/Azure/azureml-examples/blob/main/sdk/python/foundation-models/system/inference/text-generation/text-generation-online-endpoint-dolly.ipynb (5) Test the endpoint with sample data:

OR

using urllib.request.Request(url, body, headers)

So, the endpoint is actually accessible.

Really appreciate if someone can guide me to fix this. I need the langchain solution for embeddings.

0

There are 0 answers