How to Convert openai functions to PromptTemplate in langchain when using local llms?

219 views Asked by At

Is is possible to Convert openai functions to PromptTemplate in langchain when using local llms and return final output similar to openai api function format

import langchain
functions = [
        {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                },
                "required": ["location"],
            },
        }
    ]
llm = llm()

#Pass functions into prompt template in langchain 

#Format prompt template similar to openai manner



0

There are 0 answers