How to provide system message variables and human message variables to an LLMChain (LangChain with Python)?

699 views Asked by At

I have a problem sending system messagge variables and human message variables to a prompt through LLMChain.

I have the following code:

prompt = ChatPromptTemplate.from_messages(
    [
        SystemMessagePromptTemplate.from_template(
            "You are a {role} having a conversation with a human. Talk as a {role}."
        ), 
        MessagesPlaceholder(
            variable_name="chat_history"
        ),  
        HumanMessagePromptTemplate.from_template(
            "{human_input}"
        ),  
    ]
)

memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)

llm = ChatOpenAI()

chat_llm_chain = LLMChain(
    llm=llm,
    prompt=prompt,
    verbose=True,
    memory=memory,
)

chat_llm_chain.predict(human_input="Hi there my friend", role='dog')

Which gives me the following error:

raise ValueError(f"One input key expected got {prompt_input_keys}")
ValueError: One input key expected got ['human_input', 'role']

How can I send the role variable? Is there any better way to handle this?

Thanks

1

There are 1 answers

2
lif cc On

If you have multiple inputs, should assign a input key. Just like this:

memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True, input_key="human_input", output_key="<***>")

Notice that output key is not necessary.

For more information, cliek this

Or you can delete memory, this error occurs when conversion finished and save chat history into memory but you have more than one input.