How to Access Model Output's Logits in LLMChain?

468 views Asked by At

I am working on a prediction project using llama2 with langchain.I am interested in obtaining the logits from the model's output for further analysis. Is there a way to access these logits directly from LLMChain?

I implemented the model without langchain and obtained the logits from the main model using the following code:

outputs = model(input_ids)
logits = outputs.logits.detach().cpu()
probs = torch.log_softmax(logits.float(), dim=-1).detach().cpu() 

However, I want to access logit values of langchain output.

prompt = PromptTemplate(template=template, input_variables=["text"])
llm_chain = LLMChain(prompt=prompt, llm=llm)

Any guidance or references would be greatly appreciated.

0

There are 0 answers