构建LangChain应用出现的TypeError错误
2024-01-07 17:43:47
在阅读LangChain官网给出的一些案列时,实际运行却报错,案列代码如下:
from langchain.prompts import ChatPromptTemplate
from langchain_community.chat_models import ChatOpenAI
from langchain_core.output_parsers import StrOutputParser
prompt = ChatPromptTemplate.from_template("tell me a short joke about {topic}")
model = ChatOpenAI()
output_parser = StrOutputParser()
chain = prompt | model | output_parser
chain.invoke({"topic": "ice cream"})
错误1:TypeError: Expected a Runnable, callable or dict.Instead got an unsupported type: <class ‘langchain_core.output_parsers.string.StrOutputParser’>
完整报错信息如下:
Traceback (most recent call last):
File "~/PycharmProjects/LangChain/main.py", line 14, in <module>
chain = prompt | model | output_parser
File "~/opt/anaconda3/envs/langchain/lib/python3.8/site-packages/langchain/schema/runnable/base.py", line 1165, in __or__
last=coerce_to_runnable(other),
File "~/opt/anaconda3/envs/langchain/lib/python3.8/site-packages/langchain/schema/runnable/base.py", line 2774, in coerce_to_runnable
raise TypeError(
TypeError: Expected a Runnable, callable or dict.Instead got an unsupported type: <class 'langchain_core.output_parsers.string.StrOutputParser'>
出现这个错误的原因是因为输出解析器不正确,不支持StrOutputParser而是要使用BaseOutputParser,因此我们可以自己来实现一个BaseOutputParser:
如下:
class CommaSeparatedListOutputParser(BaseOutputParser):
"""Parse the output of an LLM call to a comma-separated list."""
def parse(self, text: str):
"""Parse the output of an LLM call."""
return text.strip()
即更新代码如下:
import os
from langchain.prompts import ChatPromptTemplate
from langchain_community.chat_models import ChatOpenAI
from langchain.schema import BaseOutputParser
os.environ["OPENAI_API_KEY"] = "xxx"
class CommaSeparatedListOutputParser(BaseOutputParser):
"""Parse the output of an LLM call to a comma-separated list."""
def parse(self, text: str):
"""Parse the output of an LLM call."""
return text.strip()
prompt = ChatPromptTemplate.from_template("tell me a short joke about {topic}")
model = ChatOpenAI()
output_parser = CommaSeparatedListOutputParser()
chain = prompt | model | output_parser
chain.invoke({"topic": "ice cream"})
仍然报错2
错误2:TypeError: Got unknown type (‘messages’, [HumanMessage(content=‘tell me a short joke about ice cream’)])
错误原因是引入ChatOpenAI的包不对,原始的引入是from langchain_community.chat_models import ChatOpenAI
改为from langchain.chat_models import ChatOpenAI
即可,修改上面2处问题后,即可正确运行代码
完整正确的代码如下
import os
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models import ChatOpenAI
from langchain.schema import BaseOutputParser
os.environ["OPENAI_API_KEY"] = "xxx"
class CommaSeparatedListOutputParser(BaseOutputParser):
"""Parse the output of an LLM call to a comma-separated list."""
def parse(self, text: str):
"""Parse the output of an LLM call."""
return text.strip()
prompt = ChatPromptTemplate.from_template("tell me a short joke about {topic}")
model = ChatOpenAI()
output_parser = CommaSeparatedListOutputParser()
chain = prompt | model | output_parser
res = chain.invoke({"topic": "ice cream"})
print(res)
输出:
Why did the ice cream go to therapy?
Because it had too many toppings and couldn't keep its sprinkles together!
文章来源:https://blog.csdn.net/weixin_43495948/article/details/135388077
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。 如若内容造成侵权/违法违规/事实不符,请联系我的编程经验分享网邮箱:veading@qq.com进行投诉反馈,一经查实,立即删除!
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。 如若内容造成侵权/违法违规/事实不符,请联系我的编程经验分享网邮箱:veading@qq.com进行投诉反馈,一经查实,立即删除!