当前位置: 首页 > news >正文

自然语言处理从入门到应用——LangChain:链(Chains)-[通用功能:链的保存(序列化)与加载(反序列化)]

分类目录:《自然语言处理从入门到应用》总目录


本文介绍了如何将链保存(序列化)到磁盘和从磁盘加载(反序列化)。我们使用的序列化格式是jsonyaml。目前,只有一些链支持这种类型的序列化。随着时间的推移,我们将增加支持的链条数量。

将链保存(序列化)到磁盘

首先,让我们可以使用.save方法将链保存到磁盘,并指定一个带有jsonyaml扩展名的文件路径。

from langchain import PromptTemplate, OpenAI, LLMChain
template = """Question: {question}Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["question"])
llm_chain = LLMChain(prompt=prompt, llm=OpenAI(temperature=0), verbose=True)llm_chain.save("llm_chain.json")

现在让我们来看看保存的文件中的内容:

!cat llm_chain.json

输出:

{"memory": null,"verbose": true,"prompt": {"input_variables": ["question"],"output_parser": null,"template": "Question: {question}\n\nAnswer: Let's think step by step.","template_format": "f-string"},"llm": {"model_name": "text-davinci-003","temperature": 0.0,"max_tokens": 256,"top_p": 1,"frequency_penalty": 0,"presence_penalty": 0,"n": 1,"best_of": 1,"request_timeout": null,"logit_bias": {},"_type": "openai"},"output_key": "text","_type": "llm_chain"
}

从磁盘加载(反序列化)链

我们可以使用load_chain方法从磁盘加载链:

from langchain.chains import load_chain
chain = load_chain("llm_chain.json")
chain.run("whats 2 + 2")

日志输出:

> Entering new LLMChain chain...
Prompt after formatting:
Question: whats 2 + 2Answer: Let's think step by step.> Finished chain.

输出:

' 2 + 2 = 4'

分别保存组件

在上面的例子中我们可以看到提示和LLM配置信息与整个链条保存在同一个json中,但我们也可以将它们分开保存。这通常有助于使保存的组件更加模块化。为了做到这一点,我们只需要指定llm_path而不是llm组件,并且指定prompt_path而不是prompt组件。

llm_chain.prompt.save("prompt.json")

输入:

!cat prompt.json

输出:

{"input_variables": ["question"],"output_parser": null,"template": "Question: {question}\n\nAnswer: Let's think step by step.","template_format": "f-string"
}

输入:

llm_chain.llm.save("llm.json")

输入:

!cat llm.json

输出:

{"model_name": "text-davinci-003","temperature": 0.0,"max_tokens": 256,"top_p": 1,"frequency_penalty": 0,"presence_penalty": 0,"n": 1,"best_of": 1,"request_timeout": null,"logit_bias": {},"_type": "openai"
}

输入:

config = {"memory": None,"verbose": True,"prompt_path": "prompt.json","llm_path": "llm.json","output_key": "text","_type": "llm_chain"
}import jsonwith open("llm_chain_separate.json", "w") as f:json.dump(config, f, indent=2)

输入:

!cat llm_chain_separate.json

输出:

{"memory": null,"verbose": true,"prompt_path": "prompt.json","llm_path": "llm.json","output_key": "text","_type": "llm_chain"
}

我们可以以相同的方式加载它:

chain = load_chain("llm_chain_separate.json")
chain.run("whats 2 + 2")

日志输出:

> Entering new LLMChain chain...
Prompt after formatting:
Question: whats 2 + 2Answer: Let's think step by step.> Finished chain.

输出:

' 2 + 2 = 4'

从LangChainHub加载

本节介绍如何从LangChainHub加载链。

from langchain.chains import load_chainchain = load_chain("lc://chains/llm-math/chain.json")
chain.run("whats 2 raised to .12")

日志输出:

> Entering new LLMMathChain chain...
whats 2 raised to .12
Answer: 1.0791812460476249
> Finished chain.

输出:

'Answer: 1.0791812460476249'

有时候链会需要额外的参数,这些参数在链序列化时未包含在内。例如,一个用于对向量数据库进行问答的链条将需要一个向量数据库作为参数。

from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.vectorstores import Chroma
from langchain.text_splitter import CharacterTextSplitter
from langchain import OpenAI, VectorDBQA
from langchain.document_loaders import TextLoader
loader = TextLoader('../../state_of_the_union.txt')
documents = loader.load()
text_splitter = CharacterTextSplitter(chunk_size=1000, chunk_overlap=0)
texts = text_splitter.split_documents(documents)embeddings = OpenAIEmbeddings()
vectorstore = Chroma.from_documents(texts, embeddings)
# Running Chroma using direct local API.
# Using DuckDB in-memory for database. Data will be transient.chain = load_chain("lc://chains/vector-db-qa/stuff/chain.json", vectorstore=vectorstore)
query = "What did the president say about Ketanji Brown Jackson"
chain.run(query)

输出:

" The president said that Ketanji Brown Jackson is a Circuit Court of Appeals Judge, one of the nation's top legal minds, a former top litigator in private practice, a former federal public defender, has received a broad range of support from the Fraternal Order of Police to former judges appointed by Democrats and Republicans, and will continue Justice Breyer's legacy of excellence."

参考文献:
[1] LangChain官方网站:https://www.langchain.com/
[2] LangChain 🦜️🔗 中文网,跟着LangChain一起学LLM/GPT开发:https://www.langchain.com.cn/
[3] LangChain中文网 - LangChain 是一个用于开发由语言模型驱动的应用程序的框架:http://www.cnlangchain.com/

http://www.lryc.cn/news/135347.html

相关文章:

  • 机器学习:开启智能时代的重要引擎
  • ES搭建集群
  • # Lua与C++交互(二)———— 交互
  • 机器人焊接生产线参数监控系统理解需求
  • 前端基础(ES6 模块化)
  • 第七章,文章界面
  • HJ102 字符统计
  • Maven聚合项目(微服务项目)创建流程,以及pom详解
  • Android OkHttp 源码浅析一
  • 【Redis】——Redis基础的数据结构以及应用场景
  • SpringBoot+WebSocket搭建多人在线聊天环境
  • 推荐适用于不同规模企业的会计软件:选择最适合您企业的解决方案
  • Apache Zookeeper架构和选举机制
  • 车联网TCU USB的配置和使用
  • Linux系统USB摄像头测试程序(三)_视频预览
  • 目标检测任务数据集的数据增强中,图像水平翻转和xml标注文件坐标调整
  • 系统架构的演变
  • IDC发布《亚太决策支持型分析数据平台评估》报告,亚马逊云科技位列“领导者”类别
  • C#之OpenFileDialog创建和管理文件选择对话框
  • Java中使用MongoTemplate 简单操作MongoDB
  • [Mac软件]Pixelmator Pro 3.3.12 专业图像编辑中文版
  • 吴恩达 GPT Prompting 课程
  • gpt3.5写MATLAB代码剪辑视频,使之保留画面ROI区域
  • 设计模式二十一:状态模式(State Pattern)
  • 【校招VIP】产品思维能力之产品设计
  • 微信小程序卡片横向滚动竖图
  • SpringBoot项目(支付宝整合)——springboot整合支付宝沙箱支付 从极简实现到IOC改进
  • 【AIGC】一款离线版的AI智能换脸工具V2.0分享(支持图片、视频、直播)
  • 管理类联考——逻辑——真题篇——按知识分类——汇总篇——一、形式逻辑——选言——相容选言——或
  • Git如何操作本地分支仓库?