{"id":1358,"date":"2024-05-06T05:31:45","date_gmt":"2024-05-06T05:31:45","guid":{"rendered":"https:\/\/www.nicekj.com\/?p=1358"},"modified":"2024-05-06T05:32:01","modified_gmt":"2024-05-06T05:32:01","slug":"langchainxiliejiaochengshiyonglangchaindellmsjinxingduihuajiyi-2","status":"publish","type":"post","link":"https:\/\/www.nicekj.com\/langchainxiliejiaochengshiyonglangchaindellmsjinxingduihuajiyi-2.html","title":{"rendered":"LangChain\u7cfb\u5217\u6559\u7a0b\uff1a\u4f7f\u7528Langchain\u7684LLMs\u8fdb\u884c\u5bf9\u8bdd\u8bb0\u5fc6"},"content":{"rendered":"<p>\u5bf9\u8bdd\u8bb0\u5fc6\u662f\u6307\u804a\u5929\u673a\u5668\u4eba\u5982\u4f55\u4ee5\u5bf9\u8bdd\u65b9\u5f0f\u54cd\u5e94\u591a\u4e2a\u67e5\u8be2\u3002\u5b83\u4f7f\u5bf9\u8bdd\u8fde\u8d2f\uff0c\u5e76\u4e14\u5982\u679c\u6ca1\u6709\u5b83\uff0c\u6bcf\u4e2a\u67e5\u8be2\u90fd\u5c06\u88ab\u89c6\u4e3a\u5b8c\u5168\u72ec\u7acb\u7684\u8f93\u5165\uff0c\u800c\u4e0d\u8003\u8651\u8fc7\u53bb\u7684\u4ea4\u4e92\u3002<\/p>\n<p>\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/www.nicekj.com\/wp-content\/uploads\/replace\/6b4bff620934b6266cfd6b397268ee68.png\" alt=\"image.png\" \/><\/figure>\n<\/p>\n<p>\u8fd9\u79cd\u8bb0\u5fc6\u4f7f\u5f97\u5927\u578b\u8bed\u8a00\u6a21\u578b\uff08LLM\uff09\u80fd\u591f\u8bb0\u4f4f\u4e0e\u7528\u6237\u4e4b\u524d\u7684\u4e92\u52a8\u3002\u9ed8\u8ba4\u60c5\u51b5\u4e0b\uff0cLLMs\u662f\u65e0\u72b6\u6001\u7684\uff0c\u8fd9\u610f\u5473\u7740\u6bcf\u4e2a\u4f20\u5165\u7684\u67e5\u8be2\u90fd\u662f\u72ec\u7acb\u5904\u7406\u7684\uff0c\u4e0d\u8003\u8651\u5176\u4ed6\u4e92\u52a8\u3002\u5bf9\u4e8e\u4e00\u4e2a\u65e0\u72b6\u6001\u7684\u4ee3\u7406\uff0c\u552f\u4e00\u5b58\u5728\u7684\u662f\u5f53\u524d\u7684\u8f93\u5165\uff0c\u6ca1\u6709\u5176\u4ed6\u4fe1\u606f\u3002<\/p>\n<p>\u6709\u8bb8\u591a\u5e94\u7528\u7a0b\u5e8f\u5728\u5176\u4e2d\u8bb0\u4f4f\u4e4b\u524d\u7684\u4e92\u52a8\u975e\u5e38\u91cd\u8981\uff0c\u6bd4\u5982\u804a\u5929\u673a\u5668\u4eba\u3002\u5bf9\u8bdd\u8bb0\u5fc6\u5141\u8bb8\u6211\u4eec\u505a\u5230\u8fd9\u4e00\u70b9\u3002<\/p>\n<p>\u6709\u51e0\u79cd\u65b9\u6cd5\u53ef\u4ee5\u5b9e\u73b0\u5bf9\u8bdd\u8bb0\u5fc6\u3002\u5728<a href=\"https:\/\/link.juejin.cn?target=https%3A%2F%2Fchat.openai.com%2Flearn%2Flangchain-intro%2F\" target=\"_blank\" title=\"https:\/\/chat.openai.com\/learn\/langchain-intro\/\" ref=\"nofollow noopener noreferrer\" rel=\"noopener\">LangChain<\/a>\u7684\u80cc\u666f\u4e0b\uff0c\u5b83\u4eec\u90fd\u662f\u57fa\u4e8eConversationChain\u6784\u5efa\u7684\u3002<\/p>\n<h1 data-id=\"heading-0\">ConversationChain\uff08\u5bf9\u8bdd\u94fe\uff09<\/h1>\n<p>\u6211\u4eec\u53ef\u4ee5\u901a\u8fc7\u521d\u59cb\u5316ConversationChain\uff08\u5bf9\u8bdd\u94fe\uff09\u5f00\u59cb\u3002\u6211\u4eec\u5c06\u4f7f\u7528OpenAI\u7684text-davinci-003\u4f5c\u4e3aLLM\uff0c\u4f46\u4e5f\u53ef\u4ee5\u4f7f\u7528\u5176\u4ed6\u6a21\u578b\uff0c\u5982gpt-3.5-turbo\u3002<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">ini<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-ini code-block-extension-codeShowNum\" lang=\"ini\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">from langchain import OpenAI<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">from langchain.chains import ConversationChain<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\"><span class=\"hljs-comment\"># first initialize the large language model<\/span><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\"><span class=\"hljs-attr\">llm<\/span> = OpenAI(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\">\t<span class=\"hljs-attr\">temperature<\/span>=<span class=\"hljs-number\">0<\/span>,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"7\">\t<span class=\"hljs-attr\">openai_api_key<\/span>=<span class=\"hljs-string\">\"OPENAI_API_KEY\"<\/span>,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"8\">\t<span class=\"hljs-attr\">model_name<\/span>=<span class=\"hljs-string\">\"text-davinci-003\"<\/span><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"9\">)<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"10\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"11\"><span class=\"hljs-comment\"># now initialize the conversation chain<\/span><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"12\"><span class=\"hljs-attr\">conversation<\/span> = ConversationChain(llm=llm)<\/span>\n<\/code><\/pre>\n<p>\u6211\u4eec\u53ef\u4ee5\u8fd9\u6837\u67e5\u770bConversationChain\u4f7f\u7528\u7684\u63d0\u793a\u6a21\u677f\uff1a<\/p>\n<p>In[8]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">print<\/span>(conversation.prompt.template)<\/span>\n<\/code><\/pre>\n<p>Out[8]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">The following is <span class=\"hljs-selector-tag\">a<\/span> friendly conversation between <span class=\"hljs-selector-tag\">a<\/span> human and an AI. The AI is talkative and provides lots of specific <span class=\"hljs-selector-tag\">details<\/span> <span class=\"hljs-selector-tag\">from<\/span> its context. If the AI does not know the answer <span class=\"hljs-selector-tag\">to<\/span> <span class=\"hljs-selector-tag\">a<\/span> question, it truthfully says it does not know.<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">Current conversation:<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">{history}<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\">Human: {<span class=\"hljs-selector-tag\">input<\/span>}<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\">AI:<\/span>\n<\/code><\/pre>\n<p>\u5728\u8fd9\u91cc\uff0c\u63d0\u793a\u901a\u8fc7\u544a\u8bc9\u6a21\u578b\u4ee5\u4e0b\u5185\u5bb9\u662f\u4e00\u4e2a\u4eba\u7c7b\uff08\u6211\u4eec\uff09\u548c\u4e00\u4e2aAI\uff08text-davinci-003\uff09\u4e4b\u95f4\u7684\u5bf9\u8bdd\u6765\u5f15\u5bfc\u6a21\u578b\u3002\u63d0\u793a\u8bd5\u56fe\u901a\u8fc7\u9648\u8ff0\u4ee5\u4e0b\u5185\u5bb9\u6765\u51cf\u5c11\u5e7b\u89c9\uff08\u6a21\u578b\u7f16\u9020\u4e8b\u7269\u7684\u60c5\u51b5\uff09\uff1a<\/p>\n<blockquote>\n<p>&#8220;If the AI does not know the answer to a question, it truthfully says it does not know.&#8221;<\/p>\n<\/blockquote>\n<p>\u8fd9\u6709\u52a9\u4e8e\u51cf\u5c11\u5e7b\u89c9\u7684\u53d1\u751f\uff0c\u4f46\u5e76\u4e0d\u80fd\u5b8c\u5168\u89e3\u51b3\u5e7b\u89c9\u7684\u95ee\u9898\uff0c\u4f46\u6211\u4eec\u5c06\u628a\u8fd9\u4e2a\u95ee\u9898\u7559\u5230\u672a\u6765\u7ae0\u8282\u8ba8\u8bba\u3002<\/p>\n<p>\u5728\u521d\u59cb\u63d0\u793a\u4e4b\u540e\uff0c\u6211\u4eec\u770b\u5230\u4e24\u4e2a\u53c2\u6570\uff1b{history} \u548c {input}\u3002{input} \u662f\u6211\u4eec\u5c06\u653e\u7f6e\u6700\u65b0\u7684\u4eba\u7c7b\u67e5\u8be2\u7684\u5730\u65b9\uff1b\u5b83\u662f\u8f93\u5165\u5230\u804a\u5929\u673a\u5668\u4eba\u6587\u672c\u6846\u7684\u8f93\u5165\u5185\u5bb9\uff1a<\/p>\n<p>\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/www.nicekj.com\/wp-content\/uploads\/replace\/b36dd9d25280a4b3a18b248ed0f4cf33.png\" alt=\"image.png\" \/><\/figure>\n<\/p>\n<p>{history} \u662f\u7528\u4e8e\u4f7f\u7528\u5bf9\u8bdd\u8bb0\u5fc6\u7684\u5730\u65b9\u3002\u5728\u8fd9\u91cc\uff0c\u6211\u4eec\u63d0\u4f9b\u4e86\u5173\u4e8e\u4eba\u7c7b\u548cAI\u4e4b\u95f4\u5bf9\u8bdd\u5386\u53f2\u7684\u4fe1\u606f\u3002<\/p>\n<p>\u8fd9\u4e24\u4e2a\u53c2\u6570\u2014\u2014{history} \u548c {input}\u2014\u2014\u4f20\u9012\u7ed9\u4e86\u6211\u4eec\u521a\u521a\u770b\u5230\u7684\u63d0\u793a\u6a21\u677f\u4e2d\u7684LLM\uff0c\u800c\u6211\u4eec\uff08\u5e0c\u671b\u5982\u6b64\uff09\u8fd4\u56de\u7684\u8f93\u51fa\u53ea\u662f\u5bf9\u8bdd\u7684\u9884\u6d4b\u7ee7\u7eed\u90e8\u5206\u3002<\/p>\n<h1 data-id=\"heading-1\">\u5bf9\u8bdd\u8bb0\u5fc6\u7684\u5f62\u5f0f<\/h1>\n<p>\u6211\u4eec\u53ef\u4ee5\u5728ConversationChain\u4e2d\u4f7f\u7528\u591a\u79cd\u7c7b\u578b\u7684\u5bf9\u8bdd\u8bb0\u5fc6\u3002\u5b83\u4eec\u4fee\u6539\u4f20\u9012\u7ed9{history}\u53c2\u6570\u7684\u6587\u672c\u3002<\/p>\n<h2 data-id=\"heading-2\">\u5bf9\u8bdd\u7f13\u51b2\u8bb0\u5fc6<\/h2>\n<p>ConversationBufferMemory\u662fLangChain\u4e2d\u6700\u76f4\u63a5\u7684\u5bf9\u8bdd\u8bb0\u5fc6\u5f62\u5f0f\u3002\u6b63\u5982\u6211\u4eec\u4e0a\u9762\u6240\u63cf\u8ff0\u7684\uff0c\u8fc7\u53bb\u4eba\u7c7b\u548cAI\u4e4b\u95f4\u7684\u5bf9\u8bdd\u7684\u539f\u59cb\u8f93\u5165\u4ee5\u5176\u539f\u59cb\u5f62\u5f0f\u4f20\u9012\u5230{history}\u53c2\u6570\u4e2d\u3002<\/p>\n<p>In[11]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">ini<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-ini code-block-extension-codeShowNum\" lang=\"ini\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">from langchain.chains.conversation.memory import ConversationBufferMemory<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\"><span class=\"hljs-attr\">conversation_buf<\/span> = ConversationChain(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">    <span class=\"hljs-attr\">llm<\/span>=llm,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\">    <span class=\"hljs-attr\">memory<\/span>=ConversationBufferMemory()<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\">)<\/span>\n<\/code><\/pre>\n<p>In[32]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">conversation_buf<\/span>(\"Good morning AI!\")<\/span>\n<\/code><\/pre>\n<p>Out[32]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">{'<span class=\"hljs-selector-tag\">input<\/span>': <span class=\"hljs-string\">'Good morning AI!'<\/span>,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"> <span class=\"hljs-string\">'history'<\/span>: <span class=\"hljs-string\">''<\/span>,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\"> <span class=\"hljs-string\">'response'<\/span>: <span class=\"hljs-string\">\" Good morning! It's a beautiful day today, isn't it? How can I help you?\"<\/span>}<\/span>\n<\/code><\/pre>\n<p>\u6211\u4eec\u8fd4\u56de\u4e86\u5bf9\u8bdd\u4ee3\u7406\u7684\u7b2c\u4e00\u4e2a\u54cd\u5e94\u3002\u8ba9\u6211\u4eec\u7ee7\u7eed\u5bf9\u8bdd\uff0c\u7f16\u5199\u53ea\u6709\u5728\u8003\u8651\u5bf9\u8bdd\u5386\u53f2\u65f6LLM\u624d\u80fd\u56de\u7b54\u7684\u63d0\u793a\u3002\u6211\u4eec\u8fd8\u6dfb\u52a0\u4e86\u4e00\u4e2acount_tokens\u51fd\u6570\uff0c\u4ee5\u4fbf\u6211\u4eec\u53ef\u4ee5\u67e5\u770b\u6bcf\u6b21\u4ea4\u4e92\u4f7f\u7528\u4e86\u591a\u5c11\u6807\u8bb0\u3002<\/p>\n<p>In[6]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">python<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-python code-block-extension-codeShowNum\" lang=\"python\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-keyword\">from<\/span> langchain.callbacks <span class=\"hljs-keyword\">import<\/span> get_openai_callback<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\"><span class=\"hljs-keyword\">def<\/span> <span class=\"hljs-title function_\">count_tokens<\/span>(<span class=\"hljs-params\">chain, query<\/span>):<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">    <span class=\"hljs-keyword\">with<\/span> get_openai_callback() <span class=\"hljs-keyword\">as<\/span> cb:<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\">        result = chain.run(query)<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\">        <span class=\"hljs-built_in\">print<\/span>(<span class=\"hljs-string\">f'Spent a total of <span class=\"hljs-subst\">{cb.total_tokens}<\/span> tokens'<\/span>)<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"7\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"8\">    <span class=\"hljs-keyword\">return<\/span> result<\/span>\n<\/code><\/pre>\n<p>In[33]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">count_tokens<\/span>(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_buf, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"My interest here is to explore the potential of integrating Large Language Models with external knowledge\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[33]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">179<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[33]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scala<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scala code-block-extension-codeShowNum\" lang=\"scala\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">' <span class=\"hljs-type\">Interesting<\/span>! <span class=\"hljs-type\">Large<\/span> <span class=\"hljs-type\">Language<\/span> <span class=\"hljs-type\">Models<\/span> are a <span class=\"hljs-class\"><span class=\"hljs-keyword\">type<\/span> <span class=\"hljs-title\">of<\/span> <span class=\"hljs-title\">artificial<\/span> <span class=\"hljs-title\">intelligence<\/span> <span class=\"hljs-title\">that<\/span> <span class=\"hljs-title\">can<\/span> <span class=\"hljs-title\">process<\/span> <span class=\"hljs-title\">natural<\/span> <span class=\"hljs-title\">language<\/span> <span class=\"hljs-title\">and<\/span> <span class=\"hljs-title\">generate<\/span> <span class=\"hljs-title\">text<\/span>. <span class=\"hljs-title\">They<\/span> <span class=\"hljs-title\">can<\/span> <span class=\"hljs-title\">be<\/span> <span class=\"hljs-title\">used<\/span> <span class=\"hljs-title\">to<\/span> <span class=\"hljs-title\">generate<\/span> <span class=\"hljs-title\">text<\/span> <span class=\"hljs-title\">from<\/span> <span class=\"hljs-title\">a<\/span> <span class=\"hljs-title\">given<\/span> <span class=\"hljs-title\">context<\/span>, <span class=\"hljs-title\">or<\/span> <span class=\"hljs-title\">to<\/span> <span class=\"hljs-title\">answer<\/span> <span class=\"hljs-title\">questions<\/span> <span class=\"hljs-title\">about<\/span> <span class=\"hljs-title\">a<\/span> <span class=\"hljs-title\">given<\/span> <span class=\"hljs-title\">context<\/span>. <span class=\"hljs-title\">Integrating<\/span> <span class=\"hljs-title\">them<\/span> <span class=\"hljs-keyword\">with<\/span> <span class=\"hljs-title\">external<\/span> <span class=\"hljs-title\">knowledge<\/span> <span class=\"hljs-title\">can<\/span> <span class=\"hljs-title\">help<\/span> <span class=\"hljs-title\">them<\/span> <span class=\"hljs-title\">to<\/span> <span class=\"hljs-title\">better<\/span> <span class=\"hljs-title\">understand<\/span> <span class=\"hljs-title\">the<\/span> <span class=\"hljs-title\">context<\/span> <span class=\"hljs-title\">and<\/span> <span class=\"hljs-title\">generate<\/span> <span class=\"hljs-title\">more<\/span> <span class=\"hljs-title\">accurate<\/span> <span class=\"hljs-title\">results<\/span>. <span class=\"hljs-title\">Is<\/span> <span class=\"hljs-title\">there<\/span> <span class=\"hljs-title\">anything<\/span> <span class=\"hljs-title\">else<\/span> <span class=\"hljs-title\">I<\/span> <span class=\"hljs-title\">can<\/span> <span class=\"hljs-title\">help<\/span> <span class=\"hljs-title\">you<\/span> <span class=\"hljs-keyword\">with<\/span><span class=\"hljs-title\">?<\/span>'<\/span><\/span>\n<\/code><\/pre>\n<p>In[34]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">count_tokens(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_buf,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"<span class=\"hljs-selector-tag\">I<\/span> just want <span class=\"hljs-selector-tag\">to<\/span> analyze the different possibilities. What can you think of?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[34]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">268<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[34]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' Well, integrating Large Language Models with external knowledge can open up a lot of possibilities. For example, you could use them to generate more accurate and detailed summaries of text, or to answer questions about a given context more accurately. You could also use them to generate more accurate translations, or to generate more accurate predictions about future events.'<\/span><\/span>\n<\/code><\/pre>\n<p>In[35]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">count_tokens(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_buf, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"Which data source types could be used <span class=\"hljs-selector-tag\">to<\/span> give context <span class=\"hljs-selector-tag\">to<\/span> the model?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[35]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">360<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[35]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">'  There are a variety of data sources that could be used to give context to a Large Language Model. These include structured data sources such as databases, unstructured data sources such as text documents, and even audio and video data sources. Additionally, you could use external knowledge sources such as Wikipedia or other online encyclopedias to provide additional context.'<\/span><\/span>\n<\/code><\/pre>\n<p>In[36]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">count_tokens<\/span>(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_buf, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"What is my aim again?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[36]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">388<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[36]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' Your aim is to explore the potential of integrating Large Language Models with external knowledge.'<\/span><\/span>\n<\/code><\/pre>\n<p>LLM\u663e\u7136\u53ef\u4ee5\u8bb0\u4f4f\u5bf9\u8bdd\u7684\u5386\u53f2\u3002\u8ba9\u6211\u4eec\u770b\u770bConversationBufferMemory\u662f\u5982\u4f55\u5b58\u50a8\u8fd9\u4e2a\u5bf9\u8bdd\u5386\u53f2\u7684\uff1a<\/p>\n<p>In[37]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">print<\/span>(conversation_buf.memory.buffer)<\/span>\n<\/code><\/pre>\n<p>Out[37]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"><span class=\"hljs-symbol\">Human:<\/span> Good morning AI!<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\"><span class=\"hljs-symbol\">AI:<\/span>  Good morning! It<span class=\"hljs-comment\">'s a beautiful day today, isn't it? How can I help you?<\/span><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\"><span class=\"hljs-symbol\">Human:<\/span> My interest here <span class=\"hljs-built_in\">is<\/span> <span class=\"hljs-keyword\">to<\/span> explore the potential <span class=\"hljs-keyword\">of<\/span> integrating Large Language Models <span class=\"hljs-keyword\">with<\/span> external knowledge<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\"><span class=\"hljs-symbol\">AI:<\/span>  Interesting! Large Language Models are a type <span class=\"hljs-keyword\">of<\/span> artificial intelligence that can process natural language <span class=\"hljs-built_in\">and<\/span> generate <span class=\"hljs-keyword\">text<\/span>. They can be used <span class=\"hljs-keyword\">to<\/span> generate <span class=\"hljs-keyword\">text<\/span> <span class=\"hljs-keyword\">from<\/span> a given context, <span class=\"hljs-built_in\">or<\/span> <span class=\"hljs-keyword\">to<\/span> answer questions about a given context. Integrating them <span class=\"hljs-keyword\">with<\/span> external knowledge can help them <span class=\"hljs-keyword\">to<\/span> better understand the context <span class=\"hljs-built_in\">and<\/span> generate more accurate results. <span class=\"hljs-built_in\">Is<\/span> there anything <span class=\"hljs-keyword\">else<\/span> I can help you <span class=\"hljs-keyword\">with<\/span>?<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\"><span class=\"hljs-symbol\">Human:<\/span> I just want <span class=\"hljs-keyword\">to<\/span> analyze the different possibilities. What can you think <span class=\"hljs-keyword\">of<\/span>?<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"7\"><span class=\"hljs-symbol\">AI:<\/span>  Well, integrating Large Language Models <span class=\"hljs-keyword\">with<\/span> external knowledge can open up a lot <span class=\"hljs-keyword\">of<\/span> possibilities. <span class=\"hljs-keyword\">For<\/span> example, you could use them <span class=\"hljs-keyword\">to<\/span> generate more accurate <span class=\"hljs-built_in\">and<\/span> detailed summaries <span class=\"hljs-keyword\">of<\/span> <span class=\"hljs-keyword\">text<\/span>, <span class=\"hljs-built_in\">or<\/span> <span class=\"hljs-keyword\">to<\/span> answer questions about a given context more accurately. You could also use them <span class=\"hljs-keyword\">to<\/span> generate more accurate translations, <span class=\"hljs-built_in\">or<\/span> <span class=\"hljs-keyword\">to<\/span> generate more accurate predictions about future events.<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"8\"><span class=\"hljs-symbol\">Human:<\/span> Which data source types could be used <span class=\"hljs-keyword\">to<\/span> give context <span class=\"hljs-keyword\">to<\/span> the model?<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"9\"><span class=\"hljs-symbol\">AI:<\/span>   There are a variety <span class=\"hljs-keyword\">of<\/span> data sources that could be used <span class=\"hljs-keyword\">to<\/span> give context <span class=\"hljs-keyword\">to<\/span> a Large Language Model. These include structured data sources such <span class=\"hljs-keyword\">as<\/span> databases, unstructured data sources such <span class=\"hljs-keyword\">as<\/span> <span class=\"hljs-keyword\">text<\/span> documents, <span class=\"hljs-built_in\">and<\/span> even audio <span class=\"hljs-built_in\">and<\/span> video data sources. Additionally, you could use external knowledge sources such <span class=\"hljs-keyword\">as<\/span> Wikipedia <span class=\"hljs-built_in\">or<\/span> other online encyclopedias <span class=\"hljs-keyword\">to<\/span> provide additional context.<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"10\"><span class=\"hljs-symbol\">Human:<\/span> What <span class=\"hljs-built_in\">is<\/span> my aim again?<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"11\"><span class=\"hljs-symbol\">AI:<\/span>  Your aim <span class=\"hljs-built_in\">is<\/span> <span class=\"hljs-keyword\">to<\/span> explore the potential <span class=\"hljs-keyword\">of<\/span> integrating Large Language Models <span class=\"hljs-keyword\">with<\/span> external knowledge.<\/span>\n<\/code><\/pre>\n<p>\u6211\u4eec\u53ef\u4ee5\u770b\u5230\u7f13\u51b2\u4fdd\u5b58\u4e86\u804a\u5929\u5386\u53f2\u4e2d\u7684\u6bcf\u4e00\u6b21\u4e92\u52a8\u3002\u8fd9\u79cd\u65b9\u6cd5\u6709\u4e00\u4e9b\u4f18\u70b9\u548c\u7f3a\u70b9\u3002\u7b80\u800c\u8a00\u4e4b\uff0c\u5b83\u4eec\u662f\uff1a<\/p>\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<table><thead><tr><th>Pros<\/th><th>Cons<\/th><\/tr><\/thead><tbody><tr><td>Storing everything gives the LLM the maximum amount of information<\/td><td>More tokens mean slowing response times and higher costs<\/td><\/tr><tr><td>Storing everything is simple and intuitive<\/td><td>Long conversations cannot be remembered as we hit the LLM token limit (4096 tokens for text-davinci-003 and gpt-3.5-turbo)<\/td><\/tr><\/tbody><\/table>\n<p>ConversationBufferMemory\u662f\u4e00\u4e2a\u5f88\u597d\u7684\u5f00\u59cb\u9009\u9879\uff0c\u4f46\u53d7\u5230\u4fdd\u5b58\u6bcf\u6b21\u4e92\u52a8\u7684\u5b58\u50a8\u9650\u5236\u3002\u8ba9\u6211\u4eec\u770b\u770b\u5176\u4ed6\u53ef\u4ee5\u5e2e\u52a9\u89e3\u51b3\u8fd9\u4e2a\u95ee\u9898\u7684\u9009\u9879\u3002<\/p>\n<h2 data-id=\"heading-3\">\u5bf9\u8bdd\u6458\u8981\u8bb0\u5fc6<\/h2>\n<p>\u4f7f\u7528ConversationBufferMemory\uff0c\u6211\u4eec\u5f88\u5feb\u5c31\u4f7f\u7528\u4e86\u5927\u91cf\u7684\u6807\u8bb0\uff0c\u751a\u81f3\u8d85\u51fa\u4e86\u5f53\u524d\u6700\u5148\u8fdb\u7684LLMs\u7684\u4e0a\u4e0b\u6587\u7a97\u53e3\u9650\u5236\u3002<\/p>\n<p>\u4e3a\u4e86\u907f\u514d\u8fc7\u591a\u7684\u6807\u8bb0\u4f7f\u7528\uff0c\u6211\u4eec\u53ef\u4ee5\u4f7f\u7528ConversationSummaryMemory\u3002\u6b63\u5982\u5176\u540d\u79f0\u6240\u793a\uff0c\u8fd9\u79cd\u5f62\u5f0f\u7684\u8bb0\u5fc6\u5728\u4f20\u9012\u7ed9{history}\u53c2\u6570\u4e4b\u524d\u5bf9\u5bf9\u8bdd\u5386\u53f2\u8fdb\u884c\u4e86\u603b\u7ed3\u3002<\/p>\n<p>\u6211\u4eec\u53ef\u4ee5\u8fd9\u6837\u521d\u59cb\u5316ConversationChain\uff0c\u4f7f\u7528\u603b\u7ed3\u8bb0\u5fc6\uff1a<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">ini<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-ini code-block-extension-codeShowNum\" lang=\"ini\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">from langchain.chains.conversation.memory import ConversationSummaryMemory<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\"><span class=\"hljs-attr\">conversation<\/span> = ConversationChain(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">\t<span class=\"hljs-attr\">llm<\/span>=llm,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\">\t<span class=\"hljs-attr\">memory<\/span>=ConversationSummaryMemory(llm=llm)<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\">)<\/span>\n<\/code><\/pre>\n<p>\u5f53\u4f7f\u7528ConversationSummaryMemory\u65f6\uff0c\u6211\u4eec\u9700\u8981\u5c06\u4e00\u4e2aLLM\u4f20\u9012\u7ed9\u8be5\u5bf9\u8c61\uff0c\u56e0\u4e3a\u603b\u7ed3\u662f\u7531\u4e00\u4e2aLLM\u63d0\u4f9b\u652f\u6301\u7684\u3002\u6211\u4eec\u53ef\u4ee5\u5728\u8fd9\u91cc\u770b\u5230\u7528\u4e8e\u6b64\u64cd\u4f5c\u7684\u63d0\u793a\uff1a<\/p>\n<p>In[19]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">print<\/span>(conversation_sum.memory.prompt.template)<\/span>\n<\/code><\/pre>\n<p>Out[19]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Progressively summarize the lines <span class=\"hljs-keyword\">of<\/span> conversation provided, adding onto the previous summary returning a <span class=\"hljs-built_in\">new<\/span> summary.<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">EXAMPLE<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">Current summary:<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\">The human asks what the AI thinks <span class=\"hljs-keyword\">of<\/span> artificial intelligence. The AI thinks artificial intelligence <span class=\"hljs-built_in\">is<\/span> a force <span class=\"hljs-keyword\">for<\/span> good.<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"7\"><span class=\"hljs-built_in\">New<\/span> lines <span class=\"hljs-keyword\">of<\/span> conversation:<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"8\"><span class=\"hljs-symbol\">Human:<\/span> Why <span class=\"hljs-keyword\">do<\/span> you think artificial intelligence <span class=\"hljs-built_in\">is<\/span> a force <span class=\"hljs-keyword\">for<\/span> good?<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"9\"><span class=\"hljs-symbol\">AI:<\/span> Because artificial intelligence will help humans reach their full potential.<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"10\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"11\"><span class=\"hljs-built_in\">New<\/span> summary:<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"12\">The human asks what the AI thinks <span class=\"hljs-keyword\">of<\/span> artificial intelligence. The AI thinks artificial intelligence <span class=\"hljs-built_in\">is<\/span> a force <span class=\"hljs-keyword\">for<\/span> good because it will help humans reach their full potential.<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"13\"><span class=\"hljs-keyword\">END<\/span> <span class=\"hljs-keyword\">OF<\/span> EXAMPLE<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"14\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"15\">Current summary:<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"16\">{summary}<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"17\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"18\"><span class=\"hljs-built_in\">New<\/span> lines <span class=\"hljs-keyword\">of<\/span> conversation:<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"19\">{new_lines}<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"20\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"21\"><span class=\"hljs-built_in\">New<\/span> summary:<\/span>\n<\/code><\/pre>\n<p>\u4f7f\u7528\u8fd9\u4e2a\u65b9\u6cd5\uff0c\u6211\u4eec\u53ef\u4ee5\u603b\u7ed3\u6bcf\u4e00\u6b21\u65b0\u7684\u4e92\u52a8\uff0c\u5e76\u5c06\u5b83\u6dfb\u52a0\u5230\u6240\u6709\u8fc7\u53bb\u4e92\u52a8\u7684\u201c\u8fd0\u884c\u6458\u8981\u201d\u4e2d\u3002\u8ba9\u6211\u4eec\u518d\u6b21\u8fdb\u884c\u4e00\u6b21\u5229\u7528\u8fd9\u79cd\u65b9\u6cd5\u7684\u5bf9\u8bdd\u3002<\/p>\n<p>In[40]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">rust<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-rust code-block-extension-codeShowNum\" lang=\"rust\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"># without count_tokens we<span class=\"hljs-symbol\">'d<\/span> call `<span class=\"hljs-title function_ invoke__\">conversation_sum<\/span>(<span class=\"hljs-string\">\"Good morning AI!\"<\/span>)`<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"># but <span class=\"hljs-keyword\">let<\/span><span class=\"hljs-symbol\">'s<\/span> keep track of our tokens:<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\"><span class=\"hljs-title function_ invoke__\">count_tokens<\/span>(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">    conversation_sum, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\">    <span class=\"hljs-string\">\"Good morning AI!\"<\/span><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\">)<\/span>\n<\/code><\/pre>\n<p>Out[40]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">290<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[40]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">\" Good morning! It's <span class=\"hljs-selector-tag\">a<\/span> beautiful day today, isn't it? How can <span class=\"hljs-selector-tag\">I<\/span> help you?\"<\/span>\n<\/code><\/pre>\n<p>In[41]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">count_tokens<\/span>(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_sum, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"My interest here is to explore the potential of integrating Large Language Models with external knowledge\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[41]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">440<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[41]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">\" That sounds like an interesting project! <span class=\"hljs-selector-tag\">I<\/span>'m familiar with Large Language Models, but <span class=\"hljs-selector-tag\">I<\/span>'m not sure how they could be integrated with external knowledge. Could you tell me more about what you have in mind?\"<\/span>\n<\/code><\/pre>\n<p>In[42]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">count_tokens(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_sum, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"<span class=\"hljs-selector-tag\">I<\/span> just want <span class=\"hljs-selector-tag\">to<\/span> analyze the different possibilities. What can you think of?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[42]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">664<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[42]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' I can think of a few possibilities. One option is to use a large language model to generate a set of candidate answers to a given query, and then use external knowledge to filter out the most relevant answers. Another option is to use the large language model to generate a set of candidate answers, and then use external knowledge to score and rank the answers. Finally, you could use the large language model to generate a set of candidate answers, and then use external knowledge to refine the answers.'<\/span><\/span>\n<\/code><\/pre>\n<p>In[43]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">count_tokens(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_sum, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"Which data source types could be used <span class=\"hljs-selector-tag\">to<\/span> give context <span class=\"hljs-selector-tag\">to<\/span> the model?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[43]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">799<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[43]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' There are many different types of data sources that could be used to give context to the model. These could include structured data sources such as databases, unstructured data sources such as text documents, or even external APIs that provide access to external knowledge. Additionally, the model could be trained on a combination of these data sources to provide a more comprehensive understanding of the context.'<\/span><\/span>\n<\/code><\/pre>\n<p>In[44]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">count_tokens<\/span>(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_sum, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"What is my aim again?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[44]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">853<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[44]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' Your aim is to explore the potential of integrating Large Language Models with external knowledge.'<\/span><\/span>\n<\/code><\/pre>\n<p>\u5728\u8fd9\u79cd\u60c5\u51b5\u4e0b\uff0c\u603b\u7ed3\u5305\u542b\u8db3\u591f\u7684\u4fe1\u606f\uff0c\u4ee5\u4f7fLLM\u201c\u8bb0\u4f4f\u201d\u6211\u4eec\u6700\u521d\u7684\u76ee\u6807\u3002\u6211\u4eec\u53ef\u4ee5\u50cf\u8fd9\u6837\u67e5\u770b\u603b\u7ed3\u7684\u539f\u59cb\u5f62\u5f0f\uff1a<\/p>\n<p>In[45]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">print<\/span>(conversation_sum.memory.buffer)<\/span>\n<\/code><\/pre>\n<p>Out[45]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">sql<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-sql code-block-extension-codeShowNum\" lang=\"sql\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">The human greeted the AI <span class=\"hljs-keyword\">with<\/span> a good morning, <span class=\"hljs-keyword\">to<\/span> which the AI responded <span class=\"hljs-keyword\">with<\/span> a good morning <span class=\"hljs-keyword\">and<\/span> asked how it could help. The human expressed interest <span class=\"hljs-keyword\">in<\/span> exploring the potential <span class=\"hljs-keyword\">of<\/span> integrating <span class=\"hljs-keyword\">Large<\/span> <span class=\"hljs-keyword\">Language<\/span> Models <span class=\"hljs-keyword\">with<\/span> <span class=\"hljs-keyword\">external<\/span> knowledge, <span class=\"hljs-keyword\">to<\/span> which the AI responded positively <span class=\"hljs-keyword\">and<\/span> asked <span class=\"hljs-keyword\">for<\/span> more information. The human asked the AI <span class=\"hljs-keyword\">to<\/span> think <span class=\"hljs-keyword\">of<\/span> different possibilities, <span class=\"hljs-keyword\">and<\/span> the AI suggested three options: <span class=\"hljs-keyword\">using<\/span> the <span class=\"hljs-keyword\">large<\/span> <span class=\"hljs-keyword\">language<\/span> model <span class=\"hljs-keyword\">to<\/span> generate a <span class=\"hljs-keyword\">set<\/span> <span class=\"hljs-keyword\">of<\/span> candidate answers <span class=\"hljs-keyword\">and<\/span> <span class=\"hljs-keyword\">then<\/span> <span class=\"hljs-keyword\">using<\/span> <span class=\"hljs-keyword\">external<\/span> knowledge <span class=\"hljs-keyword\">to<\/span> <span class=\"hljs-keyword\">filter<\/span> <span class=\"hljs-keyword\">out<\/span> the most relevant answers, score <span class=\"hljs-keyword\">and<\/span> rank the answers, <span class=\"hljs-keyword\">or<\/span> refine the answers. The human <span class=\"hljs-keyword\">then<\/span> asked which data source types could be used <span class=\"hljs-keyword\">to<\/span> give context <span class=\"hljs-keyword\">to<\/span> the model, <span class=\"hljs-keyword\">to<\/span> which the AI responded that there <span class=\"hljs-keyword\">are<\/span> many different types <span class=\"hljs-keyword\">of<\/span> data sources that could be used, such <span class=\"hljs-keyword\">as<\/span> structured data sources, unstructured data sources, <span class=\"hljs-keyword\">or<\/span> <span class=\"hljs-keyword\">external<\/span> APIs. Additionally, the model could be trained <span class=\"hljs-keyword\">on<\/span> a combination <span class=\"hljs-keyword\">of<\/span> these data sources <span class=\"hljs-keyword\">to<\/span> provide a more comprehensive understanding <span class=\"hljs-keyword\">of<\/span> the context. The human <span class=\"hljs-keyword\">then<\/span> asked what their aim was again, <span class=\"hljs-keyword\">to<\/span> which the AI responded that their aim was <span class=\"hljs-keyword\">to<\/span> explore the potential <span class=\"hljs-keyword\">of<\/span> integrating <span class=\"hljs-keyword\">Large<\/span> <span class=\"hljs-keyword\">Language<\/span> Models <span class=\"hljs-keyword\">with<\/span> <span class=\"hljs-keyword\">external<\/span> knowledge.<\/span>\n<\/code><\/pre>\n<p>\u8fd9\u6b21\u5bf9\u8bdd\u4e2d\u4f7f\u7528\u7684\u6807\u8bb0\u6570\u91cf\u6bd4\u4f7f\u7528ConversationBufferMemory\u591a\uff0c\u6240\u4ee5\u4f7f\u7528ConversationSummaryMemory\u662f\u5426\u6709\u4efb\u4f55\u4f18\u52bf\u5462\uff1f<\/p>\n<p>\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/www.nicekj.com\/wp-content\/uploads\/replace\/52d7729359841014e7be4c13d21d7f1c.png\" alt=\"image.png\" \/><\/figure>\n<\/p>\n<p>\u5bf9\u4e8e\u66f4\u957f\u7684\u5bf9\u8bdd\u6765\u8bf4\uff0c\u662f\u7684\u3002\u5728\u8fd9\u91cc\uff0c\u6211\u4eec\u6709\u4e00\u4e2a\u8f83\u957f\u7684\u5bf9\u8bdd\u3002\u5982\u4e0a\u6240\u793a\uff0c\u6458\u8981\u5185\u5b58\u6700\u521d\u4f7f\u7528\u7684\u6807\u8bb0\u8981\u591a\u5f97\u591a\u3002\u7136\u800c\uff0c\u968f\u7740\u5bf9\u8bdd\u7684\u8fdb\u884c\uff0c\u603b\u7ed3\u7684\u65b9\u6cd5\u589e\u957f\u901f\u5ea6\u8f83\u6162\u3002\u76f8\u6bd4\u4e4b\u4e0b\uff0c\u7f13\u51b2\u5185\u5b58\u4f1a\u968f\u7740\u804a\u5929\u4e2d\u7684\u6807\u8bb0\u6570\u91cf\u7ebf\u6027\u589e\u957f\u3002<\/p>\n<p>\u6211\u4eec\u53ef\u4ee5\u603b\u7ed3ConversationSummaryMemory\u7684\u4f18\u7f3a\u70b9\u5982\u4e0b\uff1a<\/p>\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<table><thead><tr><th>Pros<\/th><th>Cons<\/th><\/tr><\/thead><tbody><tr><td>Shortens the number of tokens for long conversations.<\/td><td>Can result in higher token usage for smaller conversations<\/td><\/tr><tr><td>Enables much longer conversations<\/td><td>Memorization of the conversation history is wholly reliant on the summarization ability of the intermediate summarization LLM<\/td><\/tr><tr><td>Relatively straightforward implementation, intuitively simple to understand<\/td><td>Also requires token usage for the summarization LLM; this increases costs (but does not limit conversation length)<\/td><\/tr><\/tbody><\/table>\n<p>\u5bf9\u4e8e\u9884\u671f\u6709\u957f\u5bf9\u8bdd\u7684\u60c5\u51b5\uff0c\u5bf9\u8bdd\u603b\u7ed3\u662f\u4e00\u4e2a\u4e0d\u9519\u7684\u65b9\u6cd5\u3002\u7136\u800c\uff0c\u5b83\u4ecd\u7136\u57fa\u672c\u4e0a\u53d7\u5230\u6807\u8bb0\u9650\u5236\u7684\u9650\u5236\u3002\u7ecf\u8fc7\u4e00\u6bb5\u65f6\u95f4\u540e\uff0c\u6211\u4eec\u4ecd\u7136\u4f1a\u8d85\u51fa\u4e0a\u4e0b\u6587\u7a97\u53e3\u7684\u9650\u5236\u3002<\/p>\n<h2 data-id=\"heading-4\">\u5bf9\u8bdd\u7f13\u51b2\u7a97\u53e3\u5185\u5b58<\/h2>\n<p>&#8220;ConversationBufferWindowMemory&#8221; \u7684\u4f5c\u7528\u4e0e\u4e4b\u524d\u7684 &#8220;\u7f13\u51b2\u5185\u5b58&#8221; \u76f8\u540c\uff0c\u4f46\u5b83\u5728\u5185\u5b58\u4e2d\u6dfb\u52a0\u4e86\u4e00\u4e2a\u7a97\u53e3\u3002\u8fd9\u610f\u5473\u7740\u6211\u4eec\u53ea\u4fdd\u7559\u4e00\u5b9a\u6570\u91cf\u7684\u8fc7\u53bb\u4ea4\u4e92\uff0c\u7136\u540e\u5c06\u5176\u201c\u9057\u5fd8\u201d\u3002\u6211\u4eec\u4f7f\u7528\u5b83\u7684\u65b9\u5f0f\u5982\u4e0b\uff1a<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">ini<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-ini code-block-extension-codeShowNum\" lang=\"ini\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">from langchain.chains.conversation.memory import ConversationBufferWindowMemory<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\"><span class=\"hljs-attr\">conversation<\/span> = ConversationChain(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">\t<span class=\"hljs-attr\">llm<\/span>=llm,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\">\t<span class=\"hljs-attr\">memory<\/span>=ConversationBufferWindowMemory(k=<span class=\"hljs-number\">1<\/span>)<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"6\">)<\/span>\n<\/code><\/pre>\n<p>\u5728\u8fd9\u4e2a\u793a\u4f8b\u4e2d\uff0c\u6211\u4eec\u8bbe\u7f6e\u4e86 k=1 \u2014 \u8fd9\u610f\u5473\u7740\u7a97\u53e3\u4f1a\u8bb0\u4f4f\u4eba\u7c7b\u548c\u4eba\u5de5\u667a\u80fd\u4e4b\u95f4\u7684\u6700\u65b0\u4e00\u6b21\u4ea4\u4e92\u3002\u4e5f\u5c31\u662f\u6700\u65b0\u7684\u4eba\u7c7b\u56de\u590d\u548c\u6700\u65b0\u7684\u4eba\u5de5\u667a\u80fd\u56de\u590d\u3002\u6211\u4eec\u53ef\u4ee5\u770b\u5230\u8fd9\u4e2a\u6548\u679c\u5982\u4e0b\uff1a<\/p>\n<p>In[61]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">count_tokens<\/span>(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_bufw, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"Good morning AI!\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[61]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">85<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[61]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">\" Good morning! It's <span class=\"hljs-selector-tag\">a<\/span> beautiful day today, isn't it? How can <span class=\"hljs-selector-tag\">I<\/span> help you?\"<\/span>\n<\/code><\/pre>\n<p>In[62]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">count_tokens<\/span>(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_bufw, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"My interest here is to explore the potential of integrating Large Language Models with external knowledge\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[62]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">178<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[62]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' Interesting! Large Language Models are a type of artificial intelligence that can process natural language and generate text. They can be used to generate text from a given context, or to answer questions about a given context. Integrating them with external knowledge can help them to better understand the context and generate more accurate results. Do you have any specific questions about this integration?'<\/span><\/span>\n<\/code><\/pre>\n<p>In[63]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">count_tokens(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_bufw, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"<span class=\"hljs-selector-tag\">I<\/span> just want <span class=\"hljs-selector-tag\">to<\/span> analyze the different possibilities. What can you think of?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[63]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">233<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[63]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' There are many possibilities for integrating Large Language Models with external knowledge. For example, you could use external knowledge to provide additional context to the model, or to provide additional training data. You could also use external knowledge to help the model better understand the context of a given text, or to help it generate more accurate results.'<\/span><\/span>\n<\/code><\/pre>\n<p>In[64]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">count_tokens(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_bufw, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"Which data source types could be used <span class=\"hljs-selector-tag\">to<\/span> give context <span class=\"hljs-selector-tag\">to<\/span> the model?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[64]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">245<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[64]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' Data sources that could be used to give context to the model include text corpora, structured databases, and ontologies. Text corpora provide a large amount of text data that can be used to train the model and provide additional context. Structured databases provide structured data that can be used to provide additional context to the model. Ontologies provide a structured representation of knowledge that can be used to provide additional context to the model.'<\/span><\/span>\n<\/code><\/pre>\n<p>In[65]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">count_tokens<\/span>(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    conversation_bufw, <\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">    \"What is my aim again?\"<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">)<\/span>\n<\/code><\/pre>\n<p>Out[65]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">css<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-css code-block-extension-codeShowNum\" lang=\"css\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\">Spent <span class=\"hljs-selector-tag\">a<\/span> total of <span class=\"hljs-number\">186<\/span> tokens<\/span>\n<\/code><\/pre>\n<p>Out[65]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-comment\">' Your aim is to use data sources to give context to the model.'<\/span><\/span>\n<\/code><\/pre>\n<p>\u5728\u5bf9\u8bdd\u7ed3\u675f\u65f6\uff0c\u5f53\u6211\u4eec\u95ee\u201c\u6211\u7684\u76ee\u6807\u662f\u4ec0\u4e48\uff1f\u201d\u65f6\uff0c\u7b54\u6848\u5305\u542b\u5728\u4e09\u6b21\u4ea4\u4e92\u524d\u7684\u4eba\u7c7b\u56de\u5e94\u4e2d\u3002\u7531\u4e8e\u6211\u4eec\u53ea\u4fdd\u7559\u4e86\u6700\u8fd1\u7684\u4e00\u6b21\u4ea4\u4e92\uff08k=1\uff09\uff0c\u6a21\u578b\u5df2\u7ecf\u5fd8\u8bb0\u4e86\uff0c\u65e0\u6cd5\u7ed9\u51fa\u6b63\u786e\u7684\u7b54\u6848\u3002<\/p>\n<p>\u6211\u4eec\u53ef\u4ee5\u8fd9\u6837\u770b\u5230\u6a21\u578b\u7684\u6709\u6548\u201c\u8bb0\u5fc6\u201d\uff1a<\/p>\n<p>In[66]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">ini<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-ini code-block-extension-codeShowNum\" lang=\"ini\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-attr\">bufw_history<\/span> = conversation_bufw.memory.load_memory_variables(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    <span class=\"hljs-attr\">inputs<\/span>=[]<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">)<span class=\"hljs-section\">['history']<\/span><\/span>\n<\/code><\/pre>\n<p>In[67]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">scss<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-scss code-block-extension-codeShowNum\" lang=\"scss\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-built_in\">print<\/span>(bufw_history)<\/span>\n<\/code><\/pre>\n<p>Out[67]:<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">vbnet<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-vbnet code-block-extension-codeShowNum\" lang=\"vbnet\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-symbol\">Human:<\/span> What <span class=\"hljs-built_in\">is<\/span> my aim again?<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\"><span class=\"hljs-symbol\">AI:<\/span>  Your aim <span class=\"hljs-built_in\">is<\/span> <span class=\"hljs-keyword\">to<\/span> use data sources <span class=\"hljs-keyword\">to<\/span> give context <span class=\"hljs-keyword\">to<\/span> the model.<\/span>\n<\/code><\/pre>\n<p>\u5c3d\u7ba1\u8fd9\u79cd\u65b9\u6cd5\u4e0d\u9002\u7528\u4e8e\u8bb0\u4f4f\u8fdc\u8ddd\u79bb\u7684\u4ea4\u4e92\uff0c\u4f46\u5b83\u5f88\u64c5\u957f\u9650\u5236\u4f7f\u7528\u7684\u6807\u8bb0\u6570\u91cf \u2014 \u8fd9\u662f\u4e00\u4e2a\u53ef\u4ee5\u6839\u636e\u6211\u4eec\u7684\u9700\u6c42\u589e\u52a0\u6216\u51cf\u5c11\u7684\u6570\u91cf\u3002\u5bf9\u4e8e\u6211\u4eec\u4e4b\u524d\u6bd4\u8f83\u4e2d\u4f7f\u7528\u7684\u66f4\u957f\u5bf9\u8bdd\uff0c\u6211\u4eec\u53ef\u4ee5\u8bbe\u7f6e k=6\uff0c\u5728\u603b\u517127\u6b21\u4ea4\u4e92\u540e\u8fbe\u5230\u6bcf\u6b21\u4ea4\u4e92\u7ea61.5K\u6807\u8bb0\uff1a<\/p>\n<p>\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/www.nicekj.com\/wp-content\/uploads\/replace\/ac71e10bbe3f807af8afddca048b01f7.png\" alt=\"image.png\" \/><\/figure>\n<\/p>\n<p>\u5982\u679c\u6211\u4eec\u53ea\u9700\u8981\u6700\u8fd1\u4ea4\u4e92\u7684\u8bb0\u5fc6\uff0c\u8fd9\u662f\u4e00\u4e2a\u5f88\u597d\u7684\u9009\u62e9\u3002\u7136\u800c\uff0c\u5982\u679c\u9700\u8981\u540c\u65f6\u8003\u8651\u8fdc\u8ddd\u79bb\u548c\u6700\u8fd1\u4ea4\u4e92\uff0c\u8fd8\u6709\u5176\u4ed6\u9009\u62e9\u3002<\/p>\n<h2 data-id=\"heading-5\">\u5bf9\u8bdd\u603b\u7ed3\u7f13\u51b2\u5185\u5b58<\/h2>\n<p>&#8220;ConversationSummaryBufferMemory&#8221; \u662f\u5bf9\u8bdd\u603b\u7ed3\u5185\u5b58\u548c\u5bf9\u8bdd\u7f13\u51b2\u7a97\u53e3\u5185\u5b58\u7684\u6df7\u5408\u4f53\u3002\u5b83\u603b\u7ed3\u4e86\u5bf9\u8bdd\u4e2d\u6700\u65e9\u7684\u4ea4\u4e92\uff0c\u540c\u65f6\u4fdd\u7559\u4e86\u5bf9\u8bdd\u4e2d\u6700\u8fd1\u7684\u6807\u8bb0\u6570\u4e0d\u8d85\u8fc7\u6700\u5927\u6807\u8bb0\u9650\u5236\u3002\u521d\u59cb\u5316\u5982\u4e0b\u6240\u793a\uff1a<\/p>\n<pre><\/div><div class=\"code-block-extension-headerRight\"><span class=\"code-block-extension-lang\">ini<\/span><div class=\"code-block-extension-copyCodeBtn\">\u590d\u5236\u4ee3\u7801<\/div><\/div><\/div><code class=\"hljs language-ini code-block-extension-codeShowNum\" lang=\"ini\"><span class=\"code-block-extension-codeLine\" data-line-num=\"1\"><span class=\"hljs-attr\">conversation_sum_bufw<\/span> = ConversationChain(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"2\">    <span class=\"hljs-attr\">llm<\/span>=llm, memory=ConversationSummaryBufferMemory(<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"3\">        <span class=\"hljs-attr\">llm<\/span>=llm,<\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"4\">        <span class=\"hljs-attr\">max_token_limit<\/span>=<span class=\"hljs-number\">650<\/span><\/span>\n<span class=\"code-block-extension-codeLine\" data-line-num=\"5\">)<\/span>\n<\/code><\/pre>\n<p>\u5c06\u8fd9\u4e2a\u65b9\u6cd5\u5e94\u7528\u5230\u4e4b\u524d\u7684\u5bf9\u8bdd\u4e2d\uff0c\u6211\u4eec\u53ef\u4ee5\u5c06max_token_limit\u8bbe\u7f6e\u4e3a\u4e00\u4e2a\u8f83\u5c0f\u7684\u6570\u5b57\uff0c\u4f46\u8bed\u8a00\u6a21\u578b\u4ecd\u7136\u53ef\u4ee5\u8bb0\u4f4f\u6211\u4eec\u4e4b\u524d\u7684\u201c\u76ee\u6807\u201d\u3002<\/p>\n<p>\u8fd9\u662f\u56e0\u4e3a\u8fd9\u4e9b\u4fe1\u606f\u88ab\u201c\u603b\u7ed3\u201d\u7ec4\u4ef6\u6355\u83b7\uff0c\u5c3d\u7ba1\u88ab\u201c\u7f13\u51b2\u7a97\u53e3\u201d\u7ec4\u4ef6\u5ffd\u7565\u4e86\u3002<\/p>\n<p>\u5f53\u7136\uff0c\u8fd9\u4e2a\u7ec4\u4ef6\u7684\u4f18\u7f3a\u70b9\u662f\u57fa\u4e8e\u5176\u6240\u4f9d\u8d56\u7684\u4e4b\u524d\u7684\u7ec4\u4ef6\u7684\u6df7\u5408\u3002<\/p>\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n<table><thead><tr><th>Pros<\/th><th>Cons<\/th><\/tr><\/thead><tbody><tr><td>Summarizer means we can remember distant interactions<\/td><td>Summarizer increases token count for shorter conversations<\/td><\/tr><tr><td>Buffer prevents us from missing information from the most recent interactions<\/td><td>Storing the raw interactions \u2014 even if just the most recent interactions \u2014 increases token count<\/td><\/tr><\/tbody><\/table>\n<p>\u5c3d\u7ba1\u9700\u8981\u66f4\u591a\u7684\u8c03\u6574\u6765\u786e\u5b9a\u8981\u603b\u7ed3\u548c\u5728\u7f13\u51b2\u7a97\u53e3\u5185\u4fdd\u7559\u7684\u5185\u5bb9\uff0c\u4f46ConversationSummaryBufferMemory\u786e\u5b9e\u4e3a\u6211\u4eec\u63d0\u4f9b\u4e86\u5f88\u5927\u7684\u7075\u6d3b\u6027\uff0c\u800c\u4e14\u5b83\u662f\u6211\u4eec\u76ee\u524d\u552f\u4e00\u80fd\u591f\u8bb0\u4f4f\u8fdc\u8ddd\u79bb\u4ea4\u4e92\u5e76\u4ee5\u539f\u59cb\u548c\u6700\u4fe1\u606f\u4e30\u5bcc\u7684\u5f62\u5f0f\u5b58\u50a8\u6700\u8fd1\u4ea4\u4e92\u7684\u8bb0\u5fc6\u7c7b\u578b\u4e4b\u4e00\u3002<\/p>\n<p>\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" src=\"https:\/\/www.nicekj.com\/wp-content\/uploads\/replace\/eb95ee6851d3a9a8cfe4d5ff180df826.png\" alt=\"image.png\" \/><\/figure>\n<\/p>\n<p>\u6211\u4eec\u8fd8\u53ef\u4ee5\u770b\u5230\uff0c\u5c3d\u7ba1\u5305\u62ec\u4e86\u5bf9\u8fc7\u53bb\u4ea4\u4e92\u7684\u603b\u7ed3\u548c\u6700\u8fd1\u4ea4\u4e92\u7684\u539f\u59cb\u5f62\u5f0f\uff0cConversationSummaryBufferMemory\u7684\u6807\u8bb0\u6570\u91cf\u589e\u52a0\u4e0e\u5176\u4ed6\u65b9\u6cd5\u76f8\u7ade\u4e89\u3002<\/p>\n<h2 data-id=\"heading-6\">\u5176\u4ed6\u8bb0\u5fc6\u7c7b\u578b<\/h2>\n<p>\u6211\u4eec\u5728\u8fd9\u91cc\u4ecb\u7ecd\u7684\u8bb0\u5fc6\u7c7b\u578b\u975e\u5e38\u9002\u5408\u5165\u95e8\uff0c\u80fd\u591f\u5728\u5c3d\u91cf\u8bb0\u4f4f\u5c3d\u53ef\u80fd\u591a\u7684\u4fe1\u606f\u548c\u6700\u5c0f\u5316\u6807\u8bb0\u4e4b\u95f4\u53d6\u5f97\u826f\u597d\u7684\u5e73\u8861\u3002<\/p>\n<p>\u7136\u800c\uff0c\u6211\u4eec\u8fd8\u6709\u5176\u4ed6\u9009\u62e9\uff0c\u7279\u522b\u662fConversationKnowledgeGraphMemory\u548cConversationEntityMemory\u3002\u6211\u4eec\u5c06\u5728\u63a5\u4e0b\u6765\u7684\u7ae0\u8282\u4e2d\u66f4\u8be6\u7ec6\u5730\u4ecb\u7ecd\u8fd9\u4e9b\u4e0d\u540c\u5f62\u5f0f\u7684\u8bb0\u5fc6\u3002<\/p>\n<p>\u8fd9\u5c31\u662f\u5173\u4e8e\u4f7f\u7528LangChain\u7684LLMs\u7684\u5bf9\u8bdd\u8bb0\u5fc6\u7684\u7b80\u4ecb\u3002\u6b63\u5982\u6211\u4eec\u6240\u89c1\uff0c\u6709\u5f88\u591a\u9009\u9879\u53ef\u4ee5\u5e2e\u52a9\u65e0\u72b6\u6001\u7684LLMs\u4ee5\u4eff\u4f5b\u5728\u6709\u72b6\u6001\u7684\u73af\u5883\u4e2d\u8fdb\u884c\u4ea4\u4e92\uff0c\u80fd\u591f\u8003\u8651\u548c\u53c2\u8003\u4ee5\u524d\u7684\u4ea4\u4e92\u3002<\/p>\n<p>\u6b63\u5982\u63d0\u5230\u7684\uff0c\u8fd8\u6709\u5176\u4ed6\u5f62\u5f0f\u7684\u8bb0\u5fc6\u6211\u4eec\u53ef\u4ee5\u6db5\u76d6\u3002\u6211\u4eec\u8fd8\u53ef\u4ee5\u5b9e\u73b0\u81ea\u5df1\u7684\u8bb0\u5fc6\u6a21\u5757\uff0c\u5728\u540c\u4e00\u94fe\u4e2d\u4f7f\u7528\u591a\u79cd\u7c7b\u578b\u7684\u8bb0\u5fc6\uff0c\u5c06\u5b83\u4eec\u4e0e\u4ee3\u7406\u7ed3\u5408\u4f7f\u7528\uff0c\u7b49\u7b49\u3002\u6240\u6709\u8fd9\u4e9b\u5185\u5bb9\u90fd\u5c06\u5728\u672a\u6765\u7684\u7ae0\u8282\u4e2d\u4ecb\u7ecd\u3002<\/p>","protected":false},"excerpt":{"rendered":"<p>\u5bf9\u8bdd\u8bb0\u5fc6\u662f\u6307\u804a\u5929\u673a\u5668\u4eba\u5982\u4f55\u4ee5\u5bf9\u8bdd\u65b9\u5f0f\u54cd\u5e94\u591a\u4e2a\u67e5\u8be2\u3002\u5b83\u4f7f\u5bf9\u8bdd\u8fde\u8d2f\uff0c\u5e76\u4e14\u5982\u679c\u6ca1\u6709\u5b83\uff0c\u6bcf\u4e2a\u67e5\u8be2\u90fd\u5c06\u88ab\u89c6\u4e3a\u5b8c\u5168\u72ec\u7acb\u7684\u8f93\u5165\uff0c\u800c\u4e0d\u8003\u8651\u8fc7\u53bb\u7684\u4ea4\u4e92\u3002 \u8fd9\u79cd\u8bb0\u5fc6\u4f7f\u5f97\u5927\u578b\u8bed\u8a00\u6a21\u578b\uff08LLM\uff09\u80fd\u591f\u8bb0\u4f4f\u4e0e\u7528\u6237\u4e4b\u524d\u7684\u4e92\u52a8\u3002\u9ed8\u8ba4<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"rank_math_title":"","rank_math_description":"","rank_math_focus_keyword":"","views":"","footnotes":""},"categories":[3],"tags":[126,127,128,129,136],"collection":[],"class_list":["post-1358","post","type-post","status-publish","format-standard","hentry","category-fenlei2","tag-gpt","tag-ai","tag-128","tag-129","tag-136"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/posts\/1358","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/comments?post=1358"}],"version-history":[{"count":0,"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/posts\/1358\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/media?parent=1358"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/categories?post=1358"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/tags?post=1358"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.nicekj.com\/nicekj2024\/wp\/v2\/collection?post=1358"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}