Update:
这篇文章回答了OP问题的第一部分:
如何将内存添加到 RetrievalQA.from_chain_type?
对于第二部分,请参见@andrew_reece 的回答 https://stackoverflow.com/a/76257734/6947937
或者,如何向 ConversationalRetrievalChain 添加自定义提示?
Original:
你有没有尝试过传入chain_type_kwargs
(底部是源代码的屏幕截图以供快速参考)?
该文档并没有让你很容易理解底层的内容,但这里有一些可以实现你的目标的东西。
你可以在这里找到笔记本GitHub 链接 https://github.com/ShumzZzZz/GPT-Rambling/blob/main/LangChain%20Specific/langchain_add_memory_to_RetrievalQA.ipynb
setup
from langchain.chat_models import ChatOpenAI
from langchain.chains import RetrievalQA
from langchain.memory import ConversationBufferMemory
from langchain import PromptTemplate
from langchain.retrievers import TFIDFRetriever
retriever = TFIDFRetriever.from_texts(
["Our client, a gentleman named Jason, has a dog whose name is Dobby",
"Jason has a good friend called Emma",
"Emma has a cat whose name is Sullivan"])
然后定义您的自定义提示:
template = """
Use the following context (delimited by <ctx></ctx>) and the chat history (delimited by <hs></hs>) to answer the question:
------
<ctx>
{context}
</ctx>
------
<hs>
{history}
</hs>
------
{question}
Answer:
"""
prompt = PromptTemplate(
input_variables=["history", "context", "question"],
template=template,
)
记下您用于输入变量的内容,尤其是'history'
and 'question'
,因为您在设置内存时需要匹配这些:
qa = RetrievalQA.from_chain_type(
llm=ChatOpenAI(),
chain_type='stuff',
retriever=retriever,
verbose=True,
chain_type_kwargs={
"verbose": True,
"prompt": prompt,
"memory": ConversationBufferMemory(
memory_key="history",
input_key="question"),
}
)
现在你可以打电话qa.run({"query": "who's the client's friend?"})
“客户的朋友是艾玛。”
进而qa.run("and her pet's name is?")
“艾玛的宠物叫沙利文。”
检查并验证内存/聊天记录:qa.combine_documents_chain.memory
ConversationBufferMemory(chat_memory=ChatMessageHistory(messages=[HumanMessage(content="who's the client's friend?", additional_kwargs={}), AIMessage(content="The client's friend is Emma.", additional_kwargs={}), HumanMessage(content="and her pet's name is?", additional_kwargs={}), AIMessage(content="Emma's pet's name is Sullivan.", additional_kwargs={})]), output_key=None, input_key='question', return_messages=False, human_prefix='Human', ai_prefix='AI', memory_key='history')
![enter image description here](https://i.stack.imgur.com/3PcuX.png)