sunny-green
sunny-green15mo ago

Creating Memory for a Bot Referencing a KB

I am making a bot to help customer service agents get information on different products quickly while they are on a call with customers. In order to make the conversation more natural I want to use memory so that the bot knows that if the user asked about the product "Blood Sugar Protector" then the follow up questions would also be about "Blood Sugar Protector"... For example: Q: "What is Blood Sugar Protector?" A: "Blood Sugar Protector is a blood sugar supplement..." Q: "What ingredients does it contain?" A: "Blood Sugar Protector contains..." Q: "What are some benefits?" A: "Benefits of Blood Sugar Protector include..." The way I set this up is to set the question i.e. {last_utterance} to a new variable {memory} in one block and then instruct my response AI block to answer {last_utterance} based on the variable {memory} in the next block. When the user goes to ask a second question I push the bot again to the response AI block hoping that it'll use memory to answer the question without the user needing to specify the product name again. This flow is on a loop so the user can keep asking questions. This set up has been working for the second question every time but on the third and fourth question the bot sometimes forgets the context and answers the question referencing a different product than what was initially asked in the first question. Any idea what's going on or how I can trouble shoot this? Thank you all in advance
2 Replies
Tico | Voiceflow Assistant
To ensure your bot retains context across multiple questions, you can leverage the Memory feature in Voiceflow's AI steps. Here’s how to set this up: Leveraging Memory in AI Steps 1. Use Memory and Prompt: This option will include the previous 10 turns in the conversation along with your prompt, providing the most context for the AI to generate accurate responses. 2. Set Up Memory in AI Steps: - Set AI Step: If you're using the Memory option on any of your AI Steps, we will automatically include the previous 10 Turns (user inputs and system responses) in the conversation into the prompt to provide context. - Response AI Step: You have the option to use Memory only, which will pass only the previous 10 turns of the conversation to the AI and allow it to respond without any guidance from you. Troubleshooting Context Loss 1. Check Memory Configuration: Ensure that the Memory option is enabled on all relevant AI steps. This will help the AI retain the context of the conversation. 2. Token Limits: Be mindful of token usage. If the conversation exceeds the token limit, the AI might lose context. Voiceflow saves the last 10 turns, but if the conversation is too lengthy, earlier context might be lost. 3. Testing: Use the Test Tool to simulate conversations and observe how the AI retains context. This will help you identify where the context might be getting lost. Summary By configuring your AI steps to use both Memory and Prompt, you can ensure that the bot retains the context of the conversation across multiple questions. Regular testing and monitoring token usage will help maintain the accuracy of responses. For more detailed information, you can refer to the Conversation Memory documentation.
W. Williams (SFT)
this is complicated to explain, but it can be done. Just DM me.

Did you find this page helpful?