LLM Intent search even when using "Only Intents For This Step"
I have on 'Only Intents For This Step" but it appears that it's still using the LLM to search, using unnecessary tokens.
Using capture user reply and then response ai via KB




12 Replies
In Voiceflow, the "Only Intents For This Step" option is used to limit the scope of the Intent step to only consider the intents that are directly linked to it. However, this doesn't necessarily prevent the Large Language Model (LLM) from generating responses based on the user's input.
The LLM is a part of the Response AI step, which is designed to generate dynamic responses based on the user's input and the context of the conversation. Even when you're using the "Only Intents For This Step" option, the LLM can still generate responses if the user's input matches the prompt you've provided in the Response AI step.
If you're using the Capture User Reply step followed by the Response AI via Knowledge Base (KB), the LLM will generate a response based on the user's entire reply that was captured and the information available in the KB. The LLM doesn't directly search the KB, but it uses the information from the KB to inform its response.
If you want to limit the use of tokens, you might consider simplifying your prompts or reducing the complexity of your conversation design. Remember that each interaction with the LLM, including each generated response, will use tokens.
Also, keep in mind that the Response AI step is an experimental feature and it's not recommended for production use cases due to its potential to generate misleading or false information.
For more detailed information on how to use these features, you can refer to the Voiceflow documentation on Conversation Memory and Response AI.
Right, but I don't even want intent to be on at all. I just need the AI to use the KB to form a response based on the conversation.
I don't understand why it'd need to decipher intent in this case. I've turned off
global intent
because I'm obviously already in the intent that I've intended the user land in.turn off intents for that capture then
I did, it's set to only intents for this step
Unless there is a non-obvious way to fully turn off intents
This is becoming a major issue for me, this doesn't happen when LLM is off.


Can you DM me and I'll have a look
@denys @bamelement Looks like there is an issue with the LLM intents. If you set the scope to local and there is no intent within that capture step, it still runs the LLM intent checking.
I have seen and was able to repo this issue
@NiKo | Voiceflow thought you might wanna see this
foreign-sapphire•17mo ago
I was also able to reproduce it.
@denys @bamelement any update on this bug?
Hey all, any update? I know tokens don't cost a ton, but I'd really rather not go live until this is fixed. If each user makes a ticket, it'll cost me a minimum of 1500 tokens because I ask for username, email and description of the issue.
Each one is using tokens even though I have Intent Scoping set to Only intents in this step.

genetic-orange•17mo ago
Good video on this just dropped from Conner https://youtu.be/4Pe6bhzE3Kc?si=UUtC9RBp0FX7nIcL
adverse-sapphire•16mo ago
Hey @Brandersnatcher - apologies for the delay. This isn't a bug per se, it has always been how it worked but less visible until now. Even though the step is intent scoped, all user utterances go through our NLU for classification every time a query is provided. The team is evaluating an option to turn this off so you are not consuming tokens in the example provided.
Interesting, thanks for informing me.
I'll check this vid out today @bloxtersamiad ty
Just watched, it didn't really tell me anything new. He just mentioned that he is also having this issue.
IMO this is a bug or we need a way to make it clear that the user is filling out a form and essentially regardless of what they type it goes to the next step. I just need their username and email, and don't want to be charged tokens for getting them. I'm not really interested in the user telling me they want to start over or change paths at this point in the flow @bamelement
Is there a place I can subscribe to get updates on changes like this, should it get fixed?