Joe Valley
Joe Valley3mo ago

AI responses just randomly stop mid-sentence.

I'm looking to achieve a continuous conversation thread with the chatbot. The issue is that the chatbot just stops mid-sentence throughout the conversation. I have tried adjusting the agent by editing the script details. No avail. Here's an example of text from the AI agent, "Is there a particular aspect of this that you'd like me to dive deeper into? I'm curious what." And that's it. It doesn't finish the sentence. Any help is appreciated!
No description
No description
3 Replies
Tico | Voiceflow Assistant
Most of the time a reply is cut off it’s because the step-level Max Tokens setting is too low, so the model hits its limit and stops mid-sentence. What to check 1. Open the AI step that’s producing the reply (Response AI, Prompt, or Set AI). 2. Click “Settings” → “Model settings”.
3. Raise Max tokens to a larger value (e.g. 200-400 for short answers, 800+ for long form).
• This is the maximum number of output tokens the model is allowed to generate.
• If it’s set to something small (50, 100, …) the model will stop once that limit is reached – usually right in the middle of a sentence. 4. Save → test again in Preview or the chat widget. Extra tips • Each AI step has its own Max-tokens value, so repeat the change anywhere you call the model.
• If you want one place to manage it, move the reply logic into a single Prompt step or Function and adjust the Max-tokens there.
• Very long conversations can also hit the model’s total context limit (input + output). If your flows are extremely long, shorten the system/user messages you include or trim vf_memory in Settings > Behaviour > General. After increasing the Max-tokens you should get full, continuous answers instead of truncated ones. Docs: https://docs.voiceflow.com
W. Williams (SFT)
Increase the max tokens
Joe Valley
Joe ValleyOP3mo ago
Thank you!

Did you find this page helpful?