Break down agent's response to multiple messages
Is there a way to force the agent step to generate its response into multiple messages, like a natural chat, not all in one response like LLMs?
I have tried to instruct it in the prompt, but it does not work.
1 Reply
At the moment an Agent step always returns a single text trace for the turn. The LLM sits behind that step and Voiceflow only renders the content it gets back as one chat bubble, so adding "please reply in several messages" to the prompt won’t split it – the platform treats the whole reply as one message.
Work-arounds if you want the user to see a stream of smaller bubbles:
• Mix in deterministic Message (or other Talk) steps. Let the Agent finish its turn, follow an exit path, and then send two or three Message steps, each with the message-delay option (e.g. 500-1000 ms) to create a natural cadence.
• Have the Agent return a short reply first and route back to another Agent or Message sequence for the rest of the information.
• If you need the LLM itself to decide where to break content, call a Function/API that returns the parts and loop over them with Message steps – but the splits still have to be separate steps.
So, there isn’t a setting inside the Agent step that will automatically split its own response today; you’ll need to structure the flow with additional Message/Talk steps to get the multi-bubble effect.
Docs reference:
Message step → Message delay section – https://docs.voiceflow.com/docs/message-step