Break down agent's response to multiple messages
Is there a way to force the agent step to generate its response into multiple messages, like a natural chat, not all in one response like LLMs?
I have tried to instruct it in the prompt, but it does not work.
I have tried to instruct it in the prompt, but it does not work.