Voiceflow PartnersVPVoiceflow Partners
Powered by
MarlonM
Voiceflow Partners•8mo ago•
11 replies
Marlon

LLM Streaming Question

Hey all,

I’m curious if Voiceflow will implement something in future like RETELL has, where the agent steps starts speaking as soon as it gets first tokens from LLM output, instead of waiting for the entire llm response before responding (what voiceflow is currently like)

This would be a gamechanger for voice calls because of latency reduction
Voiceflow Partners banner
Voiceflow PartnersJoin
Exclusive community for Voiceflow Solutions Providers & Content Creators.
13,076Members
Resources
Recent Announcements

Similar Threads

Was this page helpful?
Recent Announcements
jacklyn

# Let your agents search the web 🔎 Hey everyone! We just released the web search tool for the agent step! Your agents can now automatically search the web for information, letting your agent supplement the LLM's knowledge and the data in its knowledge base with live, up-to-date information. Plus.... * You can restrict searches to specific domains, so your agent only searches sites that you own * This is a tool, so you remain in control of when the agent searches the web * Results are automatically summarized in a way that your agent can automatically understand Under the hood, we're using OpenAI's web search API. Give it a try, and let us know what you think! https://docs.voiceflow.com/changelog/native-web-search-tool

jacklyn · 4mo ago

jacklyn

jacklynbiggin's Thread

jacklyn · 4mo ago

hurt-tomato

connor_maclean's Thread

hurt-tomato · 4mo ago

Similar Threads

A Question About LLM
HAHHA / ❓┃ask-a-question
8mo ago
Open Source LLM
Leotrim ZumerajLLeotrim Zumeraj / ❓┃ask-a-question
5mo ago
LLM models in Pro plan
cremeschnitzelCcremeschnitzel / ❓┃ask-a-question
8mo ago
What’s causing LLM Response Timeout?
ltd-scarletLltd-scarlet / ❓┃ask-a-question
2y ago