HA
HA2mo ago

A Question About LLM

Good Afternoon, are the LLMs used in VoiceFlow called Cloud LLMs, a client wants their own LLM and I wasn’t sure if that’s even possible
1 Reply
Tico | Voiceflow Assistant
Hi there! Voiceflow’s built-in “LLM” blocks (and the default Assistant model) call hosted cloud models such as OpenAI GPT-3.5/4, Anthropic Claude, and Cloudflare Workers-AI – so yes, they are cloud-hosted LLMs. If your client wants to use their own LLM (e.g. a model they host in their own VPC or on-prem): • It’s totally possible – you simply “bring your own LLM”.
• Expose the model through an HTTP endpoint (ideally OpenAI-compatible chat-completion JSON, but any REST API works).
• Inside Voiceflow you call that endpoint with an API Step or a Function/Code Step, pass the user’s messages, and return the model’s response into a variable that you then send back to the user. Because all of the heavy lifting happens over HTTPS, nothing needs to run locally inside Voiceflow – you keep full control of the model, infra, and data residency. Tutorial + sample project:
https://docs.voiceflow.com/ai-resources/bring-your-own-llm
(or search “Bring your own LLM – Voiceflow”) So, out-of-the-box models = cloud LLMs, but you can absolutely plug in a self-hosted/private model whenever you need.

Did you find this page helpful?