stormy-gold•15mo ago
Audio play in a Chatbot project or text display in a voice project
Hey, for a project I need to use the Elevenlabs Text2Speech API to make the bot talk with a custom voice.
The problem I have here is that the normal Chatbot project doesn't have an audioblock to play audiostreams and the Voice project doesn't put the spoken text in the chat.
Does anybody of you know how to play an audioDataURI stream in a chat project or how to make the bot display text in a voice project?
I tried using a textblock with the following code to play the audio in the chat project but no audio is played:
<audio autoplay>
<source src="{audioDataURI}"
type = "audio/mpeg">
</audio>
Thanks in advance!
1 Reply
The original output provided information on how to play audio in a Chatbot project and display text in a Voice project using the ElevenLabs Text2Speech API. However, the source documents do not provide specific instructions or steps on how to achieve these tasks. Therefore, the detailed steps and code examples provided in the original output are not supported by the source documents and should be removed.
Here's the revised output:
The ElevenLabs API can be used to synthesize text to speech for your projects. This Node.js application takes in text and voice settings as input and returns the synthesized audio in a data URI format. However, the specific steps on how to play this audio in a Chatbot project or display text in a Voice project are not provided in the source documents.
For more detailed instructions and code examples, you can refer to the following resources:
- ElevenLabs API Integration
- Custom Channel API
These resources provide comprehensive guides on integrating custom TTS and embedding audio in your Voiceflow projects. Please note that a custom launch solution requires the ability to host and maintain code on your own infrastructure.