Is it normal for each AI response to use around 1400 - 1800 tokens?
Just wondering if I should be using the knowledge base query api, or if using the KB response in the UI is the same thing.
I'm a bit nervous about launching my bot and destroying the quota when our users start trying it out.
I'm a bit nervous about launching my bot and destroying the quota when our users start trying it out.