modern-teal•2y ago
unable to find relevant answer
I have a good kb of docs, but when i ask the bot questions, it keeps saying unable to find relevant answer. and it consumes over 65000 tokens without any help. What is going on? Please help.

8 Replies
modern-tealOP•2y ago
I have now lost all my tokens just in few seconds please help as i already have some models already deployed

modern-tealOP•2y ago
the models i have deployed are saying [token quota exceeded]
You are passing in 66000 tokens in your ai response.
as context
modern-tealOP•2y ago
in my instructions, i have a number of urls in there to help the bot link a product to the answer.
I had a restriction of about 800 tokens max
the questions i ask it are just one sentence kind of questions
conscious-sapphire•17mo ago
@drbanny101 [Lvl. 2] Did you find a solution to this?
modern-tealOP•17mo ago
no
No i got no help from VF at the moment
optimistic-gold•17mo ago
I think you may need to reorganize your project. Instead of including the URLs in your prompt, put them in your KB instead. That way you don't chew up your tokens on trying to ground the LLM in URLs. Instead, you should just vectorize everything in KB so that the LLMs can source it for you without being such a KB drain. Token management is a huge consideration in LLM Ops.
modern-tealOP•17mo ago
Thank you, i re-purchased tokens and improved the bot some days ago.