modern-teal
modern-teal2y ago

unable to find relevant answer

I have a good kb of docs, but when i ask the bot questions, it keeps saying unable to find relevant answer. and it consumes over 65000 tokens without any help. What is going on? Please help.
No description
8 Replies
modern-teal
modern-tealOP2y ago
I have now lost all my tokens just in few seconds please help as i already have some models already deployed
No description
modern-teal
modern-tealOP2y ago
the models i have deployed are saying [token quota exceeded]
W. Williams (SFT)
You are passing in 66000 tokens in your ai response. as context
modern-teal
modern-tealOP2y ago
in my instructions, i have a number of urls in there to help the bot link a product to the answer. I had a restriction of about 800 tokens max the questions i ask it are just one sentence kind of questions
conscious-sapphire
conscious-sapphire17mo ago
@drbanny101 [Lvl. 2] Did you find a solution to this?
modern-teal
modern-tealOP17mo ago
no No i got no help from VF at the moment
optimistic-gold
optimistic-gold17mo ago
I think you may need to reorganize your project. Instead of including the URLs in your prompt, put them in your KB instead. That way you don't chew up your tokens on trying to ground the LLM in URLs. Instead, you should just vectorize everything in KB so that the LLMs can source it for you without being such a KB drain. Token management is a huge consideration in LLM Ops.
modern-teal
modern-tealOP17mo ago
Thank you, i re-purchased tokens and improved the bot some days ago.

Did you find this page helpful?