Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Llm empty response fix #423

Open
wants to merge 15 commits into
base: main
Choose a base branch
from

Conversation

hiamitabha
Copy link
Contributor

Currently wirepod crashes if the LLM returns an empty content as part of its streaming responses.

For some providers/ models that seems to be a common occuring.

This fix prevents wirepod from crashing. Without this fix, wirepod will crash with:

Bot 00902161 Transcribed text: what is two plus three
Not a custom intent
Making LLM request for device 00902161...
Using deepseek-ai/DeepSeek-V3
Using remembered chats, length of 0 messages
LLM stream response:
Hmm, let me think about that!
Bot 00902161 Intent Sent: intent_greeting_hello
No Parameters Sent
{{playAnimationWI||thinking}} Hmm, let me think...
panic: runtime error: index out of range [0] with length 0

goroutine 147 [running]:
github.com/kercre123/wire-pod/chipper/pkg/wirepod/ttr.StreamingKGSim.func2()
/mnt/ssddrive/alldata/code/wire-pod/chipper/pkg/wirepod/ttr/kgsim.go:348 +0xe10
created by github.com/kercre123/wire-pod/chipper/pkg/wirepod/ttr.StreamingKGSim in goroutine 26
/mnt/ssddrive/alldata/code/wire-pod/chipper/pkg/wirepod/ttr/kgsim.go:294 +0x10b8

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant