Gemini text and function call in the same response #1333
Replies: 4 comments
-
I am not sure if the streaming chat model would do that,but you can certainly try |
Beta Was this translation helpful? Give feedback.
-
I have not implemented streaming in my PR
Enviat amb Gmail Mobile
El dv, 28 febr. 2025 a les 19:48 Georgios Andrianakis <
***@***.***> va escriure:
… I am not sure if the streaming chat model would do that,but you can
certainly try
—
Reply to this email directly, view it on GitHub
<#1333 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AALSMYMFFU7YMCVBAQKYH732SCVPTAVCNFSM6AAAAABYC2KKSOVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTEMZVGUZTGMA>
.
You are receiving this because you were mentioned.Message ID:
<quarkiverse/quarkus-langchain4j/repo-discussions/1333/comments/12355330@
github.com>
|
Beta Was this translation helpful? Give feedback.
-
Thanks @geoand @lordofthejars Alex, I know, I just wanted to keep you in the loop given your Gemini interest. Yes, it is probably not a streaming model case, I've been thinking if having something like |
Beta Was this translation helpful? Give feedback.
-
It is not a blocker for me, but good to see LangChain4j already offers this interface, actually this reminds me of some Ollama code I saw in Quarkus LangChain4j... I guess it can be sorted out easily enough |
Beta Was this translation helpful? Give feedback.
-
I have a chat bot application, with Gemini returning this content during the chat:
I'd like to get that text sent to the user, and let Gemini continue with function calls and then once it is done, return the final response.
Does it mean Quarkus LangChain4j needs to support StreamingChatModel for Gemini or is there is another way ?
CC @geoand @lordofthejars
Beta Was this translation helpful? Give feedback.
All reactions