-
Notifications
You must be signed in to change notification settings - Fork 163
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] chore: bump vercel ai #512
base: main
Are you sure you want to change the base?
Conversation
|
WalkthroughThis pull request updates the handling of streaming data. In the Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant API
participant Stream
participant CallbackManager
participant ChatEngine
Client->>API: POST /chat request
API->>Stream: createDataStreamResponse()
API->>CallbackManager: Set up callbacks (onFinal)
CallbackManager->>ChatEngine: Process chat events
ChatEngine-->>CallbackManager: Return event data
CallbackManager->>API: Finalize response
API->>Client: Stream response with annotations
Possibly related PRs
Suggested reviewers
Poem
📜 Recent review detailsConfiguration used: .coderabbit.yaml 📒 Files selected for processing (3)
🧰 Additional context used📓 Path-based instructions (1)`templates/**`: For files under the `templates` folder, do n...
⏰ Context from checks skipped due to timeout of 90000ms (45)
🔇 Additional comments (9)
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
Documentation and Community
|
LlamaIndexAdapter.mergeIntoDataStream(response, { | ||
dataStream: vercelStreamData, | ||
callbacks: { onFinal }, | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Look like this merge function doesn't work as expectation. It still makes the markdown content fully generated
StreamData is deprecated, Use createDataStream, createDataStreamResponse, and pipeDataStreamToResponse instead
Change in this PR:
https://github.com/vercel/ai/pull/3919/files#diff-44d4973726b4ae9362e9d6f34398def14a0fd80dc2d97fee6b6160f3910aa27e
Follow this example to update:
https://github.com/vercel/ai/blob/main/examples/next-openai/app/api/use-chat-persistence-single-message-tools/route.ts
Summary by CodeRabbit
New Features
Chores