44
55<!-- START: Include on https://convex.dev/components -->
66
7- This Convex component enables persistent text streaming. It provides a React hook
8- for streaming text from HTTP actions while simultaneously storing the data in the
9- database. This persistence allows the text to be accessed after the stream ends
10- or by other users.
7+ This Convex component enables persistent text streaming. It provides a React
8+ hook for streaming text from HTTP actions while simultaneously storing the data
9+ in the database. This persistence allows the text to be accessed after the
10+ stream ends or by other users.
1111
12- The most common use case is for AI chat applications. The example app (found in the
13- ` example ` directory) is a just such a simple chat app that demonstrates use of the
14- component.
12+ The most common use case is for AI chat applications. The example app (found in
13+ the ` example ` directory) is a just such a simple chat app that demonstrates use
14+ of the component.
1515
16- Here's what you'll end up with! The left browser window is streaming the chat body to the client,
17- and the right browser window is subscribed to the chat body via a database query. The
18- message is only updated in the database on sentence boundaries, whereas the HTTP
19- stream sends tokens as they come:
16+ Here's what you'll end up with! The left browser window is streaming the chat
17+ body to the client, and the right browser window is subscribed to the chat body
18+ via a database query. The message is only updated in the database on sentence
19+ boundaries, whereas the HTTP stream sends tokens as they come:
2020
2121![ example-animation] ( ./anim.gif )
2222
2323## Pre-requisite: Convex
2424
25- You'll need an existing Convex project to use the component.
26- Convex is a hosted backend platform, including a database, serverless functions,
27- and a ton more you can learn about [ here] ( https://docs.convex.dev/get-started ) .
25+ You'll need an existing Convex project to use the component. Convex is a hosted
26+ backend platform, including a database, serverless functions, and a ton more you
27+ can learn about [ here] ( https://docs.convex.dev/get-started ) .
2828
29- Run ` npm create convex ` or follow any of the [ quickstarts] ( https://docs.convex.dev/home ) to set one up.
29+ Run ` npm create convex ` or follow any of the
30+ [ quickstarts] ( https://docs.convex.dev/home ) to set one up.
3031
3132## Installation
3233
@@ -59,7 +60,7 @@ In `convex/chat.ts`:
5960
6061``` ts
6162const persistentTextStreaming = new PersistentTextStreaming (
62- components .persistentTextStreaming
63+ components .persistentTextStreaming ,
6364);
6465
6566// Create a stream using the component and store the id in the database with
@@ -87,15 +88,15 @@ export const getChatBody = query({
8788 handler : async (ctx , args ) => {
8889 return await persistentTextStreaming .getStreamBody (
8990 ctx ,
90- args .streamId as StreamId
91+ args .streamId as StreamId ,
9192 );
9293 },
9394});
9495
9596// Create an HTTP action that generates chunks of the chat body
9697// and uses the component to stream them to the client and save them to the database.
9798export const streamChat = httpAction (async (ctx , request ) => {
98- const body = (await request .json ()) as {streamId: string };
99+ const body = (await request .json ()) as { streamId: string };
99100 const generateChat = async (ctx , request , streamId , chunkAppender ) => {
100101 await chunkAppender (" Hi there!" );
101102 await chunkAppender (" How are you?" );
@@ -106,7 +107,7 @@ export const streamChat = httpAction(async (ctx, request) => {
106107 ctx ,
107108 request ,
108109 body .streamId as StreamId ,
109- generateChat
110+ generateChat ,
110111 );
111112
112113 // Set CORS headers appropriately.
@@ -126,8 +127,8 @@ http.route({
126127});
127128```
128129
129- Finally, in your app, you can now create chats and them subscribe to them
130- via stream and/or database query as optimal:
130+ Finally, in your app, you can now create chats and them subscribe to them via
131+ stream and/or database query as optimal:
131132
132133``` ts
133134// chat-input.tsx, maybe?
@@ -149,7 +150,7 @@ const { text, status } = useStream(
149150 api .chat .getChatBody , // The query to call for the full stream body
150151 new URL (` ${convexSiteUrl }/chat-stream ` ), // The HTTP endpoint for streaming
151152 driven , // True if this browser session created this chat and should generate the stream
152- chat .streamId as StreamId // The streamId from the chat database record
153+ chat .streamId as StreamId , // The streamId from the chat database record
153154);
154155```
155156
@@ -162,28 +163,29 @@ let's examine each approach in isolation.
162163- ** HTTP streaming only** : If your app _ only_ uses HTTP streaming, then the
163164 original browser that made the request will have a great, high-performance
164165 streaming experience. But if that HTTP connection is lost, if the browser
165- window is reloaded, if other users want to view the same chat, or this
166- users wants to revisit the conversation later, it won't be possible. The
166+ window is reloaded, if other users want to view the same chat, or this users
167+ wants to revisit the conversation later, it won't be possible. The
167168 conversation is only ephemeral because it was never stored on the server.
168169
169170- ** Database Persistence Only** : If your app _ only_ uses database persistence,
170171 it's true that the conversation will be available for as long as you want.
171172 Additionally, Convex's subscriptions will ensure the chat message is updated
172- as new text chunks are generated. However, there are a few downsides: one,
173- the entire chat body needs to be resent every time it is changed, which is a
174- lot redundant bandwidth to push into the database and over the websockets to
175- all connected clients. Two, you'll need to make a difficult tradeoff between
173+ as new text chunks are generated. However, there are a few downsides: one, the
174+ entire chat body needs to be resent every time it is changed, which is a lot
175+ redundant bandwidth to push into the database and over the websockets to all
176+ connected clients. Two, you'll need to make a difficult tradeoff between
176177 interactivity and efficiency. If you write every single small chunk to the
177- database, this will get quite slow and expensive. But if you batch up the chunks
178- into, say, paragraphs, then the user experience will feel laggy.
178+ database, this will get quite slow and expensive. But if you batch up the
179+ chunks into, say, paragraphs, then the user experience will feel laggy.
179180
180- This component combines the best of both worlds. The original browser that
181- makes the request will still have a great, high-performance streaming experience.
182- But the chat body is also stored in the database, so it can be accessed by the
181+ This component combines the best of both worlds. The original browser that makes
182+ the request will still have a great, high-performance streaming experience. But
183+ the chat body is also stored in the database, so it can be accessed by the
183184client even after the stream has finished, or by other users, etc.
184185
185186## Background
186187
187- This component is largely based on the Stack post [ AI Chat with HTTP Streaming] ( https://stack.convex.dev/ai-chat-with-http-streaming ) .
188+ This component is largely based on the Stack post
189+ [ AI Chat with HTTP Streaming] ( https://stack.convex.dev/ai-chat-with-http-streaming ) .
188190
189191<!-- END: Include on https://convex.dev/components -->
0 commit comments