Skip to content

Commit 6ea6366

Browse files
committed
format
1 parent 395aa15 commit 6ea6366

File tree

21 files changed

+316
-313
lines changed

21 files changed

+316
-313
lines changed

.prettierrc.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
11
{
2-
"trailingComma": "es5"
2+
"trailingComma": "all",
3+
"proseWrap": "always"
34
}

CONTRIBUTING.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,8 @@ git push --tags
4242

4343
#### Alpha release
4444

45-
The same as above, but it requires extra flags so the release is only installed with `@alpha`:
45+
The same as above, but it requires extra flags so the release is only installed
46+
with `@alpha`:
4647

4748
```sh
4849
npm version prerelease --preid alpha

README.md

Lines changed: 36 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -4,29 +4,30 @@
44

55
<!-- START: Include on https://convex.dev/components -->
66

7-
This Convex component enables persistent text streaming. It provides a React hook
8-
for streaming text from HTTP actions while simultaneously storing the data in the
9-
database. This persistence allows the text to be accessed after the stream ends
10-
or by other users.
7+
This Convex component enables persistent text streaming. It provides a React
8+
hook for streaming text from HTTP actions while simultaneously storing the data
9+
in the database. This persistence allows the text to be accessed after the
10+
stream ends or by other users.
1111

12-
The most common use case is for AI chat applications. The example app (found in the
13-
`example` directory) is a just such a simple chat app that demonstrates use of the
14-
component.
12+
The most common use case is for AI chat applications. The example app (found in
13+
the `example` directory) is a just such a simple chat app that demonstrates use
14+
of the component.
1515

16-
Here's what you'll end up with! The left browser window is streaming the chat body to the client,
17-
and the right browser window is subscribed to the chat body via a database query. The
18-
message is only updated in the database on sentence boundaries, whereas the HTTP
19-
stream sends tokens as they come:
16+
Here's what you'll end up with! The left browser window is streaming the chat
17+
body to the client, and the right browser window is subscribed to the chat body
18+
via a database query. The message is only updated in the database on sentence
19+
boundaries, whereas the HTTP stream sends tokens as they come:
2020

2121
![example-animation](./anim.gif)
2222

2323
## Pre-requisite: Convex
2424

25-
You'll need an existing Convex project to use the component.
26-
Convex is a hosted backend platform, including a database, serverless functions,
27-
and a ton more you can learn about [here](https://docs.convex.dev/get-started).
25+
You'll need an existing Convex project to use the component. Convex is a hosted
26+
backend platform, including a database, serverless functions, and a ton more you
27+
can learn about [here](https://docs.convex.dev/get-started).
2828

29-
Run `npm create convex` or follow any of the [quickstarts](https://docs.convex.dev/home) to set one up.
29+
Run `npm create convex` or follow any of the
30+
[quickstarts](https://docs.convex.dev/home) to set one up.
3031

3132
## Installation
3233

@@ -59,7 +60,7 @@ In `convex/chat.ts`:
5960

6061
```ts
6162
const persistentTextStreaming = new PersistentTextStreaming(
62-
components.persistentTextStreaming
63+
components.persistentTextStreaming,
6364
);
6465

6566
// Create a stream using the component and store the id in the database with
@@ -87,15 +88,15 @@ export const getChatBody = query({
8788
handler: async (ctx, args) => {
8889
return await persistentTextStreaming.getStreamBody(
8990
ctx,
90-
args.streamId as StreamId
91+
args.streamId as StreamId,
9192
);
9293
},
9394
});
9495

9596
// Create an HTTP action that generates chunks of the chat body
9697
// and uses the component to stream them to the client and save them to the database.
9798
export const streamChat = httpAction(async (ctx, request) => {
98-
const body = (await request.json()) as {streamId: string};
99+
const body = (await request.json()) as { streamId: string };
99100
const generateChat = async (ctx, request, streamId, chunkAppender) => {
100101
await chunkAppender("Hi there!");
101102
await chunkAppender("How are you?");
@@ -106,7 +107,7 @@ export const streamChat = httpAction(async (ctx, request) => {
106107
ctx,
107108
request,
108109
body.streamId as StreamId,
109-
generateChat
110+
generateChat,
110111
);
111112

112113
// Set CORS headers appropriately.
@@ -126,8 +127,8 @@ http.route({
126127
});
127128
```
128129

129-
Finally, in your app, you can now create chats and them subscribe to them
130-
via stream and/or database query as optimal:
130+
Finally, in your app, you can now create chats and them subscribe to them via
131+
stream and/or database query as optimal:
131132

132133
```ts
133134
// chat-input.tsx, maybe?
@@ -149,7 +150,7 @@ const { text, status } = useStream(
149150
api.chat.getChatBody, // The query to call for the full stream body
150151
new URL(`${convexSiteUrl}/chat-stream`), // The HTTP endpoint for streaming
151152
driven, // True if this browser session created this chat and should generate the stream
152-
chat.streamId as StreamId // The streamId from the chat database record
153+
chat.streamId as StreamId, // The streamId from the chat database record
153154
);
154155
```
155156

@@ -162,28 +163,29 @@ let's examine each approach in isolation.
162163
- **HTTP streaming only**: If your app _only_ uses HTTP streaming, then the
163164
original browser that made the request will have a great, high-performance
164165
streaming experience. But if that HTTP connection is lost, if the browser
165-
window is reloaded, if other users want to view the same chat, or this
166-
users wants to revisit the conversation later, it won't be possible. The
166+
window is reloaded, if other users want to view the same chat, or this users
167+
wants to revisit the conversation later, it won't be possible. The
167168
conversation is only ephemeral because it was never stored on the server.
168169

169170
- **Database Persistence Only**: If your app _only_ uses database persistence,
170171
it's true that the conversation will be available for as long as you want.
171172
Additionally, Convex's subscriptions will ensure the chat message is updated
172-
as new text chunks are generated. However, there are a few downsides: one,
173-
the entire chat body needs to be resent every time it is changed, which is a
174-
lot redundant bandwidth to push into the database and over the websockets to
175-
all connected clients. Two, you'll need to make a difficult tradeoff between
173+
as new text chunks are generated. However, there are a few downsides: one, the
174+
entire chat body needs to be resent every time it is changed, which is a lot
175+
redundant bandwidth to push into the database and over the websockets to all
176+
connected clients. Two, you'll need to make a difficult tradeoff between
176177
interactivity and efficiency. If you write every single small chunk to the
177-
database, this will get quite slow and expensive. But if you batch up the chunks
178-
into, say, paragraphs, then the user experience will feel laggy.
178+
database, this will get quite slow and expensive. But if you batch up the
179+
chunks into, say, paragraphs, then the user experience will feel laggy.
179180

180-
This component combines the best of both worlds. The original browser that
181-
makes the request will still have a great, high-performance streaming experience.
182-
But the chat body is also stored in the database, so it can be accessed by the
181+
This component combines the best of both worlds. The original browser that makes
182+
the request will still have a great, high-performance streaming experience. But
183+
the chat body is also stored in the database, so it can be accessed by the
183184
client even after the stream has finished, or by other users, etc.
184185

185186
## Background
186187

187-
This component is largely based on the Stack post [AI Chat with HTTP Streaming](https://stack.convex.dev/ai-chat-with-http-streaming).
188+
This component is largely based on the Stack post
189+
[AI Chat with HTTP Streaming](https://stack.convex.dev/ai-chat-with-http-streaming).
188190

189191
<!-- END: Include on https://convex.dev/components -->

example/README.md

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
# Chat Example App
22

33
This is a simple chat app that uses the persistent text streaming component.
4-
When a new prompt is submitted, the app will stream the response from the OpenAI API
5-
back to the client and write the prompt and response to the database in an efficient
6-
way. When the app is refreshed, it will use the database to value to restore the
7-
chat. Other concurrent browser sessions will also see the chat updates by subscribing
8-
to the Convex database records in the usual way.
4+
When a new prompt is submitted, the app will stream the response from the OpenAI
5+
API back to the client and write the prompt and response to the database in an
6+
efficient way. When the app is refreshed, it will use the database to value to
7+
restore the chat. Other concurrent browser sessions will also see the chat
8+
updates by subscribing to the Convex database records in the usual way.
99

1010
## Running the app
1111

@@ -24,11 +24,12 @@ npm run dev:frontend # in another terminal
2424

2525
### Establishing your OPENAI_API_KEY
2626

27-
This chat app talks to OpenAI's API. You need to set the `OPENAI_API_KEY` environment variable
28-
inside your Convex backend.
27+
This chat app talks to OpenAI's API. You need to set the `OPENAI_API_KEY`
28+
environment variable inside your Convex backend.
2929

3030
1. Download an API key from [OpenAI](https://platform.openai.com/api-keys).
3131
2. Run `npx convex env set OPENAI_API_KEY=<your-key>`
3232

33-
Then you should be able to chat successfully with the app. Pop up the Convex dashboard to
34-
debug any issues (the `npx convex dashboard` command will get you to the right place).
33+
Then you should be able to chat successfully with the app. Pop up the Convex
34+
dashboard to debug any issues (the `npx convex dashboard` command will get you
35+
to the right place).

example/convex/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# Welcome to your Convex functions directory!
22

3-
Write your Convex functions here.
4-
See https://docs.convex.dev/functions for more.
3+
Write your Convex functions here. See https://docs.convex.dev/functions for
4+
more.
55

66
A query function that takes two arguments looks like:
77

@@ -85,6 +85,6 @@ function handleButtonPress() {
8585
}
8686
```
8787

88-
Use the Convex CLI to push your functions to a deployment. See everything
89-
the Convex CLI can do by running `npx convex -h` in your project root
90-
directory. To learn more, launch the docs with `npx convex docs`.
88+
Use the Convex CLI to push your functions to a deployment. See everything the
89+
Convex CLI can do by running `npx convex -h` in your project root directory. To
90+
learn more, launch the docs with `npx convex docs`.

example/convex/chat.ts

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ export const streamChat = httpAction(async (ctx, request) => {
4040
// Append each chunk to the persistent stream as they come in from openai
4141
for await (const part of stream)
4242
await append(part.choices[0]?.delta?.content || "");
43-
}
43+
},
4444
);
4545

4646
response.headers.set("Access-Control-Allow-Origin", "*");

example/convex/messages.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,10 +45,10 @@ export const getHistory = internalQuery({
4545
userMessage,
4646
responseMessage: await streamingComponent.getStreamBody(
4747
ctx,
48-
userMessage.responseStreamId as StreamId
48+
userMessage.responseStreamId as StreamId,
4949
),
5050
};
51-
})
51+
}),
5252
);
5353

5454
return joinedResponses.flatMap((joined) => {

example/convex/streaming.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ import { components } from "./_generated/api";
77
import { query } from "./_generated/server";
88

99
export const streamingComponent = new PersistentTextStreaming(
10-
components.persistentTextStreaming
10+
components.persistentTextStreaming,
1111
);
1212

1313
export const getStreamBody = query({
@@ -17,7 +17,7 @@ export const getStreamBody = query({
1717
handler: async (ctx, args) => {
1818
return await streamingComponent.getStreamBody(
1919
ctx,
20-
args.streamId as StreamId
20+
args.streamId as StreamId,
2121
);
2222
},
2323
});

example/src/App.css

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,9 @@
33
/* Basic styling for markdown content */
44
.md-answer {
55
color: #333;
6-
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen, Ubuntu, Cantarell, "Open Sans", "Helvetica Neue", sans-serif;
6+
font-family:
7+
-apple-system, BlinkMacSystemFont, "Segoe UI", Roboto, Oxygen, Ubuntu,
8+
Cantarell, "Open Sans", "Helvetica Neue", sans-serif;
79
line-height: 1.6;
810
max-width: 800px;
911
margin: 0 auto;
@@ -235,4 +237,4 @@
235237
color: #6a737d;
236238
border-top: 1px solid #eaecef;
237239
margin-top: 2em;
238-
}
240+
}

0 commit comments

Comments
 (0)