patterntypescriptTip
Vercel AI SDK useChat hook manages message state and streaming out of the box
Viewed 0 times
ai@3.x
vercel-ai-sdkuseChatstreamingreactstreamTexttoDataStreamResponse
Problem
Building a streaming chat UI from scratch requires managing partial response state, abort controllers, loading indicators, and error handling. Doing this manually is error-prone and verbose.
Solution
Use the Vercel AI SDK's useChat hook. It handles streaming, message state, loading/error states, and abort logic automatically. On the server, use the streamText() function with toDataStreamResponse() to produce a compatible stream.
Why
The Vercel AI SDK was designed specifically for this pattern — it provides a unified protocol between the server stream and the React hook, handling chunked transfer encoding and SSE framing transparently.
Gotchas
- useChat expects responses on /api/chat by default — configure 'api' prop if your endpoint differs
- The initialMessages prop is for static pre-population only — don't update it dynamically
- append() can be used to add messages programmatically without triggering a new completion
Code Snippets
Vercel AI SDK useChat example
// app/page.tsx
'use client';
import { useChat } from 'ai/react';
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div>
{messages.map(m => <div key={m.id}>{m.role}: {m.content}</div>)}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
<button type="submit">Send</button>
</form>
</div>
);
}Context
Building chat UIs in Next.js or React applications
Revisions (0)
No revisions yet.