React Quickstart
Build a working chat UI with React in under 10 minutes. This guide builds on the Quickstart — if you haven't read it yet, start there to understand the core SDK concepts.
Prefer to build with an AI coding agent? Skip to the Vibe Coding Guide — it goes from empty folder to running chat with Claude Code, Codex, or Antigravity.
Prerequisites
- A React 18+ or Next.js 13+ project
- The SDK installed (see Installation)
Step 1: Create a basic chat component
The useChatSession hook manages the entire chat lifecycle: connection, streaming, message state, errors, and input.
import { useChatSession, MessagePart } from 'gecx-chat/react';
export function SupportChat() {
const chat = useChatSession();
return (
<div>
<h2>Support Chat</h2>
{/* Message list */}
<div>
{chat.messages.map((msg) => (
<div key={msg.id} style={{ textAlign: msg.role === 'user' ? 'right' : 'left' }}>
{msg.parts.map((part) => (
<MessagePart key={part.id} part={part} />
))}
</div>
))}
</div>
{/* Error display */}
{chat.error && <p style={{ color: 'red' }}>{chat.error.message}</p>}
{/* Input form */}
<form
onSubmit={(e) => {
e.preventDefault();
chat.sendText(chat.input.value);
chat.input.clear();
}}
>
<input
value={chat.input.value}
onChange={(e) => chat.input.set(e.currentTarget.value)}
disabled={!chat.canSend}
placeholder="Type a message..."
/>
<button type="submit" disabled={!chat.canSend}>
Send
</button>
</form>
</div>
);
}
This gives you a fully functional chat with streaming responses, automatic message normalization, and disabled input while the AI is responding.
Step 2: Add configuration
Pass a config object to control auth, transport, storage, and tools:
import { useChatSession, MessagePart } from 'gecx-chat/react';
import { tokenEndpointAuth } from 'gecx-chat';
export function SupportChat() {
const chat = useChatSession({
config: {
auth: tokenEndpointAuth({ endpoint: '/api/gecx-chat-token' }),
storage: { mode: 'session', consent: 'functional' },
},
});
// ... same JSX as above
}
For local development, you can skip the config entirely — the hook uses mock auth and mock transport by default.
Step 3: Set up a server token route
Browser code must never touch service account keys or long-lived credentials. The SDK ships a ready-made handler for Next.js:
// app/api/gecx-chat-token/route.ts
import { createChatTokenHandler, mockIssueToken } from 'gecx-chat/server';
const handler = createChatTokenHandler({
allowedOrigins: [process.env.NEXT_PUBLIC_APP_URL ?? 'http://localhost:3000'],
tokenTtlMs: 15 * 60_000,
issueToken: mockIssueToken(), // swap for your real token broker in production
});
export async function POST(request: Request) {
return handler(request);
}
Step 4: Add client tools
Define tools and pass them in the config:
import { useChatSession, MessagePart } from 'gecx-chat/react';
import { defineClientTool } from 'gecx-chat';
const lookupOrder = defineClientTool({
name: 'lookup_order',
description: 'Look up an order by ID',
inputSchema: {
type: 'object',
required: ['orderId'],
properties: { orderId: { type: 'string' } },
},
execute: async ({ orderId }) => {
const res = await fetch(`/api/orders/${encodeURIComponent(orderId)}`);
return res.json();
},
});
export function SupportChat() {
const chat = useChatSession({
config: {
tools: [lookupOrder],
},
});
// ... JSX
}
For tools that change state (add to cart, apply refund), require user approval:
const addToCart = defineClientTool({
name: 'add_to_cart',
description: 'Add a product to the shopping cart',
inputSchema: {
type: 'object',
required: ['productId', 'quantity'],
properties: {
productId: { type: 'string' },
quantity: { type: 'number' },
},
},
permissions: { requiresUserApproval: true },
execute: async ({ productId, quantity }) => {
// your cart logic here
return { added: true, productId, quantity };
},
});
Step 5: Render suggestion chips
The AI can suggest quick replies via suggestion chips. Use the chat.messages array to detect them:
export function SupportChat() {
const chat = useChatSession();
const lastMessage = chat.messages[chat.messages.length - 1];
const suggestions = lastMessage?.parts.filter(p => p.type === 'suggestion-chips') ?? [];
return (
<div>
{/* ... message list and input from Step 1 ... */}
{/* Suggestion chips */}
{suggestions.map((part) =>
part.type === 'suggestion-chips' &&
part.chips.map((chip) => (
<button
key={chip.label}
onClick={() => chat.sendText(chip.label)}
disabled={!chat.canSend}
>
{chip.label}
</button>
))
)}
</div>
);
}
Step 6: Use pre-built components
The SDK ships pre-built, accessible components that handle common patterns:
import {
ChatProvider,
ChatSurface,
MessageList,
Composer,
SuggestionBar,
} from 'gecx-chat/react';
import { createChatClient } from 'gecx-chat';
const client = createChatClient({ /* your config */ });
export function App() {
return (
<ChatProvider client={client}>
<ChatSurface title="Help Center">
<MessageList />
<SuggestionBar />
<Composer />
</ChatSurface>
</ChatProvider>
);
}
Step 7: Customize message rendering
Override how any message part type renders by providing a custom renderer registry:
import { ChatProvider, createRendererRegistry } from 'gecx-chat/react';
const renderers = createRendererRegistry({
text: ({ part }) => <p className="my-text">{part.text}</p>,
'product-carousel': ({ part }) => (
<div className="product-grid">
{part.products.map(p => (
<div key={p.id} className="product-card">
<h3>{p.name}</h3>
<span>${p.price}</span>
</div>
))}
</div>
),
});
export function App() {
return (
<ChatProvider client={client} renderers={renderers}>
{/* your chat UI */}
</ChatProvider>
);
}
What's next
- React Integration Guide — full API reference for hooks and components
- Custom Renderers Guide — advanced rendering patterns
- Client Tools Guide — approval workflows, timeouts, validation
- Generative UI Guide — let the AI design the UI layout
- Explore the Showcase — see every feature in action
docs/getting-started/quickstart-react.md