useAiChat Data Responsive Stream
@yh-ui/hooks provides extremely powerful Headless Hooks Abstraction for AI workflows. The useAiChat and useAiConversations data engines completely separate visual rendering from pure data modeling.
In a traditional setup, handling AI model REST API requires maintaining Token streams, handling Stop/Abort capabilities, managing loading states, and keeping conversations synchronized.
Now, simply inject a custom request functional interceptor to useAiChat, and unlock the full-link state cycle immediately.
Basic Integration
Use simple Promises to take over the isolation state without manually doing messages.push.
Native Event Stream and Instant Abort
Most large language models utilize text/event-stream for lower Time-To-First-Byte (TTFB). Just turn your request engine into an AsyncGenerator (or yield streams). The Hook core automatically converts chunks into brilliant & silky-smooth continuous typewriter outputs!
It safely halts all subsequent fetches entirely by invoking the native browser AbortController from stop().
Conversation Sidebar Navigation (useAiConversations)
A complete ChatGPT-like client requires managing side menus. The useAiConversations hook quickly sets up a managed conversation list for sidebar state.
useAiConversationsAPI Reference
useAiChat Options
| Property | Description | Type |
|---|---|---|
initialMessages | Array of default chat records | AiChatMessage[] |
request | Request adapter with signature (message, history, abortSignal) => AsyncGenerator | Promise<string | Response> | Function |
idGenerator | Replaces the default random ID generator | () => string |
parser | SSE / stream chunk parser for different vendor formats | StreamChunkParser |
typewriter | Enable or disable the built-in typewriter effect | boolean |
charsPerFrame | Number of characters rendered per frame in typewriter mode | number |
systemPrompt | System prompt automatically prepended to request history | string |
onError | Error callback | (err) => void |
onFinish | Callback fired when the assistant message finishes | (message) => void |
useAiChat Returns
The state hook unpacks all data and methods automatically:
| Property | Description | Type |
|---|---|---|
messages | Returns state representation bound directly to conversations. | Ref<AiChatMessage[]> |
isGenerating | Whether the assistant is currently generating. | Ref<boolean> |
isSending | Semantic alias of isGenerating. | ComputedRef<boolean> |
sendMessage | Main trigger appending custom queries to AI | (content: string) => Promise<void> |
stop | Stop the current generation and settle message state. | () => void |
clear | Clears local history tracking window | () => void |
removeMessage | Single deletion handler | (id: string) => void |
updateMessage | Update a specific message in place | (id: string, patch: Partial<AiChatMessage>) => void |