Skip to content

useAiChat Data Responsive Stream

@yh-ui/hooks provides extremely powerful Headless Hooks Abstraction for AI workflows. The useAiChat and useAiConversations data engines completely separate visual rendering from pure data modeling.

In a traditional setup, handling AI model REST API requires maintaining Token streams, handling Stop/Abort capabilities, managing loading states, and keeping conversations synchronized.

Now, simply inject a custom request functional interceptor to useAiChat, and unlock the full-link state cycle immediately.

Basic Integration

Use simple Promises to take over the isolation state without manually doing messages.push.

Minimal Engine

Native Event Stream and Instant Abort

Most large language models utilize text/event-stream for lower Time-To-First-Byte (TTFB). Just turn your request engine into an AsyncGenerator (or yield streams). The Hook core automatically converts chunks into brilliant & silky-smooth continuous typewriter outputs!

It safely halts all subsequent fetches entirely by invoking the native browser AbortController from stop().

Data Streaming with Native Abort

Conversation Sidebar Navigation (useAiConversations)

A complete ChatGPT-like client requires managing side menus. The useAiConversations hook quickly sets up a managed conversation list for sidebar state.

My Conversations
Handled by useAiConversations
Conversation Management Hook

API Reference

useAiChat Options

PropertyDescriptionType
initialMessagesArray of default chat recordsAiChatMessage[]
requestRequest adapter with signature (message, history, abortSignal) => AsyncGenerator | Promise<string | Response>Function
idGeneratorReplaces the default random ID generator() => string
parserSSE / stream chunk parser for different vendor formatsStreamChunkParser
typewriterEnable or disable the built-in typewriter effectboolean
charsPerFrameNumber of characters rendered per frame in typewriter modenumber
systemPromptSystem prompt automatically prepended to request historystring
onErrorError callback(err) => void
onFinishCallback fired when the assistant message finishes(message) => void

useAiChat Returns

The state hook unpacks all data and methods automatically:

PropertyDescriptionType
messagesReturns state representation bound directly to conversations.Ref<AiChatMessage[]>
isGeneratingWhether the assistant is currently generating.Ref<boolean>
isSendingSemantic alias of isGenerating.ComputedRef<boolean>
sendMessageMain trigger appending custom queries to AI(content: string) => Promise<void>
stopStop the current generation and settle message state.() => void
clearClears local history tracking window() => void
removeMessageSingle deletion handler(id: string) => void
updateMessageUpdate a specific message in place(id: string, patch: Partial<AiChatMessage>) => void

Released under the MIT License.