useAiStream Streaming Request Engine 🌊
useAiStream is a low-level engine hook designed for AI streaming output, featuring:
- 🏭 Multi-vendor Adapters (OpenAI / DeepSeek / Wenxin / Tongyi)
- 🖋️ Typewriter Throttling (Based on
requestAnimationFrame) - 🛑 AbortController Support
🖊️ Typewriter Effect Demo (Local AsyncGenerator)
Simulate real AI output rhythm using AsyncGenerator, allowing typewriter experience without networking or API keys.
What are the core advantages of Vue3 Composition API?
Select a question and click "Generate Answer" to experience typewriter effect ✨
🌐 Connecting Real SSE APIs
Replace the request adapter with a real API call. Pass your key to enable production-grade streaming conversation.
📋 Replace YOUR_API_KEY with your SiliconFlow / OpenAI / DeepSeek compatible key to achieve real streaming output.
👉 Supports: OpenAI · DeepSeek · SiliconFlow · iFlytek · Moonshot · MiniMax and all OpenAI SSE formats.
API
Options
| Param | Type | Default | Description |
|---|---|---|---|
request | (query, signal, ...args) => AsyncGenerator | Promise<Response> | Required | Request adapter, supports AbortSignal |
parser | StreamChunkParser | plainTextParser | Stream chunk parser (multi-vendor) |
typewriter | boolean | true | Enable typewriter effect |
charsPerFrame | number | 3 | Chars output per frame (speed control) |
onUpdate | (chunk, fullContent) => void | - | Incremental update callback |
onFinish | (content) => void | - | Final completion callback |
onError | (err) => void | - | Error callback |
Returns
| Field | Type | Description |
|---|---|---|
isStreaming | Ref<boolean> | If streaming is in progress |
currentContent | Ref<string> | Full content received so far |
fetchStream | (query, ...args) => Promise<void> | Trigger request |
stop | () => void | Abort request |
Built-in Parsers
| Parser | Vendor | Description |
|---|---|---|
openaiParser | OpenAI, DeepSeek, SiliconFlow, iFlytek | Compatible with OpenAI SSE format |
ernieParser | Baidu Wenxin | Compatible with Wenxin SSE format |
qwenParser | Alibaba Tongyi (Direct) | Compatible with Tongyi SSE format |
plainTextParser | Raw AsyncGenerator | Yield strings appended directly |
Connecting Other AI Platforms
ts
import { useAiStream, openaiParser } from '@yh-ui/hooks'
// ── SiliconFlow ──
const { fetchStream } = useAiStream({
parser: openaiParser,
request: async (query, signal) =>
fetch('https://api.siliconflow.cn/v1/chat/completions', {
method: 'POST',
signal,
headers: {
Authorization: 'Bearer YOUR_SF_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: 'Qwen/Qwen2.5-7B-Instruct',
stream: true,
messages: [{ role: 'user', content: query }]
})
})
})
// ── DeepSeek ──
const { fetchStream: deepseekStream } = useAiStream({
parser: openaiParser,
request: async (query, signal) =>
fetch('https://api.deepseek.com/chat/completions', {
method: 'POST',
signal,
headers: {
Authorization: 'Bearer YOUR_DEEPSEEK_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: 'deepseek-chat',
stream: true,
messages: [{ role: 'user', content: query }]
})
})
})