Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: document useChat() from Vercel AI SDK #250

Merged
merged 8 commits into from
Aug 31, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
80 changes: 78 additions & 2 deletions docs/content/1.docs/2.features/ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -151,5 +151,81 @@ export default defineEventHandler(async () => {

Explore open source templates made by the community:

- [Atidraw](https://github.com/atinux/atidraw): leverage AI to generate the alt text of the user drawing and generate an alternative image with AI ([blog post](/blog/cloudflare-ai-for-user-experience)).
- [Hub Chat](https://github.com/ra-jeev/hub-chat): a chat interface to interact with various text generation AI models ([blog post](https://rajeev.dev/create-cloudflare-workers-ai-llm-playground-using-nuxthub-and-nuxtui)).
::card-group
::card{title="Atidraw" to="https://github.com/atinux/atidraw"}
Generate the alt text of the user drawing and generate an alternative image with AI.
::
::card{title="Hub Chat" to="https://github.com/ra-jeev/hub-chat"}
A chat interface to interact with various text generation AI models.
::
::

## Vercel AI SDK

It is possible to use the Vercel AI SDK with Cloudflare Workers AI.

NuxtHub AI is compatible with some functions of the [Vercel AI SDK](https://sdk.vercel.ai), which enables streaming responses.

Make sure to install the Vercel AI SDK in your project.

```[Terminal]
npx nypm add ai @ai-sdk/vue
```

::note
[`nypm`](https://github.com/unjs/nypm) will detect your package manager and install the dependencies with it.
::

### `useChat()`

To leverage the `useChat()` Vue composable, you need to create a `POST /api/chat` endpoint that uses the `hubAI()` server composable and returns a compatible stream for the Vercel AI SDK.

```ts [server/api/chat.post.ts]
import { AIStream, formatStreamPart } from 'ai'

export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)

const stream = await hubAI().run('@cf/meta/llama-3.1-8b-instruct', {
messages,
stream: true
}) as ReadableStream

// Return a compatible stream for the Vercel AI SDK
return AIStream(
new Response(stream),
data => formatStreamPart('text', JSON.parse(data).response)
)
})
```

Then, we can create a chat component that uses the `useChat()` composable.

```vue [app/pages/chat.vue]
<script setup lang="ts">
import { useChat } from '@ai-sdk/vue'

const { messages, input, handleSubmit, isLoading, stop, error, reload } = useChat()
</script>

<template>
<div v-for="m in messages" :key="m.id">
{{ m.role }}: {{ m.content }}
</div>
<div v-if="error">
<div>{{ error.message || 'An error occurred' }}</div>
<button @click="reload">retry</button>
</div>
<form @submit="handleSubmit">
<input v-model="input" placeholder="Type here..." />
<button v-if="isLoading" @click="stop">stop</button>
<button v-else type="submit">send</button>
</form>
</template>
```

Learn more about the [`useChat()` Vue composable](https://sdk.vercel.ai/docs/reference/ai-sdk-ui/use-chat).

::callout
Check out our [`pages/ai.vue` full example](https://github.com/nuxt-hub/core/blob/main/playground/app/pages/ai.vue) with Nuxt UI & Nuxt MDC.
::
99 changes: 20 additions & 79 deletions playground/app/pages/ai.vue
Original file line number Diff line number Diff line change
@@ -1,93 +1,34 @@
<script setup lang="ts">
import destr from 'destr'
import { useChat } from '@ai-sdk/vue'

interface Message {
role: 'ai' | 'user'
message: string
}

const prompt = ref('')
const loading = ref(false)
const messages = ref<Message[]>([])
const currentAIResponse = ref('')

async function sendPrompt() {
if (loading.value) return
loading.value = true
currentAIResponse.value = ''
messages.value.push({
role: 'user',
message: prompt.value
})
const promptToSend = prompt.value
prompt.value = ''
const body = await $fetch('/api/ai', {
method: 'POST',
responseType: 'stream',
body: {
prompt: promptToSend
}
})
const reader = body.getReader()
const decoder = new TextDecoder()
let result = await reader.read()
while (!result.done) {
const text = decoder.decode(result.value)
for (const line of text.split('\n')) {
if (!line) continue
const data: any = destr(line.replace('data: ', '')) || {}
if (data?.response) {
currentAIResponse.value += data.response
}
}
result = await reader.read()
}
messages.value.push({
role: 'ai',
message: currentAIResponse.value
})
currentAIResponse.value = ''
loading.value = false
}
const { messages, input, handleSubmit, isLoading, stop, error, reload } = useChat()
</script>

<template>
<UCard>
<div class="h-full overflow-auto chat-messages">
<div v-for="(message, i) in messages" :key="i" class="flex flex-col p-4">
<div v-if="message.role === 'ai'" class="pr-8 mr-auto">
<div class="p-2 mt-1 text-sm text-gray-700 bg-gray-200 rounded-lg text-smp-2 whitespace-pre-line">
{{ message.message }}
</div>
</div>
<div v-else class="pl-8 ml-auto">
<div class="p-2 mt-1 text-sm text-white bg-blue-400 rounded-lg whitespace-pre-line">
{{ message.message }}
</div>
<div class="h-full overflow-auto">
<div v-for="message in messages" :key="message.id" class="flex flex-col p-4">
<div :class="message.role === 'assistant' ? 'pr-8 mr-auto' : 'pl-8 ml-auto'">
<MDC
class="p-2 mt-1 text-sm rounded-lg text-smp-2 whitespace-pre-line"
:class="message.role === 'assistant' ? 'text-white bg-blue-400' : 'text-gray-700 bg-gray-200'"
:value="message.content"
/>
</div>
</div>
<div v-if="currentAIResponse" class="flex flex-col p-4">
<div class="pr-8 mr-auto">
<div class="p-2 mt-1 text-sm text-gray-700 bg-gray-200 rounded-lg text-smp-2 whitespace-pre-line">
{{ currentAIResponse }}
</div>
<div v-if="error" class="flex items-center justify-center gap-2">
<div class="text-red-500">
{{ 'An error occurred' }}
</div>
<UButton color="gray" size="xs" @click="reload">
retry
</UButton>
</div>
</div>
<form class="flex items-center w-full p-2 gap-2" @submit.prevent="sendPrompt">
<UInput
v-model="prompt"
type="text"
placeholder="Type here..."
class="w-full"
:disabled="loading"
/>
<UButton
:loading="loading"
icon="i-heroicons-paper-airplane"
type="submit"
color="black"
/>
<form class="flex items-center w-full p-2 gap-2" @submit.prevent="handleSubmit">
<UInput v-model="input" placeholder="Type here..." class="w-full" :disabled="Boolean(error)" />
<UButton v-if="isLoading" icon="i-heroicons-stop" color="black" @click="stop" />
<UButton v-else icon="i-heroicons-paper-airplane" type="submit" color="black" />
</form>
</UCard>
</template>
3 changes: 2 additions & 1 deletion playground/nuxt.config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ export default defineNuxtConfig({

modules: [
'@nuxt/ui',
'@nuxtjs/mdc',
'@kgierke/nuxt-basic-auth',
module
],
Expand All @@ -20,7 +21,7 @@ export default defineNuxtConfig({
bindings: {
// Used for /api/hyperdrive
hyperdrive: {
POSTGRES: '08f7bc805d1d409aac17e72af502abd0'
POSTGRES: '8bb2913857b84c939cd908740fa5a5d5'
}
}
// projectUrl: ({ branch }) => branch === 'main' ? 'https://playground.nuxt.dev' : `https://${encodeHost(branch).replace(/\//g, '-')}.playground-to39.pages.dev`
Expand Down
3 changes: 3 additions & 0 deletions playground/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,13 @@
"preview": "nuxi preview"
},
"dependencies": {
"@ai-sdk/vue": "^0.0.45",
"@iconify-json/simple-icons": "^1.1.114",
"@kgierke/nuxt-basic-auth": "^1.6.0",
"@nuxt/ui": "^2.18.4",
"@nuxthub/core": "latest",
"@nuxtjs/mdc": "^0.8.3",
"ai": "^3.3.21",
"drizzle-orm": "^0.33.0",
"nuxt": "^3.13.0",
"postgres": "^3.4.4"
Expand Down
15 changes: 0 additions & 15 deletions playground/server/api/ai.post.ts

This file was deleted.

20 changes: 20 additions & 0 deletions playground/server/api/chat.post.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
import { AIStream, formatStreamPart } from 'ai'

export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)

const stream = await hubAI().run('@cf/meta/llama-3.1-8b-instruct', {
messages,
stream: true
}) as ReadableStream

// For testing purposes, we'll randomly throw an error
// if (Math.round(Math.random()) === 1) {
// throw createError({
// status: 500,
// statusMessage: 'Nope'
// })
// }

return AIStream(new Response(stream), data => formatStreamPart('text', JSON.parse(data).response))
})
Loading