Vercel AI Provider for running LLMs locally using Ollama
npm install ollama-ai-providerThe Ollama Provider for the Vercel AI SDK
contains language model support for the Ollama APIs and embedding model support for the Ollama embeddings API.
This provider requires Ollama >= 0.5.0
The Ollama provider is available in the ollama-ai-provider module. You can install it with
``bash`
npm i ollama-ai-provider
You can import the default provider instance ollama from ollama-ai-provider:
`ts`
import { ollama } from 'ollama-ai-provider';
`ts
import { ollama } from 'ollama-ai-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: ollama('phi3'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
``
Please check out the Ollama provider documentation for more information.