The **[Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index)** for the [Vercel AI SDK](https://ai-sdk.dev/docs) contains language model support for thousands of models through multiple inference providers via the Hugging
npm install @ai-sdk/huggingfaceThe Hugging Face Inference Providers for the Vercel AI SDK contains language model support for thousands of models through multiple inference providers via the Hugging Face router API.
The Hugging Face provider is available in the @ai-sdk/huggingface module. You can install it with:
``bash`
npm i @ai-sdk/huggingface
If you use coding agents such as Claude Code or Cursor, we highly recommend adding the AI SDK skill to your repository:
`shell`
npx skills add vercel/ai
You can import the default provider instance huggingface from @ai-sdk/huggingface:
`ts`
import { huggingface } from '@ai-sdk/huggingface';
`ts
import { huggingface } from '@ai-sdk/huggingface';
import { generateText } from 'ai';
const { text } = await generateText({
model: huggingface('meta-llama/Llama-3.1-8B-Instruct'),
prompt: 'Write a vegetarian lasagna recipe.',
});
``
Please check out the Hugging Face provider documentation for more information.