Local LLM inference for Node.js. GPU-accelerated. Zero config. Works standalone or with Vercel AI SDK.
npm install @tryhamster/gerbil
No README available