š¦ Drizzle ORM migrator applies migrations in browser environment, for PGLite, SQLite, DuckDB WASM!
npm install @proj-airi/drizzle-orm-browser-migrator@proj-airi/drizzle-orm-browser-migratorš¦ Drizzle ORM migrator applies migrations in browser environment, for PGLite, SQLite, DuckDB WASM!
> [!NOTE]
>
> This project is part of (and also associate to) the Project AIRI, we aim to build a LLM-driven VTuber like Neuro-sama (subscribe if you didn't!) if you are interested in, please do give it a try on live demo.
Pick the package manager of your choice:
``shellnpm i -g @antfu/ni
ni @proj-airi/drizzle-orm-browser-migrator -D # from @antfu/ni, can be installed via `
pnpm i @proj-airi/drizzle-orm-browser-migrator -D
yarn i @proj-airi/drizzle-orm-browser-migrator -D
npm i @proj-airi/drizzle-orm-browser-migrator -D
`typescript
import { IdbFs, PGlite } from '@electric-sql/pglite'
import { migrate } from '@proj-airi/drizzle-orm-browser-migrator/pglite'
import migrations from 'drizzle-migrations.sql'
import { drizzle } from 'drizzle-orm/pglite'
const pgLite = new PGlite({ fs: new IdbFs('pglite-database') })
const db = drizzle({ client: pgLite })
await migrate(db, migrations)
`
- Awesome AI VTuber: A curated list of AI VTubers and related projects
- unspeech: Universal endpoint proxy server for /audio/transcriptions and /audio/speech, like LiteLLM but for any ASR and TTShfup
- : tools to help on deploying, bundling to HuggingFace Spacesxsai-transformers
- : Experimental š¤ Transformers.js provider for xsAI.@proj-airi/drizzle-duckdb-wasm
- WebAI: Realtime Voice Chat: Full example of implementing ChatGPT's realtime voice from scratch with VAD + STT + LLM + TTS.
- : Drizzle ORM driver for DuckDB WASM@proj-airi/duckdb-wasm
- : Easy to use wrapper for @duckdb/duckdb-wasmautorio
- Airi Factorio: Allow Airi to play Factorio
- Factorio RCON API: RESTful API wrapper for Factorio headless server console
- : Factorio automation librarytstl-plugin-reload-factorio-mod
- : Reload Factorio mod when developingdemodel
- Velin: Use Vue SFC and Markdown to write easy to manage stateful prompts for LLM
- : Easily boost the speed of pulling your models and datasets from various of inference runtimes.inventory`: Centralized model catalog and default provider configurations backend service
-
- MCP Launcher: Easy to use MCP builder & launcher for all possible MCP servers, just like Ollama for models!
- š„ŗ SAD: Documentation and notes for self-host and browser running LLMs.