Showing 1-20 of 72,121 packages
Onairos React Native SDK for social media authentication and AI model training
An AI-powered plugin that integrates [Replicate](https://replicate.com/) machine learning models with your Deenruv server for model training and order prediction tasks.
TensorFlow layers API in JavaScript
TypeScript client for gagara-boost model training service
Utilities for in browser visualization with TensorFlow.js
Detects if a string appears to be gibberish, using markov-chaining
👉 https://hyper.fun/c/material-icon-model-training-twotone/1.3.0
👉 https://hyper.fun/c/material-icon-model-training-outlined/1.3.0
SAP Cloud SDK for AI is the official Software Development Kit (SDK) for **SAP AI Core**, **SAP Generative AI Hub**, and **Orchestration Service**.
Self-learning LLM orchestration with SONA adaptive learning, HNSW memory, RLM recursive retrieval, FastGRNN routing, and SIMD inference
👉 https://hyper.fun/c/material-icon-model-training-round/1.3.0
Filter Cypress tests using title or tags
👉 https://hyper.fun/c/material-icon-model-training-sharp/1.3.0
Capture, store, and analyze coding sessions for AI model training. A production-ready CLI tool with cloud sync.
With this package, you can generate predictions of machine learning models trained with YDF in browser and with NodeJS.
A comprehensive toolkit for data management, model training, and project scaffolding
ProseMirror's document model
High-performance WebAssembly attention mechanisms for transformers and LLMs: Multi-Head, Flash Attention, Hyperbolic, Linear (Performer), MoE, Local-Global, and CGT Sheaf Attention with coherence gating. GPU-accelerated with SIMD fallback.
The **[Perplexity provider](https://ai-sdk.dev/providers/ai-sdk-providers/perplexity)** for the [AI SDK](https://ai-sdk.dev/docs) contains language model support for Perplexity's Sonar API - a powerful answer engine with real-time web search capabilities.
A browser-native implementation of GPT language models built on TensorFlow.js, developed as part of the Finnish Generation AI research project. This library enables training, fine-tuning, and inference of transformer-based language models entirely in the