Blazing-fast, zero-dependency uploader for CloudKu. Supports auto-conversion, chunked uploads, and TypeScript. Easily upload images, videos, audio, and documents via Node.js.
npm install cloudku-uploadercloudkuimages.guru and cloudkuimages-guru.us.itpanel.app
Promise.allSettled for resilient batch operations
bash
npm install cloudku-uploader
`
`bash
yarn add cloudku-uploader
`
`bash
pnpm add cloudku-uploader
`
🚀 Usage
$3
`javascript
import cloudku from 'cloudku-uploader'
const buffer = await fetch('image.jpg').then(r => r.arrayBuffer())
const result = await cloudku.uploadFile(buffer, 'image.jpg')
console.log(result.url)
`
$3
`javascript
const cloudku = require('cloudku-uploader')
const fs = require('fs').promises
async function upload() {
const buffer = await fs.readFile('image.jpg')
const result = await cloudku.uploadFile(buffer, 'image.jpg')
console.log(result.url)
}
upload()
`
$3
`javascript
import cloudku from 'cloudku-uploader'
document.querySelector('#fileInput').addEventListener('change', async (e) => {
const file = e.target.files[0]
const buffer = await file.arrayBuffer()
const result = await cloudku.uploadFile(buffer, file.name)
console.log('Uploaded to:', result.url)
})
`
$3
`javascript
import cloudku from 'cloudku-uploader'
import { readFile } from 'fs/promises'
const buffer = await readFile('./photo.jpg')
const result = await cloudku.uploadFile(buffer.buffer, 'photo.jpg')
console.log('File URL:', result.url)
`
$3
`javascript
import cloudku from 'cloudku-uploader'
import { createReadStream } from 'fs'
import { pipeline } from 'stream/promises'
const stream = createReadStream('./large-video.mp4')
const chunks = []
for await (const chunk of stream) {
chunks.push(chunk)
}
const buffer = Buffer.concat(chunks)
const result = await cloudku.uploadFile(buffer.buffer, 'video.mp4')
console.log('Stream uploaded:', result.url)
`
$3
`javascript
import cloudku from 'cloudku-uploader'
import { createReadStream } from 'fs'
import { stat } from 'fs/promises'
const filePath = './movie.mp4'
const fileStats = await stat(filePath)
const totalSize = fileStats.size
const stream = createReadStream(filePath)
let uploadedSize = 0
const chunks = []
for await (const chunk of stream) {
chunks.push(chunk)
uploadedSize += chunk.length
const progress = (uploadedSize / totalSize * 100).toFixed(2)
console.log(Progress: ${progress}%)
}
const buffer = Buffer.concat(chunks)
const result = await cloudku.uploadFile(buffer.buffer, 'movie.mp4')
console.log('Complete:', result.url)
`
$3
`javascript
import cloudku from 'cloudku-uploader'
const buffer = await fetch('4k-video.mp4').then(r => r.arrayBuffer())
const result = await cloudku.uploadLarge(
buffer,
'4k-video.mp4',
16 1024 1024
)
console.log('Large file URL:', result.url)
`
$3
`javascript
import cloudku from 'cloudku-uploader'
const files = [
{ buffer: buffer1, name: 'photo1.jpg' },
{ buffer: buffer2, name: 'photo2.png' },
{ buffer: buffer3, name: 'document.pdf' }
]
const results = await cloudku.uploadBatch(files)
const successful = results.filter(r => r.status === 'fulfilled')
const failed = results.filter(r => r.status === 'rejected')
console.log(✓ ${successful.length} uploaded successfully)
console.log(✗ ${failed.length} failed)
results.forEach((result, index) => {
if (result.status === 'fulfilled') {
console.log([${index + 1}] ${files[index].name}: ${result.value.url})
} else {
console.error([${index + 1}] ${files[index].name}: ${result.reason.message})
}
})
`
📚 API Reference
$3
Main upload method with automatic strategy selection based on file size.
Parameters:
- buffer {ArrayBuffer} - File content as ArrayBuffer (required)
- name {string} - Filename with extension (optional, default: 'file.bin')
Returns: Promise
Behavior:
- Files ≤ 100MB: Uses single POST request
- Files > 100MB: Automatically switches to chunked upload
Example:
`javascript
const buffer = await file.arrayBuffer()
const result = await cloudku.uploadFile(buffer, 'photo.jpg')
`
Response Schema:
`javascript
{
status: 'success',
url: 'https://cloudkuimages.guru/files/abc123.jpg',
filename: 'photo.jpg',
size: 2048576
}
`
---
$3
Explicit chunked upload for large files with progress control.
Parameters:
- buffer {ArrayBuffer} - File content as ArrayBuffer (required)
- name {string} - Filename with extension (optional, default: 'file.bin')
- chunkSize {number} - Chunk size in bytes (optional, default: 8388608 = 8MB)
Returns: Promise
Implementation Details:
1. Generates UUID v4 as fileId for chunk tracking
2. Splits buffer into chunks of specified size
3. Uploads each chunk with metadata:
- chunk: Current chunk index (0-based)
- chunks: Total number of chunks
- filename: Original filename
- fileId: UUID for tracking
- size: Total file size in bytes
4. Sends finalization request with chunked=1&finalize=1 query params
Example:
`javascript
const result = await cloudku.uploadLarge(
buffer,
'movie.mkv',
10 1024 1024
)
`
Chunk Upload Request:
`javascript
FormData {
file: Blob(chunk),
chunk: 0,
chunks: 12,
filename: 'movie.mkv',
fileId: '550e8400-e29b-41d4-a716-446655440000',
size: 104857600
}
`
Finalization Request:
`javascript
POST /upload.php?chunked=1&finalize=1
Content-Type: application/json
{
"fileId": "550e8400-e29b-41d4-a716-446655440000",
"filename": "movie.mkv",
"chunks": 12
}
`
---
$3
Upload multiple files concurrently with individual error handling.
Parameters:
- files {Array} - Array of file objects (required)
FileObject Schema:
`typescript
{
buffer: ArrayBuffer,
name: string
}
`
Returns: Promise
Example:
`javascript
const results = await cloudku.uploadBatch([
{ buffer: buffer1, name: 'image1.jpg' },
{ buffer: buffer2, name: 'image2.png' },
{ buffer: buffer3, name: 'video.mp4' }
])
results.forEach((result, index) => {
if (result.status === 'fulfilled') {
console.log(Success: ${result.value.url})
} else {
console.error(Failed: ${result.reason})
}
})
`
🔧 Advanced Usage
$3
`javascript
try {
const result = await cloudku.uploadFile(buffer, 'image.jpg')
if (result.status === 'error') {
throw new Error(result.message)
}
console.log('Uploaded:', result.url)
} catch (error) {
console.error('Upload failed:', error.message)
}
`
$3
`javascript
async function uploadWithRetry(buffer, name, maxRetries = 3) {
for (let i = 0; i < maxRetries; i++) {
try {
const result = await cloudku.uploadFile(buffer, name)
return result
} catch (error) {
if (i === maxRetries - 1) throw error
await new Promise(resolve => setTimeout(resolve, 1000 * (i + 1)))
}
}
}
`
$3
`javascript
function getOptimalChunkSize(connectionType) {
const sizes = {
'4g': 16 1024 1024,
'wifi': 10 1024 1024,
'3g': 4 1024 1024,
'slow-2g': 1 1024 1024
}
return sizes[connectionType] || 8 1024 1024
}
const connection = navigator.connection?.effectiveType || 'wifi'
const chunkSize = getOptimalChunkSize(connection)
const result = await cloudku.uploadLarge(buffer, 'file.zip', chunkSize)
`
$3
`javascript
async function uploadBatchWithLimit(files, limit = 3) {
const results = []
for (let i = 0; i < files.length; i += limit) {
const batch = files.slice(i, i + limit)
const batchResults = await cloudku.uploadBatch(batch)
results.push(...batchResults)
}
return results
}
`
⚙️ Technical Details
$3
The uploader uses random selection between two endpoints:
- https://cloudkuimages.guru
- https://cloudkuimages-guru.us.itpanel.app
This provides basic load balancing and failover capability.
$3
Small File (<= 100MB):
`
Client → pickBase() → POST /upload.php → Response
`
Large File (> 100MB):
`
Client → Generate UUID
→ Split into chunks
→ For each chunk:
→ POST /upload.php (with metadata)
→ POST /upload.php?chunked=1&finalize=1
→ Response
`
$3
All requests include:
`javascript
{
'User-Agent': 'cloudku-uploader/5.0',
'Accept': 'application/json'
}
`
$3
- Single upload: Recommended up to 100MB
- Chunked upload: No hard limit (tested up to 5GB)
- Default chunk size: 8MB (8,388,608 bytes)
- Recommended chunk range: 4MB - 16MB
🌐 Environment Support
$3
- ✅ Chrome 90+
- ✅ Firefox 88+
- ✅ Safari 14+
- ✅ Edge 90+
$3
- ✅ Node.js 14.x
- ✅ Node.js 16.x
- ✅ Node.js 18.x
- ✅ Node.js 20.x
$3
- ✅ React / Next.js
- ✅ Vue / Nuxt
- ✅ Angular
- ✅ Svelte / SvelteKit
$3
- ✅ ESM (ES Modules)
- ✅ CommonJS
- ✅ UMD (via bundlers)
🛠️ Module Formats
$3
`javascript
import cloudku from 'cloudku-uploader'
`
$3
`javascript
const cloudku = require('cloudku-uploader')
`
$3
`typescript
import cloudku from 'cloudku-uploader'
import type { UploadResult, FileObject } from 'cloudku-uploader'
const result: UploadResult = await cloudku.uploadFile(buffer, 'file.jpg')
`
📝 Type Definitions
`typescript
interface UploadResult {
status: 'success' | 'error'
url?: string
filename?: string
size?: number
message?: string
}
interface FileObject {
buffer: ArrayBuffer
name: string
}
interface CloudkuUploader {
uploadFile(buffer: ArrayBuffer, name?: string): Promise
uploadLarge(buffer: ArrayBuffer, name?: string, chunkSize?: number): Promise
uploadBatch(files: FileObject[]): Promise[]>
}
`
🤝 Contributing
Contributions are welcome! Please follow these guidelines:
1. Fork the repository
2. Create a feature branch: git checkout -b feature/amazing-feature
3. Commit your changes: git commit -m 'Add amazing feature'
4. Push to the branch: git push origin feature/amazing-feature`