A media storage helper, allowing storage to firebase storage, cloudflare r2, aws s2 and more buckets.
npm install universal_media_storageA powerful and modular Node.js media storage library that provides a unified interface for managing file uploads, integrity verification, and deletions across multiple providers โ including Cloudflare R2, Firebase Storage (GCS), and Google Drive.
---
- ๐ Unified API for multiple storage providers (R2, Firebase, Drive)
- ๐ Automatic integrity verification with Subresource Integrity (SRI)
- ๐ง Smart caching and checksum validation (sha256)
- ๐งฉ Pluggable architecture for extending storage backends
- โ๏ธ Strongly typed (TypeScript)
---
``bash`
npm install universal_media_storage
or with yarn:
`bash`
yarn add universal_media_storage
---
| Provider | Module | Notes |
|-----------|---------|-------|
| Cloudflare R2 | CloudFlareR2StorageService | S3-compatible; uses @aws-sdk/client-s3 |FirebaseStorageService
| Firebase Storage (GCS) | | Uses @google-cloud/storage |GoogleDriveStorageService
| Google Drive | | Uses googleapis Drive v3 |
---
with a unified uploadFile API. For deletions, you can either call the provider's deleteFile method directly or use the deleteFileFromStorage helper with a StorageResult.locator.`ts
interface UploadParams {
file: {
name: string;
data: Buffer;
mimetype: string;
uri?: string;
};
uploadPath?: string;
parentPathIds?: string[];
cacheControl?: string;
}
`Example:
`ts
import dotenv from "dotenv";
dotenv.config();
import express from "express";
import { S3Client } from "@aws-sdk/client-s3";
import { CloudFlareR2StorageService } from "../src/services/cloudFlareR2Storage";
import { FirebaseStorageService } from "../src/services/firebaseStorage";
import { GoogleDriveStorageService } from "../src/services/googleDriveStorage";
import {MediaStorage} from "../src/MediaStorage";function init() {
const fb_storage =new MediaStorage(
{
config: {
'firebase_service_account_key_base64': process.env.FIREBASE_SERVICE_ACCOUNT_BASE64 || '',
'firebase_storage_bucket': process.env.FIREBASE_STORAGE_BUCKET || '',
},
service: new FirebaseStorageService()
}
);
const r2_storage = new MediaStorage({
service:new CloudFlareR2StorageService(),
config:{
r2_account_id: process.env.R2_ACCOUNT_ID || '',
r2_bucket: process.env.R2_BUCKET || '',
r2_access_key_id: process.env.R2_ACCESS_KEY_ID || '',
r2_access_key_secret: process.env.R2_ACCESS_KEY_SECRET || '',
r2_cdn_base: process.env.R2_CDN_BASE || '',
}
})
const gd_storage = new MediaStorage({
service:new GoogleDriveStorageService(),
config:{
gcp_service_account_key_base64: process.env.GCP_SERVICE_ACCOUNT_KEY_BASE64 || '',
//!Note: This is not need, if you experience a hang try removing this, we handle scopes internally already
gcp_drive_scopes: process.env.GCP_DRIVE_SCOPES || '',
}
})
const r2Client = new S3Client({
region: 'auto',
endpoint:
https://${process.env.R2_ACCOUNT_ID}.r2.cloudflarestorage.com,
credentials: {
accessKeyId: process.env.R2_ACCESS_KEY_ID || '',
secretAccessKey: process.env.R2_ACCESS_KEY_SECRET || '',
},
});
gd_storage.uploadFile({
// For Google Drive, you can specify parent folder IDs to organize files
parentPathIds:[
process.env.PARENT_FOLDER_ID || ''
],
//--------------------------------------
file: {
name: 'test.txt',
mimetype: 'text/plain',
data: Buffer.from('Hello, world!')
},
uploadPath: 'test'
}).then(result => {
console.log('File uploaded:', result);
if (process.env.DELETE_AFTER_UPLOAD === 'true') {
const fileId = result.locator?.provider === 'drive' ? result.locator.fileId : undefined;
if (fileId) {
gd_storage.deleteFile(fileId).then(() => {
console.log('Drive file deleted');
}).catch(err => console.error('Drive delete error:', err));
}
}
}).catch(err => {
console.error('Upload error:', err);
}) r2_storage.uploadFile({
file: {
name: 'delete-me.txt',
mimetype: 'text/plain',
data: Buffer.from('Delete me after upload'),
},
uploadPath: 'example',
}).then(async result => {
console.log('R2 file uploaded:', result);
if (process.env.DELETE_AFTER_UPLOAD === 'true') {
const key = result.locator?.provider === 'r2' ? result.locator.key : undefined;
if (key) {
await r2_storage.deleteFile(undefined, key);
console.log('R2 file deleted');
}
}
}).catch(err => {
console.error('R2 upload error:', err);
})
`Firebase app reuse (safe to call multiple times in the same process):
`ts
import { FirebaseStorageService } from "../src/services/firebaseStorage";const serviceA = new FirebaseStorageService();
const serviceB = new FirebaseStorageService(); // reuses the existing Firebase app
`---
$3
Every upload generates a sha256 SRI hash that can later be validated using the universal verifier:
`ts
import { verifyStorage } from 'media-storage';const outcome = await verifyStorage(result, { r2: { s3: new S3Client() } });
console.log(outcome);
`Sample output:
`json
{
"exists": true,
"integrityMatches": true,
"sizeMatches": true
}
`---
$3
The
uploadFile method returns a StorageResult with URLs, integrity, size, and a provider-specific locator that includes fileId and filePath.Cloudflare R2:
`ts
const r2Result = await r2_storage.uploadFile({
file: {
name: 'avatar.png',
mimetype: 'image/png',
data: Buffer.from('...'),
},
uploadPath: 'profiles/user123',
});
`Firebase Storage:
`ts
const fbResult = await fb_storage.uploadFile({
file: {
name: 'avatar.png',
mimetype: 'image/png',
data: Buffer.from('...'),
},
uploadPath: 'profiles/user123',
});
`Google Drive:
`ts
const gdResult = await gd_storage.uploadFile({
parentPathIds: [''],
file: {
name: 'avatar.png',
mimetype: 'image/png',
data: Buffer.from('...'),
},
});
`$3
You can delete by
fileId or filePath, depending on the provider.Cloudflare R2:
`ts
await r2_storage.deleteFile(undefined, r2Result.locator?.filePath);
`Firebase Storage:
`ts
await fb_storage.deleteFile(undefined, fbResult.locator?.filePath);
`Google Drive:
`ts
await gd_storage.deleteFile(gdResult.locator?.fileId);
`$3
For private R2 buckets, generate a time-limited presigned URL for downloads. Use the
object
key returned from uploadFile (or stored in your DB).`ts
const r2Service = r2_storage.getStorageService() as CloudFlareR2StorageService;
const key = r2Result.key;
const downloadUrl = await r2Service.getPresignedUrl(key, 600); // 10 minutes
`You can also create a presigned PUT URL for direct client uploads:
`ts
const uploadUrl = await r2Service.getPresignedUploadUrl(key, "text/plain", 600);
`$3
Use the
deleteFileFromStorage helper with the locator returned by uploadFile.`ts
import { deleteFileFromStorage } from 'media-storage';
import { S3Client } from '@aws-sdk/client-s3';const r2Client = new S3Client({ region: 'auto', endpoint: 'https://.r2.cloudflarestorage.com' });
await deleteFileFromStorage(result, { r2: { s3: r2Client } });
`---
โ๏ธ Environment Configuration
Environment variables are managed by the built-in
EnvironmentRegister class. You can register them at runtime or load from process.env.`ts
import EnvironmentRegister from 'media-storage/register';const env = EnvironmentRegister.getInstance();
env.loadFromProcessEnv();
`For Google Drive, you can authenticate with either a service account (
GCP_SERVICE_ACCOUNT_KEY_BASE64) or a user OAuth token (GCP_OAUTH_ACCESS_TOKEN, optionally with refresh token + client id/secret).$3
Service account (Shared Drives only):
- Create a service account in Google Cloud Console and generate a JSON key.
- Base64-encode the JSON file and set
GCP_SERVICE_ACCOUNT_KEY_BASE64.
- Share the target Shared Drive or folder with the service account email.User OAuth (My Drive quota):
- Create an OAuth client (Desktop/Web) in Google Cloud Console and note the client id/secret.
- Run an OAuth consent flow with the Drive scope you need (e.g.
https://www.googleapis.com/auth/drive.file).
- Use the resulting access token as GCP_OAUTH_ACCESS_TOKEN. For long-lived use, request offline access and store the refresh token as GCP_OAUTH_REFRESH_TOKEN.$3
`bash
R2_ACCOUNT_ID=your-account
R2_BUCKET=media
R2_ACCESS_KEY_ID=xxxx
R2_ACCESS_KEY_SECRET=xxxx
CDN_BASE=https://cdn.example.com
FIREBASE_STORAGE_BUCKET=my-app.appspot.com
FIREBASE_SERVICE_ACCOUNT_KEY_BASE64=...base64...
GCP_SERVICE_ACCOUNT_KEY_BASE64=...base64...
GCP_DRIVE_SCOPES=https://www.googleapis.com/auth/drive.file
GCP_OAUTH_ACCESS_TOKEN=ya29... # optional alternative to service account
GCP_OAUTH_REFRESH_TOKEN=1//... # optional, enables refresh when paired with client id/secret
GCP_OAUTH_CLIENT_ID=...apps.googleusercontent.com
GCP_OAUTH_CLIENT_SECRET=...
`---
๐งช Testing
Run the Jest test suite:
`bash
npm test
`Key tests:
- cloudflareR2.spec.ts โ Verifies R2 upload, integrity, and race conditions
- firebaseStorage.spec.ts โ Validates Firebase metadata and size checks
- googleDriveStorage.spec.ts โ Tests Drive uploads and mock API verification
- environmentRegister.spec.ts โ Ensures correct env registration and immutability
- baseStorage.spec.ts โ Validates integrity computation and result normalization
---
๐งฑ Project Structure
`
media_storage/
โโโ src/
โ โโโ register.ts # Environment config
โ โโโ services/ # Provider implementations
โ โ โโโ cloudFlareR2Storage.ts
โ โ โโโ firebaseStorage.ts
โ โ โโโ googleDriveStorage.ts
โ โโโ utils/ # Common utilities
โ โ โโโ encryptions.ts
โ โ โโโ deleteFileFromStorage.ts
โ โ โโโ integrity.ts
โ โ โโโ universalIntegrityVerifier.ts
โ โ โโโ validate.ts
โ โโโ types.ts # Type definitions
โ
โโโ __tests__/ # Jest test suite
โโโ package.json
โโโ README.md
``MIT License ยฉ 2025 [Rookie Players]