Skip to content

StorageBucket

new StorageBucket(fetch, bucketName): StorageBucket

ParameterType
fetchFluxbaseFetch
bucketNamestring

StorageBucket

abortResumableUpload(sessionId): Promise<object>

Abort an in-progress resumable upload

ParameterTypeDescription
sessionIdstringThe upload session ID to abort

Promise<object>

NameType
errornull | Error

copy(fromPath, toPath): Promise<object>

Copy a file to a new location

ParameterTypeDescription
fromPathstringSource file path
toPathstringDestination file path

Promise<object>

NameType
datanull | object
errornull | Error

createSignedUrl(path, options?): Promise<object>

Create a signed URL for temporary access to a file Optionally include image transformation parameters

ParameterTypeDescription
pathstringThe file path
options?SignedUrlOptionsSigned URL options including expiration and transforms

Promise<object>

NameType
datanull | object
errornull | Error
// Simple signed URL (1 hour expiry)
const { data, error } = await storage.from('images').createSignedUrl('photo.jpg');
// Signed URL with custom expiry
const { data, error } = await storage.from('images').createSignedUrl('photo.jpg', {
expiresIn: 7200 // 2 hours
});
// Signed URL with image transformation
const { data, error } = await storage.from('images').createSignedUrl('photo.jpg', {
expiresIn: 3600,
transform: {
width: 400,
height: 300,
format: 'webp',
quality: 85,
fit: 'cover'
}
});

download(path): Promise<object>

Download a file from the bucket

ParameterTypeDescription
pathstringThe path/key of the file

Promise<object>

NameType
datanull | Blob
errornull | Error
// Default: returns Blob
const { data: blob } = await storage.from('bucket').download('file.pdf');
// Streaming: returns { stream, size } for progress tracking
const { data } = await storage.from('bucket').download('large.json', { stream: true });
console.log(`File size: ${data.size} bytes`);
// Process data.stream...

download(path, options): Promise<object>

ParameterType
pathstring
optionsobject
options.signal?AbortSignal
options.streamtrue
options.timeout?number

Promise<object>

NameType
datanull | StreamDownloadData
errornull | Error

download(path, options): Promise<object>

ParameterType
pathstring
optionsobject
options.signal?AbortSignal
options.stream?false
options.timeout?number

Promise<object>

NameType
datanull | Blob
errornull | Error

downloadResumable(path, options?): Promise<object>

Download a file with resumable chunked downloads for large files. Returns a ReadableStream that abstracts the chunking internally.

Features:

  • Downloads file in chunks using HTTP Range headers
  • Automatically retries failed chunks with exponential backoff
  • Reports progress via callback
  • Falls back to regular streaming if Range not supported
ParameterTypeDescription
pathstringThe file path within the bucket
options?ResumableDownloadOptionsDownload options including chunk size, retries, and progress callback

Promise<object>

A ReadableStream and file size (consumer doesn’t need to know about chunking)

NameType
datanull | ResumableDownloadData
errornull | Error
const { data, error } = await storage.from('bucket').downloadResumable('large.json', {
chunkSize: 5 * 1024 * 1024, // 5MB chunks
maxRetries: 3,
onProgress: (progress) => console.log(`${progress.percentage}% complete`)
});
if (data) {
console.log(`File size: ${data.size} bytes`);
// Process data.stream...
}

getPublicUrl(path): object

Get a public URL for a file

ParameterTypeDescription
pathstringThe file path

object

NameType
dataobject
data.publicUrlstring

getResumableUploadStatus(sessionId): Promise<object>

Get the status of a resumable upload session

ParameterTypeDescription
sessionIdstringThe upload session ID to check

Promise<object>

NameType
datanull | ChunkedUploadSession
errornull | Error

getTransformUrl(path, transform): string

Get a public URL for a file with image transformations applied Only works for image files (JPEG, PNG, WebP, GIF, AVIF, etc.)

ParameterTypeDescription
pathstringThe file path
transformTransformOptionsTransformation options (width, height, format, quality, fit)

string

// Get a 300x200 WebP thumbnail
const url = storage.from('images').getTransformUrl('photo.jpg', {
width: 300,
height: 200,
format: 'webp',
quality: 85,
fit: 'cover'
});
// Get a resized image maintaining aspect ratio
const url = storage.from('images').getTransformUrl('photo.jpg', {
width: 800,
format: 'webp'
});

list(pathOrOptions?, maybeOptions?): Promise<object>

List files in the bucket Supports both Supabase-style list(path, options) and Fluxbase-style list(options)

ParameterTypeDescription
pathOrOptions?string | ListOptionsThe folder path or list options
maybeOptions?ListOptionsList options when first param is a path

Promise<object>

NameType
datanull | FileObject[]
errornull | Error

listShares(path): Promise<object>

List users a file is shared with (RLS)

ParameterTypeDescription
pathstringThe file path

Promise<object>

NameType
datanull | FileShare[]
errornull | Error

move(fromPath, toPath): Promise<object>

Move a file to a new location

ParameterTypeDescription
fromPathstringCurrent file path
toPathstringNew file path

Promise<object>

NameType
datanull | object
errornull | Error

remove(paths): Promise<object>

Remove files from the bucket

ParameterTypeDescription
pathsstring[]Array of file paths to remove

Promise<object>

NameType
datanull | FileObject[]
errornull | Error

revokeShare(path, userId): Promise<object>

Revoke file access from a user (RLS)

ParameterTypeDescription
pathstringThe file path
userIdstringThe user ID to revoke access from

Promise<object>

NameType
datanull
errornull | Error

share(path, options): Promise<object>

Share a file with another user (RLS)

ParameterTypeDescription
pathstringThe file path
optionsShareFileOptionsShare options (userId and permission)

Promise<object>

NameType
datanull
errornull | Error

upload(path, file, options?): Promise<object>

Upload a file to the bucket

ParameterTypeDescription
pathstringThe path/key for the file
fileBlob | ArrayBufferView | ArrayBuffer | FileThe file to upload (File, Blob, ArrayBuffer, or ArrayBufferView like Uint8Array)
options?UploadOptionsUpload options

Promise<object>

NameType
datanull | object
errornull | Error

uploadLargeFile(path, file, options?): Promise<object>

Upload a large file using streaming for reduced memory usage. This is a convenience method that converts a File or Blob to a stream.

ParameterTypeDescription
pathstringThe path/key for the file
fileBlob | FileThe File or Blob to upload
options?StreamUploadOptionsUpload options

Promise<object>

NameType
datanull | object
errornull | Error
const file = new File([...], 'large-video.mp4');
const { data, error } = await storage
.from('videos')
.uploadLargeFile('video.mp4', file, {
contentType: 'video/mp4',
onUploadProgress: (p) => console.log(`${p.percentage}% complete`),
});

uploadResumable(path, file, options?): Promise<object>

Upload a large file with resumable chunked uploads.

Features:

  • Uploads file in chunks for reliability
  • Automatically retries failed chunks with exponential backoff
  • Reports progress via callback with chunk-level granularity
  • Can resume interrupted uploads using session ID
ParameterTypeDescription
pathstringThe file path within the bucket
fileBlob | FileThe File or Blob to upload
options?ResumableUploadOptionsUpload options including chunk size, retries, and progress callback

Promise<object>

Upload result with file info

NameType
datanull | object
errornull | Error
const { data, error } = await storage.from('uploads').uploadResumable('large.zip', file, {
chunkSize: 5 * 1024 * 1024, // 5MB chunks
maxRetries: 3,
onProgress: (p) => {
console.log(`${p.percentage}% (chunk ${p.currentChunk}/${p.totalChunks})`);
console.log(`Speed: ${(p.bytesPerSecond / 1024 / 1024).toFixed(2)} MB/s`);
console.log(`Session ID (for resume): ${p.sessionId}`);
}
});
// To resume an interrupted upload:
const { data, error } = await storage.from('uploads').uploadResumable('large.zip', file, {
resumeSessionId: 'previous-session-id',
});

uploadStream(path, stream, size, options?): Promise<object>

Upload a file using streaming for reduced memory usage. This method bypasses FormData buffering and streams data directly to the server. Ideal for large files where memory efficiency is important.

ParameterTypeDescription
pathstringThe path/key for the file
streamReadableStream<Uint8Array>ReadableStream of the file data
sizenumberThe size of the file in bytes (required for Content-Length header)
options?StreamUploadOptionsUpload options

Promise<object>

NameType
datanull | object
errornull | Error
// Upload from a File's stream
const file = new File([...], 'large-video.mp4');
const { data, error } = await storage
.from('videos')
.uploadStream('video.mp4', file.stream(), file.size, {
contentType: 'video/mp4',
});
// Upload from a fetch response stream
const response = await fetch('https://example.com/data.zip');
const size = parseInt(response.headers.get('content-length') || '0');
const { data, error } = await storage
.from('files')
.uploadStream('data.zip', response.body!, size, {
contentType: 'application/zip',
});