Skip to content

StorageBucket

new StorageBucket(fetch, bucketName): StorageBucket

ParameterType
fetchFluxbaseFetch
bucketNamestring

StorageBucket

abortResumableUpload(sessionId): Promise<{ error: Error | null; }>

Abort an in-progress resumable upload

ParameterTypeDescription
sessionIdstringThe upload session ID to abort

Promise<{ error: Error | null; }>


copy(fromPath, toPath): Promise<{ data: { path: string; } | null; error: Error | null; }>

Copy a file to a new location

ParameterTypeDescription
fromPathstringSource file path
toPathstringDestination file path

Promise<{ data: { path: string; } | null; error: Error | null; }>


createSignedUrl(path, options?): Promise<{ data: { signedUrl: string; } | null; error: Error | null; }>

Create a signed URL for temporary access to a file Optionally include image transformation parameters

ParameterTypeDescription
pathstringThe file path
options?SignedUrlOptionsSigned URL options including expiration and transforms

Promise<{ data: { signedUrl: string; } | null; error: Error | null; }>

// Simple signed URL (1 hour expiry)
const { data, error } = await storage.from('images').createSignedUrl('photo.jpg');
// Signed URL with custom expiry
const { data, error } = await storage.from('images').createSignedUrl('photo.jpg', {
expiresIn: 7200 // 2 hours
});
// Signed URL with image transformation
const { data, error } = await storage.from('images').createSignedUrl('photo.jpg', {
expiresIn: 3600,
transform: {
width: 400,
height: 300,
format: 'webp',
quality: 85,
fit: 'cover'
}
});

download(path): Promise<{ data: Blob | null; error: Error | null; }>

Download a file from the bucket

ParameterTypeDescription
pathstringThe path/key of the file

Promise<{ data: Blob | null; error: Error | null; }>

// Default: returns Blob
const { data: blob } = await storage.from('bucket').download('file.pdf');
// Streaming: returns { stream, size } for progress tracking
const { data } = await storage.from('bucket').download('large.json', { stream: true });
console.log(`File size: ${data.size} bytes`);
// Process data.stream...

download(path, options): Promise<{ data: StreamDownloadData | null; error: Error | null; }>

Download a file from the bucket

ParameterTypeDescription
pathstringThe path/key of the file
options{ signal?: AbortSignal; stream: true; timeout?: number; }-
options.signal?AbortSignal-
options.streamtrue-
options.timeout?number-

Promise<{ data: StreamDownloadData | null; error: Error | null; }>

// Default: returns Blob
const { data: blob } = await storage.from('bucket').download('file.pdf');
// Streaming: returns { stream, size } for progress tracking
const { data } = await storage.from('bucket').download('large.json', { stream: true });
console.log(`File size: ${data.size} bytes`);
// Process data.stream...

download(path, options): Promise<{ data: Blob | null; error: Error | null; }>

Download a file from the bucket

ParameterTypeDescription
pathstringThe path/key of the file
options{ signal?: AbortSignal; stream?: false; timeout?: number; }-
options.signal?AbortSignal-
options.stream?false-
options.timeout?number-

Promise<{ data: Blob | null; error: Error | null; }>

// Default: returns Blob
const { data: blob } = await storage.from('bucket').download('file.pdf');
// Streaming: returns { stream, size } for progress tracking
const { data } = await storage.from('bucket').download('large.json', { stream: true });
console.log(`File size: ${data.size} bytes`);
// Process data.stream...

downloadResumable(path, options?): Promise<{ data: ResumableDownloadData | null; error: Error | null; }>

Download a file with resumable chunked downloads for large files. Returns a ReadableStream that abstracts the chunking internally.

Features:

  • Downloads file in chunks using HTTP Range headers
  • Automatically retries failed chunks with exponential backoff
  • Reports progress via callback
  • Falls back to regular streaming if Range not supported
ParameterTypeDescription
pathstringThe file path within the bucket
options?ResumableDownloadOptionsDownload options including chunk size, retries, and progress callback

Promise<{ data: ResumableDownloadData | null; error: Error | null; }>

A ReadableStream and file size (consumer doesn’t need to know about chunking)

const { data, error } = await storage.from('bucket').downloadResumable('large.json', {
chunkSize: 5 * 1024 * 1024, // 5MB chunks
maxRetries: 3,
onProgress: (progress) => console.log(`${progress.percentage}% complete`)
});
if (data) {
console.log(`File size: ${data.size} bytes`);
// Process data.stream...
}

getPublicUrl(path): object

Get a public URL for a file

ParameterTypeDescription
pathstringThe file path

object

NameType
dataobject
data.publicUrlstring

getResumableUploadStatus(sessionId): Promise<{ data: ChunkedUploadSession | null; error: Error | null; }>

Get the status of a resumable upload session

ParameterTypeDescription
sessionIdstringThe upload session ID to check

Promise<{ data: ChunkedUploadSession | null; error: Error | null; }>


getTransformUrl(path, transform): string

Get a public URL for a file with image transformations applied Only works for image files (JPEG, PNG, WebP, GIF, AVIF, etc.)

ParameterTypeDescription
pathstringThe file path
transformTransformOptionsTransformation options (width, height, format, quality, fit)

string

// Get a 300x200 WebP thumbnail
const url = storage.from('images').getTransformUrl('photo.jpg', {
width: 300,
height: 200,
format: 'webp',
quality: 85,
fit: 'cover'
});
// Get a resized image maintaining aspect ratio
const url = storage.from('images').getTransformUrl('photo.jpg', {
width: 800,
format: 'webp'
});

list(pathOrOptions?, maybeOptions?): Promise<{ data: FileObject[] | null; error: Error | null; }>

List files in the bucket Supports both Supabase-style list(path, options) and Fluxbase-style list(options)

ParameterTypeDescription
pathOrOptions?string | ListOptionsThe folder path or list options
maybeOptions?ListOptionsList options when first param is a path

Promise<{ data: FileObject[] | null; error: Error | null; }>


listShares(path): Promise<{ data: FileShare[] | null; error: Error | null; }>

List users a file is shared with (RLS)

ParameterTypeDescription
pathstringThe file path

Promise<{ data: FileShare[] | null; error: Error | null; }>


move(fromPath, toPath): Promise<{ data: { message: string; } | null; error: Error | null; }>

Move a file to a new location

ParameterTypeDescription
fromPathstringCurrent file path
toPathstringNew file path

Promise<{ data: { message: string; } | null; error: Error | null; }>


remove(paths): Promise<{ data: FileObject[] | null; error: Error | null; }>

Remove files from the bucket

ParameterTypeDescription
pathsstring[]Array of file paths to remove

Promise<{ data: FileObject[] | null; error: Error | null; }>


revokeShare(path, userId): Promise<{ data: null; error: Error | null; }>

Revoke file access from a user (RLS)

ParameterTypeDescription
pathstringThe file path
userIdstringThe user ID to revoke access from

Promise<{ data: null; error: Error | null; }>


share(path, options): Promise<{ data: null; error: Error | null; }>

Share a file with another user (RLS)

ParameterTypeDescription
pathstringThe file path
optionsShareFileOptionsShare options (userId and permission)

Promise<{ data: null; error: Error | null; }>


upload(path, file, options?): Promise<{ data: { fullPath: string; id: string; path: string; } | null; error: Error | null; }>

Upload a file to the bucket

ParameterTypeDescription
pathstringThe path/key for the file
fileArrayBuffer | Blob | File | ArrayBufferView<ArrayBufferLike>The file to upload (File, Blob, ArrayBuffer, or ArrayBufferView like Uint8Array)
options?UploadOptionsUpload options

Promise<{ data: { fullPath: string; id: string; path: string; } | null; error: Error | null; }>


uploadLargeFile(path, file, options?): Promise<{ data: { fullPath: string; id: string; path: string; } | null; error: Error | null; }>

Upload a large file using streaming for reduced memory usage. This is a convenience method that converts a File or Blob to a stream.

ParameterTypeDescription
pathstringThe path/key for the file
fileBlob | FileThe File or Blob to upload
options?StreamUploadOptionsUpload options

Promise<{ data: { fullPath: string; id: string; path: string; } | null; error: Error | null; }>

const file = new File([...], 'large-video.mp4');
const { data, error } = await storage
.from('videos')
.uploadLargeFile('video.mp4', file, {
contentType: 'video/mp4',
onUploadProgress: (p) => console.log(`${p.percentage}% complete`),
});

uploadResumable(path, file, options?): Promise<{ data: { fullPath: string; id: string; path: string; } | null; error: Error | null; }>

Upload a large file with resumable chunked uploads.

Features:

  • Uploads file in chunks for reliability
  • Automatically retries failed chunks with exponential backoff
  • Reports progress via callback with chunk-level granularity
  • Can resume interrupted uploads using session ID
ParameterTypeDescription
pathstringThe file path within the bucket
fileBlob | FileThe File or Blob to upload
options?ResumableUploadOptionsUpload options including chunk size, retries, and progress callback

Promise<{ data: { fullPath: string; id: string; path: string; } | null; error: Error | null; }>

Upload result with file info

const { data, error } = await storage.from('uploads').uploadResumable('large.zip', file, {
chunkSize: 5 * 1024 * 1024, // 5MB chunks
maxRetries: 3,
onProgress: (p) => {
console.log(`${p.percentage}% (chunk ${p.currentChunk}/${p.totalChunks})`);
console.log(`Speed: ${(p.bytesPerSecond / 1024 / 1024).toFixed(2)} MB/s`);
console.log(`Session ID (for resume): ${p.sessionId}`);
}
});
// To resume an interrupted upload:
const { data, error } = await storage.from('uploads').uploadResumable('large.zip', file, {
resumeSessionId: 'previous-session-id',
});

uploadStream(path, stream, size, options?): Promise<{ data: { fullPath: string; id: string; path: string; } | null; error: Error | null; }>

Upload a file using streaming for reduced memory usage. This method bypasses FormData buffering and streams data directly to the server. Ideal for large files where memory efficiency is important.

ParameterTypeDescription
pathstringThe path/key for the file
streamReadableStream<Uint8Array<ArrayBufferLike>>ReadableStream of the file data
sizenumberThe size of the file in bytes (required for Content-Length header)
options?StreamUploadOptionsUpload options

Promise<{ data: { fullPath: string; id: string; path: string; } | null; error: Error | null; }>

// Upload from a File's stream
const file = new File([...], 'large-video.mp4');
const { data, error } = await storage
.from('videos')
.uploadStream('video.mp4', file.stream(), file.size, {
contentType: 'video/mp4',
});
// Upload from a fetch response stream
const response = await fetch('https://example.com/data.zip');
const size = parseInt(response.headers.get('content-length') || '0');
const { data, error } = await storage
.from('files')
.uploadStream('data.zip', response.body!, size, {
contentType: 'application/zip',
});