Skip to main content

Upload From Your Local

BlendVision provides the capability to upload video files directly from your local device for the encoding jobs.

File TypeAccepted FormatsMaximum Size
Videoavi, mpg, mp4, ts, m2ts, mov, mkv, wmv70 GB

To facilitate efficient and smooth uploads, your file should be partitioned into specified parts and utilize the provided presigned_url for uploading each segment through the PUT method. Each part is a contiguous portion of the file's data. You can upload these object parts independently and in any order.

Achieving uploads for large files requires these four steps:

  1. Initiate multipart upload for your file
  2. Logically split your file into byte-range chunks
  3. Upload the chunks in parallel
  4. Assemble the chunks to create the larger file

Multipart Upload

Here are the steps to complete this process:

Create Multipart Upload

Upload URL is an on-demand presigned location where you can place your file.

To obtain the URL for loading a file, make a POST request to the following API with the name and size of your file specified in the request body:

POST /bv/cms/v1/library/files:upload

Here is an example of the request body:

"file":{
"type":"FILE_TYPE_VIDEO",
"source":"FILE_SOURCE_ADD_VOD",
"name":"file_name",
"size":"file_size_in_bytes"
}

When you request to upload a file that exceeds 100MB, multiple URLs will be provided and you will need to split your file into pieces and upload them onto each URL separately (5MiB-5GiB per each, no limit for the last piece).

Here's an example of the response:

{
"file":{
"id":"library-id",
"upload_data":{
"id":"upload-id",
"parts":[
{
"part_number":0,
"presigned_url":"string"
},
{
"part_number":1,
"presigned_url":"string"
}
]
}
}
}

In the response:

  • The id within the file object represents the library ID of the file.
  • The id inside the upload_data object denotes the upload ID of the file.

Split and Upload Parts

It's recommended to upload these parts in parallel but with a restricted capacity to ensure efficient and smooth uploads. To illustrate the process, consider the following JavaScript code example:

// Imports necessary modules and libraries for file upload functionality.
import { CustomFile } from '@sphere/web-ui';
import axios from 'axios';
import Base64 from 'crypto-js/enc-base64';
import sha1 from 'crypto-js/sha1';

// Import API actions for file upload process.
import { cancelUploadFile } from 'features/library/apis/cancelUploadFile';
import { completeUploadFile, CompleteUploadFileResponse, Part } from 'features/library/apis/completeUploadFile';
import { uploadFile } from 'features/library/apis/uploadFile';
// Import types for file sources and file types.
import { FileSourceType, FileType } from 'features/library/types';

// Interface for uploader parameters, detailing the structure of the expected input.
export interface UploaderParams {
file: CustomFile;
type: FileType;
source: FileSourceType;
onError: (error: unknown) => void;
onComplete: (result: CompleteUploadFileResponse) => void;
onProgress?: (progress: number) => void;
}

// Class definition for handling multipart file uploads.
class MultiPartUploader {
// Class properties for storing file details, upload progress, and callback functions.
file: CustomFile;
type: FileType;
source: FileSourceType;
onComplete: (result: CompleteUploadFileResponse) => void;
onError: (error: Error) => void;
parts: Part[] = [];
onProgress?: (progress: number) => void;

// Variables related to the upload process status and control.
fileId = '';
uploadId = '';
uploadPartSize = 0;
controller = new AbortController(); // For aborting HTTP requests.

// Constructor initializes the class properties with provided parameters.
constructor({ file, type, source, onProgress, onError, onComplete }: UploaderParams) {
this.file = file;
this.type = type;
this.source = source;
this.onProgress = onProgress;
this.onError = onError;
this.onComplete = onComplete;

// Resetting parts, fileId, and uploadId for a new upload session.
this.parts = [];
this.fileId = '';
this.uploadId = '';
this.uploadPartSize = 0;
}

// Main method to start the upload process.
async upload() {
try {
// Only proceed if a file has been provided.
if (this.file) {
// Extracting name and size from the file object.
const { name, size } = this.file;
// Initial API call to get upload IDs and part information.
const {
file: { id: fileId },
upload_data: { id: uploadId, parts },
} = await uploadFile({
file: {
name,
size: size.toString(),
source: this.source,
type: this.type,
},
});

// Setting up class properties with the API response for further use.
this.fileId = fileId;
this.uploadId = uploadId;
this.uploadPartSize = parts.length ?? 0;
// Calculating the size of each chunk based on total file size and part count.
const chunkSize = this.file.size / this.uploadPartSize;

// Loop through all parts and upload them individually.
for (let i = 0; i < this.uploadPartSize; i++) {
const { part_number, presigned_url } = parts[i];
// Slice the file into the current part's chunk.
const chunk = this.file.slice(chunkSize * i, chunkSize * (i + 1));
// Upload the current chunk to the provided presigned URL.
this.uploadParts({ presignedUrl: presigned_url, partNumber: part_number, chunk });
}
}
} catch (error) {
// Error handling during the upload initialization.
this.onError(error as Error);
}
}

// Method for uploading individual parts of the file.
async uploadParts({ presignedUrl, chunk, partNumber }: { presignedUrl: string; chunk: Blob; partNumber: number }) {
try {
// Check to ensure that the uploadPartSize is valid.
if (this.uploadPartSize === 0) {
throw new Error('upload part size error');
}

// Create a new Axios instance for HTTP requests.
const AxiosInstance = axios.create();
// Setting necessary headers for the upload request.
AxiosInstance.defaults.headers.put['Content-Type'] = 'application/octet-stream';
// Perform the PUT request to the presigned URL with the file chunk.
const data = await AxiosInstance.put(presignedUrl, chunk, {
signal: this.controller.signal, // Allows for request cancellation.
});

// Update the parts array with the ETag and part number from the response.
this.parts = [...this.parts, { etag: data.headers.etag ?? '', part_number: partNumber }];
// If provided, call the onProgress callback with the current progress.
if (this.onProgress) {
this.onProgress(partNumber && Math.floor((this.parts.length / this.uploadPartSize) * 10000) / 100);
}

// Once all parts are uploaded, finalize the upload process.
if (this.parts.length === this.uploadPartSize) {
this.completeUpload();
}
} catch (error: unknown) {
// Handle any errors during the part upload process.
this.onError(error as Error);
}
}

// Finalizes the upload process by sending a completion request to the server.
async completeUpload() {
try {
// Ensure that fileId and uploadId are valid before proceeding.
if (!this.fileId) {
throw new Error('No file id');
}
if (!this.uploadId) {
throw new Error('No upload id');
}

// Call the completeUploadFile API with necessary data to finalize the upload.
const data = await completeUploadFile({
id: this.fileId,
complete_data: {
checksum_sha1: Base64.stringify(sha1(this.file.toString())),
id: this.uploadId,
parts: this.parts,
},
});
// Call the onComplete callback with the final response data.
this.onComplete(data);
} catch (error) {
// Handle any errors during the upload completion process.
this.onError(error as Error);
}
}

// Aborts the upload process and cancels any ongoing HTTP requests.
async abort() {
this.controller.abort(); // Sends an abort signal to the fetch request.
// Call the cancelUploadFile API to notify the server of the cancellation.
await cancelUploadFile(this.fileId, { upload_id: this.uploadId });
}
}

// Export the MultiPartUploader class for external use.
export default MultiPartUploader;

Key functionalities in the sample code:

  • Chunked File Upload: The file is divided into smaller parts or chunks, which are uploaded separately. This method is beneficial for large files, improving reliability and allowing for resume of uploads in case of failure.
  • Error Handling and Progress Reporting: The mechanism is provided to report upload progress and handle errors gracefully. It attempts to ensure that the application remains responsive and the user is informed of the upload status.
  • Abort Capability: The ability to abort the upload process is crucial for user control, especially for large uploads that might need to be stopped by the user.

Finalize Multipart Upload

After uploading all pieces of the file to the URLs as obtained in the first step, send a registration request for the file using the following API:

POST /bv/cms/v1/library/files/{id}:complete-upload

Replace the path parameter id with the library ID of uploading file.

Here's an example of the request body:

"complete_data":{
"checksum_sha1":"base64-encoded-SHA-1-160bit-digest",
"id":"upload-id",
"parts":[
{
"etag":"uploaded-part-ETag",
"part_number":1
}
]
}

In the request:

  • The checksum_sha1 represents the base64-encoded, 160-bit SHA-1 digest of the original file content.
  • The parts array contains objects that detail each uploaded part, specifying its ETag and the associated part number.

Workflow diagram