Skip to main content

Upload File

Uploads a file from the local filesystem to a Google Cloud Storage bucket.

Common Properties

  • Name - The custom name of the node.
  • Color - The custom color of the node.
  • Delay Before (sec) - Waits in seconds before executing the node.
  • Delay After (sec) - Waits in seconds after executing node.
  • Continue On Error - Automation will continue regardless of any error. The default value is false.
info

If the ContinueOnError property is true, no error is caught when the project is executed, even if a Catch node is used.

Inputs

  • GCS Client Id - The client identifier obtained from the Connect node. Optional if credentials are provided directly.
  • Bucket Name - The name of the destination bucket where the file will be uploaded.
  • File Path - The local filesystem path of the file to upload (absolute or relative path).

Options

  • File Name - Custom name for the uploaded object in GCS. If not specified, the original filename will be used.
  • Timeout - Maximum time in seconds to wait for the upload to complete. Default is 30 seconds. Use higher values for large files.
  • Credentials - Google Cloud service account credentials (optional). Use this instead of GCS Client Id for direct authentication without Connect node.

Output

  • File Path - The Google Cloud Storage URI of the uploaded file in the format gs://bucket-name/object-name. Stored in message context as filePath.

How It Works

The Upload File node transfers a file from the local filesystem to a Google Cloud Storage bucket. When executed, the node:

  1. Establishes connection using either GCS Client Id or provided credentials
  2. Validates all required inputs (Bucket Name, File Path)
  3. Opens the local file for reading
  4. Determines the object name (custom or original filename)
  5. Uploads the file content to the specified bucket
  6. Returns the GCS URI of the uploaded file

Example

Basic File Upload

// After Connect node or with direct credentials
// Input:
// Bucket Name: "my-data-bucket"
// File Path: "/home/user/reports/sales-2024.csv"
// File Name: (empty - uses original name)

// Output:
// message.filePath = "gs://my-data-bucket/sales-2024.csv"

Upload with Custom Name

// Upload file with a different name in GCS
// Input:
// Bucket Name: "backup-bucket"
// File Path: "/tmp/data.xlsx"
// File Name: "reports/monthly-report-2024-01.xlsx"

// Output:
// message.filePath = "gs://backup-bucket/reports/monthly-report-2024-01.xlsx"

Upload with Folder Structure

// Create folder-like organization in GCS
// Input:
// File Path: "/downloads/invoice.pdf"
// File Name: "invoices/2024/january/inv-001.pdf"

// The object will be accessible at:
// gs://my-bucket/invoices/2024/january/inv-001.pdf

Batch Upload with Loop

// Loop through files and upload each one
const files = [
"/data/file1.csv",
"/data/file2.csv",
"/data/file3.csv"
];

// Use Loop node to iterate and upload each file
// Each iteration uploads one file to GCS

Upload with Timestamp

// In a JavaScript node before Upload:
const timestamp = new Date().toISOString().replace(/:/g, '-');
const filename = `backup-${timestamp}.json`;
message.customFileName = filename;

// Then in Upload node:
// File Name: {{customFileName}}
// Result: gs://bucket/backup-2024-01-15T10-30-45.123Z.json

Requirements

  • Either a valid GCS Client Id from Connect node OR credentials provided directly
  • Valid bucket name that exists in Google Cloud Storage
  • Valid local file path that exists and is readable
  • Appropriate IAM permissions:
    • storage.objects.create permission
    • roles/storage.objectCreator or higher
  • Sufficient network connectivity and bandwidth
  • Adequate timeout for large file uploads

Error Handling

The node will return specific errors in the following cases:

Error CodeDescription
ErrInvalidArgBucket Name or File Path is empty or invalid
os.Open errorLocal file doesn't exist or isn't readable
io.Copy errorUpload failed during transfer
Writer.Close errorUpload finalization failed

Common error scenarios:

  • Empty or invalid GCS Client Id
  • Empty or invalid Bucket Name
  • Empty or invalid File Path
  • File does not exist at specified path
  • Insufficient permissions to read local file
  • Bucket does not exist or is inaccessible
  • Insufficient permissions to upload files
  • Upload timeout exceeded (large files)
  • Network connectivity issues
  • Disk I/O errors

Usage Notes

  • The File Name option allows you to rename the file during upload
  • If File Name is not specified, the original filename from File Path is used
  • Object names can include forward slashes to create folder-like paths
  • GCS doesn't have actual folders - paths are part of the object name
  • The Timeout option helps prevent hanging operations for large files
  • Default timeout is 30 seconds; increase for files over 10MB
  • The output File Path provides a direct GCS URI reference
  • Existing objects with the same name will be overwritten without warning
  • All file types are supported (text, binary, images, etc.)
  • File size limits depend on your GCS bucket configuration

Tips for Effective Use

  • Use descriptive File Name patterns for better organization
  • Implement folder-like structures with slashes in File Name
  • Set appropriate timeout based on file size and network speed
  • Use timestamp or UUID in filenames to avoid overwriting
  • Implement error handling for network failures
  • Consider compression for large files before upload
  • Use batch uploads with Loop nodes for multiple files
  • Monitor upload progress for very large files
  • Validate file existence before upload attempt
  • Clean up local files after successful upload if needed

Common Errors and Solutions

Error: "Bucket Name cannot be empty"

  • Solution: Ensure bucket name is properly set in inputs
  • Check variable bindings and message context

Error: "File Path cannot be empty"

  • Solution: Verify the file path input is correctly specified
  • Use absolute paths to avoid confusion

Error: "os.Open: no such file or directory"

  • Solution: Verify the file exists at the specified path
  • Check file path for typos or incorrect directory
  • Ensure the file hasn't been moved or deleted

Error: "Writer.Close: context deadline exceeded"

  • Solution: Increase the Timeout value
  • Check network connectivity and speed
  • Consider uploading during off-peak hours for large files

Upload succeeds but file is corrupted

  • Solution: Verify the local file isn't corrupted
  • Check disk space during upload
  • Ensure stable network connection

Permission denied errors

  • Solution: Check service account has storage.objects.create permission
  • Verify bucket allows uploads from your service account
  • Check bucket IAM policies

Use Cases

Automated Backups

  • Schedule daily backups of critical files to GCS
  • Archive logs and reports to cloud storage
  • Maintain versioned backups with timestamp naming

Data Integration

  • Upload processed data files to shared GCS buckets
  • Transfer files between systems via cloud storage
  • Stage files for cloud-based data processing

Report Distribution

  • Upload generated reports to accessible GCS locations
  • Share files with team members via GCS URIs
  • Archive historical reports with organized folder structure

File Processing Workflows

  • Upload files for further cloud processing
  • Store intermediate results in GCS
  • Archive processed files with metadata

Multi-Region Distribution

  • Upload files to regional buckets for CDN distribution
  • Replicate files across different GCS buckets
  • Distribute content geographically