
Uploading Multiple Files at the Same Time Using Multithreading in Node.js
Node.js is often described as a single-threaded environment, but this is a common misconception. While the main event loop is single-threaded, Node.js offers powerful modules such as worker_threads and child_process to enable multithreading and parallelism.
worker_threadsshare memory with the main thread and are ideal for lightweight background tasks.child_processspawns independent processes with separate memory, useful for CPU-heavy tasks or OS-level commands.
In this tutorial, you will learn how to use the worker_threads module to upload multiple files in parallel to Google Cloud Storage using Streams. This approach boosts performance and makes your application more responsive.
🧪 What You Will Build
You will create a Node.js application that:
- Reads files from a local folder
 - Spawns multiple worker threads to handle uploads in parallel
 - Uploads each file using Google Cloud Storage's streaming API
 
By the end, you’ll have a deeper understanding of Node.js multithreading and practical knowledge to use it in I/O-heavy applications.
🛠️ Prerequisites
- Node.js 16.16 or later
 - Google Cloud account with a Storage bucket and credentials file
 
Install the required dependency:
npm install @google-cloud/storage
📁 Creating the Storage Service (cloudStorageFileService.js)
This file will encapsulate all logic for uploading files to Google Cloud Storage.
const { Storage } = require('@google-cloud/storage');
const path = require('path');
const serviceKey = path.join(__dirname, '../gkeys.json');
class CloudStorageFileService {
  constructor() {
    this.storage = new Storage({
      projectId: 'my-project-id',
      keyFilename: serviceKey,
    });
  }
  async uploadFile(bucketName, destFileName) {
    return this.storage
      .bucket(bucketName)
      .file(destFileName)
      .createWriteStream();
  }
}
module.exports = CloudStorageFileService;
🔍 Explanation:
- Connects to GCP using a service account key file
 - Returns a writable stream to upload content directly
 
🧠 Controlling the Threads (threadController.js)
This file manages the creation and execution of worker threads.
const { Worker } = require('node:worker_threads');
const { readdir } = require('fs/promises');
const path = require('path');
class ThreadController {
  constructor(threadsNumber) {
    this.files = [];
    this.threadsNumber = threadsNumber;
    this.count = 0;
  }
  async loadFiles() {
    this.files = await readdir(path.join(__dirname, '/content'));
  }
  async uploadThread(filePath) {
    return new Promise((resolve, reject) => {
      const worker = new Worker('./fileUploadWorker.js', {
        workerData: { file: filePath },
      });
      worker.once('error', reject);
      worker.on('exit', () => resolve(filePath));
    });
  }
  async execute() {
    const init = performance.now();
    await this.loadFiles();
    let promises = [];
    while (this.count < this.files.length) {
      for (let i = this.count; i < this.count + this.threadsNumber; i++) {
        if (this.files[i]) {
          promises.push(this.uploadThread(this.files[i]));
        }
      }
      const result = await Promise.all(promises);
      promises = [];
      this.count += this.threadsNumber;
      console.log('Uploaded files:', result);
    }
    const end = performance.now();
    console.log(`Total time: ${Math.round(end - init)}ms`);
  }
}
module.exports = ThreadController;
🔍 Explanation:
- Reads all files from the 
contentfolder - Assigns threads in batches according to the desired concurrency
 - Tracks and logs upload results and performance
 
⚙️ Writing the Upload Worker (fileUploadWorker.js)
This is the worker thread that receives a file path and uploads the file.
const { isMainThread, workerData } = require('node:worker_threads');
const path = require('path');
const { pipeline } = require('stream/promises');
const { createReadStream } = require('fs');
const CloudStorageFileService = require('./cloudStorageFileService');
class FileUploadWorker {
  constructor() {
    this.storage = new CloudStorageFileService();
    this.filePath = path.join(__dirname, '/content/', workerData.file);
    this.fileName = workerData.file;
  }
  async upload() {
    if (!isMainThread) {
      await pipeline(
        createReadStream(this.filePath),
        await this.storage.uploadFile('myfileuploads', this.fileName)
      );
    }
  }
}
(async () => {
  const fileUploader = new FileUploadWorker();
  await fileUploader.upload();
})();
🔍 Explanation:
- Each worker loads a single file path from 
workerData - Uses stream pipeline to upload the file efficiently
 - Ensures threads handle only their assigned job
 
🧪 Running the Controller (index.js)
Run your program with:
const ThreadController = require('./threadController');
const controller = new ThreadController(9); // Change number to test different levels of concurrency
(async () => {
  await controller.execute();
})();
🧾 Conclusion & Key Takeaways
- Node.js supports multithreading through 
worker_threads, despite its reputation as single-threaded - Uploading files using streams helps avoid memory bottlenecks
 - Combining threads and streams lets you scale heavy I/O tasks effectively
 - This pattern is ideal for cloud processing pipelines, background tasks, and parallel uploads
 
Now you have a reusable and scalable pattern to boost performance in your Node.js applications!