How to decompress files in Node.js with zlib

Decompressing files in Node.js with zlib extracts compressed data efficiently for file processing, data analysis, and storage management operations. As the creator of CoreUI with over 11 years of Node.js development experience, I’ve implemented file decompression extensively in data processing pipelines, backup systems, and file management applications. From my expertise, the most reliable approach is using zlib decompression streams with proper error handling and pipeline management for robust file extraction. This pattern enables memory-efficient decompression of large files while maintaining application performance and stability.

Use zlib decompression streams with pipeline for memory-efficient file extraction and proper error handling.

const fs = require('fs')
const zlib = require('zlib')
const { pipeline } = require('stream')
const path = require('path')

function decompressFile(inputFile, outputFile) {
  return new Promise((resolve, reject) => {
    const readStream = fs.createReadStream(inputFile)
    const gunzipStream = zlib.createGunzip()
    const writeStream = fs.createWriteStream(outputFile)

    pipeline(
      readStream,
      gunzipStream,
      writeStream,
      (error) => {
        if (error) {
          console.error('Decompression failed:', error.message)
          reject(error)
        } else {
          console.log('File decompressed successfully')
          resolve()
        }
      }
    )
  })
}

// Decompress multiple files
async function decompressFiles(files) {
  for (const file of files) {
    try {
      const outputFile = file.replace('.gz', '')
      await decompressFile(file, outputFile)

      // Compare file sizes
      const compressedSize = fs.statSync(file).size
      const decompressedSize = fs.statSync(outputFile).size
      const ratio = ((decompressedSize / compressedSize - 1) * 100).toFixed(2)

      console.log(`Expansion ratio: ${ratio}%`)
    } catch (error) {
      console.error(`Failed to decompress ${file}:`, error.message)
    }
  }
}

// Usage
decompressFiles(['data.txt.gz', 'logs.txt.gz'])

Here zlib.createGunzip() creates a decompression stream that transforms gzipped data back to its original format. The pipeline() function safely connects the read stream, decompression stream, and write stream with automatic error handling and cleanup. The Promise wrapper enables async/await usage for better control flow and error management.

Best Practice Note:

This is the same approach we use in CoreUI backend systems for log file processing, backup restoration, and data import workflows in production environments. Always validate compressed file integrity before decompression and implement proper cleanup of temporary files to maintain system resources effectively.


Speed up your responsive apps and websites with fully-featured, ready-to-use open-source admin panel templates—free to use and built for efficiency.


About the Author