Developers / Admins / Consultants / Platform

How to Manage Files in Salesforce With ZipWriter and ZipReader

By Bassem Marji

The Salesforce Spring ’25 release introduced native support for file compression functionality in Apex, featuring the new ZipWriter and ZipReader classes within the Compression namespace for seamless creation and extraction of .zip files.

These built-in tools empower developers to efficiently manage file compression and extraction directly in Apex, streamlining processes that previously required external libraries, third-party integrations, or other workarounds.

This article will provide a comprehensive guide on leveraging these new utilities while exploring practical examples and real-world use cases where these tools can deliver significant value.

Key Takeaways and Actionable Ideas

Salesforce’s native support for zip operations enables a wide range of applications, including:

  • Bundling multiple documents into a single downloadable .zip file.
  • Compressing large files to reduce storage use and improve transfer efficiency.
  • Extracting files from uploaded ZIPs for simultaneous processing.
  • Sending compressed email attachments, such as reports or test logs.
  • Uploading zipped data for batch jobs.
  • Integrating with APIs that require ZIP input/output.

Getting Started with ZipWriter and ZipReader

ZipWriter

The ZipWriter class lets you dynamically create a .zip archive and return the content as a Blob. You can then:

  • Attach the Blob to an email (Messaging.EmailFileAttachment)
  • Store it in Salesforce Files using ContentVersion
  • Add it as an attachment to a Salesforce object record.

The ZipWriter class provides methods to build and manage a list of files and their contents, referred to as “entries”, to be packaged into a ZIP archive. At its simplest, the addEntry() method accepts two parameters: the file path/name within the archive and the corresponding Blob content to be added. 

Additionally, there is an addEntries() method that allows multiple entries to be added in a single call. Once all entries have been defined, the getArchive() method compiles them into a complete ZIP archive and returns it as a new Blob instance. 

ZipReader

The ZipReader class extracts and reads files from a zipped Blob, making it ideal for processing ZIP uploads in Apex.

This class includes the getEntries() method, which returns a list of all entries (files and directories) contained within a ZIP archive. This allows you to inspect the contents of a ZIP file and process each entry as needed.

ZipEntry

The ZipEntry class provides granular control over individual entries within a ZIP archive; Each entry represents a single file (or directory) in a ZIP file. Below is a breakdown of its key properties:

NameTypeDescription
nameStringThe name of the entry.
dataBlobThe actual binary of the data.
sizeLongThe size of the entry in bytes.
compressedSizeLongThe size of the entry after compression.
isDirectoryBooleanIndicates whether the entry represents a directory.

Use Cases

Roll up your sleeves and let’s explore how to leverage these features in practice! The following ZipManipulation class illustrates Salesforce’s built-in ZIP capabilities while tracking system resources via its runTest method. This implementation blueprint covers:

  • ZIP Archive Creation: Bundle multiple files into a single compressed archive.
  • Content Extraction: Retrieve and inspect individual files from the zipped blob.
  • Performance Monitoring: Track heap memory and CPU consumption at each process stage.

Hereafter, the execution logs after running the method: ZipManipulation.runTest();

Analyzing the generated log reveals the following:

StageHeap Size (bytes_CPU Time (ms)
Initial1,0911
Post Compression8,004100
Post Extraction11,520189
  1. Heap Size Growth: The heap size has increased from about 1 KB initially to around 11.5 KB after extraction. This increase is expected since compressing and extracting files involves creating Blob objects and storing file contents in memory.
  2. CPU Time Consumption: CPU time rose from 1ms initially to 100ms after compression and 189ms after extraction. This indicates that the compression and extraction operations are moderately CPU-intensive but well within typical governor limits.
  3. Files Content Matching: The extracted file contents match the original ones.

Key Considerations

Limitations

Salesforce’s built-in ZIP functionality enables efficient file compression and extraction within Apex, but to maximize reliability and performance, it’s essential to be aware of the platform’s limitations and constraints. The following outlines critical considerations to keep in mind when working with ZIP files:

  • Governor Limits: Apex heap size limits apply when compressing or decompressing files, as these operations may consume significant memory. You should monitor heap usage carefully to avoid runtime exceptions. Salesforce advises using asynchronous Apex methods like Queueable or Batchable to avoid hitting governor limits on Max heap sizes.
Limit TypeSynchronous ApexAsynchronous Apex
Max BLOB Size6 MB12 MB
Heap Size6 MB12 MB
  • File Input Requirements for ZipWriter: The ZipWriter class requires files to be added as BLOBs, with filenames provided as strings including relative paths. Issues can arise if ZIP entries have file names with spaces or special characters, potentially causing problems when extracting or referencing files within Salesforce.
  • Bulk API Limitations: The Bulk API is a powerful tool for processing large volumes of data asynchronously. When using ZIP functionality in conjunction with the Bulk API, Salesforce imposes specific restrictions to ensure scalability and performance: 
    • Maximum Number of Files per ZIP:
      You can include up to 1,000 files in a single ZIP archive. This limit ensures that the system does not get overwhelmed by a large number of files, which could lead to inefficiencies or timeouts during processing.
    • Maximum Uncompressed Total File Size:
      The total size of all uncompressed files within the ZIP archive must not exceed 20 MB. This limit helps control memory usage and ensures that the system can handle the data without exceeding heap size or other governor limits.
    • Maximum Compressed ZIP Size:
      The size of the compressed ZIP file itself must not exceed 10 MB. This constraint ensures that the file remains manageable for upload, download, and processing within the Bulk API framework.

These limitations highlight the importance of carefully managing ZIP file size, number of files, file naming conventions, and processing methods to ensure reliable and performing solutions within the Salesforce platform constraints.

Best Practices

To guarantee reliable and efficient ZIP file operations, it’s important to follow proven best practices. These guidelines help you manage resource limits, maintain data integrity, and enhance security throughout compression and extraction processes:

  1. Test Compression with Varying File Sizes to Stay Within Heap Limits: When compressing files into a ZIP archive, the uncompressed file data resides in memory until the compression process completes. If the total size of the uncompressed files exceeds the heap limit, your code will fail with a System.LimitException.
  2. Use Asynchronous Methods for Large Files to Avoid Timeouts: Compressing or extracting large files can be computationally intensive and may exceed these limits, resulting in timeouts or incomplete operations.
  3. Handle Exceptions Gracefully to Prevent Incomplete Operations or Data Loss: Errors during compression or extraction can lead to incomplete operations, corrupted files, or data loss. Graceful error handling ensures that your code can recover from unexpected issues without compromising data integrity.
  4. Log Processing Details for Debugging and Monitoring: Logging is essential for understanding how your code behaves during execution.
  5. Security: Ensure sensitive data is encrypted before compression to prevent unauthorized access. Validate ZIP files for malicious content before extraction.

By adhering to these best practices, you can certify that your compression and decompression operations in Salesforce are efficient, reliable, and scalable. Testing with varying file sizes, leveraging asynchronous methods, handling exceptions gracefully, and logging processing details are critical steps to building robust solutions that comply with governor limits and deliver optimal performance. 

These practices not only help prevent runtime errors but also make your code easier to debug, monitor, and maintain over time. And for further details, be certain to read the full API reference documentation

Final Thoughts

Leveraging the Compression namespace empowers developers to simplify data storage, improve file transfer efficiency, and boost overall system performance. Whether generating reports, managing uploads, or facilitating data transfers via email, mastering ZipWriter and ZipReader makes working with compressed files straightforward.

Start incorporating these utilities in your development toolkit to elevate file handling in your Salesforce solutions.

The Author

Bassem Marji

Bassem is a certified Salesforce Administrator, a Project Implementation Manager at BLOM Bank Lebanon, and a Technical Author at Educative. He is a big fan of the Salesforce ecosystem.

Leave a Reply