My name is Rich Geldreich, and I’m a game developer, graphics programmer, and data compression specialist. I’ve been in the trenches working on the creation, optimization, or enhancement of several major game engines at companies like Ensemble Studios (Microsoft) and Valve over the last 20 years.
Also over the years, usually late at night in between shipping games, I’ve worked on several closed and open source compression libraries originally intended for game developers, such as crunch, LZHAM, miniz, and jpeg-compressor. I also wrote picojpeg, a JPEG image decoder optimized for extremely low memory 8-bit embedded CPU’s. Outside of game products, these libraries have been utilized in a surprising variety of interesting applications, such as on picosatellites, for WebGL content delivery, for GPU accelerated UHD video decoding, and as educational material.
According to world expert Matt Mahoney, data compression is “the art of reducing the number of bits needed to store or transmit data”. At a different level, compression is an essential, strategically enabling technology that can save time, reduce storage space, reduce memory utilization, or reduce bandwidth. Compression can make possible things that were previously impractical or uneconomic due to available hardware, storage, or transmission resources.
There are two major classes of compression systems, such as specialized lossy systems (think JPEG or MP3), and generic lossless systems (think .ZIP). Within these two categories are an almost endless variety of applications, approaches and specialized algorithms. Some of the most essential and valuable compression systems become worldwide standards and are implemented directly into the hardware using specialized integrated circuits.
The Data Compression Team’s domain is basically anything related to compression. Internally, Unity already utilizes a number of custom and off the shelf compression systems for game assets like sounds, textures, animations, meshes, and asset bundles. Needless to say, without these systems Unity as a product would require an impractical amount of memory on many platforms, especially on mobile devices.
One of our team’s background responsibilities is to tune, optimize, and maintain our existing set of compression systems. In the near term, we’re focusing on writing a new offline and real-time generic binary data delta compressor for use by several teams within Unity. Our team’s most significant long term goal is to examine Unity’s entire software stack and determine how to break down artificial software barriers that are preventing us from getting the best possible compression solutions.
Since coming to Unity, Alexander Suvorov and I have dived in and started deeply studying the lossless compression field’s current state. Lossless compression technology allows Unity’s downloadable asset bundle files to require significantly less space and time to download. Our goal was to identify not only where the state of the art is, but to predict where the field is going. We’ve also talked about our long term view of the field of GPU texture compression.
During our lossless survey, we found at least one major feature we can add to current lossless compressors that would enable us to readily build new types of custom texture, geometry, and animation compressors. We’re also considering completely redefining how data is fed into a lossless compressor. After some great discussions with Unity developers on the Cloud Build and Services teams, we’ve begun researching and planning what it would take to modify our current offline mobile delta compressor to work in real-time.
Finally, while doing this work, we realized the key long-term problem the Compression Team should be working on: How do we build a data compression engine that Unity can talk to better? Our long term goal is to build several new lossy and lossless compression engines optimized specifically for Unity’s data.
These are very exciting times at Unity, and I can’t wait to see what the future holds.