General Advice on Data Size and a Look to the Future
Nanite and
Virtual Texturing systems, coupled with fast SSDs, have lessened concern over runtime budgets of geometry and textures. The biggest bottleneck now is how to deliver this data to the user.
Data size on disk is an important factor when considering how content is delivered — on physical media or downloaded over the internet — and compression technology can only do so much. Average end user's internet bandwidth, optical media sizes, and hard drive sizes have not scaled at the same rate as hard drive bandwidth and access latency, GPU compute power, and software technology like Nanite. Pushing that data to users is proving challenging.
Rendering highly detailed meshes efficiently is less of a concern with Nanite, but storage of its data on disk is now the key area that must be kept in check. Outside of compression, future releases of Unreal Engine should see tools to support more aggressive reuse of repeated detail, and tools to enable trimming data late in production to get package size in line, allowing art to safely overshoot their quality bar instead of undershoot it.
Looking to the future development of Nanite, many parallels can be drawn to how texture data is managed that has had decades more industry experience to build, such as:
- Texture tiling
- UV stacking and mirroring
- Detail textures
- Texture memory reports
- Dropping mip levels for final packaged data
Similar strategies are being explored and developed for geometry in Unreal Engine 5.