Voxel Engine

This project is somewhat special to me as it's actually one of the first projects I ever started. I've simply built on it over so many years now that I've updated, replaced and rewritten it all so many times as I got better at programming that it now looks nothing like it did when i started. It's a real ship of theseus!

The project is meant to be a sandbox space game.

The main part of the project consists of a voxel engine I've slowly been working on over a few years now and I thought I'd write a little about it and what it can do.

The Chunking System

At the highest level the engine consists of Substrates. Substrates being the method by which the chunks are LODed. Currently the engine can handle FractalSubtrates and LatticeSubstrates.

A FractalSubtrate is what i call a octree-based chunk layout. This is used for things with finite size, roughly cube shaped bounds and a need for large scale LODing. Currently it's primarily in use for making spherical planets that can LOD the terrain around the player so things in the distance can be seen without causing performance concerns. It also allows you to see the planet from space and have a (somewhat) smooth transiton as you approach it.

A LatticeSubtrate is what i call the more conventional style of chunking where you have an infinite grid in which chunks can be loaded around the player as they move. This method is better for things of unknown size where the number of chunks may shrink or grow, like a player built structure.

The substrates basically define a set of points where chunks can be loaded under a set of rules. Both can be attached to a transform, allowing things like orbiting planets and flying player-made spaceships.

FractalSubtrate detail vizualization of a planet around the player

Terrain System

All substrates hold a type of chunk. There's different types for different things but for now I'll focus on the terrain chunks.

This chunk type uses a DataGenerator (a generalized voxel samper) to produce a 3D array of voxel data on a bunch of worker threads before meshing the data with a marching cubes algorithm. The chunk then grabs a prepared GameObject from a pool of them to visualize the chunk if need be. Otherwise they can just sit in memory without needing to be rendered. The system has also been built so that the chunks are completely event driven, meaning they have no tick or update loop running in the background sucking performance.

The chunks can also be interacted with though their substrate. Terrain chunks can for instance accept arbitrary SDF formulas to modify the chunk data. We can now just have a player tool that sends an SDF of choice to the substrates in range in order to achieve smooth terrain deformations.

Almost the entire engine is multithreaded to speed up chunk generation and deformation. I've investigated use of the GPU as well at some points but ultimately ruled against it due to the chunk generation stealing all the GPU time and causing frame drops.

Texturing System

Each voxel in a terrain chunk has a VoxelMaterial. These VoxelMaterials among other things hold the different texture maps for the PBR material.

A good question now becomes how to texture the terrain given we now have potentially hundreds of unique VoxelMaterials in a single chunk mesh, not mention the hundreds of chunks visible at any one time in the scene. 

The way i solve it is by automatically finding all VoxelMaterials on game boot, reading their texture maps, and then stitching them together into a set of atlases. That solves the problem of how to get all the textures to the GPU, but how do we UV map the terrain when it could generate with effectively any geometry? The way I do it is with a custom hlsl shader. It implements a combination of triplanar mapping and splat mapping but with a twist! 

While the chunks are generating their mesh, We decide what voxel has the most influence on any one vertex in the mesh and encode that materials texture atlas id in the vertex colors for that vertex. By for each triangle in the mesh putting the encoded atlas id in either the R, G or B channel corresponding to the triangle index, as well as putting the triangle index itself inside the A channel.

To recap, each vertex now has a vertex color containing the atlas id for triangle vertex 0 in R, triangle vertex 1 in G, triangle vertex 2 in B, and the triangle index for this vertices triangle in A.

We can then in the vertex shader for the mesh read the vertex color A channel and to get the triangle index. We then create a float3 with 0 in all channels except the channel corresponding to the triangle index where we instead put a 1. We then pass that float3 to the pixel shader alongside the RGB part of the vertex color. Now in the pixel shader, the vertex color values will have interpolated together from the 3 vertices, mangling the atlas ids inside. However we can now descramble the data by dividing the scrambled vertex color on the extra float3 we passed along. Due to having had 1 in the same channel the the atlas id was in for each vertex, they have for their respective vertex effectively multiplied the interpolation weights by 1, meaning we've captured the barycentric weights! By dividing on it we effectively undo the interpolation leaving us with the encoded atlas ids from each of the 3 vertices for this pixel in the respective RGB channel. By now undoing the id encoding and converting them to atlas uv coordinates, we can sample the atlases to get the data! We can then blend those sampled colors back together with our captured weights and some splat maps to get a more natural look. To get the UV coordinate in the material however, we use triplanar mapping with a hard blend edge.


There's so much more to this project, but for now I'll end it here.