Foliage Brush | Specialization
For my specialization, I consulted our designers and artists in Solar Uprising about the creation of a tool that could be of great assistance to them. Among the handful of ideas I was presented with, the one that stood out was that of a foliage brush. At the initial conception of the idea, I knew it would be a great way to challenge myself.
In searching for an already published implementation of such a tool, I only came across implementations for terrain brushes. While these implementations utilized virtual texturing, I concluded upon researching the technique that I would not be able to implement it within the timeframe of our specialization course. As such, I opted to independently design the entire process of developing such a tool, without any external guiding resources.
I knew from the beginning that I could retrieve the world position from the picking logic using the mouse cursor, and then place out each foliage object. The problem with this approach is that I only get a single position and can therefore only place out foliage on a single point if I tried to distribute them over an area I could not garage that they would be placed inside of objects. So I turned to decals because they are just projections of an image, so I would solve two problems with a single solution. The problem I just described and how I would show the area affected by the brush.
For the GPU process, I opted to store the data in a texture that would replicate the decal. To achieve this, an RWTexture was bound to a compute shader which cycled through all pixels of the position texture.
Step by step
Place the decal texture on the picking position (world mouse position)
Get the world positions from the affected area of the decal
Transform the local position of the decal into a UV coordinate that we use to write into a texture.
Check if the UV coordinate of the texture contains an alpha value above zero otherwise we discard that pixel.
Then we write the data into an RWTexture using the calculated UV coordinates
However, this approach does have its downsides, which is illustrated in the images where there are holes in the RWTexture. The number of screen pixels covered by the brush affects the quantity of data stored in the texture. If there are only 100 pixels covered by the brush on the screen, only up to 100 different pixels are saved in the texture. Additionally, more than one pixel might generate the same uv coordinate which leads them to write to the same location.
One solution to these problems is to go through each pixel in the decal texture instead of the screen. This ensures that no pixels are overwritten or that unwanted gaps in the texture appear, which significantly improves performance. Unfortunately, time constraints prevented the implementation of this technique.
Furthermore, a way to erase foliage placed in the scene was needed. The logic to check if a pixel is inside the decal was resued before placing the foliage id in a structured buffer, which is used on the CPU-side later in the process to delete the objects inside the brush.
For the CPU side, opted to store the foliage data inside of foliage objects. These objects, which are stored in binary files to minimize the loading time, contain a list with a data structure, that in turn stores a model as well as a list of matrices that describes each instance of said model.
After the data from the GPU has been stored, we can pick a random pixel in the texture and check the validity of its alpha value. If it is valid, we can create an instance of the selected model.
Once the logic was complete, I started the development of the tool’s interface. The first step was to give users the ability to add new foliage objects or select one that is already in the scene. Additionally, an option was added to change the brush’s size and toggle its mode between painting and erasing. Moreover, each model is listed in the interface, complete with their respective number of instances. The outline of the model selected for the brush is highlighted in order to maximize clarity , and each model is complete with its own subset of settings that allow the user to affect the speed at which the brush places models, as well as the
randomization of their size, rotation, and position offsets. By selecting a checkbox, the user can make the object rotation match the normal of the surface drawn to. Lastly, to add a new model the user only has to drop a prefab file from the asset browser to the drop box at the bottom.
Implementing the aforementioned technique listed to gather the data on the GPU-side.
Implement instanced rendering
Increase customization options for the brush and add the ability to draw different models at once.