Next steps for Bevy materials in 0.18
Bevy saw an enormous milestone in the 0.16
release in its transition to a GPU driven render architecture, lead by the work of @pcwalton. However, this work had the consequence of more thoroughly bifurcating Bevy rendering into a world of 3d materials, which have access to high performance patterns, and everything else, including 2d and UI materials, but also all other uses of the mid-level rendering APIs.
"GPU driven rendering" or "bindless" are catch-alls for a combination of many techniques1, some of which may individual be useful to custom rendering code:
- Efficient mesh upload and packing of vertex data, which requires complicated machinery to track where a given mesh's data exists in a vertex buffer when calling draw.
- Bind group creation and upload, particularly for bindless, which requires using advanced features of
AsBindGroup
and ideally supports transparent switching between bindless and bindful modes. - GPU pre-processing, which enables the use of multi-draw indirect rendering, as well as GPU frustum and experimental occlusion culling.
- Cached specialization information, that stores a renderable item's pipeline on the basis of change detection data.
- Retained draw bins, used mainly for opaque items, that prevent unnecessary bookkeeping in the queue phase.
The material system is also limited in some ways. Namely, it's tightly coupled to the Material
trait, which also makes certain assumptions about how a renderable item is structured:
- It assumes that it has a mesh and that mesh is managed by the material mesh allocator. This is a problem for items that may wish to implement techniques like vertex pulling, compute driven mesh generation, or hard coding vertex data in the vertex shader.
- It requires using the
DrawMaterial
draw function, which makes certain assumptions about the layout of a material and prevents techniques like binding additional bind groups or customizing the view bind group, etc. - It does not have mechanisms for techniques like instance step vertex data or customizing additional vertex buffers.
The following are some of my personal thoughts about where Bevy rendering should and hopefully will go during the 0.18
dev cycle.
Towards data driven rendering
Ideally, most custom rendering use cases would be able to make use of the material system without needing to write additional mid-level rendering code. The goal here is to make material rendering more data-driven, allowing all it's data to be specified and customizable at runtime, without making specific assumptions about the shape of a higher level material API.
A first step towards this was the move to "type erased materials", which removed the Material
trait bound from all render world systems, allowing for the first the possibilty of creating a material that does not pass through the Material
trait. While this is still a difficult API to work with, this demonstrates the way in which we can start to experiment with the high-level material API by decoupling from the actual render implementation. The Material
trait that most users interact with will remain stable and most of the changes described in this post should not affect most users.
The goal here is that rendering should be ECS driven. Rendering customization should be configured on the basis of components that are added or removed from a material asset-entity. For example, the prepass can know whether it needs to run on the basis of a variety of components being present instead of being hard-coded into the concept of "material".
What even is a material?
There's a definition that is derived from PBR rendering, which is that a material is a surface that receives lighting. In other words, a material is a BDRF used for shading. However, there's another more mechanical definition which is just that a material is an item that's being rendered. In this sense, it's worth asking what the minimal definition of a material is: what does a material require? Well, in it's simplest form, that means a material is just a render pipeline we'll call draw on. It doesn't even require a fragment shader, just a vertex shader.
2d/3d unification
The existence of the 2d rendering API has long been a source of technical debt within Bevy rendering code. While it is important to provide high-level rendering APIs that are tailored to 2d specific use cases, the existence of separate rendering paths for 2d materials has lead to a copy/paste problem where every new rendering feature must be duplicated into the 2d codebase. This had lead to 2d trailing behind 3d in an unfortunate way.
From the perspective of the renderer, everything is 3d anyway. GPUs speak 3d and in this sense 2d is just a 3d scene with an orthographic projection where Z is used as a kind of Z-index. Whether it's rendering a 3d or 2d opaque item, our renderer shouldn't really have to care.
Additionally, while some rendering features may be a bit odd or unusual to use in 2d sprite based games, there's fundamentally no reason that we should prohibit 2d from using things like prepass or deferred lighting. We may not choose to surface these techniques at the highest level of the 2d API, but Bevy's interested is in being a modular renderer and we should never remove the ability to do something because "it doesn't make sense." Games are ultimately about creative expression and we should give users the tools to do whatever they want even as we express more opinionated APIs.
As such, our goal is to fully unify the 2d and 3d rendering code paths. These changes primarily affect render-world implementation details. While this may take a few steps and will not impact any code that is visible to most users, our goal is that new features that are implemented for 3d should be possible to provide to 2d. This will result in less tech debt and more opportunities to focus on high-level 2d apis instead of copy/pasting rendering code.
bevy_render
-less scenes and a wgpu-types
driven API
As a corollary, it's become necessary for us to spin our much of the rendering material code into a new crate that can be shared between 2d and 3d.
In the 0.17
cycle, @atlv24
accomplished a number of crate refactors aimed at making it possible to fully define scenes without any dependency on bevy_render
, i.e. on wgpu
. The heuristic that's been used is essentially to reify wgpu-types
as the public API for the renderer.
Our goal here as it relates to material is to create a new bevy_material
crate that includes mid and high-level material APIs. In order to do so, we need to make some changes to how BindGroupLayout
s are managed, as these are a primary wgpu
item that is currently exposed via the material API. Our proposal is to specify things in terms of BindGroupLayoutEntries
via the introduction of a caching layer that manages layout creation so that even RenderPipelineDescriptor
can be used in a purely wgpu-types
context.
While the exact line of demarcation between bevy_material
and code that must live in bevy_render
still isn't entirely clear, it's our hope that a large amount of lower-level material code can exist in in bevy_material
, opening up the possibility of a non-wgpu
backend making use of significant portion of our lower-level material API.
Bindless UI and a UI camera
Bevy's UI rendering has been somewhat neglected and currently exists in an awkward place as related to camera-driven rendering, as described in a recent rant about our camera APIs. Like 2d, we want to ultimately have UI share the same rendering paths and be able to take advantage of things like bindless UI materials.
This will likely require the introduction of a dedicated CameraUi
component that allows UI to follow the same patterns as 2d/3d. While this may surface some issues discussed in the post linked above, like requiring users to better understand how compositing with post-processing effects works in Bevy, it will ultimately allow us to better bring UI into the fold of the renderer rather than it feeling a bit like a side-project.
My hope is that this will also allow more experimentation and interoperability for use cases like worldspace UI that have long been requested by users. Additionally, while UI performance is often not a primary concern of most games, having access to all of the GPU driven rendering techniques may help some more advanced use cases, including a future Bevy editor or other desktop apps that make extensive use of Bevy UI.
Contentious!
This is likely a somewhat controversial change as there are advantages to the auto-configured parasitic camera UI pattern. However, I personally feel pretty that given the existing camera-driven rendering idioms, it makes sense to think of compositing at a multi-camera level so that users can think about which fullscreen effects apply per-camera and which on the final render target. At the very least, this is something that deserves to be debated with actual code on the table.
Custom render phases
Describing the relationship between cameras, render phases, and render graphs is best left to another post. But one piece of extensibility that is currently missing is the ability to define custom render phases while still taking advantage of material-driven rendering.
In short, View
s have render phases, and render phases are used by a variety of View
s — i.e., they exist in a many-to-many relation. It's my hope that some form of many-to-many relations can land soon which would enable use to start exploring the use of more ECS techniques within the renderer itself.
The addition of custom render phases would provide a final missing piece to the ability for users to fully customize their render graph while still taking advantage of Bevy's sophisticated rendering code.
Conclusion
One of Bevy's core principles is modularity. And while the renderer has taken great steps in terms of both features and performance, it's important that we continue to press on how to make the renderer more fully modular at all levels. Not only will this benefit our own codebase, but also hopefully provide many opportunities for people to express more advanced rendering techniques without re-inventing the wheel.
We'll see how much gets done during our 0.18
cycle, but I'm extremely excited about this work and the kinds of possibilities it opens up for more future facing and experimental work on what a more fully ECS-driven high-level material API might look like!
Footnotes
-
Indeed, some might argue that we still do not have "real" bindless in Bevy due to the lack of mutable bind groups. ↩