Tainted \\ Coders

Bevy Rendering

Last updated:

Rendering in Bevy is done through the wgpu library. This allows Bevy to render in both native and web based environments.

wgpu is a safe portable graphics library for Rust based on the WebGPU API.

In your Bevy game there are two big things going on each frame:

  1. Simulation where we run our game logic
  2. Rendering where we actually draw things on the screen

These two parts are done in parallel and rendering is by far the more unfamiliar and behind the scenes.

The original rendering pipeline Bevy had was over complicated with self invented abstractions that made it quite hard to learn. Sprite rendering was slow and rendering to multiple windows was difficult. It also implemented wgpu in a custom way that lagged behind.

In Bevy 0.6 we got our first render pipeline implementation which was showcased to be faster, simpler, modular and deliver better looking renders.

The new renderer was also “ECS-driven” which means that the “Render World” was populated with extracted data from the “Main World”. It also means that views such as a camera could be modified with additional rendering components.

Because drawing things on screen is quite expensive Bevy automatically uses the camera’s viewpoint to determine which things to draw and which to ignore in a process called “Frustum culling”.

The Render Pipeline

There are 5 steps to the Bevy rendering pipeline:

  1. Extract where all the info required to render is taken from our game world
  2. Prepare where we set up all the vertex data and write the vertex buffer
  3. Queue get the pipeline created, set up the bind groups and add entities into a “render phase” (list of items we are going to use to perform our draw calls)
  4. Render Graph which is outside of normal ECS and system flows and calls out to each node which generate draw calls
  5. Draw Functions when we use the RenderCommand generated in the last step to actually perform the drawing on the screen

Extraction

The goal of this step is to extract all the data we need from the game world so both simulation and rendering can continue without worrying about each other.

It acts as a sync point so both simulation and rendering will lock during this step and cannot continue until this step is finished.

Because of its blocking nature its important to keep this step as fast as possible and just copy values without any heavy algorithms.

Two systems are added to your application to handle this: one to extract the camera view and another to extract the UI nodes. This system produces an ExtractedView component which is wrapped into the render phase used during the queue step.

We need the ExtractedView which takes our cameras view into account so we can project our nodes onto the screen relative to its viewpoint.

Prepare

Here our goal is to write vertices and bind group data to UiMeta resource.

Queue

Here we are setting up out UiPipeline by telling the GPU how we laid out our vertex data in the previous prepare step.

We then set up vertex and fragment shaders. Bevy also uses some caching to check a handle to this pipeline to see if it has changed, otherwise we can cache the results.

Render Graph

In this step we create an acyclical render graph containing all the steps to render our nodes on the screen.

Render Graphs are a way to logically model GPU command construction in a modular way. Graph Nodes pass GPU resources like Textures and Buffers (and sometimes Entities) to each other, forming a directed acyclic graph.

When a Graph Node runs, it uses its graph inputs and the Render World to construct GPU command lists.

Render Graphs also support sub graphs (basically namespaced graphs) that can be called by any node (e.g. “2d” sub graph and a “3d” subgraph for different parts of the game or multiple windows).

This will also call out to wgpu to begin a render pass.

Draw functions

Our render pass has begun and we begin rendering our phase items.

For every item we added to the render phase we call draw on our DrawFunction.

Draw is actually just a trait we can implement ourself or use a RenderCommand.

RenderCommand can itself be a tuple of RenderCommand.

Textures

A texture refers to a two-dimensional image that is used to add details, colors, and patterns to the surface of a 3D model. Textures are typically created in image editing software such as Photoshop or GIMP.

They can be simple images, such as a picture of wood grain, or complex maps that define different aspects of the material, such as a diffuse map, normal map, specular map, etc.

Textures are often stored as files in formats like PNG or JPEG.

TextureAtlas is used for tilemaps or spritemaps and navigating them.

let texture_handle = asset_server.load("textures/rpg/chars/gabe/gabe-idle-run.png");
let texture_atlas =
    TextureAtlas::from_grid(texture_handle, Vec2::new(24.0, 24.0), 7, 1, None, None);
let texture_atlas_handle = texture_atlases.add(texture_atlas);

Materials

A Material defines how light interacts with the surface of an object. It determines the visual properties, such as color, reflectivity, shininess, transparency, and more.

Your textures are applied to a material. You can think of a texture as the visual details of what an object should look like and the material contains the rules about how the texture should appear given an environment.

In Bevy, materials are defined using shaders, which are programs that run on the GPU and calculate how light interacts with the geometry of an object.

Physical based rendering (PBR) uses a series of properties to mimic real life.

  • Color
  • Metalic
  • Roughness
  • Reflectance
  • Clear coat
  • Clear coat roughness
  • Anisotrophy (shapliness)

Meshes

When rendering a game object, the mesh provides the underlying geometry on which textures and materials are applied.

The mesh’s vertices store position information, which determines the shape and structure of the object. Edges connect the vertices, and faces define the polygons that form the visible surface of the object.

Textures are often mapped onto the mesh using UV coordinates, which define how the texture image is wrapped around the geometry. UV coordinates assign specific points on the mesh’s surface to corresponding pixels in the texture image.

The material is responsible for determining how light is reflected or absorbed by different parts of the mesh, giving it a specific visual appearance.

In bevy we are provided with some built in meshes that represent common shapes:

  • Cube
  • Box
  • Quad
  • Plane
  • Capsule
  • Cylinder
  • Icosphere a sphere made from a subdivided Icosahedron.
  • RegularPolygon
  • Torus
  • UVSphere a sphere made of sectors and stacks.

Rendering entities

We can actually render our entities on the screen by giving them a Material (which contains a Texture) and a Mesh:

use bevy::{
    prelude::*,
    sprite::MaterialMesh2dBundle
};

fn main() {
    App::new()
        .add_plugins(DefaultPlugins)
        .add_systems(Startup, setup)
        .run();
}

fn setup(
    mut commands: Commands,
    mut meshes: ResMut<Assets<Mesh>>,
    mut materials: ResMut<Assets<ColorMaterial>>,
    asset_server: Res<AssetServer>
) {
    // Spawn our viewport so we can see things
    commands.spawn(Camera2dBundle::default());

    // Circle mesh
    commands.spawn(MaterialMesh2dBundle {
        mesh: meshes.add(Mesh::from(shape::Circle::new(50.))).into(),
        material: materials.add(ColorMaterial::from(Color::PURPLE)),
        transform: Transform::from_xyz(-150., 0., 0.),
        ..default()
    });

    // Sprite
    commands.spawn(SpriteBundle {
        texture: asset_server.load("enemy.png"),
        transform: Transform::from_translation(Vec3::new(-50., 0., 0.)),
        ..default()
    });
}

The calls to into() are letting your compiler figure out how to change the more basic types into the ones the functions expect without having to do it by hand.

When we called materials.add or meshes.add we inserted an Asset into our Assets and got back a handle. It also creates an AssetEvent::Created which we can read in other systems if we wanted to.

Text

We can either render the text as part of the scene using Text2dBundle:

let font = asset_server.load("fonts/FiraSans-Bold.ttf");
let text_style = TextStyle {
    font: font.clone(),
    font_size: 60.0,
    color: Color::WHITE,
};
let text_alignment = TextAlignment::Center;
// 2d camera
commands.spawn(Camera2dBundle::default());
// Demonstrate changing translation
commands.spawn((
    Text2dBundle {
        text: Text::from_section("translation", text_style.clone())
            .with_alignment(text_alignment),
        ..default()
    }
));

Or render it as part of the UI with TextBundle:

commands.spawn((
    // Create a TextBundle that has a Text with a single section.
    TextBundle::from_section(
        // Accepts a `String` or any type that converts into a `String`, such as `&str`
        "hello\nbevy!",
        TextStyle {
            font: asset_server.load("fonts/FiraSans-Bold.ttf"),
            font_size: 100.0,
            color: Color::WHITE,
        },
    ) // Set the alignment of the Text
    .with_text_alignment(TextAlignment::Center)
    // Set the style of the TextBundle itself.
    .with_style(Style {
        position_type: PositionType::Absolute,
        position: UiRect {
            bottom: Val::Px(5.0),
            right: Val::Px(15.0),
            ..default()
        },
        ..default()
    }),
    ColorText,
));

Lighting

Common bundles:

  • PointLightBundle A light that emits light in all directions from a central point.
  • SpotLightBundle A light that emits light in a given direction from a central point.
  • DirectionalLightBundle A Directional light from very far away (like the sun)

Read more