(MTL S01E09) Rendering pt.3: Shaders
Today, we’ll dive into vertex and fragment shaders. Previously, we explored the Metal Shading Language, the rendering pipeline and its setup, so now our focus shifts specifically to shaders. The examples will remain simple and concise, as different tasks often require unique solutions. To isolate shader concepts, I’ll be using the KodeLife shader editor, which lets us experiment with shaders independently from other pipeline stages.
Rendering pipeline again
Now, we can set aside the details of the entire rendering pipeline and concentrate solely on these three steps:
- The vertex shader is called for every vertex within the viewport, calculating its projected position on screen.
- Rasterization then estimates fragments (representing pixels along with interpolated vertex parameters) based on the chosen primitives and the calculated vertex positions.
- The fragment shader is called for each visible fragment, calculating color, depth, or stencil values for the output texture(s).
KodeLife brief intro
Building the entire Metal infrastructure just for shader development can feel redundant, so let’s use a shader-editing tool instead:
- The editor area, displaying your shader code with the rendered result in the background.
- Tabs for different passes (yes, multipass effects are possible) and shader types (for OpenGL, there are more than two types available per pass).
- Project settings, including output resolution, shading language, and various common parameters.
- Pass parameters: render/compute modes, primitive types, model (primitive), instance count, projection, clear colors, etc.
- Shader parameters (you can also “watch” a file to use an external editor).
Preparations
This setup is sufficient for our current purpose. Now, let’s get prepared:
- Create a default project.
- In the `Renderer` section (3), set it to `Metal`, and you should see something like the screen above (1).
- Set the `primitive` to `Sphere`, the type to `LINE_STRIP`, and the instance count to, say, `5`.
- Finally, set `Projection` to `Perspective`.
Review “default” shaders
Let’s begin with the “default” shader provided by KodeLife. While a real-world shader might look quite different, this starting point is effective and works well for most simple tasks.
NOTE: I’ve made some slight adjustments to the original KodeLife naming.
Vertex shader
#include <metal_stdlib> // (1)
using namespace metal; // (2)
struct VertexInput { // (3)
float4 position [[attribute(0)]];
float3 normal [[attribute(1)]];
float2 texcoord [[attribute(2)]];
};
struct VertexOutput { // (4)
float4 position [[position]];
float3 normal;
float2 texCoord;
};
struct Parameters { // (5)
float time;
float2 resolution;
float2 mouse;
float3 spectrum;
float4x4 mvp;
};
vertex VertexOutput vs_main( // (6)
VertexInput input [[stage_in]], // (7)
constant Parameters& uniform [[buffer(16)]]) { // (8)
VertexOutput out;
out.position = uniform.mvp * input.position; // (9)
out.normal = input.normal;
out.texCoord = input.texcoord;
return out;
}
- Including the standard Metal library. This header includes other necessary libraries, so you only need this one.
- Using Metal’s namespace. Remember, Metal Shading Language is a subset of C++14. If you prefer using `metal::` explicitly, you can skip this step.
- Defining a structure for vertex attributes. As discussed in the previous episode, you can either access buffers directly or map them to a structure. Here’s what they look like on the shader side.
- Creating a structure for the shader’s output parameters (the actual vertex shader output). It must include `[[position]]`, representing the vertex’s position in the viewport. Without this, rendering won’t work, as there would be no displayable position.
- Defining a structure for uniform parameters — common parameters for each vertex. KodeLife provides `time`, output `resolution`, `mouse` position and status, `spectrum` (just low, mid, high), and `mvp` — the Model, View, Projection matrix. This is actually `Projection * View * Model`, used to calculate vertices’ final positions in the viewport.
- Defining the `vertex` function. It returns VertexOutput — this same structure will serve as `[[stage_in]]` for the fragment shader, with interpolated values. The function name can vary, but KodeLife uses `vs_main`.
- Specifying the input vertex structure. For each call (one per vertex), Metal maps values to `VertexInput` attributes from the corresponding buffers. For more details, refer to the previous episode.
- Setting up uniform parameters (common for all calls). This could include buffers, textures, etc. If we pass a buffer, it remains the same across calls, allowing access to different items via instance and/or vertex indices (explained further below).
- Calculating and returning output vertex parameters, which are then passed to the rasterization stage.
Fragment shader
#include <metal_stdlib>
using namespace metal;
struct FragmentInput { // (1)
float3 normal;
float2 texCoord;
};
struct Parameters { // (2)
float time;
float2 resolution;
float2 mouse;
float3 spectrum;
};
fragment float4 fs_main( // (3)
FragmentInput In [[stage_in]], // (4)
constant Parameters& params [[buffer(16)]]) {
float2 uv = -1. + 2. * In.texCoord;
float4 col = float4(
abs(sin(cos(params.time+3.*uv.y)*2.*uv.x+params.time)),
abs(cos(sin(params.time+2.*uv.x)*3.*uv.y+params.time)),
params.spectrum.x * 100.,
1.0);
return col; // (5)
}
- Defining a structure for the input fragment data. The `[[position]]` parameter is omitted here, but you can include it if needed.
- Creating a structure for common parameters.
- Defining the `fragment` function. It returns a `float4` since there’s only one color attachment. If needed, you can define a structure with multiple attachments like `[[color(0)]]`, `[[color(1)]]`, `[[depth(0)]]`, etc.
- Specifying input fragment data, which comes from rasterization with interpolated vertex parameters.
- Returning a calculated color value. In this example, it builds a gradient based on texture coordinates and the spectrum value, adding a slight animation.
Improvements
Having five spheres in one spot is obviously quite dull. Let’s enhance the composition by moving them around the viewport and adding some animation.
Vertex shader
Since we have 5 instances, there’s no need to add additional models; we can simply reuse the geometry of the original. This can be achieved using an instance index. Let’s add a few lines to our function to make the composition more dynamic:
vertex VertexOutput vs_main(
unsigned int vid [[vertex_id]], // (1)
unsigned int iid [[instance_id]], // (2)
VertexInput input [[stage_in]],
constant Parameters& params [[buffer(16)]]
) {
VertexOutput out;
float4 position = input.position; // (3)
float wave = 1.0 + sin(vid * 0.01 + params.time + iid) * 0.1;
position.xyz *= wave; // (4)
float a = iid;
position = float4x4(
float4(cos(a), sin(a), 0, 0),
float4(-sin(a), cos(a), 0, 0),
float4(0, 0, 1, 0),
float4(0, 0, 0, 1)
) * position; // (5)
float3 offset = float3(iid - 3.0, iid % 3 - 1.0, iid * 0.4);
position.xyz += offset; // (6)
out.position = params.mvp * position; // (7)
out.normal = input.normal;
out.texCoord = float2(iid, vid); // (8)
return out;
}
- Vertex indices: Even when using vertex attributes, vertex indices remain accessible. For each instance, these indices reset rather than incrementing continuously, starting from the first vertex index each time. This is often useful for calculating vertex parameters programmatically or directly accessing vertex attributes in buffers.
- Instance indices: These can be used for handling instances differently — for example, by accessing separate transformation data for each instance from a buffer using this index.
- We define a variable for position to modify it, rather than passing the input value directly.
- Adding a wave effect to the sphere’s surface. Here, we could base the effect on only the `z`-coordinate, but using vertex indices can be helpful, depending on the context.
- Applying rotation around the z-axis using the instance index. Ideally, we’d also rotate normals and recalculate, but for the sake of simplicity, we’ll keep it as is.
- Adding an offset for each instance. Having multiple instances in the same position is usually unnecessary.
- Projecting the calculated positions to the viewport.
- Passing instance and vertex indices to the fragment in `texCoord`. Ideally, this field should be renamed.
Fragment shader
It’s still a bit dull, even though the shapes have become more interesting. Now, let’s focus on improving the coloring:
struct FragmentInput {
float4 position [[position]]; // (1)
float3 normal;
float2 texCoord;
};
fragment float4 fs_main(
FragmentInput In [[stage_in]],
constant Parameters& params [[buffer(16)]]) {
// (2)
float wave = sin(5 * fract(In.texCoord.y * 0.02 + params.time + In.texCoord.x));
wave = 1.0 - smoothstep(0.0, 0.3, wave);
// (3)
float3 lightPos = float3(1000.0, 100.0, 1.0);
float3 lightDir = normalize(lightPos - In.position.xyz);
float lightness = clamp(dot(lightDir, In.normal), 0.0, 1.0);
lightness = mix(0.2, 2.0, lightness);
// (4)
return float4(In.texCoord.x * 0.2, wave, In.texCoord.y * 0.0005, 1.0) * lightness;
}
- Although it’s not essential to have this in the input fragment, we can use it for certain tasks. In this example, we use it to calculate the lightness value.
- Enabling a color wave effect (we applied a similar effect to the surface in the vertex shader). Using the vertex index allows us to achieve a spiral effect easily, based on the assumption that vertices follow a spiral pattern. This lets us visualize the spiral structure.
- Implementing a simple lightness model by checking the relative orientation of the surface’s normal in the fragment against the light direction. We use `mix` to set the minimum and maximum lightness values.
- Constructing the final color based on vertex and instance indices (remember, we stored them in `texCoord`), the spiral wave, and the calculated lightness.
Conclusion
- We explored vertex and fragment shaders, experimenting with their functionalities.
- The vertex shader, based on input vertex attributes, calculates positions and other parameters.
- The fragment shader is applied to each fragment, calculating output values for attachments.
- Using instance and vertex indices allows us to access buffer elements and create dynamic, unique effects with minimal geometry.
- Tools like KodeLife simplify shader experimentation, allowing us to focus on core concepts without needing a full pipeline setup.