Difference between revisions of "3D Fundamentals Tutorial 9"
From Chilipedia
(→Video) |
(→Video) |
||
Line 56: | Line 56: | ||
:* Demo [https://youtu.be/pef2405M-os?t=19m19s 19:19] | :* Demo [https://youtu.be/pef2405M-os?t=19m19s 19:19] | ||
</div> | </div> | ||
− | * Reflections on | + | * Reflections on our approach to Pixel Shading [https://youtu.be/pef2405M-os?t=19m58s 19:58] |
<div class="mw-collapsible-content"> | <div class="mw-collapsible-content"> | ||
:* With our approach, the sky is the limit in terms of adding options/effects to the Pixel Shader | :* With our approach, the sky is the limit in terms of adding options/effects to the Pixel Shader |
Revision as of 20:34, 1 June 2020
In this tutorial we incorporate our first shader stage into the 3D pipeline: the pixel shader stage. We also explore some basic example pixel shaders (more heavy-duty stuff to come later).
Video
The tutorial video is on YouTube here.
- What is a Pixel Shader and why is it used? 0:20
- We want to make color mapping of the triangles in the rendering pipeline configurable
- One way is to use templated function objects (functors) as "plug-in" code
- These functors determine the color of the pixel based on their input
- Refactoring of the
gfx.PutPixel(...)
function inPipeline.h
2:04
- This is where we are ultimately determining the color of the pixel
- Inside the PutPixel function, we want to call a shader object that defines the coloring behavior
- Implementation of the programmable Pixel Shader in the rendering pipeline 3:11
- We template the pipeline class on an
<class Effect>
- We will adjust the definition of the
Vertex
class depending on what effect we are using - We adjust
gfx.PutPixel(...)
to take a pixel shader object, and call the function operator with the interpolated Vertex data:
gfx.PutPixel( x,y,effect.ps( iLine ) );
- Changes to the
Pipeline
class 5:37
- - The
Pipeline
object now holds anEffect
object (which holds all shader data and methods) - - Coding the
Effect
class 6:03 - - Coding the
PixelShader
class 6:22
- Putting it all together in a new
CubeSkinScene.h
class 7:26
- - The code now renders the same scene, but has become completely configurable
- We template the pipeline class on an
- Making a new
Effect
: Color Blending 8:31
- The
VertexColorEffect
functor will not hold a texture and return interpolated texture coordinates - Instead, it interpolates colors operating on a
Vec3
object that holds RGB values (we encode the Vertex colors as floats) - Cube definition now needs to hold color data of each Vertex 10:07
- We need a conversion operator and a conversion constructor to translate between the
Vec3
andColor
representation of colors 11:12 - Changes to the
Pipeline
class: generalizing Vertex transformations (independent of the pixel shader effect) 11:45 - Adding a static function
GetPlain()
inCube.h
to get the Vertex colors 13:39
template<class V> static IndexedTriangleList<V> GetPlain(float size = 1.0f) {...}
- Coding the scene class </code>CubeVertexColorScene.h</code> 14:13
- The
- Making a new
Effect
: Solid Colors 14:47
- Need to address the issue: we can't store color data in the 8 unique cube vertices
- Solution: make the faces of the cube independent (no shared vertices) 16:22
- - This requires 6 faces x 4 vertices per face = 24 vertices
- Reflections on our approach to Pixel Shading 19:58
- With our approach, the sky is the limit in terms of adding options/effects to the Pixel Shader
- However, because the whole pipeline is templated, each effect requires a pipeline
- - Adantage of using templates: enables aggressive inlining / compiler optimization
- - Disadvantage: we can't switch effects easily at runtime
- There are differences between this approach and how Hardware 3D APIs (like Direct3D and OpenGL) implement this flexibility
- - These let you switch components (e.g. by binding a different shader dynamically)
- - Another difference: use of texturing units seperate from the shader object 21:14
Downloads
The GitHub repository for the tutorial code is here.