-
In the past 2 exhibits, we render 2D scenes to a texture.
WebGL is primarily used for 3D rendering, and we can certainly render 3D scenes to a texture.
However, to render a 3D scene, we typically need a depth buffer.
The depth buffer also need to be attached to the FBO
In this exhibit, we'll tell you now.
-
Instead of using a texture as a depth buffer, we'll use a render buffer.
-
A render buffer is the same as textures but support fewer operations.
- You cannot use WebGL commands to read anything from a render buffer.
- You cannot access a render buffer in a shader.
- They only serve as targets for rendered images and must be use with an FBO.
- However, they are less of a hassle to maintain than a texture.
-
A render buffer is very suitable for using as a depth buffer if you don't intend to use
the depth buffer to do anything else other than depth testing.
- There are other uses such as shadow mapping, but these are not covered in this course.
-
Here's how to create a render buffer to use as a depth buffer.
// Step 1: Create the render buffer.
var depthBuffer = gl.createRenderbuffer();
// Step 2: Bind the render buffer.
gl.bindRenderbuffer(gl.RENDERBUFFER, depthBuffer);
// Step 3: Allocate the render buffer.
gl.renderbufferStorage(
// First argument is always gl.RENDERBUFFER.
gl.RENDERBUFFER,
// Second argument indicates the internal pixel format.
// This is either gl.DEPTH_COMPONENT16 or gl.DEPTH_COMPONENT.
gl.DEPTH_COMPONENT16,
// Third argument is the width of the buffer.
512,
// Fourth argument is the height of the buffer.
512
);
// Step 3: Unbind the render buffer.
gl.bindRenderbuffer(gl.RENDERBUFFER, null);
-
Notice that when creating a render buffer, we do not have to specify many other stuffs
that we specify with a texture. These include:
- Minification filter.
- Magnification filter.
- Wrapping modes
-
To attach a render buffer as a depth buffer to an FBO, use the
gl.framebufferRenderbuffer
command and the gl.DEPTH_ATTACHMENT
slot.
-
Here's a convenience function that facilitates rendering to a double buffer with an optional
render buffer to act as the depth buffer.
function drawToBufferAndSwap(gl, fbo, buffer, depthBuffer, drawFunc) {
// Bind the FBO.
gl.bindFramebuffer(gl.FRAMEBUFFER, fbo);
// Attach the write buffer to the zeroth color attachment slot.
gl.framebufferTexture2D(
gl.FRAMEBUFFER,
gl.COLOR_ATTACHMENT0,
gl.TEXTURE_2D,
buffer.getWriteBuffer(),
0);
// Attach the depth buffer to the depth attachment slot.
if (depthBuffer != null) {
gl.framebufferRenderbuffer(
// First argument is always gl.FRAMEBUFFER
gl.FRAMEBUFFER,
// Second argument indicates the attachment slot.
// Since we want the buffer to serve as the depth buffer, we use the gl.DEPTH_ATTACHMENT slot.
gl.DEPTH_ATTACHMENT,
// We are giving a render buffer instead of a texture, so we use gl.RENDERBUFFER.
gl.RENDERBUFFER,
// Lastly, the render buffer itself.
depthBuffer
)
}
// Let the user's code do its magic.
drawFunc();
// Detach the color buffer.
gl.framebufferTexture2D(
gl.FRAMEBUFFER,
gl.COLOR_ATTACHMENT0,
gl.TEXTURE_2D,
null,
0);
// Detach the depth buffer.
if (depthBuffer != null) {
gl.framebufferRenderbuffer(
gl.FRAMEBUFFER,
gl.DEPTH_ATTACHMENT,
gl.RENDERBUFFER,
null
)
}
// Unbind the FBO.
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
// Swap the buffers.
buffer.swap();
}
-
Here is the overall structure of our WebGL program:
// Draw scene to frame buffer.
drawToBufferAndSwap(gl, fbo, renderBuffer, depthBuffer, function() {
gl.clearColor(0.75, 0.75, 0.75, 1.0);
gl.clearDepth(1.0);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
gl.enable(gl.DEPTH_TEST);
// Code elided for brevity.
gl.disable(gl.DEPTH_TEST);
gl.flush();
});
// Copy pixel from render buffer.
{
drawToBufferAndSwap(gl, fbo, imageBuffer, null, function() {
gl.useProgram(textureCopyProgram);
// Code elided for brevity.
drawFullScreenQuad(gl, textureCopyProgram);
gl.flush();
});
}
if ($("#blurXCheckBox").is(":checked")) {
drawToBufferAndSwap(gl, fbo, imageBuffer, null, function() {
gl.useProgram(blurXProgram);
// Code elided for brevity.
drawFullScreenQuad(gl, blurXProgram);
gl.useProgram(null);
gl.flush();
});
}
if ($("#blurYCheckBox").is(":checked")) {
drawToBufferAndSwap(gl, fbo, imageBuffer, null, function() {
gl.useProgram(blurYProgram);
// Code elided for brevity.
drawFullScreenQuad(gl, blurYProgram);
gl.useProgram(null);
gl.flush();
});
}
if ($("#srgbCheckBox").is(":checked")) {
drawToBufferAndSwap(gl, fbo, imageBuffer, null, function() {
gl.useProgram(srgbProgram);
// Code elided for brevity.
drawFullScreenQuad(gl, srgbProgram);
gl.useProgram(null);
gl.flush();
});
}
// Copy pixel from image buffer to screen.
{
gl.useProgram(textureCopyProgram);
if (textureCopyProgram.texture != null) {
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, imageBuffer.getReadBuffer());
gl.uniform1i(textureCopyProgram.texture, 0);
}
drawFullScreenQuad(gl, textureCopyProgram);
}
-
One of the cool thing that we did here is that we use the rendered image as a texture
in the scene.
This allows us to implement something like having a video camera taping a flat screen
that is showing the footage of the video.
To do this, it is not hard at all: render to the write buffer, and use the read buffer
as a texture.