WebGL: Hands On

DevCon5 Austin 2011

Steve Baker

Senior Software Engineer,

Intific Inc, Austin, TX.

<steve@sjbaker.org>

Agenda:

http://sjbaker.org/devcon5/

  • Introduce WebGL and some history.
  • Show a complete example with source code.
  • Demonstrate some examples from the web
  • Discuss performance issues.

Introduction:

WebGL is best viewed as a set of JavaScript bindings for OpenGL ES 2.0 - an established 2D and 3D rendering API that's well supported on everything from desktops to phones and pads.

  • Additional convenience functions for loading textures, etc via HTTP.
  • Mechanisms for interacting with the <canvas> system.
  • Hardware acceleration - you can draw millions of textured, lit triangles and still achieve interactive frame rates.
  • Shaders introduce an additional programming language called GLSL that runs natively on the GPU on most graphics cards.

OpenGL programmers have to learn JavaScript.
JavaScript programers must learn OpenGL.
Everything is mature and well understood because it builds on solid ground.

Processing stages.

In a typical application:

  • On startup: use JavaScript to download/create data structures and hand them to WebGL:
    • Lists of triangles for each 'object' in your scene.
    • Texture map images.
    • Shader program source code.
  • At runtime: use JavaScript to transmit position data for each object to the GPU. Repeat at ~10 to ~60 times per second for smooth motion.
  • In parallel: The GPU draws your objects using the textures and shaders that you provided.
  • Finally: The <canvas> system composites the resulting image into the HTML5 page.

Shaders:

  • A shader is a (typically short) program, written in a C-like language called 'GLSL'.
  • GLSL is fully compiled native code that runs inside the GPU.
  • The GPU is the ultimate 'sandboxed' environment, so this is OK.
  • Shaders come in two flavors:
    • Vertex shaders: run once for every vertex of your object.
    • Fragment shaders: run once for every pixel of every triangle that results.
  • Shaders are 'stateless' - they retain no memory from one vertex/pixel to the next.
  • This enables the extreme parallelism of the GPU to be fully realised.

The Rendering model:

  • The CPU/JavaScript sends lists of vertex and triangle data to the GPU.
  • The GPU runs in parallel:
    • The Vertex Shader transforms those vertices into screen-space.
    • Vertices are reassembled into triangles and clipped to the edges of the <canvas>.
    • The triangles are then 'rasterized' - chopped up into individual pixels with per-pixel data generated by interpolating vertex data.
    • The Fragment Shader takes each pixel and usese the interpolated data to generate an RGBA color.
    • The Z-depth of the pixel is tested to perform hidden-surface culling.
    • The resulting pixel is then alpha-blended into the image buffer.
  • The <canvas> system composites the buffer into the HTML page.

From CPU to GPU.

  • Use the typed array extension to create data buffers representing the vertices of each triangle of your model.
  • These are handed to WebGL as vertex buffer objects (VBO's).
  • A VBO is:
    • A collection of 'attributes' (position, color, texture coordinates, etc) for each vertex.
    • Attributes can either be interleaved or separate blocks of data.
    • An 'index buffer' that says which three vertices make up each triangle.
  • Additionally, JavaScript provides the GPU with "uniform variables" - parameters that are passed into the vertex and fragment shaders.

The Vertex Shader:

  • A GLSL program, typically one or two dozen lines of code.
  • Each vertex is run in a logically separate copy of the shader.
    • Take the data for one vertex.
    • Processs it in some manner.
    • Emit one vertex in 'screen space'.
    • Exit.
  • Vertex shaders typically:
    • Rotate, translate and scale each vertex of the 3D model to place it correctly relative to the virtual camera (which is always at the origin).
    • Perform perspective calculations.
    • Perform some of the lighting math.
    • Handle complex animation tasks.

Example Vertex Shader

attribute vec3 POSITION ;
attribute vec3 COLOR    ;
uniform mat4 ModelToCamera  ;
uniform mat4 CameraToScreen ;
varying vec4 outColor   ;

void main(void)
{
  outColor.rgb  = COLOR.rgb ;
  outColor.a    = 1.0 ;
  vec4 worldPos = ModelToCamera * vec4 ( POSITION, 1.0 ) ;
  gl_Position   = CameraToScreen * worldPos ;
}

The Fragment Shader:

  • A GLSL program, typically between three and a hundred lines of code.
  • Each 'fragment' (pixel) is run in a logically separate copy of the shader.
    • Take the data for one pixel.
    • Processs it in some manner.
    • Emit the final color for that pixel.
    • Exit.
  • Fragment shaders typically:
    • Apply texture.
    • Perform lighting calculations.
    • Add in 'atmospheric' effects: fog, etc.

Example Fragment Shader

precision mediump float ;
varying vec4 outColor   ;

void main(void)
{
  gl_FragColor = outColor ;
}

A worked example.

It's tough to provide a complete example in a few slides - so this is going to be super-minimal! It's adapted from Giles Thomas' excellent "Learning WebGL" tutorials:

http://learningwebgl.com/

  • Draw a single colored triangle in 2D.
  • Error checking removed for sake of clarity.

Vertex Shader

Copies vertex position and color directly to output:

  attribute vec3 POSITION;
  attribute vec4 COLOR;
  varying vec4 vColor;

  void main(void)
  {
    gl_Position = vec4(POSITION, 1.0);
    vColor = COLOR;
  }

Fragment Shader

Copies interpolated input color directly to output:

  precision mediump float ;
  varying vec4 outColor   ;

  void main(void)
  {
    gl_FragColor = outColor ;
  }

Initialization

  var gl ;
  function initGL ( canvas )
  {
    try {
      gl = canvas.getContext("experimental-webgl");
      gl.viewportWidth = canvas.width;
      gl.viewportHeight = canvas.height;
    } catch (e) {}

    if ( !gl ) alert( "Couldn't initialise WebGL");
  }

It convenient to send the user to http://get.webgl.org/ if something goes wrong.

Installing Shaders - Part 1

Shaders are sent as source code to WebGL and can be embedded into HTML5 directly using <script> tags:

  <script id="shader-vs" type="x-shader/x-vertex">
  attribute vec3 POSITION;
  attribute vec4 COLOR;
  ...
  </script>
  <script id="shader-fs" type="x-shader/x-fragment">
  precision mediump float;
  varying vec4 vColor;
  ...
  </script>

Installing Shaders - Part 2

  function getShader(gl,id)
  {
    var shader;
    // Grab shader source code from HTML element:
    var script = document.getElementById(id);
    if (script.type == "x-shader/x-vertex")
      shader = gl.createShader(gl.VERTEX_SHADER);
    else
    if (script.type == "x-shader/x-fragment")
      shader = gl.createShader(gl.FRAGMENT_SHADER);
    // Pass the source code to WebGL
    gl.shaderSource(shader, script.text);
    // Compile it!
    gl.compileShader(shader);
    return shader;
  }

Installing Shaders - Part 3

var program ;
function initShaders () {
  // Grab & Compile both shaders
  var fragmentShader = getShader(gl, "shader-fs");
  var vertexShader   = getShader(gl, "shader-vs");
  program = gl.createProgram(); // Create a 'program'
  gl.attachShader(program, vertexShader  );  // Link vertex shader
  gl.attachShader(program, fragmentShader);  // Link frag shader
  gl.linkProgram(program);
  gl.useProgram(program); // Make this program active:
  // Enable shader attributes.
  program.POSITION = gl.getAttribLocation(program, "POSITION");
  gl.enableVertexAttribArray(program.POSITION);
  program.COLOR    = gl.getAttribLocation(program, "COLOR");
  gl.enableVertexAttribArray(program.COLOR);
}

Setting up triangle buffers

  var buffer;
  function initGeometry()
  {
    // Create a WebGL buffer and bind it.
    buffer = gl.createBuffer();
    gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
    // A single triangle with red, green and blue vertices
    var vertexData = [
    // X     Y    Z    R    G    B    A
      0.0,  0.8, 0.0, 1.0, 0.0, 0.0, 1.0, // Vertex 1
     -0.8, -0.8, 0.0, 0.0, 1.0, 0.0, 1.0, // Vertex 2
      0.8, -0.8, 0.0, 0.0, 0.0, 1.0, 1.0  // Vertex 3
    ] ;
    // Pass data to WebGL.
    gl.bufferData(gl.ARRAY_BUFFER,
       new Float32Array(vertexData), gl.STATIC_DRAW);
  }

Drawing the scene

  function drawScene()
  {
    // Set up the viewport and clear the screen.
    gl.viewport(0, 0, gl.viewportWidth, gl.viewportHeight);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
    // Bind the vertex attribute array buffer
    gl.bindBuffer(gl.ARRAY_BUFFER, buffer);
    // Seven values per vertex (X,Y,Z plue R,G,B,A)
    var stride = 7 * Float32Array.BYTES_PER_ELEMENT;
    // Set up the attribute streams
    gl.vertexAttribPointer(program.POSITION,
                     3, gl.FLOAT, false, stride, 0);
    gl.vertexAttribPointer(program.COLOR,
                     4, gl.FLOAT, false, stride,
                     3 * Float32Array.BYTES_PER_ELEMENT);
    gl.drawArrays(gl.TRIANGLES, 0, 3); // Draw it!
  }

Watch it work!

A more real example

Hamperball

Efficiency:

  • JavaScript is orders of magnitude slower than GLSL. So do whatever you can to do work in the GPU.
    • GLSL is fully compiled to native GPU machine-code.
    • It has four-way parallelism at the instruction level.
    • It typically runs on a massively parallel computer with dozens to hundreds of 'cores'.
  • Minimize the number of draw calls. Ideally, you should be drawing hundreds to thousands of triangles with each draw operation.
  • Learn about OpenGL optimisation:
    • Minimize the number of times you re-bind buffers, textures and shaders.
    • Also to minimize the number of uniform variable changes and other kinds of state switching.

Some other techniques:

In most cases, you can pick up a book on 3D rendering and the techniques and shaders it describes will work "as is" in WebGL. However, there are some limitations in the API that cause problems for a few common tricks:

  • Modeling content for a 3D world.
  • Shadow casting.
  • Picking.
  • Particle systems.
  • Skeletal mesh deformation.

Modeling content for a 3D world

  • Sooner or later, you're going to get tired of making triangles and cubes and such.
  • Many sites have cheap/free 3D content to get you started. (TurboSquid.com, TheFree3dModels.com, Artist-3d.com, etc)
  • In most applications, to make useful 3D models, you would use a tool like 3D Studio, Maya, Lightwave or the Free/OpenSourced tool 'blender'.
  • Either way, create an exporter to write data into Collada, XML, JSON or whatever format you choose.
  • Download models during program startup.
  • Texture maps are just conventional RGB images in PNG (JPEG is 'iffy').

Shadow Casting

The conventional approach to shadow casting in OpenGL is as follows:

  • Render the scene from the perspective of the light source.
  • Store the Z-buffer into a 'shadow depth map' texture.
  • Render the scene from the perspective of the camera.
  • In the fragment shader, compare the distance from this pixel to the light source to the value in the shadow depth map. If it's further away, then you're in shadow.

Sadly, WebGL doesn't let you perform that second step. So instead, you must write the Z-depth of the pixel into the Red/Green/Blue planes of the image - storing just a few bits into each. On systems with only 4 or 5 bits of RGB, this is a challenge!

Picking.

Standard OpenGL has mechanisms to support figuring out which objects the mouse is clicked on. Sadly, WebGL lacks that feature and doing it with ray-casting in JavaScript is painful. So we have to get creative and do it in the GPU.

  • When the mouse is clicked...
  • Render the scene as usual - but render each object in the scene in a different color.
  • Read back the pixel under the mouse pointer...that tells you which object you clicked.
  • Clear the screen and re-render the scene in the proper colors.

Particle systems

Used for special effects like fire, smoke, sparkles and magical effects.

  • Created by animating a bunch of translucently textured triangles.
  • Traditionally, this animation is performed in the CPU.
  • But JavaScript is slow.

So...use the GPU...

Particle systems in the GPU

  • Devise a parameterized equation that describes the paths of your particles as a function of time.
  • Create a bunch of triangles at the origin and give the vertices of each a 'ParticleNumber' attribute that runs from zero to the number of particles.
  • Send the vertex shader a uniform variable "AgeOfParticleSystem" and another "ParticleCreationRate".
  • Use vertex shader code to calculate the age of each particle: AgeOfParticleSystem + ParticleNumber / ParticleCreationRate.
  • From that, compute its' position.
  • You can parametrically define other properties such as color, size, etc.
  • Use an additional vertex attribute to provide randomness.

Skeletal mesh deformation.

Skeletal mesh animation is for things like people or animals where an 'organic' motion is required.

  • Conventionally: Assign each vertex of the 'skin' of the creature to a 'bone'.
  • The physics/animation system moves the bones and transforms each skin vertex by the appropriate bone number.
  • This is often performed in the CPU - but transforming hundreds of thousands of skin vertices in JavaScript is a non-starter.

So, we need to do this inside the GPU:

  • Pack all of the bone positions and rotations into 'uniform variables'.
  • Provide the bone number for each vertex as a vertex attribute.
  • Have the shader access the appropriate bone and transform the vertex as required.

Support Libraries

There are at least a dozen libraries and 'game engine' packages out there to help out with mundane stuff like matrix math - and to handle higher level functionality:

  • There is a list of them here:
    http://www.khronos.org/webgl/wiki/User_Contributions#Frameworks
  • Many support Collada as their base model file format.

Resources

  • Khronos group WebGL Wiki:
    http://www.khronos.org/webgl/wiki/Main_Page
  • WebGL Developers' List:
    https://groups.google.com/group/webgl-dev-list
  • Gregg Tavares' Google I/O 2011 talk on WebGL Techniques and Performance:
    http://www.google.com/events/io/2011/sessions/webgl-techniques-and-performance.html
  • Giles Thomas' WebGL blog:
    http://learningwebgl.com/blog/
  • Mozilla's WebGL articles:
    http://hacks.mozilla.org/category/webgl/
  • WebGL Chrome Experiments:
    http://www.chromeexperiments.com/webgl/

Q & A