WebGL Raytracer

Live Demo

WebGL harness

I used the same base code I developed for the HTML5 pointcloud viewer for the raytracer however much simplified. The whole ray tracer is implemented in the fragments shader. The vertex shader creates the primary rays. The result is displayed using a full-screen quad which acts as the input primitive. I still use my camera class that provides me with the OpenGL matrices to set up the ray tracing. That way I can easily support different projection methods, etc and maintain some kind of compatibility to other WebGL code.

Ray creation -- the Vertex Shader

The vertex shader sets up all primary rays that intersect the scene. I am drawing a fullscreen quad with 2D vertices in [0--1]. These vertices are first transformed into full clipspace coordinates and from there to world coordinates (don't forget to divide by w!), giving both the near and far plane coordinates for each corner vertex. The ray origin is alway the nearplane pixel, the ray direction the vector from the near plane to the far plane position. We are doing this in the vertex shader for all 4 corner vertices and the result will be interpolated for the fragment shader.

Here is the full code of the shader:


// input position in [0..1]
attribute vec2 positionIn;

// output ray origin and direction 
varying vec3 rayOrigin;
varying vec3 rayDirection;

// the inverse ModelViewProjectionMatrix from our OpenGL cameras
uniform mat4 inverseMVP;

void main()
{

  // expand the vertex to [-1..1]
  vec2 vertex = positionIn * 2.0 - vec2(1.0);

  // Calculate the farplane vertex position. 
  // Input is clipspace, output is worldspace
  vec4 farPlane = inverseMVP * vec4(vertex, 1.0, 1.0);
  farPlane /= farPlane.w;

  // Same as above for the near plane. 
  // Remember the near plane in OpenGL is at z=-1
  vec4 nearPlane = inverseMVP * vec4(vertex, -1.0, 1.0);
  nearPlane /= nearPlane.w;

  // No need to normalize ray direction here, as it might get 
  // interpolated (and become non-unit) before going to the fragment 
  // shader.
  rayDirection = (farPlane.xyz - nearPlane.xyz); 
  rayOrigin = nearPlane.xyz;

  // output the actual clipspace position for OpenGL
  gl_Position = vec4(vertex, 0.0, 1.0);
}
        

Ray intersection -- the Fragment Shader

The actual ray tracing happens here. To keep it super-simple, I packed the full source of a simple ray tracer into the shader. A ray is created from the inputs of the vertex shader and used to intersect a couple of spheres and the ground floor. The spheres have a simple material system: a diffuse color and a 'reflectance' value that determines if a secondary ray is spawned and how much the new hit contributes. Usually, this would run recursively until a threshold value is reached, however in GLSL we are stuck with for loops with static initialization so we have to go through N iterations anyway.

If a hit is recorded and the current hit is closer to the ray origin than any previous hit (lower line parameter t), we can calculate its shading based on the material, position and normal. A secondary ray is spawned towards the (only) light source to see if the the pixel is lit or not.

Here is (most) of the source from the fragment shader:

varying vec3 rayDirection;
varying vec3 rayOrigin;

// point light
uniform vec3 light0;

struct Ray
{
  vec3    origin;
  vec3    direction;
};

struct Material
{
  vec3  diffuse;
  float   reflectance;
};


struct Hit
{
  vec3    position;
  vec3    normal;
  float   t;

  Material  material;
};

struct Sphere
{
  // coords + radius
  vec3    center;
  float   radius;
  Material  material;
};


// a small epsilon to work around the biggest problems with floats
const float EPS = 0.0001;
// 'clear value' for the ray 
const float START_T = 100000.0;
// maximum recursion depth for rays
const int MAX_DEPTH = 3;


Ray initRay()
{
  Ray r;
  r.origin = rayOrigin;
  r.direction = normalize(rayDirection);
  return r;
}

// intersects the ray with a sphere and returns the line parameter t for
// a hit or -1 if not hit occured
float intersectRaySphere(in Ray ray, in Sphere sphere)
{
  ...
}

// simplified ray-sphere hit test, returns true on a hit
bool intersectRaySphereOcclusion(in Ray ray, in Sphere sphere)
{
  ... 
}

// intersects the ray with the ground plane at Y=0 and returns the line
// parameter t where it hit
float intersectRayYPlane(in Ray ray)
{
  return -ray.origin.y / ray.direction.y;
}


// the scene consists of 7 spheres, a ground plane an a light
Sphere spheres[7];

// initializes the scene
void initScene()
{
  // set the center, radius and material of the first sphere
  spheres[0].center = vec3(0.0, 4.0, 0.0);
  spheres[0].radius = 2.0;
  spheres[0].material.diffuse = vec3(0.7, 0.0, 0.0);
  spheres[0].material.reflectance = 0.1;

  // and so on 
  ...
}


// traces the scene and reports the closest hit
bool traceScene(in Ray ray, inout Hit hit)
{


  hit.t = START_T;

  // first intersect the ground
  float t = intersectRayYPlane(ray);
  if (t >= 0.0 && t <= hit.t)
  {
    hit.t = t;
    hit.position = ray.origin + ray.direction * t;
    hit.normal = vec3(0,1,0);
    hit.material.diffuse  = vec3(0.6);
    hit.material.reflectance = 0.0;//05;
  }


  // then check each of the seven sphers
  for (int i = 0; i < 7; ++i)
  {
    t = intersectRaySphere(ray, spheres[i]);

    // only keep this hit if it's closer (smaller t)
    if (t >= 0.0 && t <= hit.t)
    {

      vec3 pos = ray.origin + ray.direction * t;; 
      vec3 N = normalize(pos - spheres[i].center);

      hit.t = t;
      hit.normal = N;
      hit.material = spheres[i].material;
      hit.position = pos;

    }
  }

  return hit.t < START_T;
}


// shades a given hit and returns the final color
vec3 shadeHit(in Hit hit)
{
  vec3 color = hit.material.diffuse;

  // ray to the light
  vec3 L = normalize(light0 - hit.position);

  // test for shadows
  Ray r;
  r.origin = hit.position + hit.normal * EPS;
  r.direction = L;

  // Phong shading with 0.2 min. ambient contribution. 
  float s = max(0.2, dot(L,hit.normal));

  for (int i = 0; i < 7; ++i)
    if (intersectRaySphereOcclusion(r, spheres[i]))
    {
      s = 0.2;
      break;
    } 
  color *= s;

  return color;
}


void main()
{

  // create the primary ray
  Ray ray = initRay();

  // create the scene
  initScene();

  // the 'clear color' is the ray direction (useful for debugging)
  vec3 color = ray.direction;


  Hit hit;
  // trace the scene and see if we hit something
  if (traceScene(ray, hit)) 
  {

    // if we do, shade the hit
    color = shadeHit(hit);


    // cannot use a while loop in glsl :( -- iterate for N recursions
    for (int i =0 ; i < MAX_DEPTH; ++i)
    {

      // add reflections if the material is reflective
      float r = hit.material.reflectance;
      if (r > 0.0)
      {

        vec3 R = reflect(ray.direction, hit.normal);
        ray.origin = hit.position + hit.normal * EPS;
        ray.direction = R;
        ray.depth += 1;


        vec3 color2;

        // again, trace the scene and shade the hit if successful
        if (traceScene(ray, hit))
          color2 = shadeHit(hit);
        else
          color2 = ray.direction;
        
        // add the color based on the reflection value
        color = mix(color, color2, r);
        
      }

    }

  }

  gl_FragColor = vec4(color, 1.0);

}