Here I will cover an approach for creating a retro style shader material. Three.js has some support for doing things like this, but I decided not to use it so I'd have more control and understanding of how things are working behind the scenes. That library is also a bit larger than I'd like to pull in just yet - around 500kb minified.
This post has 3 main parts:
The first step to this experiment was creating a plane geometry. You might think this is a trivial thing to do, but it's actually a bit tricky because there's no simple way to tell WebGL to just "draw a plane". Instead you need to construct the plane using some sort of geometry primitive like a triangle strip, which is what I've done here. By using a triangle strip, WebGL will create a triangle for every sequential subset of three vertices passed in as vertex position data.
This algorithm will handle creating a regularly distributed set of coordinates in the shape of a plane covering the X and Z axes. For each vertex, I'll support a function call to return the y coordinate (elevation) based on the current X and Z coordinates. This makes it so I can easily map some pregenerated noise into the plane for elevation values.
const createPlaneGeometry = (
planeWidth: number,
planeHeight: number,
getY: (xi: number, zi: number) => number = () => 0
): VertexData => {
const data: number[] = [];
const rectWidth = 25;
// Split the plane into strips which represent a column of squares
for (let strip = 0; strip < planeWidth - 1; strip++) {
for (let row = 0; row < planeHeight; row++) {
for (let col = 0; col < 2; col++) {
// Each new strip must define vertices in the opposite direction
// e.g. up and to the right, or down and to the left.
const even = strip % 2 === 0;
const colAdd = Math.floor(strip / 2) * 2;
const xi = even ? col + colAdd : 2 - col + colAdd;
const zi = even ? row : planeHeight - row - 1;
// push x,y,z coordinates for this vertex
data.push(rectWidth * xi);
data.push(getY(xi, zi));
data.push(rectWidth * zi);
}
}
}
return {
data,
size: 3,
count: data.length / 3
};
};
There are two main things that make the shader code work.
Barycentric coordinates provide a way to measure the distance between the edge of a triangle and any point within the triangle. The way to use this is pretty simple: when the fragment shader is drawing some pixel, change the color depending on how close it is to the edge of its triangle face.
This is actually incredibly easy to do thanks to how the fragment shader receives values from the vertex shader. When you're passing data from the vertex shader to the fragment shader, the values returned by each vertex of a given face are interpolated to get a value for each pixel of that face. So for example, if vertex v0 returns 0.0 and vertex v1 returns 1.0, each pixel along the line from v0 to v1 would receive a smooth transition of values from 0.0 to 1.0.
This means I can just assign a unit vector to each vertex in the geometry. Then in the fragment shader, I can just check to see how close the current pixel is to the interpolated values for the vector holding the barycentric coordinates.
export const createBarycentricCoords = (vertexCount: number): VertexData => {
const data: number[] = [];
for (let i = 0; i < vertexCount; i++) {
// each vertex gets its own vector: 1,0,0 ; 0,1,0 ; 0,0,1
const vector = [0, 0, 0];
vector[i % 3] = 1;
data.push(...vector);
}
return {
data,
size: 3,
count: vertexCount
};
};
Now the vertex data can be passed along with the barycentric coordinate data, each with its own vertexAttribPointer
.
const width = 36;
const height = 36;
const noise = createNoise(width, height);
const plane = createPlaneGeometry(
width,
height,
(xi, zi) => noise[xi + zi * height]
);
const barycentric = createBarycentricCoords(plane.count);
const vertexData: Array<VertexAttributeData> = [
{ ...plane, attributeName: "a_position" },
{ ...barycentric, attributeName: "a_barycentric" }
];
#version 300 es
in vec4 a_position;
in vec3 a_barycentric;
uniform mat4 u_matrix;
out vec3 v_barycentric;
void main() {
gl_Position = u_matrix * a_position;
v_barycentric = a_barycentric;
}
#version 300 es
precision mediump float;
in vec3 v_barycentric;
out vec4 out_color;
// This technique is based on Florian Bösch's article
float edge_factor() {
vec3 d = fwidth(v_barycentric);
vec3 smooth_dist = smoothstep(vec3(0.0), d * 5.5, v_barycentric);
return min(min(smooth_dist.x, smooth_dist.y), smooth_dist.z);
}
void main() {
vec3 face = vec3(0.0);
vec3 wire = vec3(1.0, 0.0, 1.0);
vec3 highlight = vec3(1.0, 1.0, 1.0);
float edge = edge_factor();
if (edge < 0.5) {
out_color = vec4(mix(highlight, wire, edge * 5.0), 1);
} else {
out_color = vec4(mix(wire, face, edge), 1);
}
}