Learn Creative Coding (#44) - Working with Textures in Shaders

Learn Creative Coding (#44) - Working with Textures in Shaders

cc-banner

Everything we've done in shaders so far has been procedural. Every color, every pattern, every shape -- generated from math. Noise functions, SDFs, fractal iterations, cosine palettes. No external data. The fragment shader receives pixel coordinates and time, and computes everything from scratch.

But shaders can also read from images. You load a photograph, a painting, a gradient, a noise texture, whatever you want -- and the shader can sample it at any position, getting back the RGBA color at that point. This is texture sampling, and it unlocks a completely different category of effects. Distortion, displacement, feedback, data-driven coloring, image processing, generative manipulation of photographs. The procedural tools we already have combine with image data in ways that get really wild really fast.

The WebGL setup for textures is a bit more involved than what we've been doing (passing a single float for time or a vec2 for resolution). You need to create a texture object, load an image into it, set filtering modes, and pass it to the shader as a sampler2D uniform. I'll walk through the JavaScript side first, then we spend the rest of the episode in GLSL doing interesting things with the texture data.

Setting up textures in WebGL

Here's the minimal JavaScript to load an image and pass it to a shader as a texture. This assumes you already have the WebGL boilerplate from episode 21 -- canvas, vertex shader (the fullscreen quad), fragment shader compilation, and the render loop with u_time and u_resolution uniforms.

// load image and create texture
const img = new Image();
img.crossOrigin = 'anonymous';
img.src = 'your-image.jpg';

img.onload = function() {
  const texture = gl.createTexture();
  gl.bindTexture(gl.TEXTURE_2D, texture);

  // upload image to GPU
  gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA,
                gl.RGBA, gl.UNSIGNED_BYTE, img);

  // filtering: LINEAR for smooth, NEAREST for pixelated
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);

  // wrapping: CLAMP_TO_EDGE prevents repeating
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);

  // bind texture to unit 0
  gl.activeTexture(gl.TEXTURE0);
  gl.bindTexture(gl.TEXTURE_2D, texture);

  // tell the shader which texture unit to use
  const texLoc = gl.getUniformLocation(program, 'u_texture');
  gl.uniform1i(texLoc, 0);
};

The key parts: texImage2D uploads the pixel data from the image to GPU memory. The filtering parameters control what happens when you sample between pixels -- LINEAR interpolates smoothly (good for photos), NEAREST snaps to the closest pixel (good for pixel art or when you want crisp blocks). The wrapping parameters control what happens when UV coordinates go outside the 0-1 range -- CLAMP_TO_EDGE repeats the edge color, REPEAT tiles the image.

In the fragment shader, you declare the texture as a sampler2D uniform:

uniform sampler2D u_texture;

And sample it with texture2D():

vec4 texColor = texture2D(u_texture, uv);

The uv is a vec2 in the range 0.0 to 1.0. (0.0, 0.0) is the bottom-left corner of the image, (1.0, 1.0) is the top-right. The return value is a vec4 with the RGBA color at that position. That's the entire API for reading textures in GLSL.

Basic texture display

Let's start with the simplest possible thing -- just display the texture:

precision mediump float;
uniform vec2 u_resolution;
uniform sampler2D u_texture;

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;
  vec4 color = texture2D(u_texture, uv);
  gl_FragColor = color;
}

Three lines in main(). Map pixel to UV, sample texture, output. The image appears on screen. If the texture and the canvas have different aspect ratios, the image will be stretched. To fix that you'd need to pass the texture dimensions as a uniform and do the aspect correction math, but for learning purposes stretching is fine.

Now here's where it gets interesting. That uv value is just a vec2. We can manipulate it before sampling. Every manipulation changes WHERE we read from the texture, which changes WHAT we see. This is the foundation of all texture effects -- transform the UV, sample the transformed position.

UV distortion with noise

Take the noise hash we've been using since episode 35 and use it to offset UV coordinates before sampling:

precision mediump float;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_texture;

float hash(vec2 p) {
  return fract(sin(dot(p, vec2(127.1, 311.7))) * 43758.5453);
}

float noise(vec2 p) {
  vec2 i = floor(p);
  vec2 f = fract(p);
  f = f * f * (3.0 - 2.0 * f);

  float a = hash(i);
  float b = hash(i + vec2(1.0, 0.0));
  float c = hash(i + vec2(0.0, 1.0));
  float d = hash(i + vec2(1.0, 1.0));

  return mix(mix(a, b, f.x), mix(c, d, f.x), f.y);
}

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;

  // noise-based UV distortion
  float n1 = noise(uv * 5.0 + u_time * 0.3);
  float n2 = noise(uv * 5.0 + u_time * 0.3 + 100.0);

  vec2 distorted = uv + vec2(n1 - 0.5, n2 - 0.5) * 0.05;

  vec4 color = texture2D(u_texture, distorted);
  gl_FragColor = color;
}

The noise values offset the UV by a small amount (0.05 at most) in both X and Y. Because the noise is smooth and continuous, the distortion looks like heat haze or underwater refraction -- the image wobbles and ripples organically. The u_time term makes it animate, and the + 100.0 on the second noise call gives a different pattern for X and Y so the motion isn't uniform.

Crank up the distortion amount (* 0.05 to * 0.2 or higher) and the image dissolves into an abstract swirl. Keep it subtle (* 0.01) and it's barely perceptible -- just enough to make a still image feel alive. The sweet spot depends on what you're going for.

The noise frequency (uv * 5.0) controls the scale of the distortion. Low frequencies (2-3) create broad, sweeping warps. High frequencies (10-20) create tight, wrinkly distortion. Mix both for layered effects -- a slow broad warp plus a fast fine-detail shimmer:

float n1_lo = noise(uv * 2.0 + u_time * 0.1);
float n1_hi = noise(uv * 15.0 + u_time * 0.8);
float n1 = n1_lo * 0.7 + n1_hi * 0.3;

Displacement mapping

Displacement takes the idea further. Instead of using noise to offset UVs, use one image's brightness to offset another image's UVs. Load two textures -- one is the "scene" image, the other is the "displacement map" (often a grayscale image where brightness encodes the offset amount):

precision mediump float;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_texture;    // scene image
uniform sampler2D u_dispMap;    // displacement map

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;

  // sample displacement map
  vec4 disp = texture2D(u_dispMap, uv);
  float strength = disp.r;  // use red channel as offset amount

  // offset UVs based on displacement
  float amount = 0.05;
  vec2 offset = vec2(
    strength * 2.0 - 1.0,
    (disp.g * 2.0 - 1.0)
  ) * amount;

  vec4 color = texture2D(u_texture, uv + offset);
  gl_FragColor = color;
}

The displacement map's red channel drives horizontal offset, green channel drives vertical offset. The * 2.0 - 1.0 converts from the 0-1 texture range to -1 to 1, so displacement can push in both directions. The amount controls the maximum pixel displacement.

If you don't have a second texture to use as a displacement map, you can use the scene image itself for self-displacement. Use the luminance of the image at each pixel to offset the sampling position:

vec4 orig = texture2D(u_texture, uv);
float luma = dot(orig.rgb, vec3(0.299, 0.587, 0.114));
vec2 displaced = uv + vec2(luma - 0.5) * 0.03;
vec4 color = texture2D(u_texture, displaced);

Self-displacement creates this recursive, swirling effect where bright areas pull pixels toward them and dark areas push them away. It looks like the image is melting toward its own highlights.

The pixel sorting effect

Kim Asendorf's pixel sorting became one of the most recognizable glitch art techniques. The algorithm sorts pixel columns or rows by brightness, creating those striking vertical or horizontal streaks. In a true implementation you'd sort actual pixel arrays, but in a shader we can approximate the effect by shifting UVs based on brightness comparison.

precision mediump float;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_texture;

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;

  // define a threshold for "bright" pixels
  float threshold = 0.5 + 0.3 * sin(u_time * 0.5);

  // sample current pixel brightness
  vec4 current = texture2D(u_texture, uv);
  float luma = dot(current.rgb, vec3(0.299, 0.587, 0.114));

  vec4 color = current;

  // if pixel is above threshold, shift it downward
  if (luma > threshold) {
    float shift = (luma - threshold) * 0.15;
    vec2 sortedUV = uv + vec2(0.0, -shift);
    color = texture2D(u_texture, sortedUV);
  }

  gl_FragColor = color;
}

This isn't a real sort (that would require comparing and swapping pixel positions across the entire column, which a fragment shader can't do in a single pass). But it creates a visual approximation -- bright pixels get shifted downward proportionally to their brightness, creating those characteristic vertical streaks. Animate the threshold with a sine wave and the sorting sweeps across the image, revealing and concealing different regions.

For horizontal sorting, change the shift direction:

vec2 sortedUV = uv + vec2(-shift, 0.0);

For diagonal streaks:

vec2 sortedUV = uv + normalize(vec2(1.0, -1.0)) * shift;

The effect is most dramatic on images with high contrast -- portraits with bright skin against dark backgrounds, cityscapes with bright windows, anything with clear brightness boundaries.

Texture feedback: the previous frame

One of the most powerful things you can do with textures is use the previous frame as input to the current frame. Render to a texture, then use that texture as input in the next frame. Each frame sees the result of all previous frames, creating accumulation, trails, and evolution.

The WebGL setup requires a framebuffer object (FBO) and two textures -- a "ping-pong" pair where you alternate reading from one and writing to the other:

// create two textures and FBOs for ping-pong
function createFBOPair(gl, width, height) {
  const textures = [];
  const fbos = [];

  for (let i = 0; i < 2; i++) {
    const tex = gl.createTexture();
    gl.bindTexture(gl.TEXTURE_2D, tex);
    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA,
                  width, height, 0, gl.RGBA,
                  gl.UNSIGNED_BYTE, null);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);

    const fbo = gl.createFramebuffer();
    gl.bindFramebuffer(gl.FRAMEBUFFER, fbo);
    gl.framebufferTexture2D(gl.FRAMEBUFFER,
      gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, tex, 0);

    textures.push(tex);
    fbos.push(fbo);
  }

  return { textures, fbos };
}

// in the render loop:
let frame = 0;
function render() {
  const read = frame % 2;
  const write = (frame + 1) % 2;

  // bind write FBO
  gl.bindFramebuffer(gl.FRAMEBUFFER, pair.fbos[write]);

  // bind read texture to unit 1
  gl.activeTexture(gl.TEXTURE1);
  gl.bindTexture(gl.TEXTURE_2D, pair.textures[read]);
  gl.uniform1i(prevFrameLoc, 1);

  // render
  gl.drawArrays(gl.TRIANGLES, 0, 6);

  // now display the result: draw to screen
  gl.bindFramebuffer(gl.FRAMEBUFFER, null);
  // ... draw the write texture to screen ...

  frame++;
  requestAnimationFrame(render);
}

That's the JavaScript plumbing. The fragment shader side is simpler -- you just sample u_prevFrame to get last frame's output:

precision mediump float;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_prevFrame;

float hash(vec2 p) {
  return fract(sin(dot(p, vec2(127.1, 311.7))) * 43758.5453);
}

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;

  // read previous frame
  vec4 prev = texture2D(u_prevFrame, uv);

  // add a new element (moving circle)
  vec2 center = vec2(
    0.5 + 0.3 * sin(u_time * 0.7),
    0.5 + 0.3 * cos(u_time * 0.5)
  );
  float d = length(uv - center);
  vec3 circle = vec3(0.0);
  if (d < 0.02) {
    circle = vec3(1.0, 0.5, 0.2);
  }

  // fade previous frame slightly (creates trails)
  vec3 faded = prev.rgb * 0.98;

  // composite: new element on top of faded trails
  vec3 color = max(circle, faded);

  gl_FragColor = vec4(color, 1.0);
}

A circle moves across the screen. Each frame, the previous frame is faded by multiplying by 0.98 (2% darker per frame). The circle paints on top of the faded previous frame. The result is a glowing trail that follows the circle's path, gradually fading away. The 0.98 factor controls trail length -- closer to 1.0 means longer trails, closer to 0.9 means rapid decay.

This is the same concept as the feedback loops we explored in episode 36, but now with actual textures. The difference is that in episode 36 we faked it by computing the scene function at an offset. Here, we're reading actual previous frame data. The true ping-pong approach is much more flexible and can accumulate effects over hundreds of frames.

Texture distortion feedback

Combine feedback with UV distortion and things get really interesting. Each frame, slightly warp the previous frame before fading it. The distortion accumulates over time, creating evolving organic patterns:

precision mediump float;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_prevFrame;

float hash(vec2 p) {
  return fract(sin(dot(p, vec2(127.1, 311.7))) * 43758.5453);
}

float noise(vec2 p) {
  vec2 i = floor(p);
  vec2 f = fract(p);
  f = f * f * (3.0 - 2.0 * f);
  float a = hash(i);
  float b = hash(i + vec2(1.0, 0.0));
  float c = hash(i + vec2(0.0, 1.0));
  float d = hash(i + vec2(1.0, 1.0));
  return mix(mix(a, b, f.x), mix(c, d, f.x), f.y);
}

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;

  // distort the UV before reading previous frame
  float n1 = noise(uv * 4.0 + u_time * 0.2);
  float n2 = noise(uv * 4.0 + u_time * 0.2 + 50.0);
  vec2 warpedUV = uv + vec2(n1 - 0.5, n2 - 0.5) * 0.003;

  // read distorted previous frame with fade
  vec3 prev = texture2D(u_prevFrame, warpedUV).rgb * 0.995;

  // inject new content: a bright spot that moves
  vec2 center = vec2(
    0.5 + 0.35 * sin(u_time * 0.4),
    0.5 + 0.35 * cos(u_time * 0.3)
  );
  float d = length(uv - center);
  float spot = smoothstep(0.03, 0.0, d);
  vec3 newColor = vec3(0.8, 0.4, 0.1) * spot;

  // add some color shifting on the feedback
  prev = prev.gbr * 0.999 + prev.rgb * 0.001;  // very subtle channel rotation

  vec3 color = max(newColor, prev);

  gl_FragColor = vec4(color, 1.0);
}

The distortion is tiny -- * 0.003 -- but it accumulates. After 100 frames, each pixel has been shifted by the accumulated noise offsets of all previous frames. The result is a swirling, evolving texture that looks like smoke or colored fluid. The bright spot injects energy into the system, and the distortion and color shifting spread it out into complex patterns.

The prev.gbr * 0.999 + prev.rgb * 0.001 line does a very subtle color channel rotation -- green moves toward the red channel, blue toward green, red toward blue. Over hundreds of frames, the colors shift through the spectrum. The effect is slow and organic, almost imperceptible frame-to-frame but clearly visible over seconds.

Texture as a palette lookup table

Here's a different use of textures entirely. Instead of displacing or distoring an image, use a 1D gradient image as a color lookup table. You compute some value (noise, distance, iteration count, whatever), and instead of converting that value to a color with math (like our cosine palette function), you use it as a coordinate to sample a gradient texture.

The gradient is a small image -- maybe 256x1 pixels -- where each pixel is a color in the palette. You sample it with texture2D(palette, vec2(t, 0.5)) where t goes from 0.0 to 1.0.

precision mediump float;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_palette;  // a 256x1 gradient texture

float hash(vec2 p) {
  return fract(sin(dot(p, vec2(127.1, 311.7))) * 43758.5453);
}

float noise(vec2 p) {
  vec2 i = floor(p);
  vec2 f = fract(p);
  f = f * f * (3.0 - 2.0 * f);
  float a = hash(i);
  float b = hash(i + vec2(1.0, 0.0));
  float c = hash(i + vec2(0.0, 1.0));
  float d = hash(i + vec2(1.0, 1.0));
  return mix(mix(a, b, f.x), mix(c, d, f.x), f.y);
}

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;

  // layered noise (like episode 35)
  float n = 0.0;
  n += noise(uv * 3.0 + u_time * 0.1) * 0.5;
  n += noise(uv * 6.0 - u_time * 0.15) * 0.25;
  n += noise(uv * 12.0 + u_time * 0.2) * 0.125;
  n += noise(uv * 24.0) * 0.0625;

  // use noise value to sample palette texture
  vec3 color = texture2D(u_palette, vec2(n, 0.5)).rgb;

  gl_FragColor = vec4(color, 1.0);
}

The advantage over cosine palettes: you can create any color sequence you want. A cosine palette is always smooth and periodic. A texture palette can have sharp transitions, flat regions, specific colors at specific positions. You can paint a gradient in an image editor, save it as a PNG, load it, and your shader uses those exact colors. Photographic color grading, exact brand colors, heat maps with specific breakpoints -- all easy with a palette texture.

You can also animate the palette lookup by offsetting the sample position:

vec3 color = texture2D(u_palette, vec2(fract(n + u_time * 0.05), 0.5)).rgb;

The fract() wraps the value around, so the colors cycle continuously through the palette. If the palette texture has REPEAT wrapping mode, you don't even need the fract().

Data textures: packing non-image data

Here's where textures get really powerful for creative coding. A texture doesn't have to contain image data. Each pixel is four floating-point values (RGBA, each 0-255 when stored as unsigned bytes). You can pack arbitrary data into those values.

Particle positions, audio FFT data, simulation state, height maps -- anything that fits in a grid of numbers. The GPU reads it as "color" but you interpret it as data.

Example: pass audio frequency data as a 256x1 texture where each pixel's red channel contains the amplitude of that frequency bin. The shader reads it to create audio-reactive visuals:

precision mediump float;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_audioData;  // 256x1, R = amplitude

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;

  // read frequency amplitude at this x position
  float freq = texture2D(u_audioData, vec2(uv.x, 0.5)).r;

  // bar visualization
  float bar = step(1.0 - uv.y, freq);

  // color by frequency
  vec3 color = vec3(
    0.2 + 0.8 * uv.x,
    0.5 * freq,
    1.0 - uv.x * 0.5
  ) * bar;

  gl_FragColor = vec4(color, 1.0);
}

The JavaScript side updates the texture every frame with new FFT data from the Web Audio API:

// in the render loop:
analyser.getByteFrequencyData(freqData);  // Uint8Array, 256 values
gl.bindTexture(gl.TEXTURE_2D, audioTexture);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.LUMINANCE,
              256, 1, 0, gl.LUMINANCE,
              gl.UNSIGNED_BYTE, freqData);

gl.LUMINANCE stores a single channel per pixel instead of RGBA, so the 256-byte frequency array maps directly to 256 pixels. Each pixel's value shows up in the R, G, and B channels identically (that's what luminance means -- grayscale).

The same approach works for particle systems. Store particle positions in a texture: red = x position, green = y position, blue = velocity x, alpha = velocity y. The compute shader (or a second render pass) updates these values each frame. Another shader reads the particle texture and renders the particles as points or circles. This is how GPU particle systems work -- the data lives on the GPU as textures, never leaving GPU memory. We touched on this concept in episode 11 with CPU particles, but GPU particles via data textures can handle millions of points at full framerate.

UV animation: scrolling, rotating, zooming

Before we get to the creative exercise, let's cover the basic UV transformations you can apply to textures. These are simple but they come up constantly.

Scrolling:

vec2 scrollUV = uv + vec2(u_time * 0.1, 0.0);  // horizontal scroll
vec4 color = texture2D(u_texture, fract(scrollUV));

The fract() wraps the UV so the texture tiles seamlessly (assumes the texture has content that tiles, or that you've set wrapping to REPEAT).

Rotation:

vec2 centered = uv - 0.5;
float angle = u_time * 0.3;
float cs = cos(angle);
float sn = sin(angle);
vec2 rotated = vec2(
  centered.x * cs - centered.y * sn,
  centered.x * sn + centered.y * cs
);
vec4 color = texture2D(u_texture, rotated + 0.5);

Standard 2D rotation matrix applied to UV coordinates centered on 0.5. The - 0.5 and + 0.5 center the rotation on the middle of the texture instead of the corner.

Zoom pulsing:

float zoom = 1.0 + 0.2 * sin(u_time);
vec2 zoomed = (uv - 0.5) / zoom + 0.5;
vec4 color = texture2D(u_texture, zoomed);

Dividing by a zoom factor greater than 1.0 makes the texture appear larger (zoomed in). Less than 1.0 makes it smaller (zoomed out). The sine wave creates a breathing pulse effect.

All three can be combined:

// scroll, then rotate, then zoom
vec2 animated = uv + vec2(u_time * 0.05, u_time * 0.03);
animated = fract(animated);
vec2 centered = animated - 0.5;
float a = u_time * 0.2;
animated = vec2(centered.x * cos(a) - centered.y * sin(a),
                centered.x * sin(a) + centered.y * cos(a)) + 0.5;
float z = 1.0 + 0.1 * sin(u_time * 0.5);
animated = (animated - 0.5) / z + 0.5;
vec4 color = texture2D(u_texture, animated);

Order matters. Scroll-then-rotate gives a differnt result than rotate-then-scroll. Experiment with the sequence -- you'll develop intuition for which order produces which visual result.

Creative exercise: living photograph

Let's put it all together. Load a photo, apply noise-based distortion that evolves over time, add brightness-based displacement, throw in some pixel-sorting style streaks, and color-grade the result. Turn a static photograph into a breathing, shifting, living artwork.

precision mediump float;
uniform vec2 u_resolution;
uniform float u_time;
uniform sampler2D u_texture;

float hash(vec2 p) {
  return fract(sin(dot(p, vec2(127.1, 311.7))) * 43758.5453);
}

float noise(vec2 p) {
  vec2 i = floor(p);
  vec2 f = fract(p);
  f = f * f * (3.0 - 2.0 * f);
  float a = hash(i);
  float b = hash(i + vec2(1.0, 0.0));
  float c = hash(i + vec2(0.0, 1.0));
  float d = hash(i + vec2(1.0, 1.0));
  return mix(mix(a, b, f.x), mix(c, d, f.x), f.y);
}

void main() {
  vec2 uv = gl_FragCoord.xy / u_resolution;

  // 1. noise-based warp (two layers)
  float n1 = noise(uv * 3.0 + u_time * 0.15) * 0.5
           + noise(uv * 8.0 + u_time * 0.4) * 0.25;
  float n2 = noise(uv * 3.0 + u_time * 0.15 + 77.0) * 0.5
           + noise(uv * 8.0 + u_time * 0.4 + 77.0) * 0.25;

  vec2 warped = uv + vec2(n1 - 0.375, n2 - 0.375) * 0.03;

  // 2. self-displacement based on luminance
  vec4 probe = texture2D(u_texture, warped);
  float luma = dot(probe.rgb, vec3(0.299, 0.587, 0.114));
  warped += vec2(0.0, (luma - 0.5) * 0.01);

  // 3. pixel sort streaks on bright areas
  float threshold = 0.6 + 0.2 * sin(u_time * 0.3);
  if (luma > threshold) {
    float shift = (luma - threshold) * 0.08;
    warped.y -= shift;
  }

  // 4. sample the final warped position
  vec3 color = texture2D(u_texture, warped).rgb;

  // 5. chromatic aberration (subtle, from ep43)
  vec2 centered = uv - 0.5;
  float abr = 0.002 * length(centered);
  vec2 dir = normalize(centered + 0.0001);
  color.r = texture2D(u_texture, warped + dir * abr).r;
  color.b = texture2D(u_texture, warped - dir * abr).b;

  // 6. color grade: slightly warm, boosted contrast
  color *= 1.15;  // exposure
  color = (color - 0.5) * 1.2 + 0.5;  // contrast
  color *= vec3(1.05, 1.0, 0.92);  // warm tint

  // 7. vignette
  color *= smoothstep(0.75, 0.3, length(centered));

  // 8. film grain
  float grain = (hash(gl_FragCoord.xy + fract(u_time) * 200.0) - 0.5) * 0.06;
  color += grain;

  gl_FragColor = vec4(clamp(color, 0.0, 1.0), 1.0);
}

Eight layers of processing stacked on a single photograph. The noise warp makes the image breathe and shift. The self-displacement pushes bright areas slightly, adding organic movement. The pixel sort streaks create glitch-art bands across bright regions. Chromatic aberration adds lens character (building on what we did in episode 43). Color grading sets the mood. Vignette focuses attention. Film grain adds analog texture.

Each layer is independent -- you can remove any one and the others still work. Start with just the noise warp. Add displacement. Add the pixel sort. Layer by layer, the static photo transforms into something alive. The threshold animation on the pixel sort sweeps the effect across the image over time, revealing and hiding the streaks.

Try this with different photos. Portraits work great because the face provides natural high-contrast edges for the displacement and sorting to interact with. Landscapes get these dreamy, flowing distortions. Abstract textures (close-up wood grain, rusty metal, cracked paint) become genuinely surreal. The same shader produces completley different moods depending on the source image.

Where textures lead

We've covered the fundamentals: loading textures in WebGL, sampling them in GLSL, UV distortion, displacement, feedback loops, palette lookups, data textures, UV animation, and the creative combination of all of these. But there's more -- textures are the bridge between CPU data and GPU computation. Audio analysis, video input, simulation state, even the output of other shaders -- all of it flows through textures.

The next episode is a mini-project where we bring together everything from the shader arc. Post-processing from episode 43, textures from today, noise from episode 35, SDFs from episode 33, color palettes from episode 37 -- all in one piece. A generative shader artwork that you can call finished and share.

And then after that, we leave the fragment-shader-only world behind and look at compute shaders and GPU particle systems -- where data textures become the backbone of massive parallel simulations. The concept of "pack data into a texture, process it on the GPU, read it back as a texture" is the foundation of all GPU computing. What we did today with audio FFT data scales up to fluid simulations, physics engines, neural networks -- anything that benefits from massive parallelism.

But one step at a time. Today, go load a photo and make it weird :-)

Allez, wa weten we nu allemaal?

  • Textures in WebGL: gl.createTexture(), gl.texImage2D() to upload, gl.texParameteri() for filtering (LINEAR vs NEAREST) and wrapping (CLAMP_TO_EDGE vs REPEAT)
  • In GLSL: declare as uniform sampler2D u_texture, sample with texture2D(u_texture, uv). UV range 0-1, returns vec4 RGBA
  • UV distortion: offset UV coordinates with noise before sampling. Small offsets (0.01-0.05) for heat haze, larger (0.1-0.3) for abstract warping
  • Displacement mapping: use one image's brightness to offset another image's UV. Self-displacement uses the image's own luminance
  • Pixel sorting approximation: shift UVs proportional to brightness above a threshold. Animate the threshold for sweeping effects
  • Texture feedback: ping-pong between two framebuffers. Read previous frame, apply distortion, fade slightly, write new frame. Accumulates trails and patterns over hundreds of frames
  • Palette textures: a 256x1 gradient image as a color lookup table. Map any computed value (noise, distance, iteration count) to the gradient for custom coloring
  • Data textures: pack non-image data into texture pixels (audio FFT, particle positions, simulation state). The GPU reads them as colors but you interpret them as numbers
  • UV animation: scroll (uv + time), rotate (2D rotation matrix centered on 0.5), zoom (divide UV by zoom factor). Combine for complex texture movement
  • Post-processing from episode 43 (chromatic aberration, vignette, grain, color grading) stacks naturally on top of texture effects

Sallukes! Thanks for reading.

X

@femdev



0
0
0.000
0 comments