Interop and Video

Video textures and external resources

WebGPU doesn't exist in isolation. Real applications integrate with video playback, Canvas2D rendering, images from the network, and frames from cameras. The importExternalTexture API provides efficient access to video frames without CPU-side copies, while other interop patterns connect WebGPU to the broader web platform.

External Textures

An external texture represents a frame from a video element, video frame, or other external source. Unlike regular textures, external textures are transient—they're valid only for the current JavaScript task and must be re-imported for each frame.

const video = document.getElementById('video') as HTMLVideoElement;
 
function render() {
  // Import the current video frame as a texture
  const externalTexture = device.importExternalTexture({
    source: video,
  });
  
  // Use in this frame's rendering
  const bindGroup = device.createBindGroup({
    layout: pipeline.getBindGroupLayout(0),
    entries: [{
      binding: 0,
      resource: externalTexture,
    }],
  });
  
  // ... render ...
  
  // externalTexture is invalid after this task completes
  requestAnimationFrame(render);
}
typescript

Interactive: External texture lifecycle

1
2
3
4
5
Video Frame Ready
Video element has a new frame from the decoder

External textures must be imported fresh each frame. They become invalid when the JavaScript task completes.

External textures avoid copying video frame data through JavaScript. The GPU can access the video frame directly from the decoder's memory, making video processing efficient enough for real-time effects.

Sampling External Textures

External textures require a special sampler type in WGSL. They use texture_external instead of texture_2d:

@group(0) @binding(0) var videoTexture: texture_external;
@group(0) @binding(1) var videoSampler: sampler;
 
@fragment
fn main(@location(0) uv: vec2f) -> @location(0) vec4f {
    // Use textureSampleBaseClampToEdge for external textures
    let color = textureSampleBaseClampToEdge(videoTexture, videoSampler, uv);
    return color;
}
wgsl

Note the function name: textureSampleBaseClampToEdge. External textures only support this sampling function—no mipmaps, no configurable address modes. The "base" refers to mip level 0, and edge clamping is always applied.

Interactive: Video as GPU texture

Each frame: importExternalTexture → shader applies effect → render to canvas. Processing runs entirely on the GPU.

Video Processing Examples

With video frames as textures, you can apply any shader effect in real-time:

Color correction:

@fragment
fn colorCorrect(@location(0) uv: vec2f) -> @location(0) vec4f {
    var color = textureSampleBaseClampToEdge(videoTexture, videoSampler, uv);
    
    // Increase saturation
    let gray = dot(color.rgb, vec3f(0.299, 0.587, 0.114));
    color.rgb = mix(vec3f(gray), color.rgb, 1.5);
    
    // Adjust contrast
    color.rgb = (color.rgb - 0.5) * 1.2 + 0.5;
    
    return color;
}
wgsl

Edge detection:

@fragment
fn edgeDetect(@location(0) uv: vec2f) -> @location(0) vec4f {
    let texelSize = vec2f(1.0 / 1920.0, 1.0 / 1080.0);
    
    let c = textureSampleBaseClampToEdge(videoTexture, videoSampler, uv).rgb;
    let l = textureSampleBaseClampToEdge(videoTexture, videoSampler, uv - vec2f(texelSize.x, 0.0)).rgb;
    let r = textureSampleBaseClampToEdge(videoTexture, videoSampler, uv + vec2f(texelSize.x, 0.0)).rgb;
    let t = textureSampleBaseClampToEdge(videoTexture, videoSampler, uv - vec2f(0.0, texelSize.y)).rgb;
    let b = textureSampleBaseClampToEdge(videoTexture, videoSampler, uv + vec2f(0.0, texelSize.y)).rgb;
    
    let edge = abs(l - r) + abs(t - b);
    return vec4f(edge, 1.0);
}
wgsl

Canvas2D Interop

You can use a Canvas2D as a texture source via copyExternalImageToTexture. This enables mixing 2D and 3D rendering:

const canvas2d = document.createElement('canvas');
canvas2d.width = 512;
canvas2d.height = 512;
const ctx = canvas2d.getContext('2d')!;
 
// Draw with Canvas2D
ctx.fillStyle = 'red';
ctx.fillRect(0, 0, 256, 256);
ctx.fillStyle = 'blue';
ctx.fillRect(256, 0, 256, 256);
 
// Copy to GPU texture
const texture = device.createTexture({
  size: [512, 512],
  format: 'rgba8unorm',
  usage: GPUTextureUsage.TEXTURE_BINDING | 
         GPUTextureUsage.COPY_DST | 
         GPUTextureUsage.RENDER_ATTACHMENT,
});
 
device.queue.copyExternalImageToTexture(
  { source: canvas2d },
  { texture },
  [512, 512]
);
typescript

Interactive: Canvas2D to WebGPU

Source
Processed

Draw with Canvas2D, copy to GPU texture with copyExternalImageToTexture, apply shader effects.

This works with any ImageBitmapSource: Canvas2D, ImageBitmap, OffscreenCanvas, or even video elements (though importExternalTexture is more efficient for video).

ImageBitmap for Async Loading

For images loaded from the network, use createImageBitmap for async decoding, then copy to a texture:

async function loadTexture(url: string): Promise<GPUTexture> {
  const response = await fetch(url);
  const blob = await response.blob();
  const bitmap = await createImageBitmap(blob, {
    colorSpaceConversion: 'none', // Preserve original colors
  });
  
  const texture = device.createTexture({
    size: [bitmap.width, bitmap.height],
    format: 'rgba8unorm',
    usage: GPUTextureUsage.TEXTURE_BINDING |
           GPUTextureUsage.COPY_DST |
           GPUTextureUsage.RENDER_ATTACHMENT,
  });
  
  device.queue.copyExternalImageToTexture(
    { source: bitmap, flipY: true }, // Flip for WebGPU coordinates
    { texture },
    [bitmap.width, bitmap.height]
  );
  
  bitmap.close(); // Free the bitmap memory
  
  return texture;
}
typescript

The flipY option handles coordinate system differences. WebGPU textures have origin at top-left with Y increasing downward, which matches most image formats.

Video Frame API

For advanced video processing, the Web Codecs API provides VideoFrame objects that work directly with importExternalTexture:

const videoDecoder = new VideoDecoder({
  output: (frame) => {
    const externalTexture = device.importExternalTexture({
      source: frame,
    });
    
    // Process frame with WebGPU
    processFrame(externalTexture);
    
    frame.close(); // Important: release the frame
  },
  error: (e) => console.error('Decode error:', e),
});
 
// ... configure decoder and feed encoded data ...
typescript

This enables processing video without a <video> element—useful for custom video players, video editing applications, or processing video files directly.

Interactive: importExternalTexture patterns

Standard video playback from <video> element
const video = document.querySelector('video');
const texture = device.importExternalTexture({
  source: video,
});
WGSL Shader Code
@group(0) @binding(0) var tex: texture_external;
@group(0) @binding(1) var samp: sampler;

// Sample with:
textureSampleBaseClampToEdge(tex, samp, uv)

External textures use a special type and sampling function. They're valid only for one frame.

Performance Considerations

External textures are efficient because they avoid copies, but they have constraints:

Re-import every frame: The texture binding becomes invalid after the current task. You must call importExternalTexture again for each frame you want to render.

Can't read back: External textures are write-only from the CPU perspective. You can render with them but can't copy their contents to a buffer.

Format conversion happens automatically: Video is often in YUV format internally. The GPU converts to RGB during sampling, but this conversion has some cost.

Synchronization with video playback: The imported frame is whatever's current when you call importExternalTexture. If video playback and your render loop run at different rates, you might process the same frame multiple times or skip frames.

For frame-accurate video processing, use the Video Frame API which gives explicit control over which frames you receive.

Webcam Input

WebRTC's getUserMedia provides camera access. The resulting MediaStream can drive a video element, which then works with importExternalTexture:

const stream = await navigator.mediaDevices.getUserMedia({
  video: { width: 1280, height: 720 },
});
 
const video = document.createElement('video');
video.srcObject = stream;
video.autoplay = true;
await video.play();
 
// Now use video with importExternalTexture
function processCamera() {
  const frame = device.importExternalTexture({ source: video });
  // ... apply effects, render to canvas ...
  requestAnimationFrame(processCamera);
}
typescript

This enables real-time camera effects: background replacement, face filters, color grading—all running at GPU speeds.

Key Takeaways

  • importExternalTexture provides efficient GPU access to video frames without CPU copies
  • External textures are transient—re-import every frame
  • Use texture_external and textureSampleBaseClampToEdge in WGSL for external textures
  • copyExternalImageToTexture connects Canvas2D, ImageBitmap, and other sources to WebGPU
  • Web Codecs' VideoFrame API gives frame-level control for advanced video processing
  • Handle coordinate system differences with the flipY option when copying images