Loading Meshes

Parsing OBJ and index buffers

So far, every mesh we have rendered was hand-coded: vertex positions typed directly into arrays, triangles defined as literals. This works for simple geometry, but real 3D content comes from modeling tools like Blender, Maya, or procedural generators. To render these, we need to load mesh data from files.

This chapter covers the anatomy of mesh data, the OBJ file format, and efficient GPU upload strategies. By the end, you will have a complete pipeline from file to screen.

What Is a Mesh?

A mesh is a collection of vertices connected by edges to form faces. In the GPU world, we care about three things: where vertices are, what properties they have, and how they connect into triangles.

Positions define where each vertex sits in 3D space. Every mesh has positions.

Normals define which direction each vertex faces. Essential for lighting calculations. A vertex on the corner of a cube might have different normals depending on whether it belongs to the top face or the side face.

Texture coordinates (UVs) define how textures map onto the surface. They are called UVs because they typically use a 2D coordinate system with axes named U and V, leaving X, Y, Z for positions.

Indices define how vertices connect into triangles. Three indices per triangle, each pointing to a vertex in the position array.

The simplest mesh format sends three vertices per triangle. A cube with 12 triangles would need 36 vertices. But cubes have only 8 corners. With index buffers, we store 8 vertices and 36 indices—far less data.

The OBJ Format

OBJ is a text-based mesh format from the 1980s. It has survived this long because it is human-readable and simple to parse. No compression, no binary encoding, just lines of text.

OBJ Format Explorer

# Simple cube mesh
# 8 vertices, 12 triangles
v -1.0 -1.0 1.0
v 1.0 -1.0 1.0
v 1.0 1.0 1.0
v -1.0 1.0 1.0
v -1.0 -1.0 -1.0
v 1.0 -1.0 -1.0
v 1.0 1.0 -1.0
v -1.0 1.0 -1.0
vn 0.0 0.0 1.0
vn 1.0 0.0 0.0
vn 0.0 0.0 -1.0
vn -1.0 0.0 0.0
vn 0.0 1.0 0.0
vn 0.0 -1.0 0.0
vt 0.0 0.0
vt 1.0 0.0
vt 1.0 1.0
vt 0.0 1.0
f 1/1/1 2/2/1 3/3/1
f 1/1/1 3/3/1 4/4/1
f 2/1/2 6/2/2 7/3/2
f 2/1/2 7/3/2 3/4/2
f 6/1/3 5/2/3 8/3/3
f 6/1/3 8/3/3 7/4/3
f 5/1/4 1/2/4 4/3/4
f 5/1/4 4/3/4 8/4/4
f 4/1/5 3/2/5 7/3/5
f 4/1/5 7/3/5 8/4/5
f 5/1/6 6/2/6 2/3/6
f 5/1/6 2/3/6 1/4/6

Hover over lines to identify their type. Use the filters to highlight specific data.

An OBJ file contains several line types:

Vertex positions start with v:

v 1.0 2.0 3.0
plaintext

This defines a vertex at position (1, 2, 3). Coordinates are floats separated by spaces.

Vertex normals start with vn:

vn 0.0 1.0 0.0
plaintext

This defines a normal pointing straight up. Normals should be unit length.

Texture coordinates start with vt:

vt 0.5 0.75
plaintext

This defines UV coordinates at (0.5, 0.75). Values typically range from 0 to 1.

Faces start with f:

f 1 2 3
f 1/1/1 2/2/2 3/3/3
plaintext

The first form references just position indices. The second form uses the pattern position/texcoord/normal. Indices are 1-based, not 0-based.

A minimal cube in OBJ format:

# A simple cube
v -1.0 -1.0  1.0
v  1.0 -1.0  1.0
v  1.0  1.0  1.0
v -1.0  1.0  1.0
v -1.0 -1.0 -1.0
v  1.0 -1.0 -1.0
v  1.0  1.0 -1.0
v -1.0  1.0 -1.0
 
f 1 2 3
f 1 3 4
f 2 6 7
f 2 7 3
f 6 5 8
f 6 8 7
f 5 1 4
f 5 4 8
f 4 3 7
f 4 7 8
f 5 6 2
f 5 2 1
plaintext

Eight vertices, twelve faces. No normals or UVs in this minimal example—they would be computed or generated separately.

Parsing OBJ

Parsing OBJ is straightforward: read lines, switch on the prefix, accumulate data. Here is a TypeScript parser:

interface ObjData {
  positions: number[];
  normals: number[];
  uvs: number[];
  indices: number[];
}
 
function parseObj(text: string): ObjData {
  const positions: number[] = [];
  const normals: number[] = [];
  const uvs: number[] = [];
  
  // Temporary storage for face data
  const tempPositions: number[][] = [];
  const tempNormals: number[][] = [];
  const tempUvs: number[][] = [];
  
  // Final vertex data (after face processing)
  const finalPositions: number[] = [];
  const finalNormals: number[] = [];
  const finalUvs: number[] = [];
  const indices: number[] = [];
  
  // Map from "pos/uv/norm" string to index
  const vertexMap = new Map<string, number>();
  
  const lines = text.split('\n');
  
  for (const line of lines) {
    const parts = line.trim().split(/\s+/);
    const type = parts[0];
    
    if (type === 'v') {
      tempPositions.push([
        parseFloat(parts[1]),
        parseFloat(parts[2]),
        parseFloat(parts[3]),
      ]);
    } else if (type === 'vn') {
      tempNormals.push([
        parseFloat(parts[1]),
        parseFloat(parts[2]),
        parseFloat(parts[3]),
      ]);
    } else if (type === 'vt') {
      tempUvs.push([
        parseFloat(parts[1]),
        parseFloat(parts[2]),
      ]);
    } else if (type === 'f') {
      // Handle faces with 3 or more vertices (triangulate)
      const faceIndices: number[] = [];
      
      for (let i = 1; i < parts.length; i++) {
        const vertexKey = parts[i];
        
        if (vertexMap.has(vertexKey)) {
          faceIndices.push(vertexMap.get(vertexKey)!);
        } else {
          const indices = vertexKey.split('/');
          const posIdx = parseInt(indices[0]) - 1;
          const uvIdx = indices[1] ? parseInt(indices[1]) - 1 : -1;
          const normIdx = indices[2] ? parseInt(indices[2]) - 1 : -1;
          
          const newIndex = finalPositions.length / 3;
          
          finalPositions.push(...tempPositions[posIdx]);
          
          if (uvIdx >= 0 && tempUvs[uvIdx]) {
            finalUvs.push(...tempUvs[uvIdx]);
          } else {
            finalUvs.push(0, 0);
          }
          
          if (normIdx >= 0 && tempNormals[normIdx]) {
            finalNormals.push(...tempNormals[normIdx]);
          } else {
            finalNormals.push(0, 1, 0);
          }
          
          vertexMap.set(vertexKey, newIndex);
          faceIndices.push(newIndex);
        }
      }
      
      // Triangulate face (fan triangulation)
      for (let i = 1; i < faceIndices.length - 1; i++) {
        indices.push(faceIndices[0], faceIndices[i], faceIndices[i + 1]);
      }
    }
  }
  
  return {
    positions: finalPositions,
    normals: finalNormals,
    uvs: finalUvs,
    indices,
  };
}
typescript

This parser handles the complexity of OBJ's face format. A single OBJ vertex might reference different normals in different faces—think of a cube corner that belongs to three faces, each with a different normal. The parser creates unique GPU vertices for each combination.

Index Buffers

Without index buffers, every triangle needs three unique vertices. A cube has 12 triangles, requiring 36 vertices. With index buffers, we store unique vertices once and reference them by index.

Index Buffer Benefit

Without Index Buffer
1,536
vertices
18.0 KB (positions only)
With Index Buffer
258
vertices + 1536 indices
9.0 KB total
6.0×
fewer vertices
83%
memory saved
9.0 KB
saved

Index buffers let you store each unique vertex once and reference it multiple times. The savings increase with mesh complexity.

The savings scale dramatically. A subdivided sphere with 10,000 triangles might have 30,000 vertices without indexing but only 5,000 unique positions. Index buffers reduce memory, but they also improve performance: when the GPU processes indexed draws, the vertex shader can cache results for vertices referenced multiple times.

In WebGPU, you create index buffers like vertex buffers:

const indexBuffer = device.createBuffer({
  size: indices.byteLength,
  usage: GPUBufferUsage.INDEX | GPUBufferUsage.COPY_DST,
});
device.queue.writeBuffer(indexBuffer, 0, indices);
typescript

Then draw with indices:

renderPass.setIndexBuffer(indexBuffer, 'uint32');
renderPass.drawIndexed(indexCount);
typescript

The second argument to setIndexBuffer specifies the index type: 'uint16' for up to 65,535 vertices, 'uint32' for larger meshes.

Interleaved vs Separate Attributes

Vertex data can be arranged two ways: interleaved or separate.

Interleaved layout packs all attributes for each vertex together:

[pos0, norm0, uv0, pos1, norm1, uv1, pos2, norm2, uv2, ...]
plaintext

Separate layout keeps each attribute in its own array:

positions: [pos0, pos1, pos2, ...]
normals:   [norm0, norm1, norm2, ...]
uvs:       [uv0, uv1, uv2, ...]
plaintext

Interleaved is generally better for rendering: when the GPU fetches vertex data, it reads contiguous memory. All attributes for one vertex arrive together. Separate layouts require multiple memory reads per vertex.

However, separate layouts are easier to work with during data preparation. Many file formats store data separately. The OBJ parser above produces separate arrays.

Converting to interleaved:

function interleave(
  positions: number[],
  normals: number[],
  uvs: number[]
): Float32Array {
  const vertexCount = positions.length / 3;
  const stride = 3 + 3 + 2; // pos + norm + uv
  const buffer = new Float32Array(vertexCount * stride);
  
  for (let i = 0; i < vertexCount; i++) {
    const offset = i * stride;
    // Position
    buffer[offset + 0] = positions[i * 3 + 0];
    buffer[offset + 1] = positions[i * 3 + 1];
    buffer[offset + 2] = positions[i * 3 + 2];
    // Normal
    buffer[offset + 3] = normals[i * 3 + 0];
    buffer[offset + 4] = normals[i * 3 + 1];
    buffer[offset + 5] = normals[i * 3 + 2];
    // UV
    buffer[offset + 6] = uvs[i * 2 + 0];
    buffer[offset + 7] = uvs[i * 2 + 1];
  }
  
  return buffer;
}
typescript

Uploading Mesh Data

With interleaved data, upload to a single vertex buffer:

const vertexData = interleave(mesh.positions, mesh.normals, mesh.uvs);
 
const vertexBuffer = device.createBuffer({
  size: vertexData.byteLength,
  usage: GPUBufferUsage.VERTEX | GPUBufferUsage.COPY_DST,
});
device.queue.writeBuffer(vertexBuffer, 0, vertexData);
typescript

Configure the pipeline to read this layout:

const pipeline = device.createRenderPipeline({
  // ... other config ...
  vertex: {
    module: shaderModule,
    entryPoint: 'vertexMain',
    buffers: [{
      arrayStride: 32, // 8 floats * 4 bytes
      attributes: [
        { shaderLocation: 0, offset: 0,  format: 'float32x3' }, // position
        { shaderLocation: 1, offset: 12, format: 'float32x3' }, // normal
        { shaderLocation: 2, offset: 24, format: 'float32x2' }, // uv
      ],
    }],
  },
});
typescript

The vertex shader receives these as inputs:

struct VertexInput {
  @location(0) position: vec3f,
  @location(1) normal: vec3f,
  @location(2) uv: vec2f,
}
 
@vertex
fn vertexMain(input: VertexInput) -> VertexOutput {
  // Transform position, pass through normal and uv
}
wgsl

Mesh Upload Flow

OBJ File
v -1.0 -1.0 1.0
v  1.0 -1.0 1.0
vn 0.0 0.0 1.0
vt 0.0 0.0
f 1/1/1 2/2/1 3/3/1

Plain text, human-readable format

The Complete Pipeline

Here is the full flow from OBJ text to rendered mesh:

async function loadMesh(url: string, device: GPUDevice) {
  // 1. Fetch the file
  const response = await fetch(url);
  const text = await response.text();
  
  // 2. Parse OBJ
  const mesh = parseObj(text);
  
  // 3. Interleave vertex data
  const vertexData = interleave(mesh.positions, mesh.normals, mesh.uvs);
  
  // 4. Create vertex buffer
  const vertexBuffer = device.createBuffer({
    size: vertexData.byteLength,
    usage: GPUBufferUsage.VERTEX | GPUBufferUsage.COPY_DST,
  });
  device.queue.writeBuffer(vertexBuffer, 0, vertexData);
  
  // 5. Create index buffer
  const indexData = new Uint32Array(mesh.indices);
  const indexBuffer = device.createBuffer({
    size: indexData.byteLength,
    usage: GPUBufferUsage.INDEX | GPUBufferUsage.COPY_DST,
  });
  device.queue.writeBuffer(indexBuffer, 0, indexData);
  
  return {
    vertexBuffer,
    indexBuffer,
    indexCount: mesh.indices.length,
  };
}
typescript

At draw time:

renderPass.setVertexBuffer(0, mesh.vertexBuffer);
renderPass.setIndexBuffer(mesh.indexBuffer, 'uint32');
renderPass.drawIndexed(mesh.indexCount);
typescript

Simple Mesh Viewer

12 triangles, 8 unique vertices

Adjust rotation to see the mesh from different angles.

Normal Generation

Many OBJ files lack normals. You can generate them from face geometry:

function generateNormals(positions: number[], indices: number[]): number[] {
  const normals = new Array(positions.length).fill(0);
  
  // Accumulate face normals at each vertex
  for (let i = 0; i < indices.length; i += 3) {
    const i0 = indices[i], i1 = indices[i + 1], i2 = indices[i + 2];
    
    const p0 = positions.slice(i0 * 3, i0 * 3 + 3);
    const p1 = positions.slice(i1 * 3, i1 * 3 + 3);
    const p2 = positions.slice(i2 * 3, i2 * 3 + 3);
    
    // Edge vectors
    const e1 = [p1[0] - p0[0], p1[1] - p0[1], p1[2] - p0[2]];
    const e2 = [p2[0] - p0[0], p2[1] - p0[1], p2[2] - p0[2]];
    
    // Cross product gives face normal
    const nx = e1[1] * e2[2] - e1[2] * e2[1];
    const ny = e1[2] * e2[0] - e1[0] * e2[2];
    const nz = e1[0] * e2[1] - e1[1] * e2[0];
    
    // Accumulate at each vertex
    for (const idx of [i0, i1, i2]) {
      normals[idx * 3 + 0] += nx;
      normals[idx * 3 + 1] += ny;
      normals[idx * 3 + 2] += nz;
    }
  }
  
  // Normalize
  for (let i = 0; i < normals.length; i += 3) {
    const len = Math.sqrt(
      normals[i] ** 2 + normals[i + 1] ** 2 + normals[i + 2] ** 2
    );
    if (len > 0) {
      normals[i] /= len;
      normals[i + 1] /= len;
      normals[i + 2] /= len;
    }
  }
  
  return normals;
}
typescript

This gives smooth normals by averaging the face normals at each vertex. For hard edges, you need separate vertices with different normals—which is exactly what the OBJ face format enables.

Key Takeaways

  • Meshes consist of positions, normals, UVs, and indices that define triangle connectivity
  • OBJ is a simple text format: v for positions, vn for normals, vt for UVs, f for faces
  • Index buffers let you reuse vertices, reducing memory and improving cache efficiency
  • Interleaved vertex data is more efficient for GPU access than separate attribute arrays
  • The upload pipeline: parse → interleave → create buffers → configure pipeline → draw indexed
  • Missing normals can be generated by accumulating and averaging face normals