Geometry Processing

Mesh Eigen Functions & Spectral Analysis

I kept running into the Laplacian everywhere in geometry processing and couldn't shake the feeling that I didn't really get it - like, why the Laplacian? And what does it even mean to have eigenvectors of one? This representation feels criminally underexplored (at least by me), and I think it has massive potential. So this is me building the intuition, one function at a time. 🐒

1 Why the Laplacian?

We have a 3D mesh - a human body, a teapot, a scanned face. It's a cloud of vertices connected by edges and triangles. Now suppose we want to study how some quantity (temperature, color, displacement) varies across the surface.

🧠 The core question
How different is the value at this point compared to its immediate surroundings?

On a smooth surface, the tool for this is the Laplacian operator ∇². Think about heat diffusion. If vertex i sits at 100° and all its neighbors sit at 20°, heat flows outward fast. If vertex i matches its neighbors exactly, nothing happens. The Laplacian captures exactly this intuition.

Wait, isn't the Laplacian a second derivative? Yes! And it turns out "second derivative" and "difference from neighbors" are the same thing in discrete form. We'll prove it in section 2.

On a mesh we don't have a smooth surface, we have discrete vertices and edges. So we need a discrete Laplacian that approximates the continuous operator on our graph structure.

· · ·

2 The discrete Laplacian on a mesh

Let f be a scalar function on the mesh - one number per vertex (temperature, height, color, whatever). The combinatorial Laplacian at vertex i is:

Lf(i) = Σj∈N(i) (f(j) − f(i)) Sum of differences between vertex i and each of its neighbors N(i)
💡 "But that's just a first-order difference, not a second derivative?"

Great catch. Let's actually prove it's a second derivative in 1D. This tripped me up for a long time, so I want to show the math step-by-step to make it crystal clear.

f′(i) = f(i+1) − f(i) (first forward difference)
f′(i−1) = f(i) − f(i−1) (first forward difference)
f″(i) = f′(i) − f′(i−1) (second backward difference)
= f(i+1) − 2f(i) + f(i−1)

Rearranged: [f(i+1) + f(i−1)] − 2f(i) = 2 × (avg. neighbors − f(i)) ✓

The second derivative literally is the neighbor-average difference.

As a matrix

Since f is a vector (N numbers, one per vertex), L can be written as an N×N matrix. The entries are beautifully simple:

EntryValueWhy
L[i][i]−degree(i)Negative count of neighbors (diagonal)
L[i][j]1If i and j share an edge
L[i][j]0If i and j are not connected

For 3 vertices all connected to each other, L looks like:

−2 1 1
1−2 1
1 1−2
🔑 L is built purely from connectivity - no scalar function needed. It's a machine ready to act on any f you give it.
· · ·

3 Eigen functions: the mesh's natural patterns

Since L is a matrix, we can ask the classic eigenvalue question: are there special functions f where applying L just scales f without changing its pattern?

Lf = λf f is the Eigen function. λ is the eigenvalue. The mesh decides both.
🧠 What does this mean geometrically?
Applying L to f measures "how different f is from its neighbors." If Lf = λf, that difference is just f scaled - meaning f has a consistent, self-similar relationship with its neighborhood everywhere on the mesh.

A mesh with N vertices has exactly N Eigen functions f₁, f₂, …, fₙ with eigenvalues λ₁ ≤ λ₂ ≤ … ≤ λₙ. Each Eigen function is just a vector of N numbers - one value per vertex - describing a pattern across the surface.

🔥 Nobody chooses these patterns
The mesh shape itself dictates them. You just solve the math and they fall out. This is why they're powerful - they're intrinsic to the shape, not an arbitrary choice.
🎸 Guitar string analogy

A guitar string has natural vibration frequencies determined by its physical properties - length, tension, mass. You don't choose them. Pluck it and they emerge.

🔺 Mesh analogy

A mesh has natural Eigen function patterns determined by its connectivity. You don't choose them. Solve Lf = λf and they emerge.

Computing Eigen Functions in practice

Mesh Laplacians are sparse - each vertex connects to only ~6 neighbors out of thousands, so L has mostly zeros. Use sparse solvers:

Python
# Dense (small meshes)
eigenvalues, eigenvectors = numpy.linalg.eigh(L)

# Sparse (large meshes - much faster, only compute K smallest)
eigenvalues, eigenvectors = scipy.sparse.linalg.eigsh(L, k=50)
Eigen vectors vs Eigen functions - same thing! An eigenvector of L is a vector of N numbers (one per vertex). An Eigen function is a function on the mesh (one value per vertex). They're literally identical objects - just different words depending on whether you're thinking algebraically or geometrically.
· · ·

4 Small λ vs large λ - the frequency story

The eigenvalue λ tells you about the frequency of the Eigen function pattern. This follows directly from Lf = λf:

✅ Small λ (close to 0)
Lf(i) ≈ 0 → Σ(f(j) − f(i)) ≈ 0 → neighbors ≈ current vertex value
Smooth, slowly varying pattern across the mesh
🔥 Large λ
Lf(i) is large → neighbors differ a lot from vertex i
Spiky, rapidly oscillating pattern
f₁ - λ ≈ 0
Constant / flat
f₂ - small λ
One smooth wave
fₙ - large λ
Rapid oscillation
🎵 Eigen functions are the Fourier frequencies of the mesh. Any function on the mesh = sum of Eigen functions, just like any signal = sum of sinusoids.

This is called spectral geometry - and it's why people talk about the "frequency domain" of a mesh. Small λ = low frequency = smooth. Large λ = high frequency = detail / noise.

· · ·

5 Extracting the low-frequency mesh

This is spectral mesh filtering - the 3D geometry equivalent of a low-pass filter in audio. Want to smooth a noisy mesh? Kill the high-frequency Eigen functions.

Any scalar function f on the mesh decomposes as:

f = c₁f₁ + c₂f₂ + … + cₙfₙ where cₖ = fₖᵀ · f (dot product = "how much of Eigen function k is in f?")

To keep only the first K smooth Eigen functions:

fsmooth = c₁f₁ + c₂f₂ + … + cKfK Low-pass filter - throw away everything above K

For a 3D mesh, apply this independently to x, y, and z coordinates:

Build L from mesh connectivity
1s for neighbors, −degree on diagonal, 0s everywhere else
L[i][i] = −deg(i) L[i][j] = 1 if neighbors
Solve Lf = λf
Get N Eigen functions sorted by eigenvalue - smoothest first
→ f₁, f₂, …, fₙ and λ₁ ≤ λ₂ ≤ … ≤ λₙ
Project coordinates onto Eigen functions
Decompose x, y, z separately into the Eigen function basis
cₖ = fₖᵀ · x (repeat for y, z)
Reconstruct with first K Eigen functions only
Keep smooth patterns, discard spiky noise
x_smooth = Σₖ₌₁ᴷ cₖfₖ (repeat for y, z)
Smoothed mesh ✓
New vertex positions with high-frequency noise removed
K small → very smooth | K large → preserves more detail
💡 The K tradeoff
K = 5: Very smooth - you'll lose fine details, but noise is gone.
K = 200: Gentle smoothing - details preserved, some noise remains.
K = N: Perfect reconstruction - nothing removed.
Drag slider · orbit with mouse · scroll to zoom · toggle wireframe Open fullscreen ↗
· · ·

6 Python implementation

Here's the full pipeline - clean, commented, and ready to drop into your project:

Python
import numpy as np
from scipy.linalg import eigh

# ── Step 1: Build Laplacian ─────────────────────────
def build_laplacian(n_vertices, edges):
    """
    edges: list of (i, j) vertex index pairs
    Returns L: (N, N) combinatorial Laplacian matrix
    """
    L = np.zeros((n_vertices, n_vertices))
    for i, j in edges:
        L[i, j] += 1       # neighbor gets +1
        L[j, i] += 1       # symmetric
        L[i, i] -= 1       # subtract degree on diagonal
        L[j, j] -= 1
    return L

# ── Step 2: Compute Eigen functions ─────────────────
def compute_Eigen functions(L):
    """
    Returns eigenvalues sorted ascending + Eigen functions.
    Each column of Eigen functions is one Eigen function.
    """
    eigenvalues, Eigen functions = eigh(L)
    return eigenvalues, Eigen functions

# ── Steps 3+4: Low-frequency mesh extraction ───────
def low_frequency_mesh(vertices, Eigen functions, K):
    """
    vertices:       (N, 3) xyz positions
    Eigen functions: (N, N) matrix - columns are Eigen functions
    K:              how many Eigen functions to keep (low-pass)
    Returns smoothed (N, 3) vertex positions
    """
    # Keep only K smoothest Eigen functions
    F_K = Eigen functions[:, :K]        # (N, K)

    # Project: cₖ = fₖᵀ · vertices
    coeffs = F_K.T @ vertices           # (K, 3)

    # Reconstruct from K Eigen functions only
    vertices_smooth = F_K @ coeffs     # (N, 3)
    return vertices_smooth

# ── Usage ──────────────────────────────────────────
# edges    = [(0,1), (1,2), ...]   from your mesh
# vertices = np.array([...])       shape (N, 3)

L = build_laplacian(len(vertices), edges)
eigenvalues, Eigen functions = compute_Eigen functions(L)

# Strong smoothing - keep only 20 Eigen functions
smooth = low_frequency_mesh(vertices, Eigen functions, K=20)

# Gentle smoothing - keep 200
gentle = low_frequency_mesh(vertices, Eigen functions, K=200)
ℹ️ Pro tip for large meshes
Don't compute all N Eigen functions if you only need K. Use scipy.sparse.linalg.eigsh(L, k=K) - it computes only the K smallest directly, which is dramatically faster for meshes with thousands of vertices.