When the Heart Speaks Math: AI-Controlled Coupled Oscillators in the Browser


Your heart doesn’t beat because something tells it to. It beats because it has to — it’s a self-sustaining oscillator, a nonlinear dynamical system that generates its own rhythm from the inside out. No central command, no external clock. Just physics.

That autonomy is beautiful. And it raises a question: if you wanted an AI to regulate a system like that — not replace it, but negotiate with it — how would you do it?

This post documents exactly that experiment. We built a browser-based simulation of two coupled Van der Pol oscillators (heart and respiration), trained a neural network to control the coupling between them in real time, and deployed the whole thing with Rust/WASM, PyTorch, ONNX, and Next.js. Here’s how it works.


The Van der Pol Oscillator as a Heartbeat Model

The Van der Pol equation is a classic model for self-sustaining biological oscillations:

The term is the key. When is small, it acts as negative damping — injecting energy into the system. When is large, it dissipates energy. The result is a stable limit cycle: trajectories spiral away from the origin at small amplitudes and fold back at large ones, settling onto a closed orbit regardless of initial conditions.

This is the local signature of a Hopf bifurcation — as crosses zero, the origin switches from a stable spiral to an unstable one, and the limit cycle is born. For the heart, is the natural state. The sinoatrial node is doing this continuously, without asking permission.


Respiratory Sinus Arrhythmia: Two Oscillators, One Body

Here’s a physiological fact that surprises most people: your heart rate is not constant. It speeds up slightly when you inhale and slows down when you exhale. This is called respiratory sinus arrhythmia (RSA), and it’s a sign of a healthy autonomic nervous system.

RSA is the result of two oscillators — cardiac and respiratory — being loosely coupled. The vagus nerve mediates this coupling, and the autonomic nervous system modulates its strength in response to context (rest, stress, exercise).

We can model this with two coupled Van der Pol equations:

The parameters and are the coupling strengths — how much each oscillator “pulls” the other. When and are small, the two oscillators run nearly independently. When they’re large, the oscillators synchronize. The autonomic nervous system is constantly tuning these values. That’s the job we’re giving to the AI.


The Architecture

Rust/WASM          →  runs the coupled ODE in the browser at ~60fps
PyTorch            →  trains the coupling controller offline
ONNX               →  exports the model as a portable format
onnxruntime-web    →  runs inference directly in the browser
Next.js            →  frontend + build pipeline
Vercel             →  deployment (free tier)

The simulation loop looks like this every frame:

// 1. Step the ODE in WASM
const result = coupled_vdp_step(mu1, mu2, alpha, beta, x1, y1, x2, y2, dt);

// 2. Every 30 frames, run inference in the browser
const raw = [state.x1, state.y1, state.x2, state.y2, stress];
const scaled = raw.map((v, i) => (v - SCALER_MEAN[i]) / SCALER_SCALE[i]);
const tensor = new ort.Tensor('float32', Float32Array.from(scaled), [1, 5]);
const results = await ortSession.run({ state: tensor });
const [alpha, beta] = results.coupling.data;

// 3. Render waveforms and phase portrait to canvas

The WASM solver uses RK4 integration — a meaningful upgrade from the Euler method in the original Van der Pol demo. RK4 handles the stiff dynamics at high without accumulating errors.


Training the Controller

The neural network is small: two hidden layers of 64 units with Tanh activations, a Softplus output to ensure .

Input (5 features): , stress level

Output (2 targets): ,

Training data was generated by simulating the coupled system across a sweep of stress levels, using physiologically motivated parameter schedules:

for stress in np.linspace(0.0, 1.0, 6):
    mu1   = 1.5 + stress * 2.0   # heart more nonlinear under stress
    mu2   = 0.5 + stress * 0.5   # respiration speeds up
    alpha = 0.1 + stress * 0.4   # coupling strengthens
    beta  = 0.05 + stress * 0.2

After 200 epochs of Adam with MSE loss, validation loss settled at ~0.000001 — essentially perfect recovery of the coupling values from state observations. This makes sense: the mapping from stress to coupling is deterministic and smooth, so a small network can learn it exactly.


Exporting to ONNX

Once trained, the model is exported with one line:

torch.onnx.export(model, dummy_input, "coupling_controller.onnx",
    input_names=["state"], output_names=["coupling"],
    opset_version=18, dynamo=False)

The scaler parameters (mean and standard deviation per feature) are hardcoded from training — a simple but effective way to handle preprocessing without a Python runtime.


A Deployment Problem Worth Documenting

The original plan was to serve inference server-side via a Next.js API route using onnxruntime-node. It worked perfectly in local development. Then we deployed to Vercel and every single API call returned a 500 error. The logs told the story immediately:

Error: libonnxruntime.so.1: cannot open shared object file: No such file or directory

onnxruntime-node relies on a native shared library that doesn’t exist in Vercel’s serverless Lambda environment. The containers are stripped-down — they don’t carry arbitrary native binaries. This is a known limitation but easy to miss if you’ve only tested locally.

The fix turned out to be an architectural improvement: move inference entirely to the browser using onnxruntime-web instead. The model is only 19KB, so loading it client-side is fast. The API route disappeared entirely. The stack got simpler.

// Load the ONNX session once, alongside the WASM module
ort.env.wasm.wasmPaths = '/';
ort.env.wasm.numThreads = 1;
const session = await ort.InferenceSession.create('/coupling_controller.onnx');

The lesson: onnxruntime-node is for Node.js servers you control (a VPS, a container, a dedicated backend). For serverless platforms like Vercel, onnxruntime-web is the right tool — and for a small model running at ~2 inferences per second, the browser handles it effortlessly.

This is the kind of thing you only learn by hitting the wall. Now you know.


What You Can Observe

The demo shows three panels: the cardiac waveform, the respiratory waveform, and a phase portrait of both oscillators simultaneously.

At rest (), the two limit cycles in the phase portrait are slightly offset — the oscillators run at different natural frequencies with weak coupling. As you increase stress, the AI controller responds by raising and , and the orbits converge. The waveforms speed up. The phase portrait tightens.

This is the math of what happens in your body when you go from sitting quietly to running for a bus.


What’s Next

This is a foundation, not a finished product. A few directions worth exploring:

The full source is available on GitHub.


Built with Rust, WebAssembly, PyTorch, ONNX, and Next.js. Both the simulation and the AI inference run entirely client-side in the browser.