When the Heart Speaks Math: AI-Controlled Coupled Oscillators in the Browser
Your heart doesn’t beat because something tells it to. It beats because it has to — it’s a self-sustaining oscillator, a nonlinear dynamical system that generates its own rhythm from the inside out. No central command, no external clock. Just physics.
That autonomy is beautiful. And it raises a question: if you wanted an AI to regulate a system like that — not replace it, but negotiate with it — how would you do it?
This post documents exactly that experiment. We built a browser-based simulation of two coupled Van der Pol oscillators (heart and respiration), trained a neural network to control the coupling between them in real time, and deployed the whole thing with Rust/WASM, PyTorch, ONNX, and Next.js. Here’s how it works.
The Van der Pol Oscillator as a Heartbeat Model
The Van der Pol equation is a classic model for self-sustaining biological oscillations:
The
This is the local signature of a Hopf bifurcation — as
Respiratory Sinus Arrhythmia: Two Oscillators, One Body
Here’s a physiological fact that surprises most people: your heart rate is not constant. It speeds up slightly when you inhale and slows down when you exhale. This is called respiratory sinus arrhythmia (RSA), and it’s a sign of a healthy autonomic nervous system.
RSA is the result of two oscillators — cardiac and respiratory — being loosely coupled. The vagus nerve mediates this coupling, and the autonomic nervous system modulates its strength in response to context (rest, stress, exercise).
We can model this with two coupled Van der Pol equations:
The parameters
The Architecture
Rust/WASM → runs the coupled ODE in the browser at ~60fps
PyTorch → trains the coupling controller offline
ONNX → exports the model as a portable format
onnxruntime-web → runs inference directly in the browser
Next.js → frontend + build pipeline
Vercel → deployment (free tier)
The simulation loop looks like this every frame:
// 1. Step the ODE in WASM
const result = coupled_vdp_step(mu1, mu2, alpha, beta, x1, y1, x2, y2, dt);
// 2. Every 30 frames, run inference in the browser
const raw = [state.x1, state.y1, state.x2, state.y2, stress];
const scaled = raw.map((v, i) => (v - SCALER_MEAN[i]) / SCALER_SCALE[i]);
const tensor = new ort.Tensor('float32', Float32Array.from(scaled), [1, 5]);
const results = await ortSession.run({ state: tensor });
const [alpha, beta] = results.coupling.data;
// 3. Render waveforms and phase portrait to canvas
The WASM solver uses RK4 integration — a meaningful upgrade from the Euler method in the original Van der Pol demo. RK4 handles the stiff dynamics at high
Training the Controller
The neural network is small: two hidden layers of 64 units with Tanh activations, a Softplus output to ensure
Input (5 features):
Output (2 targets):
Training data was generated by simulating the coupled system across a sweep of stress levels, using physiologically motivated parameter schedules:
for stress in np.linspace(0.0, 1.0, 6):
mu1 = 1.5 + stress * 2.0 # heart more nonlinear under stress
mu2 = 0.5 + stress * 0.5 # respiration speeds up
alpha = 0.1 + stress * 0.4 # coupling strengthens
beta = 0.05 + stress * 0.2
After 200 epochs of Adam with MSE loss, validation loss settled at ~0.000001 — essentially perfect recovery of the coupling values from state observations. This makes sense: the mapping from stress to coupling is deterministic and smooth, so a small network can learn it exactly.
Exporting to ONNX
Once trained, the model is exported with one line:
torch.onnx.export(model, dummy_input, "coupling_controller.onnx",
input_names=["state"], output_names=["coupling"],
opset_version=18, dynamo=False)
The scaler parameters (mean and standard deviation per feature) are hardcoded from training — a simple but effective way to handle preprocessing without a Python runtime.
A Deployment Problem Worth Documenting
The original plan was to serve inference server-side via a Next.js API route using onnxruntime-node. It worked perfectly in local development. Then we deployed to Vercel and every single API call returned a 500 error. The logs told the story immediately:
Error: libonnxruntime.so.1: cannot open shared object file: No such file or directory
onnxruntime-node relies on a native shared library that doesn’t exist in Vercel’s serverless Lambda environment. The containers are stripped-down — they don’t carry arbitrary native binaries. This is a known limitation but easy to miss if you’ve only tested locally.
The fix turned out to be an architectural improvement: move inference entirely to the browser using onnxruntime-web instead. The model is only 19KB, so loading it client-side is fast. The API route disappeared entirely. The stack got simpler.
// Load the ONNX session once, alongside the WASM module
ort.env.wasm.wasmPaths = '/';
ort.env.wasm.numThreads = 1;
const session = await ort.InferenceSession.create('/coupling_controller.onnx');
The lesson: onnxruntime-node is for Node.js servers you control (a VPS, a container, a dedicated backend). For serverless platforms like Vercel, onnxruntime-web is the right tool — and for a small model running at ~2 inferences per second, the browser handles it effortlessly.
This is the kind of thing you only learn by hitting the wall. Now you know.
What You Can Observe
The demo shows three panels: the cardiac waveform, the respiratory waveform, and a phase portrait of both oscillators simultaneously.
At rest (
This is the math of what happens in your body when you go from sitting quietly to running for a bus.
What’s Next
This is a foundation, not a finished product. A few directions worth exploring:
- Reinforcement learning controller — instead of supervised learning from a fixed stress schedule, train an RL agent that gets rewarded for maintaining RSA coherence and penalized for dangerous parameter regimes.
- Neural ODE formulation — embed the controller directly inside the differential equation as a learned forcing term, rather than a parameter-tuning wrapper.
- HRV analysis — compute heart rate variability metrics from the simulation output and display them alongside the waveforms. HRV is clinically meaningful and would make the demo more grounded.
- Noise injection — real biological oscillators have stochastic perturbations. Adding noise and watching the controller maintain coherence would be a more honest model.
The full source is available on GitHub.
Built with Rust, WebAssembly, PyTorch, ONNX, and Next.js. Both the simulation and the AI inference run entirely client-side in the browser.