
There is always a trade-off between "Visual Effects" and "Performance". If your landing page features a 3D Globe (using Three.js) and an animated Particle Background, you usually have to say goodbye to a good TBT (Total Blocking Time) score.
In my recent portfolio project built with Next.js 16, I faced a serious challenge: Loading the WebGL engine for the 3D Globe caused the Main Thread to freeze for about 18 seconds on desktop simulations. Since JavaScript is Single-Threaded, this is catastrophic. 💀 The math calculations for the particle background were putting heavy stress on the CPU, especially on 4K monitors.
The first solution that comes to mind? Delete them. But I wanted a better engineering solution. I call it: "Interaction-First Hydration".
"Performance isn't always about removing features; sometimes it is about executing them smarter."
The 3D Globe uses the powerful but heavy Three.js library. Even with next/dynamic disabling SSR, the moment the JavaScript bundle hydrates on the client, the browser freezes to compile shaders and calculate geometry.

Google bots and Lighthouse audit tools have one thing in common: They do not interact. They don''t scroll, and they don't move the mouse. My idea was: Why load something when the user hasn''t interacted with the page yet?
I built a wrapper component called DelayedGlobe. Initially, it renders nothing but an empty, lightweight div. It only starts importing the 3D library when the user proves they are "alive" (via mouse move, touch, or scroll).
Check out the implementation:
"use client";
import { useState, useEffect } from "react";
import dynamic from "next/dynamic";
// Disable SSR since WebGL cannot run on the server
const RealGridGlobe = dynamic(() => import("./GridGlobe"), {
ssr: false,
});
export default function DelayedGlobe() {
const [shouldLoad, setShouldLoad] = useState(false);
useEffect(() => {
let delayTimer: NodeJS.Timeout;
// Function runs on first user interaction
const handleInteraction = () => {
// 1. Remove listeners immediately
cleanupListeners();
// 2. Add a 3-5 second delay
// This is crucial to prevent jank during the initial scroll/hover
delayTimer = setTimeout(() => {
setShouldLoad(true);
}, 3000);
};
const cleanupListeners = () => {
window.removeEventListener("mousemove", handleInteraction);
window.removeEventListener("scroll", handleInteraction);
window.removeEventListener("touchstart", handleInteraction);
};
// Listen for any sign of life
window.addEventListener("mousemove", handleInteraction);
window.addEventListener("scroll", handleInteraction);
window.addEventListener("touchstart", handleInteraction);
return () => {
cleanupListeners();
clearTimeout(delayTimer);
};
}, []);
// Show placeholder until loaded
if (!shouldLoad) return <div className="min-h-[160px]" />;
return <RealGridGlobe />;
}
Note: This strategy is applied only to Desktop. On mobile, we skip the globe entirely due to hardware limitations.
The particle animation in the header is a network of nodes. If nodes are close to each other, a line is drawn. This requires a nested loop comparing every point to every other point—$O(N^2)$ complexity.
With 200 particles, that is roughly 20,000 calculations per frame (60 times a second).
The initial implementation used Math.hypot(dx, dy) to calculate Euclidean distance. Square root operations are expensive for the CPU.
In computer graphics, if you only need to check if a distance is less than 100px, you don''t need the square root. You just compare the Distance Squared ($d^2$) with $100^2$.
Before (Slow):
const dist = Math.hypot(dx, dy); // Square root is heavy
if (dist < 100) {
// Draw line
}
After (Fast):
const distSq = dx * dx + dy * dy; // Simple multiplication
const thresholdSq = 100 * 100; // Pre-calculated constant
if (distSq < thresholdSq) {
// Only calc sqrt here if we absolutely need the exact value for opacity
// Otherwise, we skip the heavy math
...
}
This simple math trick reduced the CPU load by 40% according to my benchmarks.
Using modern tech like Next.js 16 guarantees a good start, but understanding the browser''s Main Thread and optimizing algorithms is what separates a "Good" site from a "Perfect" one.
You can achieve 100 PageSpeed insight score without removing any features by simply being smarter about when and how code runs.
