🔮 Brain Visualizer··7 min read

Watch Your AI Brain Grow in 3D — Live

Every lesson your AI learns becomes a glowing node in a real-time 3D galaxy. We built this because dashboards shouldn't be boring — and now it's the most addictive thing in our product.

The problem with invisible memory

When we shipped persistent AI memory for Claude Code, Cursor, and GitHub Copilot, the most common reaction was: "It works — but I can't see it working."

That's a real problem. If you can't see your AI learning, you don't trust it. If you don't trust it, you don't use it. And if you don't use it, it never gets better.

So we asked ourselves: what would make the learning visible? Not just a list of lessons — but something you can actually watch grow in real time?

Building the 3D Brain

The answer was Three.js. We rendered every topic your AI knows as a sphere floating in 3D space. Related topics cluster together. Topics your AI has used recently pulse brighter. The whole thing rotates slowly — a live galaxy of everything your AI has ever learned.

The technical implementation uses a Fibonacci sphere algorithm to distribute nodes evenly across a sphere. Node size scales with lesson count. Color encodes category (deploys, fixes, architecture, debugging, etc.). Edges connect related topics with luminescent beams.

// Core rendering loop
const geometry = new THREE.SphereGeometry(
  Math.log(lesson_count + 1) * 0.15 + 0.08
);
const material = new THREE.MeshStandardMaterial({
  color: CATEGORY_COLORS[category],
  emissive: CATEGORY_COLORS[category],
  emissiveIntensity: 0.3 + (0.7 * Math.sin(t * 2)),
});

The pulsing emissive intensity is the secret sauce. It makes the brain feel alive. Nodes breathe. They glow. When a new lesson arrives, the corresponding cluster lights up.

What "watching AI learn" actually means

Here's a typical session flow with the Brain Visualizer open:

  1. You fix a bug with your AI assistant (Cursor, Claude Code, etc.)
  2. learn_from_attempts fires automatically — the lesson is stored in cachly
  3. The Brain Visualizer polls every 5 seconds via the GET /v1/instances/:id/brain-stats endpoint
  4. The debug cluster in the 3D galaxy grows by one node and briefly pulses violet
  5. Next session: your AI sees that node, recalls the exact fix, and doesn't repeat the mistake

It sounds small. It isn't. Watching your AI get smarter in real time changes how you feel about AI-assisted development.

Phase 3: The God View — see every brain at once

For teams and admins, we went further. The God View (/admin/brain) shows a single galaxy that aggregates every brain in your organization.

This is what we call the Global Brain: the collective knowledge of every AI assistant your team uses, anonymized and merged into one 3D visualization. Universal topics float at the center. Anomalies — patterns that appear in one brain but not others — orbit at the edge in red.

The Global Brain API (GET /api/admin/global-brain) scans all running instances' Valkey stores, aggregates topics anonymously, and returns the merged knowledge graph in under 50ms — even across 50+ instances.

Phase 4: Predictive nodes — what your brain will learn next

The most surprising feature we shipped is predictive learning paths. After enough lessons, the brain can predict what it will learn next.

The algorithm is deliberately simple: we curated 10 common engineering learning sequences (e.g. deploy → fix → debug, api → docs → test) and match them against your brain's current state. Matches above a confidence threshold appear as dashed future-nodes in the visualizer — translucent spheres with a ? prefix.

For teams, cross-instance predictions kick in: if three other brains on your team have all learned k8s:ingress after k8s:pods, your brain gets a prediction too.

These predictions don't just show up in the visualizer — they surface in session briefings. Your AI might start a session with: "Based on your recent deploy work, you'll probably hit a TLS certificate issue next. Here's what your team learned about that."

Performance: 60fps, even at 300+ nodes

Three.js can get heavy fast. We made three decisions to keep it smooth:

  1. InstancedMesh for small nodes — nodes below a size threshold share a single draw call
  2. requestAnimationFrame budget cap — rotation speed auto-throttles below 30fps to keep the tab responsive
  3. Lazy loading — the Three.js canvas is loaded with dynamic(ssr: false) so it doesn't block the dashboard LCP

On a MacBook Pro M3, the visualizer runs at 60fps with 300 nodes and 450 edges. On a low-end Android phone it drops to ~40fps with auto-throttling keeping it comfortable.

Try it: it's live now

The Brain Visualizer is live for all cachly users at /instances/:id/brain. The God View is available at /admin/brain for admin users.

To get started: run npx @cachly-dev/init in your project. Your MCP server will start storing lessons from the first session. Open the visualizer and watch them appear.

The first time you see a new node light up because you fixed a bug — you'll understand why we built this.

cachly is a managed AI Brain for developers — persistent memory, team knowledge sharing, and semantic cache for Claude Code, Cursor, GitHub Copilot & Windsurf. One MCP server. 51 tools. Free tier, EU servers, no credit card.

Your AI is forgetting everything right now.

Every session starts blank. Every bug re-discovered. Every deploy procedure re-explained. cachly fixes that in 30 seconds — your AI remembers every lesson, every fix, every teammate's hard-won knowledge. Forever.

🇪🇺 EU servers · GDPR-compliant🆓 Free tier — forever, no credit card⚡ 30-second setup via npx🔌 Claude Code · Cursor · Copilot · Windsurf