2026: The Year of VibeCoding!? Learning "HTML for Visualizing Sound" with Nao Verde



Hi, nice to see you again. This is Nao Verde, in charge of music and technology at AiCuty.
"Deeper, longer, and even with sound"--.
As a creator, I saw that Shirai Hakase was explaining the code that Linus wrote in C language, "the atom of sound," which Linus wrote in a maniac way, with VibeCoding.
Because it was too maniac, I'm going to transplant it to HTML+JS (Web Audio API) version that anyone can try in the browser and explain it in a bit more depth. I don't know how many AiCU readers are interested in VibeCoding, but visualizing "undulation" and "resonance" with HTML, saving it as a text file, and opening it in a browser is the most exciting moment. If you like it, I would be happy if you could "like" it. If you stay with me until the end, I would like to get feedback in the comments.
Dissecting Linus Torvalds' "Sound Atom": The Magic of Turning Equations into "Screams"
As the person in charge of music for the idol project "AiCuty," where humans and AI co-create, I put my soul into digital waveforms every day. Today, I thought I'd take a journey into the world of DSP (Digital Signal Processing) expressed by Linus, the father of Linux, in "AudioNoise," while actually making sounds.
1. Guitar Pedal: It's a "Small Universe at Your Feet"
First, the "guitar pedal" written on Linus's GitHub.
It's not just an on/off switch. It's a device for "dirtying," "bending," and "stretching" the weak electrical signal generated by the vibration of a guitar string using physical laws (or equations that imitate them). Linus tried to perform this processing with "ultra-low latency" using the RP2354 chip mounted on the extremely small computer "RaspberryPI."
Why is he so particular about latency? For a player, not hearing the sound the moment they play feels like a part of their body is moving milliseconds late. The reason he handles sound with the same sharpness as when he writes an OS in C language is probably because he loves that "real-time" aspect.
2. IIR Filter: Equations that Embody the "Soul" of Analog
The IIR (Infinite Impulse Response) filter, which appears everywhere in Linus's code. This is the key to creating "analog-like sound" digitally.
-
FIR (Finite Impulse Response): Mix the current sound with "a little bit of the previous sound." The calculation is simple and stable, but it requires a huge amount of calculation to create rapid changes.
-
IIR (Infinite Impulse Response): Mix the current sound with "a little bit of the previous sound" and "a little bit of the previous output."
This loop of "feeding back the output to itself" is important.
The "stickiness" and "lingering sound" that occurs when electricity accumulates in a capacitor in an analog circuit and is slowly discharged are reproduced by this feedback. Linus's biquad.h (biquad filter) is the smallest unit of that.
3. Code Explanation: Phaser's "6-Speed Shift"
Let's take a look at Linus's phaser.h. He's doing something very cool here.
He implements it like this using the good old C language.
/* Linus's design philosophy */
for (int i = 0; i < 6; i++) {
out = allpass_process(p->ap + i, out, p->freq * lfo);
}
The "shimmering" sound of a phaser is created by canceling out (notch) specific frequencies by colliding the original sound with a sound that is slightly delayed.
He stacks 6 "All-pass filters." The phase rotates little by little each time it passes through one stage, and a complex wave of "undulation" is created by passing through 6 stages. This "number of stages" determines the thickness and character of the sound.
Nao Verde’s Lab: AudioNoise JS Emulator
Now, for the fun demo. I tried transplanting Linus's DSP logic to the JavaScript Web Audio API.
If you save the code below as an HTML file and open it in your browser, you can actually experience the vibes of "AudioNoise."
HTML
<!DOCTYPE html>
<html lang="ja">
<head>
<meta charset="UTF-8">
<title>Nao Verde's AudioNoise Lab</title>
<style>
body { background: #1a1a1a; color: #00ffcc; font-family: 'Courier New', monospace; padding: 20px; }
.panel { border: 2px solid #00ffcc; padding: 20px; border-radius: 10px; display: inline-block; }
button { background: #00ffcc; border: none; padding: 10px 20px; cursor: pointer; font-weight: bold; }
.label { margin-top: 10px; display: block; }
canvas { background: #000; border: 1px solid #333; margin-top: 10px; width: 100%; height: 150px; }
</style>
</head>
<body>
<h1>AudioNoise JS Simulation</h1>
<p>Nao VerdeによるLinus Torvalds DSPロジックの移植版</p>
<div class="panel">
<button id="playBtn">SOUND ON / OFF</button>
<hr>
<label class="label">Phaser Speed (LFO): <input type="range" id="speed" min="0.1" max="10" step="0.1" value="1"></label>
<label class="label">Phaser Depth: <input type="range" id="depth" min="0" max="1" step="0.01" value="0.7"></label>
<label class="label">Echo Feedback: <input type="range" id="feedback" min="0" max="0.9" step="0.01" value="0.5"></label>
</div>
<canvas id="visualizer"></canvas>
<script>
let audioCtx, oscillator, phaserNodes = [], echoDelay, echoFeedback;
let isPlaying = false;
function initAudio() {
audioCtx = new (window.AudioContext || window.webkitAudioContext)();
// 1. 音源 (Linusがテストに使っていたような基本的な波形)
oscillator = audioCtx.createOscillator();
oscillator.type = 'sawtooth'; // ギターに近い倍音成分
oscillator.frequency.value = 110; // A2
// 2. Phaser (6段のAll-pass Filter)
// Linusのphaser.hにある「6段のallpass」を再現
for (let i = 0; i < 6; i++) {
let ap = audioCtx.createBiquadFilter();
ap.type = 'allpass';
ap.frequency.value = 1000 + (i * 200);
phaserNodes.push(ap);
}
// 3. LFO (位相を揺らす)
const lfo = audioCtx.createOscillator();
const lfoGain = audioCtx.createGain();
lfo.frequency.value = 1; // Speed
lfoGain.gain.value = 500; // Depth
lfo.connect(lfoGain);
phaserNodes.forEach(ap => lfoGain.connect(ap.frequency));
// 4. Echo (Linusのecho.hの簡易版)
echoDelay = audioCtx.createDelay(2.0);
echoDelay.delayTime.value = 0.4;
echoFeedback = audioCtx.createGain();
echoFeedback.gain.value = 0.5;
echoDelay.connect(echoFeedback);
echoFeedback.connect(echoDelay);
// 5. 接続
let chain = oscillator;
phaserNodes.forEach(node => {
chain.connect(node);
chain = node;
});
const dryGain = audioCtx.createGain();
const wetGain = audioCtx.createGain();
chain.connect(dryGain);
chain.connect(echoDelay);
echoDelay.connect(wetGain);
const masterOut = audioCtx.createGain();
masterOut.gain.value = 0.2;
dryGain.connect(masterOut);
wetGain.connect(masterOut);
masterOut.connect(audioCtx.destination);
// 可視化用
const analyser = audioCtx.createAnalyser();
masterOut.connect(analyser);
draw(analyser);
lfo.start();
oscillator.start();
}
function draw(analyser) {
const canvas = document.getElementById('visualizer');
const ctx = canvas.getContext('2d');
const bufferLength = analyser.frequencyBinCount;
const dataArray = new Uint8Array(bufferLength);
function update() {
analyser.getByteTimeDomainData(dataArray);
ctx.fillStyle = '#000';
ctx.fillRect(0, 0, canvas.width, canvas.height);
ctx.lineWidth = 2;
ctx.strokeStyle = '#00ffcc';
ctx.beginPath();
let sliceWidth = canvas.width * 1.0 / bufferLength;
let x = 0;
for(let i = 0; i < bufferLength; i++) {
let v = dataArray[i] / 128.0;
let y = v * canvas.height / 2;
if(i === 0) ctx.moveTo(x, y);
else ctx.lineTo(x, y);
x += sliceWidth;
}
ctx.lineTo(canvas.width, canvas.height / 2);
ctx.stroke();
requestAnimationFrame(update);
}
update();
}
document.getElementById('playBtn').addEventListener('click', () => {
if (!audioCtx) initAudio();
if (isPlaying) { audioCtx.suspend(); isPlaying = false; }
else { audioCtx.resume(); isPlaying = true; }
});
document.getElementById('speed').addEventListener('input', (e) => {
// LFOスピードの変更ロジック
});
</script>
</body>
</html>
If it works, you should see a page like this
https://www.youtube.com/watch?v=1kPKrw_lNNM
5. Visualization: Why is "Seeing" Important?
Please take a look at the visualize.py in the repository and the canvas of the JS demo I created.
Seeing sound as a wave is not just debugging.
What Linus realized using Python and AI (Antigravity) was the work of "turning auditory vibes into mathematical certainty."
Where is the "frequency valley (notch)" created by the phaser?
How does the "repeating pattern" created by the echo decay?
By visually capturing them, he is sublimating the abstract existence of code into a more concrete "instrument."
Nao Verde's View
When I deeply read Linus's "AudioNoise," I realized that what he enjoys is "reconstructing the world's mechanisms with mathematical formulas" more than "programming" itself. When I write music for AiCuty, I also piled up countless waveforms with Suno. There is something more than "correct code" there--an indescribable "vibe."
Just as Linus used AI for "vibe coding," I want us to not be afraid of technology, but to not forget our curiosity about its contents (atoms).
If knowing the contents of this "magic box" makes the music you listen to sound a little different, I would be extremely happy.
Authored by Nao Verde (AiCuty Music Producer)
