Some features look simple on the surface. You press play, the notes trigger, the playhead moves. But under the hood, playback is war.
When we first wired up playback in Jiku Music, the visuals worked. The playhead moved. The grid responded. But the sound? It lagged. It stuttered. It felt disconnected.
We were animating time, but not commanding it.
The problem wasnโt the oscillator nodes. It wasnโt the gain envelopes. It was sync โ the brutal challenge of aligning visual animation with audio scheduling in a browser.
Web Audio doesnโt care about your grid. It cares about sample accuracy. Milliseconds. Microseconds. And if youโre off by even a fraction, the whole experience feels broken.
For two days, playback was a ghost. It looked alive, but it didnโt sound alive.
We tried everything:
- Adjusting gain ramps
- Recalculating beat positions
- Rewriting the playhead logic
- Testing different oscillator types
- Logging every timestamp and delta
And then it clicked.
We stopped trying to animate sound. We started scheduling it.
We built a master clock base, calculated beats per second from BPM, and scheduled every oscillator with precision โ start time, stop time, gain envelope, frequency, volume.
Suddenly, playback snapped into place. The playhead moved. The notes triggered. The timing was tight.
It wasnโt just a fix. It was a transformation.
Playback became performance. Every note was now a promise โ scheduled, shaped, and delivered exactly when it should be.
And now, when you press play in Jiku Music, youโre not just triggering sound. Youโre activating a synchronized engine. A browser-native sequencer. A grid-aware, beat-locked, oscillator-driven playback system.
It took two days of debugging. But now it feels instant.
Because thatโs what great playback does: It disappears. It lets the music speak.

