Semester 1 was about grounding the project in small, structured experiments. Instead of aiming for refinement, everything here focused on understanding how sound, haptics and their properties behave once translated into visual or interactive form.

Experiments



Experiment 1 : Sound Pattern Interface

Sound Pattern Interface is a sound-reactive sketch built in p5.js that listens to live microphone input and translates it into three different visual “modes”: a DNA-like helix, a drifting neural network, and travelling sound waves that sweep across the screen. Using FFT bands (bass, mids, treble), the system stretches, compresses and brightens these forms in real time, so the canvas almost behaves like a reaction study space for sound energy.

The Sketch →





Experiment 2 : Constellation Sound

Constellation Sound is an interactive drawing–instrument where each click on the canvas creates a star and a musical note. The horizontal position of the star is mapped to a C-major pentatonic scale, so the constellation slowly turns into a melody as you keep placing points. Pressing ENTER clears the screen and “replays” the drawing in the same order, rebuilding the constellation while playing back each note. This sketch treats memory as something spatial and sonic at the same time: the pattern you draw is both a visual trace and a sequence of pitches.

The Sketch →






Experiment 3 : Letter–Sound Sequencer

Letter Sound Sequencer turns the keyboard into a tiny text-based instrument. Each letter from A–Z is mapped to a frequency and a colour, so typing becomes both a way to write and a way to play notes. In Create mode, I can type letters to improvise, with each keypress leaving a coloured letter on the screen and being stored with its time-stamp. Short sequences can be replayed, almost like a quick sketch of a melody. The second part is Compose mode, where saved “words” behave like building blocks. I can assign each recorded word to a number key and then string them together into a longer track.
The Sketch →













Experiment 4 : Sonic–Haptic Constellation

Sonic–Haptic Constellation is an early touch-sensor experiment where six wires connected to a MPR121 that act as “stars” in a circular formation on screen. Touching a wire triggers a set of simultaneous responses: a sine tone for that star, a glow that expands outward like a ripple, and subtle shifts in a surrounding dust field that reacts to the intensity of the touch. Although there is no physical vibration yet, this sketch became important because it let me prototype the logic of sonic–haptic behaviour before building any physical haptics. layered or fused.









Prototype 1 : Build To Feel


This prototype used a piezo sensor hidden under a sheet of paper, turning the surface into a kind of quiet stage where pressure, touch, and building gestures became data. Instead of asking people to press a button, I let them build with LEGO on top of the paper. Every block, every adjustment, every moment of pressing down created a sonic and visual reaction. It felt like the surface was listening to the audience’s hands and not just detecting taps but reading different intensities of touch.

This was the first time I felt the sonic–haptic idea come through in a simple, honest way. The sound didn’t feel separate from the action; it was directly shaped by how someone interacted with the material. The visuals also changed with pressure, so the whole system behaved like a single organism where physical force became a sonic signature. The important learning from this prototype was how sensitive touch needs to be for the experience to feel natural. Even small variations in pressure changed the outcome, which made the interaction feel more personal. (also tried a snowman version first just to test)






Prototype 2 : Vibration Touch Paper”

In this prototype, I moved from pressure to contact. I connected four shapes on paper to the MPR121 touch board using alligator clips, creating a simple touch-based interface. Each shape acted like its own little island: when you touched it, the system triggered a sound and activated a coin vibration motor at the same time. It was a small moment, but the dual feedback, sound in the air, vibration in the hand, made the interaction feel more embodied than anything I had built so far.

Even without complex visuals, the touch → sound → vibration flow felt coherent. The shapes became more than drawings; they became “zones” of multisensory feedback. What I learned here was how important timing is. The sound and vibration had to happen together for the experience to feel believable. Even though the visuals aren’t developed yet, this prototype marks the moment where the sonic–haptic connection became tangible. It’s also the one that made me think more seriously about how future interactions could blend shape, material, vibration, and sound into a more unified experience.