RESONANT SPACE
Feb. 2025
Speculative Design
Video Link
Resonant Space explores how biometric signals—specifically heartbeat and pulse—can serve as the foundation for an immersive visual experience rooted in healing and embodiment.
By translating real-time physiological rhythms into generative particle motion, and allowing hand gestures to guide their flow, the system constructs a space of resonance between body and environment—one that listens and responds to the user.
Rather than relying on spectacle or passive immersion, Resonant Space encourages users to enter interaction through their own rhythms.
Through this, it fosters a sense of calm, presence, and agency—turning the body from a sensed object into an active co-creator within the environment.
Resonant Space 探索了如何将心跳与脉搏等生物信号转化为沉浸式视觉体验的核心驱动,以此建立一个围绕“感知与疗愈”的互动空间。
通过将实时生理节奏转化为粒子运动,并允许用户通过手势引导其流动,系统在身体与环境之间构建起一种响应性的共鸣关系。
不同于传统的“观看式”沉浸,Resonant Space 鼓励用户通过自身的节奏进入交互,感受环境对他们状态的呼应与变化。
这种方式带来一种平静、在场与自主的体验,让身体从被感知的对象转变为共创的主体。
PROBLEM:
In design, we often emphasize “interactivity” and “Immersive”— but do these approaches truly help people reconnect with their bodies and emotions?
- "Interactive systems" are essentially pre-scripted processes. Users are guided through a fixed experience rather than leading it.
- "Immersive works" create environments for people to enter, but not necessarily ones they can shape through their own rhythms.
In these systems, the user is often a passive subject—being sensed, being guided, being immersed—but not actively resonating.
1. Can we design visual and interactive systems that support a user’s autonomous entry into immersion, instead of pulling them into one?
2. Can we shift from experiences that are designed for the body to experiences driven by the body?
3. Can the user become an active co-creator, rather than someone merely being observed or processed?
RESARCH:
Rhythmic entrainment refers to the phenomenon where the synchronization of external rhythms with internal bodily rhythms—such as heartbeat or breathing—can lead to states of calm, focus, or even meditation. This principle is widely used in music therapy, neuron-feedback training, and somatic mindfulness practices.
Humans are naturally responsive to rhythm. When the rhythm of the environment resonates with the rhythm of the body, the nervous system tends to regulate itself, promoting a sense of balance and roundedness.
In this project, this principle is made tangible: the visual behavior of particles is driven directly by the user’s physiological data.
For example:
- Heartbeat (BPM) controls the pulsing or breathing-like motion of the particles;
- Pulse signal modulates brightness or subtle fluctuations in appearance.
This design creates a visual field that breathes with the user’s own rhythm. The environment becomes a mirror of the body’s state, enabling the user to enter a deeper state of self-awareness through sensory alignment.
Healing is not only about being seen—it is also about the feeling that one can affect change in their environment. The sense of agency—of being able to act and be responded to—is fundamental to safety, presence, and emotional regulation.
To embody this, my project includes a gesture-based interaction module:
- The user’s finger movements are tracked in real time;
- The motion of the fingertips influences the behavior of the particles—creating vortexes, attracting or dispersing flow.
This serves two key functions:
- First, it brings the user’s attention back to their own body, focusing perception through the hand;
- Second, it establishes a responsive feedback loop—you move, and the environment responds.
This transforms the interaction into a co-regulated, participatory experience, allowing the user to enter immersion not by being absorbed, but by actively shaping the sensory field.
PROCESS
1. Plush Sensor with ESP32:
The sensor continuously captures the user’s heartbeat and pulse wave data, and transmits it to the computer via serial communication.
The code reads both the heart rate (BPM) and raw pulse signal, which are used to drive particle behavior in TouchDesigner, creating real-time visuals that resonate with the body’s internal rhythm.
CODE (Arduino IDE):
#define USE_ARDUINO_INTERRUPTS false
#include <PulseSensorPlayground.h>
const int PULSE_INPUT = 12;
const int THRESHOLD = 550;
const int DETECTION_INTERVAL = 1000;
PulseSensorPlayground pulseSensor;
unsigned long lastDetectionTime = 0;
void setup( ) {
Serial.begin(9600);
pulseSensor.analogInput(PULSE_INPUT);
pulseSensor.setThreshold(THRESHOLD);
if (pulseSensor.begin()) {
Serial.println("Pulse Sensor initialized successfully");
}
}
void loop() {
if (millis() - lastDetectionTime >= DETECTION_INTERVAL) {
lastDetectionTime = millis();
int pulseValue = pulseSensor.getBeatsPerMinute();
int rawSignal = analogRead(PULSE_INPUT);
Serial.print(pulseValue);
Serial.print(",");
Serial.println(rawSignal);
}
}
#define USE_ARDUINO_INTERRUPTS false
#include <PulseSensorPlayground.h>
const int PULSE_INPUT = 12;
const int THRESHOLD = 550;
const int DETECTION_INTERVAL = 1000;
PulseSensorPlayground pulseSensor;
unsigned long lastDetectionTime = 0;
void setup( ) {
Serial.begin(9600);
pulseSensor.analogInput(PULSE_INPUT);
pulseSensor.setThreshold(THRESHOLD);
if (pulseSensor.begin()) {
Serial.println("Pulse Sensor initialized successfully");
}
}
void loop() {
if (millis() - lastDetectionTime >= DETECTION_INTERVAL) {
lastDetectionTime = millis();
int pulseValue = pulseSensor.getBeatsPerMinute();
int rawSignal = analogRead(PULSE_INPUT);
Serial.print(pulseValue);
Serial.print(",");
Serial.println(rawSignal);
}
}
2. From Arduino to Touchdesigner:
This diagram shows how data from the ESP32 is received and parsed in TouchDesigner for real-time visual control.
The Serial DAT receives the incoming data stream, and a Python script in the Text DAT splits the values into heart rate (BPM) and raw pulse signal.
These values are assigned to two Constant CHOPs, which can then be used to drive particle behavior or other visual parameters.
The setup enables a continuous, synchronized connection between biometric input and generative visuals.
TEXT DAT - CODE:
def onReceive(dat, rowIndex, message, byteData):
try:
parts = message.strip().split(',')
if len(parts) != 2:
return
bpm = float(parts[0])
raw = float(parts[1])
op('value').par.value0 = bpm
op('value').par.value1 = raw
except Exception as e:
debug('Serial Parse Error:', e)
def onReceive(dat, rowIndex, message, byteData):
try:
parts = message.strip().split(',')
if len(parts) != 2:
return
bpm = float(parts[0])
raw = float(parts[1])
op('value').par.value0 = bpm
op('value').par.value1 = raw
except Exception as e:
debug('Serial Parse Error:', e)
3. Hand Gesture Control Flow:
This diagram shows how real-time hand gesture data from MediaPipe is used to control particle behavior in TouchDesigner.
The middle_finger_tip coordinates are extracted and processed through Math CHOPs to adjust the center position and translation of a metaball and force field.
This allows users to directly influence particle motion using finger movements, creating a responsive and embodied interaction.
2. Biometric + Gesture-Driven Particle Control
This diagram illustrates how heartbeat, pulse wave, and hand gesture data are mapped to control a real-time particle system in TouchDesigner.
- Pulse signal drives radial force intensity and particle lifespan.
- Heartbeat (BPM) influences particle color and behavior via math and noise modulation.
- Middle finger tip position (from MediaPipe) defines the directional force vector, allowing users to guide particle flow with hand motion.
Together, these inputs create a dynamic visual field where the environment responds fluidly to the user’s internal rhythms and gestures.