Module 3 Activity Research

Weekly Activity Template

Xinyu Lu


Project 3


Module 3

This module is an extension of the research and explorations from Project 1 and Project 2, continuing our focus on how real-time data can shape interactive visual experiences. While the previous projects used environmental inputs such as images and light, Project 3 shifts toward sensing the user’s physical state through a force sensor and a pulse sensor, creating a more personal and responsive system.
In this project, we built a prototype that combines Arduino and TouchDesigner to turn body-based data into visual feedback. When the user leans on the cushion, the force sensor detects pressure changes, and the pulse sensor on the wristband reads heart-rate values. These signals are sent to TouchDesigner, where they control different visual modes. Fast and intense visuals appear when the user’s heart rate is high, while slow and calm visuals show when the user is relaxed.
Through this module, we explored sensor integration, data processing, and media output mapping. We learned how to connect multiple sensors, stabilize their readings, and translate live data into animated particle systems and geometric effects. The final outcome is an interactive relaxation device that supports self-awareness and emotional regulation, continuing the overall design goal established in earlier projects—using real-time data to help users feel more present, calm, and connected to their environment.

Workshop 1

<b>Tools and props</b><br>These are the cardboard tools and props we made for our smart kitchen workshop. We created items like a recipe card, ingredients, and cooking tools to support the apple pie scenario. They helped us act out how the smart kitchen gives instructions to the user. <b>Smart fridge</b><br>This cardboard box represents our smart fridge. When the user opens it, the fridge shows the apple pie recipe and expiry alerts. It is used to communicate suggestions and guide the next steps. <b>Apple pie recipe</b><br>This card shows the apple pie recipe provided by the smart fridge. It lists simple ingredients like pie crust, sugar, and heavy cream. The recipe is the main instruction the user follows to start cooking. <b>Oven preheat sign</b><br>This cardboard sign represents the oven starting to preheat. After the user agrees to make the pie, the oven receives the recipe and begins warming up. It shows how different smart devices can share information. <b>Smart cleaning robot</b><br>This image shows the smart cleaning robot detecting and cleaning the wet spill. It identifies the mess automatically and responds without user instruction. It demonstrates how the smart kitchen supports the user during cooking.

Activity 1: My Research

<b>Sensor Placement Concept</b><br>In this sketch, we show how we place both sensors inside the prototype. We hide the force sensor under the fabric of the cushion, so the user can activate it naturally by touching or pressing the surface. We also place the heart-rate sensor inside the wristband, keeping the wires and components covered. This setup helps us create a clean and comfortable interaction while still collecting the data we need. <b>Natural User Interaction</b><br>This sketch explains how the user interacts with our device in a natural way. When the user leans back on the cushion, their body weight activates the force sensor placed inside. The highlighted area shows where the pressure is detected. By using normal sitting behavior as the trigger, we make the interaction simple and effortless for the user. <b>Visual Output and Calm Mode</b><br>This sketch shows how the system responds after receiving data from the sensors. When the user sits and leans on the cushion, the visual output changes on the projected screen. Here, we show the calm visual mode, which displays soft and slow shapes. This visual is designed to guide the user toward a more relaxed and steady state. <b>Sensor connected on breadboard</b><br>We connected the sensor on the breadboard with Arduino to test if the wiring is correct. We checked the power, ground, and signal pin to make sure the sensor can send stable data to Arduino before linking it to TouchDesigner. <b>Full setup with Arduino and TouchDesigner</b><br>We tested the full setup by connecting Arduino and the breadboard to the laptop while running TouchDesigner. We checked the real-time data flow and made sure both the force sensor and pulse sensor could send values to TouchDesigner at the same time.

Activity 2: My Research

<b>Error from material</b><br>We noticed an error on the node because it was connected to a material we did not need for our assignment. Since our project does not use any texture or material on this model, we removed the material connection. After deleting it, we could continue building the rest of the system without issues. <b>Adjusting animation with Trim</b><br>We adjusted the animation using the Trim settings to control the start and end frames. This allowed us to create a smoother and more natural movement for the model. By setting a clear range, we made sure the animation plays only the part we need. <b>Converting FBX into particles</b><br>We connected the FBX model to the particle system to turn the shape into particles. This step allowed the model’s form to be represented as many small points, which helps us create a more dynamic visual effect. It is the base for the particle animation. <b>Customizing particle material</b><br>We worked on customizing the material for the particles inside the particlesGpu component. By selecting a custom material, we could control how each particle looks, such as adding a simple texture or adjusting color. This step helped us refine the overall visual style. <b>Adding Motion to Noise<b><br>In this step, we added animation to the noise by using a time-based expression. By animating the noise, we gave movement to the particle system. This made the particles react in a more dynamic way and created a more expressive final visual.

Additional Research or Workshops

<b>Preparing Materials</b><br>We prepared the soft cushion and the wrist band pieces for our prototype. These materials will hold the force sensor inside the cushion and the pulse sensor inside the wrist band. At this stage, we focused on arranging the parts before adding any electronics. <b>Testing Wearing and Interaction</b><br>We tested how the wrist band fits on the user and how the cushion responds when someone leans on it. This helped us check the positions of the pulse sensor and the force sensor, making sure the user can interact with the prototype in a natural way. <b>Assembling the Wrist Band</b><br>We decorated and assembled the wrist band by attaching felt and buttons with hot glue. This step allowed us to make the sensor wearable and comfortable, while also keeping the pulse sensor hidden under the fabric. <b>Placing the Sensor Inside the Cushion</b><br>We placed the force sensor inside the cushion and guided the wire through the side opening. This step made the cushion functional while keeping the sensor invisible, so the interaction stays soft and natural. <b>Full Prototype Connection Test</b><br>We connected the cushion and sensors to the Arduino to test the full prototype. This setup allowed us to read both the force and pulse data in real time, which will later be sent to TouchDesigner to create the responsive visuals.

Additional Research or Workshops

<b>Setting Up the Data Table in TouchDesigner</b><br>In this image, we are testing how TouchDesigner receives and organizes the sensor data. We use a select DAT and a table DAT to check if the values are read correctly. This step helps us confirm that the serial input from Arduino can be cleaned and prepared for later use in our visuals. <b>Reading Sensor Data Through the Serial DAT</b><br>Here we read the real-time data coming from Arduino, including force sensor and pulse sensor values. We use a serial DAT and a select DAT to filter the data we need. This setup allows us to track the changing numbers clearly and send them into our visual system for interaction. <b>Arduino Sensor Data Changes</b><br>In this GIF, we show how the force sensor and heart rate sensor send continuous data to Arduino. We can see the numbers updating in real time, which helps us understand how the user’s pressure and heartbeat change during interaction. This step is important because it confirms that both sensors work correctly and send stable data for our project. <b>TouchDesigner Visual for High Heart Rate</b><br>This GIF presents the visual effect we designed in TouchDesigner for moments when the user’s heart rate becomes high. The movement is fast and intense, with strong contrast and sharp lines. We created this visual to reflect tension and excitement, so the system can communicate elevated emotional states through dynamic motion. <b>TouchDesigner Visual for Calm Heart Rate</b><br>In this GIF, we show the calm visual mode in TouchDesigner. The shapes move slowly with smooth transitions, and the overall rhythm is soft. We designed this visual to represent a stable and relaxed heart rate, giving the user a gentle and peaceful experience that reflects their calmer physical state.

Project 2


Project 3 Final Prototype

Final Prototype – Video

Project 3 builds on the directions from Projects 1 and 2, exploring how real-time environmental data can shape visual output to help users relax. In this stage, we replaced the previous light sensor with a force sensor and a pulse sensor to create a more personal and responsive media experience. The prototype uses an Arduino to read pressure changes from a force sensor in the cushion and heart-rate data from a wristband pulse sensor. When the user leans on the cushion or wears the wristband, the sensors send data to TouchDesigner, which then triggers different visuals. High heart rates activate fast and intense animations, while stable heart rates produce calm and gentle motions. This system can remind users to slow down or support calming and meditation. Through this project, we learned how to connect multiple sensors and transform real-time data into interactive visuals. By experimenting with particle effects and mapping sensor values to visual behaviors, we created a system that turns physical input into meaningful feedback for relaxation.

Project 3 builds on the directions from Projects 1 and 2, exploring how real-time environmental data can shape visual output to help users relax. In this stage, we replaced the previous light sensor with a force sensor and a pulse sensor to create a more personal and responsive media experience.

The prototype uses an Arduino to read pressure changes from a force sensor in the cushion and heart-rate data from a wristband pulse sensor. When the user leans on the cushion or wears the wristband, the sensors send data to TouchDesigner, which then triggers different visuals. High heart rates activate fast and intense animations, while stable heart rates produce calm and gentle motions. This system can remind users to slow down or support calming and meditation.

Through this project, we learned how to connect multiple sensors and transform real-time data into interactive visuals. By experimenting with particle effects and mapping sensor values to visual behaviors, we created a system that turns physical input into meaningful feedback for relaxation.
×

Powered by w3.css