Module 2 Activity Research

Weekly Activity Template

Xinyu Lu


Project 2


Module 2

In this module, we explored how real-time light data can shape a calming media experience. We began by building simple circuits with Arduino and a light sensor, testing different wiring setups to solve issues such as unstable readings, incorrect LED responses, and problems with pin connections. Through repeated troubleshooting, we learned how the sensor behaves under different light conditions and how to send clean data into TouchDesigner. At the same time, we created concept sketches to visualize the system: a cushion-like structure that darkens when the user leans on it, a simple chair setup, and particle-based visuals projected on the wall.

Using these sketches as guidance, we developed a basic prototype with the sensor placed behind a soft surface and connected it to TouchDesigner. The visuals—slow particles and gentle gradients—responded smoothly to the sensor values and supported our goal of creating a relaxing interaction. Our in-class testing showed that the user naturally becomes the trigger by blocking light, although this represents an ideal condition that assumes stable circuit connections. We also discovered new questions, such as how the system behaves in brighter environments and how additional sensors could enrich the experience.

Overall, this module allowed us to build a full workflow—from hardware testing to interaction mapping to visual output—and to establish a foundation for future development. The work helped us understand how environmental data can drive responsive visuals and how thoughtful interaction design can support stress relief in daily life.

In-Class Activity 1 Geurrilla Prototyping I

<b>Drawing and Measuring</b><br>We start by marking lines on the cardboard to plan the shape of our phone stand, and we measure each part carefully so every piece will fit together correctly. <b>Cutting the Cardboard</b><br>We cut along the marked lines to create the main pieces of the structure, and we also prepare the inner parts that will allow the stand to fold and support the phone at an angle. <b>Building the Base</b><br>We glue several layers of cardboard together to build a stronger base, and this base will support the phone’s weight and keep the whole stand steady. <b>Assembling the Stand</b><br>We attach the angled piece to the base to form the main support, and the structure begins to show its final shape as it becomes able to hold light pressure. <b>Final Model</b>We complete the stand and test it by placing a phone on top, and the cardboard structure is strong enough to hold the device securely while offering a comfortable viewing angle.

In-Class Activity 2 Geurrilla Prototyping II

<b>Cushion Concept Sketch (Sensor Location)<b><br>We sketched the idea of placing the light sensor on the front area of the cushion. When the user leans on the cushion, the light becomes darker and the sensor changes its value. This drawing explains the basic interaction we want to create. <b>User Interaction on Chair</b><br>We sketched how the cushion sits on a chair and how the user naturally leans on it. The darker area on the cushion shows the moment when the sensor is activated. This helps us describe how simple and calm the interaction feels. <b>Visual Projection Concept</b><br>We illustrated how the visuals appear on the wall when the sensor receives input. The particles move softly to support a relaxing experience. This sketch shows the relationship between the user, the cushion, and the projected visuals. <b>Testing the Circuit Behind the Fabric</b><br>We placed the Arduino and breadboard in font of a piece of fabric to simulate the cushion surface. The setup allowed us to check how the sensor reacted when the light changed in front of the fabric. This was our first step in building the physical prototype. <b>User Experiencing the Projection</b><br>We tested the visuals in a real space with a full-size projection. The user sat in front of the screen while the particle animation responded to the sensor values. This photo shows the relaxing atmosphere we aimed to create.

Activity 1: My Resarch

<b>Sensor Kit Wiring Confusion</b><br>We connected the light sensor to the breadboard and attached the wires to the Arduino Sensor Kit, but the circuit did not work as expected. We were not sure whether the pins on the kit matched the correct input on the Uno. This made it difficult for us to confirm if the sensor was responding properly. <b>LED Not Lighting on the Kit</b><br>We tested the LED by connecting it directly to the Sensor Kit, but the LED did not turn on. At this point, we still assumed the kit could power the LED correctly. This made it hard for us to tell whether the problem was the LED, the wiring, or the kit itself. <b>Testing All Functions on the Sensor Kit</b><br>We connected the Sensor Kit directly to the computer and uploaded sample codes to test each built-in sensor. We checked the data from different modules to understand which parts were working correctly. This step helped us confirm the kit’s functions, but it did not solve the LED issue on our custom circuit. <b>Sensor Data Appears but LED Still Not Working</b><br>We finally received clear numbers in the Serial Monitor, but the LED still did not respond to the sensor data. Even though the sensor showed different values, our output logic did not activate the LED. We realized that the circuit or pin mapping was still incorrect. <b>LED Only Works After Adjusting Light Level</b><br>We managed to make the LED respond by adjusting the light level over the sensor. The LED only turned on when the sensor was covered, showing that the logic worked under specific conditions. This step confirmed that the circuit was correct, but the sensitivity still needed refinement.

Activity 2: My Reearch

<b>Particle System Setup</b><br>In this step, we created a particle system using the grid, sort, and particle operators. We arranged the nodes to build the basic movement and form of the particles. This gave us a starting point for creating dynamic visuals controlled by external data. ><b>Mouse Input Control Path</b>Here we used the mouse input to test interaction values inside TouchDesigner. We applied math and lag operators to smooth the movement and reduce sudden jumps. This helped us understand how input data can change the behavior of the visual elements. <b>Final Particle Rendering Layout</b>In this stage, we organized the render network for the particle visuals. We added transformations and effects to improve the appearance and motion of the particles. This layout allowed us to preview how the full visual system would look when active. <b>Geometry Rotation Adjustment</b><br>In this image, we adjusted the rotation values of a geometry component. We tested how extreme rotation changes affect the particle structure and its orientation. This helped us confirm which settings were stable and suitable for our visual output. <b>Connecting Arduino Light Sensor to TouchDesigner</b>Here, we connected the Arduino light sensor to TouchDesigner through the serial DAT. We used the select and DAT-to-CHOP nodes to convert the incoming serial data into a usable channel. This allowed us to control visual elements directly with real light sensor values.

Action Research: My Research Continued

<b>Testing Light Sensor Input on Arduino</b><br>We moved our hand above the light sensor to test how the values changed in Arduino. The Serial Monitor showed different numbers when the light level changed. This helped us confirm that the sensor could read brightness correctly. <b>Checking Sensor Data Stability</b><br>We ran the same code again to observe if the sensor values were stable. The Serial Monitor showed constant changes when the light around the sensor shifted. This allowed us to understand the behavior of the sensor before connecting it to visuals. <b>Setting Up Interaction Nodes in TouchDesigner</b><br>We tested how the values could influence nodes inside TouchDesigner. We adjusted basic nodes such as mouse input, math, and lag to build a smooth control path. This step helped us understand how interaction data affects movement and forces. <b>Building Particle-Based Visuals</b><br>We arranged multiple operators to create particle visuals that respond to input. We tested different node combinations to achieve a soft and calm animation style. This process helped us refine how the visuals should look for a relaxing experience. <b>Connecting Arduino Data to TouchDesigner</b><br>We connected the Arduino to TouchDesigner and received live sensor values through a serial DAT. The numbers changed when we covered the light sensor or changed the light intensity. This allowed us to link real-world light changes to responsive visuals.

Project 2


Project 2 Prototype

Project 2 continues our direction from Project 1 by exploring how real-time light changes can produce calming visual output. We created a cushion-like prototype with a light sensor connected to Arduino, allowing TouchDesigner to show soft particles and gentle gradients when the user blocks the light. This setup demonstrates how sensory input can support relaxation, although the experience represents an ideal condition assuming stable circuit connections. After testing, the interaction felt natural because the user’s body becomes the trigger, though we still wonder whether the visuals remain clear in brighter environments. Overall, the prototype successfully expresses our intention to reduce stress and forms a strong base for future work with additional sensors and richer visual layers.

▶ Watch Final Prototype Video

Project 2 continues our direction from Project 1 by exploring how real-time light changes can produce calming visual output. We created a cushion-like prototype with a light sensor connected to Arduino, allowing TouchDesigner to show soft particles and gentle gradients when the user blocks the light. This setup demonstrates how sensory input can support relaxation, although the experience represents an ideal condition assuming stable circuit connections. After testing, the interaction felt natural because the user’s body becomes the trigger, though we still wonder whether the visuals remain clear in brighter environments. Overall, the prototype successfully expresses our intention to reduce stress and forms a strong base for future work with additional sensors and richer visual layers.
×

Powered by w3.css