Final Project Documentation (week 14): The ‘Robinson’ Maze

Concept:

Cover page for the game. I designed this using Photoshop, obtained the image from Pixabay, a free image place with copyright free usage policies.

This game incorporates two key concepts. The first concept, which inspired the creation of this game, is the maze itself—an idea drawn from my midterm project. In the game, the user starts at one end of the maze’s wire frame and navigates through it without touching the wire, aiming to reach the endpoint before time runs out. The maze is uniquely designed to resemble the word “Robinson,” with the first letter forming an “R,” the last letter forming an “N,” and the middle section creatively looped to provide a more engaging and challenging experience for players.

The second concept is inspired by urban design. The maze is mounted on two wooden blocks designed to resemble buildings, giving the entire structure the appearance of a miniature cityscape. This combination of gameplay and aesthetic design enhances the overall experience by integrating storytelling with visually appealing architecture.

Project Interaction:

Do you remember the person I used for my user testing? Well, they’re back again for another round to interact with the program, this time after addressing and resolving any misconceptions about how the game functions. Below is a video of Vladimir testing my game once again:

How Does the Implementation Work?

1. Description of Interaction Design

The interaction design focuses on creating a seamless and engaging user experience. The game begins with a welcome screen that introduces the user to the interface. It includes options to view instructions or start playing the game.

  • Instructions Screen: The instructions explain the rules of the game, such as the objective of navigating through the maze without touching the conductive walls and what happens when the wire is touched. A “Back” button is provided to return to the main menu.
  • Timer Selection: Players can choose their preferred play duration (30 seconds, 1 minute, or 2 minutes). After selecting a timer, the game transitions to a countdown preparation phase.
  • Game Play: During the game, the player must navigate the maze using the loop object without touching the maze walls. Touching the walls triggers a red glow effect and reduces the remaining time by 5 seconds. The player wins by reaching the endpoint (connected to the A3 pin) before the timer runs out. If the timer reaches zero before the endpoint is touched, the player loses.
  • Win/Lose Feedback: Winning triggers celebratory fireworks visuals, while losing displays visuals indicating failure. Both states return the user to the main menu after 5 seconds.

2. Description of Arduino Code

The Arduino code handles two primary functions:

  • Touch Detection (A0): When the loop object touches the maze walls, the Arduino sends a ‘Touch Detection’ message to p5.js. It also briefly activates the buzzer to provide immediate audio feedback.
  • Win Detection (A3): When the loop object reaches the endpoint, the Arduino sends a ‘win’ message to p5.js to indicate success.

The code integrates digital input from the A0 and A3 pins and sends serial messages to the p5.js sketch, allowing the game to react accordingly. The following code is my entire self-written code (including comments) that I used to bring the project to life using the Arduino IDE:

#define TOUCH_PIN A0
#define WIN_PIN A3
#define BUZZER_PIN 8

bool gamePlaying = false; // Flag to track if the game is currently active

void setup() {
  Serial.begin(9600);

  pinMode(TOUCH_PIN, INPUT);  // Touch detection
  pinMode(WIN_PIN, INPUT);    // Win detection
  pinMode(BUZZER_PIN, OUTPUT); // Buzzer feedback

  Serial.println("Setup Complete. Monitoring touch and win states...");
}

void loop() {
  // Check for serial messages from p5.js
  if (Serial.available() > 0) {
    String command = Serial.readStringUntil('\n');
    command.trim();

    if (command == "START") {
      gamePlaying = true; // Start the game
      Serial.println("Game started!");
    } else if (command == "STOP") {
      gamePlaying = false; // Stop the game
      Serial.println("Game stopped!");
    }
  }

  if (gamePlaying) {
    int touchValue = digitalRead(TOUCH_PIN); // Read A0
    int winValue = digitalRead(WIN_PIN);    // Read A3

    // Check if wires are touching
    if (touchValue == HIGH) {
      tone(BUZZER_PIN, 1000); // Play buzzer
      delay(100);
      noTone(BUZZER_PIN);
      Serial.println("Touch detected!"); // Notify touch
    }

    // Check if win condition is met
    if (winValue == HIGH) {
      Serial.println("WIN"); // Notify win
      delay(500);            // Avoid spamming
    }
  } else {
    noTone(BUZZER_PIN); // Ensure the buzzer is off when the game isn't active
  }

  delay(50); // Stability delay
}

3. Schematic

The circuit consists of:

  • A0: Connected to the loop object with a 10kΩ pull-down resistor.
  • A3: Connected to the maze endpoint for win detection.
  • D8: Connected to the buzzer for audio feedback.
  • 5V and GND: Power and ground connections.

The schematic visually details how the components are connected to the Arduino board.

The following schematic represents the connection for the Arduino circuit while also inclusding the objects used to complete the circuit for the overall functioning of the game. The schematic was made using TinkerCAD

4. Description of p5.js Code

The p5.js code manages the visual interface, user interactions, and communication with the Arduino. Key functions include:

  • Serial Communication: Establishes a connection with the Arduino and processes messages received from it.
  • Visual Design: Displays custom backgrounds for each screen (e.g., welcome, instructions, timer selection, and game states). Buttons are styled and positioned for ease of use.
  • Game Logic: Handles the countdown timer, touch detection, win/lose conditions, and visual effects for the game (e.g., the red glow for touches, fireworks for wins).
  • Dynamic Transitions: Smoothly transitions between different game states and incorporates a 3-second preparation countdown to ensure the user is ready before gameplay begins.

The following is the p5 embedded code. Be sure to copy the Arduino code into the Arduino IDE, using the schematic connections on the Arduino Uno board if you would like to test this following program

for the full screen link, click here: https://editor.p5js.org/takuthulani/full/H_F1sS4qo

5. Description of Communication Between Arduino and p5.js

The communication between Arduino and p5.js is established through serial data transfer. The Arduino sends the following messages based on the game events:

  • “Touch detected!”: Sent when the loop object touches the maze walls (A0 input). p5.js responds by reducing the timer and activating the red glow effect.
  • “WIN”: Sent when the loop object reaches the endpoint (A3 input). p5.js displays the “You Won” message and celebratory visuals.

Additionally, p5.js sends a message to the Arduino when the game begins and a stop message when the game ends.

Aspect I am proud of:

If I am being truly honest, I feel really proud of the overall functioning of the game. The game works exactly as I imagined it would, and this is something to be proud of since we know that, from concept to final project, many things can go wrong. One often needs to make exceptions for unforeseen challenges and find creative solutions. In this case, the project works as intended, and I am particularly proud of the fact that when you touch the wire, the small speaker activates, and the screen flashes red to indicate the touch.

The serial communication was one of the trickiest parts of this project. Strangely enough, my game stopped working when the IM showcase began, requiring a few minutes of troubleshooting to get it running again. Beyond that hiccup, I am especially proud of the feature where the timer decreases whenever the user touches the wire. This functionality involved many late nights of debugging, but the result was worth it.

Lastly, I am happy that the program does not react when anyone touches the wire with the conductive loop object while the game is not being played. This demonstrates that the code I wrote provides enough control and that the communication between p5.js and Arduino is solid.

Areas of Future Improvements:

I am someone who enjoys visually stimulating designs, and one thing I would like to improve is the overall visual look of the game. The cover page seems too simple, and I would add enhancements to make it more engaging. Additionally, I would add sounds to the interface buttons so that they produce feedback whenever someone clicks them. More instructions and better visual cues for the game would greatly enhance the user experience.

For the timer, I would make it more visually apparent that the player is running out of time and that they lose time whenever they touch the wire. One improvement could be adding a louder speaker, as the sound of the small speaker was often drowned out by the environmental noise during the IM showcase. Providing users with options to enable or disable background music would also enhance the experience.

Furthermore, the physical structure of the game could use a more polished look. While not many people commented on this during gameplay, a better structure would contribute to the overall mood of the game. Lastly, I would add more engaging animations to the “You Win” screen to make users feel a greater sense of accomplishment. Implementing a high-ranking system could encourage competitiveness among players, as I noticed many users enjoyed challenging one another. Additionally, a more dramatic loss message could motivate users to replay the game more often.

 

Credits:

I would like to take this opportunity to thank Nelson from the scene shop for his immense help in cutting the wood for my project. I would also like to thank the lab TAs for assisting me in finding items in the lab and helping me progress through this project by supplying resources.

A special thanks to Professor Michael Shilo, whom I met in the lab. He helped me find wires in the scene shop, and I am grateful for his suggestions. Lastly, I would like to thank Professor Michael Ang for guiding me through this project from the beginning to the end, as well as for all the knowledge I gained from him throughout the semester.

Disclaimer

I use Grammarly as a browser add-on in Chrome, which assists me in fixing grammar issues while writing this blog.

Final Project User Testing (Week 13)

User Testing

The user testing for this project was both enlightening and entertaining. The first test occurred just a few minutes before the Interactive Media (IM) showcase. Vladimir Sontea, the first person I invited to test the game, provided a unique perspective and valuable feedback. Below is the video recording of the session:

In the video, you can hear me laughing at how quickly Vladimir lost before even starting the game. What stood out wasn’t his loss but the realization of certain faults in my game design. This moment marked the beginning of identifying key areas for improvement.

Observations:

User Experience

  • The user quickly figured out the navigation system, including accessing the instructions page, selecting the desired timer, and starting the game. This indicates that the visual structure of the game is intuitive and easy to understand.
  • However, there was significant confusion about what happens when the wire is touched. Questions like “How many lives do I have?” and “What are the conditions for losing?” came up frequently.

Game Mechanics

  • The key mechanic—reducing the timer by 5 seconds when the wire is touched—was unclear to users. This detail was missing from the instructions, which many users tended to skip.
  • Initially, I left this aspect as part of the game’s “discoverability,” where players would learn the rules by playing. However, based on feedback, I made this mechanic more explicit to avoid unnecessary confusion. The following image shows the changes I made to ensure users understood the game instructions:

What Worked Well

  • The core mechanics, including the timer countdown and the win/loss conditions, functioned as intended.
  • The navigation system, with options for instructions and gameplay, was clear and easy to use.
  • The visual appeal of the game effectively drew users’ attention, and the inclusion of the “gold handle” added a premium feel to the design.

Areas for Improvement

  1. Instructions:
    • Many users skipped the instructions page, leading to confusion about certain game rules. Adding more visual and interactive cues to emphasize key mechanics (like the 5-second penalty for touching the wire) could help.
  2. Design:
    • While functional, the game’s visuals could be enhanced further to improve its overall polish and user experience.
  3. Preparation Countdown:
    • The 3-second countdown appeared as a glitch to users unfamiliar with its purpose. Better visual or auditory cues could make it clear that the countdown is intended to give players time to prepare.
  4. Safety Concerns:
    • Some users were hesitant to interact with the conductive handle, fearing they might get shocked. Adding a clear safety disclaimer in the instructions would address these concerns.

 

These insights, gathered during the first user test case, informed improvements in the game’s mechanics, design, and user experience. Although these seemed to be the challenges of the game, I strongly believe that it would not take away the experience of the game.

Final Project Proposal & Concept (from week 11 and 12)

Introduction (week 11)

Coming up with an idea for this project was a challenge. I struggled with creative block and external pressures, which delayed the process. After two weeks of reflection, I decided to revisit and expand on a concept from my midterm project.

In the midterm project, the idea was to guide the main character through a maze. The narrative was that the character (the player) journeys through the maze to finally meet their parents. However, the maze portion was not fully realized. For this project, I wanted to bring this maze concept to life using p5.js and Arduino.

This final project builds on that narrative and integrates digital and physical interaction, creating an engaging and immersive experience.

Game Design and Features (week 12)

Overview

The game combines p5.js visuals with Arduino-based physical interactivity. The player navigates a conductive wire maze using a loop object. The goal is to reach the end of the maze without touching the maze walls and within a set time limit. As a key design element, the maze design spells out the word “ROBINSON,” tying back to the narrative of the midterm project. More on this later.

Arduino Program

The following are the functionalities of the project:

Inputs:

  1. A0 (Touch Detection): Detects if the loop touches the maze wire.
    • Behavior: When a touch is detected, it sends a “Touch detected!” message to p5.js.
  2. A3 (Win Detection): Detects if the loop reaches the end of the maze.
    • Behavior: When contact is made, it sends “WIN” to p5.js.

Outputs:

  1. BUZZER_PIN (Buzzer): Plays a short tone when a touch is detected during gameplay.
    • Behavior: Activates only while the countdown timer is active.

The following is code snippets of how this is brought to life:

void loop() {
  int touchValue = digitalRead(TOUCH_PIN); // Check for touches
  int winValue = digitalRead(WIN_PIN);     // Check for win condition

  if (touchValue == HIGH) {
    tone(BUZZER_PIN, 1000); // Activate buzzer
    delay(100);
    noTone(BUZZER_PIN);
    Serial.println("Touch detected!"); // Notify p5.js
  }

  if (winValue == HIGH) {
    Serial.println("WIN"); // Notify p5.js of win
    delay(500);
  }

  delay(50); // Stability delay
}

p5.js Program

Inputs from Arduino:

  1. ‘Touch detection’
    • Deducts 5 seconds from the timer.
    • Triggers a red glow effect on the screen edges.
  2. ‘WIN’ detection
    • Displays a “You Won!” message with a fireworks animation.

Outputs to Arduino:

  1. START: Sent when the game begins, activating Arduino’s detection logic.
  2. STOP: Sent when the game ends, deactivating Arduino outputs.

Key Features:

  • Dynamic Timer Countdown: Starts a countdown when a timer button is selected, with a 3-second preparation countdown before the game begins.
  • Touch Feedback: Deducts time and triggers a glow effect when the maze walls are touched.
  • Win and Lose States: Celebrates a win with fireworks or displays a loss message if time runs out

the following are some key code snippets for how this would work:

function readSerial(data) {
  if (gameState === "playing") {
    const trimmedData = data.trim();

    if (trimmedData === "Touch detected!") {
      glowEffect = true;
      clearTimeout(glowTimer);
      glowTimer = setTimeout(() => (glowEffect = false), 100);
      timeLeft = Math.max(0, timeLeft - 5); // Deduct 5 seconds
    }

    if (trimmedData === "WIN" && timeLeft > 0) {
      clearInterval(timerInterval);
      writeSerial("STOP\n");
      gameState = "win"; // Transition to the win state
    }
  }
}

function startGame(time) {
  timeLeft = time;
  gameState = "countdown";
  let countdownTime = 3;

  let countdownInterval = setInterval(() => {
    countdownTime--;
    if (countdownTime <= 0) {
      clearInterval(countdownInterval);
      gameState = "playing";
      writeSerial("START\n");
      startTimer();
    }
  }, 1000);
}

Execution Process (progress)

Physical Setup

  1. Maze Design: The maze spells out “ROBINSON,” aligning with the narrative. This design adds a storytelling element to the game while providing a challenging maze path.
    • Material Used: Conductive wire shaped into the letters of “ROBINSON.”
    • Support Structure: Cut wood panels to create a stable base for the maze.
    • Process: Shaped the conductive wire carefully to form the letters and attached it to the wood base for durability and aesthetics.

the following image is an attempt made to spell the word mentioned above using conductive wire:

The following images show the progress of this project through wires, woodcutting and soldering:

soldering
Wood cutting. Special thanks to Nelson for helping me immensely with cutting this wood into shape.
Shaping the wire in the scene shop. Special thanks to Nelson and Michael Shilo for assisting me with finding the conductive materials to use for this project.

Wiring Schematic

  • A0: Connected to the loop object and configured with a pull-down resistor for touch detection.
  • A3: Configured for detecting contact at the maze endpoint.
  • BUZZER_PIN: Connected to a larger external buzzer to provide audible feedback during gameplay.

The following is the schematic used for this project:

The following schematic represents the connection for the Arduino circuit while also including the objects used to complete the circuit for the overall functioning of the game

This concludes the progress made for the maze program. After completing this progress, this is when I came up with the name for it: The Robinson Maze.

Week 11: In-Class Exercises

Final Video Demonstration can be found here: https://youtu.be/CTLXGrMEBxU

Cover image for this week’s production.
Exercise 1:

“make something that uses only one sensor  on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on Arduino is controlled by p5”

The following code was utilized for this particular exercise:

let sensorValue = 0; // To store the sensor value from Arduino

function setup() {
  createCanvas(640, 480);
  textSize(18);
}

function draw() {
  background(220);

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);

    // Display the sensor value
    text('Sensor Value: ' + sensorValue, 20, 50);

    // Map the sensor value to the horizontal position of the ellipse
    let ellipseX = map(sensorValue, 0, 1023, 0, width);

    // Draw the ellipse in the middle of the canvas vertically
    fill(255, 0, 0);
    ellipse(ellipseX, height / 2, 50, 50);
  }
}

function keyPressed() {
  if (key == " ") {
    setUpSerial(); // Start the serial connection
  }
}

// This function is called by the web-serial library
function readSerial(data) {
  if (data != null) {
    let fromArduino = trim(data); // Trim any whitespace
    if (fromArduino !== "") {
      sensorValue = int(fromArduino); // Convert the sensor value to an integer
    }
  }
}

The following code was used in the Arduino IDE for this exercise:

// make something that uses only one sensor  on Arduino and makes the ellipse in p5 move on the horizontal axis,
// in the middle of the screen, and nothing on arduino is controlled by p5

int sensorPin = A0; // Single sensor connected to A0

void setup() {
  Serial.begin(9600);
}

void loop() {
  int sensorValue = analogRead(sensorPin); // Read sensor value
  Serial.println(sensorValue); // Send sensor value to p5.js
  delay(50); // Short delay for stability
}

 

Exercise 2:

“make something that controls the LED brightness from p5”

The following code was used to make this exercise come to fruition:

let brightness = 0; // Brightness value to send to Arduino

function setup() {
  createCanvas(640, 480);
  textSize(18);

  // Create a slider to control brightness
  slider = createSlider(0, 255, 0);
  slider.position(20, 50);
}

function draw() {
  background(220);

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);

    // Display brightness value
    text("Brightness: " + brightness, 20, 90);

    // Update brightness from the slider
    brightness = slider.value();
    
    // Send brightness to Arduino
    writeSerial(brightness + "\n");

  }
}

function keyPressed() {
  if (key == " ") {
    setUpSerial(); // Start the serial connection
  }
}

function readSerial(data) {
  if (data != null) {
    let fromArduino = trim(data); // Trim whitespace
    brightness = int(fromArduino); // Parse data into an integer
  }
}

The following Arduino code was used for this particular exercise:

//make something that controls the LED brightness from p5

int ledPin = 3;

void setup() {
  Serial.begin(9600);
  pinMode(ledPin, OUTPUT);
}

void loop() {
  if (Serial.available()) {
    int brightness = Serial.parseInt();
    if (Serial.read() == '\n') {
      brightness = constrain(brightness, 0, 255);
      analogWrite(ledPin, brightness);
      Serial.println(brightness); // Send brightness to p5.js
    }
  }
}

 

Exercise 3:

The following code is an alteration of professor Aaron Sherwood’s code which was used for this exercise:

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;

let windSensorValue = 0; // Value from the wind sensor
let connectButton; // Button for connecting to the serial port

function setup() {
  createCanvas(640, 360);
  noFill();

  position = createVector(width / 2, 0); // Initial position of the ball
  velocity = createVector(0, 0); // Initial velocity
  acceleration = createVector(0, 0); // Initial acceleration
  gravity = createVector(0, 0.5 * mass); // Gravity force
  wind = createVector(0, 0); // Initial wind force

  // Create a button to initiate the serial connection
  connectButton = createButton("Connect to Serial");
  connectButton.position(10, 10);
  connectButton.mousePressed(setUpSerial); // Trigger serial connection on button press
}

function draw() {
  background(255);

  if (!serialActive) {
    text("Click 'Connect to Serial' to start", 20, 50);
    return; // Exit the draw loop until the serial connection is established
  }

  // Map wind sensor value to wind force (affects horizontal movement)
  wind.x = map(windSensorValue, 0, 1023, -1.5, 1.5); // Adjust force range as needed

  // Apply forces
  applyForce(wind); // Apply wind force
  applyForce(gravity); // Apply gravity force

  // Update velocity and position
  velocity.add(acceleration);
  velocity.mult(drag); // Apply drag (friction)
  position.add(velocity);
  acceleration.mult(0); // Reset acceleration

  // Ball bounce logic (vertical boundary)
  if (position.y > height - mass / 2) {
    position.y = height - mass / 2; // Place the ball on the ground
    velocity.y *= -0.9; // Reverse and dampen vertical velocity

    // Notify Arduino to toggle the LED when the ball touches the ground
    writeSerial("1\n"); // Send '1' to Arduino
  } else {
    // Ensure the LED is off when the ball is not touching the ground
    writeSerial("0\n"); // Send '0' to Arduino
  }

  // Draw the ball
  ellipse(position.x, position.y, mass, mass);
}

function applyForce(force) {
  // Newton's 2nd law: F = M * A -> A = F / M
  let f = p5.Vector.div(force, mass); // Scale force by mass
  acceleration.add(f); // Add force to acceleration
}

// Reset the ball to the top of the screen when the space key is pressed
function keyPressed() {
  if (key === " ") {
    position.set(width / 2, 0); // Reset position to top center
    velocity.set(0, 0); // Reset velocity to zero
    mass = random(15, 80); // Randomize mass
    gravity.set(0, 0.5 * mass); // Adjust gravity based on new mass
  }
}

// Serial communication: Read sensor value from Arduino
function readSerial(data) {
  if (data != null) {
    let trimmedData = trim(data);
    if (trimmedData !== "") {
      windSensorValue = int(trimmedData); // Read wind sensor value
    }
  }
}

The following code was used in the Arduino IDE to bring this to life:

//gravity wind example
int ledPin = 2;     // Pin connected to the LED
int windPin = A0;   // Analog pin for the potentiometer (A0)

void setup() {
  Serial.begin(9600);        // Start serial communication
  pinMode(ledPin, OUTPUT);   // Set the LED pin as an output
  digitalWrite(ledPin, LOW); // Turn the LED off initially
}

void loop() {
  // Read the analog value from the potentiometer
  int windValue = analogRead(windPin);

  // Send the wind value to p5.js over serial
  Serial.println(windValue);

  // Check if a signal is received from p5.js for the LED
  if (Serial.available()) {
    char command = Serial.read(); // Read the signal from p5.js
    if (command == '1') {
      digitalWrite(ledPin, HIGH); // Turn on the LED when the ball touches the ground
    } else if (command == '0') {
      digitalWrite(ledPin, LOW);  // Turn off the LED
    }
  }

  delay(5); // Small delay for stability
}


The following schematic was used for all 3 of the exercises with slight moderations, provided in class:

Week 11: Design meets Disability

One of the main arguments I extracted from this week’s reading is the interplay between fashion and discretion in design, particularly in the context of disability. Whether a design should blend in or stand out is subjective and depends on the user. For instance, teeth implants and removable teeth were initially functional medical solutions meant to resemble natural teeth. Over time, however, these appliances have become fashion statements, with materials like gold being used to signify wealth or spirituality. This shift exemplifies how functional designs can appeal to broader audiences and evolve into tools of self-expression. Similarly, the example of the athlete in the reading, who embraced her prosthetic legs as a fashionable part of her identity, demonstrates how design choices can transcend functionality to reflect individuality. This underscores the idea that the line between utility and self-expression is fluid and often shaped by societal influences.

The reading also provokes thought about the ethics of design, particularly when it comes to medical appliances. While designers from unrelated fields might bring fresh perspectives, their lack of specialized knowledge can lead to unintended consequences. For example, the trend of hearing aids resembling earphones doesn’t address how excessive earphone use may itself lead to hearing loss, creating a harmful cycle. This highlights the risk of prioritizing aesthetics or profit over the users’ actual needs. These insights also apply to interactive design, reminding us that functionality and user experience must take precedence over superficial appeal. Thoughtful design must strike a balance, respecting the user’s needs and individuality while avoiding exploitation or unnecessary commercialization.

Week 10: The Arduino Piano (Takudzwa & Bismark)

The final product for you convenience is here: https://youtu.be/62UTvttGflo

Concept:

The motivation behind our project was to create a unique piano-like instrument using Arduino circuits. By utilizing two breadboards, we had a larger workspace, allowing for a more complex setup. We incorporated a potentiometer as a frequency controller—adjusting it changes the pitch of the sounds produced, making the instrument tunable. To enhance the experience, we added synchronized LED lights, creating a visual element that complements the sound. This combination of light and music adds a fun, interactive touch to the project. Here’s the project cover:

The tools used for this project were: The potentiometer, Piezo Speaker, LEDs, 10k & 330 ohm resistors, push buttons and jump wires.

Execution:

The following was the schematic for our project, which served as the foundation that allowed us to successfully execute this project:

The following Arduino code snippet brought our project to life, controlling both sound and light to create an interactive musical experience:

void setup() {
  // Set button and LED pins as inputs and outputs
  for (int i = 0; i < 4; i++) {
    pinMode(buttonPins[i], INPUT);       // Button pins as input
    pinMode(ledPins[i], OUTPUT);         // LED pins as output
  }
  pinMode(piezoPin, OUTPUT);             // Speaker pin as output
}

void loop() {
  int potValue = analogRead(potPin);                    // Read potentiometer value
  int pitchAdjust = map(potValue, 0, 1023, -100, 100);  // Map pot value to pitch adjustment range

  // Check each button for presses
  for (int i = 0; i < 4; i++) {
    if (digitalRead(buttonPins[i]) == HIGH) {         // If button is pressed
      int adjustedFreq = notes[i] + pitchAdjust;      // Adjust note frequency based on potentiometer
      tone(piezoPin, adjustedFreq);                   // Play the adjusted note
      digitalWrite(ledPins[i], HIGH);                 // Turn on the corresponding LED
      delay(200);                                     // Delay to avoid rapid flashing
      noTone(piezoPin);                               // Stop the sound
      digitalWrite(ledPins[i], LOW);                  // Turn off the LED
    }
  }
}

Finally, the final project can be found here: https://youtu.be/62UTvttGflo

Reflection:

Although our project may seem simple, we encountered several challenges during its development. Initially, we accidentally placed the digital pins incorrectly, preventing the project from functioning as expected. After hours of troubleshooting, we sought help to identify the issue. This experience turned into a valuable teamwork activity, helping us grow as students and problem-solvers. I view challenges like these as opportunities to build skills I can apply to future projects, including my final one. To enhance this project further, I would improve its visual design and sound quality to make it more appealing to a wider audience. That’s all for now!

Week 10: Reading Response

Bret Victor – A Brief Rant on the Future of Interactive Design

My initial instinct upon reading Bret Victor’s article was to push back and think, “But you’re not really improving anything.” This reaction softened after reading his direct responses to these criticisms. Victor’s defensiveness, in some ways, protects his vision—arguably so. But even beyond that, the true point of his article lies in his challenge to our current conception of interactivity. He questions why we’ve limited ourselves to “single-finger” interaction, arguing that we’re barely scratching the surface of what interactive technology could become. I found myself agreeing, especially when he mentioned that if, 20 years down the line, all we had were glorified iPads, it would be a sign of stagnation. Now, over a decade since the article was written, we’ve indeed developed more advanced interfaces—like VR, AR, and even some early-stage holographic tech—but these technologies haven’t become mainstream, and they haven’t revolutionized interaction to the degree Victor imagined thus proving his point to a degree.

Reflecting on his perspective today, it’s clear he raises valid points. For the field of interactive design to truly evolve, we need critical voices like his, highlighting what’s lacking and pushing the boundaries of what’s possible. Yet, while I appreciate his vision, I also think it’s worth noting that the demand for fully immersive tech isn’t universal. Not everyone wants full-body interaction, and realistically, innovations often emerge only if there’s sufficient market interest. While technologies like VR and AR are groundbreaking, they remain largely inaccessible to many—especially those in marginalized or economically disadvantaged communities. In contrast, iPads and similar devices, while more limited, have found a place even in lower-income communities. Victor’s perspective is compelling and reminds us of the potential for interactive design, but it also underscores the need for accessibility and practical applications.

Thoughts on the Video:

The video accompanying Victor’s article showcases futuristic and visually stunning technologies, like transparent phones and computers. These concepts seem efficient, fast, and intuitive, presenting a vision of an accessible tech-forward society. But this vision quickly becomes complicated when you consider the societal implications. The choice to illustrate this world in a city like Johannesburg, for instance, inadvertently sidelines the broader realities of poverty and inequality. The technology depicted is only accessible to specific demographics, creating an illusion of widespread accessibility that falls short in practice. Can such tech really deliver on its promise of happiness if it only deepens capitalist divides?

Moreover, there’s an unsettling irony in the interactions depicted in the video. People engrossed in advanced technology appear disconnected and isolated, as though their devices detract from meaningful social interactions. This tension isn’t new; many feared that the rise of technology would eventually isolate us, even as it ostensibly connects us. The video seems to highlight this concern, making me reflect on whether such advancements genuinely enhance human happiness or merely serve to reinforce certain fears about a tech-saturated future.

Week 9: Day & Night Simulator

Video Link to final project: https://youtu.be/lS588oI_GPU

Concept

For this assignment, my concept was to simulate a day and night state using 2 LEDs and a photosensor. The digital sensor would be the switch, while the photosensor would be the analog sensor. One light should turn on while the other is off during each state. I was also inspired by this week’s readings, which discussed the idea of not being afraid to draw inspiration from what other artists have created. The articles by Tigoe helped me gather inspiration from various sources on the internet, such as our GitHub page. From there, I was able to develop this day and night simulation prototype.

Design and Execution

The following schematic represents the foundation on which I built this day and night simulator:

Schematic for day and night simulation

After drawing the schematic, I carefully assembled the entire circuit to represent the desired effect. The following code is the loop which allowed the entire idea come to life:

void loop() {
  int lightLevel = analogRead(photoPin); // Read photosensor value
  bool isDay = lightLevel > lightThreshold; // Determine if it's daytime
  Serial.println(lightLevel); // Print light level for debugging

  // Read switch positions
  bool switch1 = !digitalRead(switchPin1); // True if position 1 is active
  bool switch2 = !digitalRead(switchPin2); // True if position 2 is active

  if (switch1) {
    // Position 1: Both LEDs off
    digitalWrite(dayLed, LOW);
    analogWrite(nightLed, 0);
  } else if (switch2) {
    // Position 2: Daytime/Nighttime mode based on light sensor
    if (isDay) {
      // Daytime behavior
      digitalWrite(dayLed, HIGH); // Daytime LED on
      analogWrite(nightLed, 0);   // Nighttime LED off
    } else {
      // Nighttime behavior
      digitalWrite(dayLed, LOW);       // Daytime LED off
      analogWrite(nightLed, lightLevel / 4); // Nighttime LED brightness based on light level
    }
  }

  delay(100); // Small delay to smooth transitions
}

Finally, the project works, and here is a link that demonstrates the final outcome: https://youtu.be/lS588oI_GPU

Final Thoughts and Reflection:

This project allowed me to utilize previous knowledge and to learn and apply new concepts. I believe a great way to improve this project would be to have multiple lights that would react differently for day and night states. The challenge behind this would be wire management, and I believe there is a solution to this problem that I’m still yet to encounter. I am, however, curious about how this could manifest. This exercise is pivotal to my knowledge basket, as it will all contribute to my final project. That’s all for now!

Week 9: Reading Response

Articles:

  • Physical Computing’s Greatest hits and misses
  • Making Interactive Art: Set the Stage, Then Shut Up and Listen

 

In Tigoe’s two articles, he explores the ways in which artists can find inspiration from the work of others, offering both insights and reassurance about the process of building on existing ideas. A particularly compelling aspect of these articles is how Tigoe illustrates the concept of ‘copying’ among artists—not as mere replication but as transformation. He shows that while artists may draw from each other’s work, they often create something entirely new and unique, reshaping the borrowed elements into original expressions. This idea aligns with the well-known notion that artists often remix various sources, juxtaposing them to create fresh, unique works. In interactive art, this approach can ease the burden of starting from scratch, which can be overwhelming and anxiety-inducing. Without inspiration from existing works, artists may struggle to bring their ideas to life fully—unless they are among the rare few who don’t rely on external inspiration at all.

Tigoe discusses various interactive pieces that appear to share certain traits but yield vastly different experiences for the audience, such as ‘Video Mirrors’ and ‘Body and Hand Cursors’. One of my favorite examples is the gloves that produce music when tapped on a surface. This design is not only interactive but fun and highly engaging, embodying the playful spirit of interactive art.

One critical reflection I have is about Tigoe’s reference to the “Scooby Doo painting” example, where he highlights a common mistake among designers: confusing presence with attention. He states, “Presence is easy to sense… but it’s harder to tell whether someone’s paying attention.” I think that in cases where artwork detects a person’s presence and responds by moving, it naturally draws the viewer’s attention, fostering interaction. For instance, in a crowded space, artwork that moves autonomously could spark collective interest, showing how even unintended effects can enhance user experience.

This concept connects with Tigoe’s advice in the second article about avoiding interference with a finished product while users engage with it. I wholeheartedly agree with this perspective, and I believe it’s an essential practice to adopt in designing interactive experiences. Even if I don’t incorporate this approach all the time, it’s a valuable insight I’ll certainly keep in mind.

Week 8 Assignment: Head Switch

Concept:

 

Final results for you convenience: https://youtu.be/6M-4nbYk2Is

 

Initially, the idea of using a switch that didn’t require hands felt challenging to execute. However, after some contemplation, the thought process shifted: if not manually, perhaps turning on the switch wirelessly would be ideal. My initial idea was to see if I could use my laptop to turn on the light with a clap. This, however, didn’t work for two main reasons: 1) it still required using my hands, and 2) the claps were too soft, as sound is typically best detected in a controlled setting. I then considered if I could control the light by turning my head left or right. Once this idea settled, the execution began.

Design and Execution:

The following schematic represents the electrical connection for the Arduino Uno board:

Schematic image painfully made using photoshop.

The final connection represented by the image above can be found from the image below:

Connection Image for the head switch LED control

Finally, the magic that brought everything together was not only the Arduino code but also a Python script, with a bit of help from everyone’s favorite chatbot. The following code was used in the Arduino IDE:

const int ledPin = 13;  // Pin connected to the LED

void setup() {
  Serial.begin(9600);       // Initialize serial communication
  pinMode(ledPin, OUTPUT);  // Set the LED pin as output
}

void loop() {
  if (Serial.available() > 0) {  // Check if data is available on the serial port
    char command = Serial.read();  // Read the incoming byte

    if (command == '1') {
      digitalWrite(ledPin, HIGH);  // Turn LED on
    } else if (command == '0') {
      digitalWrite(ledPin, LOW);   // Turn LED off
    }
  }
}

I then ran the Python code in my terminal, which activated the camera. Head tracking began, and from that point, turning my head to the left switched the light on, while turning it to the right switched it off. The following portion of the code made this possible:

while True:
       # Capture a frame from the camera
       ret, frame = cap.read()
       if not ret:
           break

       # Convert frame to RGB
       rgb_frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)

       # Process the frame with Mediapipe
       results = face_mesh.process(rgb_frame)

       # If a face is detected, analyze head direction
       if results.multi_face_landmarks:
           landmarks = results.multi_face_landmarks[0].landmark
           direction = calculate_turn_direction(landmarks)

           if direction < LEFT_THRESHOLD and not led_on:
               print("Head turned left - Turning LED on")
               arduino.write(b'1')  # Send signal to Arduino to turn LED on
               led_on = True

           elif direction > RIGHT_THRESHOLD and led_on:
               print("Head turned right - Turning LED off")
               arduino.write(b'0')  # Send signal to Arduino to turn LED off
               led_on = False

       # Display the frame (optional)
       cv2.imshow("Head Movement Detection", frame)
       if cv2.waitKey(1) & 0xFF == ord('q'):
           break

 

Final Project:

Have a look at how the final project turned out in this short YouTube video:

https://youtu.be/6M-4nbYk2Is

Here is a progress of what happens when the user turns their head left and right:

Progress of head turns
Final Thoughts & Reflection:

This homework exercise was both fun and enjoyable. It pushed me to learn Arduino while thinking creatively about solving problems. Throughout the project, I kept considering how it might be integrated into my final project. So, instead of making this exercise long and complex, I approached it as a potential feature for the final project. That’s where I envision improvements and a broader application of this single project. That’s all for now!