Week 13 User Testing and Final Project

Video:

Drive links:https://drive.google.com/file/d/1tj29Zt4eafPmq3sbn2XQxWIdDe19ptf9/view?usp=sharing
https://drive.google.com/file/d/1iaTtnn3k2h35bS9jtLQnl48PngWvzTUW/view?usp=sharing

User Testing Documentation for the Project
To evaluate the user experience of the game, the following steps were conducted:

Participants: Two users were asked to play the game without prior instructions.
Environment: Each participant was given access to the joystick and mouse, along with the visual display of the game.
Recording: Gameplay sessions were recorded, capturing both screen activity and user interactions with the joystick and mouse.
Feedback: After the session, participants were asked to share their thoughts on the experience, including points of confusion and enjoyment.

Observations from User Testing
Most users instinctively tried to use the joystick to control the player.
Mapping joystick movement to player control was understood quickly.
Dying when hitting the wall was unexpected for both players, but they learned to avoid the walls and play more carefully quickly.

The dual control option (mouse click and joystick button) for starting the game worked well.

Powerups:

Participants found the power-up visuals engaging and intuitive.
Some users struggled to understand the effects of power-ups initially (e.g., what happens when picking up a turtle, or a lightning bolt)
But once they passed through the powerups they understood the effects it had.

Game Objectives:

The goal (reaching the endpoint) was clear to all participants.
Participants appreciated the timer and “Lowest Time” feature as a challenge metric.

What Worked Well
Joystick Integration: Smooth player movement with joystick controls was highly praised.
Visual Feedback: Power-up icons and heart-based life indicators were intuitive.
Engagement: Participants were motivated by the timer and the ability to beat their lowest time.
Obstacle Design: The maze structure was well-received for its balance of challenge and simplicity.

 Areas for Improvement:

Power-up Explanation:

Players were unclear about the effects of power-ups until they experienced them.
I think this part does not need changing as it adds to the puzzling aspect of the game and makes further playthroughs more fun.

Collision Feedback:

When colliding with walls or losing a life, the feedback was clear as they could hear the sound effect and can see the heart lost at the top of the screen.

 Lessons Learned
Need for Minimal Guidance: I like the challenge aspect of playing the game for the first time, with the lack of instructions, players are inspired to explore which increases their intrigue in the game.

Engaging Visuals and Sounds: Participants valued intuitive design elements like heart indicators and unique power-up icons.

Changes Implemented Based on Feedback
The speed was decreased slightly as the high speed was leading to many accidental deaths, The volume for the death feedback was increased to more clearly indicate what happens when a player consumes a death powerup or collide with a wall.

 

GAME:

Concept
The project is an interactive maze game that integrates an Arduino joystick controller to navigate a player through obstacles while collecting or avoiding power-ups. The objective is to reach the endpoint in the shortest possible time, with features like power-ups that alter gameplay dynamics (speed boosts, slowdowns, life deductions) and a life-tracking system with visual feedback.

  • Player Movement: Controlled via the Arduino joystick.
  • Game Start/Restart: Triggered by a joystick button press or mouse click.
  • Power-Ups: Randomly spawned collectibles that provide advantages or challenges.
  • Objective: Navigate the maze, avoid obstacles, and reach the goal with the least possible time.

 

The game is implemented using p5.js for rendering visuals and managing game logic, while Arduino provides the physical joystick interface. Serial communication bridges the joystick inputs with the browser-based game.

Design
Joystick Input:

X and Y axes: Control player movement.
Button press: Start or restart the game.

Visuals:

Player represented as a black circle.
Heart icons track lives.
Power-ups visually distinct ( icon-based).

Feedback:

Life loss triggers sound effects and visual feedback.
Timer displays elapsed and lowest times.
Game-over and win screens provide clear prompts.

Arduino Code:

const int buttonPin = 7; // The pin connected to the joystick button
int buttonState = HIGH;  // Assume button is not pressed initially

void setup() {
    Serial.begin(9600);       // Start serial communication
    pinMode(buttonPin, INPUT_PULLUP); // Set the button pin as input with pull-up resistor
}

void loop() {
    int xPos = analogRead(A0); // Joystick X-axis
    int yPos = analogRead(A1); // Joystick Y-axis
    buttonState = digitalRead(buttonPin); // Read the button state

    // Map analog readings (0-1023) to a more usable range if needed
    int mappedX = map(xPos, 0, 1023, 0, 1000); // Normalize to 0-1000
    int mappedY = map(yPos, 0, 1023, 0, 1000); // Normalize to 0-1000

    // Send joystick values and button state as CSV (e.g., "500,750,1")
    Serial.print(mappedX);
    Serial.print(",");
    Serial.print(mappedY);
    Serial.print(",");
    Serial.println(buttonState);

    delay(50); // Adjust delay for data sending frequency
}

The circuit connects the joystick to the Arduino and includes connections for the button and power LEDs (to indicate remaining lives).

  • Joystick:
    • X-axis: A0
    • Y-axis: A1
    • Click (SW) connected to digital pin 7.
    • VCC and GND to power the joystick module.

The p5.js sketch renders the maze, player, and power-ups, while handling game logic and serial communication.

Key features:

  • Player Class: Handles movement, collision detection, and rendering.
  • Power-Up Class: Manages random spawning, effects, and rendering.
  • Obstacles Class: Generates Obstacles, and handles design aspects of them
  • Joystick Input Handling: Updates player movement based on Arduino input.
  • Game Loops: Includes logic for starting, restarting, and completing the game.

Code:

let player; //player variable
let obstacles = []; //list of obstacles
const OBSTACLE_THICKNESS = 18; //thickness of each rectangle
let rectImg, startImg; //maze pattern and start screen
let obstaclesG; // pre rendered obstacle course pattern for performance
let gameStarted = false; //game started flag
let gameEnded = false; //game ended flag
let startTime = 0; //start time 
let elapsedTime = 0; //time passed since start of level
let lowestTime = Infinity; //infinity so the first level completion leads to the new lowest time
let lives = 3; // player starts with 3 lives
let collisionCooldown = false; // Tracks if cooldown is active
let cooldownDuration = 1000; // Cooldown duration in milliseconds
let lastCollisionTime = 0; // Timestamp of the last collision
let heartImg;//live hearts img
let bgMusic;
let lifeLostSound;
let winSound;
let serial; //for arduino connection
let joystickX = 500; // default joystick X position
let joystickY = 500; // default joystick Y position
let powerUps = []; // Array to store power-ups
let powerUpSpawnInterval = 10000; // interval to spawn a new 
let lastPowerUpTime = 0; // time when the last power-up was spawned
let speedUpImg, slowDownImg, loseLifeImg;
let buttonPressed = false;





function preload() {
  rectImg = loadImage('pattern.png'); // Load obstacle pattern
  startImg = loadImage('start.png'); // Load start screen image
  heartImg = loadImage('heart.png');//  load heart image
  bgMusic = loadSound('background_music.mp3'); // background music
  lifeLostSound = loadSound('life_lost.wav');  // Sound for losing a life
  winSound = loadSound('win_sound.wav'); //sound for winning
  speedUpImg = loadImage('speed_up.png'); //icons for powerups
  slowDownImg = loadImage('slow_down.png');
  loseLifeImg = loadImage('lose_life.png');


}

function setup() {
  createCanvas(1450, 900);
  serial = new p5.SerialPort(); // Initialize SerialPort
  serial.open('/dev/tty.usbmodem1101'); //the code for the arduino device being opened
  serial.on('data', handleSerialData);
  player = new Player(30, 220, 15, 5); //maze starting coordinate for player

 //maze background
  obstaclesG = createGraphics(1450, 900);
  obstaclesG.background(220);

  // Add obstacles
  addObstacles(); //adds all the obstacles during setup

  // loops through the list and displays each one
  for (let obs of obstacles) {
    obs.showOnGraphics(obstaclesG);
  }
  bgMusic.loop() //background music starts
}

function spawnPowerUp() {
    let x, y;
    let validPosition = false;

    while (!validPosition) {
        x = random(50, width - 50);
        y = random(50, height - 50);
        //a valid position for a powerup is such that it does not collide with any obstacles
        validPosition = !obstacles.some(obs =>
            collideRectCircle(obs.x, obs.y, obs.w, obs.h, x, y, 30)
        ) && !powerUps.some(pu => dist(pu.x, pu.y, x, y) < 60);
    }

    const types = ["speedUp", "slowDown", "loseLife"];
    const type = random(types); //one random type of powerup

    powerUps.push(new PowerUp(x, y, type)); //adds to powerup array
}


function handlePowerUps() {
  // Spawn a new power-up if the interval has passed
  if (millis() - lastPowerUpTime > powerUpSpawnInterval) {
    spawnPowerUp();
    lastPowerUpTime = millis(); // reset the spawn timer
  }

  // Display and check for player interaction with power-ups
  for (let i = powerUps.length - 1; i >= 0; i--) {
    const powerUp = powerUps[i];
    powerUp.display();
    if (powerUp.collidesWith(player)) {
      powerUp.applyEffect(); // Apply the effect of the power-up
      powerUps.splice(i, 1); // Remove the collected power-up
    }
  }
}

function draw() {
  if (!gameStarted) {
    background(220);
    image(startImg, 0, 0, width, height);
    noFill();
    stroke(0);

    // Start the game with joystick button or mouse click
    if (buttonPressed || (mouseIsPressed && mouseX > 525 && mouseX < 915 && mouseY > 250 && mouseY < 480)) {
      gameStarted = true;
      startTime = millis();
    }
  } else if (!gameEnded) {
    background(220);
    image(obstaclesG, 0, 0);

    player.update(obstacles); // Update player position
    handlePowerUps(); // Manage power-ups
    player.show(); // Display the player

    // Update and display elapsed time, hearts, etc.
    elapsedTime = millis() - startTime;
    serial.write(`L${lives}\n`);
    displayHearts();

    fill(0);
    textSize(22);
    textAlign(LEFT);
    text(`Time: ${(elapsedTime / 1000).toFixed(2)} seconds`, 350, 50);
    textAlign(RIGHT);
    text(
      `Lowest Time: ${lowestTime < Infinity ? (lowestTime / 1000).toFixed(2) : "N/A"}`,
      width - 205,
      50
    );

    if (dist(player.x, player.y, 1440, 674) < player.r) {
      endGame(); // Check if the player reaches the goal
    }
  } else if (gameEnded) {
    // Restart the game with joystick button or mouse click
    if (buttonPressed || mouseIsPressed) {
      restartGame();
    }
  }
}


function handleSerialData() {
    let data = serial.readLine().trim(); // Read and trim incoming data
    if (data.length > 0) {
        let values = data.split(","); // Split data by comma
        if (values.length === 3) {
            joystickX = Number(values[0]); // Update joystick X
            joystickY = Number(values[1]); // Update joystick Y
            buttonPressed = Number(values[2]) === 0; // Update button state (0 = pressed)
        }
    }
}


function displayHearts() { //display lives
  const heartSize = 40; // size of each heart
  const startX = 650; // x position for hearts
  const startY = 40; // y position for hearts
  for (let i = 0; i < lives; i++) { //only displays as many hearts as there are lives left
    image(heartImg, startX + i * (heartSize + 10), startY, heartSize, heartSize);
  }
}

function endGame() {
  gameEnded = true;
  noLoop(); // stop the draw loop
  winSound.play(); //if game ends
  serial.write("END\n");

  // check if the current elapsed time is a new record
  const isNewRecord = elapsedTime < lowestTime;
  if (isNewRecord) {
    lowestTime = elapsedTime; // update lowest time
    
  }

  // Display end screen
  background(220);
  fill(0);
  textSize(36);
  textAlign(CENTER, CENTER);
  text("Congratulations! You reached the goal!", width / 2, height / 2 - 100);
  textSize(24);
  text(`Time: ${(elapsedTime / 1000).toFixed(2)} seconds`, width / 2, height / 2 - 50);

  // Display "New Record!" message if applicable
  if (isNewRecord) {
    textSize(28);
    fill(255, 0, 0); // Red color for emphasis
    text("New Record!", width / 2, height / 2 - 150);
  }

  textSize(24);
  fill(0); // Reset text color
  text("Click anywhere to restart", width / 2, height / 2 + 50);
}


function mouseClicked() {
 
  if (!gameStarted) {
    // start the game if clicked in start button area
    if (mouseX > 525 && mouseX < 915 && mouseY > 250 && mouseY < 480) {
      gameStarted = true;
      startTime = millis();
    }
  } else if (gameEnded) {
    // Restart game
    restartGame();
  }
}
function checkJoystickClick() {
  if (buttonPressed) {
    if (!gameStarted) {
      gameStarted = true;
      startTime = millis();
    } else if (gameEnded) {
      restartGame();
    }
  }
}

function restartGame() {
  gameStarted = true;
  gameEnded = false;
  lives = 3;
  powerUps = []; // Clear all power-ups
  player = new Player(30, 220, 15, 5); // Reset player position
  startTime = millis(); // Reset start time
  loop();
  bgMusic.loop(); // Restart background music
}


function loseGame() {
  gameEnded = true; // End the game
  noLoop(); // Stop the draw loop
  bgMusic.stop();
  serial.write("END\n");

  // Display level lost message
  background(220);
  fill(0);
  textSize(36);
  textAlign(CENTER, CENTER);
  text("Level Lost!", width / 2, height / 2 - 100);
  textSize(24);
  text("You ran out of lives!", width / 2, height / 2 - 50);
  text("Click anywhere to restart", width / 2, height / 2 + 50);
}


function keyPressed() { //key controls
  let k = key.toLowerCase();
  if (k === 'w') player.moveUp(true);
  if (k === 'a') player.moveLeft(true);
  if (k === 's') player.moveDown(true);
  if (k === 'd') player.moveRight(true);
  if (k === 'f') fullscreen(!fullscreen());
}

function keyReleased() { //to stop movement once key is released
  let k = key.toLowerCase();
  if (k === 'w') player.moveUp(false);
  if (k === 'a') player.moveLeft(false);
  if (k === 's') player.moveDown(false);
  if (k === 'd') player.moveRight(false);
}

class Player { //player class
  constructor(x, y, r, speed) {
    this.x = x;
    this.y = y;
    this.r = r;
    this.speed = speed;

    this.movingUp = false;
    this.movingDown = false;
    this.movingLeft = false;
    this.movingRight = false;
  }

update(obsArray) { //update function
  let oldX = this.x;
  let oldY = this.y;

  //joystick-based movement
  if (joystickX < 400) this.x -= this.speed; // move left
  if (joystickX > 600) this.x += this.speed; // move right
  if (joystickY < 400) this.y -= this.speed; // move up
  if (joystickY > 600) this.y += this.speed; // move down

  // constrain to canvas
  this.x = constrain(this.x, this.r, width - this.r);
  this.y = constrain(this.y, this.r, height - this.r);

  //  restrict movement if colliding with obstacles
  if (this.collidesWithObstacles(obsArray)) {
    this.x = oldX; // revert to previous position x and y
    this.y = oldY;

    // Handle life deduction only if not in cooldown to prevent all lives being lost in quick succession
    if (!collisionCooldown) {
      lives--;
      lastCollisionTime = millis(); // record the time of this collision
      collisionCooldown = true; // activate cooldown
      lifeLostSound.play(); // play life lost sound

      if (lives <= 0) {
        loseGame(); // Call loseGame function if lives reach 0
      }
    }
  }

  // Check if cooldown period has elapsed
  if (collisionCooldown && millis() - lastCollisionTime > cooldownDuration) {
    collisionCooldown = false; // reset cooldown
  }
}


  show() { //display function
    fill(0);
    ellipse(this.x, this.y, this.r * 2);
  }

  collidesWithObstacles(obsArray) { //checks collisions in a loop
    for (let obs of obsArray) {
      if (this.collidesWithRect(obs.x, obs.y, obs.w, obs.h)) return true;
    }
    return false;
  }

  collidesWithRect(rx, ry, rw, rh) { //collision detection function checks if distance between player and wall is less than player radius which means a collision occurred
    let closestX = constrain(this.x, rx, rx + rw);
    let closestY = constrain(this.y, ry, ry + rh);
    let distX = this.x - closestX;
    let distY = this.y - closestY;
    return sqrt(distX ** 2 + distY ** 2) < this.r;
  }

  moveUp(state) {
    this.movingUp = state;
  }
  moveDown(state) {
    this.movingDown = state;
  }
  moveLeft(state) {
    this.movingLeft = state;
  }
  moveRight(state) {
    this.movingRight = state;
  }
}

class Obstacle { //obstacle class
  constructor(x, y, length, horizontal) {
    this.x = x;
    this.y = y;
    this.w = horizontal ? length : OBSTACLE_THICKNESS;
    this.h = horizontal ? OBSTACLE_THICKNESS : length;
  }

  showOnGraphics(pg) { //to show the obstacle pattern image repeatedly
    for (let xPos = this.x; xPos < this.x + this.w; xPos += rectImg.width) {
      for (let yPos = this.y; yPos < this.y + this.h; yPos += rectImg.height) {
        pg.image(
          rectImg,
          xPos,
          yPos,
          min(rectImg.width, this.x + this.w - xPos),
          min(rectImg.height, this.y + this.h - yPos)
        );
      }
    }
  }
}

class PowerUp {
    constructor(x, y, type) {
        this.x = x;
        this.y = y;
        this.type = type; // Type of power-up: 'speedUp', 'slowDown', 'loseLife'
        this.size = 30; // Size of the power-up image
    }

    display() {
        let imgToDisplay;
        if (this.type === "speedUp") imgToDisplay = speedUpImg;
        else if (this.type === "slowDown") imgToDisplay = slowDownImg;
        else if (this.type === "loseLife") imgToDisplay = loseLifeImg;

        image(imgToDisplay, this.x - this.size / 2, this.y - this.size / 2, this.size, this.size);
    }

    collidesWith(player) {
        return dist(this.x, this.y, player.x, player.y) < player.r + this.size / 2;
    }

    applyEffect() {
        if (this.type === "speedUp") player.speed += 2;
        else if (this.type === "slowDown") player.speed = max(player.speed - 1, 2);
        else if (this.type === "loseLife") {
            lives--;
            lifeLostSound.play();
            if (lives <= 0) loseGame();
        }
    }
}

function addObstacles() {
  // adding all obstacles so the collision can check all in an array
  
obstacles.push(new Obstacle(0, 0, 1500, true));
obstacles.push(new Obstacle(0, 0, 200, false));
obstacles.push(new Obstacle(0, 250, 600, false));
obstacles.push(new Obstacle(1432, 0, 660, false));
obstacles.push(new Obstacle(1432, 700, 200, false));
obstacles.push(new Obstacle(0, 882, 1500, true));
obstacles.push(new Obstacle(100, 0, 280, false));
obstacles.push(new Obstacle(0, 400, 200, true));
obstacles.push(new Obstacle(200, 90, 328, false));
obstacles.push(new Obstacle(300, 0, 500, false));
obstacles.push(new Obstacle(120, 500, 198, true));
obstacles.push(new Obstacle(0, 590, 220, true));
obstacles.push(new Obstacle(300, 595, 350, false));
obstacles.push(new Obstacle(100, 680, 200, true));
obstacles.push(new Obstacle(0, 770, 220, true));
obstacles.push(new Obstacle(318, 400, 250, true));
obstacles.push(new Obstacle(300, 592, 250, true));
obstacles.push(new Obstacle(420, 510, 85, false));
obstacles.push(new Obstacle(567, 400, 100, false));
obstacles.push(new Obstacle(420, 680, 100, false));
obstacles.push(new Obstacle(567, 750, 150, false));
obstacles.push(new Obstacle(420, 680, 400, true));
obstacles.push(new Obstacle(410, 90, 200, false));
obstacles.push(new Obstacle(410, 90, 110, true));
obstacles.push(new Obstacle(520, 90, 120, false));
obstacles.push(new Obstacle(410, 290, 350, true));
obstacles.push(new Obstacle(660, 90, 710, false));
obstacles.push(new Obstacle(660, 90, 100, true));
obstacles.push(new Obstacle(420, 680, 500, true));
obstacles.push(new Obstacle(410, 290, 315, true));
obstacles.push(new Obstacle(830, 0, 290, false));
obstacles.push(new Obstacle(760, 200, 70, true));
obstacles.push(new Obstacle(742, 200, 90, false));
obstacles.push(new Obstacle(950, 120, 480, false));
obstacles.push(new Obstacle(1050, 0, 200, false));
obstacles.push(new Obstacle(1150, 120, 200, false));
obstacles.push(new Obstacle(1250, 0, 200, false));
obstacles.push(new Obstacle(1350, 120, 200, false));
obstacles.push(new Obstacle(1058, 310, 310, true));
obstacles.push(new Obstacle(760, 390, 300, true));
obstacles.push(new Obstacle(660, 490, 200, true));
obstacles.push(new Obstacle(760, 582, 200, true));
obstacles.push(new Obstacle(920, 680, 130, false));
obstacles.push(new Obstacle(1040, 310, 650, false));
obstacles.push(new Obstacle(790, 760, 200, false));
obstacles.push(new Obstacle(1150, 400, 400, false));
obstacles.push(new Obstacle(1160, 560, 300, true));
obstacles.push(new Obstacle(1325, 440, 200, false));
obstacles.push(new Obstacle(1240, 325, 150, false));
obstacles.push(new Obstacle(1150, 800, 200, true));
obstacles.push(new Obstacle(1432, 850, 130, false));
obstacles.push(new Obstacle(1240, 720, 200, true));

}

What I’m Proud Of
Joystick Integration: Seamless control with physical inputs enhances immersion.
Dynamic Power-Ups: Randomized, interactive power-ups add a strategic layer.
Visual and Auditory Feedback: Engaging effects create a polished gaming experience.
Robust Collision System: Accurate handling of obstacles and player interaction.

Areas for Improvement:

  1. Tutorial/Instructions: Add an in-game tutorial to help new users understand power-ups and controls. This could be a simple maze with all powerups and a wall to check collision.
  2. Level Design: Introduce multiple maze levels with increasing difficulty.
  3. Enhanced Feedback: Add animations for power-up collection and collisions

Conclusion:

I had a lot of fun working on this project, it was a fun experience learning serial communication and especially integrating all the powerup logic. I think with some polishing and more features this could be a project that I could publish one day.

 

 

 

Progress on Final Project Week 12

Game Title: Maze Craze

Game Image:


Concept Overview: This project is an interactive maze game where players use a joystick connected to an Arduino Uno to navigate a ball through a maze displayed on a p5.js canvas. The game includes challenges like obstacles, a life system (represented by heart icons), and sound effects for feedback. The player wins by reaching a target point in the maze. The game tracks the fastest time for completion and the fastest time is also displayed. The walls are obstacles where the player loses a life when they touch the wall. This is to increase difficulty of the game and make sure players have to be careful while navigating the maze.

 

Arduino Program Design
Inputs:

Joystick (HW 504):

VRX (X-axis): Controls horizontal movement of the ball.
VRY (Y-axis): Controls vertical movement of the ball.
SW (Button): Can be used to reset the game .
Serial Communication:

Sends joystick X and Y axis values to the p5.js program.

Serial Data:
Sends joystick data in the format x,y to p5.js.

 

Arduino Logic:
Read X and Y values from the joystick.
Send the data via serial to p5.js in the format x,y.

Arduino Code:

void setup() {
  Serial.begin(9600); // Initialize serial communication at 9600 baud rate
}

void loop() {
  int xPos = analogRead(A0); // Joystick X-axis
  int yPos = analogRead(A1); // Joystick Y-axis

  // Map analog readings (0-1023) to a more usable range if needed
  int mappedX = map(xPos, 0, 1023, 0, 1000); // Normalize to 0-1000
  int mappedY = map(yPos, 0, 1023, 0, 1000); // Normalize to 0-1000

  // Send joystick values as CSV (e.g., "500,750")
  Serial.print(mappedX);
  Serial.print(",");
  Serial.println(mappedY);

  delay(50); // Adjust delay for data sending frequency
}

P5.js Logic
Game Start:
Display a start screen and wait for mouse click or button press to begin.
Joystick Integration:
Map joystick X and Y data to control the ball’s position on the canvas.
Collision Detection:
Check for collisions with obstacles and deduct a life upon collision.
Game End:
Display a victory or loss message based on game outcomes.

Code for handling serial data:`

function handleSerialData() {
  let data = serial.readLine().trim(); // Read and trim data
  if (data.length > 0) {
    let values = data.split(",");
    if (values.length === 2) {
      joystickX = Number(values[0]);
      joystickY = Number(values[1]);
    }
  }
}

 

 

Week 11 Reading Response

When Design Meets Disability

Reading Graham Pullin’s ‘Design Meets Disability’ made me rethink how we view assistive devices and how much design influences our perception of them. Pullin argues that assistive technology doesn’t have to be just functional—it can also be beautiful, creative, and reflective of individuality. This idea stood out to me because it flips the usual way we think about devices like hearing aids, wheelchairs, or prosthetics. Instead of being tools to hide or blend in, they can be seen as things that people can show off and be proud of, just like any other accessory or piece of technology.

One example Pullin mentions is hearing aids and how they’re often designed to be invisible. I never thought about how strange that is—why do we feel the need to hide something that helps people? What if hearing aids could be stylish, like jewelry, or customized to fit someone’s personality? It’s a simple shift in thinking, but it makes such a big difference. It reminds me of how glasses used to be seen as embarrassing, but now people wear bold frames to express their style. Why can’t assistive devices evolve in the same way? It’s not just about function; it’s about identity and empowerment.

This idea also connects to the bigger issue of how design often caters to an ‘average user, which leaves a lot of people feeling excluded. Pullin’s focus on inclusive design challenges that by showing how products can be more adaptable and personal. It made me imagine what prosthetic limbs could look like if they were designed with personality in mind—like having different patterns, colors, or even glowing lights. A prosthetic arm could be just as much a fashion statement as a designer handbag or a cool pair of sneakers. This would help break down the stigma around disability by celebrating the creativity and individuality of the people using these devices.

Pullin also makes a really interesting point about beauty. He argues that beauty doesn’t have to mean perfection. Instead, it can come from things that are unique, unexpected, or even imperfect. This reminded me of the Japanese concept of wabi-sabi, which finds beauty in imperfection and the natural flow of life. If we applied that to assistive technology, we could design devices that are not only functional but also artistic and meaningful. For example, a wheelchair could have a sleek, futuristic look, or a prosthetic leg could be designed with intricate patterns that make it stand out in a good way. These designs could change how people think about disability, not as something to pity but as something to appreciate and admire.

In the end, Pullin’s book shows that design is never just about solving problems—it’s about making statements and shaping how people see the world. By bringing creativity into assistive technology, we can create a world that’s not only more inclusive but also more exciting and diverse. Design Meets Disability opened my eyes to how much potential there is in rethinking design and how even small changes can make a huge difference in people’s lives.

Week 10 Reading Response

The Future of Interaction

Reading this article really made me rethink how I interact with technology. I’ve always been fascinated by touchscreens and their simplicity, but I never stopped to consider how limiting they actually are. The author’s critique of “Pictures Under Glass” really hit me, especially when they described how flat and numb these interfaces feel. It’s true—I use my phone every day, but when I think about it, the experience of swiping and tapping feels so disconnected compared to how I interact with physical objects.

One part that really stood out to me was the comparison to tying shoelaces. It reminded me of when I was little and struggling to learn how to tie mine. My hands learned by feeling, adjusting, and figuring things out without needing to rely on my eyes all the time. That’s such a natural way for us to interact with the world, and it’s crazy to think how little that’s reflected in our technology today.

The section about making a sandwich was also a moment of realization for me. It’s such a simple, everyday task, but it involves so much coordination and subtle feedback from your hands—how the bread feels, the weight of the knife, the texture of the ingredients. None of that exists when I swipe through apps or scroll on a website. It made me wonder: why do we settle for technology that ignores so much of what our hands can do?

This article really inspired me to think differently about the future of technology. I agree with the author that we need to aim higher—to create interfaces that match the richness of our human abilities. Our hands are capable of so much more than sliding on glass, and it’s exciting to imagine what might be possible if we started designing for that.

Responses: A Brief Rant on the Future of Interaction Design

I found this follow-up just as thought-provoking as the original rant. The author’s unapologetic tone and refusal to offer a neatly packaged solution make the piece feel refreshingly honest. It’s clear that their main goal is to provoke thought and inspire research, not to dictate a specific path forward. I really appreciated the comparison to early Kodak cameras—it’s a great reminder that revolutionary tools can still be stepping stones, not destinatione.

The critique of voice and gesture-based interfaces resonated with me too. I hadn’t really considered how dependent voice commands are on language, or how indirect and disconnected waving hands in the air can feel. The section on brain interfaces was particularly interesting. I’ve always thought of brain-computer connections as a futuristic dream, but the author flipped that idea on its head. Instead of bypassing our bodies, why not design technology that embraces them? The image of a future where we’re immobile, relying entirely on computers, was unsettling but eye-opening.

I love how the author frames this whole discussion as a choice. It feels empowering, like we’re all part of shaping what’s next. It’s made me more curious about haptics and dynamic materials—fields I didn’t even know existed before reading this. I’m left thinking about how we can create tools that actually respect the complexity and richness of human interaction.

 

Week 10 Project Echoes of Light

Concept
Our project, “Light and Distance Harmony,” emerged from a shared interest in using technology to create expressive, interactive experiences. Inspired by the way sound changes with distance, we aimed to build a musical instrument that would react naturally to light and proximity. By combining a photoresistor and distance sensor, we crafted an instrument that lets users shape sound through simple gestures, turning basic interactions into an engaging sound experience. This project was not only a creative exploration but also a chance for us to refine our Arduino skills together.

Materials Used
Arduino Uno R3
Photoresistor: Adjusts volume based on light levels.
Ultrasonic Distance Sensor (HC-SR04): Modifies pitch according to distance from an object.
Piezo Buzzer/Speaker: Outputs the sound with controlled pitch and volume.
LED: Provides an adjustable light source for the photoresistor.
Switch: Toggles the LED light on and off.
Resistors: For the photoresistor and LED setup.
Breadboard and Jumper Wires
Code
The code was designed to control volume and pitch through the analog and digital inputs from the photoresistor and ultrasonic sensor. The complete code, as documented in the previous sections, includes clear mappings and debugging lines for easy tracking.

// Define pins for the components
const int trigPin = 5; // Trigger pin for distance sensor
const int echoPin = 6; // Echo pin for distance sensor
const int speakerPin = 10; // Speaker PWM pin (must be a PWM pin for volume control)
const int ledPin = 2; // LED pin
const int switchPin = 3; // Switch pin
const int photoResistorPin = A0; // Photoresistor analog pin

// Variables for storing sensor values
int photoResistorValue = 0;
long duration;
int distance;

void setup() {
Serial.begin(9600); // Initialize serial communication for debugging
pinMode(trigPin, OUTPUT); // Set trigger pin as output
pinMode(echoPin, INPUT); // Set echo pin as input
pinMode(speakerPin, OUTPUT); // Set speaker pin as output (PWM)
pinMode(ledPin, OUTPUT); // Set LED pin as output
pinMode(switchPin, INPUT_PULLUP); // Set switch pin as input with pull-up resistor
}

void loop() {
// Check if switch is pressed to toggle LED
if (digitalRead(switchPin) == LOW) {
digitalWrite(ledPin, HIGH); // Turn LED on
} else {
digitalWrite(ledPin, LOW); // Turn LED off
}

// Read photoresistor value to adjust volume
photoResistorValue = analogRead(photoResistorPin);

// Map photoresistor value to a range for volume control (0-255 for PWM)
// Higher light level (LED on) -> lower photoresistor reading -> higher volume
int volume = map(photoResistorValue, 1023, 0, 0, 255); // Adjust mapping for your setup

// Measure distance using the ultrasonic sensor
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);

duration = pulseIn(echoPin, HIGH);

// Calculate distance in cm
distance = duration * 0.034 / 2;

// Set frequency based on distance in the range of 2-30 cm
int frequency = 0;
if (distance >= 2 && distance <= 30) {
frequency = map(distance, 1, 100, 20000, 2000); // Closer = higher pitch, farther = lower pitch
tone(speakerPin, frequency);
analogWrite(speakerPin, volume); // Apply the volume based on photoresistor reading
} else {
noTone(speakerPin); // Silence the speaker if the distance is out of range
}

// Debugging output
Serial.print("Photoresistor: ");
Serial.print(photoResistorValue);
Serial.print("\tVolume: ");
Serial.print(volume);
Serial.print("\tDistance: ");
Serial.print(distance);
Serial.print(" cm\tFrequency: ");
Serial.println(frequency);

delay(100); // Short delay for sensor readings
}

 

Video Demonstration
In our video demonstration, we showcase how the instrument responds to changes in light and proximity. We toggle the LED to adjust volume and move a hand closer or farther from the ultrasonic sensor to change pitch, demonstrating the instrument’s sensitivity and interactive potential.

Reflections
What Went Well: The project successfully combines multiple sensors to create a reactive sound device. The integration of volume and pitch control allows for intuitive, responsive sound modulation, achieving our goal of designing an engaging, interactive instrument.

Improvements: To improve this instrument, we would enhance the melody range, creating a more refined and versatile sound experience. This could involve using additional sensors or more sophisticated sound generation methods to provide a broader tonal range and a richer melody.

Week 9 Reading Response

Reading about classic physical computing projects, I was struck by how these interactive experiences allowed creative personalization. I was greatly inspired by how simply notions of time space and senses could be creatively challenged and it is a task that I intend to do in my phyiscal computing projects going forward.

The theremin-like instruments idea resonated greatly with me. I’d love to explore creating music through hand gestures, adding subtle layers that shift tone and pitch, channeling emotions through motion. Similarly, the glove-based instruments brought to mind using gloves to play sounds with a touch or pinch, bridging the digital and physical in a poetic way. Hands are the way we interact most closely with the world, and having them be the bridge that transcends the physical and virtual is inherently poetic.

The video mirror section made me think about my interest in using computer vision thoughtfully in art. I’d love to create a mirror that captures shadows or movements subtly, making it feel like a shared, non-intrusive moment.

The yell-activated projects felt liberating, inspiring thoughts on using sound to create splashes of color, encouraging expressive, even chaotic, engagement with tech. The passion involved in a scream or a loud sound, uncaring for your environment and showing vulnerability, are all themes that I wish to explore further.

Reading about these timeless projects has me excited to experiment with ways to make them my own. Physical computing offers an amazing canvas, and I would love to take a deeper dive into these topics so that I can hopefully inspire someone too like I was inspired reading about these projects.

Tom Igoe’s article, “Making Interactive Art: Set the Stage, Then Shut Up and Listen,” emphasized that artists should let their interactive artwork speak for itself, rather than over-explaining it. According to Igoe, artists often script or interpret every part of their work, potentially limiting participants’ personal engagement. Instead, Igoe argues that interactive art should initiate a dialogue, allowing the audience to explore, interpret, and react in their own way. He compares this to a director’s approach with actors, guiding without dictating emotions, letting authentic responses emerge. For interactive art, setting up the experience thoughtfully, then stepping back, creates a collaborative process where the audience completes the piece through interaction. I realised that often in my own work, I do not allow the space for that creativity which hinders how interactive they can be. If the observer is first faced with my viewpoint and then the art itself, they will only be able to see it from my eye. This reading gives me motivation to learn to separate the art from the artist in my own and other’s art and let my work flourish and have a voice of its own.

Week 9 Sunrise Project

Concept:

The idea for this project came from something I’ve always found fascinating: the transition from night to day, especially the way streetlights turn off as the new day begins and the sun begins to rise.
I’ve always been captivated by the natural beauty of a sunrise, how the sky gradually shifts colors as the day begins.  The goal was to create something interactive where turning off a “streetlight” (represented by a blue LED) would trigger a sunrise sequence using an RGB LED. I used a photoresistor that detects the change in light when the streetlight is turned off, this then turns on the RGB light sequence.

Demonstration:

Code:

const int blue_led_pin = 9;         // Blue light (streetlight)
const int switch_pin = 2;           // Switch pin
const int photo_resistor_pin = A0;  // Photoresistor pin
const int red_pin = 3;              // Red pin of RGB 
const int green_pin = 5;            // Green pin of RGB 
const int blue_pin = 6;             // Blue pin of RGB

// Variables
bool streetlight_on = true;         // bool to track if the stretlight is on
bool sunrise_started = false;       // bool to track is the sunrise sequence has started

void setup() {
  // Initialize pins
  pinMode(blue_led_pin, OUTPUT);
  pinMode(switch_pin, INPUT);

  pinMode(red_pin, OUTPUT);
  pinMode(green_pin, OUTPUT);
  pinMode(blue_pin, OUTPUT);

  // streetlight starts off as on
  digitalWrite(blue_led_pin, HIGH);
}

void loop() {
  // Check if the switch is pressed
  if (digitalRead(switch_pin) == HIGH && streetlight_on) {
    digitalWrite(blue_led_pin, LOW);   // Turn off streetlight
    streetlight_on = false;            // change the bool to reflect the streetlight being off
  }

  // Read light level from photoresistor
  int light_level = analogRead(photo_resistor_pin);

  // If light level drops below threshold, start sunrise sequence, can be if the blue light turns off or if the light is manually blocked
  if ( !sunrise_started && light_level < 500) { 
    sunrise_effect();
    sunrise_started = true;            // bool confirms that sunrise has started
  }
}

void sunrise_effect() {
  // Gradually change colors like a sunrise
  for (int i = 0; i <= 255; i++) {
    analogWrite(red_pin, i);           // Increase red brightness first
    delay(30);                         // Adjust delay for smooth transition

    if (i > 100) {
      analogWrite(green_pin, i - 100); // Add green for yellow/orange colors
    }

    if (i > 200) {
      analogWrite(blue_pin, i - 200);   // Add blue for full white at to show noon
    }

    delay(30);
  }
}

Schematics:

Workings:

The blue led is shone directly into a photoresistor. When the switch is used to turn it off, the photoresistor reading goes down which triggers the sunrise sequence on the rgb led. It is demonstrated that the photoresistor triggers the sequence and not the button press as even when the photoresistor is covered with a finger with the blue led on, the reading goes down and the sunrise sequence is initiated.

Future Considerations:

I had tremendous fun working on this project especially learning how the rgb led works. In the future I’d like to add more leds which trigger different sequences in the rgb led or perhaps a sound aspect to it as well where a loud sound triggers the sequence.

Reading Response 6: Norman and Hamilton

Norman:
When I first opened the article titled “attractive things work better”, I was completely unaware that I would leave with such profound insights.
Norman’s premise—“attractive things work better”—initially appears straightforward, but as he dives deeper, it’s clear that he challenges the pure function-over-form mindset often seen in technology and design. Reading his analysis, I found myself reflecting on my own relationship with technology and my understanding of utility, aesthetics, and the human experience in design.

One example Norman uses to illustrate the power of emotional design is color screens on computers. He recalls initially perceiving color screens as unnecessary, a view that aligns with a utilitarian perspective focused on measurable, practical outcomes. However, he soon realized that, despite offering no overt advantage, color screens created a more engaging experience. I can see this reflected in my choice of smartphones, where choosing a sleeker more aesthetically pleasing model is a priority even if the it performs identically to a model that is cheaper but less aesthetically appealing. Though a basic model could perform many of the same tasks, I choose a high-end model because it simply feels right. While utilitarianism would label this decision inefficient, Norman’s work suggests that emotion has its own kind of utility.

An interesting case where Normal emphasizes utility over looks is his example of emergency doors, where design has to be immediately intuitive, especially in emergencies. It’s an example where utilitarianism, focused on maximum efficiency, clearly benefits the user. However, in low-stress situations, attractive designs with aesthetic details enhance the user’s experience. This reflects John Stuart Mill’s “higher pleasures” in qualitative utilitarian philosophy, which suggests that intellectual and emotional satisfaction are inseparable from true utility. Norman’s view implicitly critiques a rigid, form-over-function approach by suggesting that design must incorporate both utility and aesthetics to meet the full spectrum of human needs.

As a student, I see Norman’s work inspiring me to think differently about the intersection of technology, utility, and emotion.  Rather than dismissing emotional design as indulgent, Norman helps us see that “attractive things work better” not because of a superficial appeal to aesthetics but because they engage our emotions in ways that functionality alone cannot.

Hamilton:
Margaret Hamilton’s contributions to the Apollo Program and software engineering are monumental feats accomplished against all odds. Her ability to work under high pressure, to predict and plan for critical failures and her creative thinking made the moon landing a resounding success. At the same time it saddens me how little I had heard of Margaret before this article, everyone always talks about Neil Armstrong or Buzz Aldrin, and famous lines they said while diminishing the ever important work of a brilliant woman.

Hamilton built in a priority task scheduling system and robust error handling which came in handy when multiple alarms threatened to abort the Apollo 11 mission. She also coined the term software engineering, providing more legitimacy to a field which now permeates through every major and minor world avenue. As a woman leading a crucial team in a male dominated field, she broke significant barriers and paved the way for more gender diversity within STEM.
Her legacy extended beyond the Apollo Guidance System, she continued working on error prevention and development of UML  showcasing a lifelong devotion to her field and love for computers.
I am truly inspired by the immense impact that Hamilton has had on the field, contributing to one of humanity’s greatest feats while also shaping how we think of software development today. Her story is a powerful reminder to push boundaries, think creatively and to plan rigorously for every outcome even in the face of insurmountable challenges.

 

 

Assignment 6 Unusual Switch

Concept: 

While thinking of unusual switches, the first thing that came to mind was how suitcase locks work. How setting some sort of combination and pressing a button would make that into a ‘password’ that you could repeatedly use to open the suitcase. Inspired and fascinated by the workings of such a simple concept I came up with the idea of mimicking a combination lock but instead using buttons on a breadboard. Although using our hands wasn’t technically a part of the assignment, I do believe that being able to set a combination using the code, and having the circuit only work when the correct combination is entered is a creative switch.
Code Highlight:

Approaching this problem I had to think very creatively about how the Arduino code would work with multiple switches. In the end the best solution I arrived at was setting a lock using the pins each switch was connected to and then checking if each switch pressed in order was one corresponding to its index in the combination array.

const int buttonPins[] = {2, 3, 4, 5}; // Digital pins connected to the buttons
const int ledPin = 13; // Digital pin connected to the LED
int combo[] = {0, 1, 2, 3}; // The correct sequence of button presses (in order)
int input[4]; // Array to store the user's button press sequense
int index = 0; // Keeps track of how many correct buttons have been pressed
bool isUnlocked = false; // Flag to indicate if the correct combination has been entered

void setup() {
  // Set up each button pin as an input
  for (int i = 0; i < 4; i++) {
    pinMode(buttonPins[i], INPUT);
  }
  // Set up the LED pin as an output
  pinMode(ledPin, OUTPUT);
}

void loop() {
  if (!isUnlocked) { // Only check for button presses if the lock is not yet unlocked
    for (int i = 0; i < 4; i++) { // Iterate over each button
      if (digitalRead(buttonPins[i]) == HIGH) { // Check if the current button is pressed
       delay(50) //Small delay to ensure one press is not registered multiple times
        if (digitalRead(buttonPins[i]) == HIGH) { // Confirm the button is still pressed
          if (i == combo[index]) { // Check if this button is the next in the correct sequence
            input[index] = i; // Record the correct press in the input array
            index++; // Move to the next position in the sequence
            delay(500); // Delay to avoid multiple readings from a single press
          } else {
            index = 0; // Reset if a wrong button is pressed
            break; // Exit the loop to start checking from scratch
          }
        }
      }
    }

    if (index == 4) { // If all buttons have been pressed in the correct order
      digitalWrite(ledPin, HIGH); // Turn on the LED to indicate success
      isUnlocked = true; // Set flag to indicate that the lock has been unlocked
    }
  }
}

Demonstration:

The led only lights up when the correct combination is pressed
In this case the correct combination is yellow, red, green, blue for clarity of demonstration.

MidTerm Project: Going Through It

Inspiration:

I’ve always had a fear of snakes. Overcoming it seemed impossible until I started working on my game. Inspired by how snakes consume food whole, I created “Going Through It”.  An obstacle course game where the obstacle course is designed in the shape of a snake. The player controls a small stick character trying to escape the snake as fast as possible. Adding my own unique twist to the game, the player cannot directly control the jumping ability of the stick figure, instead the stick is more akin to a ‘pogo stick’ where it bounces off with every obstacle it collides with and the player only controls the rotation of the stick figure using their keyboard.

Challenges Faced:

Developing “Going Through It” presented me with several challenges

  • Collision Detection: One of the primary difficulties was implementing an effective collision detection system that could handle the stick’s rotation and interactions with obstacles at various angles. Ensuring that the stick responds correctly to collisions, including bouncing off surfaces at appropriate angles, required careful calculation and testing. This is one of the primary features of the game and needed to be perfect although I am always sorting out minor issues with the collision detection mechanism.
  • Physics and Movement: Balancing the physics of gravity, friction, and rotational speed to create a challenging yet fun experience was another challenge. The stick couldn’t be too fast, the gravity couldn’t be too strong and the rotation had to be just right to be responsive yet precise. Fixing these problems involved a significant amount of play testing the game.
  • User Interface and Feedback: Designing an intuitive user interface that provides clear feedback to players was an essential feature of the game for me. This included displaying elapsed time, providing instructions, and ensuring that game states (such as starting or ending the game) were communicated effectively. In the end I decided to go with a very minimal layout that fits with the aesthetic of the game but I do believe that it is still intuitive and someone could understand how to play and win with minimal effort.

Final Project:
Conclusions and Reflections:
Reflecting back on this project, developing this game has been both a creative and highly technical journey.
Looking ahead, I hope to improve many aspects of this project as it is an idea that I haven’t seen before. The following are some ideas I have for future improvements to this game.
Level Design: Expanding “Going Through It” with more levels featuring diverse obstacle layouts and increasing difficulty. Moving obstacles are also a problem I hope to tackle in the future.
Multiplayer Mode: Due to the speed-run nature of the game, a mode where players can compete in real time would greatly add to the immersion and entertaining nature of the gameOverall, this project has laid a strong foundation for further development, I am genuinely excited about the game I have created and I hope to keep working on it in the future. Plus the amount of trigonometry implemented for collisions has made me a better mathematician which is always a welcome side effect.