Final Project

Concept & Goals

Inspiration: Give a plant a “voice” through digital animation and sound, fostering empathy and care.

The idea was to monitor ambient light and human touch on the plant, translate those signals into a friendly digital avatar’s mood, color-changing lamp, moving leaf, and background music.

    • Goals:

      1. Awareness of plant wellbeing via playful tech.

      2. Interaction through capacitive touch (DIY sensor) and light sensing.

      3. Empathy by giving the plant a way to “talk back.”

Video

Avatar

Setup

Hardware Overview

1. Capacitive Touch Sensor (DIY)

    • Pins: D4 (send) → 1 MΩ resistor → D2 (receive + foil/copper tape).

    • Library: Paul Badger’s CapacitiveSensor. Downloaded for the Arduino code.

    • Assembly:

      1. Connect a 1 MΩ resistor between pin 4 and pin 2.

      2. Attach copper tape to the receiving leg (pin 2) and wrap gently around the plant’s leaves.

      3. In code: CapacitiveSensor capSensor(4, 2);

2. Photoresistor (Ambient Light)

    • Pins: LDR → +5 V; 10 kΩ → GND; junction → A0.

    • Function: Reads 0–1023, mapped to 0–255 to control lamp intensity.

3. Push-Button (Music Control)

    • Pins: Button COM → D7, NO → GND (using INPUT_PULLUP).

4. Mood LEDs

    • Pins:

      • Green LED1: D12 → 330 Ω → LED → GND

      • Red LED2: D13 → 330 Ω → LED → GND

    • Behavior:

      • Red LED on when touch > high threshold (indicates that the plant does not like the touch).

      • Green LED on when touch < low threshold (the plant is calm and likes the touch).

Arduino Code

#include <CapacitiveSensor.h>

// Capacitive sensor: send→ 4, receive→ 2
CapacitiveSensor capSensor(4, 2);

// Mood LEDs
const int ledPin   = 12;
const int ledPin2  = 13;

// Photoresistor
const int photoPin = A0;

// Push-button + button LED
const int buttonPin    = 7;  // COM→7, NO→GND


// Hysteresis thresholds
const long thresholdHigh = 40;
const long thresholdLow  = 20;

// Debounce
const unsigned long debounceDelay = 50;
unsigned long lastDebounceTime    = 0;
int           lastButtonReading   = HIGH;

// State trackers
bool musicOn = false;
bool led1On  = false;

void setup() {
  Serial.begin(9600);
  delay(100);

  // Let p5.js know we start OFF
  Serial.println("MUSIC_OFF");

  // Mood LEDs
  pinMode(ledPin,   OUTPUT);
  pinMode(ledPin2,  OUTPUT);

  // Capacitive sensor raw
  capSensor.set_CS_AutocaL_Millis(0);

  // Button LED off
  pinMode(buttonLedPin, OUTPUT);
  digitalWrite(buttonLedPin, LOW);

  // Push-button with pull-up
  pinMode(buttonPin, INPUT_PULLUP);
}

void loop() {
  // Button toggle (only prints MUSIC_ON / MUSIC_OFF) 
  int reading = digitalRead(buttonPin);
  if (reading != lastButtonReading) {
    lastDebounceTime = millis();
  }
  if (millis() - lastDebounceTime > debounceDelay) {
    static int buttonState = HIGH;
    if (reading != buttonState) {
      buttonState = reading;
      if (buttonState == LOW) {            // you pressed
        musicOn = !musicOn;
        Serial.println(musicOn ? "MUSIC_ON" : "MUSIC_OFF");
        digitalWrite(buttonLedPin, musicOn ? HIGH : LOW);
      }
    }
  }
  lastButtonReading = reading;

  // Capacitive
  long sensorValue = capSensor.capacitiveSensor(30);
  Serial.println(String("TOUCH:") + sensorValue);

  // Mood LED hysteresis 
  if (!led1On && sensorValue > thresholdHigh) {
    led1On = true;
  } else if (led1On && sensorValue < thresholdLow) {
    led1On = false;
  }
  digitalWrite(ledPin,  led1On ? HIGH : LOW);
  digitalWrite(ledPin2, led1On ? LOW  : HIGH);

  // Photoresistor
  int raw       = analogRead(photoPin);
  int mappedVal = map(raw, 0, 1023, 0, 255);
  Serial.println(String("LAMP:") + mappedVal);

  delay(50);
}

Serial messages:

    • MUSIC_ON / MUSIC_OFF (button)

    • TOUCH:<value> (capacitive)

    • LAMP:<0–255> (light)

P5js Code

let port;
const baudrate = 9600;
let connectButton;
let bgMusic;
let interactionStarted = false;
let isTouched = false;
let lampBrightness = 0;


let plankCount = 6;
let cam;
let myFont;
let waveAngle = 0;
let isWaving = false;
let waveStartTime = 0;



function preload() {
  myFont = loadFont('CalligraphyFLF.ttf');
  bgMusic = loadSound('musica.mp3');  // load  MP3
}
  // Works for clicks, touches, and fullscreen events
  const unlockAudio = () => {
    if (!audioUnlocked) {
      getAudioContext().resume().then(() => {
        console.log('AudioContext unlocked');
        audioUnlocked = true;
        if (musicOn && bgMusic.isLoaded()) bgMusic.loop();
      });
    }
  };
  // Mouse/touch unlock
  window.addEventListener('mousedown', unlockAudio);
  window.addEventListener('touchstart', unlockAudio);
  // Also unlock on fullscreen change
  document.addEventListener('fullscreenchange', unlockAudio);

function setup() {
  createCanvas(windowWidth, windowHeight, WEBGL);

  connectButton = createButton("Connect to Arduino");
  connectButton.position(20, 20);
  connectButton.mousePressed(() => {
    port.open(baudrate);
  });

  port = createSerial();
  const used = usedSerialPorts();
  if (used.length > 0) {
    port.open(used[0], baudrate);
  } else {
    console.warn("No previously used serial ports found.");
  }

  setInterval(onSerialData, 50);   

  textFont(myFont);
  textSize(36);
  textAlign(CENTER, CENTER);

  cam = createCamera();
  const fov = PI / 3;
  const cameraZ = (height / 2.0) / tan(fov / 2.0);
  cam.setPosition(0, 0, cameraZ);
  cam.lookAt(0, 0, 0);
  frameRate(60);
}
function onSerialData() { //serial data
  if (!port || !port.opened()) return;
  while (port.available() > 0) {
    const raw = port.readUntil('\n');
    if (!raw) break;
    const line = raw.trim();
    console.log('Received:', line);

    if (line.startsWith('LAMP:')) {
      lampBrightness = int(line.split(':')[1]);
    } else if (line.startsWith('TOUCH:')) {
      const t = int(line.split(':')[1]);
      // ignore baseline zero readings
      if (t > 0) {
        isTouched = true;
        isWaving = true;
        waveStartTime = millis();
      }
    } else if (line === 'START') {
      interactionStarted = true;
    } else if (line === 'MUSIC_ON') {
      musicOn = true;
      if (bgMusic.isLoaded()) {
        console.log('MUSIC_ON → play');
        bgMusic.play();
      }
    } else if (line === 'MUSIC_OFF') {
      musicOn = false;
      if (bgMusic.isPlaying()) {
        console.log('MUSIC_OFF → stop');
        bgMusic.stop();
      }
    }
  }
}

//Draw
function draw() {
  background(210, 180, 140);
  ambientLight(10);
  pointLight(255, 200, 150, 200, -300, 300);
  pointLight(255, 150, 100, 400, -300, 300);

  drawWarmGradientBackground();
  drawWarmLamp();
  drawWoodDeck();
  drawWoodenBase();
  drawShadow();
  drawPot();
  drawBody();
  drawFace();
  drawPetiole();
  drawInstructionFrame();

  if (!interactionStarted) {
    drawInstructionFrame();
  } else {
    drawScene();
    pointLight(lampBrightness, lampBrightness * 0.8, 100, 200, -300, 300);
  }

  if (isWaving) {
    waveAngle = sin((millis() - waveStartTime) / 2000) * 0.5;

    if (millis() - waveStartTime > 400) {
      isWaving = false;
      waveAngle = 0;
    }
  }

  push();
  rotateZ(waveAngle);
  drawLeaf(0, -140, 0, 1, 4);
  pop();

  push();
  rotateZ(-waveAngle);
  drawLeaf(0, -140, 0, -1, 4);
  pop();

}

function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
  const fov = PI / 3;
  const cameraZ = (height / 2.0) / tan(fov / 2.0);
  cam.setPosition(0, 0, cameraZ);
}


function drawWarmLamp() {
  push();
  translate(250, -250, 0);

  // glow — modulate alpha by lampBrightness
  for (let r = 300; r > 50; r -= 20) {
    push();
    noStroke();
    // fade alpha between 0 and 150
    let a = map(lampBrightness, 0, 255, 20, 150);
    fill(255, 180, 90, a);
    ellipse(0, 10, r, r * 1.2);
    pop();
  }

  // stand
  fill(100, 70, 50);
  translate(0, 200, 0);
  cylinder(5, 400);

  // base
  translate(0, 200, 0);
  fill(80, 60, 40);
  ellipse(0, 0, 80, 16);

  // lampshade (inverted cone)
  push();
  translate(0, -400, 0);
  fill(225, 190, 150);
  cone(50, -100, 24);
  pop();

  pop();
}

function drawWoodDeck() {
  push();
    rotateX(HALF_PI);
    translate(0, -2, -170);
    // static plank color
    fill(160, 100, 60);
    stroke(100, 60, 40);
    const plankHeight = -50;
    for (let i = 0; i < plankCount; i++) {
      box(width, plankHeight, 10);
      translate(0, plankHeight, 0);
    }
  pop();
}

function drawWoodenBase() {
  push();
    rotateX(HALF_PI);
    translate(0, 150, -200);
    // static plank color
    fill(160, 100, 60);
    stroke(100, 60, 40);
    const baseCount = 8;
    const plankWidth = (width * 1.2) / baseCount;
    for (let i = 0; i < baseCount; i++) {
      push();
        translate(-width * 0.6 + i * plankWidth + plankWidth / 2, 24, 0);
        box(plankWidth, 400, 20);
      pop();
    }
  pop();
}


function drawShadow() {
  push();
  translate(0, 150, -10);
  rotateX(HALF_PI);
  noStroke();
  fill(80, 70, 60, 30);
  ellipse(0, 0, 190, 30);
  pop();
}

function drawPot() {
  push();
  translate(0, 100, 0);
  fill(170, 108, 57);
  stroke(120, 70, 40);
  strokeWeight(1);
  cylinder(60, 80);
  translate(0, -50, 0);
  fill(190, 120, 70);
  cylinder(80, 20);
  pop();
}

function drawBody() {
  push();
  translate(0, 20, 0);
  noStroke();
  fill(150, 255, 150);
  sphere(75);
  translate(0, -90, 0);
  sphere(70);

  // highlights
  fill(255, 255, 255, 90);
  translate(-30, -10, 50);
  sphere(10);

  pop();
}

function drawPetiole() {
  push();
  translate(0, -110, 20);  // start from top of head
  rotateX(radians(200));  // slight backward tilt
  fill(100, 200, 100);
  noStroke();
  cylinder(8, 100);       
  pop();
}


function drawLeaf(x, y, z, flip = 1, scaleFactor = 1) {
  push();
  translate(x, y, z);
  rotateZ(flip * QUARTER_PI); // tilt outwards
  rotateX(HALF_PI - QUARTER_PI * 0.5); // add a slight backward curve
  scale(flip * scaleFactor, scaleFactor); // flip + scale


  fill(100, 200, 100);
  stroke(60, 160, 80);
  strokeWeight(1);

  beginShape();
  vertex(0, 0);
  bezierVertex(-35, -90, -70, -10, 0, 0); // left curve
  endShape(CLOSE);

  // Center vein
  stroke(90, 150, 80);
  strokeWeight(2);
  line(0, 0, -40, -29);
  pop();
}

function drawFace() {
  push();
  translate(0, -100, 70);

  stroke(10);
  strokeWeight(2);
  noFill();
  arc(-20, 0, 20, 10, 0, PI);
  arc(20, 0, 20, 10, 0, PI);

  // blush
  noStroke();
  fill(255, 200, 200, 100);
  ellipse(-35, 10, 15, 8);
  ellipse(35, 10, 15, 8);

  // smile
  stroke(30);
  noFill();
  arc(0, 15, 30, 10, 0, PI);

  pop();
}
function drawWarmGradientBackground() {
  push();
  translate(0, 0, -500); // move far behind the scene
  noStroke();
  beginShape();
  fill(250, 230, 200); // top (warm cream)
  vertex(-width, -height);
  vertex(width, -height);
  fill(210, 170, 130); // bottom (warm brownish-orange)
  vertex(width, height);
  vertex(-width, height);
  endShape(CLOSE);
  pop();
}


function drawInstructionFrame() {
  push();
    // move to a spot on the back wall
    translate(-width * 0.25, -height * 0.25, -490);

    // outer frame (landscape)
    fill(120, 80, 40);
    box(430, 300, 10);

    // inner “paper” canvas inset
    push();
      translate(0, 0, 7);
      fill(255, 245, 220);
      box(390, 260, 2);
    pop();

    // text
    push();
      translate(0, 0, 12);
      fill(60, 40, 30);
      textSize(40);
      textAlign(CENTER, CENTER);
      text("Make the Plant Happy\n- Press to play Music\n- Control the Lighting\n- Pet the plant", 0, 0);
    pop();
  pop();
}

Functionality Flow

    1. Startup

      • Arduino sends MUSIC_OFF. p5.js opens port, waits for START (button press on avatar).

    2. Interaction

      • Touch: Plant touch → TOUCH:<value> → leaf animation.

      • Light: Ambient changes → LAMP:<0–255> → lamp glow intensity.

      • Music: Physical push-button → MUSIC_ON/MUSIC_OFF → background music.

Challenges & Solutions

    • Serial Fragmentation: Split messages were mis-parsed → switched to single println("TOUCH:"+value) calls.

    • Sensor Hysteresis: Capacitive values vary → implemented high/low thresholds to avoid flicker.

    • Full-Screen Behavior: Music wouldn’t play in full screen → tied audio unlock to fullscreenchange event.

Progress Report

Current Progress

The p5.js environment has been successfully designed, I went from a 2D version to a 3D avatar rendered in WEBGL. The system also includes a custom font and a wooden background platform for visual warmth. A floating instruction frame appears at the beginning of the interaction, prompting users to “press to start.”

The Arduino hardware components (photoresistor, DIY capacitive touch sensor, LEDs, and buzzer) are currently in the process of being tested. I am actively working on matching sensor input with the avatar’s behavior (e.g., face expression, sound).

Video 

What’s Being Tested

    • Touch Sensor + LEDs → Plant’s mood environment (happy, sad)

    • Touch Input → Start and Display instructions

    • Avatar Design → Body, leaf animation, emotional face drawn in p5.js

    • Instructions Interface → Initial user onboarding screen

 Pending Tasks

    • Finalizing the integration of the Arduino circuit into the physical plant (soldering and arranging).

    • Smoothing the interaction between sensor readings and p5.js visual/audio feedback.

    • Conducting user tests to assess how people engage with the plant-avatar system.

      Avatar Demo

Final Project Concept: Plant Whisperer – Interactive Plant Companion

Plant Whisperer is a physical-digital interaction system that lets a real plant express how it feels through a friendly digital avatar. Using Arduino Uno and p5.js, the system monitors the plant’s environment, specifically light exposure and human interaction, and translates this data into visual and auditory feedback.

I want to promote awareness of nature and care through playful, intuitive technology. It reimagines how we perceive and respond to non-verbal living things by giving the plant a way to “talk back.”

Sensors and Components

    • Photoresistor (Light Sensor): Detects ambient light around the plant.

    • Capacitive Touch Sensor – DIY (using the CapacitiveSensor library): Detects when the user interacts with the plant.

    • RGB LED: Shows the plant’s current emotional state in color.

    • Piezo Buzzer: Plays tones based on mood or user interaction.

Avatar Behavior (in p5.js)

The avatar is a stylized digital plant that changes facial expression, movement, background color, and plays ambient sounds based on the sensor input.

Inspiration

Mood Logic:

Sensor Input Mood Visual in p5.js LED Color Sound
Bright Light Happy Smiling plant, upright Green Gentle chime
Dim Light Sleepy Drooping plant, closed eyes Blue Soft drone
Very Low Light Sad Frowning plant, faded color Purple Slow tone
Button Pressed Excited Eyes sparkle, leaf wiggle Yellow flash Upbeat trill

Significance of the Project

My goal with this project is to encourage mindfulness, care, and emotional engagement with the environment. By giving a non-verbal living organism a digital voice, the system fosters empathy and attention.

This project is about more than just monitoring a plant, it’s about interaction. By gently blurring the line between organic life and digital expression, Plant Whisperer invites users to slow down, observe, and connect with their environment through technology.

Reading response

Design Meets Disability is a new take on how design can play a much bigger role in the world of assistive technology. Instead of treating medical devices as something to hide, it encourages us to see them as an opportunity for creativity and personal expression. One of Graham’s coolest examples is the leg splint designed by Charles and Ray Eames during World War II—originally meant to help injured soldiers, but it later inspired their famous furniture. It’s a great reminder that functional design can also be beautiful and impactful beyond its original use.

Pullin makes a strong case for reimagining everyday medical devices, like hearing aids and prosthetic limbs, as something more than just tools. Think about glasses: they used to be seen purely as medical necessities, but now they’re a full-on fashion accessory. So why not give the same design attention to other devices? By making them customizable and stylish, users could feel more confident wearing them, and society might begin to see disability in a new, more positive light.

What’s especially great is Pullin’s emphasis on involving people with disabilities in the design process. He believes that assistive tech should reflect the user’s identity, not just meet a need. That mindset leads to more thoughtful, inclusive designs, and it helps everyone, not just those with disabilities. The reading is an inspiring call to designers to be more empathetic, open-minded, and creative in how they think about ability, aesthetics, and innovation.

Final Project Proposal

Plant Whisperer

I want to make an interactive system that allows a plant to “communicate” its environment and feelings through a digital avatar. Using an Arduino, the project reads light levels and touch interactions, and sends this data to a p5.js sketch. The p5.js interface visualizes the plant’s mood in real time using simple animations, color changes, and sound.

For example, if the plant is in low light, the avatar may appear tired or sleepy, and the RGB LED on the Arduino will turn a soft blue. If the user touches the plant (using a button or capacitive sensor), it responds with a cheerful animation and sound, and the LED may flash in a brighter color.

It’s intended to be playful, meditative, and emotionally engaging—bridging physical and digital experiences in a way that encourages users to care for their environment.

Serial Communication

Arduino & p5.js

[Genesis and Nikita]

Each exercise focuses on the communication between the Arduino’s physical components and visual output through serial communication.


Exercise 1: Moving an Ellipse with One Sensor

video

Arduino Code:

void setup(){
 Serial.begin(9600);
}
void loop(){
 int pot = analogRead(A0);
 // send mapped X (0–400) to p5
 int xPos = map(pot, 0, 1023, 0, 400);
 Serial.println(xPos);
 delay(30);
}

circuit:

Explanation:

This exercise uses a single analog sensor (potentiometer) connected to Arduino to control the horizontal position of an ellipse in p5.js. The Arduino reads the potentiometer value and sends it over Serial to the p5.js sketch, which updates the x-position of the ellipse accordingly. The communication is one-way:   Arduino → p5.js.


Exercise 2: Controlling LED Brightness from p5.js

video

Arduino Code:

// Arduino: LED brightness via Serial.parseInt()
const int ledPin = 9; // PWM-capable pin


void setup() {
 pinMode(ledPin, OUTPUT);
 Serial.begin(9600); 
}


void loop() {
 if (Serial.available() > 0) {
   int brightness = Serial.parseInt(); // Read integer from serial
   brightness = constrain(brightness, 0, 255); // Clamp
   analogWrite(ledPin, brightness);
 }
}

circuit:

Explanation:

In this exercise, the communication is reversed. The p5.js sketch sends a brightness value (0–255) to Arduino via Serial, which adjusts the brightness of an LED connected to a PWM-capable pin (pin 9). This demonstrates real-time control from software (p5.js) to hardware (Arduino).


Exercise 3: Gravity Wind with LED Bounce Indicator

video

Arduino Code:

/*
* Exercise 3 Arduino:
* - Read pot on A0, send sensorValue to p5.js
* - Listen for 'B' from p5.js → blink LED on pin 9
*/
const int sensorPin = A0;
const int ledPin    = 9;


void setup() {
 Serial.begin(9600);       
 pinMode(ledPin, OUTPUT);
}


void loop() {
 // Read and send sensor
 int sensorValue = analogRead(sensorPin);
 Serial.println(sensorValue);


 // Check for bounce command
 if (Serial.available() > 0) {
   char inChar = Serial.read();
   if (inChar == 'B') {
     digitalWrite(ledPin, HIGH);
     delay(100);
     digitalWrite(ledPin, LOW);
   }
 }


 delay(20);
}

circuit:

Explanation:

This sketch is a modified version of the classic p5.js Gravity + Wind example. An analog sensor (potentiometer) on Arduino controls the wind force in p5.js. Every time the ball hits the bottom (a “bounce”), p5.js sends a command (‘B’) back to Arduino via Serial, which briefly lights up an LED. This showcases a complete two-way communication system between Arduino and p5.js.


Readings

The text “A Brief Rant on the Future of Interaction Design” takes a critical look at modern digital interfaces and points out how they often force users to adapt to strict, outdated design rules instead of the other way around. The author argues that interfaces should be more in tune with natural human thought processes, which could lead to more flexible and easier-to-use systems. For example, the text challenges the reliance on traditional metaphors in design that can limit how people interact with technology, suggesting that a rethinking of these strategies would better serve everyone.

In the responses, various designers and thinkers share their own views on what works and what doesn’t in today’s interaction design. Many contributors agree that sticking to rigid structures can suppress innovation and user engagement, while others offer practical examples from their work. One common point is that when interfaces are redesigned to be more intuitive, it often results in smoother and more productive user experiences, showing that a change in approach can have positive real-world benefits.

Overall, both readings encourage a move toward interaction design that feels more natural and accommodating to users. The discussion emphasizes the importance of creating technology that adapts to how people actually think and work, rather than forcing users to learn and conform to outdated digital patterns. This friendly call for change makes it clear that smarter design is not just a theoretical goal; it can lead to improvements in everyday technology that benefit us all.

Musical Instrument

Group members: Genesis & Nikita

Concept:

video

“WALL-E’s Violin” is inspired by the character WALL-E from the popular animated movie. The idea behind “WALL-E’s Violin” is to mimic WALL-E expressing emotion through music, specifically through a violin — a symbol of sadness and nostalgia. A cardboard model of WALL-E is built with an ultrasonic distance sensor acting as his eyes. A cardboard cutout of a violin and bow are placed in front of WALL-E to represent him holding and playing the instrument.

As the user moves the cardboard stick (the “bow”) left and right in front of WALL-E’s “eyes” (the ultrasonic sensor), the sensor detects the distance of the bow. Based on the distance readings, different predefined musical notes are played through a piezo buzzer. These notes are chosen for their slightly melancholic tones, capturing the emotional essence of WALL-E’s character

To add a layer of animation and personality, a push-button is included. When pressed, it activates two servo motors attached to WALL-E’s sides, making his cardboard arms move as if he is emotionally waving or playing along. The mechanical hum of the servos blends into the soundscape, enhancing the feeling of a robot expressing music not just through tones, but with motion — like a heartfelt performance from a lonely machine.

How it Works:

  • Ultrasonic Sensor: Acts as WALL-E’s eyes. It continuously measures the distance between the robot and the cardboard bow moved in front of it.
  • Note Mapping: The Arduino interprets the measured distances and maps them to specific, predefined musical notes. These notes are chosen to sound melancholic and emotional, fitting WALL-E’s character.
  • Piezo Buzzer: Plays the corresponding musical notes as determined by the distance readings. As the user sways the bow, the pitch changes in real time, simulating a violin being played.
  • Servo Motors + Button: Two servo motors are connected to WALL-E’s arms. When the button is pressed, they animate his arms in a waving or playing motion. The sound of the servos adds a robotic texture to the performance, further humanizing the character and blending physical movement with audio expression.

Code:

#include <Servo.h>
#include <math.h>

// Pin Assignments

const int trigPin   = 12;   // Ultrasonic sensor trigger
const int echoPin   = 11;   // Ultrasonic sensor echo
const int buzzerPin = 8;    // Buzzer output
const int buttonPin = 2;    // Button input for servo tap
const int servoPin  = 9;    // First servo control pin (default starts at 0°)
const int servoPin2 = 7;    // Second servo control pin (default starts at 180°)

Servo myServo;    // First servo instance
Servo myServo2;   // Second servo instance

// Bittersweet Melody Settings

int melody[] = {
  294, 294, 370, 392, 440, 392, 370, 294
};
const int notesCount = sizeof(melody) / sizeof(melody[0]);


// Sensor to Note Mapping Parameters

const int sensorMin = 2;     // minimum distance (cm)
const int sensorMax = 30;    // max distance (cm)
int currentNoteIndex = 0;  

// Vibrato Parameters 

const float vibratoFrequency = 4.0;  
const int vibratoDepth = 2;          

// Timer for vibrato modulation

unsigned long noteStartTime = 0;

void setup() {
  // Initialize sensor pins
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  
 // Initialize output pins
  pinMode(buzzerPin, OUTPUT);
  pinMode(buttonPin, INPUT_PULLUP);  
  
  // Initialize the first servo and set it to 0°.
  myServo.attach(servoPin);
  myServo.write(0);
  
   // Initialize the second servo and set it to 180°..
  myServo2.attach(servoPin2);
  myServo2.write(180);
  
 // Initialize serial communication for debugging
  Serial.begin(9600);
  
// Get the initial note based on the sensor reading
  currentNoteIndex = getNoteIndex();
  noteStartTime = millis();
}

void loop() {
 // Get the new note index from the sensor
  int newIndex = getNoteIndex();
  
 // Only update the note if the sensor reading maps to a different index
  if (newIndex != currentNoteIndex) {
    currentNoteIndex = newIndex;
    noteStartTime = millis();
    Serial.print("New Note Index: ");
    Serial.println(currentNoteIndex);
  }
  
 // Apply vibrato to the current note.
  unsigned long elapsed = millis() - noteStartTime;
  float t = elapsed / 1000.0;  // Time in seconds for calculating the sine function
  int baseFreq = melody[currentNoteIndex];
  int modulatedFreq = baseFreq + int(vibratoDepth * sin(2.0 * PI * vibratoFrequency * t));
  
// Output the modulated tone
  tone(buzzerPin, modulatedFreq);
  
  // Check the button press to trigger both servos (movement in opposite directions).
  if (digitalRead(buttonPin) == LOW) {
    tapServos();
  }
  
  delay(10); 
}

// getNoteIndex() measures the sensor distance and maps it to a note index.
int getNoteIndex() {
  int distance = measureDistance();
// Map the distance (between sensorMin and sensorMax) to the range of indices of the array (0 to notesCount-1).
  int index = map(distance, sensorMin, sensorMax, 0, notesCount - 1);
  index = constrain(index, 0, notesCount - 1);
  return index;
}

// measureDistance() triggers the ultrasonic sensor and calculates the distance in cm.
int measureDistance() {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2;
  return constrain(distance, sensorMin, sensorMax);
}

// tapServos() performs a tap action on both servos:
// - The first servo moves from 0° to 90° and returns to 0°.
// - The second servo moves from 180° to 90° and returns to 180°.
void tapServos() {
  myServo.write(90);
  myServo2.write(90);
  delay(200);
  myServo.write(0);
  myServo2.write(180);
  delay(200);
}

 

Circuit:

Future Improvements:

    • Add more notes or an entire scale for more musical complexity.
    • Integrate a servo motor to move the arm or head of WALL-E for animation.
    • Use a better speaker for higher-quality sound output.
    • Use the LED screen to type messages and add to the character.

 

Analog input & output

Concept

For this project, I tried to build a simple lighting system using an LDR (light sensor), a push-button, and two LEDs. One LED changes brightness depending on how bright the room is, and the other lights up to show when I’ve manually taken control.

I wanted to manually override the automatic light control with the press of a button—so if I want the light to stay at a fixed brightness no matter how bright it is outside, I just hit the button. Press it again, and the automatic behavior comes back.

I used TinkerCad for the circuit simulation.

Video

How It Works

    1. The LDR is connected to pin A0 and tells the Arduino how bright the environment is.

    2. Based on this reading, the Arduino maps the value to a number between 0 and 255 (for LED brightness).

    3. The LED on pin 9 gets brighter when it’s dark and dims when it’s bright—automatically.

    4. I also wired a button to pin 2. When I press it, the system switches to manual mod

      • In this mode, the LED stays at medium brightness, no matter the light level.

      • An indicator LED on pin 13 lights up to let me know I’m in manual mode.

    5. Pressing the button again switches back to automatic mode.

Code

// Pin Definitions
const int ldrPin = A0;         // LDR sensor connected to A0
const int buttonPin = 2;       // Push-button connected to digital pin D2
const int pwmLedPin = 9;       // PWM LED for the ambient light effect
const int overrideLedPin = 13; // Digital LED for manual override indicator

// Variables
bool manualOverride = false;   // Tracks if the override mode is active
int lastButtonState = LOW;     // With external pull-down, default is LOW
unsigned long lastDebounceTime = 0;
const unsigned long debounceDelay = 50; // Debounce time in milliseconds

void setup() {
  pinMode(ldrPin, INPUT);
  pinMode(buttonPin, INPUT);
  pinMode(pwmLedPin, OUTPUT);
  pinMode(overrideLedPin, OUTPUT);
  
  // Start with manual override off, LED off
  digitalWrite(overrideLedPin, LOW);
  Serial.begin(9600);
}

void loop() {
  // Read the LDR Sensor
  int ldrValue = analogRead(ldrPin);
  
  // Map the LDR value to PWM brightness (0-255).
  // Darker environment (low ldrValue) yields a higher brightness.
  int pwmValue = map(ldrValue, 0, 1023, 255, 0);
  
  // Handle the Push-Button for Manual Override with Debouncing
  int reading = digitalRead(buttonPin);
  if (reading != lastButtonState) {
    lastDebounceTime = millis();
  }
  if ((millis() - lastDebounceTime) > debounceDelay) {
    //Unpressed = LOW, pressed = HIGH.
    if (reading == HIGH && lastButtonState == LOW) { // button press detected
      manualOverride = !manualOverride;
      // Update the indicator LED accordingly
      digitalWrite(overrideLedPin, manualOverride ? HIGH : LOW);
    }
  }
  lastButtonState = reading;
  
  // LED Behavior Based on Mode 
  if (manualOverride) {
    // In manual override mode, set LED to a fixed brightness.
    analogWrite(pwmLedPin, 128);
  } else {
    // Set brightness according to ambient light measured by the LDR.
    analogWrite(pwmLedPin, pwmValue);
  }
  
  // Debug output
  Serial.print("LDR Value: "); Serial.print(ldrValue);
  Serial.print(" | PWM Brightness: "); Serial.print(pwmValue);
  Serial.print(" | Manual Override: "); Serial.println(manualOverride ? "ON" : "OFF");
  
  delay(10);
}

Challenges

    • Balancing Automatic and Manual Modes:
      Getting the right balance between automatic brightness adjustments and a satisfying manual override was a fun challenge. I had to fine-tune the mapping of LDR readings to PWM values until the LED’s response felt right in different lighting conditions.

    • Debugging with Serial Monitor:
      Utilizing the Serial Monitor was incredibly useful. Every time something wasn’t working as expected, I added more Serial prints to understand what was happening.

Readings

Reading 1

Reading Tom Igoe’s blog post on the “greatest hits (and misses)” of physical computing really made me reflect on how approachable and endlessly creative this field can be. As someone who’s still growing my understanding of interactive technology, I found his message super reassuring, especially the idea that it’s okay to revisit classic project ideas like light-sensitive instruments or pressure-sensitive floors. Instead of feeling like I’m copying someone else’s work, Igoe made me realize that there’s value in reinterpreting those familiar ideas in new, personal ways. It reminded me that creativity isn’t always about inventing something completely new—it’s also about seeing old ideas through a different lens.

Igoe’s emphasis on how machines don’t understand intention—only action- was what caught my attention the most. It got me thinking about how important it is to really consider how people will interact with what I make. It’s not just about the cool factor of the tech; it’s about how people feel when they use it. Igoe’s point about designing systems that are intuitive and human-centered stuck with me. It encouraged me to start paying more attention to how people move, gesture, and react in everyday life—because that’s where meaningful interaction begins.

Reading 2

In “Making Interactive Art: Set the Stage, Then Shut Up and Listen,” Tom Igoe presents a thoughtful perspective on the role of the artist in interactive work. He emphasizes that the artist’s job is to create an environment that invites engagement, not to dictate how people should respond. This idea stood out because it shifts the focus from controlling the audience’s experience to trusting them to find their own way through the piece. Igoe’s comparison to theater directing, where the director sets the scene but lets the actors interpret it, captures this idea well. It reminded me that successful interaction often comes from subtle suggestions, not detailed instructions.

What I found especially valuable was Igoe’s insistence on listening, observing how people interact with a piece instead of trying to steer them. It’s a reminder that interactive art is a two-way conversation, and part of the creative process is stepping back and allowing space for surprise and interpretation. This approach encourages more open-ended, personal experiences for the audience.