Final Project Idea

Project Title:

Traffic Light Control Game


Overview:

The Traffic Light Control Game is an interactive simulation where a car on the p5.js screen reacts to traffic light LEDs controlled by an Arduino Uno. Players use keyboard arrow keys to control the car’s movement, adhering to basic traffic light rules:

  • Red LED: Stop the car.
  • Green LED: Move the car.
  • Yellow LED: Serves as a warning with no required interaction.

The game emphasizes real-time interaction between physical components and digital visuals, showcasing the integration of hardware and software.


Key Features:

  1. Traffic Light Simulation:
    • Red, yellow, and green LEDs simulate a real-world traffic light.
    • The lights change sequentially in predefined intervals.
  2. Interactive Car Movement:
    • Players use arrow keys to control the car:
      • Up Arrow: Move forward.
      • Down Arrow: Stop the car.
    • The car’s behavior must match the traffic light signals.
  3. Real-Time Feedback:
    • If the car moves during a red light or stops during a green light, a buzzer sounds to indicate a violation.
  4. Game Over:
    • If repeated violations occur, the game ends with a “Game Over” screen.

Objective:

The goal is to follow traffic light rules accurately and avoid violations. The game offers an educational yet engaging experience, simulating real-world traffic scenarios.


Technical Components:

Hardware:
  1. Arduino Uno:
    • Controls traffic light LEDs and buzzer.
  2. 3 LEDs:
    • Represent traffic lights (red, yellow, green).
  3. Buzzer:
    • Provides auditory feedback for rule violations.
  4. Resistors:
    • Ensure proper current flow for LEDs and buzzer.
  5. Breadboard and Wires:
    • Connect and organize the components.
Software:
  1. Arduino IDE:
    • Manages traffic light logic and sends the light states to p5.js via serial communication.
  2. p5.js:
    • Displays the car and road.
    • Handles player input and real-time car behavior based on the light states.

Implementation Plan:

1. Traffic Light Control:
  • The Arduino controls the sequence of LEDs:
    • Green for 5 seconds.
    • Yellow for 3 seconds.
    • Red for 5 seconds.
  • The current light state is sent to p5.js via serial communication.
2. Car Movement:
  • The p5.js canvas displays:
    • A road with a car.
    • The current traffic light state using on-screen indicators.
  • Arrow keys control the car’s position:
    • Right Arrow: Move forward.
    • Up Arrow: Stop.
3. Feedback System:
  • If the car moves during a red light or doesn’t move during a green light:
    • A buzzer sounds via Arduino.
    • Violations are logged, and after three violations, the game ends with a “Game Over” message.

Expected Outcome:

  • Players will interact with a dynamic simulation where their actions on the keyboard directly correspond to the car’s behavior.
  • The integration of physical LEDs and buzzer with digital visuals will create a seamless interactive experience.
  • The project demonstrates a clear understanding of hardware-software integration and real-time interaction design.

Extensions (STILL THINKING ABOUT IT):

  1. Scoring System:
    • Reward correct responses with points.
  2. Dynamic Difficulty:
    • Reduce light duration intervals as the game progresses.
  3. Enhanced Visuals:
    • Add animations for the car (e.g., smooth movement, brake effects).

 

Week 11 Reading Reflection

This week’s reading, Design Meets Disability, made me think differently about how we design for people with disabilities. Instead of just focusing on making tools that work, the reading talks about making them look good too. One idea that stood out to me was how assistive devices, like hearing aids, can be designed to match the user’s style. This turns them from something people might feel shy about into something they’re proud to wear.

I also liked the focus on working directly with the people who will use these designs. When users are involved, the tools are not only more useful but also feel more personal and meaningful. For example, the way glasses became a fashion statement over time shows how design can change how we see things, not just how they work.

This reading made me think about my own projects and how I can use similar ideas. I want to make designs that are simple and easy to use but still look creative and fun. I also want to involve users more, so the designs feel like they belong to them, not just something made for them.

In the end, this reading reminded me that design isn’t just about fixing problems—it’s about improving lives in ways that make people feel seen and valued. It’s a small change in thinking but one that can make a big difference.

Week 10: The Arduino Piano (Takudzwa & Bismark)

The final product for you convenience is here: https://youtu.be/62UTvttGflo

Concept:

The motivation behind our project was to create a unique piano-like instrument using Arduino circuits. By utilizing two breadboards, we had a larger workspace, allowing for a more complex setup. We incorporated a potentiometer as a frequency controller—adjusting it changes the pitch of the sounds produced, making the instrument tunable. To enhance the experience, we added synchronized LED lights, creating a visual element that complements the sound. This combination of light and music adds a fun, interactive touch to the project. Here’s the project cover:

The tools used for this project were: The potentiometer, Piezo Speaker, LEDs, 10k & 330 ohm resistors, push buttons and jump wires.

Execution:

The following was the schematic for our project, which served as the foundation that allowed us to successfully execute this project:

The following Arduino code snippet brought our project to life, controlling both sound and light to create an interactive musical experience:

void setup() {
  // Set button and LED pins as inputs and outputs
  for (int i = 0; i < 4; i++) {
    pinMode(buttonPins[i], INPUT);       // Button pins as input
    pinMode(ledPins[i], OUTPUT);         // LED pins as output
  }
  pinMode(piezoPin, OUTPUT);             // Speaker pin as output
}

void loop() {
  int potValue = analogRead(potPin);                    // Read potentiometer value
  int pitchAdjust = map(potValue, 0, 1023, -100, 100);  // Map pot value to pitch adjustment range

  // Check each button for presses
  for (int i = 0; i < 4; i++) {
    if (digitalRead(buttonPins[i]) == HIGH) {         // If button is pressed
      int adjustedFreq = notes[i] + pitchAdjust;      // Adjust note frequency based on potentiometer
      tone(piezoPin, adjustedFreq);                   // Play the adjusted note
      digitalWrite(ledPins[i], HIGH);                 // Turn on the corresponding LED
      delay(200);                                     // Delay to avoid rapid flashing
      noTone(piezoPin);                               // Stop the sound
      digitalWrite(ledPins[i], LOW);                  // Turn off the LED
    }
  }
}

 

Finally, the final project can be found here: https://youtu.be/62UTvttGflo

Reflection:

Although our project may seem simple, we encountered several challenges during its development. Initially, we accidentally placed the digital pins incorrectly, preventing the project from functioning as expected. After hours of troubleshooting, we sought help to identify the issue. This experience turned into a valuable teamwork activity, helping us grow as students and problem-solvers. I view challenges like these as opportunities to build skills I can apply to future projects, including my final one. To enhance this project further, I would improve its visual design and sound quality to make it more appealing to a wider audience. That’s all for now!

A Brief Rant on the Future of Interaction Design

In A Brief Rant on the Future of Interaction Design, Bret Victor argues that most interaction design today isn’t meaningful enough. He believes designers focus too much on making things look nice on screens rather than creating tools that help people think or solve real problems. This stood out to me because I agree that design should do more than just look good—it should make our lives easier or allow us to do more.

As someone studying computer science and interested in interactive media, I think Victor’s ideas are important. He makes me want to focus on designing tools that actually help users accomplish things rather than just looking nice. His views remind me that good design should be about creating real benefits for people, not just entertainment or convenience.

The responses to A Brief Rant on the Future of Interaction Design show different views. Some people agree with Victor and think design should be more useful, while others say his ideas are too difficult to make real. One response I read pointed out that many companies prefer simple screen designs because they make money more easily. This made me think about the challenges of aiming high in design while facing real-life limits, like budgets or technology.

These responses remind me that good design is a balance between what’s possible and what’s ideal. While Victor’s ideas are inspiring, they also show the need for practical solutions. Moving forward, I want to think more about how to push for meaningful design within real-world limits.

Week 9

Project: Dual-Control LED System with LDR Night Light and Button-Controlled LED

 Concept

This project explores the integration of analog and digital controls using an LDR (Light Dependent Resistor) and a **button to control two LEDs. The first LED functions as an automatic night light, turning on in low-light conditions as detected by the LDR. The second LED is controlled by a button, which allows the user to toggle it on and off. This setup demonstrates the use of both analog and digital inputs to create interactive light control.

Design and Execution

Components
– LDR: Senses ambient light and provides analog input for the night light.
– Button: Serves as a digital switch for the second LED.
– LED 1: Acts as a night light that responds to light levels.
– LED 2: Controlled by the button for manual on/off functionality.
– 220Ω Resistors: Limit current to protect the LEDs.

Schematic Diagram

The circuit is designed on a breadboard, with the LDR setup on one side and the button on the other. Power and ground rails connect both sides to the Arduino.

1. LDR Circuit: Creates a voltage divider with a 10kΩ resistor connected to analog pin A0.
2. Button Circuit: Connected to digital pin 7 with an internal pull-up resistor enabled in the code.
3. LEDs: LED 1 is controlled by the LDR (analog), and LED 2 by the button (digital).

Code

// Dual-Control LED System: LDR and Button

// Pin definitions
const int led1 = 8;            // LED for night light (LDR-based)
const int sensorPin = A0;      // LDR sensor pin
const int buttonPin = 7;       // Button pin for second LED control
const int led2 = 9;            // Second LED pin (button-controlled)

// Threshold for LDR to turn on the night light
const int threshold = 500;     // Adjust based on ambient light

// Variables
int sensorValue;               // LDR sensor reading
int buttonState;               // Button state

void setup() {
  pinMode(led1, OUTPUT);             // Set LED1 as output
  pinMode(buttonPin, INPUT_PULLUP);  // Set button as input with pull-up
  pinMode(led2, OUTPUT);             // Set LED2 as output
  Serial.begin(9600);                // Initialize serial communication
}

void loop() {
  // LDR-controlled LED (Night Light)
  sensorValue = analogRead(sensorPin);  // Read LDR value
  Serial.println(sensorValue);          // Print for debugging
  
  if(sensorValue < threshold) {         // If below threshold, turn on LED1
    digitalWrite(led1, HIGH);
  } else {                              // Otherwise, turn off LED1
    digitalWrite(led1, LOW);
  }

  // Button-controlled LED
  buttonState = digitalRead(buttonPin); // Read button state
  if (buttonState == LOW) {             // If button is pressed
    digitalWrite(led2, HIGH);           // Turn on LED2
  } else {
    digitalWrite(led2, LOW);            // Turn off LED2
  }
  delay(100); // Small delay for stability
}

 

Observations

– Night Light Sensitivity: The LDR-based LED responds to ambient light, offering a basic “automatic night light” functionality. The threshold value can be adjusted to suit different lighting conditions.
– Button Responsiveness: The button controls the second LED reliably, allowing manual toggling.

Video Demonstration

https://drive.google.com/file/d/1wugSSY6orAYq02qOQg9335lDeIQAwCms/view?usp=drive_link

Reflection and Future Improvements

This project demonstrates the integration of analog and digital inputs for LED control, offering insights into how sensors can drive interactive behavior. Future improvements could include adding adjustable sensitivity for the night light or introducing more complex patterns for the button-controlled LED.

This project enhanced my understanding of basic sensor interactions, providing a foundation for more advanced input/output controls in Arduino projects.

Week 9: Reading responses.

Reflection on “Physical Computing’s Greatest Hits and Misses”

This reading covers both the successes and failures in the field of physical computing. It shows that the best projects are designed to be simple and easy for people to use, while projects that try to be overly complex or focus too much on technology alone often fail. The reading highlights that successful projects blend new technology with a strong focus on what users actually need and find helpful. Reading this, I realized how important it is to keep users in mind when designing, focusing on how they will interact with the system rather than just on the technology itself. When I work on my own projects, this idea of user-centered design can help me make choices that make my designs feel more useful and intuitive. This reflection reminds me to aim for a balance between innovation and usability, focusing on what will make the design easy and enjoyable to use.

 

Reflection on “Making Interactive Art: Set the Stage, Then Shut Up and Listen”

This reading talks about how artists can create interactive art by giving the audience space to explore and interpret the work on their own. Instead of controlling every aspect of the experience, the artist should set up a framework and then let the audience interact with it however they want. This approach lets people bring their own ideas and feelings into the art, creating a more personal connection. I find this approach inspiring because it gives people the freedom to experience the art in their own way, rather than the artist guiding them to a specific idea or feeling. In my own work, I usually try to direct how users should interact, but this reading has me thinking about how allowing more freedom can create a deeper, more meaningful experience. By stepping back and letting users take control, I can create projects that are more engaging and leave room for different interpretations.

Week 8 Reading Response

Her Code Took Us to the Moon

Reading about Margaret Hamilton’s work on the Apollo mission taught me a lot about how careful and exact programming needed to be. The mission depended on software, so even a small mistake could lead to failure. This reading reminds me that when working on physical programming projects, I need to pay close attention to every detail. As we move into the hands-on part of the course, I’ll focus on testing my projects carefully, just like the Apollo team used simulations to catch problems early and make sure everything worked as it should.

Norman’s “Emotion & Design: Why Good Looks Matter”

In Norman’s reading, I found it interesting how he connects emotions and looks with how well things work. He says that when something is nice to look at, it makes people feel good, and that makes it easier to use. This idea makes me want to design things that aren’t just useful but also make people curious and excited to try them. Moving forward, I’ll try to add features that make people want to explore, while keeping things simple to use. Norman’s ideas showed me that good looks can improve the whole experience, helping people enjoy and connect with what they’re using.

 

Week 8 Assignment

Ultrasonic Distance-Based LED Switch

Concept: This project uses an HC-SR04 ultrasonic sensor and an Arduino to activate an LED when an object is within 10 cm. This distance-based switch can serve as a touchless light control.

Materials
– Arduino Uno
– HC-SR04 Ultrasonic Sensor
– LED
– 220Ω Resistor
– Breadboard and Jumper Wires

Process:
1. Setup: The ultrasonic sensor’s Trig and Echo pins connect to Digital Pins 12 and 13 on the Arduino, while the LED is connected to Digital Pin 2 with a resistor.
2. Code: The Arduino reads the distance from the sensor. If within 10 cm, it lights the LED; otherwise, it turns off.
3. Testing: The LED successfully turns on when an object is close, providing feedback on proximity.

CODE: 

const int echo = 13;     // Echo pin of the ultrasonic sensor
const int trig = 12;     // Trigger pin of the ultrasonic sensor
int LED = 2;             // LED pin

int duration = 0;        // Variable to store pulse duration
int distance = 0;        // Variable to store calculated distance

void setup() {
  pinMode(trig, OUTPUT);    // Set trig pin as output
  pinMode(echo, INPUT);     // Set echo pin as input
  pinMode(LED, OUTPUT);     // Set LED pin as output
  Serial.begin(9600);       // Initialize serial monitor for debugging
}

void loop() {
  // Send out a 10 microsecond pulse on the trig pin
  digitalWrite(trig, LOW);
  delayMicroseconds(2);
  digitalWrite(trig, HIGH);
  delayMicroseconds(10);
  digitalWrite(trig, LOW);

  // Read the echo pin and calculate distance
  duration = pulseIn(echo, HIGH);
  distance = (duration / 2) / 29.1;  // Convert duration to distance in cm

  // Turn on the LED if the distance is less than 10 cm
  if (distance < 10 && distance > 0) {  // Check distance within range
    digitalWrite(LED, HIGH);  // Turn on LED
  } else {
    digitalWrite(LED, LOW);   // Turn off LED
  }

  // Print distance to the serial monitor for debugging
  Serial.print("Distance: ");
  Serial.print(distance);
  Serial.println(" cm");

  delay(500);  // Delay for stability
}

VIDEO DEMONSTRATION

 


 

Reflection: This project demonstrates basic sensor integration and is adaptable for touchless applications in home automation.

MIDTERM PROJECT: SUPERMAN SAVES

INTRODUCTION

For my midterm project, I decided to build upon an earlier project concept, evolving it into a full-fledged interactive game called “Superman Saves”. This project incorporates the concepts and techniques I’ve learned in class so far.

CONCEPT

The concept of the game is simple: Superman has to rescue a person from danger by navigating obstacles such as clouds and birds. The player uses arrow keys to control Superman’s movements, helping him avoid the obstacles while trying to rescue the person in time. As the player progresses through different levels, the game increases in difficulty by speeding up the obstacles, making it more challenging to achieve the objective. 

RESOURCES

For the background image, i used DALL.E AI to generate it. I got the sounds from freesound.org

HIGHLIGHTS

One of the more challenging aspects of the project was implementing accurate collision detection between Superman and the moving obstacles (clouds and birds). The collision detection logic ensures that when Superman gets too close to an obstacle, he loses a life, and his position is reset. The difficulty lies in precisely calculating the distance between Superman and the obstacles, accounting for the different speeds and movements of the clouds and birds.

The code snippet below handles collision detection for both clouds and birds, resetting Superman’s position and decreasing his lives if a collision occurs:

function checkCollision() {
  // Check collisions with clouds
  if (dist(supermanX, supermanY, cloudX1, cloudY1) < 50 ||
      dist(supermanX, supermanY, cloudX2, cloudY2) < 50 ||
      dist(supermanX, supermanY, cloudX3, cloudY3) < 50) {
    supermanX = width / 2;
    supermanY = height - 100; // Reset Superman's position
    lives -= 1; // Lose a life
    return true;
  }

  // Check collisions with birds
  if (dist(supermanX, supermanY, birdX, birdY) < 50) {
    supermanX = width / 2;
    supermanY = height - 100; // Reset Superman's position
    lives -= 1; // Lose a life
    return true;
  }
  return false;
}

 

CHALLENGES AND IMPROVEMENTS

Creating the dynamic background and ensuring smooth movement was initially challenging. Managing multiple moving elements (clouds, birds, stars) required a balance between performance and visual appeal. Looking ahead, I plan to add more features such as power-ups for Superman, different types of obstacles, and possibly multiplayer options.

 

EMBEDDED CODE

 

 

LINK TO FULL SCREEN

 

 

WEEK 5 READING RESPONSE

Computer vision and human vision are very different in how they process information. While people can quickly understand what they see, computers need specific algorithms to detect motion, brightness changes, and object differences. We help computers “see” by adjusting lighting and using techniques like background subtraction and motion detection to improve tracking accuracy.

In interactive art, computer vision allows viewers to engage with the artwork in real-time. By tracking movements and gestures, it creates an immersive experience where the audience becomes an active participant, enhancing their interaction with the art.

However, this ability to track people also raises concerns about privacy, especially in public spaces. While it makes art more interactive and responsive, the same technology can be used for surveillance, which can feel invasive. Artists and technologists must strike a balance between creating innovative interactive art and respecting individual privacy, ensuring the technology is used responsibly and ethically.