Control LED by light – Week 9

Control LED by light

I created a straightforward project demonstrating a simple light-activated switch using a photoresistor. A photoresistor, also known as a light-dependent resistor (LDR), changes its resistance based on the amount of light it receives. When it’s dark, the resistance is high; when it’s light, it is low.

Components:

  • Arduino Uno
  • Photoresistor
  • Resistor (10 kΩ)
  • LED
  • Resistor (330 Ω)
  • Jumper wires

Setup:

  • Connect one leg of the photoresistor to the 5V pin of the Arduino and the other leg to a digital pin (e.g., pin 2) through a 10 kΩ resistor.
  • Connect the LED to another digital pin (e.g., pin 3) through a 330 Ω resistor.

Upload the following code to your Arduino:

int photo = A0;
int led = 7;

void setup() {
  pinMode(led, OUTPUT);
}

void loop() {
  int lightRaw = analogRead(photo);
  int light = map(lightRaw, 1023, 0, 10, 0);

  if (light < 6) {
    digitalWrite(led, HIGH);
  }

  else {
    digitalWrite(led, LOW);
  }
}

Future Developments

  • One potential future development for this project would be to create a more sophisticated light-activated switch that could be used in various applications. For example, the switch could turn on lights automatically when it gets dark or open and close doors or windows.
  • Another possibility would be integrating the switch into a more extensive system, such as a home automation system. This would allow users to control their lights and other devices with their smartphones or tablets.
  • Additionally, the switch could be made more user-friendly by adding features such as a dimmer switch or a timer. This would allow users to customize the switch’s behavior to suit their needs.
    Overall, this project has many potential future developments. With creativity and ingenuity, the light-activated switch could be used in various ways to make our lives easier and more convenient.I had a lot of fun with this assignment! It was really cool to see how a simple idea could be turned into a daily application that we can use to save electricity.

Margaret Hamilton’s journey – Week 8a

Reading about Margaret Hamilton’s journey in programming the Apollo mission and Don Norman’s thoughts on emotional design made me think deeply about how we blend innovation and design in technology. Hamilton was working on something as monumental as landing humans on the moon. She pioneered women in tech and the concept of software itself. It’s fascinating and a bit disheartening to see how, despite such trailblazers, we still grapple with gender equality in STEM. I wonder what more we must do to make the field more inclusive.

One passage that particularly resonated with me was Hamilton’s description of her work as “both a joy and a responsibility.” This encapsulates her passion and dedication to her work, recognizing its immense significance. It also highlights the pressure and burden she carried, knowing that the lives of astronauts depended on the code she and her team wrote. Even when others saw it as unnecessary, Hamilton’s insistence on adding error checks speaks volumes about foreseeing and mitigating human errors in tech. It’s a reminder of how vital it is to balance trust in technology with caution, especially as we lean more into AI and automation today.

On the flip side, Don Norman’s take on design brings a whole new layer to how we interact with technology. His story about the three teapots, each with a unique blend of aesthetics and functionality, shows that design has a deep meaning about how things look and make us feel and work for us. It got me thinking about how we design our digital tools and interfaces. Are we considering how they feel to use, or just how they function?

Both pieces highlight the importance of looking beyond the surface, whether breaking down gender barriers in tech or creating designs that delight and serve. They make me believe that innovation has more meanings and aspects to explore. Some people think it is only about the next big tech breakthrough, but it has hidden meanings, like making technology more human, accessible, and enjoyable for everyone.

Midterm Project – Space

This project combines the thrill of techno music with the interactivity of hand gestures, providing players with a unique gaming experience. Using a webcam, players can control the spaceship’s movement with simple hand gestures, making the game both engaging and intuitive.

The game environment features a spaceship navigating through a field of objects. The objective is to avoid colliding with the objects while collecting points. The game utilizes the hand pose machine learning model to detect hand gestures, allowing players to control the spaceship’s movement by opening or closing their hand. Additionally, players can make a fist to pause or resume the game, adding an extra layer of interaction.

One aspect of the project that I’m particularly proud of is the seamless integration of hand gestures for controlling the spaceship. By leveraging the hand pose model provided by the ml5.js library, I was able to accurately detect and interpret hand movements in real time, providing players with responsive and intuitive controls. Additionally, the dynamic gameplay, with objects spawning randomly and increasing in speed over time, keeps players engaged and challenged throughout the game.

function detectHands() {
  if (predictions.length > 0) {
    const landmarks = predictions[0].landmarks;
    // highlightHand(landmarks, 'green'); // Optional: Uncomment to see hand landmarks

    if (isClosedFist(landmarks)) {
      let currentTime = millis();
      if (currentTime - fistDetectedTime > fistToggleDelay) {
        isGamePaused = !isGamePaused;
        console.log(isGamePaused ? "Game Paused" : "Game Resumed");
        fistDetectedTime = currentTime;
      }
    } else if (isOpenHand(landmarks)) {
      let averageX = landmarks.reduce((acc, val) => acc + val[0], 0) / landmarks.length;
      if (averageX < width / 2) {
        spaceship.setDir(-1);
      } else {
        spaceship.setDir(1);
      }
    } else {
      spaceship.setDir(0);
    }
  }
}

// Check if the hand is open 
function isOpenHand(landmarks) {
  let minDist = Infinity;
  for (let i = 4; i <= 20; i += 4) {
    for (let j = i + 4; j <= 20; j += 4) {
      let dist = distanceBetweenPoints(landmarks[i], landmarks[j]);
      if (dist < minDist) {
        minDist = dist;
      }
    }
  }
  return minDist > 50;
}

Also, another key element that enhances the immersive experience of the Gesture-Controlled Game is the synchronization of the music with the background elements. By integrating dynamic sound effects and music, the game creates a cohesive audio-visual experience that engages players on multiple sensory levels.

function drawGameElements() {
    let level = amplitude.getLevel();
    let size = map(level, 0, 1, 5, 20); //size for more impact

    // Cycle through HSB colors 
    colorMode(HSB, 360, 100, 100, 100);
    colorPhase = (colorPhase + 1) % 360;

    for (let i = 0; i < 400; i++) {
        let x = noise(i * 0.1, frameCount * 0.01) * width;
        let y = noise((i + 100) * 0.1, frameCount * 0.01 + level * 5) * height;

        // Dynamic stroke color
        let hue = (colorPhase + i * 2) % 360;
        let alpha = map(level, 0, 1, 50, 100); //alpha based on volume for dynamic visibility

        // Simulated glow effect
        for (let glowSize = size; glowSize > 0; glowSize -= 4) {
            let glowAlpha = map(glowSize, size, 0, 0, alpha); 
            stroke(hue, 80, 100, glowAlpha);
            strokeWeight(glowSize);
            point(x, y);
        }
    }

    colorMode(RGB, 255);
    spaceship.show();
    spaceship.move();
    handleParticles();
}

Improvements:

Optimization and Performance: Consider optimizing the game’s performance, especially when dealing with graphics and rendering. This includes minimizing unnecessary calculations and rendering, especially within loops like drawGameElements().

Game Mechanics Refinement: Assess and refine the game mechanics for a better gameplay experience. This could involve adjusting spaceship movement controls, particle spawning rates, or particle effects to enhance engagement and challenge.

Midterm Project Draft: Interactive Hand-Gesture Controlled Game

Project Concept
The core idea behind my midterm project is to develop an interactive game controlled entirely through hand gestures. The game will leverage a hand-tracking library to interpret the player’s hand movements as input commands. I aim to create an engaging and intuitive user experience that does not rely on traditional input devices like keyboards, mice, or game controllers.

Design & User Interaction
Players will interact with the game through various hand gestures. For instance, an open hand gesture will start the game, while making a fist and holding it for a brief moment will pause or resume the game. The game’s mechanics and objectives will be designed around these gestures, ensuring that the player’s physical movements are seamlessly translated into in-game actions.

To detect and interpret these gestures, I will use a hand-tracking library that provides real-time hand position and gesture recognition. The player’s hand movements will be captured through a webcam and processed to identify specific gestures. Based on the detected gestures, the game will execute corresponding actions, such as starting, pausing, or resuming gameplay.

hand

 

Code Design
Gesture Detection Functions: I have implemented functions like detectOpenHandToStart() and detectHands() to detect specific hand gestures. These functions use the hand-tracking library’s predictions to analyze the hand’s position and orientation.

Hand Highlighting: The highlightHand() function visualizes the player’s hand position on the screen, enhancing user feedback and interaction.

Gesture Recognition Algorithms: Functions are OpenHand () and ClosedFist (), which distinguish between different hand gestures by analyzing the distances between hand landmarks. These algorithms are crucial for converting physical gestures into game commands.

let video;
let handpose;
let predictions = [];
let isGamePaused = false;
let fistDetectedTime = 0;
const fistToggleDelay = 2000;

function setup() {
  createCanvas(640, 480);
  video = createCapture(VIDEO);
  video.hide();
  handpose = ml5.handpose(video, modelReady);
  handpose.on('predict', results => {
    predictions = results;
  });
}

function modelReady() {
  console.log("Handpose model ready!");
}

function draw() {
  background(255);
  image(video, 0, 0, width, height);
  detectHands();
}

function detectHands() {
  if (predictions.length > 0) {
    const landmarks = predictions[0].landmarks;
    highlightHand(landmarks, 'green');

    if (isClosedFist(landmarks)) {
      let currentTime = millis();
      if (currentTime - fistDetectedTime > fistToggleDelay) {
        isGamePaused = !isGamePaused;
        console.log(isGamePaused ? "Game Paused" : "Game Resumed");
        fistDetectedTime = currentTime;
      }
    }
  }
}

function highlightHand(landmarks, color) {
  fill(color);
  landmarks.forEach(point => {
    ellipse(point[0], point[1], 10, 10);
  });
}

function isOpenHand(landmarks) {
  let minDist = Infinity;
  for (let i = 4; i <= 20; i += 4) {
    for (let j = i + 4; j <= 20; j += 4) {
      let dist = distanceBetweenPoints(landmarks[i], landmarks[j]);
      if (dist < minDist) {
        minDist = dist;
      }
    }
  }
  return minDist > 50;
}

function isClosedFist(landmarks) {
    let maxDist = 0;
    for (let i = 4; i < landmarks.length - 4; i += 4) {
        let dist = distanceBetweenPoints(landmarks[i], landmarks[i + 4]);
        if (dist > maxDist) {
            maxDist = dist;
        }
    }
    return maxDist < 40; 
}

function distanceBetweenPoints(point1, point2) {
  return Math.sqrt(Math.pow(point2[0] - point1[0], 2) + Math.pow(point2[1] - point1[1], 2) + Math.pow(point2[2] - point1[2], 2));
}

Challenges & Risk Mitigation
The most challenging aspect of this project was developing reliable gesture recognition algorithms that can accurately interpret the player’s intentions from the hand’s position and movement. Misinterpretation of gestures could lead to a frustrating user experience.

To address this challenge, I focused on refining our gesture recognition algorithms (isOpenHand() and isClosedFist()) to improve their accuracy and robustness. I conducted testing with different hand sizes and lighting conditions to ensure the algorithms’ reliability across a wide range of scenarios. Additionally, I implemented visual feedback mechanisms (via highlightHand()) to help players adjust their gestures for better recognition.

Next Steps
In conclusion, this project represents a significant step towards creating more natural and immersive gaming experiences. I aim to explore new possibilities in game design and interaction by leveraging hand gestures as input.

Reading Reflection 5 – Pavly Halim

Reflecting on the insights gained from “Computer Vision for Artists and Designers: Pedagogic Tools and Techniques for Novice Programmers” offers a perspective on the intersection between technology, art, and education. This journey through the text has broadened my understanding of computer vision as a technical field and deepened my appreciation for its artistic and pedagogical applications. It’s fascinating to see how computer vision can bridge the tangible world and digital expression, enabling artists and designers to explore new dimensions of creativity. The examples of interactive installations and projects highlighted in the text, such as the LimboTime game or the innovative use of surveillance for artistic expression, showcase the power of computer vision to create engaging and thought-provoking experiences. These projects underscore the potential of computer vision to transform our interaction with the digital world, making technology an integral part of artistic exploration and expression.

Moreover, the text’s emphasis on making computer vision accessible to novice programmers through pedagogic tools and techniques is particularly inspiring. It demystifies a complex field, making it approachable for artists and designers who may not have a strong background in programming. This approach democratizes the field of computer vision and also encourages a more inclusive community of creators who can bring diverse perspectives to the intersection of art and technology. The discussion around optimizing physical conditions to enhance the performance of computer vision algorithms was enlightening, highlighting the intricate dance between the digital and physical realms. As I reflect on the reading, I am left with a profound excitement about the possibilities at the confluence of computer vision, art, and design. This text educates and inspires, pushing the boundaries of what we perceive as possible in digital art and design.

Reading Reflection 4 – Pavly Halim

Reflecting on Chapter One of “The Psychopathology of Everyday Things,” the text really made me think differently about the stuff we use every day. It’s like, we’ve all been there, struggling with something that’s supposed to be simple, like a door that doesn’t make it clear if you should push or pull. The author talks about how these design fails aren’t just annoying, but they actually show a more significant problem: designers sometimes focus more on making things look cool or high-tech instead of making them easy for us to use.

It made me question why design often misses the mark on user-friendliness. Is it because designers are trying too hard to impress other designers instead of thinking about regular people? Reading this has definitely changed how I look at things around me. Now, I catch myself wondering how something could be designed better to avoid confusion or frustration. It’s interesting to think about how much smoother our day could be if more things were designed with the user in mind. This chapter has sparked a lot of questions for me about what else in my daily life could be improved with better design.

Assignment 4 – Pavly Halim

The Concept:
My project is centered around the visualization of stock market data, specifically designed to help users visually track the fluctuations in stock prices over time. The idea was to create an interactive graph that could display a moving window of stock prices, allowing users to see trends and changes as they happen.

A Highlight of the Code:
One snippet of code that stands out to me for its simplicity and effectiveness is the drawGraph function. This function renders the graph on the screen, mapping the stock prices to visual elements that can be easily understood.

function drawGraph(startIndex) {
  const graphWidth = 700;
  const graphHeight = 300;
  const startY = height - graphHeight - 50;
  const endY = startY + graphHeight;
  const startX = (width - graphWidth) / 2;
  const endX = startX + graphWidth;

  //Calculate the subset of prices to display
  let displayedPrices = prices.slice(startIndex, Math.min(startIndex + windowSize, prices.length));
  const minValue = Math.min(...displayedPrices);
  const maxValue = Math.max(...displayedPrices);
  const latestPrice = displayedPrices[displayedPrices.length - 1]; // The last price in the displayed subset

  push();
  stroke(0);
  noFill();
  rect(startX - 10, startY - 10, graphWidth + 20, graphHeight + 20);
  pop();

  push();
  stroke(25, 118, 210);
  strokeWeight(2);
  noFill();
  beginShape();
  for (let i = 0; i < displayedPrices.length; i++) {
    let x = map(i, 0, displayedPrices.length - 1, startX, endX);
    let y = map(displayedPrices[i], minValue, maxValue, endY, startY);
    vertex(x, y);
  }
  endShape();

Embedded Sketch:

Showcasing the dynamic visualization of stock prices over time

 

Code:

let stockData;
let currentIndex = 0;
let windowSize = 15; //Data points to display
let prices = [];

function preload() {
  stockData = loadTable('stock_prices.csv', 'csv', 'header');
}

function setup() {
  createCanvas(800, 400);
  frameRate(1); // Framerate 
  loadPrices(); // Load data
}

function draw() {
  background(255);
  drawGraph(currentIndex);
  currentIndex++; // Move the window forward
  if (currentIndex > prices.length - windowSize) currentIndex = 0; // Loop back if we reach the end
}

function loadPrices() {
  //Extract prices and store it in array
  for (let r = 0; r < stockData.getRowCount(); r++) {
    let price = stockData.getNum(r, 'price');
    prices.push(price);
  }
}

function drawGraph(startIndex) {
  const graphWidth = 700;
  const graphHeight = 300;
  const startY = height - graphHeight - 50;
  const endY = startY + graphHeight;
  const startX = (width - graphWidth) / 2;
  const endX = startX + graphWidth;

  //Calculate the subset of prices to display
  let displayedPrices = prices.slice(startIndex, Math.min(startIndex + windowSize, prices.length));
  const minValue = Math.min(...displayedPrices);
  const maxValue = Math.max(...displayedPrices);
  const latestPrice = displayedPrices[displayedPrices.length - 1]; // The last price in the displayed subset

  push();
  stroke(0);
  noFill();
  rect(startX - 10, startY - 10, graphWidth + 20, graphHeight + 20);
  pop();

  push();
  stroke(25, 118, 210);
  strokeWeight(2);
  noFill();
  beginShape();
  for (let i = 0; i < displayedPrices.length; i++) {
    let x = map(i, 0, displayedPrices.length - 1, startX, endX);
    let y = map(displayedPrices[i], minValue, maxValue, endY, startY);
    vertex(x, y);
  }
  endShape();
  pop();
 push();
  textSize(17);
  fill(0);
  noStroke();
  textAlign(CENTER);
  text(`Current Price: $${latestPrice.toFixed(2)}`, width / 2, startY - 25);
  pop();
}

Reflection and Future Work:
This project has allowed me to refine my skills in processing and presenting data in an informative and aesthetically pleasing way. The feedback loop provided by visually tracking the effectiveness of the code in real time has been incredibly rewarding.

Moving forward, I see several avenues for improvement and expansion. For instance, integrating real-time data feeds to allow the graph to update with live market data would significantly enhance its utility. Additionally, incorporating user input to select different stocks or adjust the window size and time frame could make the tool more versatile and user-friendly.

Reading Reflection 3 – Pavly Halim

Reflecting on “The Art of Interactive Design,” it feels like embarking on a spirited journey through what makes technology engaging. The author presents a world where interactivity is about the clicks and taps on our devices and the conversation between humans and machines. Imagine sitting with a good friend over coffee, where the exchange of thoughts and ideas flows freely. This book suggests that our gadgets should offer a similar two-way street of communication, a concept that feels revolutionary and glaringly obvious.

There’s a playful yet earnest tone in the author’s argument that nudges us to question the authenticity of our digital interactions. Are we merely following a script laid out by designers, or are we genuinely engaging in a dialogue? This perspective might tilt towards a bias for more profound, meaningful connections over superficial tech encounters, but it’s hard not to be swayed by such a compelling case. It leaves you pondering about the true essence of interactivity and whether our current technology meets that mark or acts as interactive. It’s an invitation to dream up a world where every interaction with technology enriches our lives, making us users and participants in a digital dance of ideas.

Assignment 3 – Pavly Halim

Concept 

The concept of using Perlin noise to simulate natural, fluid motion in the particles is inspired. Perlin noise generates a more natural and organic flow compared to purely random movement, as it ensures smooth transitions and variations. This approach is commonly used in generative art and simulations to mimic natural phenomena like smoke, fire, and flowing water.

Code Highlight 

// Update the particle's position
move() {
  // Calculate a noise value for organic movement
  let noiseValue = noise(this.noiseOffset);
  this.pos.x += map(noiseValue, 0, 1, -1, 1) * this.rate;
  this.pos.y += map(noiseValue, 0, 1, -1, 1) * this.vertRate;
  
  // Wrap the particle around edges to make it looks like space
  if (this.pos.x > width) this.pos.x = 0;
  if (this.pos.x < 0) this.pos.x = width;
  if (this.pos.y > height) this.pos.y = 0;
  if (this.pos.y < 0) this.pos.y = height;
      
  // Increment the noise offset for the next frame
  this.noiseOffset += 0.01;
}

The move() function in the particle system utilizes Perlin noise to simulate organic and fluid motion, creating a natural, lifelike behavior in each particle. By calculating a smoothly varying noise value and mapping it to influence the particle’s velocity, the function ensures each particle moves in a unique, seamless manner that mimics natural phenomena. Additionally, the function incorporates edge wrapping to create a continuous, boundless effect on the canvas, enhancing the illusion of a natural, endless environment. This innovative use of Perlin noise, combined with the edge wrapping technique, sets the function apart by providing a sophisticated method to generate complex, visually appealing animations that evoke the randomness and elegance of the natural world.

Embedded Sketch

Code

let particles = [];

function setup() {
  createCanvas(600, 600);
    // Initialize particles and add them to array
  for (let i = 0; i < 400; i++) {
    particles.push(new Particle());
  }
}

function draw() {
  background(0);
    // Loop through all particles to display and move them
    for (let particle of particles) {
        particle.display();
        particle.move();
    }
}

// Define the Particle class
class Particle {
  constructor() {
    // Set initial position to a random point within the canvas
    this.pos = createVector(random(width), random(height));
    this.size = random(0.5, 2);
    //Set random transparency value
    this.alpha = random(150, 255);
    this.noiseOffset = random(0, 1000);
    this.rate = random(-1, 1); 
    this.vertRate = random(-1, 1); 
  }
  
  // Display the particle on the canvas
  display() {
    noStroke();
    fill(255, this.alpha);
    ellipse(this.pos.x, this.pos.y, this.size);
  }

  // Update the particle's position
  move() {
    // Calculate a noise value for organic movement
    let noiseValue = noise(this.noiseOffset);
    this.pos.x += map(noiseValue, 0, 1, -1, 1) * this.rate;
    this.pos.y += map(noiseValue, 0, 1, -1, 1) * this.vertRate;
    
    // Wrap the particle around edges to make it looks like space
    if (this.pos.x > width) this.pos.x = 0;
    if (this.pos.x < 0) this.pos.x = width;
    if (this.pos.y > height) this.pos.y = 0;
    if (this.pos.y < 0) this.pos.y = height;
        
    // Increment the noise offset for the next frame
    this.noiseOffset += 0.01;
  }
}

 

Reflection

I was creating this particle system with p5.js, especially using Perlin noise for fluid motion. This approach transformed a basic animation into a mesmerizing display that closely mirrors the elegance of natural phenomena. The seamless movement and edge wrapping created an endless, captivating visual experience. This project pushed the boundaries of what I thought possible with code and deepened my appreciation for programming as a form of creative expression.

Future work

Looking ahead, I’m excited to explore further possibilities with this particle system, such as introducing interactions between particles to simulate forces like attraction and repulsion, adding color gradients for a more dynamic visual effect, and experimenting with different noise algorithms to vary the motion patterns.

Reading Reflection – Week 2

Casey Reas’s insights into the dance between randomness and order in art have really got me thinking. It’s like he’s saying, “Hey, the world is way more unpredictable than we like to admit, and that’s okay.” This whole spiel about embracing chaos instead of always trying to keep things neat and tidy in our art—it’s pretty revolutionary. Historically, we’ve been obsessed with making everything orderly, but Reas is challenging us to break free from that. He’s suggesting that maybe, just maybe, there’s beauty and authenticity in letting randomness take the wheel now and then.

And that’s got me pondering why we’re so hung up on order in the first place. Is it because it feels safer and more comfortable? Reas throws out this idea of creating your reality, which hits home. It’s like he’s saying we cling to order because it’s familiar, but what if we stepped out of our comfort zones? Imagine the wild, unpredictable art we could create if we were willing to get a little uncomfortable, to embrace the chaos instead of running from it. It’s a bit scary to think about but also thrilling. It makes me wonder what kind of art we’re missing out on by sticking to the same old patterns. Maybe it’s time to shake things up, to see where a little randomness can take us.