Final Project: Torqu3-y

Introduction

Transitioning from the conceptualization of ArtfulMotion, a project centered around translating gestures into visual art, I sought to elevate the interactive experience by integrating physical computing. This blog post outlines the genesis of the gesture-controlled robot concept, the nuanced implementation, and the resultant user experiences.

Inspiration and Conceptualization

The inception of this project emanated from a desire to imbue physicality into the realm of gesture-controlled art, a departure from the digital interface. Initially considering an “electronic buddy” or an “art robot,” inspiration struck upon encountering a multidirectional moving robot in a video shared by Professor Riad. The challenge was to replicate this unique motion with standard tires and integrate Bluetooth technology, ultimately opting for a tethered connection.

Gesture Recognition

Leveraging the handpose model from ml5.js, the implementation of gesture recognition unfolded by identifying 21 keypoints on a hand. The model, confined to recognizing one hand at a time, prompted the division of the video feed into segments, each corresponding to a distinct direction of motion. The chosen gestures prioritize intuitive and natural user interactions.

Interaction Design

User interaction revolves around using hand movements captured by a webcam, transforming them into navigational commands for the robot. An onboard button toggles the robot’s state, turning it on or off. Although the current iteration lacks auditory feedback, prospective enhancements will explore the integration of sound cues. The unique motion of the robot necessitates users to rely on intuition, adding an element of engagement.

Technical Implementation

The interaction between the p5.js sketch and the Arduino board relies on tethered serial communication, facilitated by the p5.web-serial.js library. A singular value is dispatched from p5 to Arduino, intricately mapped to specific motion sets.

p5.js sketch:

preload():

function preload() {
  
  instructionPage = loadImage('instructions.png');
  
  for (let i = 1; i <= carNum; i++) {
    carArray.push(new Sprite(spriteWidth, spriteHeight, 160, 80));
    carArray[i - 1].x = 80 + i * 20;
    carArray[i - 1].y = 100 * i;
    carArray[i - 1].spriteSheet = 'spritesheet.png';
    carArray[i - 1].anis.offset.x = 5;
    carArray[i - 1].anis.frameDelay = 8;

    carArray[i - 1].addAnis({
      move: { row: 0, frames: 16 },
    });
    carArray[i - 1].changeAni('move');
    carArray[i - 1].layer = 2;
    carArray[i - 1].visible = false;
  }
}

The preload() function loads the instruction page image and initializes an array of car sprites.

setup():

function setup() {
  createCanvas(windowWidth, windowHeight);
  video = createCapture(VIDEO);
  video.size(width, height);

  handpose = ml5.handpose(video, modelReady);
  handpose.on("predict", results => {
    predictions = results;
  });

  video.hide();
}

The setup() function serves as the initial configuration for the canvas, video capture, and the Handpose model. It establishes the canvas size based on the current window dimensions and initializes the necessary components, such as the video capture object and Handpose model. The modelReady callback function is triggered when the Handpose model is prepared for use, ensuring that the application is ready to detect hand poses accurately.

draw():

function draw() {
  background(255);

  flippedVideo = ml5.flipImage(video);

  // Calculate the aspect ratios for video and canvas
  videoAspectRatio = video.width / video.height;
  canvasAspectRatio = width / height;

  

  // Adjust video dimensions based on aspect ratios
  if (canvasAspectRatio > videoAspectRatio) {
    videoWidth = width;
    videoHeight = width / videoAspectRatio;
  } else {
    videoWidth = height * videoAspectRatio;
    videoHeight = height;
  }

  // Calculate video position
  video_x = (width - videoWidth) / 2;
  video_y = (height - videoHeight) / 2;

  if (currentPage == 1) {
    // display instructions page
    image(instructionPage, 0, 0, width, height);
  }
  else if (currentPage == 2) {
    // serial connection page
    if (!serialActive) {
      runSerialPage();
    } 

    else {
      // hides car animation
      for (let i = 0; i < carNum; i++) {
        carArray[i].visible = false;
      }

      // controlling page
      if (controlState) {
        runControllingPage();
      } 

      // device has been turned off
      else {
        runTorqueyOff();
      }
    }
  }
}

Within the draw() function, various elements contribute to the overall functionality of the sketch. The calculation of video and canvas aspect ratios ensures that the video feed maintains its proportions when displayed on the canvas. This responsiveness allows the application to adapt seamlessly to different window sizes, providing a consistent and visually appealing user interface.

The draw() function is also responsible for managing different pages within the application. It evaluates the currentPage variable, determining whether to display the instruction page or proceed to pages related to serial connection and hand gesture control. This page-switching behavior is facilitated by the mousePressed function, enabling users to navigate through the application intuitively.

readSerial(data):

function readSerial(data) {

  if (data != null) {
    ////////////////////////////////////
    //READ FROM ARDUINO HERE
    ////////////////////////////////////
    if (int(trim(data)) == maxVoltReading) {
      controlState = true;
    }
    else if (int(trim(data)) == minVoltReading){
      controlState = false;
    }

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    
    let sendToArduino = value + "\n";
    writeSerial(sendToArduino);
    
    // reset value
    value = defaultState;
  }
}

The readSerial(data) function handles communication with an Arduino device. It interprets incoming data, updates the controlState based on voltage readings, and initiates a handshake with the Arduino. This interaction establishes a connection between the physical device and the digital application, enabling real-time responses to user inputs.

drawKeypoints():

function drawKeypoints() {
  for (let i = 0; i < predictions.length; i += 1) {
    const prediction = predictions[i];
    let area = [0, 0, 0, 0, 0];
    for (let j = 0; j < prediction.landmarks.length; j += 1) {
      const keypoint = prediction.landmarks[j];
      fill(0, 255, 0);
      noStroke();
      let x = map(keypoint[0], 0, 640, 0, videoWidth);
      let y = map(keypoint[1], 0, 480, 0, videoHeight);
      ellipse(x, y, 10, 10);
      
      // count number of trues
      // -- helps to detect the area the detected hand is in
      if (withinLeft(x, y)) {
        area[0] += 1;
      }
      if (withinTopCenter(x, y)) {
        area[1] += 1;
      }
      if (withinRight(x, y)) {
        area[2] += 1;
      }
      if (withinMiddleCenter(x, y)) {
        area[3] += 1;
      }
      if (withinBottomCenter(x, y)) {
        area[4] += 1;
      }
      // end of count
    }
    
    // print index
    for (let i = 0; i < area.length; i += 1) {
      if (area[i] == 21) {
        value = i;
      }
    }
  }
}

The drawKeypoints() function utilizes the Handpose model’s predictions to visualize detected keypoints on the canvas. These keypoints correspond to various landmarks on the hand, and their positions are mapped from the video coordinates to the canvas coordinates. By counting the number of keypoints within specific regions, the function determines the area of the hand’s position. This information is crucial for the application to interpret user gestures and trigger relevant actions.

Robot Movement

Arduino schematic diagram:

The robot’s movement encompasses pseudo-forward, pseudo-backward, and rotational movements in either direction around its center. Achieving these nuanced movements involved a methodical trial-and-error process, aligning gestures with intended actions.

Arduino sketch

const int ain1Pin = 3;
const int ain2Pin = 4;
const int pwmAPin = 5;

const int bin1Pin = 8;
const int bin2Pin = 7;
const int pwmBPin = 6;

int buttonValue = 0;
int prevButtonValue = 0;
const int defaultState = -1;

const unsigned long eventInterval = 1000;
unsigned long previousTime = 0;

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  pinMode(LED_BUILTIN, OUTPUT);

  pinMode(ain1Pin, OUTPUT);
  pinMode(ain2Pin, OUTPUT);
  pinMode(pwmAPin, OUTPUT); // not needed really

  pinMode(bin1Pin, OUTPUT);
  pinMode(bin2Pin, OUTPUT);
  pinMode(pwmBPin, OUTPUT); // not needed really

  // TEST BEGIN
  // turn in one direction, full speed
  analogWrite(pwmAPin, 255);
  analogWrite(pwmBPin, 255);
  digitalWrite(ain1Pin, HIGH);
  digitalWrite(ain2Pin, LOW);
  digitalWrite(bin1Pin, HIGH);
  digitalWrite(bin2Pin, LOW);

  // stay here for a second
  delay(1000);

  // slow down
  int speed = 255;
  while (speed--) {
    analogWrite(pwmAPin, speed);
    analogWrite(pwmBPin, speed);
    delay(20);
  }

  // TEST END

  buttonValue = analogRead(A0);
  prevButtonValue = buttonValue;

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println(buttonValue); // send a starting message
    delay(50);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }

}

void loop() {

   /* Updates frequently */
  unsigned long currentTime = millis();
  /* This is the event */
  if (currentTime - previousTime >= eventInterval) {
    /* Event code */
    buttonValue = analogRead(A0);
    
   /* Update the timing for the next time around */
    previousTime = currentTime;
  }

  while (Serial.available()) {
    // sends state data to p5
    if (buttonValue != prevButtonValue) {
      prevButtonValue = buttonValue;
      Serial.println(buttonValue);
    }
    else {
      Serial.println(defaultState);
    }
    
    // led on while receiving data
    digitalWrite(LED_BUILTIN, HIGH); 

    // gets value from p5
    int value = Serial.parseInt();

    // changes brightness of the led
    if (Serial.read() == '\n' && buttonValue == 1023) {
      
      if (value == 0) {
        // 0
        digitalWrite(ain1Pin, HIGH);
        digitalWrite(ain2Pin, LOW);
        digitalWrite(bin1Pin, LOW);
        digitalWrite(bin2Pin, HIGH);

        analogWrite(pwmAPin, 255);
        analogWrite(pwmBPin, 255);
        // 0

      }

      else if (value == 1) {
        // 1
        digitalWrite(ain1Pin, HIGH);
        digitalWrite(ain2Pin, LOW);
        digitalWrite(bin1Pin, HIGH);
        digitalWrite(bin2Pin, LOW);

        analogWrite(pwmAPin, 255);
        analogWrite(pwmBPin, 255);
        // 1
      }

      else if (value == 2){
        // 2
        digitalWrite(ain1Pin, LOW);
        digitalWrite(ain2Pin, HIGH);
        digitalWrite(bin1Pin, HIGH);
        digitalWrite(bin2Pin, LOW);

        analogWrite(pwmAPin, 255);
        analogWrite(pwmBPin, 255);
        // 2
      }

      else if (value == 3) {
        analogWrite(pwmAPin, 0);
        analogWrite(pwmBPin, 0);
      }

      else if (value == 4) {
        // 4
        digitalWrite(ain1Pin, LOW);
        digitalWrite(ain2Pin, HIGH);
        digitalWrite(bin1Pin, LOW);
        digitalWrite(bin2Pin, HIGH);

        analogWrite(pwmAPin, 255);
        analogWrite(pwmBPin, 255);
        // 4
      }

      else {
        analogWrite(pwmAPin, 0);
        analogWrite(pwmBPin, 0);
      }
    }
  }
  // led off at end of reading
  digitalWrite(LED_BUILTIN, LOW);
}

The setup() function initializes serial communication, configures pins, and performs an initial motor test. Additionally, it sends the initial buttonValue to the p5.js sketch for the handshake.

The loop() function checks if the eventInterval has elapsed and updates the buttonValue accordingly. It handles incoming serial data from the p5.js sketch, sending state data back and adjusting LED brightness. Motor control logic is implemented based on the received values from the p5.js sketch, allowing for different motor configurations.

User Experience

End users find the robot’s unconventional design intriguing, coupled with a sense of awe at its mobility. The brief learning curve is accompanied by occasional glitches arising from imperfections in handpose detection, which may result in initial user frustration.

User Testing

IM Showcase

The IM showcase went well overall. Despite a few technical hiccups during the presentation, the feedback from people who interacted with the project was positive. Some issues raised were ones I had anticipated from user testing, and I plan to address them in future versions of the project.

User Interaction 1:

User Interaction 2:

Aesthetics and Design

Crafted predominantly from cardboard, the robot’s casing prioritized rapid prototyping, considering time constraints. The material’s versatility expedited the prototyping process, and the strategic use of zipties and glue ensured durability, with easily replaceable parts mitigating potential damage.

Future Enhancements

Subsequent iterations of ArtfulMotion 2.0 aspire to introduce gesture controls for variable speed, operational modes such as tracking, and exploration of more robust machine learning models beyond the limitations of handpose. The quest for wireless control takes precedence, offering heightened operational flexibility, potentially accompanied by a structural redesign.

Reflection

The completion of this project within constrained timelines marks a journey characterized by swift prototyping, iterative testing, and redesign. Future endeavors shift focus towards refining wireless communication, structural enhancements, and the exploration of advanced machine learning models.

p5 rough sketch:


 

P5 Sketch

Final Project: Something ml5.js

Progress:

As previously mentioned in my final project concept, I intend to create an “electronic buddy” or an “art robot” (which sounds better than my previous term). Thus far, I have successfully established communication between p5 and Arduino. Specifically, I utilized the handpose model from the ml5 library to detect if the hand is within specific areas on the screen and transmit a corresponding index to the Arduino.

As this is only a preliminary test, I have two motors connected to the Arduino, rotating in one direction if the hand is in the upper third of the screen, halting if the hand is in the middle portion, and rotating in the opposite direction if the hand is in the lower third. Currently, there is only unidirectional communication from p5 to Arduino, with p5 sending data to the latter. I hope to achieve meaningful communication from the Arduino to p5 as I progress. I have included the p5 sketch, Arduino sketch, and a video of the test run for reference.

p5 sketch:

let handpose;
let value = -1;
let video;
let predictions = [];

function setup() {
  createCanvas(640, 480);
  video = createCapture(VIDEO);
  video.size(width, height);

  handpose = ml5.handpose(video, modelReady);

  // This sets up an event that fills the global variable "predictions"
  // with an array every time new hand poses are detected
  handpose.on("predict", results => {
    predictions = results;
  });

  // Hide the video element, and just show the canvas
  video.hide();
}

function modelReady() {
  console.log("Model ready!");
}

function draw() {
  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    image(video, 0, 0, width, height);

    // We can call both functions to draw all keypoints and the skeletons
    drawKeypoints();

    // draws sections on the screen
    for (let i = 0; i <= width / 2; i += (width / 2) / 3) {
      stroke(255, 0, 0);
      line(i , 0, i, height);
    }

    for (let i = 0; i <= height; i += height / 3) {
      stroke(255, 0, 0);
      line(0, i, width / 2, i);
    }

  }
  
}


function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}

function readSerial(data) {

  if (data != null) {

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = value + "\n";
    writeSerial(sendToArduino);
  }
}

// A function to draw ellipses over the detected keypoints
function drawKeypoints() {
  for (let i = 0; i < predictions.length; i += 1) {
    const prediction = predictions[i];
    let area = [0, 0, 0, 0, 0, 0, 0, 0, 0];
    for (let j = 0; j < prediction.landmarks.length; j += 1) {
      const keypoint = prediction.landmarks[j];
      fill(0, 255, 0);
      noStroke();
      ellipse(keypoint[0], keypoint[1], 10, 10);
      
      // count number of trues
      // -- helps to detect the area the detected hand is in
      if (withinTopLeft(keypoint[0], keypoint[1])) {
        area[0] += 1;
      }
      if (withinTopCenter(keypoint[0], keypoint[1])) {
        area[1] += 1;
      }
      if (withinTopRight(keypoint[0], keypoint[1])) {
        area[2] += 1;
      }
      if (withinMiddleLeft(keypoint[0], keypoint[1])) {
        area[3] += 1;
      }
      if (withinMiddleCenter(keypoint[0], keypoint[1])) {
        area[4] += 1;
      }
      if (withinMiddleRight(keypoint[0], keypoint[1])) {
        area[5] += 1;
      }
      if (withinBottomLeft(keypoint[0], keypoint[1])) {
        area[6] += 1;
      }
      if (withinBottomCenter(keypoint[0], keypoint[1])) {
        area[7] += 1;
      }
      if (withinBottomRight(keypoint[0], keypoint[1])) {
        area[8] += 1;
      }
      // end of count
    }
    
    // print index
    for (let i = 0; i < area.length; i += 1) {
      if (area[i] == 21) {
        value = i;
      }
    }
  }
}

// returns true if a point is in a specific region
function withinTopLeft(x, y) {
  if (x >= 0 && x < (width / 2) / 3 && y >=0 && y < height / 3) {
    return true;
  }
  return false;
}

function withinTopCenter(x, y) {
  if (x > (width / 2) / 3  && x < 2 * (width / 2) / 3 && 
      y >=0 && y < height / 3) {
    return true;
  }
  return false;
}

function withinTopRight(x, y) {
  if (x > 2 * (width / 2) / 3 && x < (width / 2) && 
      y >=0 && y < height / 3) {
    return true;
  }
  return false;
}

function withinMiddleLeft(x, y) {
  if (x >= 0 && x < (width / 2) / 3 && y > height / 3 && y < 2 * height / 3) {
    return true;
  }
  return false;
}

function withinMiddleCenter(x, y) {
  if (x > (width / 2) / 3  && x < 2 * (width / 2) / 3 && 
      y > height / 3 && y < 2 * height / 3) {
    return true;
  }
  return false;
}

function withinMiddleRight(x, y) {
  if (x > 2 * (width / 2) / 3 && x < (width / 2) && 
      y > height / 3 && y < 2 * height / 3) {
    return true;
  }
  return false;
}

function withinBottomLeft(x, y) {
  if (x >= 0 && x < (width / 2) / 3 && y > 2 * height / 3 && y < height) {
    return true;
  }
  return false;
}

function withinBottomCenter(x, y) {
  if (x > (width / 2) / 3  && x < 2 * (width / 2) / 3 &&
      y > 2 * height / 3 && y < height) {
    return true;
  }
  return false;
}

function withinBottomRight(x, y) {
  if (x > 2 * (width / 2) / 3 && x < (width / 2) &&
      y > 2 * height / 3 && y < height) {
    return true;
  }
  return false;
}

Arduino sketch:

const int ain1Pin = 3;
const int ain2Pin = 4;
const int pwmAPin = 5;

const int bin1Pin = 8;
const int bin2Pin = 7;
const int pwmBPin = 6;


void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  pinMode(LED_BUILTIN, OUTPUT);

  pinMode(ain1Pin, OUTPUT);
  pinMode(ain2Pin, OUTPUT);
  pinMode(pwmAPin, OUTPUT); // not needed really

  pinMode(bin1Pin, OUTPUT);
  pinMode(bin2Pin, OUTPUT);
  pinMode(pwmBPin, OUTPUT); // not needed really

  // TEST BEGIN
  // turn in one direction, full speed
  analogWrite(pwmAPin, 255);
  analogWrite(pwmBPin, 255);
  digitalWrite(ain1Pin, HIGH);
  digitalWrite(ain2Pin, LOW);
  digitalWrite(bin1Pin, HIGH);
  digitalWrite(bin2Pin, LOW);

  // stay here for a second
  delay(1000);

  // slow down
  int speed = 255;
  while (speed--) {
    analogWrite(pwmAPin, speed);
    analogWrite(pwmBPin, speed);
    delay(20);
  }

  // TEST END

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }

}

void loop() {

  while (Serial.available()) {
    // sends dummy data to p5
    Serial.println("0,0");

    // led on while receiving data
    digitalWrite(LED_BUILTIN, HIGH); 

    // gets value from p5
    int value = Serial.parseInt();

    // changes brightness of the led
    if (Serial.read() == '\n') {
      
      if (value >= 0 && value <= 2) {
        digitalWrite(ain1Pin, HIGH);
        digitalWrite(ain2Pin, LOW);
        digitalWrite(bin1Pin, HIGH);
        digitalWrite(bin2Pin, LOW);

        analogWrite(pwmAPin, 255);
        analogWrite(pwmBPin, 255);

      }

      else if (value >= 3 && value <= 5) {
        analogWrite(pwmAPin, 0);
        analogWrite(pwmBPin, 0);
      }

      else {
        digitalWrite(ain1Pin, LOW);
        digitalWrite(ain2Pin, HIGH);
        digitalWrite(bin1Pin, LOW);
        digitalWrite(bin2Pin, HIGH);

        analogWrite(pwmAPin, 255);
        analogWrite(pwmBPin, 255);
      }
    }
  }
  // led off at end of reading
  digitalWrite(LED_BUILTIN, LOW);
}

Video:

Reading Response – Week 11

This week’s reading, titled “Design meets disability,” intrigued me with its exploration of the intersection between design and the needs of individuals with disabilities. Prior to this, I hadn’t given much thought to the design aspects of products catering to the disabled community. My initial perspective was primarily focused on functionality, deeming it sufficient. However, the author emphasizes that functionality alone may not be the sole consideration. The incorporation of aesthetic appeal in design plays a crucial role in enhancing the user’s self-image.

The author delves into the analysis of various items such as spectacles (now referred to as eyewear), prosthetics, and other adaptive “equipment” designed for individuals with different abilities. It made me reflect on the challenges of introducing fashion into this realm, especially considering the diversity of needs. Unlike spectacles, which serve users regardless of eyesight effectiveness, prosthetics and hearing aids pose unique challenges. How can someone without amputation use a prosthetic, or how can individuals without hearing impairment benefit from hearing aids?

The transformation of spectacles into trendy “eyewear” appears to have been a successful evolution, but I question whether similar success is feasible for other assistive devices tailored for individuals with special needs. The reading has prompted me to consider the intricacies and complexities involved in merging design and functionality, particularly in the context of products meant to aid those with disabilities.

Final Project Concept: Something ml5.js

For my final project, I plan to utilize the ml5.js library and Arduino to create either an “electronic buddy” or an “art-aiding” device. Concerning the electronic buddy, my idea involves employing the ml5 library to execute a face recognition machine learning model through the webcam of a laptop or an attached camera. The objective is to enable the “buddy” to navigate towards the user. This electronic companion would be capable of displaying messages using the display unit in the SparkFun Inventor’s Kit for Arduino Uno. Additionally, it could produce sounds, potentially incorporating recorded messages.

On the other hand, the concept of the art-aiding device shares some similarities with the electronic buddy. This mobile device would be equipped with servo motors, possibly two or three in number. Colored pencils or markers, depending on what works best, would be attached to these servo motors. The servo motors would be allowed to move at specific angles, enabling the attached pencils or markers to make contact with a canvas. The user would have control over the device’s movement direction and the servo motors, along with the attached pencils, using p5 and a machine learning model from ml5.js.

Week 11 : In-class exercises

Exercise 1

Concept

The potentiometer is used to control the x-coordinate of the ellipse drawn in p5.

Code

Arduino:

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output..
  pinMode(LED_BUILTIN, OUTPUT);

}

void loop() {

  // gets sensor reading
  int sensor = analogRead(A0);
  delay(5);

  // indicates data transfer
  digitalWrite(LED_BUILTIN, HIGH);

  // sends data to p5
  Serial.println(sensor);
  
  // indicates data transfer
  digitalWrite(LED_BUILTIN, LOW);
  

}

p5

// variable to control x-coordinate
let circleX = 0;

function setup() {
  createCanvas(640, 480);
  textSize(18);
}

function draw() {
  // sets background
  background(255);
  stroke(0);

  // draws circle on canvas
  // -- circleX is mapped between 0 and the width
  circle(map(circleX, 0, 1023, 0, width), height / 2, 50);
  
  // checks if serial communication has been established
  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
  }
  
}

// sets up serial connection
function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}

// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////
  if (data != null) {
    circleX = int(trim(data));
  }
  
}

Video

 

Exercise 2

Concept

The brightness of an LED is controlled by the mouseX value from p5 mapped between 0 and 255.

extwo.jpg

exe22.jpg

 Code

Arduino:

// led pin number
int ledPin = 5;

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output.
  // We can't use the serial monitor since the serial connection is
  // used to communicate to p5js and only one application on the computer
  // can use a serial port at once.
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(ledPin, OUTPUT);

  // checks if led works correctly
  digitalWrite(ledPin, HIGH);
  delay(1000);
  digitalWrite(ledPin, LOW);

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }

}

void loop() {


  // wait for data from p5 before doing something
  while (Serial.available()) {
    // sends dummy data to p5
    Serial.println("0,0");

    // led on while receiving data
    digitalWrite(LED_BUILTIN, HIGH); 

    // gets value from p5
    int value = Serial.parseInt();

    // changes brightness of the led
    if (Serial.read() == '\n') {
      analogWrite(ledPin, value);
    }
  }
  // led off at end of reading
  digitalWrite(LED_BUILTIN, LOW);
  
}

p5:

// variable to hold led brightness value
let value = 0;

function setup() {
  createCanvas(640, 480);
  textSize(18);
}

function draw() {
  background(0);
  stroke(255);
  fill(255);
  
  // checks for state of serial communication
  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
  }
}

// sets up serial communication
function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}

// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  if (data != null) {
    //////////////////////////////////
  //SEND TO ARDUINO HERE (handshake)
  //////////////////////////////////
    
    // mouseX is mapped to right value before transmitted
    value = int(map(mouseX, 0, width, 0, 255));
  let sendToArduino = value + "\n";
  writeSerial(sendToArduino);
  }

  
}

Video

 

Exercise 3

Concept

An LED lights up when the ellipse touches the ground. A potentiometer is used to control the wind variable.

exe3.jpg

ex33.jpg

Code

Arduino:

// LED pin value
int ledPin = 5;

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output.
  // We can't use the serial monitor since the serial connection is
  // used to communicate to p5js and only one application on the computer
  // can use a serial port at once.
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(ledPin, OUTPUT);

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }

}

void loop() {

  // wait for data from p5 before doing something
  while (Serial.available()) {

    // led on while receiving data
    digitalWrite(LED_BUILTIN, HIGH); 

    // gets value from p5
    int value = Serial.parseInt();

    // turns on or off the led depending on value from p5
    if (Serial.read() == '\n') {
      if (value == 1) {
        digitalWrite(ledPin, HIGH);
      }
      else {
        digitalWrite(ledPin, LOW);
      }
      
      // gets sensor value
      int sensor = analogRead(A0);
      delay(5);

      // sends sensor value to p5
      Serial.println(sensor);
    }
  }
  // indicates end of reading
  digitalWrite(LED_BUILTIN, LOW);
  
}

p5:

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let value = 0;
let drag = 1;
let mass = 50;

function setup() {
  createCanvas(640, 360);
  noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  wind = createVector(0,0);
}

function draw() {
  background(255);
  stroke(0);
  fill(0);
  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
    
    applyForce(wind);
    applyForce(gravity);
    velocity.add(acceleration);
    velocity.mult(drag);
    position.add(velocity);
    acceleration.mult(0);
    ellipse(position.x,position.y,mass,mass);
    
    if (position.y > height-mass/2) {
        velocity.y *= -0.9;  // A little dampening when hitting the bottom
        position.y = height-mass/2;
      
        // sets value to 1 to indicate touching the ground
        value = 1;
      }
    else {
      // sets value to 0 to indicate touching the ground
      value = 0;
    }
    
  }
  
  
}

function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed(){

  if (key==UP_ARROW){
    mass=random(15,80);
    position.y=-mass;
    velocity.mult(0);
  }
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}


// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // divides potentiometer value into 2
    // -- increments wind value if value is equal to or greater 
    // -- than 511
    if (int(trim(data)) >= 511) {
      wind.x = 3;
    }
    // -- decrements otherwise
    else {
      wind.x = -3;
    }

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = value + "\n";
    writeSerial(sendToArduino);
  }
}

Video

 

 

Reading Response – Week 10

I share Bret Victor’s perspective on the significance of hands and their tactile capabilities when interacting with tools and the world around us. However, I believe that the challenges he identified with the “Picture Under Glass” concept may not be as relevant today with the advancements in augmented reality, virtual reality, and haptic technologies. Researchers are actively developing virtual environments that strive for realism, and personally, I’ve encountered promising experiences in our AIMLab on campus. For instance, I felt a virtual spider crawling up my arm and interacted with a software loop, sensing its weight and texture through a haptic device. In my view, while Victor’s concerns were valid at the time, technological progress has since addressed many of these issues.

Rekas & Boamah-Powers: Ab3n

Concept

Back home, we usually play the trumpet, but when we came to the UAE, we couldn’t find any trumpets. So, we decided to create our own – an electronic trumpet. It works by using a light sensor for the blowing effect and regular buttons for playing the notes. The sound comes out from a speaker. Simple as that!

The circuit diagram looks a bit messy but it works 🙂

#include "pitches.h"

//set the pins for the button, buzzer, and photoresistor
int firstKeyPin = 13;
int secondKeyPin = 12;
int thirdKeyPin = 11;
int buzzerPin = 9;
int blow = A2;

// variables regulate when value is read
const long interval = 200;  
unsigned long previousMillis = 0;
int blowVal;


void setup() {

  Serial.begin(9600);
  //set the button pins as inputs
  pinMode(firstKeyPin, INPUT_PULLUP);
  pinMode(secondKeyPin, INPUT_PULLUP);
  pinMode(thirdKeyPin, INPUT_PULLUP);

  //set the buzzer pin as an output
  pinMode(buzzerPin, OUTPUT);

  // reads value on setup to avoid later error
  blowVal = analogRead(blow);
}

void loop() {

  // reads current time
  unsigned long currentMillis = millis();

  // checks if specified duration has passed
  if (currentMillis - previousMillis >= interval) {
    // updates time since value was read from sensor
    previousMillis = currentMillis;

    // reads value from sensor
    blowVal = analogRead(blow);
  }

  Serial.println(blowVal);

  // conditions to play specific notes
  if (blowVal <= 350) {
    if ((digitalRead(firstKeyPin) == HIGH) && (digitalRead(secondKeyPin) == HIGH) && (digitalRead(thirdKeyPin) == HIGH)) {
      tone(buzzerPin, NOTE_F3); 
    }

    if ((digitalRead(firstKeyPin) == LOW) && (digitalRead(secondKeyPin) == HIGH) && (digitalRead(thirdKeyPin) == LOW)) {
      tone(buzzerPin, NOTE_G3); 
    }

    if ((digitalRead(firstKeyPin) == LOW) && (digitalRead(secondKeyPin) == LOW) && (digitalRead(thirdKeyPin) == HIGH)) {
      tone(buzzerPin, NOTE_A3); 
    }

    if ((digitalRead(firstKeyPin) == LOW) && (digitalRead(secondKeyPin) == HIGH) && (digitalRead(thirdKeyPin) == HIGH)) {
      tone(buzzerPin, NOTE_AS3); 
    }

  }
  
  if (blowVal > 350) {
    if ((digitalRead(firstKeyPin) == HIGH) && (digitalRead(secondKeyPin) == HIGH) && (digitalRead(thirdKeyPin) == HIGH)) {
      tone(buzzerPin, NOTE_C4); 
    }

    if ((digitalRead(firstKeyPin) == LOW) && (digitalRead(secondKeyPin) == LOW) && (digitalRead(thirdKeyPin) == HIGH)) {
      tone(buzzerPin, NOTE_D4); 
    }

    if ((digitalRead(firstKeyPin) == HIGH) && (digitalRead(secondKeyPin) == LOW) && (digitalRead(thirdKeyPin) == HIGH)) {
      tone(buzzerPin, NOTE_E4); 
    }

    if ((digitalRead(firstKeyPin) == LOW) && (digitalRead(secondKeyPin) == HIGH) && (digitalRead(thirdKeyPin) == HIGH)) {
      tone(buzzerPin, NOTE_AS3); 
    }

    if ((digitalRead(firstKeyPin) == HIGH) && (digitalRead(secondKeyPin) == HIGH) && (digitalRead(thirdKeyPin) == LOW)) {
      tone(buzzerPin, NOTE_F4); 
    }

  }
}

Just Work

Concept

For this particular assignment, I didn’t particularly aim for anything. I have been having issues with connecting some of the components of the arduino kit and getting them to work so I decided to strictly get stuff to work. I utilized three buttons, jumper wires, three LEDs, a potentiometer, and the arduino UNO board. Using the value read from the potentiometer as a delay time, the LEDs are blinked using different buttons. That’s basically what the setup does.

I’ve included the sketch below.

//set the pins for the button and leds
int firstKeyPin = 2;
int secondKeyPin = 3;
int thirdKeyPin = 4;

int led1 = 9;
int led2 = 10;
int led3 = 11;

void setup() {
  //set the button pins as inputs
  pinMode(firstKeyPin, INPUT_PULLUP);
  pinMode(secondKeyPin, INPUT_PULLUP);
  pinMode(thirdKeyPin, INPUT_PULLUP);

  // set leds for output
  pinMode(led1, OUTPUT);
  pinMode(led2, OUTPUT);
  pinMode(led3, OUTPUT);

}

void loop() {
  // read voltage from potentiometer
  int delay_time = analogRead(A0);

  if(digitalRead(firstKeyPin) == LOW){        //if the first key is pressed
    // turns on led1
    digitalWrite(led1, HIGH);
    // delays for the value read from the potentiometer
    delay(delay_time);
    // turns off led1
    digitalWrite(led1, LOW);
    // delays for the value read from the potentiometer
    delay(delay_time);
  }
  else if(digitalRead(secondKeyPin) == LOW){  //if the second key is pressed
    // turns on led2
    digitalWrite(led2, HIGH);
    // delays for the value read from the potentiometer
    delay(delay_time);
    // turns off led2
    digitalWrite(led2, LOW);
    // delays for the value read from the potentiometer
    delay(delay_time);
  }
  else if(digitalRead(thirdKeyPin) == LOW){   //if the third key is pressed
    // turns on led3
    digitalWrite(led3, HIGH);
    // delays for the value read from the potentiometer
    delay(delay_time);
    // turns off led3
    digitalWrite(led3, LOW);
    // delays for the value read from the potentiometer
    delay(delay_time);
  }
  else{
    // turns off all leds
    digitalWrite(led1, LOW);
    digitalWrite(led2, LOW);
    digitalWrite(led3, LOW);
  }
}

Ideas for Future Improvements

For future improvements, I hope it doesn’t take me as much time as it took me this time to get everything working properly.

I used the tinker kit circuit guide as a reference when I got stuck.

Reading Reflection – Week 9

In my initial perspective, I approached the interaction in physical computing from a strictly practical standpoint. However, the revelation brought about by the mechanical pixels and the surrounding fields of grass has broadened my understanding, highlighting the potential for creativity and aesthetics in these interactions. Previously, I considered the user’s input as the primary influence, shaping the device’s responses accordingly. Yet, the mechanical pixels and the grassy environment exhibit a distinct “will of technology.” Despite their simplicity, they engage with their surroundings not merely to fulfill a predetermined purpose dictated by humans, but to actively create an interaction with people. An illustrative example is the subtle quivering of the mechanical pixel “leaves” in an attempt to attract attention when no one is nearby. As individuals pass by, they collectively form a wave, accompanied by a tinkling sound, showcasing a dynamic and engaging interaction that transcends the conventional user-device relationship.

After delving into Tigoe’s “Making Interactive Art,” my perspective on interaction and interactivity has evolved. Initially, I conceived these concepts as a straightforward exchange where two entities respond to each other through inputs and outputs. However, Tigoe’s insights have led me to perceive interactivity as a more nuanced and profound experience. The excerpt from “Making Interactive Art” that resonated with me emphasizes the importance of letting the audience absorb the work through their senses, encouraging them to contemplate the meaning of each part, identify elements open to contact or control, and ultimately empowering them to interpret and respond in their own unique ways.

This perspective reframes interactivity as a process that places emphasis on the receiving end, highlighting the significance of allowing individuals to engage with artistic or interactive creations on a personal and subjective level. Rather than a uniform response to inputs, true interactivity, as suggested by Tigoe, enables a diverse range of interpretations and responses. In this light, I now see interaction not only as a means of communication but also as a pathway for individuals to feel a sense of uniqueness, care, and individuality within the realm of human experience.

Reading reflection – Week 8

The concept that attractive things work better raises an intriguing question about the relationship between aesthetics and functionality. It is indeed thought-provoking to consider whether we find objects appealing because we can subconsciously perceive their potential ease of use, and the subsequent satisfaction we derive from confirming our expectations. Just as when we examine a product and intuitively sense its ergonomic design, we appreciate its simplicity and effectiveness even before handling it. This initial attraction is comparable to our perceptions of beauty in people; an attractive exterior often becomes more compelling when accompanied by qualities like a warm smile, healthy teeth, a pleasant voice, and a likable personality.

On the other hand, the counter argument suggests that our thoughtfulness in assessing attractiveness beforehand may be limited. It’s true that not all functional products on the shelves are necessarily coveted in the manner described. Many utilitarian items do not receive the same level of initial attention or appeal as aesthetically pleasing ones. However, the counter argument acknowledges that functionality can enhance attractiveness afterward. For instance, a product that may not appear striking initially but offers exceptional functionality can become more attractive once its usefulness and positive attributes are realized. This is analogous to a person who may not be conventionally attractive but is incredibly enjoyable to be around, gradually altering our perception and making us appreciate their company in a different light.