Final Project User Testing

For the user testing, I asked two of my coursemates and asked for their feedback. There were several aspects in the game that they and I had differing opinions but there were also several agreements in terms of improvement of the game. Initially, I had the game with the objective of collecting 10 treasures on both level 1 and level 2, but due to the map size, it seems just 3 to 5 treasures is enough as it would take too much time and might bore the players. Another advice they gave me was to decrease the range of the radar, which actually surprised me, as I thought that the range was too small. Furthermore, we thought it would also be good idea to make the pulsing effect much more slower and inline with the cooldown effect, so that the users will know the cooldown of the radar. I found their opinions in this regard to be most useful and seeing as I also had to explain to them certain instructions and game mechanics.

IMG_4595

Final Project – ThumbWave Studio – Final Report – Dachi

Concept:

My final project mostly stays true to its initial goal – except I combined both of my ideas. This project is an interactive art project that allows you to both visualize and create your own visual art. At core of its project are sin wave structures, which in simple term serve as fundamental graphical elements (used OpenGL for 3D) to create dynamic flowing patterns. The user is free to control them with various gestures and interactions all of which are natural with our hands to perform. For example, pinching your thumb and index finger together and moving your hand up and down. There is also a layer of Arduino integration which serves to expand the functionality with ultrasonic sensor, dictating the zoom level of the canvas, and three buttons which serve different functions. Red one is responsible for toggling Trail mode which lets you create a dynamic painting, the yellow one lets you stop the drawing to then save your work with green button as a form of a screenshot. This simple concept can be expanded to bigger scales. I could imagine this implemented in science or other classes where we demonstrate different mathematical equations or models with which students are able to interact with simple gestures. Conversely, they are able to adjust individual parameters and really see whats going on under the hood. The gestures make it much more intuitive and fun to do so. Moreover, they are free to experiment with trails, creating abstract drawing of their own and coming up with creative ways to combine and fuse different shapes and patterns to come up with beautiful renders.

Images:

Example sketches:

 

 

 

 

 

UI:

Building Arduino interface:

 

User Testing:

My friend Adi who has already taken this class and has experience in Interactive Media figured it out quite quickly. Before we even started, he told me he had worked with ML5 before and knew of its strengths and weaknesses and what to watch out for. Despite, this he still made some mistakes initially where he did not fully know how to do gestures. I showed him instruction paged and after confirming some of his doubts with hand orientations with me we started testing it second time where I tried to only explain after he did things on his own. He said it was pretty smooth and there were no major delays. I feel like one way to make this more clear for people unlike Adi who have never worked with ML5 before, is to instead of putting pictures and texts, record short GIFS or videos of visual demonstrations so they can see whats happening in 3 dimensions since the project itself is very much three dimensional. It will be interesting to see how people react to it during IM showcase which I will update in the respective section. Here is Adi’s second run:

More on: https://drive.google.com/drive/folders/1kAu8gpN6yCG0EfSr5FmAnB6liQApwaBf?usp=sharing

Implementation:

 

Interaction Design:

The interaction design revolves around using hand gestures to control various parameters of the visual art. Each gesture is mapped to a specific parameter, allowing the user to intuitively manipulate the shapes and patterns. For example, pinching the thumb and index finger together and moving the hand up and down controls the X-axis rotation. Conversely, thumb and middle finger controls number of shapes. Thumb and ring finger controls shape detail and thumb and pink changes perceived rotation speed (zAngle). Additionally, using two hands they can change zScale which visually demonstrates larger movement across Z, sort of like pulsing. Moving two hands like accordion further adds to fun and visual candy. Users can also use the physical box with three buttons and distance sensor to change zoom, toggle trail mode, stop/continue drawing and save a screenshot. The use of gestures makes the interaction more engaging and natural, enabling users to explore and experiment with the artwork in a hands-on manner.

Arduino Code:

const int trigPin = 9;
const int echoPin = 10;
const int buttonPin = 2;
const int ledPin = 13;
const int trailButtonPin = 4;
const int trailLedPin = 7;
const int screenshotButtonPin = 12;
const int screenshotLedPin = 8;
const int adminButtonPin = 6;

long duration;
int distance;
int buttonState = 0;
int trailButtonState = 0;
int screenshotButtonState = 0;
int adminButtonState = 0;

void setup() {
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  pinMode(buttonPin, INPUT);
  pinMode(ledPin, OUTPUT);
  pinMode(trailButtonPin, INPUT);
  pinMode(trailLedPin, OUTPUT);
  pinMode(screenshotButtonPin, INPUT);
  pinMode(screenshotLedPin, OUTPUT);
  pinMode(adminButtonPin, INPUT);
  Serial.begin(9600);
}

void loop() {
  // Read distance from ultrasonic sensor
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  duration = pulseIn(echoPin, HIGH);
  distance = duration * 0.034 / 2; //calculate distanse
  distance = constrain(distance, 1, 20); //constraint

  // Read button states
  buttonState = digitalRead(buttonPin);
  trailButtonState = digitalRead(trailButtonPin);
  screenshotButtonState = digitalRead(screenshotButtonPin);
  adminButtonState = digitalRead(adminButtonPin);

  // Control LEDs based on button states
  digitalWrite(ledPin, buttonState);
  digitalWrite(trailLedPin, trailButtonState);
  digitalWrite(screenshotLedPin, screenshotButtonState);

  // Send data to p5.js
  Serial.print(distance);
  Serial.print(",");
  Serial.print(buttonState);
  Serial.print(",");
  Serial.print(trailButtonState);
  Serial.print(",");
  Serial.print(screenshotButtonState);
  Serial.print(",");
  Serial.println(adminButtonState);

  delay(100);
}

The Arduino code is responsible for reading data from the ultrasonic sensor and button states, and sending this information to the p5.js sketch. It continuously measures the distance using the ultrasonic sensor, which is then mapped to control the zoom level of the canvas. The code also reads the states of three buttons: a red button for toggling the trail mode, a yellow button for stopping the drawing, and a green button for taking a screenshot. The button states and distance data are sent to p5.js via serial communication.

Description of p5.js code:

The p5.js code handles the visual rendering and interaction of the artwork. It uses the ML5 library for hand pose detection, allowing the sketch to recognize and track hand gestures. The code maps each gesture to a specific parameter of the visual elements, such as X-axis rotation, number of shapes, shape detail, and Z-axis rotation. It also incorporates the data received from the Arduino, using the ultrasonic sensor readings to control the zoom level and the button states to toggle trail mode, stop drawing, and take screenshots. The code creates a dynamic and interactive experience by combining the hand gesture controls with the Arduino inputs.

I have three main sketch files. Sketch.js is the main one and is responsible for drawing the artwork as well as declaring, initializing variables, creating buttons, sliders, other helpers and toggles. It binds whole program together and keeps it running.

 

//Variables
let handPose; // ml5.js hand pose object
let video; 
let hands = []; // stores hand data from the pose detector
let rotateXAngle = 60;
let rotateZAngle = 60;
let numShapes = 50; // number of geometric shapes displayed
let shapeDetail = 360 / 60;  // detail level of each shape
let radialScale = 3;
let zScale = 50; 
let gestureActivated = false;
let zoom = 1; 
let trailsEnabled = false;
let drawingEnabled = true;
let osc;
let adminMode = false; //for debugging + additional controls
let instructionsVisible = false;

//based on instruction on vs off
function updateButtonVisibility() {
  let buttons = selectAll('button:not(#backButton)');
  for (let i = 0; i < buttons.length; i++) {
    buttons[i].style('display', instructionsVisible ? 'none' : 'inline-block');
  }
}

function createInstructionsButton() {
  let buttonContainer = createDiv("");
  buttonContainer.style("position", "absolute");
  buttonContainer.style("right", "229px");
  buttonContainer.style("top", "20px");

  let instructionsButton = createButton("Instructions");
  instructionsButton.parent(buttonContainer);
  instructionsButton.id("instructionsButton");
  instructionsButton.mousePressed(toggleInstructions);
  styleButton(instructionsButton);
}

function toggleInstructions() {
  instructionsVisible = !instructionsVisible;
  updateButtonVisibility();
  updateAdminElements();
}

function hideInstructions() {
  instructionsVisible = false;
  updateButtonVisibility();
  updateAdminElements();
  backButtonPressed = true;
}


function createBackButton() {
  let buttonContainer = createDiv("");
  buttonContainer.style("position", "absolute");
  buttonContainer.style("left", "20px");
  buttonContainer.style("top", "20px");

  let backButton = createButton("↑");
  backButton.parent(buttonContainer);
  backButton.id("backButton");
  backButton.mousePressed(hideInstructions);
  styleButton(backButton);
}

function toggleAdminMode() {
  adminMode = !adminMode;
  updateAdminElements();
}

function updateAdminElements() {
  let elements = selectAll('button:not(#adminButton):not(#fullscreenButton):not(#instructionsButton):not(#backButton), input[type="range"]');
  for (let i = 0; i < elements.length; i++) {
    elements[i].style('display', adminMode && !instructionsVisible ? 'inline-block' : 'none');
  }
}

function createAdminButton() {
  let buttonContainer = createDiv("");
  buttonContainer.style("position", "absolute");
  buttonContainer.style("right", "20px");
  buttonContainer.style("top", "20px");

  let adminButton = createButton("Admin");
  adminButton.parent(buttonContainer);
  adminButton.id("adminButton");
  adminButton.mousePressed(toggleAdminMode);
  styleButton(adminButton);
}

function createFullscreenButton() {
  let buttonContainer = createDiv("");
  buttonContainer.style("position", "absolute");
  buttonContainer.style("right", "110px");
  buttonContainer.style("top", "20px");

  let fullscreenButton = createButton("Fullscreen");
  fullscreenButton.parent(buttonContainer);
  fullscreenButton.id("fullscreenButton");
  fullscreenButton.mousePressed(toggleFullscreen);
  styleButton(fullscreenButton);
}

function styleButton(button) {
  button.style("background-color", "#4CAF50");
  button.style("border", "none");
  button.style("color", "white");
  button.style("padding", "10px 20px");
  button.style("text-align", "center");
  button.style("text-decoration", "none");
  button.style("display", "inline-block");
  button.style("font-size", "16px");
  button.style("border-radius", "4px");
  button.style("cursor", "pointer");
}

function createSliders() { //visible in admin mode
  let sliderContainer = createDiv("");
  sliderContainer.id("sliderContainer");
  sliderContainer.style("position", "absolute");
  sliderContainer.style("left", "20px");
  sliderContainer.style("top", "20px");
  sliderContainer.style("display", "flex");
  sliderContainer.style("flex-direction", "column");

  let rotateZSlider = createSlider(10, 180, rotateZAngle);
  rotateZSlider.parent(sliderContainer);
  rotateZSlider.style("width", "200px");
  rotateZSlider.input(() => updateRotateZAngle(rotateZSlider.value()));

  let numShapesSlider = createSlider(10, 100, numShapes, 1);
  numShapesSlider.parent(sliderContainer);
  numShapesSlider.style("width", "200px");
  numShapesSlider.input(() => updateNumShapes(numShapesSlider.value()));

  let shapeDetailSlider = createSlider(3, 60, 6, 1);
  shapeDetailSlider.parent(sliderContainer);
  shapeDetailSlider.style("width", "200px");
  shapeDetailSlider.input(() => updateShapeDetail(shapeDetailSlider.value()));

  let radialScaleSlider = createSlider(1, 10, radialScale, 0.1);
  radialScaleSlider.parent(sliderContainer);
  radialScaleSlider.style("width", "200px");
  radialScaleSlider.input(() => updateRadialScale(radialScaleSlider.value()));

  let zScaleSlider = createSlider(10, 100, zScale, 1);
  zScaleSlider.parent(sliderContainer);
  zScaleSlider.style("width", "200px");
  zScaleSlider.input(() => updateZScale(zScaleSlider.value()));

  let zoomSlider = createSlider(0.1, 2, zoom, 0.1);
  zoomSlider.parent(sliderContainer);
  zoomSlider.style("width", "200px");
  zoomSlider.input(() => updateZoom(zoomSlider.value()));
}

function toggleFullscreen() {
  if (!document.fullscreenElement) {
    if (document.documentElement.requestFullscreen) {
      document.documentElement.requestFullscreen();
    } else if (document.documentElement.webkitRequestFullscreen) { // Safari
      document.documentElement.webkitRequestFullscreen();
    } else if (document.documentElement.msRequestFullscreen) { // IE/Edge
      document.documentElement.msRequestFullscreen();
    }
  } else {
    if (document.exitFullscreen) {
      document.exitFullscreen();
    } else if (document.webkitExitFullscreen) { // Safari
      document.webkitExitFullscreen();
    } else if (document.msExitFullscreen) { // IE/Edge
      document.msExitFullscreen();
    }
  }
}
function windowResized() {
  if (document.fullscreenElement) {
    resizeCanvas(windowWidth, windowHeight);
    instructionsGraphics.resizeCanvas(windowWidth, windowHeight * 2); //for fullscreen

  } else {
    resizeCanvas(windowWidth, windowHeight);
    instructionsGraphics.resizeCanvas(windowWidth, windowHeight * 2);

  }
}

function updatePitch(value) { //sound
  let freq = map(value, 1, 200, 50, 400);
  oscZScale.freq(freq);
  envZScale.play();
}

function updateFrequency(value) {  //different shape sound, reversed
  let freq = map(value, 10, 180, 400, 50);
  oscRotateZAngle.freq(freq);

}
function preload() {
  handPose = ml5.handPose();
  img1 = loadImage('gestures.png');
  img2 = loadImage('box.png');
}


function createScreenshotButton() {
  let buttonContainer = createDiv("");
  buttonContainer.style("position", "absolute");
  buttonContainer.style("left", "20px");
  buttonContainer.style("top", "260px");

  let screenshotButton = createButton("Take Screenshot");
  screenshotButton.parent(buttonContainer);
  screenshotButton.mousePressed(takeScreenshot);
}

function takeScreenshot() {
  saveCanvas('screenshot', 'png');
}

function setup() {
  createFullscreenButton();
  createCanvas(windowWidth, windowHeight, WEBGL);
  video = createCapture(VIDEO);
  video.size(640, 480); //og resolution
  video.style('transform', 'scale(-1, 1)'); //reverse if need to test video
  video.hide(); //if need to test for video
  handPose.detectStart(video, gotHands);
  angleMode(DEGREES);
  createSliders();
  createTrailButton();
  createStopDrawingButton();
  createScreenshotButton();
  oscZScale = new p5.Oscillator('triangle');
  oscRotateZAngle = new p5.Oscillator('sawtooth'); //waveform sawtooth
  oscZScale.amp(0.1);
  oscRotateZAngle.amp(0.1);
  
  envZScale = new p5.Envelope(); //used to ocntrol amplitude
  envZScale.setADSR(0.1, 0.2, 0.5, 0.5); //controls attack decay sustain and release
  envZScale.setRange(0.2, 0); //silence  after sustained phase

  oscZScale = new p5.Oscillator('triangle'); //waveform triangle
  oscZScale.start();
  oscZScale.amp(envZScale);

  // updateFrequency(rotateZAngle); //initial frequency
  updatePitch(zScale);
  createAdminButton();
  createBackButton();
  createInstructionsButton();
  updateButtonVisibility();
  updateAdminElements(); // make admin initially not visible
  instructionsGraphics = createGraphics(width, height * 2); //new graphics buffer for instructions
}
function keyPressed() { //spacebar to select serial port
  if (key === ' ') {
    setUpSerial();
  }
}

let backButtonPressed = false;

function draw() {
  if (!trailsEnabled || backButtonPressed) {
    background(30); //clear background
    backButtonPressed = false;
  }


  if (instructionsVisible) {
    // Clear the instructions graphics buffer
    instructionsGraphics.clear();
    instructionsGraphics.fill(0); //rectangle background black
    instructionsGraphics.noStroke();
    instructionsGraphics.rectMode(CORNER);
    instructionsGraphics.rect(0, 0, width, 2*height);
    instructionsGraphics.textAlign(LEFT, TOP); //align text
    
    // Adjust text size based on fullscreen mode
    let textSize = document.fullscreenElement ? 24 : 16;
    instructionsGraphics.textSize(textSize);

    instructionsGraphics.fill(255);

    // instruction paragraphs
    let paragraph1 = `Welcome to ThumbWave studio where you can explore interactive mathematical graphics with your hand gestures. Different hand gestures are assigned to different parameter. Holding your thumb and index finger together alters the tilt of the visuals, mimicking the effect of changing your perspective. Bringing your thumb and middle finger together adjusts the number of shapes on the display, allowing you to fill the screen with complexity or clear it for simplicity. Connecting your thumb to your ring finger modifies the intricacy of each shape, adding a layer of detail with further movements.`;
    let paragraph2 = `Keep in mind, while you hold these gestures you can change the value of individual parameter by moving your hand up and down (for X-axis rotation) or side to side (for the rest). Finally, a touch between your thumb and pinkie will spin the shapes around the Z-axis, injecting motion into the scene. For a more dramatic effect, use both hands as if handling an accordion: moving your hands together and apart changes the scale and depth of the shapes on the screen and alters the pitch of the background sounds to match your movements, enhancing the sensory experience.`;

    // each paragraph with appropriate spacing
    let padding = 20;
    let topPadding = 80; // top padding before the first paragraph
    let maxTextWidth = width * 0.9;
    let lineSpacing = document.fullscreenElement ? 60 : 50;

    drawParagraph(instructionsGraphics, paragraph1, padding, topPadding + padding, maxTextWidth, lineSpacing);
    let paragraph1Height = calculateParagraphHeight(instructionsGraphics, paragraph1, maxTextWidth, lineSpacing);

    // first image after the first paragraph
    let img1Width = width * 0.8;
    let img1Height = img1.height * (img1Width / img1.width);
    let img1X = (width - img1Width) / 2;
    let img1Y = topPadding + padding + paragraph1Height + lineSpacing;
    instructionsGraphics.image(img1, img1X, img1Y, img1Width, img1Height);

    drawParagraph(instructionsGraphics, paragraph2, padding, img1Y + img1Height + lineSpacing, maxTextWidth, lineSpacing);
    let paragraph2Height = calculateParagraphHeight(instructionsGraphics, paragraph2, maxTextWidth, lineSpacing);

    // second image after the second paragraph
    let img2Width = width * 0.8;
    let img2Height = img2.height * (img2Width / img2.width);
    let img2X = (width - img2Width) / 2;
    let img2Y = img1Y + img1Height + lineSpacing + paragraph2Height + lineSpacing;
    instructionsGraphics.image(img2, img2X, img2Y, img2Width, img2Height);
    
    let scrollPosition;
    if (document.fullscreenElement) {
      scrollPosition = map(mouseY, 0, windowHeight, 0, instructionsGraphics.height - windowHeight);
    } else {
      scrollPosition = map(mouseY, 0, height, 0, instructionsGraphics.height - height);
    }
    image(instructionsGraphics, -width / 2, -height / 2, width, height, 0, scrollPosition, width, height);


    //  the back button
    select("#backButton").style("display", "inline-block");
  } else { //dynamic rendering of 3D geometric shapes
    if (drawingEnabled) {
      push();
      scale(zoom);
      rotateX(rotateXAngle);
      noFill();
      stroke(255);
      for (let i = 0; i < numShapes; i++) { //dynamic color assignment
        let r = map(sin(frameCount / 2), -1, 1, 100, 200);
        let g = map(i, 0, numShapes, 100, 200);
        let b = map(cos(frameCount), -1, 1, 200, 100);
        stroke(r, g, b);
        rotate(frameCount / rotateZAngle); //rotate shape around z axis
        beginShape();
        for (let j = 0; j < 360; j += shapeDetail) { //3D cordinates for each vertex of shape
          let rad = j * radialScale;
          let x = rad * cos(j);
          let y = rad * sin(j);
          let z = sin(frameCount * 2 + i * 5) * zScale;
          vertex(x, y, z);
        }
        endShape(CLOSE);
      }
      pop();
    }

    // Hide the back button
    select("#backButton").style("display", "none");
  }
}

// Helper function to draw a paragraph of text
function drawParagraph(graphics, text, x, y, maxWidth, lineSpacing) {
  let words = text.split(' ');
  let currentLine = '';
  let yPos = y;
  //split text in individual words
  for (let i = 0; i < words.length; i++) {
    let word = words[i];
    let testLine = currentLine + ' ' + word; //add to current line
    let testWidth = graphics.textWidth(testLine);

    if (testWidth > maxWidth && currentLine !== '') { //exceed max
      graphics.text(currentLine, x, yPos); //we draw currentline on graphics
      currentLine = word;
      yPos += lineSpacing;
    } else {
      currentLine = testLine; //word added to current line
    }
  }
  graphics.text(currentLine, x, yPos); // draw the last line
}


function calculateParagraphHeight(graphics, text, maxWidth, lineSpacing) {
  let words = text.split(' ');
  let currentLine = '';
  let height = 0;
  

  for (let i = 0; i < words.length; i++) {
    let word = words[i];
    let testLine = currentLine + ' ' + word;
    let testWidth = graphics.textWidth(testLine);

    if (testWidth > maxWidth && currentLine !== '') { 
      currentLine = word;
      height += lineSpacing; //increments height counter
    } else {
      currentLine = testLine;
    }
  }
  height += lineSpacing; // Add the last line's spacing
  return height;
}

function createStopDrawingButton() {
  let buttonContainer = createDiv("");
  buttonContainer.style("position", "absolute");
  buttonContainer.style("left", "20px");
  buttonContainer.style("top", "230px");

  let stopDrawingButton = createButton("Stop Drawing");
  stopDrawingButton.parent(buttonContainer);
  stopDrawingButton.mousePressed(toggleDrawing);
}
function toggleDrawing() {
  drawingEnabled = !drawingEnabled;
}

function createTrailButton() {
  let buttonContainer = createDiv("");
  buttonContainer.style("position", "absolute");
  buttonContainer.style("left", "20px");
  buttonContainer.style("top", "200px");

  let trailButton = createButton("Toggle Trails");
  trailButton.parent(buttonContainer);
  trailButton.mousePressed(toggleTrails);
}

function toggleTrails() {
  trailsEnabled = !trailsEnabled;
}

let rotateXAngleHistory = [];
let rotateXAngleHistorySize = 10;
let rotateXAngleSmoothingFactor = 0.2; //used for smoothing X with moving average


function updateRotateZAngle(value) {
  rotateZAngle = value;
  updateFrequency(value);
}

function updateNumShapes(value) {
  numShapes = value;
}

function updateShapeDetail(value) {
  shapeDetail = 360 / value;
}

function updateRadialScale(value) {
  radialScale = value;
}

function updateZScale(value) {
  zScale = value;
  updatePitch(value);
}

function updateZoom(value) {
  zoom = value;
}

The other file is gestures.js which is the heart of the project. It handles detection of gestures as I described above. It is commented well for general functions and you are free to inspect for particular mechanisms/logic.

 

function gotHands(results) {
  hands = results; //store data
  if (hands.length === 2) { //two hand detection
    let leftWrist = hands[0].keypoints[0];
    let rightWrist = hands[1].keypoints[0];
    //get wrist positions and then calculate wrist distanse
    let wristDistance = dist(leftWrist.x, leftWrist.y, rightWrist.x, rightWrist.y);

    let minDistance = 100;
    let maxDistance = 400;
    //normalize wrist constraints
    let mappedDistance = constrain(wristDistance, minDistance, maxDistance);
    let zScaleNew = map(mappedDistance, minDistance, maxDistance, 1, 200); //we map it to z scale
    zScale = zScaleNew; //update global value
    updatePitch(zScaleNew);
    if (adminMode) {
      console.log("Two hands gesture - zScale:", zScale);
    }
  } else if (hands.length > 0) { //if at least one is detected
    if (adminMode) {
      console.log("Pinch gesture - rotateXAngle:", rotateXAngle);
      console.log("Middle-thumb gesture - numShapes:", numShapes);
      console.log("Ring-thumb gesture - shapeDetail:", shapeDetail);
      console.log("Pinkie-thumb gesture - rotateZAngle:", rotateZAngle);
    }
    //fingertips for vairous gestures
    let indexFingerTip = hands[0].keypoints[8];
    let thumbTip = hands[0].keypoints[4];
    let pinchDistance = dist(indexFingerTip.x, indexFingerTip.y, thumbTip.x, thumbTip.y);

    let middleFingerTip = hands[0].keypoints[12];
    let middleThumbDistance = dist(middleFingerTip.x, middleFingerTip.y, thumbTip.x, thumbTip.y);

    let ringFingerTip = hands[0].keypoints[16];
    let ringThumbDistance = dist(ringFingerTip.x, ringFingerTip.y, thumbTip.x, thumbTip.y);

    let pinkieFingerTip = hands[0].keypoints[20];
    let pinkieThumbDistance = dist(pinkieFingerTip.x, pinkieFingerTip.y, thumbTip.x, thumbTip.y);

    //thumb to index gesture
    if (pinchDistance < 20) {
      gestureActivated = true;
      let wristY = hands[0].keypoints[0].y;
      let centerY = video.height * 0.6;
      let range = video.height / 10;
      let mappedY = constrain(wristY, centerY - range, centerY + range);
      let rotateXAngleNew = map(mappedY, centerY - range, centerY + range, 200, 0);

      rotateXAngleHistory.push(rotateXAngleNew);
      //we maintain history of rotateX angles to smoothen transition
      if (rotateXAngleHistory.length > rotateXAngleHistorySize) {
        rotateXAngleHistory.shift();
      }

      let rotateXAngleAverage = rotateXAngleHistory.reduce((sum, value) => sum + value, 0) / rotateXAngleHistory.length;

      rotateXAngle = lerp(rotateXAngle, rotateXAngleAverage, rotateXAngleSmoothingFactor);

    } else if (middleThumbDistance < 20) { //thumb gesture and middle finger
      let wristX = hands[0].keypoints[0].x;
      let centerX = video.width * 0.5;
      let range = video.width / 8;
      let mappedX = constrain(wristX, centerX - range, centerX + range);
      let numShapesNew = round(map(mappedX, centerX - range, centerX + range, 100, 1));
      numShapes = numShapesNew;

    } else if (ringThumbDistance < 20) { //ring and thumn gesture
      let wristX = hands[0].keypoints[0].x;
      let centerX = video.width * 0.5;
      let range = video.width / 8;
      let mappedX = constrain(wristX, centerX - range, centerX + range);
      let shapeDetailNew = round(map(mappedX, centerX - range, centerX + range, 3, 60));
      shapeDetail = 360 / shapeDetailNew;
 
    } else if (pinkieThumbDistance < 20) { //pinkie thumb gesture
      let wristX = hands[0].keypoints[0].x;
      let centerX = video.width * 0.5;
      let range = video.width / 8;
      let mappedX = constrain(wristX, centerX - range, centerX + range);
      let rotateZAngleNew = round(map(mappedX, centerX - range, centerX + range, 10, 180));
      rotateZAngle = rotateZAngleNew;
      updateFrequency(rotateZAngleNew);

    } else {
      gestureActivated = false;
    }
  }
}

Last but not least is the Arduino file. This code sets up and runs the serial communication between the Arduino and p5.js. It reads data from the Arduino, which includes the ultrasonic sensor distance and button states, and updates the corresponding variables in the p5.js sketch. The setUpSerial function initializes the serial communication, while the runSerial function continuously reads data from the serial port. The readSerial function parses the received data and updates the p5.js sketch accordingly. The distance value from the ultrasonic sensor controls the zoom level of the canvas using a smoothing technique. The button states toggle various functionalities, such as enabling/disabling drawing, trails, and taking screenshots. If adminMode is enabled, the code logs relevant information to the console for debugging purposes.

 

async function setUpSerial() {
  noLoop();
  ({ reader, writer } = await getPort());
  serialActive = true;
  runSerial();
  loop();
}

async function runSerial() {
  try {
    while (true) {
      if (serialActive) {
        const { value, done } = await reader.read();
        if (done) {
          reader.releaseLock();
          break;
        }
        readSerial(value);
      } else {
        break;
      }
    }
  } catch (e) {
    console.error(e);
  }
}

let zoomPrev = zoom;
let zoomSmoothingFactor = 0.1;
let zoomHistory = [];
let zoomHistorySize = 10;

let previousButtonState = 0;
let previousTrailButtonState = 0;
let previousScreenshotButtonState = 0;


let previousAdminButtonState = 0;



function readSerial(data) {
  let values = data.trim().split(",");
  if (values.length === 5) {
    let distance = parseInt(values[0]);
    let buttonState = parseInt(values[1]);
    let trailButtonState = parseInt(values[2]);
    let screenshotButtonState = parseInt(values[3]);
    let adminButtonState = parseInt(values[4]);

    if (!isNaN(distance) && !isNaN(buttonState) && !isNaN(trailButtonState) && !isNaN(screenshotButtonState) && !isNaN(adminButtonState)) {
      let zoomNew = map(distance, 1, 20, 4, 0.2);
      zoomHistory.push(zoomNew);
      if (zoomHistory.length > zoomHistorySize) {
        zoomHistory.shift();
      }
      let zoomAverage = zoomHistory.reduce((sum, value) => sum + value, 0) / zoomHistory.length;
      zoom = lerp(zoomPrev, zoomAverage, zoomSmoothingFactor);
      zoomPrev = zoom;
      if (adminMode) {
        console.log("Distance sensor - zoom:", zoom);
      }

      if (buttonState === 1 && previousButtonState === 0) {
        drawingEnabled = !drawingEnabled;
        if (adminMode) {
          console.log("Drawing state toggled:", drawingEnabled);
        }
      }
      previousButtonState = buttonState;

      if (trailButtonState === 1 && previousTrailButtonState === 0) {
        trailsEnabled = !trailsEnabled;
        if (adminMode) {
          console.log("Trails state toggled:", trailsEnabled);
        }
      }
      previousTrailButtonState = trailButtonState;

      if (screenshotButtonState === 1 && previousScreenshotButtonState === 0) {
        saveCanvas('screenshot', 'png');
        if (adminMode) {
          console.log("Screenshot taken");
        }
      }
      previousScreenshotButtonState = screenshotButtonState;
    }
  }
}

The communication between Arduino and p5.js:

As we said, the Arduino code sends the ultrasonic sensor distance and button states as a comma-separated string to p5.js. The p5.js code listens for the serial data and parses the received string to extract the distance and button states. This parsed data is then used to update the corresponding variables in the p5.js sketch, allowing the Arduino inputs to influence the visual output in real-time. The seamless communication between Arduino and p5.js enables the integration of physical interactions with the digital artwork.

Sketch Embed

Link for testing:

https://editor.p5js.org/dt2307/full/Gqx2rsti9L

What I am proud of:

I am proud of making ML5 integration with Handpose as smooth as it is. For the most part, the experience is seamless with minor delay. I am glad that visualization is working and due to the flexible nature of this project, this could be expanded to other mathematical demonstrations.

I am also proud of the fact that I did not necessarily fix myself to one idea. I experimented with both and found a way to combine them. Allowing users to not just test but also create something of their own is true interaction and I feel like this project accomplishes this task by integrating numerous hardware, software and design principles we learned in the class.

 

Resources I used:

For sine wave form visualizations I followed Colorful Coding videos: https://www.youtube.com/@ColorfulCoding

For general knowledge about ML5, I used ML5 project website: https://ml5js.org

For Handpose detection model, I used following Github repository with next gen ml5: https://github.com/ml5js/ml5-next-gen

For general principle, knowledge, troubleshooting – I used open web, provided slides and other available resources.

Challenges I faced and how I overcame them:

Throughout the development of this project, I encountered several challenges that tested my problem-solving skills and pushed me to think creatively. One of the main challenges I faced was ensuring a smooth and responsive interaction between the hand gestures and the visual elements. Initially, the gestures felt sluggish and unreliable, leading to a frustrating user experience. To overcome this, I spent a considerable amount of time fine-tuning the gesture recognition and mapping algorithms, experimenting with different thresholds and smoothing techniques. (Ended up with moving average method). Through trial and error, I managed to strike a balance between responsiveness and stability, resulting in a more intuitive and enjoyable interaction.

Another challenge I faced was different graphics buffers. Sometime’s drawings would get messed up and not display at all, or display on each other or not fully. Again it took lots of trials and errors but I eventually found what worked. There were several other minor bugs that might not have been immediately noticeable by users but I tried to polished them out so that whole experience remained coherent. Last aspect was just improving CSS and styling and make presentation visually pleasing. The audio aspect was also a bit confusing. I tried to keep constant audio but that got annoyed after a file. The dynamic audio which is only audible during logical (e.g. stretching across Z) movements is much more satisfying.

I also took a few suggestions from friends/professors to improve the interface. For example I added visuals to instructions page to make it more user friendly. Additionally I added a piece of plywood inside the box as a counterweight to keep the box from moving when users would press the buttons. Perhaps, cramming everything together in one box was the most nerve wracking part as I was using double sided tape and only had one shot at making it work. I planned it in my had many, many times before I actually committed to cutting out cardboard shapes and putting my components in without disconnecting. The most annoying issue I faced was perhaps something I could not control at all, because my USB hub, which never gives me issues, does not properly work with Arduino. Sometimes it just refused to work and I had to use other hubs when it would miraculously start working again.

Areas for Future Improvement:

There are several directions this project can be extended in. Firstly, I would love to integrate more advanced machine learning algorithms or perhaps one day learn to write one myself to make the whole experience even smoother. With more accurate detection, you could get even more nuanced gestures which could further ameliorate user experience and make the whole process run faster.

Additionally, I would like to expand and add more mathematical models for this project and its educational context. Perhaps by making modular design, I could let users pick their desired mathematical or other science based concepts from biology or chemistry and have their own visualization running in as little time as possible. They could assign their own parameters and have more flexibility with all the movements. Perhaps, having a more advanced model would also help with more than one person demonstrations, where multiple people can engage with single visualization, either by observing it or by producing their own art. Of course, polishing current code, improving casing for Arduino, adding more sensors are all viable avenues as well. It would be cool to add vibration motors so users could also feel the movement in real time, making the experience much more tactile and intuitive.

In the end, I am very glad with how my project turned out and despite facing numerous challenges, overall I had a lot of fun and would love to come up with more creative projects like this in the feature. Hope you guys liked it too!

 

Final Project Update and Progress – Marcos Hernández

Changes done to the project:

From the two suggested ideas, I decided to work on the first one since it would be the most realistic option. Now, the original first idea consisted in creating a platformer that would interact via the traditional directional moving buttons and a photoresistor, that would move the scenario according to how much light it received. Although, since I have already done a platform and time constraints, I had to modify this idea.

The idea still has the directional movement and photoresistor to interact with the game differently, but now there are significant changes such as:

  • It is now going to be from a top-down perspective, instead from a traditional 2-D.
  • It will feature enemies in the form of ghosts.
  • Time will serve as a metric for progression.
  • It now has a flashlight mechanic which serves to reveal the current position of enemies, but at the cost of an initial high consumption of battery and increased draining battery speed. This battery remaining for this mechanic will be display via LEDs that indicate, according to the scale of Green > Green > Yellow > Red. If the red LED light dims, the player will lose.
  • It has a “flashlight recharge station” which helps in increasing the current battery.
  • The enemies (Ghosts) can be detected, according to their current location and distance, via sound with the piezo speakers; in other words, if they are too near or far, the piezo speaker will make sounds according to it.
  • Random outcomes in every match due to the location of the player, enemies, and battery charges.
  • The photoresistor is now going to serve as to how fast the battery will charge. I am still going to allow external factors to interact with it to let the player have a certain degree of freedom. For example, the normal speed at which the battery charges in the stations are of 1x, but if a smartphone flashlight interacts with the photoresistor, it can increase up to 4x.

All of this has been implemented at the moment, but still need bug fixes, further improvements and user testing.

Current progress of the game:

You can move with the keys ESDF and you can use backspace to turn ON or OFF the lights. Keep in mind, because an Arduino is needed, the sound will not be played to locate the enemies, as also the external modification for the speed at which the battery is recharged.

Current Arduino Progress

My current Arduino looks something like this at the moment:

I still need to implement the design and the buttons to move, but since I was mostly worried about the game mechanics and logic first, the design is something that I know will be straightforward to implement. Although, definitively, still time-consuming.

 

Design meets Disability | Creative Response | Week 12

This week’s reading talks about how design is important,  and sets a trend,  even in the medical field. The compare and contrast between different approaches to design in the case of eyewear, prosthetics, hearing aids etc was very interesting to me . I thought – is it just marketing that causes these differences or is it something at a much deeper level? The heading” good design on any terms ” compelled me to think about why this phrase was worth giving a thought. The example of Charles and Ray Eames making a leg splint that was ‘designed’ well illustrated this concept.

The discussion on fashion v/s discussion was intriguing too. The author shows how these two are not necessarily mutually exclusive as in the case of eyewear. He talks about a balance between simplicity and overly complex/colorful/designed things. One good example he gave that stuck to my mind is that of AirPods along with the quote “if I had more time, I would have written a shorter letter” which means that just because something is simple doesn’t mean that a lot of thought was not required into making it . \Infact,  genius can be found in simplicity.

I hope to embrace these concepts in my final project and design something user-friendly and simple .

Reading Response 9 (Week 11)

“Design meets Disability” was an eye-opening exploration of the complex interplay between design, disability, and dignity. The notion that assistive devices can be both functional and stylish is a powerful one, breaking away from the traditional paradigm that prioritized hiding disabilities rather than celebrating them. As someone passionate about disability activism, it has been my personal mission to make society more accessible for people with disabilities. This reading thus prompted me to reconsider the traditional approaches to designing assistive technologies and encouraged me to explore more inclusive and culturally relevant design solutions. It also helped me ideate my final IM project- making a sign language glove, particularly through its emphasis on simplicity and cognitive accessibility reminding me of the importance of intuitive design in facilitating meaningful interactions and experiences for users with disabilities.

 

The discussion on the evolution of eyewear from medical necessity to fashion accessory was also intriguing, illustrating how incorporating fashion culture into design can lead to more positive perceptions of disability. I have used spectacles since the age of 5. I currently have high myopia (-11! yeah I know…). But I have never been insecure about wearing glasses or never considered laser eye surgery. Now that I look back, one of the reasons is because I enjoyed selecting new spectacles to wear every few years. I knew myopia is not something I could hide or change, so I coped with it by making it my style. Similarly, embracing fashion culture and aesthetics can transform the perception of other assistive devices, empowering individuals with disabilities to embrace their uniqueness rather than hide it.  Overall, this reading reinforced my commitment to creating inclusive and empowering solutions that celebrate diversity and promote social inclusion.

Assignment 12: Code – In class exercises

Exercise 1

For this exercise, we used a photosensor to control the x position of an ellipse in p5. The more light the photosensor reads, the further right the ellipse’s position is.

Demo:

https://intro.nyuadim.com/wp-content/uploads/2024/04/Exercise-1-Video.mov

Codes:

p5 –

let circlePosition = 100;
function setup() {
  createCanvas(400, 400);
  textSize(18);
  fill(243,213,205);
  text("Press the space bar to connect port", width / 2, 60);
}
function draw() {
  background(176, 16, 50);
  textSize(18);
  fill(243,213,205);
  text("Press the space bar to connect port", 60, 60);
  ellipse(circlePosition, height / 2, 70, 70);
}
// Function to set up the serial connection
function keyPressed() {
  if (key == " ") {
    setUpSerial();     
  }
}
// Function to read the data from serial port
function readSerial(data) {
  // Check if data received is not null
  if (data != null) {
    // Mapping the data to 0-400 and sets it as X position of the circle
    circlePosition = map(int(data), 0, 1023, 0, 400);
  }
}

Arduino –

void setup() {
  Serial.begin(9600); // Begin serial communication 
}
void loop() {
  int sensor = analogRead(A0);
  delay(5);
  Serial.println(sensor);
 
}

Circuit:

Exercise 2

For this exercise, we created a gradient on p5 that goes from green to black along the y-axis. When the mouse is on the highest point of the canvas, meaning at the greenest point, the LED is the brightest. The further the mouse goes down towards the black, the darker it gets.

Demo:

https://intro.nyuadim.com/wp-content/uploads/2024/04/IMG_2972.mov

Codes:

p5 –

function setup() {
  createCanvas(400, 400);
  textSize(20);
}

function draw() {
  // Gradient background from green to black
  setGradient(0, 0, width, height, color(0, 255, 0), color(0));



  if (!serialActive) {
    fill(255);
    text("Press space bar to select port", 60, 60);
  } else {
  }

  // Change brightness of LED based on mouse position
  let brightness = map(mouseY, 0, width, 255, 0);

  // Send the brightness value to Arduino
  if (serialActive) {
    let sendToArduino = brightness + "\n";
    writeSerial(sendToArduino);
  }
}

// Function to draw a gradient background
function setGradient(x, y, w, h, c1, c2) {
  noFill();
  for (let i = y; i <= y + h; i++) {
    let inter = map(i, y, y + h, 0, 1);
    let c = lerpColor(c1, c2, inter);
    stroke(c);
    line(x, i, x + w, i);
  }
}

// Function to begin serial connection
function keyPressed() {
  if (key == " ") {
    setUpSerial();
  }
}

function readSerial(data) {
  if (data != null) {
    serialActive = true;
  }
}

Arduino –

int ledPin = 9; 
int brightness = 0; 

void setup() {
  Serial.begin(9600);
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(ledPin, OUTPUT);
  digitalWrite(ledPin, LOW); // Starts with LED off
  while (!Serial) { // Wait for serial connection 
    delay(500);
  }
  Serial.println("Arduino initialized"); 
}

void loop() {
  if (Serial.available() > 0) {
    brightness = Serial.parseInt(); // Read brightness value
    analogWrite(ledPin, brightness);
    Serial.read(); 
  }
  digitalWrite(LED_BUILTIN, LOW); // LED off when there is no data
}

Circuit:

Exercise 3

For this exercise, we established bidirectional communication between the Arduino and the p5 sketch, enabling real-time interaction between physical input (potentiometer) and visual output (LED indication) synchronized with the simulation on p5.

The Arduino code reads data from a potentiometer connected to analog pin and sends it to the p5.js sketch. It also receives position data from the p5.js sketch via serial communication and controls an LED connected to pin 9 accordingly. The setup() function initializes serial communication and sets pin modes for the LED, potentiometer, and built-in LED. It initiates a handshake with the p5.js sketch by sending a starting message until it receives data from the serial port.

Demo:

Codes:

p5 –

let velocity;
let gravity;
let position;
let acceleration;
let breeze;
let drag = 0.99;
let mass = 50;
let heightOfBall = 0;
function setup() {
  createCanvas(640, 360);
 
  noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  breeze = createVector(0,0); 
}
function draw() {
  background(215);
  fill(0);
  
  if (!serialActive) {
    text("Press space bar to connect Arduino", 50, 60);
  }
  else 
  {
  
  applyForce(breeze); 
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  ellipse(position.x,position.y,mass,mass);
    
  if (position.y > height-mass/2) {
      velocity.y *= -0.9;  
    
      position.y = height-mass/2;
    
    heightOfBall = 0;
    
    } 
    else {
      heightOfBall = 1;
    }
  }
}
function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}
function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }   
  else if (key=='b'){
    mass=random(15,80);
    position.y=-mass;
    velocity.mult(0);
  }
}
// this callback function
function readSerial(data) {
    ////////////////////////////////////
    //READ FROM ARDUINO HERE
    ////////////////////////////////////
  
     if (data != null) {
    // make sure there is actually a message
    
    let fromArduino = split(trim(data), ",");
    
       // if the right length, then proceed
    if (fromArduino.length == 1) {
//sensor value is the input from potentiometer
      let sensorVal = int(fromArduino[0]);
      
//potentiometer value ranges from 0 - 1023
//for values less than 400,wind blows to right
      if (sensorVal < 400){
        breeze.x=1
      }
//if value between 400 and 500, wind stops so ball stops
      else if(sensorVal >= 400 && sensorVal < 500){
        breeze.x = 0
      }
//if value greater than 500, wind blows to left
      else {
        breeze.x = -1
      }
          //////////////////////////////////
          //SEND TO ARDUINO HERE (handshake)
          //////////////////////////////////
    }
//height of ball sent to arduino to check if ball on floor or not
    let sendToArduino = heightOfBall  + "\n";
    writeSerial(sendToArduino);
  }
}

Arduino –

const int poten_pin = A5;
const int ledPin = 9;
void setup() {
 Serial.begin(9600); // Start serial communication at 9600 bps
 pinMode(LED_BUILTIN, OUTPUT);
 pinMode(ledPin, OUTPUT);
 pinMode(poten_pin, INPUT);
 // start the handshake
 while (Serial.available() <= 0) {
   digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
   Serial.println("0,0"); // send a starting message
   delay(300);            // wait 1/3 second
   digitalWrite(LED_BUILTIN, LOW);
   delay(50);
 }
}
void loop()
{
 // wait for data from p5 before doing something
   while (Serial.available())
   {
     digitalWrite(LED_BUILTIN, HIGH);
     digitalWrite(ledPin, LOW);
//read the position of ball from p5
     int position = Serial.parseInt();
  
     if (Serial.read() == '\n') {
       // Read potentiometer value
     int sensorValue = analogRead(poten_pin);
     //send value to p5
     Serial.println(sensorValue);
     }
//if ball is touching the ground i.e. height is zero, turn LED on
     if (position == 0)
     {
       digitalWrite(ledPin, HIGH);
     }
     else{
       digitalWrite(ledPin, LOW);
     }
   }
     digitalWrite(LED_BUILTIN, LOW);
   }

Circuit:

Assignment #12 – Code – ☆In-Class Exercises☆

Exercise 1

For this exercise, we used a photosensor to control the x position of an ellipse in p5. The more light the photosensor reads, the further right the ellipse’s position is.

Demo:

Exercise 1 Video

Codes:

p5 –

let circlePosition = 100;
function setup() {
  createCanvas(400, 400);
  textSize(18);
  fill(243,213,205);
  text("Press the space bar to connect port", width / 2, 60);
}
function draw() {
  background(176, 16, 50);
  textSize(18);
  fill(243,213,205);
  text("Press the space bar to connect port", 60, 60);
  ellipse(circlePosition, height / 2, 70, 70);
}
// Function to set up the serial connection
function keyPressed() {
  if (key == " ") {
    setUpSerial();     
  }
}
// Function to read the data from serial port
function readSerial(data) {
  // Check if data received is not null
  if (data != null) {
    // Mapping the data to 0-400 and sets it as X position of the circle
    circlePosition = map(int(data), 0, 1023, 0, 400);
  }
}

Arduino –

void setup() {
  Serial.begin(9600); // Begin serial communication 
}
void loop() {
  int sensor = analogRead(A0);
  delay(5);
  Serial.println(sensor);
 
}

Circuit:

Exercise 2

For this exercise, we created a gradient on p5 that goes from green to black along the y-axis. When the mouse is on the highest point of the canvas, meaning at the greenest point, the LED is the brightest. The further the mouse goes down towards the black, the darker it gets.

Demo:

IMG_2972

Codes:

p5 –

function setup() {
  createCanvas(400, 400);
  textSize(20);
}

function draw() {
  // Gradient background from green to black
  setGradient(0, 0, width, height, color(0, 255, 0), color(0));



  if (!serialActive) {
    fill(255);
    text("Press space bar to select port", 60, 60);
  } else {
  }

  // Change brightness of LED based on mouse position
  let brightness = map(mouseY, 0, width, 255, 0);

  // Send the brightness value to Arduino
  if (serialActive) {
    let sendToArduino = brightness + "\n";
    writeSerial(sendToArduino);
  }
}

// Function to draw a gradient background
function setGradient(x, y, w, h, c1, c2) {
  noFill();
  for (let i = y; i <= y + h; i++) {
    let inter = map(i, y, y + h, 0, 1);
    let c = lerpColor(c1, c2, inter);
    stroke(c);
    line(x, i, x + w, i);
  }
}

// Function to begin serial connection
function keyPressed() {
  if (key == " ") {
    setUpSerial();
  }
}

function readSerial(data) {
  if (data != null) {
    serialActive = true;
  }
}

Arduino –

int ledPin = 9; 
int brightness = 0; 

void setup() {
  Serial.begin(9600);
  pinMode(LED_BUILTIN, OUTPUT);
  pinMode(ledPin, OUTPUT);
  digitalWrite(ledPin, LOW); // Starts with LED off
  while (!Serial) { // Wait for serial connection 
    delay(500);
  }
  Serial.println("Arduino initialized"); 
}

void loop() {
  if (Serial.available() > 0) {
    brightness = Serial.parseInt(); // Read brightness value
    analogWrite(ledPin, brightness);
    Serial.read(); 
  }
  digitalWrite(LED_BUILTIN, LOW); // LED off when there is no data
}

Circuit:

 

Exercise 3

For this exercise, we established bidirectional communication between the Arduino and the p5 sketch, enabling real-time interaction between physical input (potentiometer) and visual output (LED indication) synchronized with the simulation on p5.

The Arduino code reads data from a potentiometer connected to analog pin and sends it to the p5.js sketch. It also receives position data from the p5.js sketch via serial communication and controls an LED connected to pin 9 accordingly. The setup() function initializes serial communication and sets pin modes for the LED, potentiometer, and built-in LED. It initiates a handshake with the p5.js sketch by sending a starting message until it receives data from the serial port.

Demo:

Codes:

p5 –

let velocity;
let gravity;
let position;
let acceleration;
let breeze;
let drag = 0.99;
let mass = 50;
let heightOfBall = 0;
function setup() {
  createCanvas(640, 360);
 
  noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  breeze = createVector(0,0); 
}
function draw() {
  background(215);
  fill(0);
  
  if (!serialActive) {
    text("Press space bar to connect Arduino", 50, 60);
  }
  else 
  {
  
  applyForce(breeze); 
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  ellipse(position.x,position.y,mass,mass);
    
  if (position.y > height-mass/2) {
      velocity.y *= -0.9;  
    
      position.y = height-mass/2;
    
    heightOfBall = 0;
    
    } 
    else {
      heightOfBall = 1;
    }
  }
}
function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}
function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }   
  else if (key=='b'){
    mass=random(15,80);
    position.y=-mass;
    velocity.mult(0);
  }
}
// this callback function
function readSerial(data) {
    ////////////////////////////////////
    //READ FROM ARDUINO HERE
    ////////////////////////////////////
  
     if (data != null) {
    // make sure there is actually a message
    
    let fromArduino = split(trim(data), ",");
    
       // if the right length, then proceed
    if (fromArduino.length == 1) {
//sensor value is the input from potentiometer
      let sensorVal = int(fromArduino[0]);
      
//potentiometer value ranges from 0 - 1023
//for values less than 400,wind blows to right
      if (sensorVal < 400){
        breeze.x=1
      }
//if value between 400 and 500, wind stops so ball stops
      else if(sensorVal >= 400 && sensorVal < 500){
        breeze.x = 0
      }
//if value greater than 500, wind blows to left
      else {
        breeze.x = -1
      }
          //////////////////////////////////
          //SEND TO ARDUINO HERE (handshake)
          //////////////////////////////////
    }
//height of ball sent to arduino to check if ball on floor or not
    let sendToArduino = heightOfBall  + "\n";
    writeSerial(sendToArduino);
  }
}

Arduino –

const int poten_pin = A5;
const int ledPin = 9;
void setup() {
 Serial.begin(9600); // Start serial communication at 9600 bps
 pinMode(LED_BUILTIN, OUTPUT);
 pinMode(ledPin, OUTPUT);
 pinMode(poten_pin, INPUT);
 // start the handshake
 while (Serial.available() <= 0) {
   digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
   Serial.println("0,0"); // send a starting message
   delay(300);            // wait 1/3 second
   digitalWrite(LED_BUILTIN, LOW);
   delay(50);
 }
}
void loop()
{
 // wait for data from p5 before doing something
   while (Serial.available())
   {
     digitalWrite(LED_BUILTIN, HIGH);
     digitalWrite(ledPin, LOW);
//read the position of ball from p5
     int position = Serial.parseInt();
  
     if (Serial.read() == '\n') {
       // Read potentiometer value
     int sensorValue = analogRead(poten_pin);
     //send value to p5
     Serial.println(sensorValue);
     }
//if ball is touching the ground i.e. height is zero, turn LED on
     if (position == 0)
     {
       digitalWrite(ledPin, HIGH);
     }
     else{
       digitalWrite(ledPin, LOW);
     }
   }
     digitalWrite(LED_BUILTIN, LOW);
   }

Circuit:

Week 12: Design meets disability

“Design Meets Disability” by Graham Pullin provides a thought-provoking exploration of the potential symbiotic relationship between design and disability. Pullin challenges the traditional separation between assistive devices and mainstream design by questioning why certain products, such as eyeglasses, have undergone a transformation into fashionable items while others, like hearing aids or prosthetic limbs, remain largely utilitarian. This critique exposes a broader issue within design culture—the tendency to prioritize aesthetics and marketability over functionality and inclusivity.

However, while Pullin’s argument for integrating disability considerations into mainstream design is compelling, it also raises questions about the motivations behind such integration. Is the goal to genuinely enhance the lives of disabled individuals by providing them with more aesthetically pleasing and user-friendly products, or is it driven primarily by profit and market trends? Furthermore, there is a risk of superficiality in simply “fashionizing” assistive devices without addressing deeper systemic issues such as accessibility, affordability, and social stigma. Hence, while Pullin’s exploration of the intersection of design and disability is interesting, it also invites critical reflection on the broader societal implications of such integration.

Week 12 Assignment – Jana and Rashed

Exercise 1: 

P5 code:

let rVal = 0;
let alpha = 255;
let left = 0; // True (1) if mouse is being clicked on left side of screen
let right = 0; // True (1) if mouse is being clicked on right side of screen

function setup() {
  createCanvas(640, 480);
  textSize(18);
}

function draw() {
  // one value from Arduino controls the background's red color
  background(255)

  // the other value controls the text's transparency value
  fill(255, 0,0)

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
    // Print the current values
    text('rVal = ' + str(rVal), 20, 50);
    text('alpha = ' + str(alpha), 20, 70);
  }

  // click on one side of the screen, one LED will light up
  // click on the other side, the other LED will light up
  if (mouseIsPressed) {
    if (mouseX > rVal-50 && mouseX < rVal+50 && mouseY > height/2-50 && mouseY < height/2+50) {
      right = 1;
    } 
  } else {
    right = 0;
  }
  ellipse(rVal, height/2, 50,50)
}

function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}

// This function will be called by the web-serial library
// with each new line of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // make sure there is actually a message
    // split the message
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 2) {
      // only store values here
      // do everything with those values in the main draw loop
      
      // We take the string we get from Arduino and explicitly
      // convert it to a number by using int()
      // e.g. "103" becomes 103
      rVal = int(fromArduino[0]);
      alpha = int(fromArduino[1]);
    }

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = left + "," + right + "\n";
    writeSerial(sendToArduino);
  }
}

We did not face any big issue in this part. We just added the ellipse and changed its position to rVal, height/2.

Video:

Exercise 2: 

P5 code:

let rVal = 0;
let alpha = 255;
let left = 0; // True (1) if mouse is being clicked on left side of screen
let right = 0; // True (1) if mouse is being clicked on right side of screen

function setup() {
  createCanvas(640, 480);
  textSize(18);
}

function draw() {
  // one value from Arduino controls the background's red color
  background(map(rVal, 0, 1023, 0, 255), 255, 200);

  // the other value controls the text's transparency value
  fill(255, 0, 255, map(alpha, 0, 1023, 0, 255));

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
    // Print the current values
    text('rVal = ' + str(rVal), 20, 50);
    text('alpha = ' + str(alpha), 20, 70);
  }


  
function keyPressed() //built in function
{
  if (key == " ") //if space is pressed then
  {
    setUpSerial(); //setup the serial. 
  }
  else if (keyCode == DOWN_ARROW)
  {
    if (right != 0)
    {
      right = right - 20;
    }
  }
  else if (keyCode == UP_ARROW)
  {
    if (right != 250)
    {
      right = right + 20;
    }
  }
}
  }

// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // make sure there is actually a message
    // split the message
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 2) {
      // only store values here
      // do everything with those values in the main draw loop
      
      // We take the string we get from Arduino and explicitly
      // convert it to a number by using int()
      // e.g. "103" becomes 103
      rVal = int(fromArduino[0]);
      alpha = int(fromArduino[1]);
    }

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = left + "," + right + "\n";
    writeSerial(sendToArduino);
  }
}

Arduino code for Exercise 1 and 2:

int leftLedPin = 2;
int rightLedPin = 5;

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output.
  // We can't use the serial monitor since the serial connection is
  // used to communicate to p5js and only one application on the computer
  // can use a serial port at once.
  pinMode(LED_BUILTIN, OUTPUT);

  // Outputs on these pins
  pinMode(leftLedPin, OUTPUT);
  pinMode(rightLedPin, OUTPUT);

  // Blink them so we can check the wiring
  digitalWrite(leftLedPin, HIGH);
  digitalWrite(rightLedPin, HIGH);
  delay(200);
  digitalWrite(leftLedPin, LOW);
  digitalWrite(rightLedPin, LOW);



  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int left = Serial.parseInt();
    int right = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(leftLedPin, left);
      digitalWrite(rightLedPin, right);
      int sensor = analogRead(A0);
      delay(5);
      int sensor2 = analogRead(A1);
      delay(5);
      Serial.print(sensor);
      Serial.print(',');
      Serial.println(sensor2);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}

Challenges:

We tried to get the light to decrease its brightness every time the person would click on the left side of the screen and for the brightness to increase whenever the person clicks on the right side of the screen. The code was correct, but for some reason it was not working. So, it might be something with the bulb or the Arduino itself. However, the light does take around 4 clicks on both sides of the screen to turn off or on so, we would consider that a success.

Video:

Exercise 3: 

P5 code:

let dragForce = 0.99;
let mass = 20;
let ledState = 0;
let velocity;
let gravity;
let position;
let acceleration;
let wind;
let force;
let bounced = false;


function setup() {
  createCanvas(640, 480);
  textSize(18);
  position = createVector(width / 2, 0);
  velocity = createVector(0, 0);
  acceleration = createVector(0, 0);
  gravity = createVector(0, 0.3 * mass);
  wind = createVector(0, 0);
}

function draw() {
  
  background(255);

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);

    noStroke();
    force = p5.Vector.div(wind, mass);
    acceleration.add(force);
    force = 0;
    force = p5.Vector.div(gravity, mass);
    acceleration.add(force);
    force = 0;
    velocity.add(acceleration);
    velocity.mult(dragForce);
    position.add(velocity);
    acceleration.mult(0);

    ellipse(position.x, position.y, mass, mass);
    if (position.y > (height - mass / 2)-30 ) {
      velocity.y *= -0.9;
      position.y = (height - mass / 2)-30;
      ledState= 1;
      
      if (!bounced) {
        fill('blue'); // Red when just bounced
        bounced = true; // Update bounce state
      } else {
        fill('red'); // White otherwise
        bounced = false; // Reset bounce state
      }
    } else {
      ledState = 0;
    }
  
    
  }

  rect(0,height-30,width,30);
  
}

function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}
function readSerial(data) {
  if (data != null) {
    // make sure there is actually a message
    // split the message
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 1) {
      let windCurrent = int(fromArduino[0]);
      wind.x = map(windCurrent, 0, 1023, -1, 1);
    }

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = ledState + "\n";
    writeSerial(sendToArduino);
  }
}

Arduino code:

int ledPin = 10;
int potPin = A0;

void setup() {
  Serial.begin(9600);

  pinMode(LED_BUILTIN, OUTPUT);

  // Outputs on these pins
  pinMode(ledPin, OUTPUT);

  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0"); // send a starting message
    delay(300);            // wait 1/3 second
    digitalWrite(LED_BUILTIN, LOW);
    delay(50);
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int right = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(ledPin, right);
      int potValue = analogRead(potPin);
      delay(5);
      Serial.println(potValue);
    }
  }
}

Challenges:

Everything went smoothly when writing the code. However, when we would click the play button to see the sketch, it would give us the error: “setUpArduino is not defined”. So, we spent countless minutes searching for something that was not there. After looking at the professors week 12 example on serial connection, we realized that we have forgotten the tab that would connect P5 to Arduino. So we just added that and it worked.

Video:

very very cute :3

 

Final Concept Proposal Confirmation

Final Concept:

I am still going to use ML5 and find a way to implement machine learning with Arduino to develop interactive art project. I did some testing with Canvas painting and while it worked, I found interactive art manipulation to be much more fun and unique. For now I am experimenting with Sin forms and their manipulation with Hand pose. I hope to add new gestures and feature so enhance interactivity of the art. For the Arduino side of things I have not full decided yet but I was thinking of using distance sensor as a tracker for engagement and rest of buttons to change up user interface, colors or perhaps navigate between different mathematical models.