Week 12 Production Hyein Kim & Hayeon Jeong

Partner: Hayeon Jeong

EXERCISE 01: ARDUINO TO P5 COMMUNICATION

Make something that uses only one sensor on arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5.

To complete this exercise, we built a simple circuit as illustrated below. We used the photocell to control the horizontal movement of the ball in p5. 

As it is illustrated in the code for Arduino, most of it is similar to that of the one provided to the class. The things we changed are mostly done inside the void loop(). Since the Arduino is sending information to p5, we readjusted the code so that data is sent unidirectionally to p5: the value collected in A0 from the photocell is sent to p5.

Arudino Code

int leftLedPin = 2;
int rightLedPin = 5;


void setup() {
 // Start serial communication so we can send data
 // over the USB connection to our p5js sketch
 Serial.begin(9600);


 // We'll use the builtin LED as a status output.
 // We can't use the serial monitor since the serial connection is
 // used to communicate to p5js and only one application on the computer
 // can use a serial port at once.
 pinMode(LED_BUILTIN, OUTPUT);


 // Outputs on these pins
 pinMode(leftLedPin, OUTPUT);
 pinMode(rightLedPin, OUTPUT);


 // Blink them so we can check the wiring
 digitalWrite(leftLedPin, HIGH);
 digitalWrite(rightLedPin, HIGH);
 delay(200);
 digitalWrite(leftLedPin, LOW);
 digitalWrite(rightLedPin, LOW);



 // start the handshake
 while (Serial.available() <= 0) {
   digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
   Serial.println("0,0"); // send a starting message
   delay(300);            // wait 1/3 second
   digitalWrite(LED_BUILTIN, LOW);
   delay(50);
 }
}


void loop() {
 // wait for data from p5 before doing something
     int sensor = analogRead(A0);
     delay(5);
     Serial.println(sensor);
 }

On p5, we drew an ellipse and initialized a variable ‘xposition’ that is used to move the horizontal position of the ball. Then under ‘if (data != null) {}’, we set the xposition = data where data refers to the value that is collected and sent from Arduino. This data is used to change the xposition of the ball in p5. 

let rVal = 0;
let alpha = 255;
let xposition = 100;


function setup() {
  createCanvas(640, 480);
  textSize(18);
}


function draw() {
  // one value from Arduino controls the background's red color
  background(map(rVal, 0, 1023, 0, 255), 255, 200);


  ellipse(xposition, height/2, 50, 50);
  
  // the other value controls the text's transparency value
  fill(255, 0, 255, map(alpha, 0, 1023, 0, 255));


  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
    // Print the current values
    text('rVal = ' + str(rVal), 20, 50);
    text('alpha = ' + str(alpha), 20, 70);
  }


function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}


// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////


  if (data != null) {
    // make sure there is actually a message
    // split the message
    xposition = data;
    }


    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = left + "," + right + "\n";
    writeSerial(sendToArduino);
  }

 

EXERCISE 02: P5 TO ARDUINO COMMUNICATION

Make something that controls the LED brightness from p5.

For this exercise, we utilized the position of mouseX from p5 to control the brightness of the LED on Arduino. The circuit we made looks like this: 

circuit 2

 

Here, Arduino receives information from p5 and changes the brightness of the LED. Under void loop(), we used the map function to convert the values to PWM range and used analogWrite to set the LED brightness according to the value received from p5. 

ARDUINO CODE

int LEDpin = 3;  // Ensure this pin supports PWM
void setup() {
 Serial.begin(9600);
 pinMode(LED_BUILTIN, OUTPUT);
 pinMode(LEDpin, OUTPUT);


 // Initial blink to confirm working setup
 digitalWrite(LEDpin, HIGH);
 delay(200);
 digitalWrite(LEDpin, LOW);


 // Wait for initial data before proceeding
 while (Serial.available() <= 0) {
   digitalWrite(LED_BUILTIN, HIGH);  // Blink LED to indicate waiting for connection
   Serial.println("0,0");  // Send a starting message
   delay(300);
   digitalWrite(LED_BUILTIN, LOW);
   delay(50);
 }
}


void loop() {
 if (Serial.available() > 0) {
   digitalWrite(LED_BUILTIN, HIGH);  // LED on while receiving data
   int bright = Serial.parseInt();
   bright = map(bright, 0, 640, 0, 255);  // Adjust received value to PWM range


   if (Serial.read() == '\n') {
     analogWrite(LEDpin, bright);  // Set LED brightness
   }
   digitalWrite(LED_BUILTIN, LOW);  // Turn off status LED after handling data
   Serial.println(0);
 }
}

In p5, we used mouseX and its position on the canvas to control the brightness of the LED on Arduino. We initialized a variable LEDbright to 0 but then let it change according to the if statement. If mouseX <= width && mouseX >= 0 && mouseY <= height && mouseY >= 0 , then the variable LEDbright is mouseX and this data is then sent to Arduino which controls the brightness of the LED. However, if the condition is not satisfied, then the variable LEDbright remains 0. Simply put, the LED lights up brighter when the mouse is on the right side of the screen and dimmer when it is on the left side of the screen. 

P5.JS CODE

let LEDbright = 0;

function setup() {
  createCanvas(640, 480);
  textSize(18);
}

function draw() {
  
  background(205);

  // the other value controls the text's transparency value
  fill(0)

  if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);

  }

  // when clicked, digital LED will turn on,
  // other analog write LED will have brightness corresponding to the height of the mouse when it is press. can be pressed and dragged for controlling brightness
  if (mouseX <= width && mouseX >= 0 && mouseY <= height && mouseY >= 0) {
      LEDbright = mouseX;
  } else {
    LEDbright = 0;
  }
}

function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}

function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    
    //send the posiiton of mouseY when clicked to control brightness
    let sendToArduino = LEDbright + "\n";
    writeSerial(sendToArduino);

}

EXERCISE 03: BI-DIRECTIONAL COMMUNICATION

Take the gravity wind example and make it so:

  • every time the ball bounces one led lights up and then turns off,
  • and you can control the wind from one analog sensor

To complete this assignment, we used a potentiometer as an input to control wind and an LED as an output controlled by the bouncing ball in p5.js.

P5.JS CODE

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;
let light = 0;

function setup() {
  createCanvas(640, 360);
  noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  wind = createVector(0,0);
}

function draw() {
  background(255);
  applyForce(wind);
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  fill('black')
  
   if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    ellipse(position.x,position.y,mass,mass);
  if (position.y > height-mass/2) {
      velocity.y *= -0.9;  // A little dampening when hitting the bottom
      position.y = height-mass/2;
    light=1
    
    }
    else{
      light=0
    }
  }
}

function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}

function keyPressed() {
  if (key == " ") {
    // important to have in order to start the serial connection!!
    setUpSerial();
  }
}


function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // make sure there is actually a message
    // split the message
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 1) {
      // only store values here
      // do everything with those values in the main draw loop
      
      // We take the string we get from Arduino and explicitly
      // convert it to a number by using int()
      // e.g. "103" becomes 103
      
      let sensorValue = int(fromArduino[0]);
      wind.x = map(sensorValue, 0, 1023, -1, 1);
    }
    }
    

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = light + "\n";
    writeSerial(sendToArduino);
  }

ARDUINO CODE

int LEDpin = 2;
int wind = A0;

void setup() {
  // Start serial communication so we can send data
  // over the USB connection to our p5js sketch
  Serial.begin(9600);

  // We'll use the builtin LED as a status output.
  // We can't use the serial monitor since the serial connection is
  // used to communicate to p5js and only one application on the computer
  // can use a serial port at once.
  pinMode(LED_BUILTIN, OUTPUT);

  // Outputs on these pins
  pinMode(LEDpin, OUTPUT);
  pinMode(wind, INPUT);


  // start the handshake
  while (Serial.available() <= 0) {
    digitalWrite(LED_BUILTIN, HIGH); // on/blink while waiting for serial data
    Serial.println("0,0"); // send a starting message
  }
}

void loop() {
  // wait for data from p5 before doing something
  while (Serial.available()) {
    digitalWrite(LED_BUILTIN, HIGH); // led on while receiving data

    int light = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(LEDpin, light);
      int sensor = analogRead(A0);
      delay(5);
      Serial.println(sensor);
    }
  }
  digitalWrite(LED_BUILTIN, LOW);
}

 

To briefly explain this code, every time the ball hits the ground, the variable ‘light’ is set to 1. If this is not the case, ‘light’ is set to 0.

if (!serialActive) {
    text("Press Space Bar to select Serial Port", 20, 30);
  } else {
    ellipse(position.x,position.y,mass,mass);
  if (position.y > height-mass/2) {
      velocity.y *= -0.9;  // A little dampening when hitting the bottom
      position.y = height-mass/2;
    light=1
    
    }
    else{
      light=0
    }
  }
}

Then, p5.js sends the value of ‘light’ to the Arduino.

let sendToArduino = light + "\n";
    writeSerial(sendToArduino);

When the Arduino receives the light value from p5.js, it turns on if the value is 1 and turns off if the value equals 0.

int light = Serial.parseInt();
    if (Serial.read() == '\n') {
      digitalWrite(LEDpin, light);

To control the wind value, if the length of the data sensed by the Arduino is equal to 1, p5.js receives the data value of the potentiometer. We then use the map function to convert the analog input value from 0 to 1023 to a range of -1 to 1.

if (data != null) {
   // make sure there is actually a message
   // split the message
   let fromArduino = split(trim(data), ",");
   // if the right length, then proceed
   if (fromArduino.length == 1) {
     // only store values here
     // do everything with those values in the main draw loop
     
     // We take the string we get from Arduino and explicitly
     // convert it to a number by using int()
     // e.g. "103" becomes 103
     
     let sensorValue = int(fromArduino[0]);
     wind.x = map(sensorValue, 0, 1023, -1, 1);
   }
   }

Week 12: Design Meets Disability

One of the many things I appreciated about this week’s reading is how expansive and detailed its exploration of the relationship between design and disability is. Indeed, for a long time, designing for disability, as a discipline or practice, was brushed aside and accessibility was achieved merely by asking what augmentations need to be introduced to allow certain groups of people with disabilities the ability to use a platform or a product. The chapter begins by examining products built for those with disabilities and the role of fashion and artistic designers in normalizing and destigmatizing these products, effectively engulfing designs for those with disabilities into the wider culture of “mainstream” fashion. This latter inclusion affords people with disabilities both the luxury of choice and the long-awaited pleasure of finally being seen, both of which are things most people take for granted. We can see this sentiment reflected in the desire of athlete and model Aimee Mullins to find prosthetics that are “off-the-chart glamorous.” Incorporating artists in conversations about design for disability, which have been largely dominated by those with clinical and engineering backgrounds, is imperative in finding the sweet spot between functionality and aesthetics. I believe it is, however, important that those who are designing for disability are keen on involving the target user in their design process – the same way they would be if they were designing for any other target audience.

Interweaved within this aforementioned discussion is an emphasis on language. Integrating such designs into fashion means integrating them into a larger system of cultural artefacts, one of which is language. I enjoyed the delineation of the linguistic evolution of “spectacles” to “eyewear”. “Patients” become “wearers”.  The term “HearWear” is proposed in place of “hearing aids.” These are illustrations of how designers can actively contribute to changing cultural and societal outlooks, which is an essential prerequisite for actualizing inclusion.

One belief I held prior to reading this chapter was that designs had to be universally inclusive. After all, what is so difficult about ensuring that we have a multimodal interface that can accommodate all users? What I failed to realize is that additive complexity, as was pointed out by the article, would potentially create unintended inaccessibility. The idea of designing appliances and platforms that are effectively “flying submarines” is bound to leave us with an influx of products that are overly intricate, but with subpar performance in any singular task. It seems like what we need is a diversity of robust designs and products, for each type of user, that can perform their intended tasks optimally with elegant simplicity. We can only begin to reframe how we think of designing for disability – designing for inclusion – by inviting more than just engineers, whose aim is merely to achieve functionality and problem-solving, into the conversation.

Final Project Proposal

For my final project, I am creating a cooking game that engagingly explores Palestinian culture. The dish created is “Mansaf” with the help of “Teta” meaning grandma. The game will consist of several tasks for the dish. The visuals will be on p5 and so will the instructions. I will use several sensors such as a button, joystick, and motion sensor… to complete the cooking tasks. Through audio, visuals, and the hands-on game experience I think the outcome will be very interesting. Of course, some challenges will include p5 and Arduino communication which is why I’ll address them early on.

Rama & Jihad Week 12

We connected the Arduino to the P5 sketch using a serial monitor. We did this by downloading a Javascript app from the P5 serial library which acts as an intermediary to allow p5 to access hardware via serial port: available here. This has both access to the javascript browser and connected Arduino. The code below is responsible for declaring variables, handling serial communication (events such as (connection, port listing, data reception, errors, port opening, and closing.), and storing data. To complete the connection we used the code below:

let serial;
let latestData = "waiting for data";
let ellipseX; // Variable to store the x-coordinate of the ellipse

function setup() {
  createCanvas(400, 400);

  serial = new p5.SerialPort();

  serial.list();
  serial.open("/dev/tty.usbmodem101");

  serial.on('connected', serverConnected);
  serial.on('list', gotList);
  serial.on('data', gotData);
  serial.on('error', gotError);
  serial.on('open', gotOpen);
  serial.on('close', gotClose);
}

function serverConnected() {
  print("Connected to Server");
}

function gotList(thelist) {
  print("List of Serial Ports:");
  for (let i = 0; i < thelist.length; i++) {
    print(i + " " + thelist[i]);
  }
}

function gotOpen() {
  print("Serial Port is Open");
}

function gotClose() {
  print("Serial Port is Closed");
  latestData = "Serial Port is Closed";
}

function gotError(theerror) {
  print(theerror);
}
function gotData() {
let currentString = serial.readLine();
trim(currentString);
if (!currentString) return;
console.log(currentString);
latestData = currentString;
}

This page explains it well.

Now to the assignment

Task 1: Use data to move ellipse horizontally.

The x-coordinate of the ellipse is specified by ellipseX, which is updated based on the sensor data in the gotData() function. This allows the ellipse to move horizontally across the canvas based on the sensor readings.

 // Update the x-coordinate of the ellipse based on the sensor data
  ellipseX = map(parseInt(latestData), 0, 1023, 0, width);
}

function draw() {
  background(255);
  fill(0);
  // Draw the ellipse at the middle of the screen with dynamic x-coordinate
  ellipse(ellipseX, height/2, 50, 50);
}

Demo:

Task 2: Use p5 to change LED brightness.

et rVal = 0;
let alpha = 255;
let upArrow = 0;
let downArrow = 0;


function setup() {
  createCanvas(640, 480);
  textSize(18);

  // Replaced Space bar with an actual Button for Serial Connection c
  const connectButton = createButton('Connect to Serial');
  connectButton.position(10, height + 30);
  connectButton.mousePressed(setUpSerial); // Call setUpSerial when the button is pressed
}

function draw() {
  background(map(rVal, 0, 1023, 0, 255), 255, 255);
  fill(255, 0, 255, map(alpha, 0, 1023, 0, 255));

  if (serialActive) {
    text("Connected", 20, 30);
    let message = upArrow + "," + downArrow + "\n";
    writeSerial(message);
  } else {
    text("Press 'Connect to Serial' button", 20, 30);
  }

  text('rVal = ' + str(rVal), 20, 50);
  text('alpha = ' + str(alpha), 20, 70);
}

// Decided on keyPressed as it is more straight forward

// Up Arrow Key turns the light on
function keyPressed() {
  if (keyCode === UP_ARROW) {
    upArrow = 1;
  } else if (keyCode === DOWN_ARROW) {
    downArrow = 1;
  }
}

// Down Key turns the light off
function keyReleased() {
  if (keyCode === UP_ARROW) {
    upArrow = 0;
  } else if (keyCode === DOWN_ARROW) {
    downArrow = 0;
  }
}


// This function will be called by the web-serial library
// with each new *line* of data. The serial library reads
// the data until the newline and then gives it to us through
// this callback function
function readSerial(data) {
  ////////////////////////////////////
  //READ FROM ARDUINO HERE
  ////////////////////////////////////

  if (data != null) {
    // make sure there is actually a message
    // split the message
    let fromArduino = split(trim(data), ",");
    // if the right length, then proceed
    if (fromArduino.length == 2) {
      // only store values here
      // do everything with those values in the main draw loop
     
      // We take the string we get from Arduino and explicitly
      // convert it to a number by using int()
      // e.g. "103" becomes 103
      rVal = int(fromArduino[0]);
      alpha = int(fromArduino[1]);
    }

    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = left + "," + right + "\n";
    writeSerial(sendToArduino);
  }
}

Demo:

Task 3:

Code Snippet:

if (!bounced) {
       bounced = true; // Update bounce state
     } else {
       bounced = false; // Reset bounce state
     }
   } else {
     ledState = 0;
   }

Demo:

Week 12 Final Project Proposal

Calling all adventurers! Prepare to embark on a thrilling collaborative treasure hunt with this interactive game. You’ll navigate a top-down world in P5.js using joysticks on custom Arduino controllers.

The Arduino controller is the command center. One joystick translates your movement (up/down/left/right, forward/backward) into smooth exploration in P5.js. Another button activates the metal detector, triggering a response in the game world.

P5.js paints the picture. The user explore a beautifully crafted map, whether it’s a lush forest or a sandy desert. Your character, represented by a sprite or shape, moves based on your joystick actions. Scattered throughout the map are hidden treasures, initially invisible. Keep your eyes and ears peeled!

The metal detector serves as your trusty companion. A visual representation of the detector moves with your character, and it changes color, size, or plays a sound effect when you approach a hidden treasure. This is your hot zone cue! Get closer, and the treasure becomes visible on the map, ready for collection. Your success adds it to your inventory display.

The communication flow is straightforward. Arduino continuously sends joystick movements to P5.js. When you press the metal detector button, Arduino sends a signal indicating activation. The character on P5.js will move according to the received movement data. P5.js  sends 2 types of data: The gamestate and signal indicating of the treasure is close. The gamestate will be displayed on the LCD on the controller, while LEDs on the controller will blink if it receives treasure signal.

Furthermore, sound effects will be added for immersive exploration and treasure discoveries. Also, will introduce a timer for a thrilling race against the clock. Consider different treasure types with varying rarities for a more dynamic hunt.

This single-player treasure hunt offers a compelling experience. You can tailor the world, difficulty, and treasure types to your preferences. With the core gameplay established, the possibilities are endless! So grab your controller, unleash your explorer spirit, and embark on a thrilling hunt for hidden treasures!

Controller Picture Demo:

Week 12 Reading Response

Design meets disability

As I have stated in the previous reading response, one of the main priorities of design should be inclusivity. However, how we approach inclusivity also matters. The way eyeglasses have mutated from medical tool to fashion entity clearly demonstrates the multiple forms it can take in various social models. Ideally speaking, the design of the product should be implemented in a way that it does not imply any negative interpretations in different social models, but this is easier said than done. In any way the design attempts to cover or camouflage the disability would imply the idea that the disability should be something to be ashamed of, but highlighting the disability would also be counterintuitive as it could support segregation.

So how exactly should we approach inclusivity? I believe that the optimal way is to design the object so that it prioritizes functionality but also offers variety. Similar to glasses, regardless of the product in question, there should be different models or designs that could be utilized. This way, the consumer/user would have the opportunity to select according to their liking. This could also be used to counter argue against negative interpretations as it would mainly depend on the user. Design, in this sense, should be flexible, so that it could provide sense of compatibility to the user, which I believe is the ultimate goal of design.

Final Project Proposal

Idea 1: Virtual Microbiology Lab

My first idea is to make a physically interactive version of my second assignment. So, the user would move a pen on top of a board. On the pen, there would be a button and a RGB LED mounted on opposite ends. The idea is to make the pen look like a pipette. If possible, I plan to have more than one pipette to represent different strains of bacteria. In that case, there might be one more sensor to identify which “pipette” is being picked up. This would link back to the p5 sketch to control the color of the bacteria.

Refurbished VWR Single Channel 100-1000ul | Maxim Pipette Service

The board itself would have rows of photoresistors mounted along the length and breadth, meant to detect the light from the LED. The information about the position (obtained from checking which photoresistors get the brightest measurement) will be used to map the bacterial colonies to the p5 sketch, which will then grow on the screen in a way very similar to my assignment.

There are many challenges I foresee. The primary is getting my hands on that many photoresistors. Also, I am not sure if photoresistors have the sensitivity required to differentiate between the light being directly in line vs. off to a small angle. I will also have to eliminate background light, although I have an idea to do so by creating a hood under which the user will have to work. Also, due to the very limited amount of time I have, as well as the amount of other commitments [mainly in the form of Capstone 🙁 ] , it might be hard to implement it.

Idea 2: Braille Translation

My second idea was to kind of make an English-to-Braille translator. On the p5 side, I plan to have the user type out a word. I have two ways of implementing this on the Arduino side:

(a) Using LEDs: In this case, the idea is to serve as a learning aid for Sighted people to learn Braille.

(b) Using Servos to raise pins: This is my preferred means of output, as it would be suitable for both Sighted and Blind people. It would be similar to an already existing device known as a refreshable Braille display.

This project was partially motivated by this project from a couple of years back, which used LEGO Mindstorms to make a Braille printer.

Shubham Banerjee, 13, works on his Lego robotics Braille printer.

While researching, I also found this handheld refreshable Braille display device made by MIT undergraduates, which serves as a pretty good indication of what the project might look like.

 

Final Project Proposal (Week 11)

I have three ideas for my final project, each emerging from my interdisciplinary interests.

Encouraging inclusivity with an interactive tool for learning the ASL alphabet — SARAH PAWLETT

The first idea I would like to propose is creating a Sign Language Glove, aimed at facilitating communication and improving accessibility for individuals who are deaf or hard of hearing. I shall limit it to fingerspelling using the alphabet for now. The glove will incorporate flex sensors on each finger to detect bending movements. Arduino will process this data and send the finger configurations to a p5.js sketch, which will interpret the gestures and recognize the corresponding letters of the alphabet.

The p5.js screen will display the recognized letters visually and audibly using text-to-speech. Additionally, users will have the option to input letters via a keyboard to display the corresponding Sign for it on the screen. This interactive system enables individuals that use sign language to have two-way communication with non-sign language users effectively. 

North Star Teacher Resources Adhesive ASL Alphabet Desk Prompts, Pack of 36: Buy Online at Best Price in UAE - Amazon.ae

I initially thought of using American Sign Language (ASL), but the issue is a lot of the signs have the same finger positions, and it will be difficult to differentiate the signs. 

Indian sign language for numbers and alphabets. | Download Scientific Diagram

An alternative is using Indian Sign Language, which uses two hands, but can overcome the above issue. However, this adds complexity of checking 10 finger configurations. 

 

My second idea is conducting a psychology experiment utilizing p5.js for the visual presentation of stimuli and Arduino for participant response collection. I aim to design either Perception experiments, such as Visual search tasks, which involve participants searching for a target stimulus among distractors, or Cognition experiments, which may involve memory tasks, where participants memorize and recall sequences of stimuli presented, or face recognition tasks, where participants identify familiar faces. In these experiments, the p5.js sketch will display visual stimuli, while Arduino buttons will serve as response inputs.

Visual search - Wikipedia

Eg, in the visual search tasks, the p5.js screen will display each of the trials and participants will use buttons connected to the Arduino board to indicate when they have found the target stimulus. Arduino will record response times and accuracy.

At the end of the experiment session, participants will be able to view their performance metrics and compare them to group averages or previous trials. This setup allows for the seamless integration of psychological experimentation with interactive technology, facilitating data collection and analysis in a user-friendly manner.

 

For my third project idea, I propose creating an interactive system that generates music from art! The user will be able to draw on the p5.js canvas, creating their unique artwork. The system will then analyze this artwork pixel by pixel, extracting the RGB values of each pixel. These RGB values will be averaged to create a single value for each pixel, which will then be mapped to a musical note. Additionally, the system will detect sharp changes in color intensity between adjacent pixels, indicating transitions in the artwork. These transitions will determine the length of each note, with sharper changes resulting in shorter notes. The coordinates of each drawn point can influence the tempo or volume of the music, to make it sound better. Once the music composition is generated in p5.js, it will be sent to Arduino, where a piezo buzzer will play the music in real-time. This interactive system lets users create their own art and music. 

Week 11 Reading Response: A Brief Rant on the Future of Interaction Design

Both readings (basically a two-part reading) were a highly interesting look into future interaction systems, especially using the tactile capabilities of our fingertips to sense and manipulate objects and interaction systems.

Bret Victor’s examples on the range of motion and sensitivity of the human hands and fingers reminded me of a key fact from developmental psychology: the fact that the touch receptors of the fingers, lips, and tongues are the first senses to develop. This is why infants touch everything and put everything into their mouths; it is basically their way of seeing things.

The human fingertip in fact can resolve objects at a resolution of 0.4mm. That means, it can basically distinguish between two objects that are about half of a sharpened pencil tip apart.

Thus, with this level of capability of the human hands, one would be inclined to agree with Victor on the declaration that current systems of interaction are highly limited compared to the possibilities.

Other than that, many touchscreen technologies are unfortunately inacessible for people with disabilities. Blind people, for example, require some kind of haptic or audio feedback from a touchscreen input that is usually never bundled with the hardware. In a lot of cases, there is no sufficient option provided in the default software, and special software needs to be downloaded… by first making one’s way through the device unaided. Old people and people with motor disabilities also often struggle with some of the finer inputs required with touchscreens, again due to the lack of any haptic feedback.

Interaction systems in the future need to be designed for all. But first, we must break away from the status quo and boldy go where no man has gone before.

Week 11 Reading Response – Hands are our PAST

While I found the reading’s focus on human capabilities to be a very insightful perspective concerning interaction design, I found its focus on the use of hands to be both limiting and outdated.

In the past several decades, technologies have developed with the consideration of hands as human’s main ‘capability’. We type, swipe, scroll, hold and even move devices with our hands in order to generate a given output – this has become second nature to us.

However, I believe that the future of immersive and seamless human-centred design revolves around moving beyond this. I feel that the utilisation of other physical human inputs can be used to maximised interaction design, both from the perspective of immersion and ease of use.

An example of this being used to provide seamless experiences for users is the use of facial recognition to unlock smartphones. By taking advantage of the front camera’s natural position when a user picks up their device, designers have been able to eliminate the tedious action of typing in a passcode or scanning a thumbprint.

Conversely, full-body immersion has been utilised in game consoles such as the Xbox Live and Wii. In these cases, sensors were put in use to revolutionise how players interact with games, effectively deconstructing the notion that playing video games is a lazy and inactive process. Despite the minimal success of these consoles, the application of full body immersion seen in them can be used as a reference for other interactive experiences such as installations and performances.