User Testing + Final Project

User Testing:

During user testing, the visuals weren’t clear in indicating what the project was for. The user mentioned that instructions to interact weren’t needed, but also that the purpose of the project wasn’t directly clear.

 

For technical improvements I adjusted the interval averaging to 8 samples, which improved stability but slightly delayed responsiveness. I also tested different tempoMul ranges (originally 0.2–3) and settled on 0.5–2 to keep the music and visuals within a comfortable range.

User Testing Video

For my final project, I developed an interactive audiovisual experience that transforms a user’s heart rate into dynamic music and visuals, creating a biofeedback-driven art piece. Built using p5.js, the project integrates Web Serial API to read pulse data from an Arduino-based heart rate sensor, generating a musical chord progression (Cmaj7-Am7-Fmaj7-G7) and WebGL visuals (swirling ellipses and a point field) that respond to the calculated BPM (beats per minute). Initially, I proposed a STEM-focused feature to educate users about heart rate, but I pivoted to make the music and visuals adjustable via mouse movements, allowing users to fine-tune the tempo and visual intensity interactively.

The heart rate sensor sends pulse data to p5.js, which calculates BPM to adjust the music’s tempo (chord changes, bass, kick drum) and visual animation speed. Mouse X position controls a tempo multiplier (tempoMul), scaling both music and visuals, while BPM directly influences animation speed and audio effects. The visuals were inspired by p5.js data visualization examples from class, particularly those using WebGL to create dynamic, responsive patterns. The project aims to create a meditative, immersive experience where users see and hear their heartbeat in real-time, with interactive control over the output.

Hardware:
  • Heart Rate Sensor: I used a PulseSensor connected to an Arduino Uno, wired directly to analog pin A0 without a resistor to simplify the circuit. The sensor is not attached and can freely be interacted with.
  • Fabrication: I fit the arduino cable and heart rate sensor through the cardboard sparkfun box. I avoided a finger strap due to wire fragility and inconsistent pressure, opting for a loose finger placement method.
  • Challenges: Direct wiring without a resistor may have increased noise in the pulse signal, requiring software filtering in the Arduino code. Loose finger contact sometimes caused erratic readings, so I adjusted the threshold and added a refractory period to stabilize detection.

The p5.js sketch reads serial data from the Arduino, calculates BPM, and updates music and visuals. Initially, I tried processing raw analog values in p5.js, but noise made it unreliable. After extensive debugging (around 10 hours), I modified the Arduino code to send pre-processed BPM estimates as integers (30–180 range), which streamlined p5.js logic. The mouse-driven tempoMul (mapped from mouse X) scales chord timing, note durations, and visual motion, replacing the STEM feature with an interactive control mechanism.

A significant challenge was balancing real-time BPM updates with smooth visualization. The visuals use the latest BPM, which can make animations appear jumpy if BPM changes rapidly. I averaged BPM over 8 intervals to smooth transitions, but this introduced a slight lag, requiring careful tuning. Serial communication also posed issues: the Web Serial API occasionally dropped connections, so I added robust error handling and a “Connect & Fullscreen” button for reconnection.

Arduino Code (Heart Rate Sensor):

#define PULSE_PIN A0
void setup() {
  Serial.begin(9600);
  pinMode(PULSE_PIN, INPUT);
}
void loop() {
  int pulseValue = analogRead(PULSE_PIN);
  if (pulseValue > 400 && pulseValue < 800) { // Basic threshold
    Serial.println(pulseValue);
  } else {
    Serial.println("0");
  }
  delay(10);
}
Images

 

The seamless integration of heart rate data into both music and visuals is incredibly rewarding. Seeing the ellipses swirl faster and hearing the chords change in sync with my heartbeat feels like a direct connection between my body and the art. I’m also proud of overcoming the Arduino noise issues by implementing software filtering and averaging, which made the BPM calculation robust despite the direct wiring.

Challenges Faced:
  • Arduino Code: The biggest hurdle was getting reliable pulse detection without a resistor. The direct wiring caused noisy signals, so I spent hours tuning the THRESHOLD and REFRACTORY values in the Arduino code to filter out false positives.
  • BPM Calculation: Calculating BPM in p5.js required averaging intervals (intervals array) to smooth out fluctuations, but this introduced a trade-off between responsiveness and stability. I used a rolling average of 8 intervals, but rapid BPM changes still caused slight visual jumps.
  • Visualization Balance: The visuals update based on the latest BPM, which can make animations feel abrupt if the heart rate spikes. I tried interpolating BPM changes, but this slowed responsiveness, so I stuck with averaging to balance real-time accuracy and smooth motion.
  • p5.js Visualization: Adapting WebGL examples from class to respond to BPM was tricky. The math for scaling ellipse and point field motion (visualSpeed = (bpm / 60) * tempoMul) required experimentation to avoid jittery animations while staying synchronized with the music.
  • Serial Stability: The Web Serial API occasionally dropped connections, especially if the Arduino was disconnected mid-session. Robust error handling and the reconnect button mitigated this, but it required significant testing.
Possible Improvements:
  • Smoother BPM Transitions: Implement linear interpolation to gradually transition between BPM values, reducing visual jumps while maintaining real-time accuracy.
  • Dynamic Color Mapping: Map BPM to the hue of the ellipses or points (e.g., blue for low BPM, red for high), enhancing the data visualization aspect.
  • Audio Feedback: Add a subtle pitch shift to the pad or bass based on BPM to make tempo changes more audible.
  • Sensor Stability: Introduce a clip-on sensor design to replace loose finger placement, improving contact consistency without fragile wires.
Reflection + Larger Picture:

This project explores the intersection of biofeedback, art, and interactivity, turning an invisible biological signal (heart rate) into a tangible audiovisual experience. It highlights the potential of wearable sensors to create personalized, immersive art that responds to the user’s physical state. The data visualization component, inspired by p5.js examples from class (e.g., particle systems and dynamic patterns), emphasizes how abstract data can be made expressive and engaging. Beyond art, the project has applications in mindfulness, where users can regulate their heart rate by observing its impact on music and visuals, fostering a deeper connection between the body and mind.

Final Project Progress and Design – Zayed Alsuwaidi

Commit to your Final Project Proposal, include the following explanations in your blog post:

    • Finalized concept for the project
    • Design and description of what your Arduino program will do with each input and output and what it will send to and/or receive from P5
    • Design and description of what P5 program will do and what it will send to and/or receive from Arduino
    • Start working on your overall project (document the progress)

The finalized concept for my project is essentially an experimental simulation which visualizes the specific heart-rate pattern of the person interacting with it, produces experimental music in coordination with that data representation in real-time, and allowing for the user to interact with the simulation.

Serial connection: From arduino to p5.js. (one way)

Links for music (not mine) and if time allows, I might create my own music, but this is the current set of ambient and experimental music:

https://cdnjs.cloudflare.com/ajax/libs/p5.js/1.4.2/p5.min.js

https://cdnjs.cloudflare.com/ajax/libs/tone/14.8.49/Tone.min.js

function mousePressed() {
  for (let btn of buttons) {
    if (mouseX > btn.x && mouseX < btn.x + btn.w && mouseY > btn.y && mouseY < btn.y + btn.h) {
      if (btn.action === "back") {
        currentSeqIndex = (currentSeqIndex - 1 + sequences.length) % sequences.length;
        sequence.stop();
        sequence = new Tone.Sequence((time, note) => {
          arpSynth.triggerAttackRelease(note, "16n", time);
          polySynth.triggerAttackRelease([note, note + "2"], "2n", time);
          shapes.push({
            x: random(width),
            y: random(height),
            size: random(50, 150),
            sides: floor(random(3, 8)),
            hue: (bpm + currentSeqIndex * 60) % 360,
            rot: 0,
            type: currentSeqIndex % 2 ? "polygon" : "circle"
          });
        }, sequences[currentSeqIndex], "4n");
        if (!buttons[1].paused) sequence.start(0);
      } else if (btn.action === "pause") {
        btn.paused = !btn.paused;
        btn.label = btn.paused ? "Play" : "Pause";
        if (btn.paused) {
          Tone.Transport.pause();
        } else {
          Tone.Transport.start();
          sequence.start(0);
        }
      } else if (btn.action === "forward") {
        currentSeqIndex = (currentSeqIndex + 1) % sequences.length;
        sequence.stop();
        sequence = new Tone.Sequence((time, note) => {
          arpSynth.triggerAttackRelease(note, "16n", time);
          polySynth.triggerAttackRelease([note, note + "2"], "2n", time);
          shapes.push({
            x: random(width),
            y: random(height),
            size: random(50, 150),
            sides: floor(random(3, 8)),
            hue: (bpm + currentSeqIndex * 60) % 360,
            rot: 0,
            type: currentSeqIndex % 2 ? "polygon" : "circle"
          });
        }, sequences[currentSeqIndex], "4n");
        if (!buttons[1].paused) sequence.start(0);
      }
    }
  }
}

To emphasize the user interaction and fine-tune the functionality, the mouse pressed function alters the algorithm in which the music is produced.

 

I faced several issues with this TypeError:

TypeError: Cannot read properties of undefined (reading ‘time’)

 

I am currently using a placeholder variable for the input (heart rate) from this pulseSensor. The reason for that is that I need to solder the piece on the right (the metal wires) to create a connection so that I may connect the arduino to the pulseSensor. I am not experienced with soldering and I will ask for further help to continue this stage. 

Next Step: My next step is to solder the wires and to start testing the sensor and implement it into the project. From this, I will test which patterns I can identify to produce the required data visualization. This is a large part of the project so at the current phase it is 30% complete.

Here is my p5.js so far, with a working and interactive algorithm to select preferred ambient music, and functionality based on the heart rate (simulated and dummy variable controlled with slider).

For the arduino uno, I will use this code:

#include <PulseSensorPlayground.h>
#include <ArduinoJson.h>

const int PULSE_PIN = A0;
const int BTN1_PIN = 2;
const int BTN2_PIN = 3;
const int BTN3_PIN = 4;

PulseSensorPlayground pulseSensor;
StaticJsonDocument<200> doc;

void setup() {
  Serial.begin(9600);
  pulseSensor.analogInput(PULSE_PIN);
  pulseSensor.setThreshold(550);
  pulseSensor.begin();
  
  pinMode(BTN1_PIN, INPUT_PULLUP);
  pinMode(BTN2_PIN, INPUT_PULLUP);
  pinMode(BTN3_PIN, INPUT_PULLUP);
}

void loop() {
  int bpm = pulseSensor.getBeatsPerMinute();
  if (!pulseSensor.sawStartOfBeat()) {
    bpm = 0; // Reset if no beat detected
  }
  
  doc["bpm"] = bpm;
  doc["btn1"] = digitalRead(BTN1_PIN) == LOW ? 1 : 0;
  doc["btn2"] = digitalRead(BTN2_PIN) == LOW ? 1 : 0;
  doc["btn3"] = digitalRead(BTN3_PIN) == LOW ? 1 : 0;
  
  serializeJson(doc, Serial);
  Serial.println();
  
  delay(100); // Send data every 100ms
}

and I will test it and document my progress in debugging as well.

 

Design considerations of the physical presentation of the project:

I am still thinking through different forms of cases, designs such as bracelets or medical tape to make the connection between the sensor and the person interacting with the program.

The design requires technical consideration, as the connection between the sensor and the radial artery would be already slightly weak (with a margin of error) I need to document this as well and consider my design after the implementation of the pulseSensor.

For the buttons, I am planning to make them on some form of platform (something similar to the platform that the breadboard and arduino are attached to.


Fullscreen for the best experience.

Week 11: Serial Communication (Reading + Final Concept)

Reading Response:

I resonate with the reading’s premise that disability design must evolve from mere practicality to an embrace of fashion and artistic expression. This shift not only empowers users but enables them to sculpt their identities—both in how they see themselves and how they are seen by others—through distinctive, personalized devices. Take eyewear as a poignant illustration of this concept: its triumph lies in the diversity of choices, such as an array of frame styles that resonate culturally, enabling individuals to exude confidence rather than embarrassment. In the same vein, Mullins’ prosthetic designs highlight how aesthetics can harmonize with personal flair, bolstering self-worth and enhancing social engagement, much like the way we choose our attire or adorn ourselves with jewelry.

To further this dialogue, I suggest harnessing innovative, interactive design tools like p5.js to create dynamic platforms where users can tailor assistive devices in real-time. By allowing them to select shapes, hues, and materials that echo their personal tastes and lifestyle choices, we align with the reading’s call for user autonomy. This transforms design into a participatory experience where individuals take an active role in shaping the aesthetics and functionality of their devices, akin to selecting outfits that express their unique style. Such tools have the potential to democratize the design process, making it accessible and inclusive while cultivating a culture that celebrates disability as a vibrant expression of individuality.

Moreover, this approach tackles the reading’s concerns about universal design by emphasizing personalized solutions. By incorporating sensor-driven inputs like gesture or voice controls, these platforms can cater to a broad spectrum of abilities, reflecting the user-friendly elegance reminiscent of the iPod interface. This not only fulfills the reading’s vision of design as an act of empowerment but also positions technology as a dynamic intersection of art, fashion, and disability, resulting in devices that are not only functional and beautiful but also deeply personal.

Final Project Preliminary Concept:

 

My concept combines generative music and art with arduino through a new type of sensor (Heart Rate Sensor). This would connect to the radial artery on the wrist, and the user’s heart rate will be sent to arduino and then through a serial connection to p5.js in real-time. p5.js will have a pre-defined set of musical nodes and visual graphics which will respond to the user’s heart rate, this is effectively a visual data representation of BPM.

 

The output then is the generative artwork (in synch and contributing to the experimental generated music). The experience would last 2 minutes and the user’s input to change the visuals and music.

I also want to incorporate 3 midi style touch sensors which facilitate the person’s interaction with my project. To make them intuitive , I will place them vertically (left for back, middle for pausing/continuing, right to go forward) which will allow the user to filter through different algorithms of how the musical nodes and visual representations are produced.

 

Week 11: Serial Communication

Arduino and p5.js:

Zayed and Zein

Exercise 1: Moving an Ellipse with One Sensor

Arduino Code:

void setup(){
  Serial.begin(9600);
}

void loop(){
  int pot = analogRead(A0);
  // Less responsive: smaller output range
  int xPos = map(pot, 0, 1023, 100, 300); 
  Serial.println(xPos);
  delay(100);  // longer delay = slower updates
}

 

Challenges:

It was difficult to make the ball move gradually, this was an issue with the p5.js sketch and we added a smoothing factor to make the movement more ideal.

Video:

Exercise 2:

Arduino Code:

// Arduino: LED brightness via Serial input
const int ledPin = 6; 
const unsigned long BAUD = 9600;

void setup() {
  Serial.begin(BAUD);
  while (!Serial) ;        // wait for Serial Monitor
  pinMode(ledPin, OUTPUT);
  Serial.println("LED Brightness Control");
  Serial.println("Send a number 0–255, then <Enter>:");
}

void loop() {
  // only proceed if we have a full line
  if (Serial.available()) {
    String line = Serial.readStringUntil('\n');
    line.trim();           // remove whitespace

    if (line.length() > 0) {
      int b = line.toInt();  
      b = constrain(b, 0, 255);

      analogWrite(ledPin, b);
      Serial.print("▶ Brightness set to ");
      Serial.println(b);
    }
  }
}

// Fade smoothly to a target brightness
void fadeTo(int target, int speed = 5, int delayMs = 10) {
  int curr = 0;
  // read current duty by trial (not perfect, but illustrates the idea)
  for (int i = 0; i < 256; i++) {
    analogWrite(ledPin, i);
    if (i == target) {
      curr = i;
      break;
    }
  }

  while (curr != target) {
    curr += (target > curr) ? speed : -speed;
    curr = constrain(curr, 0, 255);
    analogWrite(ledPin, curr);
    delay(delayMs);
  }
}

 

Challenges:

For this exercise the challenges were 1: making the gradient slider animation look intuitive for functionality, and we spent a lot of time debating on whether a rainbow slider was cool or not. We decided that it was.

 

 

Exercise 3:

For this exercise we decided to build on the example provided and improve the physics of the wind as well as the animation of the ball itself. To keep track of the sensor values and ensure we are receiving consistent results, we included variables to track the potentiometer, bounce (1 results in LED high, 0 results in LED low) and the mass is just an added feature.

Challenges:

The issues we faced were plenty, including making the wind seem more realistic and making the interaction smoother. The first approach to solve this challenge were to implement debounce to discard those operations that would occur too closely during the runtime interval. We also had to be particularly careful about how often the p5.js asks for the potentiometer reading, considering the frequency of the frames displaying the visual text and the frequency of the potentiometer reading.

Arduino Code:

const int LED_PIN = 9;    // LED pin
const int P_PIN = A0;     // Potentiometer pin

void setup() {
  Serial.begin(9600);
  pinMode(LED_PIN, OUTPUT);
  digitalWrite(LED_PIN, LOW); // Initial LED state

  // Handshake: Wait for p5.js to send a start signal
  while (Serial.available() <= 0) {
    Serial.println("INIT"); // Send "INIT" until p5.js responds
    delay(100);             // Avoid flooding
  }
}

void loop() {
  // Check for incoming serial data
  if (Serial.available() > 0) {
    char incoming = Serial.read(); // Read a single character
    if (incoming == 'B') {        // p5.js sends 'B' followed by 0 or 1
      while (Serial.available() <= 0); // Wait for the value
      int bounce = Serial.read() - '0'; // Convert char '0' or '1' to int
      digitalWrite(LED_PIN, bounce);    // Set LED (0 = LOW, 1 = HIGH)
    }
    else if (incoming == 'R') {       // p5.js requests potentiometer reading
      int potValue = analogRead(P_PIN); // Read potentiometer (0-1023)
      Serial.println(potValue);         // Send value
    }
    // Clear any remaining buffer (e.g., newlines)
    while (Serial.available() > 0) {
      Serial.read();
    }
  }
}

 

ONE VIDEO FOR ALL 3 EXERCISES IN RESPECTIVE ORDER

 

Week 10 Reading Response

Whilst reading a Brief Rant on The Future of Interactive Design I thought of the essential framework as controversial. Firstly the assumption that humans require tactile feedback and attributing the cause of high numbers of nerve endings in fingertips only reinforces concepts of evolutionary outcomes. Yes, by evolution we do have the most nerve endings in the tips of our fingers, that does not contradict the future of interactive design removing the necessity for human touch. Unlike Humans 5000 years ago, we will not need more tactile feedback in modern society. I prefer completely removing human touch when needed. We should not have to turn a light switch on, that should be automated. We should not have to turn the AC on through an outdated retro interface as well. Several things are way better without human touch. If we take the framework further, then we should rethink even current technology and not just future technology.

Whilst Reading Responses: A Brief Rant on the Future of Interaction Design, I realized that the author completely dismissed the reasoning behind the framework under the guise that it is just a rant. Perhaps the examples of solutions such as haptic holography and nanobot assemblies is conceptually useful, but not without a further explanation of the framework. Overall I disliked this reading for the above reasons but I found the response to provide some helpful

Week 10 Instrument (Z Z)

Concept:

We decided to use two digital switches (push switches) to play two different notes, with the LDR effectively acting as an analogue volume adjustment mechanism. The video demonstrates how this feedback from the LDR changes the volume, and if you focus when the light intensity pointed towards the LDR is decreased, there is a very small noise.

Demo (Circuit):

Demo (Video):

Arduino Code:

// Define pins for the buttons and the speaker
int btnOnePin = 2;
int btnTwoPin = 3;
int speakerPin = 10;

void setup() {
// Initialize both button pins as inputs with built-in pull-up resistors
pinMode(btnOnePin, INPUT_PULLUP);
pinMode(btnTwoPin, INPUT_PULLUP);

// Configure the speaker pin as an output
pinMode(speakerPin, OUTPUT);
}

void loop() {
// Check if the first button is pressed
if (digitalRead(btnOnePin) == LOW) {
tone(speakerPin, 262); // Play a tone at 262 Hz
}
// Check if the second button is pressed
else if (digitalRead(btnTwoPin) == LOW) {
tone(speakerPin, 530); // Play a tone at 530 Hz
}
// No button is pressed
else {
noTone(speakerPin); // Turn off the speaker
}
}

Challenges:

The initial concept started out with a Light Dependent Resistor and the Piezo speaker/buzzer. We faced issues as the readings from the LDR did not behave as expected, there was an issue with the  sound produced, and the change in music produced was not adequate.

We also faced challenges with the programming, as the noise production was inconsistent. We fixed this by adjusting the mapping of the notes to produce more distinct frequencies for each independent push button (red vs yellow). for 262 and 540 Hz respectively.

Done by: Zayed Alsuwaidi (za2256) and Zein Mukhanov (zm2199)

Week 9 Reading

Physical Computing’s Greatest Hits(and misses):

I am particularly interested in theremin-like instruments, gloves, and video mirrors. I believe that these three examples of physical compute (in harmony and implemented altogether) have the potential to change the representation of the lived physical world. To extend these concepts, I thought of methods of taking input from the environment. These could be the sounds of birds, wind, the temperature, the perceived luminosity of the sun (at the place of input), as well as small movements (either from the wind or sound). Combining these inputs together and processing them, an algorithm can generate certain sounds. This music then comes together with video mirroring and processing, where the composed music dictates a new perception of the scenery / place of physical compute. I found value in the limitations explained in this reading, as the meaning attributed to the actions performed by physical compute and the method of input is key towards making a truly interactive and valuable physical compute experience. To address those, if I were to use the previous example I would think of a way (through testing) by seeing how people received the experience, and how to fine tune it. Things within this line are intuitive experience (ease of use), meaning from interaction, and ability to interact for a sufficient amount of time whilst maintaining meaningful interaction.

 

 

Making Interactive Art: Set the Stage, Then Shut Up and Listen.

 

This reading builds on ‘Physical Computing’s Greatest Hits (and misses)’ by explaining the intricacies of limitations in the design of interactive art. The concept of intuition in interaction is therefore represented as the ability of a person to understand and find meaning in the interaction with the artwork. I found the organization and flow of this reading very helpful because it helped re-iterate and solidify the concepts of listening (by both the artist and the audience) and the statement made by the artist. The statement (as I understand it) is not a standalone expression, but rather an interactive conversation, that aims to have a meaningful exchange of actions and expressions by both the artist and his/her art and the audience.

Week 9 Analogue + Digital

For this assignment I started by connecting one push-switch to one LED, and I used that as the basis of my draft schematic and work. I then re-drew my schematic and included the photoresistor (LDR) sensor. Here I faced challenges in both the hardware and software, specifically within the timing of the code. Initially, the light would stay on for too long, and I rechecked the timing of my Arduino code and adjusted it to be in a shorter high (on) state.

 

const int ldrPin = A2;              // LDR connected to analog pin A0
const int buttonPin = 2;            // Push button connected to digital pin 2
const int ledAnalogPin = 10;         // PWM LED pin
const int ledDigitalPin = 11;       // On/off LED pin

void setup() {
  pinMode(buttonPin, INPUT);
  pinMode(ledAnalogPin, OUTPUT);
  pinMode(ledDigitalPin, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  int ldrValue = analogRead(ldrPin);        
  int brightness = map(ldrValue, 0, 1023, 255, 0);  // This adjusts the brightness

  analogWrite(ledAnalogPin, brightness);

  int buttonState = digitalRead(buttonPin);

  if (buttonState == HIGH) {
    digitalWrite(ledDigitalPin, HIGH);
  } else {
    digitalWrite(ledDigitalPin, LOW);
  }

  delay(100);
}

For the schematic I followed the fritz schematics format:

 

 

 

Video Demonstration:

https://drive.google.com/file/d/1kAYD6v6C86fbftmEfKM37j6jtumxzql1/view?usp=sharing

 

Some challenges I would still like to overcome and solve better are the output of the LED from the photoresistor (LDR) sensor, as I wanted the output to be stronger. I will address these issues by having a closer adherence to the schematic and a better methodology of implementing the changes to the circuit (thinking of what would happen if I added a certain connection or made a specific change before making that change).

 

Week 8 – Reading Response

The key aspects I noticed in Norman’s Emotion & Design: Attractive Things Work Better were the negative and positive valences, where affects have great implications towards how humans interact with and therefore achieve productivity in the use of everyday things (or better described as any tool).  I want to expand this concept and interpret this very affect and behavior to not only items such as teapots or any tool alike, but for the lived environment. I tend to agree with this framework, to the extent that I am fully confident that you will produce different work and express different levels of creativity through a lived metaphysical conversation in realtime. To highlight this with an example, I prefer sitting in a natural setting (garden, desert, beach) with no seating or furniture when I try to solve an issue or I want to focus on my creative aspects. After all, nature inspired half of the things we know today.

After Reading McMillan’s Her Code Got Humans on The Moon – And Invented Software itself I connected the storyline and experience of Margaret to that of Ada Lovelace. After all, they are both pioneers in their own respect. Lovelace developed the first known and documented computer program and directly worked with and gave more influence to Charles Babbage. Margaret reflects this pioneering and ingenious creativity by working on highly-technical work on software engineering through punch-cards and simulation. During the 1960s, this development process required a great deal of manual human intervention. I am inspired by her motives and ambition, and I wonder how many people she inspires today. Margaret’s work and achievements resonate within me today, and I believe she deserves even more credit, just like how Lovelace has a GPU architecture named after her.

Unusual Switch

For the Unusual switch I decided to continue the concepts I was exploring within the first seven weeks of interactive and generative art using p5.js. I call this switch the “Handcuff” switch. A general rule of thumb tells you “gif > any other file format” so here is my demonstration whilst also trying to re-enact how the person wearing the handcuffs would feel

 

 

const int switchPin = 2;
const int ledPin = 13;
bool handcuffsLocked = false; // tracks if handcuffs are on or off

void setup() {
  pinMode(switchPin, INPUT_PULLUP);  // Internal pull-up resistor
  pinMode(ledPin, OUTPUT);
  Serial.begin(9600);
}

void loop() {
  int switchState = digitalRead(switchPin);

  if (switchState == LOW && !handcuffsLocked) {
    handcuffsLocked = true; // lock the handcuffs
    digitalWrite(ledPin, HIGH);
    Serial.println("Handcuffs ON");
  }

  if (switchState == HIGH && handcuffsLocked) {
    handcuffsLocked = false; // unlock the handcuffs
    digitalWrite(ledPin, LOW);
    Serial.println("Handcuffs OFF");
  }

  delay(100);
}

I designed this basic code to detect and track when the handcuffs are unlocked. Mainly, it is a conditional that keeps track of the states (on) or (off).