Final Project Documentation

I initiated this project to emulate the card-scanning excitement of Yu-Gi-Oh duel disks, in which tapping cards summons monsters and spells. Users present one or more RFID tags-each representing cowboy, astronaut or alien-to an MFRC522 reader connected to an Arduino Uno. The system then allocates a five-second selection window before launching one of three interactive mini-games in p5.js: Stampede, Shooter or Cookie Clicker. In Stampede you take the helm of a lone rider hurtling through a hazardous space canyon, dodging bouncing rocks and prickly cacti that can slow or shove you backwards-all while a herd of cosmic cows closes in on your tail. Shooter throws two players into a tense standoff: each pilot manoeuvres left and right, firing lasers at their opponent and scrambling down shields to block incoming beams until one side breaks. Cookie Clicker is pure, frenzied fun-each participant pounds the mouse on a giant on-screen cookie for ten frantic seconds, racing to rack up the most clicks before time runs out. All visual feedback appears on a browser canvas, and audio loops accompany each game.

 

 

Components

The solution comprises four principal components:

  • RFID Input Module: An MFRC522 reader attached to an Arduino Uno captures four-byte UIDs from standard MIFARE tags.
  • Serial Bridge: The Arduino transmits single-character selection codes (‘6’, ‘7’ or ‘8’) at 9600 baud over USB and awaits simple score-report messages in return. P5.js
  • Front End: A browser sketch employs the WebSerial API to receive selection codes, manage global state and asset loading, display a five-second combo bar beneath each character portrait, and execute the three mini-game modules.
  • Mechanical Enclosure: Laser-cut plywood panels, secured with metal L-brackets, form a cuboid housing; a precision slot allows the 16×2 LCD module to sit flush with the front panel.

Hardware Integration

The MFRC522 reader’s SDA pin connects to Arduino digital pin D10 and its RST pin to D9, while the SPI lines (MOSI, MISO, SCK) share the hardware bus. In firmware, the reader is instantiated via “MFRC522 reader(SS_PIN, RST_PIN);
” and a matchUID() routine compares incoming tags against the three predefined UID arrays.

Integrating a standard 16×2 parallel-interface LCD alongside the RFID module proved significantly more troublesome. As soon as “lcd.begin(16, 2)”  was invoked in setup(), RFID reads ceased altogether. Forum guidance indicated that pin conflicts between the LCD’s control lines and the RC522’s SPI signals were the most likely culprit. A systematic pin audit revealed that the LCD’s Enable and Data-4 lines overlapped with the RFID’s SS and MISO pins. I resolved this by remapping the LCD to use digital pins D2–D5 for its data bus and D6–D7 for RS/Enable, updating both the wiring and the constructor call in the Arduino sketch.

P5.js Application and Mini-Games

The browser sketch orchestrates menu navigation, character selection and execution of three distinct game modules within a single programme.

A single “currentState” variable (0–3) governs menu, Stampede, Shooter and Cookie Clicker modes. A five-second “combo” timer begins upon the first tag read, with incremental progress bars drawn beneath each portrait to visualise the window. Once the timer elapses, the sketch evaluates the number of unique tags captured and transitions to the corresponding game state.

Merging three standalone games into one sketch turned out to be quite the headache. Each mini-game had its own globals-things like score, stage and bespoke input handlers-which clashed as soon as I tried to switch states. To sort that out, I prefixed every variable with its game name (stampedeScore, sh_p1Score, cc_Players), wrapped them in module-specific functions and kept the global namespace clean.

The draw loop needed a rethink, too. Calling every game’s draw routine in sequence resulted in stray graphics popping up when they shouldn’t. I restructured draw() into a clear state machine-only the active module’s draw function runs each frame. That meant stripping out stray background() calls and rogue translate()s  from the individual games so they couldn’t bleed into one another

Finally, unifying input was tricky. I built a single handleInput function that maps RFID codes (‘6’, ‘7’, ‘8’) and key presses to abstract commands (move, shoot, click) then sends them to whichever module is active. A bit of debouncing logic keeps duplicate actions at bay- especially critical during that five-second combo window- so you always get predictable, responsive controls.

The enclosure is constructed from laser-cut plywood panels, chosen both for its sustainability, and structural rigidity, and finished internally with a white-gloss plastic backing to evoke a sleek, modern aesthetic. Metal L-brackets fasten each panel at right angles, avoiding bespoke fasteners and allowing for straightforward assembly or disassembly. A carefully dimensioned aperture in the front panel accommodates the 16×2 LCD module so that its face sits perfectly flush with the surrounding wood, maintaining clean lines.

Switching between the menu and the individual mini-games initially caused the sketch to freeze on several occasions. Timers from the previous module would keep running, arrays retained stale data and stray transformations lingered on the draw matrix. To address this, I introduced dedicated cleanup routine- resetStampede(), shCleanup() and ccCleanup()- that execute just before currentState changes. Each routine clears its game’s specific variables, halts any looping audio and calls resetMatrix() (alongside any required style resets) so that the next module starts with a pristine canvas.

Audio behaviour also demanded careful attention. In early versions, rapidly switching from one game state to another led to multiple tracks playing at once or to music cutting out abruptly, leaving awkward silences. I resolved these issues by centralising all sound control within a single audio manager. Instead of scattering stop() and loop() calls throughout each game’s code, the manager intercepts state changes and victory conditions, fading out the current track and then initiating the next one in a controlled sequence. The result is seamless musical transitions that match the user’s actions without clipping or overlap.

The enclosure underwent its own process of refinement. My first plywood panels, cut on a temperamental laser cutter, frequently misaligned-the slot for the LCD would be too tight to insert the module or so loose that it rattled. After three iterative cuts, I tweaked the slot width, adjusted the alignment tabs and introduced a white-gloss plastic backing. This backing not only conceals the raw wood edges but also delivers a polished, Apple- inspired look. Ultimately, the panels now fit together snugly around the LCD and each other, creating a tool-free assembly that upholds the project’s premium aesthetic.

Future Plans

Looking ahead, the system lends itself readily to further enhancement through the addition of new mini-games. For instance, there could be a puzzle challenge or a rhythm-based experience which leverages the existing state-framework; each new module would simply plug into the central logic, reusing the asset-loading and input-dispatch infrastructure already in place.

Beyond additional games, implementing networked multiplayer via WebSockets or a library such as socket.io would open the possibility of remote matches and real-time score sharing, transforming the project from a local-only tabletop experience into an online arena. Finally, adapting the interface for touch input would enable smooth operation on tablets and smartphones, extending the user base well beyond desktop browsers.

Conclusion

Working on this tabletop arcade prototype has been both challenging and immensely rewarding. I navigated everything from the quirks of RFID timing and serial communications to the intricacies of merging three distinct games into a single p5.js sketch, all while refining the plywood enclosure for a polished finish. Throughout the “Introduction to Interactive Media” course, I found each obstacle-whether in hardware, code or design-to be an opportunity to learn and to apply creative problem-solving. I thoroughly enjoyed the collaborative atmosphere and the chance to experiment across disciplines; I now leave the class not only with a functional prototype but with a genuine enthusiasm for future interactive projects.

 

Week 11- Serial Communication

1. Concept

There are 3 parts to the project:

(1) Light Dependent Resistor (LDR) readings are sent from Arduino to p5js. The ellipse in p5js moves on the horizontal axis in the middle of the screen depending on the LDR readings. Nothing on arduino is controlled by p5js.
(2) Control the LED brightness from p5js using mouseX position. The more right the mouse position, the higher the LED brightness.
(3) Taking the gravity wind example (https://editor.p5js.org/aaronsherwood/sketches/I7iQrNCul), every time the ball bounces one led lights up and then turns off, and you can control the wind from the potentiometer (analog sensor).

2. Highlights: Challenges and Overcoming Them

(1) The circle position was not changing in response to the brightness of the surroundings. We attempted to tackle this problem by checking the serial monitor as to whether LDR readings are being read. After confirming the LDR’s functionality, we closed the serial monitor proceeded to use p5js and use the right serial port. However, the circle position was still not changing. With help from Professor, we streamlined our code to include just about all that seemed necessary. This worked!

 

(2) The LED was flickering. We did not know why. Alisa thought that the delay(50) must be at after instead of before analogWrite(ledPin, brightness). However, that did not solve the problem. Samuel thought to remove the delay(50). It still did not work. We decided to try to map the mouseX position (ranging from 0 to 600) to the range of 0 to 255 in Arduino instead of p5js. This worked!

 

(3) To code the 3rd assignment. I worked alongside many different individuals, Alisa, Samuel, Haris, (even Prof Aya’s slides). I was having trouble at every stage of the hardware aspect, from safari stopping the code from interacting with the Arduino, to then having serial issues with the laptop. I was able to piece together work until I finally had it working in the end. The coding aspect was simple overall, as the base code only had to be minority amended to take inputs from the Arduino, and the LED had to complete an action based on the running programme.

3. Video

Part 1:

Part 2:

Part 3:

4. Reflection and Future Areas of Improvement

Our challenges emphasized the need for understanding the code that allows communication between p5js and Arduino. Without understanding the code, it is difficult to make the right removals or changes necessary to achieve the results we are hoping to.

It would be great to find out why the LED did not flicker when the mapping was performed in Arduino rather than p5js. Is this related to PWM?

Reading Response: Design Meets Disability

The text prompted me to reflect deeply on the role of design in shaping not only products but also perceptions and identities- especially for those of us working in product design. The text challenges the traditional boundaries between medical necessity and mainstream desirability, and as a designer, I find both inspiration and discomfort in this tension.

One point that stands out is the way the Eameses’ leg splint for the U.S. Navy, originally a response to a medical brief, became a catalyst for innovations that transformed mainstream furniture design. The author’s admiration for the leg splint’s “organic form and geometric holes, the combination of subtle surfaces and crisp edges” resonates with me because it exemplifies how constraints can drive creativity. The evidence is clear: the technology and aesthetic language developed for a medical device directly influenced the iconic Eames furniture. This makes me think about how often, in design for disability, we excuse poor design because of the market or the context, rather than holding it to the same standards as mainstream products. It prompts me to question: why shouldn’t design for disability be as beautiful, as considered, and as desirable as anything else?

However, I am uneasy about the persistent bias towards discretion in assistive products. The text critiques the tradition of camouflaging medical devices- pink plastic hearing aids, for instance- in an attempt to make them invisible. The author argues that this approach can inadvertently reinforce the idea that disability is something to be hidden, rather than embraced. The evidence comes from the evolution of eyewear: glasses have transitioned from stigmatized medical appliances to fashion statements, even to the point where people without visual impairments wear them as accessories. This shift did not come from making glasses invisible, but from making them objects of desire. It makes me realise that, as a designer, I should challenge the default of discretion and instead explore how products can project positive identities.

The discussion of fashion’s role in design for disability is particularly provocative. The text points out that fashion and discretion are not true opposites, but there is a creative tension between them. Fashion’s embrace of diversity and its ability to make wearers feel good about themselves stands in stark contrast to the clinical, problem-solving culture that dominates medical design. The evidence is in the HearWear project, where inviting designers from outside the medical field led to radical new concepts for hearing aids- some playful, some overtly technological, and some drawing on jewellery and body adornment. This makes me reflect on my own practice: am I too quick to prioritise technical performance and problem-solving at the expense of self-expression and emotional value?

What I particularly like about the text is its insistence on keeping “the design in design for disability.” The author links the success of products like the iPod to a relentless focus on simplicity and attention to detail, arguing that these same sensibilities are often missing from assistive products because designers are not involved early enough in the process. The point is well made: design is not just about how something looks, but how it works and how it makes people feel. The evidence is in the contrast between the iPod’s iconic simplicity and the “flying submarine” syndrome of overburdened, over-complicated universal designs that try to be everything to everyone and end up pleasing no one. This reminds me that good design often means having the courage to say no to unnecessary features, and instead focusing on the essence of the product and the experience it creates.

Yet, I dislike the way the field of design for disability is still so often siloed and marginalised, both in practice and in perception. The text highlights how multidisciplinary teams developing prosthetics rarely include industrial or fashion designers, let alone sculptors or artists. The result is products that may function well but fail to resonate emotionally or culturally with their users. The evidence comes from the stories of Aimee Mullins and Hugh Herr, who both see their prostheses not just as tools but as extensions of their identity- sometimes even as sources of pride or advantage. This makes me think about the importance of diversity, not only among users but also among designers. We need to bring in more voices, more perspectives, and more creativity if we are to create products that are truly inclusive and empowering.

In conclusion, this text has challenged me to rethink my approach as a designer. It has made me more aware of the cultural and emotional dimensions of product design, especially in the context of disability. I am inspired to seek a healthier balance between problem-solving and playful exploration, between discretion and fashion, and between universality and simplicity. Most of all, I am reminded that design has the power to shape not only products but also identities and societies- and that this responsibility should never be taken lightly.

Preliminary Concept: Multiplayer Character Card Game System Using Arduino and p5.js

Finalised Concept

For my final project, I am designing an interactive multiplayer game system inspired by the iconic Yu-Gi-Oh! duel disk, but reimagined for creative, social play. Players use physical cards embedded with unique identifiers (such as RFID or NFC tags) to represent custom characters they create. These cards are scanned by an Arduino-powered duel disk, allowing each player to join the game with their personalized avatar and stats. The system supports multiple players competing in a series of mini-games or trivia challenges, with all real-time visuals and game logic handled by a p5.js interface on the computer.

This project merges tangible interaction (physical cards and duel disk) with digital feedback (customizable avatars, live scores, and dynamic mini-games), creating a seamless, engaging experience that emphasizes both individual expression and social play.

Arduino Program Design

Inputs:

  • Multiple RFID/NFC readers (or a shared reader for sequential scans), each detecting when a player places their card on the duel disk.

  • Optional: Buttons or touch sensors for additional in-game actions.

Outputs:

  • LEDs, buzzers, or vibration motors embedded in the disk to provide physical feedback (e.g., indicate successful scans, turns, or game outcomes).

Communication:

  • When a card is scanned, Arduino reads the card’s unique ID and sends it to the computer via serial communication.

  • Arduino can also receive commands from p5.js (e.g., to trigger LEDs or buzzers when a player wins a round).

Summary of Arduino’s Role:

  • Listen for card scans and input actions.

  • Transmit card IDs and sensor data to p5.js promptly.

  • Receive feedback commands from p5.js to control physical outputs.

p5.js Program Design

Responsibilities:

  • Listen for incoming serial data from the Arduino (card IDs, button presses).

  • Match each card ID to a player profile, loading their custom character (name, avatar, stats).

  • Allow players to customize their characters through a user-friendly interface.

  • Manage the game flow: let multiple players join, select mini-games, and track scores.

  • Display real-time game visuals, avatars, and results.

  • Send commands back to Arduino to trigger physical feedback (e.g., light up the winning player’s section).

Data Flow:

  • On card scan: p5.js receives the ID, loads or prompts for player customization, and adds the player to the game.

  • During play: p5.js updates visuals and scores based on game logic and player actions.

  • On game events: p5.js sends output commands to Arduino for physical feedback.

Interaction Design

  • Joining the Game: Each player places their card on the duel disk. The Arduino detects the card, and p5.js loads their profile or prompts for customization.

  • Customizing Characters: Players can use the p5.js interface to personalize their avatars, choose stats, and save progress.

  • Starting a Game: Once all players have joined, they select a mini-game or trivia challenge.

  • Gameplay: Players compete head-to-head or in teams, with p5.js managing the game logic and displaying results. Physical feedback (lights, sounds) enhances the experience.

  • Winning and Progression: Scores and achievements are tracked per player, and leaderboards are displayed at the end of each round.

System Communication

  • Arduino → p5.js: Sends card IDs and sensor/button states.

  • p5.js → Arduino: Sends commands to trigger LEDs, buzzers, or other outputs based on in-game events.

Project Progress and Next Steps

  • Prototyping: I am currently prototyping the card scanning system with Arduino and testing serial communication with p5.js.

  • UI/UX: Early sketches for the p5.js interface focus on clear avatar displays, easy customization, and intuitive game flow.

  • Game Logic: I am developing the first mini-game (a trivia challenge) and designing the multiplayer logic to support dynamic player counts.

Why This Project?

This system blends physical and digital interaction in a way that is social, customizable, and fun. It encourages users to invest in their characters and compete or collaborate with others, making every session unique. The project leverages Arduino for timely, tangible sensing and feedback, while p5.js handles multimedia processing and engaging visual responses- fulfilling the assignment’s requirements for a responsive, multimedia interactive system.

Assignment 10: Make a musical instrument

This is my Melodic Button Machine. It uses three push buttons (digital sensors) and a potentiometer (analog sensor) to create a simple, playful musical instrument. Each button plays a different musical note, while the potentiometer allows the player to bend the pitch of the note in real time- much like a musician bending a guitar string or sliding a trombone.

Machine Shown in Class

Assignment Brief

The assignment challenged us to create a musical instrument using Arduino technology. The requirements were clear: incorporate at least one digital sensor (such as a switch or button) and at least one analog sensor (like a potentiometer, photoresistor, or distance sensor). The instrument should respond to user input in a way that is both interactive and expressive.

Conceptualisation

The idea for this project emerged from my fascination with the simplicity of early electronic instruments. I remembered a childhood toy keyboard that could produce a handful of notes, and how magical it felt to create music with just a few buttons. I wanted to recreate that sense of wonder, but with a modern DIY twist. I also wanted to explore how analog and digital sensors could work together to give the user expressive control over the sound.

Process

Component Selection: I started by gathering the essential components: an Arduino Uno, a breadboard, three push buttons, a potentiometer, a piezo buzzer, jumper wires, and a handful of resistors. The buttons would serve as the digital inputs for note selection, while the potentiometer would act as the analog input to modulate pitch.

Circuit Assembly: The buttons were wired to digital pins 2, 3, and 4 on the Arduino, with internal pull-up resistors enabled in the code. The potentiometer’s middle pin was connected to analog pin A0, with its outer pins going to 5V and GND. The piezo buzzer was connected to digital pin 8, ready to bring the project to life with sound.

Code Development: I wrote Arduino code that assigned each button a specific musical note: C, D, or E. The potentiometer’s value was mapped to a pitch modulation range, so turning it would raise or lower the note’s frequency. This allowed for playful experimentation and made the effect of the potentiometer obvious and satisfying. I tested the code, tweaking the modulation range to make sure the pitch bend was dramatic and easy to hear.

Testing and Tuning: Once everything was wired up, I played simple tunes like “Mary Had a Little Lamb” and “Hot Cross Buns” by pressing the buttons in sequence. The potentiometer added a fun twist, letting me add vibrato or slides to each note.

Challenges

Pitch Range Calibration:
Finding the right modulation range for the potentiometer was tricky. If the range was too wide, the notes sounded unnatural; too narrow, and the effect was barely noticeable. After some trial and error, I settled on a ±100 Hz range for a musical yet expressive pitch bend.

Wiring Confusion:
With multiple buttons and sensors, it was easy to mix up wires on the breadboard. I solved this by colour-coding my jumper wires and double-checking each connection before powering up.

Potential Improvements

More Notes:
Adding more buttons would allow for a wider range of songs and melodies. With just three notes, the instrument can play simple tunes, but five or more would open up new musical possibilities.

Polyphony:
Currently, only one note can be played at a time. With some code modifications and additional hardware, I could allow for chords or overlapping notes.

Alternative Sensors:
Swapping the potentiometer for a light sensor or distance sensor could make the instrument even more interactive.

Visual Feedback:
Adding LEDs that light up with each button press or change colour with the pitch would make the instrument more visually engaging.

Schematics

Source Code

const int button1Pin = 2;
const int button2Pin = 3;
const int button3Pin = 4;
const int potPin = A0;
const int buzzerPin = 8;

// Define base frequencies for three notes (C4, E4, G4)
const int noteC = 262;  // C4
const int noteE = 330;  // E4
const int noteG = 294;  // D4

void setup() {
  pinMode(button1Pin, INPUT_PULLUP);
  pinMode(button2Pin, INPUT_PULLUP);
  pinMode(button3Pin, INPUT_PULLUP);
  pinMode(buzzerPin, OUTPUT);
}

void loop() {
  int potValue = analogRead(potPin);
  // Map potentiometer to a modulation range
  int modulation = map(potValue, 0, 1023, -100, 100);

  if (digitalRead(button1Pin) == LOW) {
    tone(buzzerPin, noteC + modulation); // Button 1: C note modulated
  } else if (digitalRead(button2Pin) == LOW) {
    tone(buzzerPin, noteE + modulation); // Button 2: E note modulated
  } else if (digitalRead(button3Pin) == LOW) {
    tone(buzzerPin, noteG + modulation); // Button 3: G note modulated
  } else {
    noTone(buzzerPin);
  }
}

A Brief Rant on the Future of Interaction Design

Reading “A Brief Rant on the Future of Interaction Design” genuinely made me pause and reconsider the direction of our relationship with technology. The author’s central argument- that today’s touchscreens, while innovative, are ultimately constraining- really struck a chord with me. I often marvel at how intuitive my phone is, yet I can’t help but notice how it reduces all my interactions to tapping and swiping on a flat surface. The analogy to early black-and-white cameras is particularly effective; just as those cameras were revolutionary yet obviously incomplete, so too are our current devices. The reference to Matti Bergström’s research about the richness of tactile sensation in our fingertips is compelling evidence that we are sacrificing a significant aspect of our cognitive and sensory development for convenience.

However, I found myself questioning the author’s rather dismissive stance on voice interfaces and gesture controls. While I agree that voice commands can’t replace the hands-on nature of artistic or spatial work, I think the author underestimates their value. Personally, I find voice assistants incredibly useful for multitasking or quick information retrieval- tasks where speed and efficiency matter more than depth of interaction. The author’s point about deep understanding requiring active exploration is well taken, but I believe there’s space for a variety of input methods, each suited to different contexts.

The text also made me reflect on the broader consequences of how we use technology. The idea that we are “giving up on the body” by designing tools that encourage us to be sedentary is quite thought-provoking. I hadn’t previously considered how interface design could contribute to a less physically engaged lifestyle. This perspective encourages me to be more mindful of how I use technology, and to seek out more physically interactive experiences where possible.

In summary, this rant has prompted me to think more deeply about what I want from the future of interaction design. While I appreciate the accessibility and simplicity of modern devices, I agree that we shouldn’t settle for tools that limit our physical and intellectual potential. The text serves as a powerful reminder that technology should enhance our full range of human abilities, not just cater to convenience.

Assignment 9: Digital and Analogue Detection

This is my Analogue and Digital device sensing machine. It uses an Ultrasonic sensor and Push button to change the brightness of LEDs and to toggle them on/off.

Assignment Brief

  • Get information from at least one analogue sensor and at least one digital sensor
  • Use this information to control at least two LEDs, one in a digital fashion and the other in an analog fashion
  • Include a hand-drawn schematic in your documentation

Conceptualisation

The idea for this project was inspired by the desire to create an interactive system that responds to both user input and environmental conditions. I wanted to design a setup where users could actively engage with the system while also observing dynamic feedback. By using an ultrasonic sensor as the analog input, I aimed to create a setup where distance could influence the brightness of an LED, making it visually engaging. Additionally, I incorporated a pushbutton as the digital input to provide manual control over another LED, allowing for a tactile interaction. This combination of sensors and LEDs was chosen to demonstrate how analog and digital inputs can work together in a creative and functional way.

Process

  1. Component Selection: I gathered an Arduino board, an Ultrasonic Sensor (HC-SR04), LEDs, Resistors, Pushbutton Switch, and Jumper Wires

  2. Circuit Assembly: I connected the ultrasonic sensor to the Arduino, ensuring proper wiring for its VCC, GND, TRIG, and ECHO pins. I connected the pushbutton switch to one of the digital pins on the Arduino with an internal pull-up resistor. I wired two LEDs: one LED was connected to a PWM pin for analog brightness control; the other LED was connected to a digital pin for simple on/off functionality.

  3. Code Development: I wrote Arduino code that: Reads distance data from the ultrasonic sensor; maps the distance data to control the brightness of one LED using PWM; reads input from the pushbutton switch to toggle another LED on or off. The code also included debugging statements for monitoring sensor data via the Serial Monitor.

  4. Calibration: I tested and calibrated the ultrasonic sensor by experimenting with different distance thresholds. This involved adjusting the mapping range for brightness control and ensuring accurate detection of distances within 100 cm. For the pushbutton, I verified its functionality by toggling the digital LED on and off during testing.

Challenges

  1. Sensor Accuracy: The ultrasonic sensor occasionally gave inconsistent readings due to interference or non-flat surfaces. To address this, I ensured proper alignment of objects in front of the sensor during testing

  2. False Triggers: Early versions of the code caused unintended behavior due to incorrect wiring and delays in signal processing. I resolved this by carefully re-checking connections and optimizing delay times in the code

  3. Brightness Mapping: Mapping distance values (0–100 cm) to PWM brightness (0–255) required fine-tuning to achieve smooth transitions in LED brightness.

Potential Improvements

  1. Multiple Sensors: Adding more ultrasonic sensors could allow for multi-directional distance sensing, enabling more complex interactions.

  2. Enhanced Visual Feedback: Using RGB LEDs instead of single-color LEDs could provide more dynamic visual responses based on distance or button presses.

  3. Energy Efficiency: Exploring low-power modes and more efficient components could extend battery life for portable applications.

Schematic Diagram

Source Code

// Pin assignments
const int trigPin = 7;       // Trig pin for ultrasonic sensor
const int echoPin = 6;       // Echo pin for ultrasonic sensor
const int buttonPin = 2;     // Digital pin for pushbutton
const int ledAnalogPin = 9;  // PWM pin for analog-controlled LED
const int ledDigitalPin = 13; // Digital pin for on/off LED

void setup() {
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  pinMode(buttonPin, INPUT_PULLUP); // Enable internal pull-up resistor
  pinMode(ledDigitalPin, OUTPUT);
  pinMode(ledAnalogPin, OUTPUT);

  Serial.begin(9600); // For debugging
}

void loop() {
  // Measure distance using ultrasonic sensor
  long duration, distance;
  
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  
  digitalWrite(trigPin, LOW);
  
  duration = pulseIn(echoPin, HIGH);
  
  // Calculate distance in centimeters
  distance = duration * 0.034 / 2;

  // Map distance (0-100 cm) to PWM range (0-255)
  int brightness = map(distance, 0, 100, 0, 255);

  // Control brightness of analog-controlled LED
  analogWrite(ledAnalogPin, brightness);

  // Read pushbutton state
  int buttonState = digitalRead(buttonPin);

  // Toggle digital-controlled LED based on button press
  if (buttonState == LOW) { // Button pressed
    digitalWrite(ledDigitalPin, HIGH);
  } else {
    digitalWrite(ledDigitalPin, LOW);
  }

  // Debugging output
  Serial.print("Distance: ");
  Serial.print(distance);
  Serial.print(" cm | Brightness: ");
  Serial.println(brightness);

  delay(100); // Small delay for stability
}

Physical Computing’s Greatest Hits (and misses)

The text explores recurring themes in physical computing projects and makes a strong case for embracing these ideas, even if they’ve been done before. I really appreciated the encouragement to view these themes as starting points for originality rather than dismissing them as unoriginal. It’s easy to feel discouraged when you think your idea isn’t new, but this perspective reframes those recurring concepts as opportunities to innovate and personalize. For example, the section on theremin-like instruments stood out to me. While the basic idea of waving your hand over a sensor to create music might seem simple, the challenge lies in adding meaningful gestures or context to make it more expressive and unique. That idea of taking something familiar and pushing it further really resonates with me.

One theme that caught my attention was the “gloves” section, particularly the drum glove. I love how it builds on a natural human behaviour- tapping rhythms with your fingers- and turns it into something interactive and fun. The text points out that gloves already come with a gestural language we understand, which makes them accessible while still offering room for creativity. I started imagining how I could expand on this concept, maybe by incorporating haptic feedback or connecting the gloves to other devices for collaborative performances. It’s exciting to think about how these simple ideas can evolve into something much bigger.

That said, not all the themes felt equally engaging to me. For instance, while “video mirrors” are visually appealing, the text acknowledges that they lack structured interaction and often devolve into simple hand-waving. I agree with this critique- while they’re beautiful, they don’t feel as immersive or meaningful compared to something like “body-as-cursor” projects, which involve more dynamic movement and interaction. It made me think about how important it is to balance aesthetics with functionality in interactive design.

Overall, this text inspired me to see recurring project ideas not as limitations but as creative challenges. It also got me thinking about how physical computing bridges the gap between art and technology in such playful and human ways. Moving forward, I want to approach my own projects with this mindset- taking familiar concepts and finding ways to make them personal, meaningful, and interactive. There’s so much potential in these themes, and I’m excited to explore where they could lead.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

This text really made me rethink the way I approach art and creativity. I love how it frames interactive art as a conversation between the artist and the audience, rather than a one-sided statement. The idea that the audience completes the work through their actions is so refreshing- it feels collaborative and alive. It reminds me of immersive installations like Yayoi Kusama’s “Infinity Rooms,” where the experience is deeply personal and shaped by how each person interacts with the space. I’m drawn to this idea of stepping back as an artist and letting others bring their own perspectives to the work.

At the same time, I struggle with the idea of completely “getting out of their way.” While I understand the importance of leaving room for interpretation, I think too little guidance can leave people feeling lost or disconnected. The text mentions giving hints or context, but it doesn’t really explain how much is enough. For example, if an interactive piece doesn’t make it clear what’s meant to be touched or explored, people might misinterpret it or feel unsure about how to engage. I think there needs to be a balance- enough structure to guide people without taking away their freedom to explore.

This text really got me reflecting on my own creative process. It made me think about how interactive art mirrors other forms of storytelling, like theater or video games, where the audience or players shape the experience. I love the idea of creating something that invites others to bring their own interpretations, but I also want to make sure they feel welcomed into that process. It’s definitely something I’ll keep in mind as I work on my own projects- how to create that balance between structure and freedom so that my work feels open but still accessible.

Assignment 8: Unique Switches

This is my body-based switch, which turns on when the ultrasonic sensor detects a barrier within its range.

Assignment Brief

  • Create an unusual switch that doesn’t require the use of your hands
  • This should be a switch that opens and closes a circuit
  • You should use digitalRead in the Arduino to get the state of your switch and do something based on that information

Conceptualisation

The idea for this project emerged from my will to create something interactive, something that responds to input and only works when someone makes the active choice to be continually using it. I was considering a domino effect game where an initial topple or roll of a marble caused a chain reaction that connected the contact points. However, I decided to take advantage of the equipment provided, instead, and use the motion sensor. By choosing the motion sensor for this experience, I hoped to make it a more interactive and sustainable, as people are able to enjoy an extent of interactivity without needing to reset the entire set-up every time it is going to be used.

Process

  1. Component Selection: I gathered an Arduino board, an ultrasonic sensor (HC-SR04), LEDs, resistors, and jumper wires

  2. Circuit Assembly: I carefully wired the ultrasonic sensor to the Arduino, ensuring proper connections for power, ground, trigger, and echo pins. I then connected the LEDs to digital pins on the Arduino through a current-limiting resistor

  3. Code Development: I wrote Arduino code to control the ultrasonic sensor and LED. The code sends out ultrasonic pulses, measures the time for the echo to return, calculates the distance, and turns the LED on or off based on that distance

  4. Calibration: I experimented with different distance thresholds to determine the optimal range for triggering the LED. This involved repeated testing and code adjustments

Challenges

  1. Sensor Accuracy: The sensor is limited by the hardware it is made from. Signals may be deflected and so sometimes the sensor won’t receive the signals back. Hence, it only works against flat surfaces

  2. False Triggers: Early versions of the system would sometimes trigger in the wrong order due to mis-labelling and mis-wiring. I addressed this by adjusting the sensor’s sensitivity and implementing a minimum detection time to filter out momentary false positives

Potential Improvements

  1. Multiple Sensors: Incorporating additional ultrasonic sensors could create a more comprehensive detection field, allowing for directional awareness

  2. Variable LED Response: Instead of a simple on/off state, the LED brightness could vary based on the detected distance, creating a more nuanced interaction.

  3. Energy Efficiency: Exploring low-power modes and more efficient components could extend battery life for portable applications.

Source Code

const int echo = 13;
const int trig = 12;

int duration = 0;
int distance = 0;

void setup() 
{
  // put your setup code here, to run once:
  pinMode(trig, OUTPUT);
  pinMode(echo, INPUT);
  Serial.begin(9600);
}

void loop() 
{
  // put your main code here, to run repeatedly:
  digitalWrite(trig, HIGH);
  delayMicroseconds(1000);
  digitalWrite(trig, LOW);

  duration = pulseIn(echo, HIGH);
  distance = (duration/2) / 28.5;
  Serial.println(distance);
}