Assignment 10: Make a musical instrument

This is my Melodic Button Machine. It uses three push buttons (digital sensors) and a potentiometer (analog sensor) to create a simple, playful musical instrument. Each button plays a different musical note, while the potentiometer allows the player to bend the pitch of the note in real time- much like a musician bending a guitar string or sliding a trombone.

Machine Shown in Class

Assignment Brief

The assignment challenged us to create a musical instrument using Arduino technology. The requirements were clear: incorporate at least one digital sensor (such as a switch or button) and at least one analog sensor (like a potentiometer, photoresistor, or distance sensor). The instrument should respond to user input in a way that is both interactive and expressive.

Conceptualisation

The idea for this project emerged from my fascination with the simplicity of early electronic instruments. I remembered a childhood toy keyboard that could produce a handful of notes, and how magical it felt to create music with just a few buttons. I wanted to recreate that sense of wonder, but with a modern DIY twist. I also wanted to explore how analog and digital sensors could work together to give the user expressive control over the sound.

Process

Component Selection: I started by gathering the essential components: an Arduino Uno, a breadboard, three push buttons, a potentiometer, a piezo buzzer, jumper wires, and a handful of resistors. The buttons would serve as the digital inputs for note selection, while the potentiometer would act as the analog input to modulate pitch.

Circuit Assembly: The buttons were wired to digital pins 2, 3, and 4 on the Arduino, with internal pull-up resistors enabled in the code. The potentiometer’s middle pin was connected to analog pin A0, with its outer pins going to 5V and GND. The piezo buzzer was connected to digital pin 8, ready to bring the project to life with sound.

Code Development: I wrote Arduino code that assigned each button a specific musical note: C, D, or E. The potentiometer’s value was mapped to a pitch modulation range, so turning it would raise or lower the note’s frequency. This allowed for playful experimentation and made the effect of the potentiometer obvious and satisfying. I tested the code, tweaking the modulation range to make sure the pitch bend was dramatic and easy to hear.

Testing and Tuning: Once everything was wired up, I played simple tunes like “Mary Had a Little Lamb” and “Hot Cross Buns” by pressing the buttons in sequence. The potentiometer added a fun twist, letting me add vibrato or slides to each note.

Challenges

Pitch Range Calibration:
Finding the right modulation range for the potentiometer was tricky. If the range was too wide, the notes sounded unnatural; too narrow, and the effect was barely noticeable. After some trial and error, I settled on a ±100 Hz range for a musical yet expressive pitch bend.

Wiring Confusion:
With multiple buttons and sensors, it was easy to mix up wires on the breadboard. I solved this by colour-coding my jumper wires and double-checking each connection before powering up.

Potential Improvements

More Notes:
Adding more buttons would allow for a wider range of songs and melodies. With just three notes, the instrument can play simple tunes, but five or more would open up new musical possibilities.

Polyphony:
Currently, only one note can be played at a time. With some code modifications and additional hardware, I could allow for chords or overlapping notes.

Alternative Sensors:
Swapping the potentiometer for a light sensor or distance sensor could make the instrument even more interactive.

Visual Feedback:
Adding LEDs that light up with each button press or change colour with the pitch would make the instrument more visually engaging.

Schematics

Source Code

const int button1Pin = 2;
const int button2Pin = 3;
const int button3Pin = 4;
const int potPin = A0;
const int buzzerPin = 8;

// Define base frequencies for three notes (C4, E4, G4)
const int noteC = 262;  // C4
const int noteE = 330;  // E4
const int noteG = 294;  // D4

void setup() {
  pinMode(button1Pin, INPUT_PULLUP);
  pinMode(button2Pin, INPUT_PULLUP);
  pinMode(button3Pin, INPUT_PULLUP);
  pinMode(buzzerPin, OUTPUT);
}

void loop() {
  int potValue = analogRead(potPin);
  // Map potentiometer to a modulation range
  int modulation = map(potValue, 0, 1023, -100, 100);

  if (digitalRead(button1Pin) == LOW) {
    tone(buzzerPin, noteC + modulation); // Button 1: C note modulated
  } else if (digitalRead(button2Pin) == LOW) {
    tone(buzzerPin, noteE + modulation); // Button 2: E note modulated
  } else if (digitalRead(button3Pin) == LOW) {
    tone(buzzerPin, noteG + modulation); // Button 3: G note modulated
  } else {
    noTone(buzzerPin);
  }
}

A Brief Rant on the Future of Interaction Design

Reading “A Brief Rant on the Future of Interaction Design” genuinely made me pause and reconsider the direction of our relationship with technology. The author’s central argument- that today’s touchscreens, while innovative, are ultimately constraining- really struck a chord with me. I often marvel at how intuitive my phone is, yet I can’t help but notice how it reduces all my interactions to tapping and swiping on a flat surface. The analogy to early black-and-white cameras is particularly effective; just as those cameras were revolutionary yet obviously incomplete, so too are our current devices. The reference to Matti Bergström’s research about the richness of tactile sensation in our fingertips is compelling evidence that we are sacrificing a significant aspect of our cognitive and sensory development for convenience.

However, I found myself questioning the author’s rather dismissive stance on voice interfaces and gesture controls. While I agree that voice commands can’t replace the hands-on nature of artistic or spatial work, I think the author underestimates their value. Personally, I find voice assistants incredibly useful for multitasking or quick information retrieval- tasks where speed and efficiency matter more than depth of interaction. The author’s point about deep understanding requiring active exploration is well taken, but I believe there’s space for a variety of input methods, each suited to different contexts.

The text also made me reflect on the broader consequences of how we use technology. The idea that we are “giving up on the body” by designing tools that encourage us to be sedentary is quite thought-provoking. I hadn’t previously considered how interface design could contribute to a less physically engaged lifestyle. This perspective encourages me to be more mindful of how I use technology, and to seek out more physically interactive experiences where possible.

In summary, this rant has prompted me to think more deeply about what I want from the future of interaction design. While I appreciate the accessibility and simplicity of modern devices, I agree that we shouldn’t settle for tools that limit our physical and intellectual potential. The text serves as a powerful reminder that technology should enhance our full range of human abilities, not just cater to convenience.

Assignment 9: Digital and Analogue Detection

This is my Analogue and Digital device sensing machine. It uses an Ultrasonic sensor and Push button to change the brightness of LEDs and to toggle them on/off.

Assignment Brief

  • Get information from at least one analogue sensor and at least one digital sensor
  • Use this information to control at least two LEDs, one in a digital fashion and the other in an analog fashion
  • Include a hand-drawn schematic in your documentation

Conceptualisation

The idea for this project was inspired by the desire to create an interactive system that responds to both user input and environmental conditions. I wanted to design a setup where users could actively engage with the system while also observing dynamic feedback. By using an ultrasonic sensor as the analog input, I aimed to create a setup where distance could influence the brightness of an LED, making it visually engaging. Additionally, I incorporated a pushbutton as the digital input to provide manual control over another LED, allowing for a tactile interaction. This combination of sensors and LEDs was chosen to demonstrate how analog and digital inputs can work together in a creative and functional way.

Process

  1. Component Selection: I gathered an Arduino board, an Ultrasonic Sensor (HC-SR04), LEDs, Resistors, Pushbutton Switch, and Jumper Wires

  2. Circuit Assembly: I connected the ultrasonic sensor to the Arduino, ensuring proper wiring for its VCC, GND, TRIG, and ECHO pins. I connected the pushbutton switch to one of the digital pins on the Arduino with an internal pull-up resistor. I wired two LEDs: one LED was connected to a PWM pin for analog brightness control; the other LED was connected to a digital pin for simple on/off functionality.

  3. Code Development: I wrote Arduino code that: Reads distance data from the ultrasonic sensor; maps the distance data to control the brightness of one LED using PWM; reads input from the pushbutton switch to toggle another LED on or off. The code also included debugging statements for monitoring sensor data via the Serial Monitor.

  4. Calibration: I tested and calibrated the ultrasonic sensor by experimenting with different distance thresholds. This involved adjusting the mapping range for brightness control and ensuring accurate detection of distances within 100 cm. For the pushbutton, I verified its functionality by toggling the digital LED on and off during testing.

Challenges

  1. Sensor Accuracy: The ultrasonic sensor occasionally gave inconsistent readings due to interference or non-flat surfaces. To address this, I ensured proper alignment of objects in front of the sensor during testing

  2. False Triggers: Early versions of the code caused unintended behavior due to incorrect wiring and delays in signal processing. I resolved this by carefully re-checking connections and optimizing delay times in the code

  3. Brightness Mapping: Mapping distance values (0–100 cm) to PWM brightness (0–255) required fine-tuning to achieve smooth transitions in LED brightness.

Potential Improvements

  1. Multiple Sensors: Adding more ultrasonic sensors could allow for multi-directional distance sensing, enabling more complex interactions.

  2. Enhanced Visual Feedback: Using RGB LEDs instead of single-color LEDs could provide more dynamic visual responses based on distance or button presses.

  3. Energy Efficiency: Exploring low-power modes and more efficient components could extend battery life for portable applications.

Schematic Diagram

Source Code

// Pin assignments
const int trigPin = 7;       // Trig pin for ultrasonic sensor
const int echoPin = 6;       // Echo pin for ultrasonic sensor
const int buttonPin = 2;     // Digital pin for pushbutton
const int ledAnalogPin = 9;  // PWM pin for analog-controlled LED
const int ledDigitalPin = 13; // Digital pin for on/off LED

void setup() {
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  pinMode(buttonPin, INPUT_PULLUP); // Enable internal pull-up resistor
  pinMode(ledDigitalPin, OUTPUT);
  pinMode(ledAnalogPin, OUTPUT);

  Serial.begin(9600); // For debugging
}

void loop() {
  // Measure distance using ultrasonic sensor
  long duration, distance;
  
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  
  digitalWrite(trigPin, LOW);
  
  duration = pulseIn(echoPin, HIGH);
  
  // Calculate distance in centimeters
  distance = duration * 0.034 / 2;

  // Map distance (0-100 cm) to PWM range (0-255)
  int brightness = map(distance, 0, 100, 0, 255);

  // Control brightness of analog-controlled LED
  analogWrite(ledAnalogPin, brightness);

  // Read pushbutton state
  int buttonState = digitalRead(buttonPin);

  // Toggle digital-controlled LED based on button press
  if (buttonState == LOW) { // Button pressed
    digitalWrite(ledDigitalPin, HIGH);
  } else {
    digitalWrite(ledDigitalPin, LOW);
  }

  // Debugging output
  Serial.print("Distance: ");
  Serial.print(distance);
  Serial.print(" cm | Brightness: ");
  Serial.println(brightness);

  delay(100); // Small delay for stability
}

Physical Computing’s Greatest Hits (and misses)

The text explores recurring themes in physical computing projects and makes a strong case for embracing these ideas, even if they’ve been done before. I really appreciated the encouragement to view these themes as starting points for originality rather than dismissing them as unoriginal. It’s easy to feel discouraged when you think your idea isn’t new, but this perspective reframes those recurring concepts as opportunities to innovate and personalize. For example, the section on theremin-like instruments stood out to me. While the basic idea of waving your hand over a sensor to create music might seem simple, the challenge lies in adding meaningful gestures or context to make it more expressive and unique. That idea of taking something familiar and pushing it further really resonates with me.

One theme that caught my attention was the “gloves” section, particularly the drum glove. I love how it builds on a natural human behaviour- tapping rhythms with your fingers- and turns it into something interactive and fun. The text points out that gloves already come with a gestural language we understand, which makes them accessible while still offering room for creativity. I started imagining how I could expand on this concept, maybe by incorporating haptic feedback or connecting the gloves to other devices for collaborative performances. It’s exciting to think about how these simple ideas can evolve into something much bigger.

That said, not all the themes felt equally engaging to me. For instance, while “video mirrors” are visually appealing, the text acknowledges that they lack structured interaction and often devolve into simple hand-waving. I agree with this critique- while they’re beautiful, they don’t feel as immersive or meaningful compared to something like “body-as-cursor” projects, which involve more dynamic movement and interaction. It made me think about how important it is to balance aesthetics with functionality in interactive design.

Overall, this text inspired me to see recurring project ideas not as limitations but as creative challenges. It also got me thinking about how physical computing bridges the gap between art and technology in such playful and human ways. Moving forward, I want to approach my own projects with this mindset- taking familiar concepts and finding ways to make them personal, meaningful, and interactive. There’s so much potential in these themes, and I’m excited to explore where they could lead.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

This text really made me rethink the way I approach art and creativity. I love how it frames interactive art as a conversation between the artist and the audience, rather than a one-sided statement. The idea that the audience completes the work through their actions is so refreshing- it feels collaborative and alive. It reminds me of immersive installations like Yayoi Kusama’s “Infinity Rooms,” where the experience is deeply personal and shaped by how each person interacts with the space. I’m drawn to this idea of stepping back as an artist and letting others bring their own perspectives to the work.

At the same time, I struggle with the idea of completely “getting out of their way.” While I understand the importance of leaving room for interpretation, I think too little guidance can leave people feeling lost or disconnected. The text mentions giving hints or context, but it doesn’t really explain how much is enough. For example, if an interactive piece doesn’t make it clear what’s meant to be touched or explored, people might misinterpret it or feel unsure about how to engage. I think there needs to be a balance- enough structure to guide people without taking away their freedom to explore.

This text really got me reflecting on my own creative process. It made me think about how interactive art mirrors other forms of storytelling, like theater or video games, where the audience or players shape the experience. I love the idea of creating something that invites others to bring their own interpretations, but I also want to make sure they feel welcomed into that process. It’s definitely something I’ll keep in mind as I work on my own projects- how to create that balance between structure and freedom so that my work feels open but still accessible.

Assignment 8: Unique Switches

This is my body-based switch, which turns on when the ultrasonic sensor detects a barrier within its range.

Assignment Brief

  • Create an unusual switch that doesn’t require the use of your hands
  • This should be a switch that opens and closes a circuit
  • You should use digitalRead in the Arduino to get the state of your switch and do something based on that information

Conceptualisation

The idea for this project emerged from my will to create something interactive, something that responds to input and only works when someone makes the active choice to be continually using it. I was considering a domino effect game where an initial topple or roll of a marble caused a chain reaction that connected the contact points. However, I decided to take advantage of the equipment provided, instead, and use the motion sensor. By choosing the motion sensor for this experience, I hoped to make it a more interactive and sustainable, as people are able to enjoy an extent of interactivity without needing to reset the entire set-up every time it is going to be used.

Process

  1. Component Selection: I gathered an Arduino board, an ultrasonic sensor (HC-SR04), LEDs, resistors, and jumper wires

  2. Circuit Assembly: I carefully wired the ultrasonic sensor to the Arduino, ensuring proper connections for power, ground, trigger, and echo pins. I then connected the LEDs to digital pins on the Arduino through a current-limiting resistor

  3. Code Development: I wrote Arduino code to control the ultrasonic sensor and LED. The code sends out ultrasonic pulses, measures the time for the echo to return, calculates the distance, and turns the LED on or off based on that distance

  4. Calibration: I experimented with different distance thresholds to determine the optimal range for triggering the LED. This involved repeated testing and code adjustments

Challenges

  1. Sensor Accuracy: The sensor is limited by the hardware it is made from. Signals may be deflected and so sometimes the sensor won’t receive the signals back. Hence, it only works against flat surfaces

  2. False Triggers: Early versions of the system would sometimes trigger in the wrong order due to mis-labelling and mis-wiring. I addressed this by adjusting the sensor’s sensitivity and implementing a minimum detection time to filter out momentary false positives

Potential Improvements

  1. Multiple Sensors: Incorporating additional ultrasonic sensors could create a more comprehensive detection field, allowing for directional awareness

  2. Variable LED Response: Instead of a simple on/off state, the LED brightness could vary based on the detected distance, creating a more nuanced interaction.

  3. Energy Efficiency: Exploring low-power modes and more efficient components could extend battery life for portable applications.

Source Code

const int echo = 13;
const int trig = 12;

int duration = 0;
int distance = 0;

void setup() 
{
  // put your setup code here, to run once:
  pinMode(trig, OUTPUT);
  pinMode(echo, INPUT);
  Serial.begin(9600);
}

void loop() 
{
  // put your main code here, to run repeatedly:
  digitalWrite(trig, HIGH);
  delayMicroseconds(1000);
  digitalWrite(trig, LOW);

  duration = pulseIn(echo, HIGH);
  distance = (duration/2) / 28.5;
  Serial.println(distance);
}

Her Code Got Humans on the Moon

I think this is a fascinating exploration of Margaret Hamilton’s groundbreaking work in software engineering and it shows how crucial her contributions to NASA’s Apollo program were.

First and foremost, what stands out immediately is how Hamilton defied the gender expectations of her time, becoming a pioneer in a male-dominated field during the 1960s. She worked and flourished in an environment where she was unique and would not be blamed for finding hard to work in. Hamilton’s story isn’t just about coding; it’s about breaking barriers and reshaping perceptions of women’s roles in science and technology. The image of Hamilton working late nights with her young daughter sleeping nearby in the office is particularly striking, illustrating the challenges and determination of working mothers in STEM fields.

One of the most interesting aspects of the piece is how it reveals the evolution of software’s importance in the space program. Initially overlooked in budgets and schedules, software quickly became central to Apollo’s success. Hamilton’s work not only helped land humans on the moon but also laid the foundation for modern software engineering practices. The anecdote about her daughter accidentally crashing the simulator, leading to Hamilton’s push for error-checking code, is a powerful example of how real-world experiences informed critical technological developments.

The article also effectively conveys the high-stakes nature of Hamilton’s work. The description of the Apollo 11 landing, where Hamilton’s software prioritisation system proved crucial, highlights how her foresight and technical arguments literally saved the mission. This incident underscores the vital role of robust, well-designed software in complex systems- a principle that remains relevant today.

Norman,“Emotion & Design: Attractive things work better”

I found this to be a very thought-provoking piece that everyone should read to understand the world that they experience. What stands out is Norman’s emphasis on the power of positive effect in design. The idea that positive emotions can make complex tasks feel easier is both intuitive and profound. It’s not just about making things look pretty; it’s about creating an emotional connection that enhances the user’s ability to interact with and utilise a product effectively. This principle is beautifully illustrated through examples like the colour displays in computers, which were once considered superfluous but later became essential for user experience.

I found Norman’s work to be groundbreaking in how it challenges the traditional separation between form and function. By arguing that attractive things actually work better, he’s pushing designers to think beyond mere aesthetics or pure functionality. His concept of the three levels of design- visceral, behavioural, and reflective- provides a comprehensive approach to understanding how users interact with products on multiple levels, from immediate visual appeal to long-term emotional connection. This framework helps designers create products that not only look good but also perform well and foster a lasting bond with users.

One of the most compelling aspects of Norman’s approach is how he integrates cognitive science with design principles. By explaining how our brains process information and how emotions affect our cognitive abilities, he provides a scientific basis for why attractive design matters. This interdisciplinary approach elevates the discussion beyond subjective taste to a more rigorous understanding of human-product interaction. The teapot examples he uses are particularly effective in illustrating the spectrum from purely aesthetic to purely functional design, and how the best designs find a sweet spot that satisfies both. These examples vividly demonstrate how form and function can be harmoniously combined.

The idea that attractive interfaces can make complex software feel more approachable is particularly relevant in our increasingly digital world. Norman’s work predates the smartphone era, yet his insights remain remarkably applicable today. His emphasis on creating products that are not just functional or beautiful, but emotionally resonant and truly user-centered, is a holistic view of user experience that is often overlooked but crucial for success.

This reading is an integration of cognitive science, emotion, and design principles, which cumulatively offer a unique perspective that underscores the importance of balancing aesthetics and usability. By recognising that attractive things indeed work better, designers can create products that are both beautiful and functional, ultimately enhancing user satisfaction and engagement.

Midterm Project

🎉Project Description

“Aura Farming” is an interactive game inspired by Dragon Ball Z, where players take control of Goku and help him transform into a Super Saiyan by rapidly pressing the spacebar. The game challenges players to increase Goku’s power level within a 30-second time limit, visually represented by the growth of his aura and transformation into his iconic Super Saiyan form. The game combines dynamic animations, sound effects, and scoring mechanics to create an engaging experience.

The gameplay revolves around pressing the spacebar to “power up” Goku, increasing his power level and triggering visual changes such as aura expansion and sprite animations. Once Goku’s power level exceeds 9000, he transforms into a Super Saiyan, with golden aura effects and looping animations representing his heightened state. 

 

 

📋Assignment Brief

The assignment required creating an interactive artwork or game using p5.js while incorporating certain elements:

The game responds to user input via:

  • Pressing the spacebar to increase Goku’s power level.
  • Clicking the mouse to navigate between game states (e.g., start screen, instructions screen, and score page).
  • The score page allows restarting the game by clicking anywhere on the screen.

The aura visuals are created using shapes:

  • A combination of ellipses and vertex-based polygons.
  • The aura dynamically grows in size as Goku’s power level increases.

Multiple images (sprites) are used throughout the game:

  • Goku’s base form transformation is represented by 8 sprites, which cycle during animation.
  • Super Saiyan mode is represented by 6 sprites, which loop continuously while in that state.
  • Background images are used for different screens (start screen, instructions screen, gameplay, and score page).

The project includes both sound effects and background music:

  • Background music plays during different game states:
    • Start screen: Non-battle music.
    • Gameplay: Battle music.
    • Score page: Victory music.
  • Sound effects play when powering up (e.g., pressing the spacebar).

On-screen text is used for:

  • Instructions on the start screen and instructions screen.
  • Displaying Goku’s current power level during gameplay.
  • Showing the final score and high score on the score page.
  • Feedback such as “Time Left” during gameplay.

The game utilizes OOP principles through the Character class:

  • Encapsulates properties such as powerLevel, auraSize, and isSuperSaiyan.
  • Includes functions such as powerUp(), drawEnergyAura(), and transformToSuperSaiyan().
  • Handles animations by cycling through arrays of sprites based on game state.

After the experience is completed, there is a way to start a new session (without restarting the sketch)

  • The experience starts with a start screen that waits for user input (mouse click) to proceed to the instructions or gameplay screen.
  • After gameplay ends (30-second timer), a score page is displayed, showing the player’s final score and high score.
  • Players can restart the game from the score page without needing to reload or restart the sketch.

💻Process

Initial Setup and Planning

Initially, the project began with the idea of creating a simple interactive game inspired by Dragon Ball Z. The goal was to replicate Goku’s iconic transformation into Super Saiyan while providing players with an engaging challenge through rapid key presses.

  • Defining Core Mechanics: The primary mechanic involved pressing the spacebar to increase Goku’s power level, which would trigger visual changes (aura growth and sprite animations) and sound effects.
  • Outlining Game States: The game required distinct states: a start screen, instructions screen, gameplay screen, and score page. Each state would serve a specific purpose in the user experience.
  • Sketching Designs: Basic designs for Goku’s aura effects were sketched out to guide development.

Loading Assets

The next step was to load all necessary assets- images, sounds, and fonts- into the project using the preload() function. This was crucial to ensure smooth gameplay without runtime delays. 

  • Arrays of images were created for Goku’s base form transformation (8 frames) and Super Saiyan mode (6 frames). With a for loop, I was able to dynamically load these images into arrays powerUpFrames and superSaiyanFrames, reducing repetitive code.
for (let i = 1; i <= 8; i++) {
  powerUpFrames.push(loadImage(`/images/Base Goku- Sprite ${i}.png`));
}
for (let j = 1; j <= 6; j++) {
  superSaiyanFrames.push(loadImage(`/images/Super Saiyan Goku- Sprite ${j}.png`));
}
  • Four different backgrounds were loaded for the start screen, instructions screen, gameplay screen, and score page.
  • Background music tracks were loaded for each game state (non-battle music, battle music, victory music), along with sound effects for powering up.

Creating the Character Class

To encapsulate Goku’s properties and behaviors, I created the Character class. This class served as the foundation for managing animations, aura effects, and transformations. Key features of the class included:

  • powerLevel: Tracks Goku’s current power level.
  • aurasSize: Controls the size of Goku’s aura.
  • isSuperSaiyan: Boolean flag indicating whether Goku is in Super Saiyan mode.
  • Arrays for animation frames (powerUpFrames and superSaiyanFrames) were used to manage sprite animations.
  • powerUp(): Increases Goku’s power level and aura size when the spacebar is pressed.
  • transformToSuperSaiyan(): Triggers Goku’s transformation into Super Saiyan once his power level exceeds 9000.
  • drawEnergyAura(): Dynamically draws Goku’s aura using shapes (ellipses and vertex-based polygons) with Perlin noise for organic movement.
  • reset(): Resets all properties to their initial values when restarting the game.
class Character {
  constructor(name, x, y) {
    // Name of the character (e.g., "Goku").
    this.name = name;
    // X and Y coordinates for positioning the character on the canvas.
    this.x = x;
    this.y = y;
    // Tracks the current power level of the character.
    this.powerLevel = 0;
    // Multiplier used to increase power level faster during Super Saiyan mode.
    this.powerLevelMultiplier = 1.0;
    // Controls the size of the aura surrounding the character.
    this.auraSize = 0;
    // Stores the current sprite image for the character (e.g., normal or Super Saiyan).
    this.sprite = null;
    // Scale factor used to resize sprites when drawing them on the canvas.
    this.scaleFactor = 0.5;
    // Number of points used to create the aura shape (polygon).
    this.auraPoints = 12;
    // Offset value for Perlin noise, used to create organic movement in the aura.
    this.auraNoiseOffset = random(1000);
    // Offset value for aura color animation (e.g., cycling between yellow and red).
    this.auraColorOffset = 50;
    // Boolean flag indicating whether the character is in Super Saiyan mode.
    this.isSuperSaiyan = false;
    // Delay between animation frames (controls animation speed).
    this.animFrameDelay = 1000;
  }

  // The display() method handles rendering the character and its associated visuals (sprite, aura, power level text).
  display() {
    // Draws Goku's energy aura using the drawEnergyAura() method.
    this.drawEnergyAura();

    // Determines which sprite to display based on Goku's state (normal, powering up, or Super Saiyan).
    let currentSprite;

    if (this.isSuperSaiyan) {
      // If Goku is in Super Saiyan mode, cycle through Super Saiyan animation frames.
      let index = floor(frameCount / 5) % superSaiyanFrames.length;
      currentSprite = superSaiyanFrames[index];
    } else if (this.powerLevel > 0) {
      // If Goku is powering up, cycle through base form transformation animation frames.
      let index = floor(frameCount / 5) % powerUpFrames.length;
      currentSprite = powerUpFrames[index];
    } else {
      // If no transformation is active, display Goku's normal sprite.
      currentSprite = normalSprite;
    }

    // Draws the selected sprite at Goku's position with scaling applied.
    let scaledWidth = currentSprite.width * this.scaleFactor;
    let scaledHeight = currentSprite.height * this.scaleFactor;
    image(currentSprite, this.x, this.y, scaledWidth, scaledHeight);

    // Displays Goku's current power level at the top of the screen.
    fill(255);
    stroke(117, 117, 116);
    fill(149, 152, 173);
    rect(200, 70, 400, 150, 30);
    noFill();
    fill(255, 166, 0);
    rect(220, 80, 360, 130, 20);
    noFill();
    fill(255);
    textFont("Helvetica", 50);
    textAlign(CENTER);
    text("Power Level", 400, 120);
    text(`${this.powerLevel}`, 400, 180);
    noFill();
  }

  // The drawEnergyAura() method creates dynamic visuals for Goku's aura based on his power level and state.
  drawEnergyAura() {
    push(); // Saves the current drawing state.

    translate(this.x, this.y); // Moves the origin to Goku's position.

    if (this.isSuperSaiyan) {
      // If Goku is in Super Saiyan mode, use a golden color for his aura.
      fill(255, 215, 0, 150);
      stroke(255, 215, 0, 200);
    } else {
      // If Goku is in base form or powering up, use a yellow color for his aura.
      fill(255, 255, 0, 100);
      stroke(255, 255, 0, 150);
    }

    strokeWeight(2); // Sets the thickness of the aura outline.

    beginShape(); // Starts defining a custom shape for the aura.

    const baseRadius = this.auraSize * 0.8; // Base size of the aura shape.
    const spikeMagnitude = this.auraSize * 0.2; // Magnitude of spikes in the aura for a dynamic effect.

    // Loop through points to define the aura shape.
    for (let i = 0; i < this.auraPoints; i++) {
      const angle = map(i, 0, this.auraPoints, 0, TWO_PI); // Calculate angle for each point.
      const noiseVal = noise(
        this.auraNoiseOffset + i * 0.1 + frameCount * 0.05
      ); // Use Perlin noise for organic movement.
      const dynamicRadius = baseRadius + spikeMagnitude * noiseVal; // Adjust radius based on noise value.
      const variedAngle = angle + radians(map(noiseVal, 0, 1, -10, 10)); // Add randomness to angles.

      // Define each vertex of the aura shape.
      vertex(
        dynamicRadius * cos(variedAngle),
        dynamicRadius * sin(variedAngle)
      );
    }

    endShape(CLOSE); // Close the custom shape.

    // Update animation parameters for smooth movement.
    this.auraNoiseOffset += 0.02; // Increment noise offset to animate aura.
    this.auraColorOffset += 0.1; // Increment color offset for potential color cycling.

    pop(); // Restores the previous drawing state.
  }

  // The powerUp() method increases Goku's power level and aura size when called (e.g., pressing the spacebar).
  powerUp() {
    this.powerLevel += 100 * this.powerLevelMultiplier; // Increase power level based on multiplier.
    this.auraSize += 3; // Gradually increase the size of Goku's aura.
  }

  // The reset() method reverts Goku's properties to their initial values (used when restarting the game).
  reset() {
    this.powerLevel = 0; // Reset power level to zero.
    this.auraSize = 10; // Reset aura size to its default value.
    this.isSuperSaiyan = false; // Reset Super Saiyan mode to false.
    this.powerLevelMultiplier = 1.0; // Reset power level multiplier to its default value.
    this.sprite = normalSprite; // Set Goku's sprite back to his base form.
  }

  // The transformToSuperSaiyan() method triggers Goku's transformation into Super Saiyan mode.
  transformToSuperSaiyan() {
    this.isSuperSaiyan = true; // Set Super Saiyan mode to true.
    this.powerLevelMultiplier = 50; // Dramatically increase the power level multiplier (e.g., x50 boost).
    this.auraSize += 50; // Increase aura size significantly to reflect transformation.

    // Change Goku's sprite to his Super Saiyan form.
    this.sprite = superSaiyanSprite;
  }
}

Implementing Game States

The game required distinct states to guide players through the experience:

  • Start Screen: Displayed the game title (“Aura Farming”) along with buttons for starting the game or viewing instructions. Mouse clicks were used to navigate between screens.
  • Instructions Screen: Provided players with guidance on how to play the game (e.g., “Press SPACEBAR rapidly to power up”).
  • Gameplay Screen: Served as the main interactive phase where players pressed the spacebar to increase Goku’s power level within a 30-second timer.
  • Score Page: Displayed the player’s final score, high score, and an option to restart the game.

State management logic was implemented in the draw() function using boolean flags (startScreen, instructionScreen, gameScreen, scorePage) to control which screen was displayed.

Adding Animations

An array of 8 sprites (powerUpFrames) was used to represent Goku powering up and an array of 6 sprites (superSaiyanFrames) represented Goku in his Super Saiyan form. 

The display() method cycled through these frames using frameCount:

let index = floor(frameCount / animFrameDelay) % powerUpFrames.length;
currentSprite = powerUpFrames[index];

Integrating Sound Effects and Music

To enhance immersion, sound effects and background music were added using p5.js sound functions:

  • Background music tracks were assigned to each game state (e.g., non-battle music on start screen, battle music during gameplay).
  • A powering-up sound effect played whenever the spacebar was pressed. Conditional checks (isPlaying()) ensured sounds did not overlap or restart unnecessarily.

Implementing Scoring System

A scoring system was added to track the player’s performance

    • The player’s current power level was displayed during gameplay
    • After gameplay ended (30-second timer), a score page displayed:
      • Final score based on Goku’s highest power level achieved.
      • High score saved across multiple play sessions using an array (previousScores).
let previousScores = [];

 

Restart Functionality

The final milestone involved adding functionality to restart the game without reloading or restarting the sketch:

  • Clicking anywhere on the score page reset all properties (e.g., power level, aura size) using the reset() method in the Character class.
  • State flags were updated to return players to the start screen.

🚩Challenges Faced

Assest Loading Issues: One of the initial challenges was loading images correctly due to incorrect file paths or naming conventions. This caused errors where sprites failed to appear in the game.
Solution: Carefully checked file paths, ensured consistency in naming conventions.

Animation Logic: Initially struggled with sprite animations cycling incorrectly or not looping smoothly during transformations.
Solution: Adjusted animation logic in the display() method of the Character class to cycle frames based on frameCount, ensuring smooth transitions.

Sound Overlapping: Sounds repeatedly played or overlapped when keys were pressed rapidly during gameplay.
Solution: Used conditional checks (isPlaying()) before playing sounds to prevent overlapping audio playback.

Game Restart Logic: Ensuring that all game states reset properly when restarting from the score page was challenging.
Solution: Added a reset() method in the Character class to revert all properties (e.g., power level, aura size) to their initial values.

 

function mouseClicked() {
  if (
    startScreen == true &&
    mouseX > 270 &&
    mouseX < 530 &&
    mouseY > 310 &&
    mouseY < 390
  ) {
    gameScreen = true;
    startScreen = false;
    gameStartTime = millis(); // Record the starting time of the game
  }

  if (
    startScreen == true &&
    mouseX > 170 &&
    mouseX < 630 &&
    mouseY > 510 &&
    mouseY < 590
  ) {
    instructionScreen = true;
    startScreen = false;
  }

  if (
    instructionScreen == true &&
    mouseX > 340 &&
    mouseX < 460 &&
    mouseY > 610 &&
    mouseY < 650
  ) {
    instructionScreen = false;
    startScreen = true;
  }

  if (scorePage == true) {
    // Restart the game:
    victoryMusic.stop();
    scorePage = false;
    startScreen = true; // Return to the start menu
    player.reset(); // Reset player's power level and aura size
  }
}

 

📶Key Take Aways and Future direction

This project has taught and solidified a lot of skills that the class established.

    • Improved proficiency in handling animations using arrays of images and frame-based logic.
    • Enhanced problem-solving skills through debugging asset loading issues and refining animation transitions.
    • Learned how to manage sound effects effectively in interactive applications.
    • Developed a better understanding of OOP principles by structuring code with classes and methods.

I particularly enjoyed designing Goku’s transformation sequence into Super Saiyan- his was technically challenging yet felt extremely fulfilling for when I did manage.

If I were to revisit this project, I would consider adding more dynamic gameplay elements, such as obstacles or enemies that interact with Goku’s aura. Another thing I would do is introduce multiple playable characters, such as Vegeta, Piccolo, or other iconic figures from Dragon Ball Z. Players could select their character on the start screen, with each character having unique abilities and animations. For example, Vegeta might power up faster but have a smaller aura, while Piccolo could regenerate power when idle. To implement this, I would extend the Character class to include properties like characterName, spriteSet, and specialAbility. This feature would add replayability and variety to the game, encouraging players to explore different characters and strategies.

To visually reflect Goku’s increasing power level, I would implement dynamic backgrounds that change as the player progresses. At low levels, the background could feature a calm landscape like Planet Namek. At mid-levels, the sky could darken with storm clouds forming, while at high levels, lightning strikes and debris flying around could create a chaotic environment. This feature could be implemented by layering multiple background images and transitioning between them using alpha blending or animations triggered by power level thresholds. Dynamic backgrounds would add excitement to gameplay and visually emphasise the intensity of Goku’s transformation.

✅Delivered Project

The final version of Aura Farming successfully delivers:

    • A complete interactive experience where players help Goku transform into Super Saiyan by rapidly pressing the spacebar.
    • Smooth animations for both base form transformation (8 frames) and Super Saiyan mode (6 frames).
    • Dynamic aura visuals that grow organically as Goku powers up.
    • Background music tailored to different game states (start screen, gameplay, score page) along with sound effects for powering up.
    • A scoring system that tracks high scores across multiple play sessions.
    • Fully functioning start screen, instructions screen, gameplay timer (30 seconds), and score page with restart functionality.

Assignment 5: Midterm Project Update

I developed “Dragon Ball Z: Power Level Training,” an engaging and nostalgic game that captures the essence of the iconic anime series. This interactive experience allows players to step into the shoes of a Dragon Ball Z warrior, focusing on the thrilling power-up sequences that made the show so memorable. Players start with a low power level and, through rapid clicking, increase their strength while watching their character’s energy aura grow. The game features familiar visual and audio elements from the series, including character sprites, power level displays, and the unmistakable sound of powering up. As players progress, they encounter milestones that pay homage to famous moments from the show, culminating in a final power-level goal that, when reached, declares the player a true warrior.

📋Assignment Brief

  • Make an interactive artwork or game using everything you have learned so far
  • Can have one or more users
  • At least one shape
  • At least one image
  • At least one sound
  • At least one on-screen text
  • Object Oriented Programming
  • The experience must start with a screen giving instructions and wait for user input (button / key / mouse / etc.) before starting
  • After the experience is completed, there must be a way to start a new session (without restarting the sketch)

💭Conceptualisation

The idea for “Dragon Ball Z: Power Level Training” was born from a deep appreciation for the iconic anime series and a desire to recreate its most thrilling moments in an interactive format. As a long-time fan of Dragon Ball Z, I’ve always been captivated by the intense power-up sequences that often served as turning points in epic battles. The image of characters like Goku, surrounded by a growing aura of energy as they pushed their limits, has become a defining element of the series.

This project concept emerged while rewatching classic Dragon Ball Z episodes, particularly those featuring transformations and power level increases. I was struck by how these moments, despite their simplicity, generated immense excitement and anticipation among viewers. I wanted to capture this essence and allow players to experience the rush of powering up firsthand. The idea evolved to focus on the visual and auditory aspects of powering up, combining the growing energy aura, rising power level numbers, and the distinctive sounds associated with these transformations.

By digitalizing this experience, I aimed to create an interactive homage to Dragon Ball Z that would resonate with fans and newcomers alike. The game’s design intentionally incorporates key visual elements from the series, such as the character sprites and power level displays, to evoke nostalgia while offering a fresh, interactive twist on the power-up concept. This project not only serves as a tribute to the series but also as an exploration of how iconic pop culture moments can be transformed into engaging interactive experiences.

💻Process

I practiced making classes for certain elements as that is what I struggle with most. I created these classes for Characters, and Auras around the characters. Through this, I was able to solidify my ability with classes, now being a pro, and am able to use them for even more features.

class Character {
  constructor(name, x, y) {
    this.name = name;
    this.x = x;
    this.y = y;
    this.powerLevel = 100; // Starting power level
    this.sprite = null; // Will hold the character's image
    this.aura = new Aura(this); // Create an aura for this character
    this.powerUpSound = null; // Will hold the power-up sound
  }

  // Load character sprite and power-up sound
  loadAssets(spritePath, soundPath) {
    // Load the sprite image
    loadImage(spritePath, img => {
      this.sprite = img;
    });
    // Load the power-up sound
    this.powerUpSound = loadSound(soundPath);
  }

  // Increase power level and grow aura
  powerUp() {
    this.powerLevel += 50;
    this.aura.grow();
    // Play power-up sound if loaded
    if (this.powerUpSound && this.powerUpSound.isLoaded()) {
      this.powerUpSound.play();
    }
  }

  // Display the character, aura, and power level
  display() {
    push(); // Save current drawing style
    this.aura.display(); // Display aura first (behind character)
    if (this.sprite) {
      imageMode(CENTER);
      image(this.sprite, this.x, this.y);
    }
    // Display character name and power level
    textAlign(CENTER);
    textSize(16);
    fill(255);
    text(`${this.name}: ${this.powerLevel}`, this.x, this.y + 60);
    pop(); // Restore previous drawing style
  }

  update() {
    // Add any character-specific update logic here
    // This could include animation updates, state changes, etc.
  }
}

class Aura {
  constructor(character) {
    this.character = character; // Reference to the character this aura belongs to
    this.baseSize = 100; // Initial size of the aura
    this.currentSize = this.baseSize;
    this.maxSize = 300; // Maximum size the aura can grow to
    this.color = color(255, 255, 0, 100); // Yellow, semi-transparent
    this.particles = []; // Array to hold aura particles
  }

  // Increase aura size and add particles
  grow() {
    this.currentSize = min(this.currentSize + 10, this.maxSize);
    this.addParticles();
  }

  // Add new particles to the aura
  addParticles() {
    for (let i = 0; i < 5; i++) {
      this.particles.push(new AuraParticle(this.character.x, this.character.y));
    }
  }

  // Display the aura and its particles
  display() {
    push(); // Save current drawing style
    noStroke();
    fill(this.color);
    // Draw main aura
    ellipse(this.character.x, this.character.y, this.currentSize, this.currentSize);
    
    // Update and display particles
    for (let i = this.particles.length - 1; i >= 0; i--) {
      this.particles[i].update();
      this.particles[i].display();
      // Remove dead particles
      if (this.particles[i].isDead()) {
        this.particles.splice(i, 1);
      }
    }
    pop(); // Restore previous drawing style
  }
}

I would like to clarify, I did use ChatGPT to help me understand classes further and it guided me as I used it to edit this code. However, the bulk of the work us mine.

🚩Predicted Challenges

One of the most intricate tasks will be implementing a particle system to create a dynamic, flowing energy aura around the character. This will require crafting a Particle class with properties like position, velocity, and lifespan, as well as methods for updating and displaying particles. Managing the creation and removal of particles based on the character’s power level will add another layer of complexity to this feature.

Customizing sounds for each character, particularly matching their iconic screams and power-up vocalizations, presents a unique challenge in this project. Dragon Ball Z is known for its distinctive character voices, and replicating this authenticity in the game will require careful sound editing and implementation. Finding high-quality audio clips that capture the essence of each character’s voice, while also ensuring they fit seamlessly into the game’s audio landscape, will be a time-consuming process.

The use of character sprites will be another difficult process, especially given that extracting character models from sprite sheets is a relatively new technique for me. Sprite sheets are efficient for storing multiple animation frames in a single image, but working with them requires a solid understanding of image slicing and animation timing. Learning how to properly extract individual frames, create smooth animations, and manage different character states (idle, powering up, transformed) will likely involve a steep learning curve. This process may involve trial and error, as well as research into best practices for sprite animation in p5.js.

📶Minimum Deliverables and Extras

Minimum:

  • Start screen with instructions and a start button
  • Main game screen with: Character sprite, Power level display, Energy aura (shape) around the character, Power-up button
  • Basic power-up mechanics (increase power level on button click)
  • Growing energy aura as power level increases
  • At least one sound effect (e.g., power-up sound)
  • Victory screen when final goal is reached
  • Option to restart the game after completion
  • Object-Oriented Programming implementation (Character, PowerUpButton, and EnergyAura classes)

Extras:

  • Multiple playable characters (e.g., Goku, Vegeta, Piccolo)
  • Animated character sprites that change with power level increases
  • Dynamic background that changes based on power level
  • More varied and engaging sound effects (e.g., different sounds for different power levels)
  • Power-up animations (e.g., lightning effects, screen shake)
  • Unlockable content (e.g., new characters, backgrounds) based on achievements
  • Adaptive music that intensifies as power level increases
  • Voice clips from the show playing at certain milestones
  • Mini-games or challenges to break up the clicking (e.g., timed button mashing, rhythm game)