Assignment Week 10: Interactive Musical Instrument

Inspiration

The WaveSynth project is inspired by the theremin, one of the first electronic instruments, invented in 1920 by Russian physicist Léon Theremin. Known for its eerie, vocal-like sound and its unique, touchless control, the theremin uses two antennas to detect the position of the player’s hands: one antenna controls pitch, and the other controls volume. By moving their hands through electromagnetic fields, players can create smooth, flowing sounds without touching the instrument. This expressive control has influenced generations of musicians and has become iconic in sci-fi, horror, and experimental music.

Concept

The WaveSynth is a gesture-controlled musical instrument that turns hand movements and environmental factors into dynamic sound. Designed to be both intuitive and expressive, the WaveSynth combines multiple sensors—ultrasonic, temperature, and a potentiometer—to create a cohesive interface.

The ultrasonic sensor detects hand distance, adjusting either pitch or volume based on the player’s proximity. The potentiometer serves as a mode selector, allowing the user to switch between pitch and volume control, as well as access different sound effects like vibrato, pulse, and temperature modulation. The temperature sensor adds an additional layer of subtlety, with ambient temperature shifts introducing slight pitch modulations, making the instrument responsive to its surroundings.

List of the hardware components used in the WaveSynth project:

  • Arduino Uno
  • HC-SR04 Ultrasonic Sonar Sensor (for gesture-based distance measurement)
  • TMP36GZ Temperature Sensor (for ambient temperature-based modulation)
  • 10k Ohm Potentiometer (for mode and effect selection)
  • Piezo Speaker (for sound output)
  • Connecting Wires (for connections between components and the Arduino)
  • Breadboard (for prototyping and circuit connections)
  • 310 Ohm Resistor (for LED circuit)

Schematic Diagram:

 

Code:

// Pin definitions
const int potPin = A0;            // Analog pin for potentiometer
const int tempPin = A1;           // Analog pin for TMP36GZ temperature sensor
const int trigPin = 3;            // Digital pin for sonar trigger
const int echoPin = 4;            // Digital pin for sonar echo
const int speakerPin = 9;         // Digital pin for speaker
const int ledPin = 5;             // Digital pin for LED (PWM-enabled)

// Variables
int effectType = 0;               // Tracks which effect is active (0: none, 1: vibrato, 2: pulse, 3: temperature modulation)

void setup() {
  pinMode(speakerPin, OUTPUT);      // Speaker as output
  pinMode(trigPin, OUTPUT);         // Sonar trigger as output
  pinMode(echoPin, INPUT);          // Sonar echo as input
  pinMode(ledPin, OUTPUT);          // LED as output
  Serial.begin(9600);               // For debugging output
}

// Function to read distance from the sonar sensor
long readDistance() {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);

  // Measure the pulse duration on the echo pin
  long duration = pulseIn(echoPin, HIGH);
  
  // Calculate distance in centimeters
  long distance = duration * 0.034 / 2;
  return distance;
}

// Function to read temperature from the TMP36GZ
float readTemperature() {
  int tempReading = analogRead(tempPin);             // Read analog value from TMP36
  float voltage = tempReading * (5.0 / 1023.0);      // Convert reading to voltage (0-5V)
  float temperatureC = (voltage - 0.5) * 100.0;      // Convert voltage to temperature in Celsius
  return temperatureC;
}

void loop() {
  // Potentiometer to control mode and effect
  int potValue = analogRead(potPin);                  // Read potentiometer (0-1023)
  bool pitchMode = potValue < 512;                    // Below midpoint is pitch mode, above is volume mode
  
  // Determine the effect based on the potentiometer value ranges
  if (potValue < 256) {
    effectType = 0;                                   // No effect
  } else if (potValue < 512) {
    effectType = 1;                                   // Vibrato
  } else if (potValue < 768) {
    effectType = 2;                                   // Pulse
  } else {
    effectType = 3;                                   // Temperature modulation
  }

  // Read sonar distance and map to a lower pitch range for soothing tones
  long distance = readDistance();                     // Distance in cm
  int baseToneValue = pitchMode ? map(distance, 5, 50, 150, 600) : 440;  // Map distance to pitch if in Pitch Mode
  
  // Control LED brightness based on distance
  int ledBrightness = map(distance, 5, 50, 255, 0);   // Closer is brighter (5 cm = max brightness)
  ledBrightness = constrain(ledBrightness, 0, 255);   // Constrain within 0-255
  analogWrite(ledPin, ledBrightness);                 // Set LED brightness
  
  // Read temperature and map it to a gentle pitch effect
  float temperature = readTemperature();
  int tempEffect = map(temperature, 20, 35, 20, 80);  // Map temperature to subtle pitch modulation
  
  // Debug output to Serial Monitor
  Serial.print("Distance: ");
  Serial.print(distance);
  Serial.print(" cm, LED Brightness: ");
  Serial.print(ledBrightness);
  Serial.print(", Pot Value: ");
  Serial.print(potValue);
  Serial.print(", Effect Type: ");
  Serial.print(effectType);
  Serial.print(", Temperature: ");
  Serial.print(temperature);
  Serial.println(" C");

  // Play sound based on the selected effect type
  switch (effectType) {
    case 0: // No effect
      tone(speakerPin, baseToneValue); // Basic tone based on distance
      break;
    
    case 1: // Smooth Vibrato
      for (int i = 0; i < 20; i++) {
        int vibratoTone = baseToneValue + (sin(i * 0.3) * 10); // Soft vibrato effect with lower amplitude
        tone(speakerPin, vibratoTone, 50); // Short tone bursts for vibrato
        delay(20); // Slightly slower delay for soothing vibrato effect
      }
      break;
      
    case 2: // Gentle Pulse
      tone(speakerPin, baseToneValue);      // Play base tone continuously
      analogWrite(speakerPin, 128);         // Soft fade for pulse effect
      delay(100);                           // Adjust pulse duration for gentler effect
      noTone(speakerPin);                   // Turn off sound briefly to create pulse
      delay(100);                           // Wait before next pulse
      break;
      
    case 3: // Temperature Modulation
      int tempModulatedTone = baseToneValue + tempEffect;  // Adjust pitch slightly based on temperature
      tone(speakerPin, tempModulatedTone); // Continuous tone with slight modulation
      delay(200); // Keep tone smooth
      break;
  }
  
  delay(50); // Small delay for stability
}

Media:

Working Process:

 

  1. Initial Setup and Calibration:
    1. When powered on, the Arduino initializes all sensors and components, including the ultrasonic sensor, temperature sensor, potentiometer, and speaker.
    2. The potentiometer’s position is read to determine the initial mode (Pitch or Volume) and effect (Vibrato, Pulse, Temperature Modulation, or None). The instrument is ready to interpret the player’s gestures and environmental inputs to start producing sound.
  2. Gesture Detection and Distance Measurement:
    1. The player positions their hand near the ultrasonic sensor and moves it to change sound properties.
    2. The ultrasonic sensor measures the distance between the player’s hand and the sensor by sending out an ultrasonic pulse and timing how long it takes for the pulse to bounce back.
    3. The distance value is calculated and then mapped to control either pitch or volume based on the selected mode:
      1. Pitch Mode: The distance between the sensor and the player’s hand changes the pitch of the sound. Closer hand positions produce higher pitches, while farther positions result in lower pitches.
      2. Volume Mode: In this mode, the distance controls the volume of the sound. Closer distances yield louder sounds, and farther distances make the sound quieter.
  3. Sound Modification through Effects:
    1. The potentiometer serves as a selector for various sound effects that add dynamic layers to the base tone. Depending on the potentiometer’s position, the following effects are applied:
      1. No Effect (Basic Tone): The sound responds directly to the pitch or volume based on the hand distance with no additional modulation.
      2. Vibrato Effect: The instrument adds a wave-like oscillation to the pitch, producing a gentle, undulating sound. This effect is applied continuously, allowing the sound to vary smoothly.
      3. Pulse Effect: The sound output is pulsed, creating a rhythmic on-and-off pattern. This effect provides a percussive quality, ideal for rhythmic play.
      4. Temperature Modulation: Ambient temperature subtly adjusts the pitch, creating an atmospheric and evolving sound that changes with the surrounding environment. This effect responds more slowly, allowing the sound to naturally vary over time.
  4. Environmental Adaptation with Temperature Modulation:
    1. When Temperature Modulation is selected, the temperature sensor reads the ambient temperature. The Arduino then uses this temperature reading to modulate the pitch subtly.
    2. For example, warmer temperatures gradually increase the pitch, while cooler temperatures lower it. This effect is gradual and blends naturally with the other sound properties, adding a unique, ambient quality to the instrument’s sound.
  5. Real-Time Sound Output:
    1. The piezo speaker produces sound based on the interpreted data, transforming distance measurements, temperature readings, and selected effects into real-time audio.
    2. The speaker continuously updates its output to reflect the current settings and environmental conditions, providing an immediate response to hand movements and mode changes.
    3. As the player moves their hand closer or farther from the ultrasonic sensor, the sound changes instantly in pitch or volume. Additionally, adjustments to the potentiometer instantly modify the effect applied to the sound.
  6. Interactive Feedback Loop:
    1. The player continuously interacts with the WaveSynth by adjusting their hand position, changing the potentiometer setting, and experiencing the evolving sound.
    2. This interactive feedback loop allows the player to dynamically control and modify the instrument’s output, creating an immersive musical experience that feels responsive and alive.

 

Future Improvement and Challenges

One of the primary challenges encountered was calibrating the sensors to respond smoothly and accurately to the user’s hand movements. Fine-tuning the pitch range and ensuring that the effects—such as vibrato and pulse—blended naturally with the sound output took several iterations to achieve a pleasing result.

The temperature sensor was tough to work with on the Arduino board.

Additionally, integrating digital sound synthesis or MIDI compatibility would enable users to connect the WaveSynth with other musical devices or software, greatly expanding its versatility as a tool for music creation.

Another possible enhancement could be the inclusion of LEDs or other visual feedback elements to indicate mode selection and provide dynamic light effects that correspond to the sound output. This would enhance the visual aspect of the instrument, making it even more engaging for live performances.

Reading Responses: Brief Rant

As a reader and technology enthusiast, I find Bret Victor’s “A Brief Rant on the Future of Interaction Design” to be a thought-provoking critique of current trends in human-computer interaction. Victor’s argument against “Pictures Under Glass” technology and her call for more tactile, three-dimensional interfaces resonates with my own experiences and frustrations with touchscreen devices.Victor’s vivid descriptions of how we use our hands to manipulate objects in the real world highlight the limitations of current touchscreen interfaces. I’ve often felt that something was missing when using my smartphone or tablet, and Victor’s examples of reading a book or drinking from a glass perfectly capture that sense of disconnection. The richness of tactile feedback we get from physical objects is indeed absent in our flat, glassy screens

However, I believe Victor’s critique, while insightful, doesn’t fully acknowledge the benefits of touchscreen simplicity and accessibility. In my experience, touchscreens have made technology more approachable for a wider range of users, including children and the elderly. The ease of use and intuitiveness of swiping and tapping have democratized access to digital tools in ways that more complex interfaces might not.That said, I agree with Victor’s call for more ambitious visions in interaction design. Her example of Alan Kay envisioning the iPad decades before its creation is inspiring and reminds us of the power of long-term, visionary thinking.

As someone who uses technology daily, I’m excited by the possibility of interfaces that better utilize our hands’ capabilities and even our entire bodies.Victor’s argument extends beyond just hands to encompass our entire bodies, noting that we have “300 joints” and “600 muscles”

This resonates with my own experiences of how we naturally use our whole bodies when interacting with the physical world. I’ve often felt constrained by the limited range of motion required to use current devices, and the idea of more holistic, full-body interfaces is intriguing.While I appreciate Victor’s vision, I also recognize the practical challenges of implementing more tactile and three-dimensional interfaces. Issues of cost, durability, and scalability would need to be addressed. Additionally, I believe the future of interaction design will likely involve a combination of approaches, including enhanced haptic feedback, hybrid interfaces that combine touchscreens with physical controls, and multimodal interaction incorporating touch, voice, and gesture.

Assignment: GripSense

Concept:

GripSense  is designed to empower recovery and build strength through simple, intuitive grip exercises. Inspired by the resilience required in rehabilitation, it is designed with individuals in mind who are recovering from hand injuries, surgeries, or conditions like arthritis that can limit hand mobility and grip strength. The idea is to create a device that not only helps measure improvement over time but also motivates users to engage in strength exercises by giving real-time feedback and adapting to different levels of force.

Work Procedure:

A squishy ball, embedded with Force Sensor, connected to 5V and Analog Pin A0 on the Arduino through a 1kΩ resistor to ground. This creates a voltage divider, enabling the Arduino to detect varying pressure levels. For feedback, two LEDs are connected: a Light Touch LED on Pin 8 with a 310Ω resistor for gentle squeezes, and a Strong Force LED on Pin 10 with a 310Ω resistorfor firmer grips. Each LED lights up based on the detected force level, providing immediate, intuitive feedback on grip strength.

The Arduino reads these values and provides real-time feedback through LEDs:

  • Light Touch LED (Blue): Lights up for a gentle squeeze ensuring the sensor is working.
  • Strong Force LED (Yellow): Lights up for a firmer, stronger grip.

Code:

const int forceSensorPin = A0;  // Pin connected to the force sensor
const int lightTouchLED = 8;    // Pin for the first LED (light touch)
const int strongForceLED = 10;  // Pin for the second LED (strong force)

int forceValue = 0;             // Variable to store the sensor reading
int lightTouchThreshold = 60;   // Threshold for light touch
int strongForceThreshold = 300; // Threshold for strong force

void setup() {
  Serial.begin(9600);               // Initialize serial communication at 9600 bps
  pinMode(lightTouchLED, OUTPUT);   // Set LED pins as outputs
  pinMode(strongForceLED, OUTPUT);
}

void loop() {
  // Read the analog value from the force sensor
  forceValue = analogRead(forceSensorPin);

  // Print the value to the Serial Monitor
  Serial.print("Force Sensor Value: ");
  Serial.println(forceValue);

  // Check for strong force first to ensure it takes priority over light touch
  if (forceValue >= strongForceThreshold) {
    digitalWrite(lightTouchLED, LOW);    // Turn off the first LED
    digitalWrite(strongForceLED, HIGH);  // Turn on the second LED
  }
  // Check for light touch
  else if (forceValue >= lightTouchThreshold && forceValue < strongForceThreshold) {
    digitalWrite(lightTouchLED, HIGH);   // Turn on the first LED
    digitalWrite(strongForceLED, LOW);   // Ensure the second LED is off
  }
  // No touch detected
  else {
    digitalWrite(lightTouchLED, LOW);    // Turn off both LEDs
    digitalWrite(strongForceLED, LOW);
  }

  delay(100);  // Small delay for stability
}

Schematic Diagram:

Schematic Diagram
Figure: Schematic Diagram of GripSense

 

 

 

 

 

 

 

Media:

 

 

Reflection and Future Improvements:

In the future, I plan to enhance GripSense with several key upgrades:

  1. Secure Sensor Integration: I’ll embed the force sensor inside the squish ball or design a custom 3D-printed casing to hold both the ball and sensor. This will make the setup more professional, durable, and comfortable for users.
  2. Add OLED Display for Instant Feedback: I want to incorporate an OLED screen that will show live grip strength data, allowing users to see their strength level and progress instantly with each squeeze.
  3. Implement Multi-Level LED Indicators: To make feedback more visual, I’ll add a series of LEDs that light up at different levels of grip strength. For example, green will represent light grip, yellow for medium, and red for strong, giving users an intuitive, color-coded response.
  4. Data Logging for Progress Tracking: I aim to add an SD card module to log grip strength data over time, stored in a CSV format. This feature will enable users to track their improvement and analyze their progress with tangible data.
  5. Enable Bluetooth or Wi-Fi Connectivity: I plan to incorporate wireless connectivity so that users can view their grip data on a smartphone app. This will allow them to track progress visually with graphs, set goals, and receive real-time feedback, making the recovery journey more engaging and motivating.

These upgrades will help make GripSense a comprehensive, interactive tool that supports and inspires users through their rehabilitation process.

Reading Reflection: Week 09

Reading Tigoe’s blog post, “Physical Computing’s Greatest Hits (and Misses),” I feel both inspired and challenged by the themes he describes. Physical computing, as he explains, is filled with opportunities for creativity and reinvention. Some themes—like theremin-like instruments or LED displays—show up in physical computing classes over and over again, but Tigoe reminds us that even if something’s been done a hundred times before, there’s always room to put your own spin on it. This resonates with me, especially when I catch myself thinking, “Is this idea too overdone?” It’s encouraging to realize that originality can come from how we interpret and reimagine these classic projects.

Then there’s the question, How can I make my work intuitive and open-ended at the same time? It’s tricky to strike the balance between guiding users and letting them explore freely. For example, in my “soccer bot” project, I wanted the bot to recognize and interact with a soccer ball by “kicking” it. But the challenge wasn’t just in the technical aspects of recognizing the ball and calibrating the bot’s stability; it was about making the interaction feel intuitive and natural, like something anyone could understand right away. I realized that if the bot’s movements felt intentional and even a little lifelike, users could engage with it without needing instructions, adding an unexpected layer of playfulness to the project.

Another question Tigoe’s post brings up is, Can technology actually communicate emotions or sensations? This is something he addresses when he talks about projects like “remote hugs” and “meditation helpers.” I agree with him—replicating emotions or calm states is complex, maybe even impossible in some cases. But I wonder if there are ways technology could enhance or evoke those states instead of replicating them. Maybe it’s not about simulating emotion but creating something that encourages users to bring their own feelings into the experience. This could be a reminder that emotional connections in interactive art come from the users, not just from the tech or design itself.

Second Reading:

The idea that interactive art should leave room for the audience to interpret, experience, and even shape it is powerful—and surprisingly challenging. As artists, we’re often conditioned to pour ourselves into our work as a complete expression of what we want to say. But interactive art pushes us to let go of that control, to see ourselves less as “sole authors” and more as facilitators of an experience that isn’t ours alone.

One thing that really resonates with me is Tigoe’s assertion that our role, especially in interactive art, is not to dictate meaning or guide the viewer’s every move, but to create a space where people can explore. I think this goes against the grain of traditional art, where the artist’s interpretation and intent are often front and center. For me, that’s what’s exciting about interactive art: it’s unpredictable. You’re creating an experience without fully knowing where it will go, and you have to be okay with that ambiguity. There’s something freeing about stepping back and letting the audience complete the work in their way, with their own reactions, ideas, and personal connections.

In my own projects, I’ve noticed that the most memorable moments come not from following my own scripts but from watching how others interpret and transform what I’ve made. I remember one piece where I built a simple interactive environment with lights and motion sensors to respond to presence and movement. I’d imagined that people would move slowly through the space, savoring the changing light patterns. Instead, they ran, laughed, and turned it into an almost game-like experience. It wasn’t what I envisioned, but it was better in many ways—more alive, more spontaneous. This unpredictability is what Tigoe captures when he says to “listen” to the reactions and responses. Observing others engage with your work like this gives you insights you couldn’t have planned for, and it makes the artwork feel truly collaborative.

Assignment Week 08: Unusual Switch ft Aloevera

Concept:
The idea for creating an unusual switch with my arm actually struck me while I was at the gym. As I was practicing with dumbbells, I thought, “Why not use the motion of my arm to activate a switch?” When I was doing bicep curls — you know, the exercise where you pull the dumbbells up toward your shoulders, maybe called as bicep curls— it clicked. I needed some conductors that could connect the wires and pass electricity when my arm moved upward. I initially thought of using aluminum foil, but that felt too common. I wanted to do something different, so I decided to go with aloe vera gel instead. It seemed like a more unique choice, and I was curious to see how well it would conduct electricity.

Hardware Used:

  • Arduino
  • LED
  • 330 ohm resistor
  • Jumper wires
  • Breadboard
  • Aloe vera gel (as a conductor)
  • Aluminum foil (optional, used for wrapping, but didn’t use)
  • Glue to attach the aloe vera slices with a decorative plant that can be wrapped around my arm

Process:

  1. Prepare the Aloe Vera Gel: I first applied some aloe vera gel to the inside of my elbow, creating a path for electricity when my arm bends during the bicep curl motion. This would serve as a conductor, allowing the current to flow when I made contact. But, it did not work. So, I used this decorative plant to wrap my arm and put the aloe vera slices glued on to it.
  2. Set Up the LED Circuit:
    • I placed the LED on the breadboard, with the shorter leg (cathode) on one row and the longer leg (anode) on another.
    • I connected a 330 ohm resistor to the same row as the shorter leg of the LED. The other end of the resistor was connected to one of the Arduino’s GND pins.
    • Then, I took a jumper wire and connected the row with the longer leg of the LED (anode) to digital pin 13 on the Arduino.
  3. Integrate the Aloe Vera Gel as the Switch:
    • I connected a jumper wire from the Arduino’s 5V pin to one piece of aloe vera that was in contact with my elbow.
    • Another jumper wire went from the same piece of aloe vera to pin 2 on the Arduino (set up as an input pin).
    • I then placed the second piece of aloe vera on the outer part of my elbow, completing the circuit when my arm was bent.
  4. Coding the Arduino:
    • In the code, I used digitalRead() to check if there was a connection between the two pieces of aloe vera (when the gel completed the circuit during the bicep curl).
    • If the circuit was closed, the LED would turn on. When I relaxed my arm, breaking the connection, the LED would turn off.
  5. Testing: I tried different amounts of aloe vera gel and even experimented with wrapping aluminum foil around the gel for better conductivity. Eventually, I found a sweet spot where the gel was conductive enough to switch the LED on and off based on my arm movement.

Pictures From The Process:

The circuit:
This is the circuit with aloe vera slices and decor plant  on my arm.

 

Unfortunately, the decor plant did not work properly. So, I used a black rope ( does not look good) to tie aloe vera with my arm directly. Here is the final one!

 

Video of the switch:

Reflection and Future Implementation:

  1. Need to find a way to adjust the aloe vera properly on my arm as I tried to use glue on a decor plant which did not work and gave me burning sensation  on my elbow area instead.
  2. Overall, I find it amusing to work with aloe vera and electronics. It was fun!

 

 

 

Reading Response: Week 08

I began my reading with the piece about Margaret Hamilton, feeling a strong curiosity to learn more about her. It resonated deeply when she mentioned being “one of the guys,” as it reminded me of how, in many cultures, there is a stereotype that only men excel in mathematics, coding, and engineering. When a woman shows talent in these fields, it is often seen as “unusual.” Her story was truly inspiring, highlighting not only her achievements but also the consistency and persistence that made her a role model in engineering. Even though it’s not the first time I’ve encountered gender disparity in STEM, her approach to tackling “imposter syndrome”—which I believe many women in STEM have faced—was a particularly powerful lesson. It’s a reminder to everyone, regardless of gender, to stay focused on their goals and remain committed to what they want to achieve. Her story also brought to mind the movie *Hidden Figures*, which is a great watch for anyone interested in the experiences of women in STEM.

For the second reading, I’ve been brainstorming ideas related to cognitive science and human adaptability. This reading showed how design, when informed by cognitive science, can significantly impact our experiences. For instance, when I visit stores like Miniso, I often find myself drawn to items that are aesthetically pleasing and easy to use. Of course, the definition of “aesthetic” may vary from person to person; for me, it means a sleek, minimal design with soothing colors and user-friendly features. While aesthetic preferences differ, there must be some fundamental principles that apply to everyone. In this context, it’s important to explore the concept of affect and design to understand how we shape our environments and how these designs impact our daily lives. Striking a balance between beauty and usability is indeed the key factor in innovating effective designs and products.

Midterm Project

Concept and Inspiration:

I wanted to share the inspiration behind my game. It all started when I delved into the captivating world of folklore, specifically the stories of Sindbad the Sailor and Behula from Bangladeshi culture. Sindbad’s adventurous spirit, sailing through the vast oceans, I mean, who doesn’t love a good tale of adventure on the high seas?

Then there’s Behula, a fierce and determined woman who braves the ocean’s challenges to bring back her husband. Her journey is filled with trials and tribulations, showcasing strength, love, and resilience. These stories are so rich and deep, and I wanted to weave that essence into my game.

As I started to brainstorm, I thought, “Why not create a surfer game set against this backdrop?” The ocean is such a dynamic environment, and surfing adds an exciting twist. I wanted to capture the thrill of riding the waves while subtly nodding to these legendary tales.

Of course, I realized that not everyone might connect with the heavier themes of folklore, especially younger audiences. So, I decided to give the game a fun, cartoonish vibe. This way, it can appeal to all ages while still honoring those timeless stories. It’s all about adventure, overcoming challenges, and having a great time on the waves!

Code Explanation:

In my game, I’ve focused on creating an engaging surfing experience, and the wave mechanics play a crucial role in bringing that to life. There are some key elects that I would like to mention, especially how I generate those smooth, dynamic waves that define the gameplay.

1. Wave Creation

One of the standout features of my game is how I create the waves. I use a combination of beginShape() and endShape()to draw the wave’s outline:

beginShape();
let xOffset = waveXOffset;
for (let x = 0; x <= width; x += 10) {
    let y = map(noise(xOffset, yOffset), 0, 1, height / 2 - waveAmplitude, height / 2 + waveAmplitude);
    vertex(x, y);
    xOffset += waveFrequency;
}
endShape(CLOSE);
  • Creating Vertex Points: Inside this loop, I utilize the vertex() function to establish a series of points that define the wave’s shape. By iterating across the entire width of the canvas, I can create a flowing wave profile that enhances the surfing experience.
  • Using Perlin Noise: The magic happens with the noise() function. I chose Perlin noise because it generates smooth, natural variations, making the waves look more realistic. Unlike random values that can create jarring changes, Perlin noise ensures that the wave transitions are fluid, which adds to the game’s aesthetic appeal.
  • Mapping Values: I then use the map() function to rescale the noise output, allowing me to set the wave height within specific bounds. By centering the waves around the middle of the canvas (height / 2), I ensure they oscillate up and down, making the gameplay more visually engaging.2.

    2. Player Interaction with Waves

    The way the waves interact with the swimmer is equally important. I calculate the wave’s height and adjust the swimmer’s position accordingly:

    if (isJumping) {
        verticalVelocity += gravity;
        swimmerY += verticalVelocity;
        if (swimmerY >= waveY - swimmerHeight / 2) {
            swimmerY = waveY - swimmerHeight / 2;
            isJumping = false;
            verticalVelocity = 0;
        }
    } else {
        swimmerY = waveY - swimmerHeight / 2;
    }
    • Jump Mechanics: I implement a jumping mechanic where the swimmer’s vertical position changes based on gravity and jump forces. When the swimmer is in the air, I adjust their position based on the wave height, allowing them to ride the wave realistically.
    • Wave Height Adjustment: By ensuring the swimmer’s position is linked to the wave’s current height, I can create a seamless experience where the player feels like they are truly surfing on the waves.3. Window Resizing and Obstacles: 
      • The windowResized() function allows the canvas to resize dynamically if the window size changes, maintaining the game’s responsiveness.
        • Instantiation of Obstacles: Every time spawnObstacle() is called, a new instance of the Obstacle class is created and added to the obstacles array. This ensures that there are multiple obstacles on the screen for players to avoid, making the gameplay more challenging.
        • Continuous Challenge: By calling this function regularly within the main game loop (in the playGame function), I ensure that obstacles keep appearing as the player progresses. This continuous spawning simulates a moving ocean environment, where new challenges arise as players navigate the waves.
        • P5.js Sketch:

      Features  and Game Mechanics:

      • The game kicks off with an engaging start screen featuring the title “Surfer Game”, inviting players to click to begin their surfing.
      • Players control a lively surfer character using intuitive keyboard commands, specifically the spacebar to jump over incoming obstacles like sharks, enhancing the gameplay’s interactivity.
      • As players navigate through the waves, they encounter a dynamic wave system that adjusts in amplitude and frequency, creating a realistic surfing experience. This effect is achieved through Perlin noise, giving the waves a smooth, natural movement.
      • Collectible coins are scattered throughout the waves, adding an exciting layer of challenge. When the player collides with a coin, it disappears, contributing to the score, which is displayed on the screen.
      • The game includes a health mechanic where if the surfer collides with an obstacle, such as a shark, the game transitions to a “Game Over” state, prompting players to either restart or return to the main menu.
      • Upon reaching specific milestones, such as collecting a set number of coins, players are rewarded with a “Level Up” screen that highlights their achievements and encourages them to continue.
      • Players can easily restart the game after it ends by clicking anywhere on the screen, offering a seamless transition between attempts.

      Additional AudioVisual:

      • Sound effects enhance the immersive experience, with cheerful tunes playing when coins are collected and exciting sound bites when the surfer jumps over obstacles.
      • Background music plays continuously throughout the game, creating an engaging atmosphere. Players can enjoy a unique soundtrack that fits the theme of the game, enriching their adventure.
      • The graphics have a cartoonish style, making it appealing for all age groups while still paying homage to the folklore inspirations behind the game.

Reflection and Challenges

When I first set out to create a game, my vision was to develop something based on neuroscience, exploring complex concepts in an engaging way. However, as I delved deeper into that idea, I realized it was more challenging than I anticipated. I then pivoted to creating a 2048 game, but that also didn’t quite hit the mark for me. I felt it lacked the excitement needed to captivate players of all ages.

This experience taught me a valuable lesson: sometimes, taking a step back can provide clarity. Rather than getting bogged down in intricate designs, I opted for a simpler concept that would appeal to a wider audience. Thus, I decided to create an ocean-themed surfing game, inspired by folklore like Sindbad the Sailor and the tale of Behula from Bangladeshi culture.

I see a lot of potential for upgrading this game. Adding a storyline could further engage young players and introduce them to fascinating narratives from folklore. Additionally, I plan to enhance the wave mechanics by incorporating realistic gravity effects, making the surfing experience more immersive. These improvements could really elevate the gameplay and provide an enjoyable adventure for everyone.

Week 05: Midterm Initial Project

Concept :

I’ve always found the brain fascinating, especially how neurons fire and communicate. It’s like this intricate dance of connections that sparks thoughts and actions. Interestingly, voices also make our neurons fire. So, I thought, why not combine that with a game I love? I decided to merge the concept of neuron firing with the mechanics of the classic 2048 game.

Instead of numbers merging, it’s all about neurons connecting and lighting up, almost like a visual representation of a neural network in action. It’s exciting to see how each move can mimic real-life brain activity. The main goal is to keep merging neurons to reach the ultimate neuron—represented as a “Super Neuron”—while continuously creating connections to maintain an active and vibrant neural network. Players can aim for high scores, challenging themselves to beat their previous records or compete with others.

 

Game Design: 

The main goal of the game is to keep building neuron connections. Among all the neuron like particles, there are some target particles (super neuron) that are shown as neuron spark. Player needs to click on the glowing/target particle to create a connection with sparking neuron and the neurons near the area. The neurons merges and more connections keeps building as apart of the simulation. If the player is able to make 05 connections, the player will pass the level.

Codes I’m Proud of: 

Still, the code is in a beginning level and requires lots of modifications to finish the game. The first thing I liked is how the neurons look like which I created using particles as function.

// Start simulation with neurons
function startSimulation() {
  background(0);
  orbitControl();  // Allow mouse control to rotate

  // Analyze the audio amplitude and spectrum
  let spectrum = fft.analyze();
  let level = amp.getLevel();

  // Set lighting
  ambientLight(20, 20, 30);
  pointLight(255, 255, 255, 0, 0, 300);

  // Slowly rotate the scene
  rotateY(frameCount * 0.001);
  rotateX(frameCount * 0.0015);

  // Draw neurons (with sparkle effect)
  for (let i = 0; i < particles.length; i++) {
    let p = particles[i];
    p.move(level);
    p.display();
    
    // Draw lines between nearby particles (neurons)
    for (let j = i + 1; j < particles.length; j++) {
      let d = dist(p.pos.x, p.pos.y, p.pos.z, particles[j].pos.x, particles[j].pos.y, particles[j].pos.z);
      if (d < 120) {
        strokeWeight(map(d, 0, 120, 4, 0.1));
        stroke(150, 150, 255);
        line(p.pos.x, p.pos.y, p.pos.z, particles[j].pos.x, particles[j].pos.y, particles[j].pos.z);
      }
    }
  }

  // Draw the target particle (glowing neuron)
  targetParticle.move();
  targetParticle.display();
  
  // Handle interactions with the target particle
  if (mouseIsPressed) {
    let d = dist(targetParticle.pos.x, targetParticle.pos.y, targetParticle.pos.z, mouseX - width / 2, mouseY - height / 2, 0);
    if (d < 50) {
      fireNeurons();
      connectToNearestNeuron();  // Connect the glowing neuron to the nearest neuron
      score++;  // Increase score when a neuron is clicked
    }
  }

I also tried to make the particles change positions according to the audio amplitude.

fft = new p5.FFT();
amp = new p5.Amplitude();

// Analyze the audio amplitude and spectrum
let spectrum = fft.analyze();
let level = amp.getLevel();

I tried to implement another  cool feature that creates dynamic connections between neurons when they come near each other. This mimics real neural networks and adds a layer of interaction. But, the code might have some flaws.

// Draw lines between nearby particles (neurons)
for (let j = i + 1; j < particles.length; j++) {
  let d = dist(p.pos.x, p.pos.y, p.pos.z, particles[j].pos.x, particles[j].pos.y, particles[j].pos.z);
  if (d < 120) {
    strokeWeight(map(d, 0, 120, 4, 0.1));
    stroke(150, 150, 255);
    line(p.pos.x, p.pos.y, p.pos.z, particles[j].pos.x, particles[j].pos.y, particles[j].pos.z);
  }
}

P5.js Sketch:

Future Works:

  • Need to find a proper background with the name visible for players.
  • Need to add descriptions on the Read part and fix the settings on the Settings Page.
  • Regarding the game design part, it requires to fix the game mechanics as the game flow is not implemented yet.
  • Need to monitor the performance and use techniques like frame rate. 
  • Need to work on the sound. Create a sound experience that reacts dynamically to what’s happening in the game. For example, as neurons connect, the music could change in tempo or complexity

Reading Reflection: Week 05

I read about Myron Kruger’s “Videoplace” from another article and I think Myron W Krueger’s philosophical approach to make an interaction between Arts and Computer Science is a pioneering idea in the 60’s that made him a first generation virtual reality and augmented reality researcher. His examples of creating art on the hands of his friend and visual aesthetics reminded me of the times when I used to draw just lines and patterns on the “Paint” software of windows computer and sometimes it would come out pretty aesthetically, but he took this simple idea to generate something out of the box. From this paper, as I read more about the VIDEOPLACE, I felt it was more advanced than the technologies of that time and surely created a path for today’s AR and VR tech. I felt this paper works as a stepping stone for students interested in Computer Vision and Artificial Intelligence.

I also felt very curious to know more about Messa di Voce, created by Golan Levin in collaboration with Zachary Lieberman. The concept of what can be done in the field of Human-Computer Interaction expanded more through this project. Previously, image or pattern recognition would come to my mind while talking about HCI, but now I see how our voice, can be an excellent element for interacting with audiovisual installations. I agree on the fact that algorithms play a key role in this field and I wonder do we all need to keep rebuilding new algorithms to explore more opportunities in HCI? The idea of building machine vision techniques “from first principles” resonates with my desire to understand how these systems work under the hood. While many environments provide off-the-shelf solutions, the challenge of implementing vision algorithms from scratch seems like a rewarding process that deepens one’s understanding of the mechanics of vision systems.

After reading this paper, I want to explore how vision-based systems can interact with audio, visuals, and physical devices to create immersive, interactive experiences. Tools like Myron, Rozin’s TrackThemColors, and especially Max/MSP/Jitter with CV.Jit seem like powerful platforms for experimenting with art and performance which I haven’t used till now.

Reading Reflection: Week 04

From the readings, when I came across the issues with door handling, it immediately reminded me of the doors at NYUAD, especially in D2. They’re supposed to be semi-automatic with sensors, but I often find myself having to pull them with all my strength just to get them to open. It’s incredibly frustrating, especially when I’m carrying something heavy or have my hands full. Struggling with these doors can be really annoying.

What really drives me crazy is that we live in a world that talks about inclusivity and accessibility, yet my own university isn’t fully disability-friendly. I can’t understand how a person in a wheelchair can access the dining halls when the doors require someone else to pull them open. Drawing from Don Norman’s ideas, I really connect with his emphasis on human-centered design to create better usability. He points out that “Engineers are trained to think logically,” which is something I’ve noticed over the years. However, I believe there’s been a shift in recent times.

As someone aspiring to be an engineer, I resonate with this statement. We often focus more on mathematical modeling rather than on accessibility and the psychological aspects of human behavior. Many engineering projects—like billboards and airport dashboards—do help by displaying information clearly, but they often overlook people with dyslexia. For example, using fonts like Sansa can make reading difficult for them.

Norman also talks about cognitive load, which refers to the mental effort required to use a device. In our fast-paced world, having a high cognitive load can be overwhelming. Take, for instance, a multifunctional printer with tons of buttons and features. If I have to remember multiple steps just to print a document, it can be exhausting. A better design would simplify the process, reducing the steps and making controls more intuitive. This aligns perfectly with Norman’s argument that good design should minimize cognitive load, letting users focus on their tasks instead of trying to figure out how to use the device.

Overall, this reading has been the most enlightening for me in the past four weeks.