Week 13 – User Testing

When I gave people a chance to try my game, they were generally able to figure it out easily. From my experience during the midterm project, I learned that providing clear instructions is very important, and I applied that lesson here. At first, players were surprised to see apples used as controllers, but I am happy with this decision because it made the game more engaging and encouraged curiosity. The instructions I provided explaining that the red apple moves the character to the right and the green apple moves the character to the left made it easy for them to understand the mapping between the controls and the game actions.

One small point of confusion was that I wrote “touch the apple,” but in practice, players needed to press or squish the apple for the character to move. This is something I can clarify better in the instructions for future users.The part of the game that worked well was moving the character from one side to another, as it felt natural and responsive. Another improvement suggested during user testing was related to catching items: originally, the character caught food near her legs at the very bottom of the screen, but after the suggestion, I moved the catch area so it aligns better with the character’s position. This made the game feel more intuitive and enjoyable.

The part of the experience that required more explanation was how the apples work as physical controllers. While users understood the concept after reading the instructions, I could improve this by adding a visual cue or short tutorial in the game showing how to press the apples to move the character. Overall, clear instructions, intuitive controls, and small adjustments based on user feedback made the game easy to understand and engaging, but highlighting the physical interaction more clearly could improve the first-time experience.

Please see below the user testing video:

User Testing

Week 12 – Final Project Proposal

Concept & Progress

For my final project, I chose to design an interactive healthy-choice game inspired by foods and habits familiar in Kazakh culture. By combining a digital game with physical inputs using Makey Makey, I want to create an experience that is playful, educational, and rooted in cultural references that feel personal and recognizable. The game features a girl character standing at the bottom of the p5.js canvas. The main mechanic of the game revolves around catching and avoiding falling items that drop from the top of the screen.  I have already made a design of the playing page, you can see it below.

These items that will fall down divided into healthy and unhealthy options that commonly appear in everyday life in Kazakhstan. Healthy items such as apples, dates, milk, and water increase the player’s score, while unhealthy items like burgers and Coca-Cola must be avoided. Each healthy item gives a specific number of points: apples give +20, dates +10, milk +20, and water +10. This scoring system encourages players not just to move and react quickly, but also to distinguish between foods visually. The game is structured in two phases, which alternate after a set time. In Phase 1, apples, dates, and burgers fall. In Phase 2, milk, water, and Coca-Cola appear. When the timer runs out, the round ends, and p5.js communicates with Arduino to display GAME OVER on an LCD module attached to the Arduino. Players can restart the game to continue the experience. Physical interaction happens through real apples connected with Makey Makey, which transforms the apples into left and right movement controls. Touching one apple makes the character move left, and touching the other moves her right. This adds a tangible element to the game and ties the controller design back to the cultural theme of the project.

The p5.js program handles all visuals, item generation, collision detection, scoring, phase transitions, and timer logic. It also sends a serial signal to Arduino at the end of the game so that the LCD screen displays “GAME OVER.” Arduino, meanwhile, receives movement inputs from Makey Makey and translates the end-of-game signal into LCD output. I have started working on the basic structure of the p5.js code, including the falling item class, the character movement, the timer, and the two-phase logic. I am also working on the sprite sheet for the girl character so she matches the visual style of the game.

To summarize inputs and outputs

Arduino → p5.js

  • Makey Makey “key” inputs from apples (Left / Right movement)
    (interpreted in p5 as arrow keys or assigned keys)

  • p5.js → Arduino
  • Message: Game Over sent via serial

  • Arduino displays GAME OVER on the LCD screen

Preliminary concept for Final Project

For my final project, I will design an interactive healthy-choice game using both Arduino and p5.js. The game features a Kazakh girl character positioned at the bottom of the screen while different items fall from the top. The healthy items, apples, dates, milk, and water are foods and drinks that people commonly consume in Kazakhstan, making the game culturally relevant and grounded in everyday life. These are contrasted with less healthy options such as burgers and Coca-Cola.

The game progresses in two alternating phases. In the first phase, apples, dates, and burgers fall from the sky. The player must collect apples and dates while avoiding burgers. After a set period, the game switches to a second phase, where milk, water, and Coca-Cola fall. In this phase, the player should collect milk and water while avoiding Coca-Cola. Each healthy item increases the score: apples give 20 points, dates 10, milk 20, and water 10. The player starts with three lives and loses one whenever they collide with an unhealthy item. When all lives are lost, the game ends, and p5 displays the final score.

Arduino provides the physical interaction through three buttons: one for start/restart and two for moving left and right. Arduino continuously senses button presses and sends serial messages to p5.js, which updates movement, animations, and game states.

Week 11 – Serial Communication

Exercise 1: Make something that uses only one sensor on Arduino and makes the ellipse in p5 move on the horizontal axis, in the middle of the screen, and nothing on arduino is controlled by p5. 

Concept: When a person covers the light sensor, the reduced light makes the ball move horizontally in p5. The player controls the ball simply by changing how much light reaches the sensor.

P5 Code:

let address = 0;

function setup() {
  createCanvas(600, 600);
  noFill();
}

function draw() {
  background("purple");
  stroke("white");

  // Convert the incoming sensor reading (0–1023) into a horizontal screen position
  ellipse(map(address, 0, 1023, 0, width), height / 2, 100, 100);

  if (!serialActive) {
    // Show a connection screen while serial communication hasn’t started yet
    background("rgb(70,9,70)");
    stroke("white");
    textSize(50);
    text("Press Space Bar to select Serial Port", 20, 30, width - 30, 200);
  }
}

function keyPressed() {
  // When the space bar is pressed, begin the setup process for the serial port
  if (key == " ") setUpSerial();
}

function readSerial(data) {
  // If valid data arrives from the Arduino, save it for use in draw()
  if (data != null) {
    address = int(data);
  }
}

Arduino Code:

int LED = A0;

void setup() {
Serial.begin(9600);
pinMode(LED, INPUT);
}

void loop() {
int sensorValue = analogRead(A0);
Serial.println(sensorValue);
delay(5);
}

Github Link

Setup and Schematic:

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Video Demonstration

Exercise 2: Make something that controls the LED brightness from p5

Concept: When the player touches the trackpad and increases the sketch’s color from black to white, the LED also gets brighter.

// Holds the brightness value we will send to the Arduino
let brightness = 0;

// Stores any data received back from Arduino (not used, but required)
let latestData = "";

function setup() {
  // Create the canvas where visual feedback will appear
  createCanvas(600, 400);
  noStroke();
}

function draw() {
  // Clear the screen each frame with a black background
  background(0);

  // Convert trackpad/mouse X position (0 → width) into brightness (0 → 255)
  brightness = int(map(mouseX, 0, width, 0, 255));

  // Draw a rectangle whose fill intensity matches the brightness value
  fill(brightness);
  rect(0, 0, width, height);

  // If a serial port is active, send the brightness value to the Arduino
  if (serialActive) {
    writeSerial(brightness + "\n"); // "\n" ensures Arduino reads full numbers
  }

  // If serial is NOT open, show instructions to the user
  if (!serialActive) {
    background("purple");
    fill("white");
    textSize(28);
    text("Press SPACE to choose Serial Port", 20, 40);
  }
}

function keyPressed() {
  // Press SPACE to open the Web Serial port selection dialog
  if (key === " ") {
    setUpSerial();
  }
}

// This function is REQUIRED by p5.webserial
// It receives data sent from Arduino (even if unused)
function readSerial(data) {
  if (data) latestData = data;
}

// Sends data to Arduino IF the writer is available
function writeSerial(value) {
  if (writer) {
    writer.write(value);
  }
}

Arduino:

int ledPin = 9;
int brightness = 0;

void setup() {
  Serial.begin(9600);
  pinMode(ledPin, OUTPUT);
}

void loop() {
  if (Serial.available() > 0) {
    brightness = Serial.parseInt();
    brightness = constrain(brightness, 0, 255);
  }

  analogWrite(ledPin, brightness);
}

Github link

Setup and Schematic:

 

 

 

 

 

 

 

 

 

 

 

 

Video Demonstration

Exercise 3:  Take the gravity wind example and make it so every time the ball bounces one led lights up and then turns off, and you can control the wind from one analog sensor

Concept: As our ball bounced, and the red LED lit up. Using the potentiometer, the player controlled the wind, making the ball move from one side to the other.

P5 Code:

let velocity;
let gravity;
let position;
let acceleration;
let wind;
let drag = 0.99;
let mass = 50;
let on = 0;

function setup() {
  createCanvas(640, 360);
  //noFill();
  position = createVector(width/2, 0);
  velocity = createVector(0,0);
  acceleration = createVector(0,0);
  gravity = createVector(0, 0.5*mass);
  wind = createVector(0,0);
}

function draw() {
  background(255);
  
  if (!serialActive) {
    text("Click on the Screen to select Serial Port", 20, 30);
  } else {
    text("Connected", 20, 30);
  }
  
  applyForce(wind);
  applyForce(gravity);
  velocity.add(acceleration);
  velocity.mult(drag);
  position.add(velocity);
  acceleration.mult(0);
  ellipse(position.x,position.y,mass,mass);
  if (position.y > height-mass/2) {
      velocity.y *= -0.9;  // A little dampening when hitting the bottom
      position.y = height-mass/2;
  }
  // turn on the LED only when it's on the ground or hits the ground
  if(position.y == height-mass/2){ 
    on = 0;
  }else{
    on = 1;
  }
}
function applyForce(force){
  // Newton's 2nd law: F = M * A
  // or A = F / M
  let f = p5.Vector.div(force, mass);
  acceleration.add(f);
}
function keyPressed(){
  if (key==' '){
    mass=random(15, 80);
    position.y=-mass;
    velocity.mult(0);
  }
}
function mousePressed() {
    setUpSerial();
}
function readSerial(data) {
  if (data != null) {
    // make sure there is actually a message
    // split the message
    wind.x = data;
    //////////////////////////////////
    //SEND TO ARDUINO HERE (handshake)
    //////////////////////////////////
    let sendToArduino = on + "\n";
    writeSerial(sendToArduino);
  }
}

Arduino Code:

const int LED = 9;
const int POT = A0;


void setup() {
Serial.begin(9600);
pinMode(LED, OUTPUT);


// Test the LED
digitalWrite(LED, HIGH);
delay(500);
digitalWrite(LED, LOW);
}


void loop() {
int p_value = analogRead(POT); // read from the potentiometer
int move = map(p_value, 0, 1023, -1, 2); // map the value to -1, 0, and 1
Serial.println(move);


if (Serial.available() > 0) {
// read from p5.js
int touch = Serial.parseInt();
// set the LED command
if (touch == 1) {
digitalWrite(LED, HIGH);
} else {
digitalWrite(LED, LOW);
}
}
}

Github file

Setup and Schematic:

 

 

 

 

 

 

 

 

 

 

 

Video Demonstration

Reflection: Across all three projects, I learned how different sensors can shape the experience through serial communication and p5. Working with light, the trackpad, and the potentiometer showed me how physical input can smoothly translate into visual changes on the screen. In the future, I would improve the responsiveness and make the serial connection more stable so the interactions feel smoother and more reliable.

 

Week 11 – Reading Reflection

Design Meets Disability:

As I was reading, I found out that what stayed with me most was how deeply design choices are entangled with social attitudes, even often more than we admit. The author’s critique of “camouflage” design exposed how much stigma gets quietly built into objects that are supposed to help people. It made me reconsider how often “neutrality” in design actually reinforces harmful norms by trying to erase difference instead of valuing it.

I was also struck by the idea that design for disability shouldn’t be separated from regular, mainstream design. The examples of the Eames splint and Aimee Mullins’ prosthetics shifted my understanding of what inclusivity can look like. Instead of treating disability as a constraint to minimize, the reading frames it as a space for experimentation, a site where new forms, aesthetics, and cultural meanings can emerge. That idea felt surprisingly liberating, because it challenges designers to imagine disability not as a deviation, but as part of the full spectrum of human experience.

Also, the discussion of universal design made me question my own assumptions about what “inclusive” even means. Sometimes we equate inclusivity with adding features for everyone, but the reading suggests that true thoughtfulness might come from editing, refining, and listening to individual needs. It left me with a sense that designing responsibly requires both humility and boldness: the humility to recognize one’s blind spots, and the boldness to challenge conventions that limit how people see themselves.

Week 10 – Reading Reflection

A Brief Rant on the Future of Interaction Design:

When I was reading Bret Victor’s “A Brief Rant on the Future of Interaction Design” I thought about how I perceive technology and design. I realized how easily I accept “innovations” like touchscreens as the peak of progress, even though, as Victor argues, they often limit our potential rather than expand it. His critique of “pictures under glass” especially resonated with me, because I use my phone and laptop every day, but I rarely think about how numb those interactions actually are. There is no real feeling, no texture, no sense of connection between my hands and what I’m creating.

I think, this reading challenged me to imagine interfaces that feel alive, that respond to our touch and movement in meaningful ways. Victor’s idea that tools should “amplify human capabilities” made me wonder whether I am designing for convenience or for human expression. I started thinking about how interaction could involve more of the body, maybe through gestures, pressure, or sound, so that users could experience technology in a fuller, more emotional way. I also liked Victor’s reminder that “the future is a choice.” It gave me a sense of agency and responsibility as a future designer. Instead of waiting for big tech companies to define how we interact, I can be part of shaping alternatives that are more tactile, intuitive, and human-centered. This reading did not just critique existing designs, it inspired me to dream bigger and to treat design as a way of expanding what people can truly feel and do.

A follow-up article

These responses challenged the way I think about technology and its relationship to the human body. His insistence that our current interfaces are “flat and glassy” made me realize how limited most digital experiences truly are. I started questioning how often I accept these limitations without noticing them. The idea that our tools should adapt to us not the other way around feels both radical and necessary.What I found most striking was his defense of our hands and bodies as essential to understanding and creativity. It made me see touch not as something trivial, but as a form of intelligence. The thought that we could lose part of that richness by constantly interacting with lifeless screens feels unsettling.

As someone studying Interactive Media, I see this as a call to design technologies that reconnect people with the physical world. Instead of chasing the newest gadget, I want to think about how digital experiences could feel more alive, how they could move, resist, or respond in ways that make us aware of our own presence. The reflections did not just critique modern design, they opened a space for imagining interaction as something deeply human, sensory, and expressive.

Week 10 – Musical instrument

Group members: Aizhan and Darmen
Concept:

As we began planning how to present our musical instrument and the sound produced by the buzzer while using both digital and analog sensors, we decided to use a button as our digital sensor and a distance-measuring sensor as our analog sensor. The main concept is that the distance sensor detects how far an object or hand is, and based on that distance, it produces different notes and sounds through the buzzer.  When the button is pressed, the system pauses for 300 milliseconds (0.3 seconds) and temporarily stops reading distance values, effectively muting the instrument. Pressing the button again reactivates the sensor, allowing the instrument to continue playing notes according to the object’s position. This simple toggle system makes it easy to control when the instrument is active, giving users time to experiment with the sound and movement.

Arduino Setup and Demonstration: 

Schematic Diagram: 

 

 

 

 

 

 

Setup:

 

 

 

 

 

 

 

Video Demonstration

Coding:

Arduino file on Github

if (sensorActive) {
    int distance = getDistance();
    Serial.println(distance);
    if (1 < distance && distance < 5) {
      tone(BUZZER, NOTE_C4);
    } else if (5 < distance && distance < 10) {
      tone(BUZZER, NOTE_D4);
    } else if (10 < distance && distance < 15) {
      tone(BUZZER, NOTE_E4);
    } else if (15 < distance && distance < 20) {
      tone(BUZZER, NOTE_F4);
    } else if (20 < distance && distance < 25) {
      tone(BUZZER, NOTE_G4);
    } else if (25 < distance && distance < 30) {
      tone(BUZZER, NOTE_A4);
    } else if (30 < distance && distance < 35) {
      tone(BUZZER, NOTE_B4);
    } else {
      noTone(BUZZER);
    }
  }
}

float getDistance() {
  float echoTime;                   //variable to store the time it takes for a ping to bounce off an object
  float calculatedDistance;         //variable to store the distance calculated from the echo time
  //send out an ultrasonic pulse that's 10ms long
  digitalWrite(TRIG_PIN, HIGH);
  delayMicroseconds(10);
  digitalWrite(TRIG_PIN, LOW);
  echoTime = pulseIn(ECHO_PIN, HIGH);      //pulsein command to see how long it takes for pulse to bounce back to sensor
  calculatedDistance = echoTime / 55.2;    //calculate  distance of object that reflected the pulse in cm 
  return calculatedDistance;
}

We would say the highlight of the coding would be the part where the distance-measuring sensor interacts with the buzzer to produce different musical notes based on how close or far an object is. We were excited to experiment with the distance-measuring sensor and explore how it could be programmed to produce sound through the buzzer. To get better understanding of the integrating notes we referred to Arduino tutorials. 

In the code above, the sensor continuously measures the distance of an object and converts the time it takes for the sound wave to bounce back into centimeters. Depending on the measured distance, the buzzer plays different musical notes, creating a simple melody that changes as the object moves closer or farther away.

Reflection:

We enjoyed experimenting with the distance-measuring sensor as it taught us how precise sensor readings can be transformed into meaningful outputs like sound, and combining it with the button control helped us manage the instrument’s activation smoothly. For future improvements, we would like to expand the range of notes to create more complex melodies and add LED lights that change color with the pitch to make the instrument more visually engaging. We could also experiment with different sensors, such as touch or motion sensors, to add more layers of interactivity. Finally, refining the accuracy and response speed of the sensor would make the sound transitions smoother and enhance the overall experience.

Week 9 – Reading Reflection

Physical Computing’s Greatest Hits (and Misses)

Reading Physical Computing’s Greatest Hits (and Misses) really made me reflect on the role of human presence and movement in interactive design. The article explores recurring motifs in physical computing, such as theremin-style sensors, body-as-cursor interfaces, tilty tables, and mechanical pixels, and evaluates which ideas succeed and which fall short. What struck me most was the idea that interaction alone is not meaningful unless it is framed with intention or context.  I found this insight particularly relevant because gestures, motion, and bodily engagement only carry meaning when integrated into a space or narrative. The article also emphasizes that even commonly used ideas can be made fresh through variation and creativity.

The discussion of emotionally responsive projects, like “remote hugs”, also inspired me to think about the potential of physical computing to create connection and presence. It made me consider designing experiences where participants’ actions are not only triggers for a response but also carriers of meaning, emotion, or narrative. I found myself imagining interactive installations or performance spaces where movement, gesture, and proximity could communicate emotion or tell a story, giving participants a sense of agency and contribution. Overall, the article reinforced the importance of centering human input and intention over technical complexity. It motivated me to experiment more boldly with interactive media, blending technology, space, and human engagement in ways that feel purposeful, immersive, and emotionally resonant.

Making Interactive Art: Set the Stage, Then Shut Up and Listen

When I was reading Making Interactive Art: Set the Stage, Then Shut Up and Listen I noticed a perspective of the audience or participant as a a co-creator, and the creator’s role is to design opportunities for exploration and discovery. The article encouraged setting the stage, providing affordances, and then stepping back to let participants engage on their own terms. This concept resonated deeply with me because I often feel the need to over-explain or control how people interact with my work, whether in interactive media projects, installations, or themed environments. Learning to trust participants’ curiosity and creativity is both challenging and exciting, and it made me rethink how I approach design: sometimes the most compelling experiences arise when the creator resists guiding every step and instead observes how others explore, interpret, and respond.

I also liked the idea  of “listening” to participant interaction. Observing how people engage, adapt, or even misuse an interactive installation can reveal insights the creator never intended, and these discoveries can guide future iterations. This connects to my interests in performance and immersive storytelling because, in both cases, the audience’s reactions shape the experience. It also made me reflect on how I design spaces and experiences outside of class projects, including themed parties or interactive setups, where I can experiment with encouraging participation rather than prescribing behavior. The article inspired me to embrace unpredictability, co-creation, and emergent experiences, reminding me that interaction is not just about technology or novelty, it is about creating a dynamic relationship between the participant, the space, and the narrative. Now, I want to apply this mindset to my projects, designing experiences where participants’ actions carry weight and meaning, and where discovery becomes a central part of engagement.

Week 9 – Analog input & output

Concept:

For my project, I wanted to create a system that could control multiple LEDs in both a digital and an analog fashion, and I drew inspiration from traffic lights. Instead of just two LEDs, I decided to use three LEDs to mimic a simplified traffic light sequence. When the push button is pressed, the system cycles through the lights in order: red → yellow → green, which demonstrates digital control, as each LED turns fully on or off with each button press.

For the analog aspect, I incorporated a potentiometer connected to an analog input pin on the Arduino. By adjusting the potentiometer, users can control the brightness of the LEDs, allowing for smooth, variable light intensity. Together, the push button and potentiometer create an interactive system: pressing the button cycles through the traffic light colors, while turning the potentiometer adjusts their brightness. This combination not only makes the project functional and educational but also connects to the familiar real-world concept of traffic lights.

Arduino Setup and Demonstration: 

Setup

Arduino video

Hand-drawn schematic

Coding:

Arduino File on Github

// Pins
const int POTENTIOMETER_PIN = A0; // Analog input pin for potentiometer
const int BUTTON_PIN = 2; // Digital input pin for push button
const int RED = 5;  // Digital output pin for RGB LED (red)
const int GREEN = 6;  // Digital output pin for RGB LED (green)
const int YELLOW = 9; // Digital output pin for RGB LED (yellow)

// Variables
int potentiometerValue = 0; // Potentiometer value
bool buttonState = false; // Button state
int colorIndex = 0; // Index of current color

void setup() {
  pinMode(POTENTIOMETER_PIN, INPUT);
  pinMode(BUTTON_PIN, INPUT_PULLUP);
  pinMode(RED, OUTPUT);
  pinMode(GREEN, OUTPUT);
  pinMode(YELLOW, OUTPUT);
}

void loop() {
  potentiometerValue = analogRead(POTENTIOMETER_PIN);
  
  // Map potentiometer value to LED brightness
  int brightness = map(potentiometerValue, 0, 1023, 0, 255);
  
  // Set the LED color based on the current color index
  switch (colorIndex) {
    case 0:  // Red
      analogWrite(RED, brightness);
      analogWrite(GREEN, 0);
      analogWrite(YELLOW, 0);
      break;
    case 1:  // Green
      analogWrite(RED, 0);
      analogWrite(GREEN, brightness);
      analogWrite(YELLOW, 0);
      break;
    case 2:  // Yellow
      analogWrite(RED, 0);
      analogWrite(GREEN, 0);
      analogWrite(YELLOW, brightness);
      break;
  }

  // Check button state
  if (digitalRead(BUTTON_PIN) == LOW) {
    if (!buttonState) {
      // Toggle color index
      colorIndex = (colorIndex + 1) % 3;
      buttonState = true;
      delay(200);
    }
  } else {
    buttonState = false;
  }
}

For this code, I created a circuit where I can control three LEDs—red, yellow, and green using a potentiometer and a button. The potentiometer adjusts the brightness of the active LED, while the button lets me switch between the LEDs. Each LED is connected to separate pins on the Arduino (5, 6, and 9), and the potentiometer is connected to the analog input A0. By pressing the button, I can cycle through the colors, and by turning the potentiometer, I can smoothly change how bright each LED glows.

Reflection:

Through this project, I learned how digital and analog inputs can work together to create an interactive system. It was rewarding to see how the push button and potentiometer complemented each other, one controlling the LED sequence and the other adjusting brightness. This helped me better understand how sensors and user inputs can bring a project to life.
For the future improvements, I would say adding a delay or debounce to make the button response smoother and more reliable, or using PWM pins consistently for better brightness control. In the future, I could also expand the project by adding automatic timing to simulate real traffic light behavior or incorporating sensors to make it more dynamic and realistic.

Week 8 – Reading Reflection

Her Code Got Humans on the Moon:

Margaret Hamilton’s story impressed me with how much dedication and focus one person can bring to their work. She did not see programming as a small or temporary job, she treated it as something worth building a whole discipline around. At a time when few people even understood what software was, she worked with a sense of seriousness and care that helped define the entire field. I found it admirable that she led a team in such a demanding environment, while also constantly pushing for precision and reliability in every line of code. It must have taken even more determination to do this in a male-dominated field, where her ideas were often questioned or overlooked. Yet she proved that talent and persistence can speak louder than bias.

The moment that stayed with me most was when people told her, “That would never happen.” She had warned about a potential error that others dismissed as impossible, but it did happen during the Apollo 11 mission. Because of her preparation, the software knew how to handle it, and the astronauts landed safely. This part made me reflect on how important it is to think beyond what seems likely or convenient. Her ability to imagine every possible mistake shows not only intelligence but humility, the awareness that humans and systems can fail, and that good work anticipates that. Hamilton showed that real achievement does not come from recognition, but from persistence and attention to detail. Even when others doubted her, she stayed focused on what she believed was right. That quiet confidence and responsibility are qualities I hope to develop in my own work.

Emotions & Design: Attractive things work better

This reading made me think deeply about how design affects both our emotions and our behavior. One of the main ideas is that attractive things work better because they make people feel happy and confident. The author explains that when we feel good, our minds become more open and creative, but when we are stressed, we tend to think narrowly and make mistakes. I found this especially interesting because it shows that emotions are not separate from thinking, they actually shape how we use and understand things. Another important idea the author discusses is that good design balances beauty and function. This made me reflect on how I interact with everyday objects. For example, I prefer using items that are both practical and attractive whether it’s a tea set, a notebook, or even my phone interface. When something looks nice, I automatically treat it with more care and feel more motivated to use it.

I also strongly connected with the author’s point about context and mood. He writes, “Design matters, but which design is preferable depends upon the occasion, the context, and above all, upon my mood.” This reminded me of how Kazakh families choose different tea sets depending on the situation. When it is just family, they use the simplest and fastest set. But when guests come, they always bring out the most beautiful one to show respect and hospitality. Another part that stood out to me was how the author connects pleasure with usability. He suggests that when something looks good, we are more tolerant of small problems. I realized this is true for me too, I do not mind if a pretty cup is a bit heavier or a stylish app takes a second longer to load, because its beauty gives me a pleasant feeling. I even change my tea sets every season, one for winter, spring, summer, and fall, because I enjoy drinking from something that matches the season’s atmosphere. The same goes for digital things: an attractive design makes me feel happier and more productive.