Final Project Documentation

1. Concept

My final project helps users, especially beginners, to learn how to play the piano and learn how to jam to the blues music style. I created a mini-piano consisting of 2 octaves plus one key (spanning notes C4 to C6), which is a welcoming size for beginners. A visualization of a piano is displayed on the p5js sketch, which can be helpful for the piano player to see an animated pressed key and listen to the relevant audio for that pressed key.

Product with Animation Display and Headphones

The piano is color-coded by note, so that note “C” is white, “D” is orange, “E” is red, “F” is blue and so on. This was a deliberate choice because seeing different colours on the piano can help users familiarize themselves with the positions of the keys over time. Additionally, I used this presentation slide deck with instructions to play the notes, color-coded, in order (example in Fig. 1). Thus, as users see the color-coded notes on the presentation and try to follow it, they could more quickly and easily match it to the note on the physical piano that they should play.

Instructions to Play, In Left-Right Order, the Color-Coded Notes
Fig. 1. Instructions to Play, In Left-Right Order, the Color-Coded Notes

2. Pictures / Videos of Project Interaction

 

3. Implementation

The interaction design can be described as follows: users can listen to an E-minor backing track, and respond by pressing on the labelled force sensitive resistors and labelled push buttons, resulting in an animation of pressed white keys vs pressed black keys respectively. They also hear the note being played using p5js.

For hardware, I deliberately used an Arduino MEGA for its sufficient provision of analog pins. I used pins A0 to A14 for the force sensitive resistors and pins 2-6 as well as 8-12 for the push buttons. The schematic is attached.

Schematic

For software, both Arduino software and p5js are used, with a one-way communication from Arduino to p5js. My Arduino sketch is linked here and my p5js sketch is linked here.

Notably, in addition to defining fsrPins and buttonPins arrays, theArduino code has string arrays “whiteKeyNames,” “blackKeyNames” as well as boolean arrays “whiteKeyPressedState” and “blackKeyPressedState” which stores the last sent state (true if pressed, false if released). The setup() function initializes the arrays. In the loop() function, white keys are processed with hysteresis, checking:

  1. IF FSR reading is above the press threshold but its pressed state was off before, currentActualStateIsPressed is TRUE
  2. IF FSR reading is above the release threshold and its pressed state was on before, currentActualStateIsPressed is TRUE
  3. ELSE IF FSR reading is not beyond the release threshold, currentActualStateIsPressed is FALSE

For stability, the black keys are processed with debouncing, which is about removing bouncing in contacts and delay is a timing feature.

Initially, I was inspired by this p5js template which uses WEBGL to create a 3D animation. In my case, orbitControl is enabled, allowing rotation and zooming in and zooming out. Notably, arrays of audio files (whiteKeyAudioFiles and blackKeyAudioFiles) as well as arrays of piano key objects (whiteKeyObjects and blackKeyObjects) help my code be concise and easier to manage. For organization purposes, “WhiteKey” class is stored in WhiteKey.js and “BlackKey” class is stored in BlackKey.js.

Using the template, I learned how to generate the animation: use an initial white key x-position (initial_wk_x = -375), set a white key spacing (white_key_spacing = 50) since they are spaced evenly on a real piano, and set dedicated black key x-positions [-350, -300, -200, -150, -100, 0, 50, 150, 200, 250]. Since I had more keys than the initial template, I had to edit the black key x-positions and the initial white key position.

In the communication from Arduino to p5, a string is sent line-by-line containing note name, followed by “=”, followed by note state (either 0 or 1 depending on if it’s considered pressed or not). activeSerialNoteStates is an important variable that stores the latest state (0 or 1) received from Arduino for each note name. Based on the current state of a particular key in activeSerialNoteStates, the handlePressEvent() and display() for that key is called.

function readSerial(data) {
  if (data != null) {
    let parts = data.split("=");
    if (parts.length === 2) {
      let noteName = parts[0].trim();
      let noteState = parseInt(parts[1].trim()); // Will be 0 or 1

      // Store the latest state sent by Arduino for this note
      activeSerialNoteStates[noteName] = noteState;
      // console.log("Received: " + noteName + ", State: " + noteState); // For debugging
    }
  }
}

4. Great Aspects of the Project

In terms of design, I think the color-coded strips are important and helpful indicators that guide the user on where to press given an instruction:

Design Framework – Color-Coded Strips Pasted on Cardboard

Crucially, an important challenge I faced in the project was using attaching wire to the FSRs in a way that would not damage them. Initially, I damaged two FSRs in my attempt to wire them. However, over time, I became more familiar on how to use the wire wrapping tool. Combined with soldering, taping, and attaching the FSRs to the circuit, this entire process was time-consuming, perhaps taking about 8 hours. Taping is so important, to prevent metal conductors from different wires touching each other and making connections that should not be made.

Adding FSRs to Circuit using and Attaching Them to Breadboard using Wire Wrapping Tool

In terms of organization, the wires have been arranged neatly using a cable organizer:

Wires Arranged Neatly Using Cable Organizer

As for code, the preload() function helped important audio, including the backing track and the piano key sounds to be prepared before the animation so they could be played on time. Moreover, a very critical challenge I faced was slow sound feedback after a key press. In my attempt to resolve this, I thought of factors that could be the issue, such as animation having a huge load. Using AI recommendation, I tried to reduce the load by increasing lerp(…, …, 0.2) to lerp(…, …, 1) – even if animation would be less smooth. However, this only reduced the feedback delay very slightly – it was still noticeable. I thought more, and realized I could try having Arduino do the checks of the FSR readings over the press threshold and then send the note state to p5(0 or 1) – instead of having p5 do the checks. I tried to revise the code manually at first, but faced disfunctionality, and used AI to help me. After having Arduino do the checks, the feedback delay issue was resolved! Through this experience, I learned how Arduino truly specializes in input/output (at least compared to p5js) really emphasizing the Professor’s words in class.

In terms of audio choice, I chose recorded sounds of pressed keys of an old piano which has volume that decreases over time, instead of digital MIDI sounds, to mimic a real piano experience more closely.

5. Future Improvement

In the future, the project could be expanded by weaving both visual arts and music into an interactive art gallery with rooms containing a visual scene with background music. Unlike a traditional art gallery, this art gallery comes with an “escape room” challenge: each art room is a phase of an “escape room” challenge for which users must interact with through musical jamming to reach the next phase (or proceed to the next art room). Once all phases are passed, the user successfully completes the escape room challenge! In this way, users should interact with every artwork in the art gallery if they are to pass this “escape room” challenge.

User decisions can be enabled by analog/digital sensors. For example, users can control an avatar using joysticks to move up, right, left, or down within the escape room setting (eg. a big treehouse). The user should use clues involving the visual artwork and the musical sounds heard from p5js to figure out a solution that involves jamming in a corrected/accepted way to pass the phase and reach the next phase. The jamming could be through a “piano” made with force sensitive resistors (FSR), each connected to a buzzer. The connection of each FSR to a buzzer is crucial as it enables multiple musical notes to be heard at the same time.

One of my hopes I had for this project is to help music be more beginner friendly – to simplify musical complexity through visual pairing. Novice players often struggle with abstract musical concepts like chords and rhythm patterns. To address this, each puzzle should pair auditory elements with direct visual analogs that guide interactions without requiring prior musical knowledge.

    • Color-coded notes mapped to specific areas in the artwork 1
    • Animated rhythm indicators synced to musical phrases 2
    • Geometric shapes representing chord structures 3

This approach aligns with research showing that visual scaffolding improves music theory comprehension by 42% compared to audio-only instruction 4.

There could be a three-stage learning progression across three phases:

    • Pattern Recognition (Phase 1)
    • Rhythm Matching (Phase 2)
    • Emotional Association (Phase 3)

IMPORTANT: There should always be an option (eg. button) to restart solution provided for each phase.

Further details can be found here.

Blending an interactive art gallery, an escape room, musical learning, and physical computing could have a lot of potential for an engaging and memorable experience – and perhaps even more research on it for musical learning!

Finalized concept for the project

Finalized concept for the project

My final project helps users, especially beginners, to learn how to play the piano and learn how to jam to the blues music style. I created a mini-piano consisting of 2 octaves plus one key (spanning notes C4 to C6), which is a welcoming size for beginners. A visualization of a piano is displayed on the p5js sketch, which can be helpful for the piano player to see an animated pressed key and listen to the relevant audio for that pressed key.

The piano is color-coded by note, so that note “C” is white, “D” is orange, “E” is red, “F” is blue and so on. This was a deliberate choice because seeing different colours on the piano can help users familiarize themselves with the positions of the keys over time. Additionally, I used this presentation slide deck with instructions to play the notes, color-coded, in order (example in Fig. 1). Thus, as users see the color-coded notes on the presentation and try to follow it, they could more quickly and easily match it to the note on the physical piano that they should play.

Instructions to Play, In Left-Right Order, the Color-Coded Notes
Fig. 1. Instructions to Play, In Left-Right Order, the Color-Coded Notes

Design and description of what your Arduino program will do with each input and output and what it will send to and/or receive from P5

  • Inputs:
    • Force Sensitive Resistors (FSR) – Arduino program receives note state based on FSR readings vs press thresholds. If the readings meet the conditions, send to p5.
    • Push buttons – Arduino program receives note state based on reading on whether button is pressed.
  • No outputs

Design and description of what P5 program will do and what it will send to and/or receive from Arduino

In the communication from Arduino to p5, a string is sent line-by-line containing note name, followed by “=”, followed by note state (either 0 or 1 depending on if it’s considered pressed or not). activeSerialNoteStates is an important variable that stores the latest state (0 or 1) received from Arduino for each note name. Based on the current state of a particular key in activeSerialNoteStates, the handlePressEvent() and display() for that key is called.

Progress

Tomorrow and this Sunday, I could try to work on building the circuit.

References:

  • Blues scale, E blues scale: https://www.youtube.com/watch?v=CjJwxtahGtw
  • Major vs Minor Blues scales: https://www.youtube.com/watch?v=WWEchKvZwdE
  • Pentatonic scales and blues scales: https://www.youtube.com/watch?v=Vj-BOmKgdE4

Preliminary Concept for Final Project

For my Intro to IM Final Project, I wanted to do something that helps me feel so “free” in the sense of creativity: visual arts, together with something I get so excited to experience: music. I’d like to weave both visual arts and music into an interactive art gallery with rooms containing a visual scene with background music. Unlike a traditional art gallery, this art gallery comes with an “escape room” challenge: each art room is a phase of an “escape room” challenge for which users must interact with through musical jamming to reach the next phase (or proceed to the next art room). Once all phases are passed, the user successfully completes the escape room challenge! In this way, users should interact with every artwork in the art gallery if they are to pass this “escape room” challenge.

User decisions can be enabled by analog/digital sensors. For example, users can control an avatar using joysticks to move up, right, left, or down within the escape room setting (eg. a big treehouse). The user should use clues involving the visual artwork and the musical sounds heard from p5js to figure out a solution that involves jamming in a corrected/accepted way to pass the phase and reach the next phase. The jamming could be through a “piano” made with force sensitive resistors (FSR), each connected to a buzzer. The connection of each FSR to a buzzer is crucial as it enables multiple musical notes to be heard at the same time.

The above is the minimum viable product. I was also inspired by the generative text data project from week 4 which involves creating a poem from a corpus of words. If time allows, an additional feature could be incorporated: after the escape room challenge is completed, the user can try to create poem artwork by jamming. Based on the user’s choices, each word of the poem on an artwork are selected and displayed.

Week 11 – Serial Communication

1. Concept

There are 3 parts to the project:

(1) Light Dependent Resistor (LDR) readings are sent from Arduino to p5js. The ellipse in p5js moves on the horizontal axis in the middle of the screen depending on the LDR readings. Nothing on arduino is controlled by p5js.
(2) Control the LED brightness from p5js using mouseX position. The more right the mouse position, the higher the LED brightness.
(3) Taking the gravity wind example (https://editor.p5js.org/aaronsherwood/sketches/I7iQrNCul), every time the ball bounces one led lights up and then turns off, and you can control the wind from the potentiometer (analog sensor).

2. Highlights: Challenges and Overcoming Them

(1) The circle position was not changing in response to the brightness of the surroundings. We attempted to tackle this problem by checking the serial monitor as to whether LDR readings are being read. After confirming the LDR’s functionality, we closed the serial monitor proceeded to use p5js and use the right serial port. However, the circle position was still not changing. With help from Professor, we streamlined our code to include just about all that seemed necessary. This worked!

 

(2) The LED was flickering. We did not know why. Alisa thought that the delay(50) must be at after instead of before analogWrite(ledPin, brightness). However, that did not solve the problem. Samuel thought to remove the delay(50). It still did not work. We decided to try to map the mouseX position (ranging from 0 to 600) to the range of 0 to 255 in Arduino instead of p5js. This worked!

 

(3) To code the 3rd assignment. I worked alongside many different individuals, Alisa, Samuel, Haris, (even Prof Aya’s slides). I was having trouble at every stage of the hardware aspect, from safari stopping the code from interacting with the Arduino, to then having serial issues with the laptop. I was able to piece together work until I finally had it working in the end. The coding aspect was simple overall, as the base code only had to be minority amended to take inputs from the Arduino, and the LED had to complete an action based on the running programme.

3. Video

Part 1:

Part 2:

Part 3:

4. Reflection and Future Areas of Improvement

Our challenges emphasized the need for understanding the code that allows communication between p5js and Arduino. Without understanding the code, it is difficult to make the right removals or changes necessary to achieve the results we are hoping to.

It would be great to find out why the LED did not flicker when the mapping was performed in Arduino rather than p5js. Is this related to PWM?

Week 11 – Reading Response

My first takeaway is the importance of design for disability support. Prior to the reading, I was not aware that glasses were used as medical prescriptions not long ago (in the 1960s and 1970s), people were stigmatized for the medical model (they were considered to “cause social humiliation”) and that there was a transition to eyeglasses becoming more stylish with new, fashionable glasses. The irony that striking fashion frames are less stigmatizing than pink plastic glasses strikes me. So I thought that what is more visible (such as the striking fashion frame) can be more positively received than what is intended to be visible (pink plastic glasses). But my thought is not fully correct – glasses’ acceptability does not come from directly from its degree of brightness, for its brightness can be to the distaste of certain consumers. Instead of making things more “visible,” I see that the author tries to bring forward the importance of designers in the making of disability support devices. In the design of prosthetic limbs, design engineers are not demanded in a multidisciplinary team of engineers, clinicians, etc. Yet, I see so much potential through the hearing aid designed by Ross LoveGrove for RNID, called The Beauty of Inner Space. Since design can project negative connotation (such as invisibility to project a sense of shame of something to be hidden) or positive connotation, we need designers for people with disabilities to feel more comfortable. With more comfort, it is easier to “wear” than to “use” – also words I took note of with interest – as the former means more closeness to oneself than the latter.

My second takeaway is the importance of design in general, not just for disability. Steve Jobs words about design being not just “what it looks like and feels like” but “how it works” suggest that design is essential, significant and should not be downplayed! It makes sense because of the example of how the design choice of only two options for the iPod shuffle then allows the user to select either playing tracks in the order they were downloaded or in a random order. Two good principles I wish to remember for design in the future are: one, to consider and pursue great cohesion in design between the parts and the whole; two, creating a more accessible design and using simplicity to achieve this.

Week 10 – Reading Response

Through the reading “A Brief Rant of the Future of Interaction Design,” I was convinced that the Future of Interaction should not be a single finger and should have good feedback, and I think big innovations needs to raise the usefulness of current things. It is problematic that most Future Interaction Concepts completely ignore parts of the body we rely on. If Pictures Under Glass were an invention when the ipad or touch-screen phones were not yet made, then perhaps it would be considered a good advancement (that still needs further improvement on interaction). Pictures Under Glass looks similar to the current phone or ipad, and I think I was visioning a different way of seeing andinteracting with things you find in the phone or ipad in the future – like content you can swipe and change up in the air (is this a good thing or not?). While watching the video “Microsoft – Productivity Future Vision”, I felt I was looking for something useful, something more. For example, I think the translation using what you see with your glasses looks very useful. If you can make your recipes while reading what you need up in the air, this could be useful because you just need to look up from the food. If things look futuristic simply for the sake of looking futuristic, is that really progress or moving backward?

However, I do disagree with one definition used for a tool (“A tool addresses human needs by amplifying human capabilities”) as I think a tool can address human needs in another way, such as by helping open up a new possibility for humans rather than amplify existing human capabilities. Another definition of a ‘tool’ strikes me well: “That is, a tool converts what we can do into what we want to do. A great tool is designed to fit both sides.” I have never thought of it that way and I sure do agree with this one.

As for the second reading, I was reminded of my experience with virtual reality, when I tried to touch objects I see in the air but not feel feedback. I was still amazed by what I experienced, but imagine if two people were learning fencing in a virtual world without feeling the weight of the weapon and its impact on the opponent’s weapon – I really think that’s not going to work… Also, while virtual reality is cool, like the author, I also have a problem with a future where people can and will spend their lives completely immobile, if they spend all their time on computers that are not part of the physical environment. That is unhealthy, and the inventions should be used to help people and encourage them in taking good actions.

Week 10 – Group Musical Instrument

1. Concept and Process of Ideation

At first, we wanted to create the instruments we liked – for Samuel, a drum set, and for Alisa, a violin. However, we encountered large challenges that would take up too much time or complexity within this homework timeframe, so we changed our idea. The next idea has two parts of one piano – so that Samuel’s musical instrument to play the lower notes as you would on a piano with your left hand, which can be paired with Alisa’s instrument that plays higher notes as you would on a piano with your right hand.

However, the last idea was revised again because Samuel’s musical instrument has fast feedback while Alisa’s instrument takes longer to register the orientation and play the correct note accordingly. So, Samuel’s musical instrument plays the higher notes (more soprano) which have much more crisp and clear sounds for the listener, while Alisa’s instrument is the accompaniment that can be used to play each lower note (more bass) for a longer amount of time in general.

Samuel’s musical instrument is for the “left hand of a piano”, with musical notes C, D, E, F, G, A that can be played using force sensors (analog sensors). In the process of coming up with this idea, we considered which sensors would be similar to a piano, registering presses. Push buttons and force sensors were our options, but we went for the force sensors (for Alisa, it was because she thought she hadn’t tried using the force sensors yet!)

Although Alisa’s musical instrument was considered earlier as for the “right hand of a piano,” it really does look like much more of a music box so it is now a “music box.” In the process of coming up with the music box, we thought of taking the last project with the gyroscope and using a button press to play the sound of musical notes, with the notes differing based on the orientation of the gyroscope. Samuel thought of using the arcade LED button (digital sensor) we found as we looked around the components in the IM Lab, and I thought that if the LED button could light up when being pressed, that would be great feedback for the user.

2. Highlights

One crucial challenge in creating the “music box,” was that we wanted to use Arcade LED Button but we did not have access to the manual. Alisa tried researching online, and thought it seemed like the LED button needed connections to one positive (+) and GND (-) for the LED, as well as one positive (+) and GND (-) for the button itself. She tried this based on the colours of the cable (2 red and 2 black) she found soldered already. However, upon further research, Alisa found out that these cable colours are misleading…

Alisa inspected the component and found ‘BAOLIAN’ on the component as possibly the brand name. She did further research online and found two useful sources:

  • ezbutton library resource- https://arduinogetstarted.com/tutorials/arduino-button-library
  • https://forum.arduino.cc/t/using-arcade-buttons-solved/677694

One of these sources includes a diagram explaining connections as shown below. Notice the crown – this was also on the component. Given that it was circled, it must be important that the crown was facing upwards. I connected it in this way, edited the code from the forum, and the LED button lights up when I press it!

Arcade Button Connections

I used the gyroscope of the MPU-6050 for my last project, but MPU-6050 contains both a gyroscope and acceleromter. While the gyroscope is smooth (since it uses integration), it has the problem of accumulating errors over time. On the other hand, the accelerometer readings fluctuate very much, but are very accurate. Therefore, to balance accuracy and smoothness, I needed to combine gyroscope readings with accelerometer readings through a  Kalman filter. I used this playlist to help me: the accelerometer calibration must be done manually as in video 14, and video 15 shows how gyroscope and accelerometer readings can be combined.

To have music box play different notes depending on orientation, the code needed to integrate if conditions checking whether the filtered roll angle and filtered yaw angle read from the readings combining gyroscope and accelerometer are within certain ranges.

In making the FSR keyboard some of the challenges we encountered included faulty resistors that detected readings even when not pressed and for this we had to try out each one of the resistors separately and this was very resourceful.

Another challenge was that the buzzer only played one frequency and also because tone() is a non-blocking function at a time and therefore playing a chord as was the original plan became a challenge. We therefore settled to using only notes unless we had a buzzer for each fsr.

3. Video

4. Reflection and ideas for future work or improvements

One key limitation we faced was the Arduino UNO’s constraint of only six usable analog input pins for AnalogRead(), which restricted us to mapping just six musical notes for the FSR keyboard. Ideally, we would have included a seventh note to complete a full octave. In future iterations, we could consider using a mega arduino with more analog input options.

Additionally, a valuable improvement for the music box would be providing real-time feedback to the user indicating which note is currently being played. This could be achieved by incorporating an OLED display, an array of LEDs corresponding to each note, or even serial monitor outputs during development. These enhancements would improve usability and allow for a more intuitive and engaging musical experience.

Week 9 – Digital + Analog Sensors

1. Concept

I was inspired by gloves from the creative reading “Physical Computing’s Greatest Hits (and misses)” to go for a project that involves specific responses for ways of movements. Being excited about the freedom of flight, I wanted to incorporate flight. I had two parts related to flight with me, a gyroscope (measuring rotation rates) and a barometer (measuring altitude) which were, by the definitions of the IM technical readings, analog (although it does use I2C which converts analog to digital). I incorporated a slide switch as digital sensor.

2. Highlights

I attached some popsicle sticks to the breadboard to express more clearly the idea of an airplane or jet. I used red LEDs specifically as indicators for warning.

In order to figure out how to wire up and code, I researched online and found the following resources valuable in the process of building my circuit and code:

  • https://www.instructables.com/How-to-Connect-MPU6050-to-Arduino-UNO/
  • https://www.electronicwings.com/arduino/mpu6050-interfacing-with-arduino-uno
  • https://components101.com/sensors/gy-bmp280-module
  • https://www.instructables.com/Slide-Switch-With-Arduino-Uno-R3/
  • https://projecthub.arduino.cc/SurtrTech/bmp280-measure-temperature-pressure-and-altitude-6002cd

The technical and creative work with code was in combining codes for the different components together, adjusting any pin numbers, and adding conditions for the LEDs to turn on.

I encountered major obstacles in both hardware and software:

  • I mistakenly thought the BMP-280 was a BME-280. Thus, I downloaded the wrong library for the BME-280. Reading this website made me realize that my component looked similar, and I had a strong inclination that towards the sensor being a BMP-280 based on the letters of the acronyms in the sensor that it referred to BMP (BM, ME / ).
  • Initially, I connected SCL and SDA of the pressure sensor to the analog pins A4 and A5. However, readings from the pressure sensor were not appearing. I tried to replace wires thinking they were faulty, but that didn’t work. I tried to connect the wires to the SCL and SDA on the Arduino UNO but probably because I used the wrong library (BME-280), even though wiring for pressure sensor, I did not see it work (readings from the pressure sensor were not appearing).
  • My circuit was wired up in the wrong order. The SCL and SDA from the gyroscope was correctly wired to the SCL and SDA of the pressure sensor, but the SCL and SDA of the pressure sensor should have been wired to the Arduino UNO, rather than the SCL and SDA of the gyroscope directly wired to the Arduino UNO.

3. Video

4. Reflection and ideas for future work or improvements

Overall, I am glad I learned much on the technical side of things. Improvements include incorporating a buzzer like in the sound of an aircraft to warn the pilot, incorporating a radio system with other “airplanes” to warn other airplanes of being in their vicinity with the other “airplane” having its warning light prompted.

Week 9 Reading Response

I completely understand the view brought up in the reading “Physical Computing’s Greatest Hits (and misses)” on how so often I think to myself, “I don’t want do to that, it’s already done” and give up on the idea because I think it’s not original. However, my horizons broadened when I realised through the reading that even recurring themes can have a lot of room for originality, if I try to add a notable variation / twist to these themes. As the reading discusses specific examples of ideas of physical interaction, I think it is great how it contains not only a description of it and its positive aspects, but also its limitations. For instance, with Theremin-like instruments, moving a hand over a sensor can have little meaning – but it can be developed through a twist involving a physical form and context for the sensors that affords a meaningful gesture. I see gloves as a way that affords much more meaning because we use the way our fingers bend and which fingers bend can result in so many variations and convey some meaning – whether someone is stressed vs relaxed, the way to play an instrument, etc. Another limitation that stood out to me was with the Scooby-Doo paintings, where designers of this type of project commonly confuse presence with attention (as I personally have). Someone’s presence does not necessarily mean that person is paying attention. Hence, I made a mental note to pay attention to this to any similar future projects I might undertake, where I could integrate detection of eyes and face, for example.

The next reading “Making Interactive Art: Set the Stage, Then Shut Up and Listen” brought to my attention a problem that I still need work on. So often, artists make artworks, describing their work and their interpretations. Personally, if I were the audience of such an artwork, it feels more difficult to interpret the work differently because I’ve been directed in the way to think. However, I think the audience will enjoy it more when they receive the opportunity to spend some time taking in the artwork through their senses… to think about what each part means, which parts afford contact or control, and which parts don’t.  In letting them interpret the parts, and decide how to respond – rather than prescribing their response, they could be more engaged and discover more. My question is, what is the balance between describing the environment and letting the audience discover the artwork?

Reading Reflection – Week#8

The first reading, “Attractive things work better” has changed one of the most important beliefs for me in the importance of beauty, as well as emphasized the importance of design for me. I don’t think I was fully convinced by previous readings that beauty was as important as utility. However,  it is through this reading that I had the thought that beauty could improve mood, and therefore, help people to be more relaxed and be in positive affect, which could be the type of affect needed at the time as it increases “the tolerance for minor difficulties and blockages.” I also felt an emphasis on the importance of design, through a very possible example whereby a person in anxiety, in flight mode, running away from danger as urgently as possible, could respond to a door that wouldn’t open after pushing. People could react by kicking harder and pushing harder, but this doesn’t solve the problem. However, if people were more relaxed, they may have slightly different though to pull the door instead. This example shows how design could be important to help save lives. Thus, a key takeaway for me is that the principles of human-centered design are especially important in stressful situations. The implication is that designs intended for stressful situations must pay attention to matching the needs of the users, to make actions salient and easy to apply.

What stood out to me from second reading, “Her Code Got Humans to the Moon” are: first, the valuableness of code in allowing humans to go to the moon, or save lives, and much more; second, significance of not ignoring a danger as a possibility; third, the importance of an error detection and handling process. In particular, it was striking that when the Apollo software realized it didn’t have enough room to do all that it was doing, it went through its error detection process and simply focused on the highest priority. This was something I wanted to apply to my work as well.