Chroma Cassette: A Multi-Speed Song Machine with Interactive Lights – Tengis & Dachi

The Chroma Cassette is a fun and interactive project that plays pre-loaded songs and allows users to control the playback speed using a distance sensor. The name “Chroma” refers to the project’s ability to switch the color of LEDs based on the song being played, while “Cassette” reflects the inspiration behind the variable playback speed control, similar to the fast-forwarding feature of cassette tapes.

  • Hardware Components:
    • Arduino Uno microcontroller
    • Piezo Speaker for playback
    • Distance sensor (ultrasonic sensor) to detect distance
    • Button for manual song switching
    • LEDs (Red, Green, Blue) for colorful song indication
    • Jumper wires for connecting components
    • Breadboard
  • Software (Code):
    • An array named songNames stores the titles of the pre-loaded songs (Game of Thrones, Imperial March, Pirates of the Caribbean, Silent Night).
    • Each song melody is defined as an array representing musical notes and their corresponding durations.
    • A function named playSong iterates through each note in the current song and plays it based on its duration. The function also calculates a speedFactor based on the distance measured by the sensor. This speedFactor is used to adjust the note duration, essentially changing the playback speed. Higher distances from the sensor result in slower playback, mimicking the fast-forwarding effect.
    • The setRGBColor function assigns specific colors to the LEDs based on the current song being played, adding a visual element to the project.
    • An interrupt service routine is triggered when the button is pressed, and a flag named switchSong is set to true, indicating the need to switch to the next song in the playlist.

The Chroma Cassette project, initially, embarked on a path paved with frustration. Our initial goal was to directly control the volume of the pre-loaded songs. Countless hours were spent crafting code, only to be met with a difficulty: directly manipulating volume on the Arduino platform proved to be an insurmountable hurdle. This limitation stemmed from the inherent architecture of the Arduino kit, lacking dedicated hardware components for fine-grained volume control.

We brainstormed alternative approaches to achieve a dynamic audio experience, eventually agreeing on varying the playback speed of the music. This approach, however, presented its own set of challenges. After lots of trial and error, adjusting code and testing countless iterations. This phase, though time-consuming, ultimately yielded a solution that met our exacting standards.

The foundation of the Chroma Cassette lies in its pre-loaded song library. Each song, be it the epic theme from Game of Thrones or the whimsical melody of Pirates of the Caribbean, was meticulously chosen to complement the project’s functionality. Once the song selection was finalized, we embarked on a critical step: adjusting the speedFactor. This variable acts as the heart of the speed control mechanism. Meticulous adjustments were made to the speedFactorfor each song, ensuring that even at faster playback speeds, the music retained its integrity and remained pleasant to the ears.

The distance sensor served as the conductor in this symphony of sound and speed. It was calibrated to operate within a specific range, from 1 centimeter to 30 centimeters. This precise calibration ensured a smooth and responsive adjustment curve. As the distance between the sensor and an object increased, the playback speed would gradually slow down, mimicking the fast-forwarding effect of a cassette tape.

To enhance user interaction, we incorporated additional functionalities. Pressing the designated button would seamlessly switch between songs in the playlist. To provide a visual cue for song changes, an LED was integrated into the system. Whenever the user switched songs, the LED would illuminate briefly, acknowledging the user’s input.

The RGB LED added a captivating layer of visual flair to the project. This versatile LED, capable of displaying a spectrum of colors, was linked to the songIndex variable. As the user cycled through the song playlist, the RGB LED would change color, reflecting the currently playing song. This color association wasn’t random – it drew inspiration from the source material of each song. For instance, the vibrant hues of green, purple, yellow, and orange adorned the LED when playing the Harry Potter theme, a subtle nod to the four Hogwarts houses.

Faced with an initial hurdle, we pivoted our approach and ultimately delivered a unique and engaging audio experience. The project seamlessly blends pre-loaded songs, dynamic speed control based on sensor input, intuitive user interaction, and a captivating visual element through the RGB LED. 

The Chroma Cassette might be a interesting blend of sound and light, but there’s always space to make it even better. For Instance, enhancing the audio quality, especially at faster playback speeds, could be a priority. Techniques like utilizing digital signal processing libraries on the Arduino might help reduce pitch shifting and distortion. Imagine users having the ability to upload their own personal soundtracks! This could be achieved by incorporating an SD card or a Bluetooth module, significantly expanding the song library and personalizing the experience for each user. The distance sensor integration could be taken a step further. By using a more advanced sensor with a wider range, users would have finer control over the playback speed across a larger distance. This refinement could create a more intuitive user experience. Another exciting possibility is an interactive light show. The RGB LED could be programmed to react to the music’s rhythm and melody, creating a dynamic visual spectacle that complements the audio. This would undoubtedly add a whole new captivating dimension to the user experience.

Picture: 

Video:

Code Snippet:

void playSong(int *melody, int melodyLength) {
  // Check if the speaker is turned off
    if (digitalRead(SPEAKER_SWITCH_PIN) == LOW) {
        Serial.println("Speaker is turned off.");
        return; // Exit the function if the speaker is turned off
    }

    // Iterate through each note in the melody
    for (int noteIndex = 0; noteIndex < melodyLength; noteIndex += 2) {
        float distance = getDistance();  // Update distance with each note

        // Adjust speedFactor based on the song and distance
        float speedFactor;
        if (currentSong == 1) { 
            // Slower scaling for Imperial March
            speedFactor = 1.1 + (distance / 30.0); //  slows down the max speed
        } else if (currentSong==2){
            speedFactor = 0.6 + (distance / 30.0);
        } else if (currentSong==3) { 
            speedFactor = 0.4 + (distance / 30.0);
        } else {
            speedFactor = 1.2 + distance / 30.0;
        }

         // Calculate the note duration based on the speed factor
        int noteDuration = (int)(1000 / melody[noteIndex + 1] * speedFactor);

        // Check if the song should be switched or the speaker is turned off
        if (switchSong || digitalRead(SPEAKER_SWITCH_PIN) == LOW) {
            noTone(SPEAKER_PIN); // Stop tone when switching off
            break;
        }

         // Play the note
        tone(SPEAKER_PIN, melody[noteIndex], noteDuration);
        setRGBColor(currentSong, melody[noteIndex]);
        delay(noteDuration * 1.30);
        noTone(SPEAKER_PIN);

        Serial.print(songNames[currentSong]);
        Serial.print(": Playing note: ");
        Serial.print(melody[noteIndex]);
        Serial.print(" at duration: ");
        Serial.println(noteDuration);  
        Serial.print("Distance: ");
        Serial.print(distance);
        Serial.println(" cm");
    }
}

 

Schematics:

Additional link:

https://drive.google.com/file/d/1Hl3iAj1yXwIkOQnEi8e_lYcrHDTGKorh/view?usp=sharing

Chroma Casette – Musical Instrument – Dachi and Tengis

The Chroma Cassette is a fun and interactive project that plays pre-loaded songs and allows users to control the playback speed using a distance sensor. The name “Chroma” refers to the project’s ability to switch the color of LEDs based on the song being played, while “Cassette” reflects the inspiration behind the variable playback speed control, similar to the fast-forwarding feature of cassette tapes.

  • Hardware Components:
    • Arduino Uno microcontroller
    • Piezo Speaker for playback
    • Distance sensor (ultrasonic sensor) to detect distance
    • Button for manual song switching
    • LEDs (Red, Green, Blue) for colorful song indication
    • Jumper wires for connecting components
    • Breadboard
  • Software (Code):
    • An array named songNames stores the titles of the pre-loaded songs (Game of Thrones, Imperial March, Pirates of the Caribbean, Silent Night).
    • Each song melody is defined as an array representing musical notes and their corresponding durations.
    • A function named playSong iterates through each note in the current song and plays it based on its duration. The function also calculates a speedFactor based on the distance measured by the sensor. This speedFactor is used to adjust the note duration, essentially changing the playback speed. Higher distances from the sensor result in slower playback, mimicking the fast-forwarding effect.
    • The setRGBColor function assigns specific colors to the LEDs based on the current song being played, adding a visual element to the project.
    • An interrupt service routine is triggered when the button is pressed, and a flag named switchSong is set to true, indicating the need to switch to the next song in the playlist.

The Chroma Cassette project, initially, embarked on a path paved with frustration. Our initial goal was to directly control the volume of the pre-loaded songs. Countless hours were spent crafting code, only to be met with a difficulty: directly manipulating volume on the Arduino platform proved to be an insurmountable hurdle. This limitation stemmed from the inherent architecture of the Arduino kit, lacking dedicated hardware components for fine-grained volume control.

We brainstormed alternative approaches to achieve a dynamic audio experience, eventually agreeing on varying the playback speed of the music. This approach, however, presented its own set of challenges. After lots of trial and error, adjusting code and testing countless iterations. This phase, though time-consuming, ultimately yielded a solution that met our exacting standards.

The foundation of the Chroma Cassette lies in its pre-loaded song library. Each song, be it the epic theme from Game of Thrones or the whimsical melody of Pirates of the Caribbean, was meticulously chosen to complement the project’s functionality. Once the song selection was finalized, we embarked on a critical step: adjusting the speedFactor. This variable acts as the heart of the speed control mechanism. Meticulous adjustments were made to the speedFactorfor each song, ensuring that even at faster playback speeds, the music retained its integrity and remained pleasant to the ears.

The distance sensor served as the conductor in this symphony of sound and speed. It was calibrated to operate within a specific range, from 1 centimeter to 30 centimeters. This precise calibration ensured a smooth and responsive adjustment curve. As the distance between the sensor and an object increased, the playback speed would gradually slow down, mimicking the fast-forwarding effect of a cassette tape.

To enhance user interaction, we incorporated additional functionalities. Pressing the designated button would seamlessly switch between songs in the playlist. To provide a visual cue for song changes, an LED was integrated into the system. Whenever the user switched songs, the LED would illuminate briefly, acknowledging the user’s input.

The RGB LED added a captivating layer of visual flair to the project. This versatile LED, capable of displaying a spectrum of colors, was linked to the songIndex variable. As the user cycled through the song playlist, the RGB LED would change color, reflecting the currently playing song. This color association wasn’t random – it drew inspiration from the source material of each song. For instance, the vibrant hues of green, purple, yellow, and orange adorned the LED when playing the Harry Potter theme, a subtle nod to the four Hogwarts houses.

Faced with an initial hurdle, we pivoted our approach and ultimately delivered a unique and engaging audio experience. The project seamlessly blends pre-loaded songs, dynamic speed control based on sensor input, intuitive user interaction, and a captivating visual element through the RGB LED. 

The Chroma Cassette might be a interesting blend of sound and light, but there’s always space to make it even better. For Instance, enhancing the audio quality, especially at faster playback speeds, could be a priority. Techniques like utilizing digital signal processing libraries on the Arduino might help reduce pitch shifting and distortion. Imagine users having the ability to upload their own personal soundtracks! This could be achieved by incorporating an SD card or a Bluetooth module, significantly expanding the song library and personalizing the experience for each user. The distance sensor integration could be taken a step further. By using a more advanced sensor with a wider range, users would have finer control over the playback speed across a larger distance. This refinement could create a more intuitive user experience. Another exciting possibility is an interactive light show. The RGB LED could be programmed to react to the music’s rhythm and melody, creating a dynamic visual spectacle that complements the audio. This would undoubtedly add a whole new captivating dimension to the user experience.

 

Picture: 


Videos:

 

For the serial output:

https://drive.google.com/file/d/1Hl3iAj1yXwIkOQnEi8e_lYcrHDTGKorh/view?usp=share_link

Schematics:

Code Snippet: (Too long for whole code)

// Function to play a song
void playSong(int *melody, int melodyLength) {
  // Check if the speaker is turned off
    if (digitalRead(SPEAKER_SWITCH_PIN) == LOW) {
        Serial.println("Speaker is turned off.");
        return; // Exit the function if the speaker is turned off
    }

    // Iterate through each note in the melody
    for (int noteIndex = 0; noteIndex < melodyLength; noteIndex += 2) {
        float distance = getDistance();  // Update distance with each note

        // Adjust speedFactor based on the song and distance
        float speedFactor;
        if (currentSong == 1) { 
            // Slower scaling for Imperial March
            speedFactor = 1.1 + (distance / 30.0); //  slows down the max speed
        } else if (currentSong==2){
            speedFactor = 0.6 + (distance / 30.0);
        } else if (currentSong==3) { 
            speedFactor = 0.4 + (distance / 30.0);
        } else {
            speedFactor = 1.2 + distance / 30.0;
        }

         // Calculate the note duration based on the speed factor
        int noteDuration = (int)(1000 / melody[noteIndex + 1] * speedFactor);

        // Check if the song should be switched or the speaker is turned off
        if (switchSong || digitalRead(SPEAKER_SWITCH_PIN) == LOW) {
            noTone(SPEAKER_PIN); // Stop tone when switching off
            break;
        }

         // Play the note
        tone(SPEAKER_PIN, melody[noteIndex], noteDuration);
        setRGBColor(currentSong, melody[noteIndex]);
        delay(noteDuration * 1.30);
        noTone(SPEAKER_PIN);

        Serial.print(songNames[currentSong]);
        Serial.print(": Playing note: ");
        Serial.print(melody[noteIndex]);
        Serial.print(" at duration: ");
        Serial.println(noteDuration);  
        Serial.print("Distance: ");
        Serial.print(distance);
        Serial.println(" cm");
    }
}

 

 

Assignment #11 – Midterm Idea

For the final project, I am thinking of making a camera on wheels. I was inspired by a panther dolly, which consists of a track and a mount with wheels on which the camera goes. Essentially, there is someone pushing the dolly when scenes require the camera to sturdily follow the character along a line.

 

 

While there is usually a camera operator working on the dolly, I thought to myself, why not make a moving camera that follows the actor without needing someone to push it on the track?

 

That way, it can even make cellphone cinema easier!

 

The Arduino part would consist of a button to start and stop recording, a LED light that indicates whether it is recording or not, and a Piezo buzzer that makes a sound when the button is first clicked to record. I would also need to incorporate the wheels and a way to control them, perhaps with a remote?

 

The p5 part would essentially consist of a video feedback, and maybe a way to save the videos somewhere.

 

Now, I feel like this may be a bit too ambitious, especially that I have struggled more with Arduino than with p5. If I feel like I won’t be able to do it, I might tweak some elements. But I would be really interested in creating something like this!

Final Project Proposal Dachi

Concept

For my Final Project, I don’t have a concrete idea yet that I have committed to but I know the general area which I want to work on as well as some potential projects. 

I want to explore ML5.js which is a machine learning designed for the web. In short, it utilizes the graphical powers of the browser to do machine-learning calculations. It works well with P5.js which is why I want to utilize it. It’s open source library which is beginner-friendly providing high-level interface to TensorFlow. As someone with no machine learning experience, it will be very interesting to explore it in depth. It has some available pre-trained models which work for image classification and recognition. For example, HandPose model would work for detecting hands and doing some action with it.

The initial concept idea would be to create a canvas where you can draw with your hands in P5js with the help of Handpose. I could utilize arduino to add additional functionality. For example put sensors in a cardboard which would act as brush settings. For example, various sensors would change various brush properties. Cut out for ultrasonic sensor could potentially change size of the brush by moving your finger. Ideally, I would like to make the interface very seamless.

I could expand this idea for user to create generative art by varying different mathematical properties using their hands, it does not have to be limited to just drawing. 

Challenges

I would face many challenges going with this approach as I have no prior experience. Moreover, I would have to combine the machine learning aspect with Arduino to create a seamless experience. My plan to overcome these challenges is to start learning as early as possible. This includes going through the library, learning the basics and finalizing my plan which I can stick with and dedicate much more time to. It should be challenging, yet achievable in the timeframe we are given. I am optimistic that this project will not only be very beneficial by learning lots of new things but will also be quite fun!

Week 11 – Response

The author raises valid concerns regarding the current trajectory of interface design, particularly the over-reliance on “Pictures Under Glass” – flat, touch-based interfaces that neglect the incredible capabilities of human hands. I wholeheartedly agree that this approach is limiting and fails to tap into the full potential of human-computer interaction.

The article beautifully highlights the richness of tactile experience and the intricate ways our hands manipulate and interact with the world. Reducing interaction to mere sliding gestures on a flat surface ignores this wealth of human capability and expression.

Instead of simply extrapolating current trends, the author urges us to envision a future where interfaces are dynamic, engaging our senses of touch and manipulation in ways that are intuitive and expressive. This vision resonates deeply and calls for a shift in focus towards technologies that leverage the full potential of human hands.

The article rightly emphasizes the limitations of purely visual interfaces. Haptic feedback technology, which recreates the sense of touch, holds immense potential in enriching user experience. Imagine feeling the texture of fabrics while online shopping, or experiencing the resistance of virtual objects in a design program.

The article challenges the dominance of flat screens and encourages exploration of 3D interfaces. Technologies like volumetric displays and mid-air haptics could enable us to interact with digital content in a more natural and intuitive manner, mimicking real-world manipulation of objects.

When I ply videogames, I prefer playing with a controller, versus a mouse and keyboard. This is for many reasons, but I specifically enjoy the haptic feedback I get when I play. It adds and extra dimension and an extra sense for me when I play a game which is lost on a mouse and keyboard. I also appreciate the haptics on a Nintendo Switch, their quality. I like how they are integral to many games, which just makes them more fun.

While the current state of research in these areas might be nascent, the author’s call for ambitious, long-term vision is crucial. We need researchers, designers, and engineers to be inspired by the possibilities beyond “Pictures Under Glass” and work towards interfaces that truly empower and enhance human capabilities.

Assignment #11 – Reading Response – Manipulate, Move, Feel No More

Our bodies are made to manipulate, to move, to feel. I mean, the author states that too. When these «technologies» of the future are introduced, they not only hinder our bodies’ abilities, but also replace them with much more harmful ways of being.
First, to manipulate. In a way, we still manipulate these technologies and those to come in the future. We turn on, we scroll, we tap… Perhaps, but, how much agency do we actually have over what these technologies present to us? Particularly in the age of media, data privacy (or lack thereof), and consumption, these devices may become not only biased, but also use our own information against us. A hotel key card such as the one in the video, combined with all of one’s other passes and documents, can easily lay ground for infringement of privacy. But it’s not like this is not already present in some way. Apple wallet, for example, can keep all your cards and passes in one place. Although this digital wallet may be efficient, how safe do we know it is? How do we know that we are not giving it control over us, instead of it being the other way around?
Simultaneously, this digitization of everything limits our movement. We become lazy. When I was traveling back to Abu Dhabi from Paris this January, I was surprised to find out at the airport that check-in now happened through a machine. Clerks were only available if an issue arose. And well, many of the people checking in were facing issues, and there were only two people assisting. So it seems that technology now and in the future, under the pretense of efficiency, is just a way to lift work off of people that have a job to do – without even being efficient! Even the other day, I went to Mamsha, and found out that you don’t get a parking ticket anymore. The camera at the entrance reads your plate number, which you then give to the restaurant so they can validate your «ticket». It’s all so lazy, isn’t it? And even though these two examples may sound very banal, it applies to bigger things.
I think, at the end of the day, the issue is that quickness is prioritized over efficiency. Things are being transformed without actually taking into account how that will impact user capability AND behavior. They say, don’t fix what’s not broken. But not only do they «fix» what’s not broken, they also render the experience much harder than before.

Dachi Reading Response – Week 11

The first article, “A Brief Rant on the Future of Interaction Design,” argues that our current touchscreen interfaces are missing out on the rich tactile feedback and 3D manipulation that our hands are capable of. The author thinks future interfaces should engage our bodies more fully, like how we interact with the physical world.

The second article is a follow-up where the author addresses some common responses to his original rant. He clarifies that he’s not against touchscreens entirely, but rather sees them as a sequentual stone to even more immersive and embodied forms of interaction. He also touches on why he’s skeptical of voice commands and gesture controls as complete replacements for physical interfaces.

Putting the two articles together, it seems like the core idea is that our technology should evolve to better adapt to our human capabilities, especially when it comes to how we use our hands to explore and manipulate objects. The author is calling for a future where digital interactions feel just as tangible and expressive as their real-world counterparts.

I actually agree with this vision, even if the exact path to get there is still unclear. I think there’s a lot of room for interfaces to become more tactile and responsive to our natural ways of interacting. At the same time, I wouldn’t want to completely abandon the benefits of touchscreens, like customization. In the real world, there are many examples where people prefer tactility. For example, lots of companies have been trying to introduce laptops with glass clear keyboards. This would undoubtedly make laptops thinner and add an option of a second screen but the majority of users myself included would hate the typing experience because there is something much more satisfying when it comes to physical keystrokes. (This is main reason why people get into expensive mechanical keyboards, I might be people).

On the other hand, despite blackberry’s popularity in the past, keyboards on phones have become a thing of the past as the benefits are simply not worth it due to major compromises like the bulkiness, less space for other components, visuals, and so on. Plus, people treat phones mainly as texting machines, when it comes to typing, it’s on big screens where real work happens.

The articles also raised some good points about the limitations of relying too heavily on voice commands or free-air gestures. While those modalities can be useful in certain contexts, they lack the physical feedback and precision of direct manipulation. So it seems like the ideal interface would offer a mix of input methods that complement each other and cater to different situations.

As someone who spends a lot of time interacting with screens, it’s an exciting prospect to imagine a future where the digital world is combined with a world of haptics to deliver an immersive experience, something akin to Ready Player One.

Week 11 – Final Project Proposal

Concept:

This project involves building a radar system using an Arduino, an ultrasonic sensor, and a joystick or buttons for control. The sensor data will be sent to p5.js, which will visualize the readings on a radar-like display, allowing you to detect objects in the surrounding environment.

Components:

  • Arduino: Controls the ultrasonic sensor and reads input from the joystick/ buttons.
  • Ultrasonic Sensor: Emits ultrasonic pulses and measures the time it takes for the echo to return, determining the distance to objects.
  • Joystick/Buttons: Provides input for controlling the servo motor that rotates the ultrasonic sensor.
  • Servo Motor: Rotates the ultrasonic sensor to scan the environment.
  • Computer with p5.js: Receives data from the Arduino and generates the radar visualization.

Implementation:

  1. Hardware Setup:
    • Connect the ultrasonic sensor to the Arduino.
    • Connect the servo motor to the Arduino.
    • Connect the joystick/buttons to the Arduino.
  2. Arduino Code:
    • Initialize the sensor, servo, and joystick/buttons.
    • In the loop function:
      • Read the joystick/button values to determine the desired rotation angle for the servo.
      • Rotate the servo to the specified angle.
      • Trigger the ultrasonic sensor and measure the distance to the nearest object.
      • Send the distance and angle data to the computer via serial communication.
      • Ensure wires connecting the sensor cannot get tangled.
  3. p5.js Sketch:
    • Establish serial communication with the Arduino.
    • Receive distance and angle data from the Arduino.
    • Create a radar-like display:
      • Draw a circular background representing the scanning area.
      • Convert the distance and angle data to Cartesian coordinates (x, y) on the display.
      • Draw points or shapes at the calculated coordinates to represent the detected objects.
      • Implement features like:
        • Different colors or sizes for objects based on distance.
        • Trail effect to visualize the movement of objects.
        • Numerical distance display.

Possible Additional Features:

  • Multiple Sensors: Use multiple ultrasonic sensors for wider coverage.
  • Sound Effects: Play beeps or tones that vary in pitch or frequency based on the distance to objects.
  • Object Tracking: Implement an algorithm to track the movement of objects over time.

Challenges and Considerations:

  • Sensor Accuracy and Noise: Ultrasonic sensors can be affected by environmental factors and may require calibration.
  • Visualization Design: Create a clear and intuitive radar display that effectively represents the sensor data.

Week 11 – Theremin-like instrument

This project provided valuable insights into the potential of technology in musical expression and exploration. Despite its seemingly simple design, utilizing two push buttons for sound generation and an ultrasound sensor for frequency modulation, the project unveiled a range of creative possibilities and highlighted areas for further development.

The incorporation of the ultrasound sensor was particularly intriguing. By translating physical distance into audible frequencies, the sensor effectively transformed space into a controllable musical parameter. This interaction proved to be a captivating demonstration of how technology can facilitate new forms of musical interaction and expression. The concept invites further exploration, prompting questions about the potential for incorporating additional sensors to create a multi-dimensional instrument responsive to a wider range of environmental stimuli.

// Check if distance is within range for playing sound
if (distance < 200 && distance > 2) {
// Map distance to frequency
int frequency = map(distance, 2, 200, 200, 2000);
tone(buzzerPin, frequency); // Play sound at calculated frequency
} else {
noTone(buzzerPin); // Stop sound if distance is out of range
}

 

While the project successfully demonstrated the feasibility of generating sound through Arduino, the limitations of pre-programmed sounds became evident. The lack of nuance and complexity inherent in such sounds presents a challenge for creating truly expressive and dynamic musical experiences. This observation underscores the need for further exploration into sound generation techniques, potentially involving machine learning or other advanced algorithms, to expand the sonic palette and introduce more organic and evolving soundscapes.

Week 11 Reading | A Century’s Solution on the Future of Interaction Design

Victor ‘ranted’ in his post to remind us how our current definition of interactivity is limited to glassy touch screens that do not utilize our touch senses to their full potential. He argues that texture, the ‘feel’ of things is not accompanied within these slidy screens.

Although he does not include solutions and was writing the article purely to raise awareness, I believe he asked important questions. One idea that I am fond of is to think of these designs as tools that react to our hands.

Elon Musk Uses Iron Man-inspired Holographic 3-D User, 42% OFF

Tony Stark in his lab. Disney Marvel.

Hiro Hamada microbots presentation Big hero 6 1080p

Hiro presenting nanobots. Disney.

Imagine this: a breed between nanobots from Disney’s Big Hero 6 and holograms used by Tony Stark in his research lab. The hologram can sketch schematics in thin air, and then magically materialize from the nanobots. It would become an interactive object, that can be touched, felt, scrubbed, and molded only limited by the user’s imagination.

Any sufficiently advanced technology is indistinguishable from magic. – Arthur c. clarke

Although it may still take centuries away with dozens of research, prototypes, and testing before we can commercially mass produce this kind of technology, perhaps we can pick a point or two from Victor’s argument to take the current developments into an alternative shape and features.