Bret Victor’s “A Brief Rant on the Future of Interaction Design” emphasizes touchscreen technology. His critique resonates deeply with those of us concerned about the narrow trajectory of technological innovation where tactile and kinesthetic interaction is marginalized in favor of visually dominated interfaces. Victor’s call for broader sensory involvement in technological interfaces is a plea for innovation and an argument rooted in the natural human interaction with the world. Evidence supports his viewpoint from various fields, including educational psychology, which suggests that multi-sensory learning environments enhance understanding and retention (Wolfe and Nevills). This aligns with Victor’s advocacy for interfaces that engage more of our bodily senses, not less.
Victor’s reflections and the subsequent responses to his original piece stimulate a broader conversation about the potential biases in technology design. His critique may seem biased to those who champion digital minimalism and current devices’ sleek, streamlined aesthetics. However, it raises an essential question about whom technology is truly serving. Has reading his arguments changed my beliefs? Absolutely. It’s led me to reconsider the role of physicality in digital interaction and consider the untapped possibilities of interfaces that could mimic more complex human behaviors and interactions. This reflection opens up questions about the potential for future technologies: How far can we push the boundaries of interaction design to make digital experiences more immersive and intuitive without sacrificing functionality? How can designers balance the need for advanced functionality with intuitive physical interactions?
Citation:
Wolfe, P., & Nevills, P. (2004). Building the reading brain, PreK-3. Corwin Press.
The concept of this project was to create a mini drum pad, or what is equivalent to one, with the hardware we have available. The device would use buttons to trigger different buzzer sounds, mimicking the functionality of a traditional drum pad. Each button on the device would correspond to a different sound, with the frequency of these sounds adjustable via a potentiometer. This allows the user to modify the pitch of the tones.
Code:
// Defining pins assignments for buttons and buzzers
const int buttonPin1 = 2;
const int buttonPin2 = 3;
const int buttonPin3 = 4;
// Coded with the Aid of ChatGPT
const int buttonPin4 = 5; // Monitoring and playbacks button
// Coded with the Aid of ChatGPT
const int buzzerPin1 = 8;
const int buzzerPin2 = 9;
const int buzzerPin3 = 10;
const int potPin = A0; // Potentiometer connected to A0 for frequency control
// Variables to manage button states and debounce timing
int buttonState1 = 0;
int lastButtonState1 = 0;
int buttonState2 = 0;
int lastButtonState2 = 0;
int buttonState3 = 0;
int lastButtonState3 = 0;
int buttonState4 = 0;
int lastButtonState4 = 0;
unsigned long lastDebounceTime1 = 0;
unsigned long lastDebounceTime2 = 0;
unsigned long lastDebounceTime3 = 0;
unsigned long lastDebounceTime4 = 0;
unsigned long debounceDelay = 50; // Debounce delay in milliseconds
// Struct to hold buzzer activation data including the pin and frequency
struct BuzzerAction {
int buzzerPin;
int frequency;
};
// Coded with the Aid of ChatGPT
BuzzerAction record[100]; // Array to store each buzzer activation
int recordIndex = 0; // Index for recording array
//Coded with the Aid of ChatGPT
void setup() {
// Initialize all button and buzzer pins
pinMode(buttonPin1, INPUT);
pinMode(buttonPin2, INPUT);
pinMode(buttonPin3, INPUT);
// Coded with the Aid of ChatGPT
pinMode(buttonPin4, INPUT);
// Coded with the Aid of ChatGPT
pinMode(buzzerPin1, OUTPUT);
pinMode(buzzerPin2, OUTPUT);
pinMode(buzzerPin3, OUTPUT);
pinMode(potPin, INPUT); // Setups potentiometer pin as input
}
void loop() {
// Reads current state of buttons
int reading1 = digitalRead(buttonPin1);
int reading2 = digitalRead(buttonPin2);
int reading3 = digitalRead(buttonPin3);
// Coded with the Aid of ChatGPT
int reading4 = digitalRead(buttonPin4);
// Coded with the Aid of ChatGPT
int potValue = analogRead(potPin); // Reads potentiometer value
int frequency = map(potValue, 0, 1023, 200, 2000); // Maps potentiometer value to frequency range
// Handle button 1 press and recording
debounceAndRecord(reading1, &lastButtonState1, &buttonState1, &lastDebounceTime1, buzzerPin1, frequency);
// Handle button 2 press and recording
debounceAndRecord(reading2, &lastButtonState2, &buttonState2, &lastDebounceTime2, buzzerPin2, frequency);
// Handle button 3 press and recording
debounceAndRecord(reading3, &lastButtonState3, &buttonState3, &lastDebounceTime3, buzzerPin3, frequency);
// Handles button 4 for playback
if (reading4 != lastButtonState4) {
lastDebounceTime4 = millis();
}
if ((millis() - lastDebounceTime4) > debounceDelay) {
if (reading4 != buttonState4) {
buttonState4 = reading4;
if (buttonState4 == HIGH) {
for (int i = 0; i < recordIndex; i++) {
// Play each recorded buzzer action with the specific frequency recorded
tone(record[i].buzzerPin, record[i].frequency, 200);
delay(250); // Short delay between each buzzer action for clarity
}
recordIndex = 0; // Resets record index after playback
}
}
}
// Update last button states for next loop iteration
lastButtonState1 = reading1;
lastButtonState2 = reading2;
lastButtonState3 = reading3;
lastButtonState4 = reading4;
}
// Coded with the Aid of ChatGPT
// Function to handle button debouncing and recording buzzer actions
void debounceAndRecord(int reading, int *lastButtonState, int *buttonState, unsigned long *lastDebounceTime, int buzzerPin, int frequency) {
if (reading != *lastButtonState) {
*lastDebounceTime = millis(); // Reset debounce timer
}
if ((millis() - *lastDebounceTime) > debounceDelay) {
if (reading != *buttonState) {
*buttonState = reading; // Updates button state
if (*buttonState == HIGH && recordIndex < sizeof(record) / sizeof(record[0])) {
record[recordIndex++] = {buzzerPin, frequency}; // Records the buzzer activation
tone(buzzerPin, frequency, 200); // Plays buzzer at recorded frequency
}
}
}
*lastButtonState = reading; // Updates last button state for debouncing
// Coded with the Aid of ChatGPT
}
Hardware Configuration: The system is designed with four button inputs and three buzzer outputs. Additionally, a potentiometer is used to control the frequency of the buzzer sounds.
Button Functionality: Buttons 1 to 3 are connected to buzzers and are responsible for triggering sounds with variable frequencies determined by the potentiometer. Button 4 is designated for playback. It plays back a sequence of sounds that have been recorded based on earlier interactions with buttons 1 to 3.
Frequency Control: The frequency of the sounds is dynamically adjusted using a potentiometer. The analog value from the potentiometer is mapped to a specified frequency range (200 Hz to 2000 Hz), which determines how the buzzers sound.
Debouncing: To ensure reliable button press detection without noise interference, the code implements debouncing logic. This involves measuring the time since the last button state change and updating the state only if this interval exceeds a predefined threshold (50 milliseconds).
Recording and Playback (Aided by Chatgpt)
Recording: When a button (1 to 3) is pressed, the action (which buzzer is activated and at what frequency) is recorded in an array. This includes storing both the pin of the buzzer and the frequency at which it was activated.
Playback: When button 4 is pressed, the system iterates over the recorded actions and plays them sequentially. Each action triggers the corresponding buzzer to sound at the recorded frequency for a short duration.
Loop and Functions:
The main loop continuously checks the state of each button and the potentiometer, updating the frequency accordingly. A helper function, debounceAndRecord, is used to handle the logic for both debouncing and recording the buzzer actions associated with each button press.
Video of Project:
Reflection and ideas for future work or improvements:
Integrating a small display screen would significantly improve its functionality, further enhancing the project. This screen would provide real-time visual feedback on button presses and frequency outputs, allow users to scroll through and select different sounds or presets, and serve as a simple interface for directly programming the device.
The potential for further development and refinement holds exciting prospects. The integration of a display screen and the addition of more customizable buttons are immediate steps that will enhance the device’s usability and versatility. Further innovations could include wireless connectivity for easy integration with other music production software or the addition of sensors to enable gesture-based controls, which would offer an even more dynamic and immersive user experience.
Several key insights stand out after reflecting on what this project has taught us. First, the practical challenges of hardware interfacing taught us the importance of robust design and a solid plan for creating it. There is also a need for effective wire management and physical housing to enhance device durability and aesthetics.
Looking Ahead:
Overall, this project resulted in a functional and entertaining product and served as a significant learning experience, underscoring the importance of patience, precision, and creativity in making it happen. These lessons will guide further improvements and innovations in our future projects.
As I was playing around with playing tones using the Arduino last week, I couldn’t help but think that the combination of potentiometers and buttons, leftover from my last project, alongside the buzzers, resembled a DJ set. This was what Rama and I ended up going with for this week’s assignment. The set is built using two potentiometers, a digital switch, a piezo buzzer, servo motors, and two LEDs (for visual feedback). When connected to power, the set produces a melody that the player (DJ) can manipulate by using the first potentiometer (which controls the pitch) and the second potentiometer (which controls the tempo). The red LED blinks in sync with the tempo set by the second potentiometer. The yellow LED’s brightness is proportional to the pitch (the higher the pitch, the higher the brightness). The servo motor resembles the turntables on an actual DJ set. You can activate it by pressing the digital switch. Upon activation, the servo motor will rotate at a speed equivalent to the beat’s tempo.
Implementation
Effectively, the DJ set plays a sequence of notes in succession, stored in a global 2D array, whose pitch class and the tempo of the beat are controlled by the potentiometers. In switching successively between tempos and pitch classes, the player generates different tunes and overall musical moods. The player can also switch on the servo motor, whose position incrementally increases at the same rate set by the tempo the player chose. We utilize two Arduinos, one controlling the sound manipulation between different tempos and pitches using the potentiometers (as well as the LED) and the other controlling the servo motor via a switch. The first Arduino sender program sends the position of the servo motor at time intervals corresponding to the current rate set by the tempo, using the Inter-Integrated Circuit (I2C) Protocol, over to the second Arduino. The receiver program on the second Arduino receives the position and updates the location of the servo, conditional on the button state being on. As the first receiver program also needed to synchronize sound with the blinking of the LEDs, it was essential to extrapolate the concept of blinking without delay to the playing of tones by the buzzer.
Code Snippets
The sender Arduino program sets up a connection using the Wire library in the setup().
void setup() {
pinMode(led0, OUTPUT);
pinMode(led1, OUTPUT);
Wire.begin(); // set up I2C communication
Serial.begin(9600);
}
The sender Arduino program reads the two potentiometer values, using the first value to control the pitch (by mapping the value to the range 0-3, corresponding to the range of sequences in the global notes array ) and the second to control the tempo/rate of playing a note. The first value is used to index the 2D array,notes[][], which stores the sequence of notes in different pitches. It also sets the brightness of the yellow LED. The second is used to index the duration[] array, which specifies the rate at which notes are played. The notes in the array notes[potVal1] are played in succession, with the next note in the sequence playing if the specified duration has passed. The state of the red LED is updated to blink according to the rate of the current melody being played. Finally, the position of the servo motor is also updated and sent to the second Arduino program.
void loop() {
// get potentiometer values
int potVal1 = map(analogRead(potPin1), 0, 1023, 0, 3); // control pitch
int potVal2 = map(analogRead(potPin2), 0, 1023, 0, 3); // control tempo
// map the first potentiometer's value to the range [30,255] to control the brightness of the
// yellow LED - higher pitch -> higher brghtness
int led0Val = map(constrain(analogRead(potPin1), 30,1023), 0, 1023, 30, 255);
analogWrite(led0, led0Val); // write the mapped potentiometer value to the LED pin
// get the duration (based on tempo) from the second potentiomenter
int duration = durations[potVal2];
// check if time has elapsed since the last tone
if (millis() - prevMillis >= duration * 1.60){
prevMillis = millis(); // update previous time
led1State = !led1State; // update blinking state for the red LED
digitalWrite(led1, led1State);
tone(4, notes[potVal1][note], duration); // play tone that corresponds to the set frequency of the first potentiometer and for the duration set by the second potentiometer
note++; // cycle through the sequence
if (note >= 5) {
note = 0; // Reset note index
}
// direction goes forward if position is at 0
if (position <= 0){
servoDir = 1;
}
// direction goes backward if position is at 160
else if (position >= 160){
servoDir = -1;
}
position = (position+servoDir*20)%180;
Wire.beginTransmission(8); // begin transmission to receiver
Wire.write(position); // send over the positon of the servo motor
Wire.endTransmission(); // stop transmitting
Wire.requestFrom(8, 6); // request 6 bytes from receiver device #8
delay(1);
}
}
The receiver Arduino program receives the position from the sender and updates the position if the switch state is set to 1 (the switch has been pressed to turn on the motor). Since the transmission occurs at a rate equivalent to the rate of the music, the motor will move according to the current tempo of the beat.
#include <Servo.h>
#include <Wire.h>
Servo servo; // initialize servo
int x = 0; // variable storing data transmission
int position = 0; // position of the servo motor
const int switchPin = 7; // switch pin
int buttonState = 0; // current button state
int prevButtonState=0; // previous button state
bool servoMove = false; // keeps track of the switch state
void setup() {
Serial.begin(9600);
pinMode(switchPin, INPUT);
servo.attach(9);
Wire.begin(8);
Wire.onReceive(receiveEvent); // initialize event triggered on receipt of data
Wire.onRequest(requestEvent); // initialize event on being requested data
}
void receiveEvent(int bytes) {
x = Wire.read();
// validate received data
if (x >= 0 && x <= 180) {
position = x; // x directly maps to servo position
}
}
void requestEvent()
{
Wire.write("Hello ");
}
void loop() {
buttonState = digitalRead(switchPin);
// maintain the state of the switch and use to determine if the motor should move
if (buttonState == HIGH && prevButtonState == LOW){
servoMove = !servoMove;
};
// smoothly move the servo towards the desired position
if (servoMove){
if (position != servo.read()) {
if (position > servo.read()) {
servo.write(servo.read() + 1);
} else {
servo.write(servo.read() - 1);
}
delay(1);
};
}
prevButtonState = buttonState;
}
Circuit Schematic
Here’s a schematic of our circuitry:
Demo
Reflections and Challenges
One of the things we struggled with, largely due to both of us lacking knowledge of musical compositions, is choosing the right sequence of tones that generate a coherent melody. With a combination of trial and error and some research, we found a suitable sequence. We also faced the challenge of one of our LED lights not turning on when we wired the servo motor to the same circuit. Instead of adding a second power source to connect the servo motor to, we opted to utilize I2C since we had an additional Arduino, which proved to be a useful exercise. Overall, we were happy with the final product, but I think it would be nice to extend this project further and give the player a little more creative control as the current setup is quite restrictive in terms of what final melodies the player can generate. For instance, we can have additional buzzers, each producing a different tone, controlled by switches that the users can use to make their own tunes from scratch over a set beat (something like a MIDI controller? ) .
The Digital Piano with Distance-Sensing Percussions is an innovative musical instrument that blends traditional piano elements with modern sensor technology to create a unique and interactive musical experience. This project utilizes an array of digital push buttons connected to an Arduino board to simulate a piano keyboard, where each button triggers a distinct musical note. In addition to the conventional keyboard setup, the instrument incorporates an ultrasonic distance sensor, which introduces a dynamic layer of percussion sounds. These sounds vary depending on the distance of the player’s hand. Furthermore, a potentiometer is integrated to alter the pitch of the notes dynamically, offering musicians the ability to manipulate the sound palette expressively.
Images
Components Used
Arduino Uno
Breadboard (x2)
Jumper Wires
Piezo Buzzer (x2)
Push Buttons (x8)
Potentiometer
10k ohm resistors (x8)
Ultrasonic Sensor
Circuit Setup
Power Connections
Arduino 5V to Breadboard positive rail
Arduino GND to Breadboard negative rail
Piezo Buzzers
Piezo Buzzer 1:
Positive connection to Arduino digital pin 12
Negative connection to Breadboard negative rail
Piezo Buzzer 2:
Positive connection to Arduino digital pin 13
Negative connection to Breadboard negative rail
Push Buttons
One side of each button connected to the Breadboard positive rail
The other side of each button is connected through a 10k ohm resistor to the Breadboard negative rail and also connected to Arduino digital pins 2 through 9.
Potentiometer
One outer pin is connected to the Breadboard positive rail.
Another outer pin is connected to the Breadboard negative rail.
Middle pin connected to Arduino analog pin A0.
Ultrasonic Sensor
VCC pin is connected to the Breadboard positive rail.
GND pin is connected to the Breadboard negative rail.
TRIG pin is connected to Arduino digital pin 10.
ECHO pin is connected to Arduino digital pin 11.
Video
Code
int buzzerPin = 12;
int buzzer2 = 13;
int potPin = A0;
int keys[] = {2, 3, 4, 5, 6, 7, 8, 9};
// Frequencies for notes (C4 to C5)
int notes[] = {262, 294, 330, 349, 392, 440, 494, 523};
int trigPin = 10;
int echoPin = 11;
int bassDrum = 200;
int snare = 250;
int hiHat = 300;
void setup() {
pinMode(buzzerPin, OUTPUT);
pinMode(buzzer2,OUTPUT);
pinMode(2,INPUT);
pinMode(3,INPUT);
pinMode(4,INPUT);
pinMode(5,INPUT);
pinMode(6,INPUT);
pinMode(7,INPUT);
pinMode(8,INPUT);
pinMode(9,INPUT);
pinMode(trigPin, OUTPUT);
pinMode(echoPin, INPUT);
Serial.begin(9600);
}
void loop() {
int potValue = analogRead(potPin);
int volume = map(potValue, 0, 1023, 0, 255); // Map the potentiometer value to a volume range
// Measure distance using the ultrasonic sensor
digitalWrite(trigPin, LOW);
delayMicroseconds(2);
digitalWrite(trigPin, HIGH);
delayMicroseconds(10);
digitalWrite(trigPin, LOW);
long duration = pulseIn(echoPin, HIGH);
int distance = duration * 0.034 / 2; // Calculate the distance
Serial.print(distance);
Serial.println(" cm");
bool isAnyButtonPressed = false;
for (int i = 0; i < 8; i++) {
int modifiedNote = map(potValue, 0, 1023, notes[i] / 2, notes[i] * 2);
if (digitalRead(keys[i]) == HIGH) {
tone(buzzerPin, modifiedNote, 100);
isAnyButtonPressed = true;
break; // Stop the loop once a button is found pressed
}
}
if (!isAnyButtonPressed) {
noTone(buzzerPin);
}
if (distance < 10) {
tone(buzzer2, bassDrum, 100);
} else if (distance >= 10 && distance < 20) {
tone(buzzer2, snare, 100);
} else if (distance >= 20 && distance < 30) {
tone(buzzer2, hiHat, 100);
} else {
noTone(buzzer2);
}
delay(100);
}
In the loop, the program first reads the potentiometer value and uses it to modify the frequency of the piano notes. Depending on the button pressed, it plays a modified note frequency. If no buttons are pressed, it stops any ongoing tone. Depending on the distance detected, it chooses a percussion sound to play, simulating a drum kit with different sounds for different ranges.
Music possesses a remarkable capacity to bridge cultural and linguistic divides and unite individuals from diverse backgrounds. Motivated by the notion of promoting interactive experiences and democratizing music creation, we set out to build a one-of-a-kind musical instrument out of buttons and Arduino. Our intention was to enable people to use sound as a creative medium for self-expression, irrespective of their experience with music. We chose the piano as our main inspiration because we could recreate the chords using buttons.
💡 Process:
Using Arduino Uno we wired up buttons to serve as interactive triggers for different musical notes. Each button was assigned a specific pitch or sound, allowing users to create melodies by pressing combinations of buttons. We leveraged our programming skills to code the logic behind the instrument, ensuring seamless functionality and an intuitive user interface.
🚀 Difficulties:
Although our journey was exciting and innovative, there were certain difficulties. A major challenge we encountered was guaranteeing the reliability and consistency of button pushes. We have to use precise calibration and filtering techniques to get beyond problems like noise interference and debounce.
There were additional difficulties in creating an intuitive user interface. To ensure that users could simply grasp how to interact with the instrument while still having the freedom to explore and experiment with various musical compositions, we had to find a balance between simplicity and utility.
const int speakerPin = 9; // Speaker connected to pin 9
int buttonPins[] = {2, 3, 4, 5}; // Button pins for C, D, E, F
int notes[] = {262, 294, 330, 349}; // Frequencies for C4, D4, E4, F4
void setup() {
// Set up each button pin as an input with pull-up resistors
for (int i = 0; i < 4; i++) {
pinMode(buttonPins[i], INPUT_PULLUP);
}
// Set the speaker pin as an output
pinMode(speakerPin, OUTPUT);
}
void loop() {
// Check each button and play the corresponding note
for (int i = 0; i < 4; i++) {
if (digitalRead(buttonPins[i]) == LOW) { // Check if button is pressed
tone(speakerPin, notes[i]); // Play the corresponding note
delay(200); // A short delay to help debounce the button
while (digitalRead(buttonPins[i]) == LOW); // Wait for the button to be released
noTone(speakerPin); // Stop playing the note
}
}
}
This week’s reading advocates for a more holistic approach to interaction design, one that fully engages human capabilities. The author criticizes current interfaces (i.e., touchscreens) for ignoring the full range of human hand functions—particularly our ability to feel textures and manipulate objects. Victor’s criticism of using just one finger made me wonder: if we could accomplish so much with just one finger, how many more opportunities might arise if we used our entire body? However, this also makes me that there is some level of bias because the author formerly worked in interaction design, I wonder if their strong opinions are a result of this experience.
Although I agree with the author’s perception that we can accomplish more with the use of our hands, what about the people who aren’t able to use them? “Picture under glass” is far more accessible than previous interactive designs since it makes use of touch and drag, which is perhaps one of the simplest human movements.
Furthermore, although more natural behaviors like grabbing and throwing are being incorporated into virtual reality systems, these systems are still mostly dependent on visual inputs and are unable to provide true haptic feedback due to limitations in existing technology. This makes me wonder what are innovative ways that technology can simulate these experiences without reverting to traditional physical forms?
For this week’s assignment, we made a simple musical instrument using the Arduino. It is more similar to DJ equipment than a traditional musical instrument – slowing down the music or speeding it up, switching between different speed quickly to add dynamics to the music, and, of course, flashy lights, which will make or break the party. As part of the assignment, we used the button switch as the digital input and the potentiometer as the analog input. The Arduino plays a melody through a buzzer, and 4 LEDs light up in correspondence with the melody. By pressing the button, you can play the melody. The potentiometer allows you to control the length or duration of the melody.
The map() function is a hidden gem in this code. It’s a versatile function that remaps a range of values from one scale to another. In our case, the potentiometer reading (analog input) ranges from 0 to 1023. We use map() to translate these values to a range of speeds between 1.3 and 6.0. This allows the potentiometer to control the tempo of the melody playback.
The loop() function continuously checks the button state. If the button is pressed (digitalRead(DIGITAL_PIN) == LOW), the isPlaying flag is set to false, effectively stopping the melody. Additionally, the LED lighting is turned off (digitalWrite(RED_PIN, LOW), etc.). When the button is not pressed (digitalRead(DIGITAL_PIN) == HIGH), the isPlaying flag is set to true, and the playMelodyWithLEDs() function runs.
As we were integrating the LEDs with the melody, we struggled a bit. Initially, the LED light-up function was different, and it had a separate delay() for the individual LED to match with the melody. It did not work as expected. We realized after playing a note if we move to the separate LED function and delay there, the melody becomes very slow due to using 2 separate delays. So, we removed the LED function and used the same delay() for both the notes and LEDs.
In the textA Brief Rant on the Future of Interaction Design, the author argues for a thoughtful approach to tool design, emphasizing that a tool’s primary function should be to serve human needs and enhance our innate abilities. The essence of his argument is that tools should not be valued merely for their aesthetics, but for their ability to improve our efficiency and effectiveness. He expresses a certain dissatisfaction with current technology, particularly critiquing devices he describes as “Pictures Under Glass.” He suggests that such technologies diminish the sensory richness and the interactive potential of our hands, more specifically our fingertips. He’s worried that focusing too much on making interfaces look good might make us forget how important it is to actually touch and interact with things around us.The author urges us to think about all the ways we can interact with tools, suggesting we use interfaces that involve more of our senses and movements. He believes that just using one finger on a touchscreen doesn’t make full use of our ability to move and feel.
In conclusion, the author calls for a future where technology development is inspired by the full range of human abilities. Instead of adapting to the constraints of existing technology, he challenges designers to envision tools that integrate seamlessly with human capacity for manipulation and sensory experience, fostering a more intuitive and enriching interaction with our tools.
As for his follow-up article, the author has replied to some critiques about his previous article on the future of interaction design, where he ranted about current tech not being up to par. He clarifies he wasn’t out to offer a solution but to spark deeper research into better solutions that really fit human capabilities. He’s not against current tech like iPhones or iPads, but he’s worried if we don’t push for more, we’ll get stuck with something that’s only a bit better than what we have. He’s not keen on just adding physical keyboards or styluses to our devices because that doesn’t really tap into the full experience of interacting with the world around us. And when it comes to voice commands, he thinks they’re fine for simple questions or tasks, but for more complex stuff like exploring ideas or creating art, they fall short.The author isn’t thrilled with tech that’s all about swiping on screens either. It might be easy, but it’s not making the most of what our bodies can do, and it could even hold back kids’ development because it’s not challenging them enough. He thinks we’re meant to interact with our world in more dynamic ways, and our future tech should reflect that.
Victor urges designers to defy convention and rethink how we engage with technology through his criticism.
Victor’s initial outburst functions as a wake-up call, stressing how crucial quick feedback loops and user-friendly interfaces are to creating meaningful user experiences. He highlights how technology can empower people and foster creativity, and he pushes for a move towards direct manipulation and user-centric design.
There is a wide range of viewpoints from the design world in the replies to Victor’s questions. While some share his views and support his vision for the direction of interface design, others present opposing arguments and voice legitimate doubts about the viability and practicality of his suggestions.
In my opinion, Victor makes strong arguments. I really like how he emphasises user empowerment and the transformative power of interactive technologies. His appeal for creativity is in line with my own conviction that technology has the ability to improve human experiences and encourage meaningful interaction.
I do, however, recognize the difficulties of turning creative concepts into workable solutions. Victor makes some insightful and thought-provoking suggestions, but putting them into practice in real-world settings necessitates giving considerable thought to user requirements, technology limitations, and wider societal ramifications.
Music possesses a remarkable capacity to bridge cultural and linguistic divides and unite individuals from diverse backgrounds. Motivated by the notion of promoting interactive experiences and democratizing music creation, we set out to build a one-of-a-kind musical instrument out of buttons and Arduino. Our intention was to enable people to use sound as a creative medium for self-expression, irrespective of their experience with music. We chose the piano as our main inspiration because we could recreate the chords using buttons.
💡 Process:
Using Arduino Uno we wired up buttons to serve as interactive triggers for different musical notes. Each button was assigned a specific pitch or sound, allowing users to create melodies by pressing combinations of buttons. We leveraged our programming skills to code the logic behind the instrument, ensuring seamless functionality and an intuitive user interface.
🚀 Difficulties:
Although our journey was exciting and innovative, there were certain difficulties. A major challenge we encountered was guaranteeing the reliability and consistency of button pushes. We have to use precise calibration and filtering techniques to get beyond problems like noise interference and debounce.
There were additional difficulties in creating an intuitive user interface. To ensure that users could simply grasp how to interact with the instrument while still having the freedom to explore and experiment with various musical compositions, we had to find a balance between simplicity and utility.
Our arduino illustration:
The code:
onst int speakerPin = 9; // Speaker connected to pin 9
int buttonPins[] = {2, 3, 4, 5}; // Button pins for C, D, E, F
int notes[] = {262, 294, 330, 349}; // Frequencies for C4, D4, E4, F4
void setup() {
// Set up each button pin as an input with pull-up resistors
for (int i = 0; i < 4; i++) {
pinMode(buttonPins[i], INPUT_PULLUP);
}
// Set the speaker pin as an output
pinMode(speakerPin, OUTPUT);
}
void loop() {
// Check each button and play the corresponding note
for (int i = 0; i < 4; i++) {
if (digitalRead(buttonPins[i]) == LOW) { // Check if button is pressed
tone(speakerPin, notes[i]); // Play the corresponding note
delay(200); // A short delay to help debounce the button
while (digitalRead(buttonPins[i]) == LOW); // Wait for the button to be released
noTone(speakerPin); // Stop playing the note
}
}
}