Week 10: A Brief Rant on the Future of Interaction Design

As I was reading A Brief Rant on the Future of Interaction Design, what really struck me was how much we’ve just accepted the idea that “the future” means shiny screens everywhere. Bret Victor makes a strong point that even though technology looks cooler and sleeker, the way we interact with it hasn’t fundamentally changed — it’s still just tapping and swiping on glass. It’s kind of depressing when you think about it, because the excitement around “new technology” mostly ignores the fact that humans are physical, three-dimensional beings. We have hands that are capable of so much subtlety, but all we do is poke at flat rectangles. Victor’s frustration feels justified — it’s like we’ve totally surrendered to convenience at the cost of creativity and human potential.

At the same time, I found myself wondering: is it really fair to expect interaction design to be radically different when so much of our world (work, entertainment, communication) has moved into the digital space? Maybe part of the reason we keep using screens is because they’re the simplest way to deal with abstract information. But still, Victor’s examples of more tactile, nuanced designs made me realize we’re probably limiting ourselves by not even trying to imagine alternatives. It’s like we’re stuck optimizing what already exists instead of exploring what could be fundamentally new. After reading this, I feel like a good interaction designer shouldn’t just make apps easier to use, but should rethink what “using” even means.

Week 10: Musical Instrument

Concept:

video

“WALL-E’s Violin” is inspired by the character WALL-E from the popular animated movie. The idea behind “WALL-E’s Violin” is to mimic WALL-E expressing emotion through music, specifically through a violin — a symbol of sadness and nostalgia. A cardboard model of WALL-E is built with an ultrasonic distance sensor acting as his eyes. A cardboard cutout of a violin and bow are placed in front of WALL-E to represent him holding and playing the instrument. 

As the user moves the cardboard stick (the “bow”) left and right in front of WALL-E’s “eyes” (the ultrasonic sensor), the sensor detects the distance of the bow. Based on the distance readings, different predefined musical notes are played through a piezo buzzer. These notes are chosen for their slightly melancholic tones, capturing the emotional essence of WALL-E’s character.

To add a layer of animation and personality, a push-button is included. When pressed, it activates two servo motors attached to WALL-E’s sides, making his cardboard arms move as if he is emotionally waving or playing along. The mechanical hum of the servos blends into the soundscape, enhancing the feeling of a robot expressing music not just through tones, but with motion — like a heartfelt performance from a lonely machine.

How it Works:

  • Ultrasonic Sensor: Acts as WALL-E’s eyes. It continuously measures the distance between the robot and the cardboard bow moved in front of it.
  • Note Mapping: The Arduino interprets the measured distances and maps them to specific, predefined musical notes. These notes are chosen to sound melancholic and emotional, fitting WALL-E’s character.
  • Piezo Buzzer: Plays the corresponding musical notes as determined by the distance readings. As the user sways the bow, the pitch changes in real time, simulating a violin being played.
  • Servo Motors + Button: Two servo motors are connected to WALL-E’s arms. When the button is pressed, they animate his arms in a waving or playing motion. The sound of the servos adds a robotic texture to the performance, further humanizing the character and blending physical movement with audio expression.

Code:

#include <Servo.h>
#include <math.h>

// Pin Assignments

const int trigPin   = 12;   // Ultrasonic sensor trigger
const int echoPin   = 11;   // Ultrasonic sensor echo
const int buzzerPin = 8;    // Buzzer output
const int buttonPin = 2;    // Button input for servo tap
const int servoPin  = 9;    // First servo control pin (default starts at 0°)
const int servoPin2 = 7;    // Second servo control pin (default starts at 180°)

Servo myServo;    // First servo instance
Servo myServo2;   // Second servo instance

// Bittersweet Melody Settings

int melody[] = {
  294, 294, 370, 392, 440, 392, 370, 294
};
const int notesCount = sizeof(melody) / sizeof(melody[0]);


// Sensor to Note Mapping Parameters

const int sensorMin = 2;     // minimum distance (cm)
const int sensorMax = 30;    // max distance (cm)
int currentNoteIndex = 0;  

// Vibrato Parameters 

const float vibratoFrequency = 4.0;  
const int vibratoDepth = 2;          

// Timer for vibrato modulation

unsigned long noteStartTime = 0;

void setup() {
  // Initialize sensor pins
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
  
 // Initialize output pins
  pinMode(buzzerPin, OUTPUT);
  pinMode(buttonPin, INPUT_PULLUP);  
  
  // Initialize the first servo and set it to 0°.
  myServo.attach(servoPin);
  myServo.write(0);
  
   // Initialize the second servo and set it to 180°..
  myServo2.attach(servoPin2);
  myServo2.write(180);
  
 // Initialize serial communication for debugging
  Serial.begin(9600);
  
// Get the initial note based on the sensor reading
  currentNoteIndex = getNoteIndex();
  noteStartTime = millis();
}

void loop() {
 // Get the new note index from the sensor
  int newIndex = getNoteIndex();
  
 // Only update the note if the sensor reading maps to a different index
  if (newIndex != currentNoteIndex) {
    currentNoteIndex = newIndex;
    noteStartTime = millis();
    Serial.print("New Note Index: ");
    Serial.println(currentNoteIndex);
  }
  
 // Apply vibrato to the current note.
  unsigned long elapsed = millis() - noteStartTime;
  float t = elapsed / 1000.0;  // Time in seconds for calculating the sine function
  int baseFreq = melody[currentNoteIndex];
  int modulatedFreq = baseFreq + int(vibratoDepth * sin(2.0 * PI * vibratoFrequency * t));
  
// Output the modulated tone
  tone(buzzerPin, modulatedFreq);
  
  // Check the button press to trigger both servos (movement in opposite directions).
  if (digitalRead(buttonPin) == LOW) {
    tapServos();
  }
  
  delay(10); 
}

// getNoteIndex() measures the sensor distance and maps it to a note index.
int getNoteIndex() {
  int distance = measureDistance();
// Map the distance (between sensorMin and sensorMax) to the range of indices of the array (0 to notesCount-1).
  int index = map(distance, sensorMin, sensorMax, 0, notesCount - 1);
  index = constrain(index, 0, notesCount - 1);
  return index;
}

// measureDistance() triggers the ultrasonic sensor and calculates the distance in cm.
int measureDistance() {
  digitalWrite(trigPin, LOW);
  delayMicroseconds(2);
  digitalWrite(trigPin, HIGH);
  delayMicroseconds(10);
  digitalWrite(trigPin, LOW);
  
  long duration = pulseIn(echoPin, HIGH);
  int distance = duration * 0.034 / 2;
  return constrain(distance, sensorMin, sensorMax);
}

// tapServos() performs a tap action on both servos:
// - The first servo moves from 0° to 90° and returns to 0°.
// - The second servo moves from 180° to 90° and returns to 180°.
void tapServos() {
  myServo.write(90);
  myServo2.write(90);
  delay(200);
  myServo.write(0);
  myServo2.write(180);
  delay(200);
}

 

Circuit:

Future Improvements:

  • Add more notes or an entire scale for more musical complexity.
  • Integrate a servo motor to move the arm or head of WALL-E for animation.
  • Use a better speaker for higher-quality sound output.
  • Use the LED screen to type messages and add to the character.

Reading Response 7 – Future of Interaction Design (Week 10)

Reading 1: A Brief Rant on the Future of Interaction Design
Bret Victor’s “A Brief Rant on the Future of Interaction Design” challenges the prevailing notion that touchscreens represent the pinnacle of user interface design. He critiques the “Pictures Under Glass” paradigm, highlighting how it neglects the tactile and manipulative capabilities of human hands. Victor emphasizes that our hands are not just for pointing and tapping but are essential tools for feeling and manipulating our environment. He argues that current interfaces fail to leverage these capabilities, leading to a diminished user experience. This perspective prompts a reevaluation of how we design technology, suggesting that future interfaces should engage more deeply with our physical senses to create more intuitive and effective tools.

Reading 2: Follow-Up
In the follow-up responses to his rant, Victor addresses common criticisms and clarifies his intentions. He acknowledges that his piece was meant to highlight a problem rather than provide a solution, aiming to inspire further research into more tactile and dynamic interfaces. Victor compares the current state of technology to early black-and-white photography—revolutionary at the time but lacking in certain dimensions. He encourages exploration into areas like deformable materials and haptic holography, emphasizing the need for interfaces that can be seen, felt, and manipulated. I feel like this response reinforces the idea that while current technologies have their merits, there is significant room for innovation that more fully engages our human capabilities.

Week 10 – Musical Instrument

Concept:

We decided to use two digital switches (push switches) to play two different notes, with the LDR effectively acting as an analogue volume adjustment mechanism. The video demonstrates how this feedback from the LDR changes the volume, and if you focus when the light intensity pointed towards the LDR is decreased, there is a very small noise.

Demo (Circuit):

Demo (Video):

Arduino Code:

// Define pins for the buttons and the speaker
int btnOnePin = 2;
int btnTwoPin = 3;
int speakerPin = 10;

void setup() {
  // Initialize both button pins as inputs with built-in pull-up resistors
  pinMode(btnOnePin, INPUT_PULLUP);
  pinMode(btnTwoPin, INPUT_PULLUP);
  
  // Configure the speaker pin as an output
  pinMode(speakerPin, OUTPUT);
}

void loop() {
  // Check if the first button is pressed
  if (digitalRead(btnOnePin) == LOW) {
    tone(speakerPin, 262);  // Play a tone at 262 Hz
  }
  // Check if the second button is pressed
  else if (digitalRead(btnTwoPin) == LOW) {
    tone(speakerPin, 530);  // Play a tone at 530 Hz
  }
  // No button is pressed
  else {
    noTone(speakerPin);     // Turn off the speaker
  }
}

Challenges:

The initial concept started out with a Light Dependent Resistor and the Piezo speaker/buzzer. We faced issues as the readings from the LDR did not behave as expected, there was an issue with the  sound produced, and the change in music produced was not adequate.

We also faced challenges with the programming, as the noise production was inconsistent. We fixed this by adjusting the mapping of the notes to produce more distinct frequencies for each independent push button (red vs yellow). for 262 and 540 Hz respectively.

Done by: Zayed Alsuwaidi (za2256) and Zein Mukhanov (zm2199)

Week 10 – Musical Instrument – Mary had a Little Piano

Groupmates : Liya and Shamma

CONCEPT and working :

For our group assignment, we built a simple digital musical instrument using an Arduino board. Our project was inspired by the children’s song Mary Had a Little Lamb, and we recreated the melody using four push buttons for four of the notes, C, D, E and G respectively. Each button acts as a digital sensor, triggering specific musical notes when pressed. The tones are played through a mini speaker connected to the Arduino, allowing the tune to be heard clearly. This created a very basic piano-style interface, with each button mapped to one of the notes in the song.

In addition to the buttons, we used a potentiometer as our analog sensor. This allowed us to control the frequency of the sound in real time. As the knob is turned, the pitch of the notes changes slightly, giving the player the ability to customize how the melody sounds. We mapped the frequency from 500-1000 Hz for this. It made the experience more interactive and demonstrated how analog inputs can add expressive control to digital systems. We also added some labels to the buttons so that it would be easier to play the music.

videos of testing :

(Sorry in advance, I’m not a singer, I just needed to show the song)

https://youtube.com/shorts/6OhDN7k7KAc?si=WBNaPPqKeTkQCIui

https://youtube.com/shorts/vH0wLT3W5Jk?si=3K7I34cpMZWwC9Ly

The code :

const int buttonPins[4] = {3, 5, 8, 9}; // buttons 

//frequency for each button 
int frequencies[4] = {262, 293, 330, 392}; //C, D , E , G notes
int potValue = 0; //to store potentiometer value

void setup() {
  //initialising buttons pins
  for (int i = 0; i < 4; i++) {
    pinMode(buttonPins[i], INPUT_PULLUP);
  }
  
  //for debugging
  Serial.begin(9600);
  
  //speaker pin for output
  pinMode(12, OUTPUT);
}

void loop() {
  //read the potentiometer value
  potValue = analogRead(A0);
  
  //map the potentiometer value to a frequency range 500-1000
  int adjustedFrequency = map(potValue, 0, 1023, 500, 1000);
  
  //for button and the corresponding note
  for (int i = 0; i < 4; i++) {
    if (digitalRead(buttonPins[i]) == LOW) { // Button is pressed (LOW because of INPUT_PULLUP)
      tone(12, frequencies[i] + adjustedFrequency);
      Serial.print("Button ");
      Serial.print(i+1);
      Serial.print(" pressed. Frequency: ");
      Serial.println(frequencies[i] + adjustedFrequency); //serial monitor 
      delay(200);
    }
  }
  
  //to stop the tone when buttons arent being pressed
  if (digitalRead(buttonPins[0]) == HIGH && digitalRead(buttonPins[1]) == HIGH &&
      digitalRead(buttonPins[2]) == HIGH && digitalRead(buttonPins[3]) == HIGH) {
    noTone(12);
  }
}

PROBLEMS / IMPROVEMENTS :

As for problems or challenges, we didnt really have any specific problems with the circuit other than loose wires or something which was fixed after debugging and checking again. Something we understood from working together is that having two different perspectives helps a lot in solving problems and finding ideas.

We see a lot of potential for expanding this project in the future. One idea is to add a distance sensor to control volume based on hand proximity, making it even more dynamic. Another would be adding LEDs that light up with each button press to provide a visual cue for the notes being played. We’re also considering increasing the number of buttons to allow more complex songs, and possibly adding a recording function so users can capture and replay their melodies.

It was a fun and educational project that helped us better understand the relationship between hardware inputs and interactive sound output. It was exciting to bring a classic tune to life through code, sensors, and a mini speaker!

Week 10-Sound, Servo motor, Mapping

For our group assignment, we built a simple digital musical instrument using an Arduino board. Our project was inspired by the children’s song Mary Had a Little Lamb, and we recreated the melody using four push buttons for four of the notes, C, D, E and G respectively. Each button acts as a digital sensor, triggering specific musical notes when pressed. The tones are played through a mini speaker connected to the Arduino, allowing the tune to be heard clearly. This created a very basic piano-style interface, with each button mapped to one of the notes in the song.

In addition to the buttons, we used a potentiometer as our analog sensor. This allowed us to control the frequency of the sound in real time. As the knob is turned, the pitch of the notes changes slightly, giving the player the ability to customize how the melody sounds. We mapped the frequency from 500-1000 Hz for this. It made the experience more interactive and demonstrated how analog inputs can add expressive control to digital systems. We also added some labels to the buttons so that it would be easier to play the music.

As for problems or challenges, we didnt really have any specific problems with the circuit other than loose wires or something which was fixed after debugging and checking again. Something we understood from working together is that having two different perspectives helps a lot in solving problems and finding ideas.

We see a lot of potential for expanding this project in the future. One idea is to add a distance sensor to control volume based on hand proximity, making it even more dynamic. Another would be adding LEDs that light up with each button press to provide a visual cue for the notes being played. We’re also considering increasing the number of buttons to allow more complex songs, and possibly adding a recording function so users can capture and replay their melodies.

It was a fun and educational project that helped us better understand the relationship between hardware inputs and interactive sound output. It was exciting to bring a classic tune to life through code, sensors, and a mini speaker!

The code:

const int buttonPins[4] = {3, 5, 8, 9}; // buttons

//frequency for each button
int frequencies[4] = {262, 293, 330, 392}; //C, D , E , G notes
int potValue = 0; //to store potentiometer value

void setup() {
//initialising buttons pins
for (int i = 0; i < 4; i++) {
pinMode(buttonPins[i], INPUT_PULLUP);
}

//for debugging
Serial.begin(9600);

//speaker pin for output
pinMode(12, OUTPUT);
}

void loop() {
//read the potentiometer value
potValue = analogRead(A0);

//map the potentiometer value to a frequency range 500-1000
int adjustedFrequency = map(potValue, 0, 1023, 500, 1000);

//for button and the corresponding note
for (int i = 0; i < 4; i++) {
if (digitalRead(buttonPins[i]) == LOW) { // Button is pressed (LOW because of INPUT_PULLUP)
tone(12, frequencies[i] + adjustedFrequency);
Serial.print("Button ");
Serial.print(i+1);
Serial.print(" pressed. Frequency: ");
Serial.println(frequencies[i] + adjustedFrequency); //serial monitor
delay(200);
}
}

//to stop the tone when buttons arent being pressed
if (digitalRead(buttonPins[0]) == HIGH && digitalRead(buttonPins[1]) == HIGH &&
digitalRead(buttonPins[2]) == HIGH && digitalRead(buttonPins[3]) == HIGH) {
noTone(12);
}
}

systematic diagram:

the video:

https://drive.google.com/drive/u/0/folders/1Kk2lkQgoAyybXSYWVmY2Dog9uQVX_DMq

 

the video of the potentiometer usage:

https://drive.google.com/drive/u/0/folders/1Kk2lkQgoAyybXSYWVmY2Dog9uQVX_DMq

 

Week 10 – Reading Response

While reading A Brief Rant on the Future of Interaction Design, I didn’t really expect it to be this focused on hands. But the author makes a pretty solid point: for all our futuristic tech, we’re still weirdly okay with staring and tapping at flat glass rectangles all day. It’s kind of wild how little we think about how limited that is compared to what our hands are actually capable of.

What really stuck with me was the idea that “Pictures Under Glass” is basically a transition phase. We’re so used to touchscreens that we’ve stopped questioning whether they’re actually good. The essay really challenges that, and it’s kinda refreshing. It reminded me of how good tools should feel natural, like when you’re making a sandwich and your hands just know what to do without you even thinking about it. You don’t get that with a tablet or phone. You’re just poking at digital buttons.

I also liked how the author ties this all back to choice. Like, we’re not just being dragged into a touchscreen future, we’re choosing it, whether we realize it or not. That part made me think about design more like storytelling or world-building. We’re shaping how people live and interact, and that’s a big deal.

Overall, it was a fun read with a strong message to stop settling for tech that dulls our senses. Our bodies are way more capable than what current interfaces allow, and maybe the real futuristic thing would be tech that actually lets us do more and not less.

—————————————————————-

Reading the follow-up article helped me better understand where the author was coming from. It’s not just a complaint about modern tech, it’s a call to action. He’s not saying the iPad or touchscreen interfaces are bad. In fact, he even says they are revolutionary. The point is that we shouldn’t stop there. Just because something works now doesn’t mean it can’t be better.

What really stood out to me was the idea that technology doesn’t just evolve on its own. People make choices on what to build, what to fund, what to imagine. If we don’t invest in more expressive, physical ways to interact with computers, we’re basically choosing a future where everything stays flat and glassy.

I also liked how the author handled the voice interface discussion. Voice has its place, commands, quick info, etc. but when it comes to creating or understanding complex things, it falls short. You can’t sculpt a statue or sketch a design with just your voice. Some things require our hands, our bodies, our sense of space.

That quote from the neuroscientist about touch and brain development was super eye-opening. It made me think about how much we take our sense of touch for granted, especially when using digital devices. In the end, the response made the original rant feel less like a critique and more like an invitation to dream a little bigger about what our future tech could be.

Week 10 : Musical Instrument

For this week’s assignment, we were tasked with using Arduino to create a musical instrument. Working in pairs, Areeba and I decided to create a piano based on the tone() function we had explored earlier in class. In our project, we wanted each button switch to correspond to a different note from a piano, so that it could be “played”.

We were asked to use both an analogue and digital component for the assignment; while the digital component was simple enough with the buttons, we decided to use a pontentiometer as our analogue component, and used it to control the pitch of the notes being produced by each button.

The components we used were:

  • Arduino Uno
  • Breadboards
  • Jumper cables
  • 10k Ohm resistors
  • Push buttons
  • 5V speaker

Here is an image of our project, and the schematic:

Our code:

#include "pitches.h"

const int potPin = A0;          // Potentiometer on A0
const int buzzerPin = 8;        // Speaker on D8
const int buttonPins[] = {2, 3, 4, 5, 6, 7, 9, 10}; // C4-C5 buttons
const int baseNotes[] = {NOTE_C4, NOTE_D4, NOTE_E4, NOTE_F4, NOTE_G4, NOTE_A4, NOTE_B4, NOTE_C5};

void setup() {
  for (int i = 0; i < 8; i++) {
    pinMode(buttonPins[i], INPUT); 
  }
  pinMode(buzzerPin, OUTPUT);
}

void loop() {
  static int potValues[5] = {0};
  static int potIndex = 0;
  potValues[potIndex] = analogRead(potPin);
  potIndex = (potIndex + 1) % 5;
  int octaveShift = map(
    (potValues[0] + potValues[1] + potValues[2] + potValues[3] + potValues[4]) / 5,
    0, 1023, -2, 2
  );

  // Check buttons 
  bool notePlaying = false;
  for (int i = 0; i < 8; i++) {
    if (digitalRead(buttonPins[i]) == HIGH) { // HIGH when pressed (5V)
      int shiftedNote = baseNotes[i] * pow(2, octaveShift);
      tone(buzzerPin, constrain(shiftedNote, 31, 4000)); 
      notePlaying = true;
      break;
    }
  }

  if (!notePlaying) {
    noTone(buzzerPin);
  }
  delay(10);
}

Videos of our project:

Playing Happy Birthday

Adjusting the Potentiometer

A challenge we faced was definitely our lack of musical knowledge. Neither Kashish or I are musicians, and as such, we had to do a lot of research to understand the terminology and theory, such as the notes, and how to adjust the octave using the potentiometer. We then also had to figure how to reflect these findings within our code.

Overall though, we had a great time making this project, and then executing our idea.

Week 10: Instrument

Concept

Our musical instrument is based off a combination of a both a mechanical and digital noise machine–controlled with a surprisingly satisfying potentiometer that allows us to modify the beats per minute (BPM), and a button that changes the octave of our instrument. We had fun with the relationship between the buzzer and our mechanical servo which produces the other noise, where the two are inversely proportional! In other words,  as the BPM of the buzzer increases, the servo slows down! And vice versa.

Figuring out the right code combination was a bit tricky at first, as we first made the mistake of nesting both the servo trigger and the buzzer tone in the same beat conditional. This meant that we fundamentally could not separate the motor from the buzzer tones. To resolve this, we ended up using two triggers–a buzzer and a motor trigger–which both run independently.

The code below shows how we ended up resolving this problem. Once we implemented this fix, we were able to celebrate with our somewhat perplexing instrument, that requires the user to complement the buzzers various beeping with the servos mechanical changes.

// --- 3. Perform Actions based on Timers ---

// --- Action Group 1: Metronome Click (Buzzer) ---
if (currentMillis - previousBeatMillis >= beatInterval) {
  previousBeatMillis = currentMillis; // Store the time of this beat

  // Play the click sound
  tone(BUZZER_PIN, currentFreq, CLICK_DUR);

  // Print beat info
  Serial.print("Beat! BPM: ");
  Serial.print(currentBPM);
  Serial.print(" | Freq: ");
  Serial.print(currentFreq);
  Serial.print(" | Beat Interval: ");
  Serial.print(beatInterval);
  Serial.println("ms");
}

// --- Action Group 2: Servo Movement Trigger ---
if (currentMillis - previousServoMillis >= servoInterval) {
  previousServoMillis = currentMillis; // Store the time of this servo trigger

  // Determine Target Angle
  int targetAngle;
  if (servoMovingToEnd) {
    targetAngle = SERVO_POS_END;
  } else {
    targetAngle = SERVO_POS_START;
  }

  // Tell the servo to move (NON-BLOCKING)
  myServo.write(targetAngle);

  // Print servo info
  Serial.print("Servo Trigger! Target: ");
  Serial.print(targetAngle);
  Serial.print("deg | Servo Interval: ");
  Serial.print(servoInterval);
  Serial.println("ms");

  // Update state for the next servo trigger
  servoMovingToEnd = !servoMovingToEnd; // Flip the direction for next time
}

 

Reading reflection-week 10

Reading A Brief Rant on the Future of Interaction Design honestly made me pause and look around at all the tech I use every day. The author’s frustration really hit home, especially the part about how everything has become a flat screen with no real feedback. It reminded me of how awkward it feels to use a touchscreen on a stove or in a car just to do something simple, like turn down the heat or change the music. Those actions used to be so quick and intuitive. I think the author has a point—we’ve lost touch with how things feel. It’s like we’re designing for appearances now, not for actual human comfort. I didn’t feel like he was biased in a bad way he just seemed passionate, like someone who really cares about how design affects everyday life.

What stuck with me most was how much we’ve adapted to poor interaction design without even noticing. I catch myself getting used to frustrating interfaces all the time, and this reading made me realize we shouldn’t have to settle for that. It made me wonder: what would technology look like if we designed it based on how our bodies naturally move and react? That question stayed with me after I finished reading. It also raised another thought—how do we balance innovation with simplicity? Just because something’s high-tech doesn’t mean it’s better. I think future designers, myself included, need to remember that. This reading didn’t completely change my beliefs, but it definitely sharpened them.