What stood out to me was the idea that tools should be designed around what humans are actually capable of doing, not just what technology can do. When he said “if a tool isn’t designed to be used by a person, it can’t be a very good tool” it made me think about how often we accept interfaces that don’t fully match how we naturally interact with the world.
For example, with touchscreens, most interactions are reduced to tapping or sliding, which feels limited compared to how expressive our hands actually are.
This connects to what we’ve been doing in class with circuits and Arduino, where interaction feels more physical and responsive. When building circuits, we’re not just coding something to appear on a screen we’re creating systems where human actions, like touching foil or changes in light, directly affect outputs like LEDs. In my project, I used inputs and conditions to map real world interactions to responses, which felt more aligned with how we naturally engage with objects. This reflects the idea of amplifying human capabilities, because the system responds to touch and environmental changes rather than limiting interaction to a flat surface. It also made me realize that even simple projects like ours explore more meaningful interaction than typical touchscreen interfaces, since they involve feedback, physical input, and a closer connection between the user and the technology.