Bret Victor argues that hands do two things, feel and manipulate, and that most screen-first products ignore both. On the counter I judge texture, resistance, and weight, I adjust heat by feel, I correct errors through immediate tactile feedback. On the screen I scroll and tap with one finger, I convert rich physical cues into flat sequences of steps, accuracy falls and attention shifts from food to interface.
Fitness tracking shows a similar pattern. A watch counts reps and time, yet it cannot teach grip pressure, bar path, stance, or breath. Effective coaching speaks through the body, the right cue is a change in force or timing, not another chart. A better tool would offer variable resistance and haptic prompts, small vibrations for tempo, pressure feedback for grip, and state you can feel without looking.
Even productivity tools can illustrate the loss in “transaction”. Physical sticky notes on a whiteboard build spatial memory, clusters are recalled by location and reach, the body encodes the arrangement. Dragging cards on a screen removes proprioception, scanning columns replaces simple recall by place. Tangible controllers and deformable surfaces could restore some of that embodied structure, information would be carried in texture and force, not only pixels.
To improve this, I propose we treat touch as information but not just input. Design for affordances that speak through force, texture, and spatial arrangement. If a tool mediates physical tasks or spatial understanding, add haptic and tangible feedback before adding new visual layers.