Both readings (basically a two-part reading) were a highly interesting look into future interaction systems, especially using the tactile capabilities of our fingertips to sense and manipulate objects and interaction systems.
Bret Victor’s examples on the range of motion and sensitivity of the human hands and fingers reminded me of a key fact from developmental psychology: the fact that the touch receptors of the fingers, lips, and tongues are the first senses to develop. This is why infants touch everything and put everything into their mouths; it is basically their way of seeing things.
The human fingertip in fact can resolve objects at a resolution of 0.4mm. That means, it can basically distinguish between two objects that are about half of a sharpened pencil tip apart.
Thus, with this level of capability of the human hands, one would be inclined to agree with Victor on the declaration that current systems of interaction are highly limited compared to the possibilities.
Other than that, many touchscreen technologies are unfortunately inacessible for people with disabilities. Blind people, for example, require some kind of haptic or audio feedback from a touchscreen input that is usually never bundled with the hardware. In a lot of cases, there is no sufficient option provided in the default software, and special software needs to be downloaded… by first making one’s way through the device unaided. Old people and people with motor disabilities also often struggle with some of the finer inputs required with touchscreens, again due to the lack of any haptic feedback.
Interaction systems in the future need to be designed for all. But first, we must break away from the status quo and boldy go where no man has gone before.