I like how the author points out the fact that no matter how much knowledge the machines’ operators accumulate over the years and how well-versed they are in their field, they still have to make the construction and/or programming simple. It’s all about precision and accuracy, or perfection in the process in other words, which humans are not prone to. But it’s fascinating how, given the contradictory nature between machines and humans, humans have managed to churn out some very advanced technology that requires high level of precision and accuracy and to minimize the number of errors as much as possible merely through experimentation and testing! But that said, added complexities equate increased difficulty in use and frustration. High-tech machines/gadgets today require a lot of knowledge from users, and not every form of knowledge is easy to be acquired.
Another point from the author with which I resonate is about the relationship between affordance and signifier. This kind of relationship is not very clear-cut as one would say, as the author points out, “some affordances are perceivable, others are not” and “perceived affordances often act as signifiers, but they can be ambiguous” (19). Despite the fact that signifiers are more important than affordances, I’ve seen some terrible signifiers that do not fulfil their duty at all, which leaves users to figure out affordances all on their own and hence, suffer from frustration. The more high-tech and more complex machines, the more effort operators should put into curating the signifiers for a more effective affordance from the users.