05. Plans and Situated Actions. Suchman, Lucy

Lucy’s book Plans and Situated Actions.  “A critique of the dominant assumptions regarding human action and communication which underlie recent research in machine intelligence.”

Also  referred to as  “The problem of human-machine communication”, this book debunked the prevailing philosophy in artificial intelligence at the time it was written in 1987, which was the belief that people worked by making a plan, and then executing it. Humans don’t make plans as a computer; giving every intricate detail in order to carry out a process (like making a journey to work) instead we have that detail built within us. Suchman pointed out several of the flawed assumptions associated with it. In particular, she notes that we never fully specify a plan, because to do so would involve an excruciating level of detail. 

Suchman contrasts this sense of embedded detail with how people were trying to program robots at the time. 

She uses the example of a robot designed to “navigate autonomously through a series of rooms”, where the robot would first observe the rooms, plot a course through them, and then follow that course. Of course, if obstacles were moved after it had plotted its course, it didn’t take that into account. 

As humans, we take for granted our ability to continually evolve our plans in response to our situation, but computers illustrate how difficult such situation awareness is to describe. 

She points out that plans, rather than being a blueprint of action, make more sense as a resource for action. The idea is that we make plans before entering a situation, and we draw upon those plans while in the situation, but if circumstances change, we obviously do not continue blindly following the plan.

Instructions serve as a resource for describing what was done not only because they guide the course of action, but also because they filter out of the retrospective account of the action, or treat as “noise,” everything that was actually done that the instructions fail to mention. (p. 102)

She also discussed the difficulty in understanding a conversation between 2 people. What is said is only the smallest part of the conversation. The listener must actively try to construct meaning from what the speaker is saying. The listener constructs a model in their head, using cues from the conversation to build that model. Computers do not have the same capability for mental model repair,

Great ideas and constructs something I take for granted and never really considered. I often listen to someone talking and think to myself  ” what are they talking about”. Good conversation experiences I think are quite rare. We don’t engage at that level. We just want a “alright” response and not much else. Very hard to find people we can properly speak with. Clever work.