To behave in a complex system without accounting for chaos is to exhibit a stubborn and persistent stupidity.
Driving a car is using a complex system. There are elements dependent on certain operations, in a certain order, with room for variables. Although it becomes automatic after awhile, as beginning drivers we are most aware of the complexity. Most elements (visual acuity, steering, pedal control, audio input, managing turns, noting signage) must be executed properly for the system to function at all, much less at optimal levels. When elements shift without warning (unpredictability), there can be dangerous results. A fan belt breaks, a child falls while crossing the street, a car at the intersection runs the stop sign.
The drivers in my part of the world operate as if they control the complex system, as if there are no chaotic elements. They don’t stop at stop signs if they don’t see anyone around, they cut in front and behind people in crosswalks, they weave in and out of traffic on the freeway. These drivers know they are in a complex system, and they think they control it, as if they were playing a traffic video game and will simply lose points if their car explodes. The high frequency of accidents here is the result of drivers assuming a lack of chaos, behaving as if all conditions (the placement and speed of other vehicles, for example) will remain the same.
Chaos also accounts for a great deal of irony. The fire starts in the closet where you stored the emergency supplies. Your alarm is set to wake you an hour early, but the battery dies. I don’t trim the tree outside my window to keep the shade, and a big wind sends branches flying off. Good intentions, and their accompanying predictions, are thwarted for reasons which are explainable but not controllable.
It was quite clear in this week’s material, particularly the excellent slides accompanying Seth Bullock’s video, that some chaos is simply the result of emergent behavior arising from “the uncoordinated actions of lower-level entities”. These actions, like those of people participating in a collective, are unintentional. I can crush the Butterfly in my hands because I don’t want its wings to move, but it won’t be because I’m trying to prevent a storm in Indonesia. I realize we participate without knowing it, and our intentions don’t always play out. I can pan a book at Amazon hoping to reduce sales, but inadvertently increase sales as people want to read something so awful.
Intentionality, however, is at least an effort to prevent stupidity by acknowledging the existence of chaos. If I purposefully drive slowly in a school zone, and allow children to reach the other side of the street, the chance of danger caused by chaos (or by me) may be slightly reduced. There will always be variables beyond my control, so in that sense intentionality provides an affective influence: I fear hitting the child more than I do being late to work, so I will experience more comfort by behaving this way. Same thing with storing water in a couple of different locations, or checking the alarm clock battery. We balance between the knowable and the unknowable, creating a Cynefin-based narrative to prepare for the unexpected (Kurtz and Snowden, p. 480) instead of assuming basic cause-and-effect.
It’s a less stupid way to do things.
Hi Lisa!
Excelent post as usual Lisa. Point taken.
Thanks for pointing out Seth Bullock’s video.
See you around. Maru
Comment by Maru — October 18, 2008 @ 2:37 pm
Hi Lisa,
Along the same lines, this week futurist Jamias Cascio wrote about elements of resiliency in the face of the predictable and unpredictable. While he uses earthquake survival as a specific example, his ideas are generalizable, and overlap with your observations as well as with features of connectivism… and might even apply to learning in a connective world:-)
Comment by C. Tschofen — October 18, 2008 @ 3:11 pm
When its applied to new things we might do, Sproull and Kiesler (1991) refer to the known and unknown sequalae in terms of the first and second level effects.
Changes we make to improve efficiency often have other offsetting consequences…..more efficiency in one place can cause less in another…and can cause deviation amplifying changes in the system…and where the consequences are more profound and harder to measure than when measuring immediate changes in efficiency.
I am interested in intentionality because we so often miss these 2nd level effects.
Thanks for your post Lisa, will now go chasing up some of the links you mention as I have fallen behind in my readings and your learning has piqued my interest. ailsa
Comment by ailsa — October 26, 2008 @ 7:41 pm