A few days ago, I received an email advertisement for a software platform that offers predictive capabilities for urban planning, environmental analysis, and related fields. The email brought me back to work that I performed shortly after graduate school running scenarios for a regional visioning plan. Using the latest fancy software, we developed a handful of alternative regional visions, each incorporating a different set of assumptions about land use patterns and transportation investments.
By twisting a digital knob, we could expand development endlessly into the hinterland and see the region’s traffic soar and its stormwater overwhelm existing capacity. Likewise, we could dial the knob the other way, emphasizing infill and protection of rural areas to see a smaller increase in traffic and reduced need for costly infrastructure. Our client used this information to steer the plan towards objectives that were intended to result in the desired outcome.What struck me most during those days was that our predictions were based on demographic data projected for the year 2040 — a full 30 years into the future. I remember feeling very unsettled about that. How can anyone have a clue what to expect in 30 years? While I believe that kind of information could be useful in the right context, I grow more and more convinced that these types of exercises often do more harm than good.
The Power of Prediction
We humans are highly susceptible to subconscious biases that emerge from what Daniel Kahneman calls System 1 thinking. This is the part of our mind that is governed by intuition and emotion. Jonathan Haidt describes this as the elephant upon which our logical, rational self rides.
One of the insights from Daniel Kahneman’s work is a cognitive fallacy he calls “the illusion of validity.” This is that so-called experts in many fields often perform worse on predictions than random chance. One explanation for this is that experts are less likely to question the System 1 impulses that invariably affect the “rational” assumptions that undergird any predictive exercise. An expert, having a broad base of knowledge in their field, becomes overly confident in his or her understanding of the factors at play and fails to appreciate the complexity of the systems at hand.
Of course, there are branches of knowledge where we can predict with great accuracy future events. My family and I were thrilled by the recent solar eclipse, an event which scientists can predict accurately hundreds or even thousands of years into the future. The key question is whether the subject acts more like a cat or a washing machine, as Nassim Taleb describes it. That is, are we predicting the mechanical behavior of a closed system with simple interactions between parts (the washing machine) or an organism with highly complex feedback loops and nonlinear responses to external forces?
Cities are cats, not washing machines. Thus, a mechanistic approach to city problems exposes us to two harms: one is that we employ the wrong solutions, or perhaps more accurately, we employ solutions to the wrong question. This is what Kahneman describes as “replacing a difficult question with an easier one”. The other danger is that our intervention actually creates new interactions in the system that may result in a general worsening of things.
A Better Way
In light of these dangers, how are we to respond In the face of complexity? There are two solid strategies we can rely on. The first is elaborated quite nicely by Jarrett Walker in this video. I highly suggest that you watch the whole presentation, but to summarize: our predictions become more useful when they are focused on universal constants. To borrow his example, we can accurately model the improved service extent of a redesigned transit network by describing the additional jobs or other important destinations that can be reached under the proposed changes. These are mathematical facts based on the frequency of service, the speed of service, and the geographic location of stops. What we can’t know is how that additional freedom translates into changes in ridership, which is subject to complex social and economic factors.
Our other solution to the problem of prediction lies in probing uncertainty with small bets, as has been written about many times on this site. In complex systems like cities, we can’t really know the full effect of our actions in advance. By making incremental changes and carefully observing the results, we gain insights about the complex interactions that drive the places we live in. Incremental changes minimize our downside risk when something goes wrong. If something works, keep doing it and reap the rewards; if it doesn’t, cut your losses and move on to the next experiment.
Strong Citizens know that large top-down initiatives based on rosy predictions for the future make us more fragile. They are careful to avoid replacing difficult and useful questions with easy but unhelpful ones. By emphasizing small bets and incremental changes, strong citizens learn how to build Strong Towns.