.


Relinquishing Control: Reactions to Emergence.


March 31st, 2011, essays

Paper written for the Fragile Conference. This is a rough draft, but comments are appreciated. PDF version.

You can not reduce the city to merely the sum of its parts – its nature escapes the deterministic world we, until recently, thought we lived in and goes on to manifest itself in the realm of the unpredictable, where it is said that a flap of a butterfly’s wing here can cause a thunderstorm half a globe way. This new way of seeing the world, in which interaction of complex systems yield results of such extreme richness and fragility that even the most minute decimal of a number can propagate changes of vast and striking nature is not as chaotic and random as it seems at a first glance. Though essentially unpredictable, this world is all but devoid of pattern. Up to a few years ago, we were perceiving nature in a deterministic way – that is, we thought that we were smart enough to be able to fully comprehend and elaborate its rules in formulas that would ultimately hold the whole truth and would be capable of describing it in its whole richness and lushness of detail and dynamism.

This approach proved to be a folly – formulas got so complicated you needed several pages to write them down[1]. The quest for a Theory of Everything is essentially flawed. Systemic theory seemed to be pushing towards a blind dead end, as Gödel elegantly proved in 1931. In short, Gödel first theorem of incompleteness states that for any system there will always be true statements about the system which are unprovable within the respective system. This led us to look for other mechanism to more efficiently describe the complex behavior of systems. Chaos theory was born and alongside it different concepts evolved which would help describe all the new found detail and richness we found after humanity dropped its deterministic glasses. By far the most comprehensive and elusive (from a mathematical point of view) is the concept of emergence. In a systemic theoretical context, this term describes the unexpected[2] appearance of novel features of a system, features which either enrich the respective system or transform it into a new one. Furthermore, it is necessary to introduce the notion of self-organization, which became a crucial tool of comprehension for today’s complex world. In short, self-organization is the process by which structures or patterns appear without a central authority or external element imposing them through planning or specific design. Furthermore, it has been proven that emergence is the rule rather than the exception (Stewart & Cohen, 1995).

All these changes coming from the world of sciences have prompted parallel developments in the philosophical world. Space is no longer perceived as flat and linear but as being a striated medium, folding upon itself and demarcated by gradients (or intensities) of values, with no “deterministic” boundaries. Manuel DeLanda argues for a perception of reality as a flow of matter-energy animated from within by self-organization processes (DeLanda, 2000). Cities thus become mineral exoskeletons which crystallize out of this flow, while at the same time performing themselves operations of stratification, destratification and restratification on the flows that transverse them. Henri Lefebvre describes this organic aesthetic of the flows of life in a city and its continuous restlessness and unpredictability as rhythms with as having a “maritime” quality: ”There on the square, there is something maritime about the rhythms. Currents traverse the masses. Streams break off, which bring or take away new participants. Some of them go towards the jaws of the monster, which gobbles them down in order quite quickly to throw them back up. The tide invades the immense square, then withdraws: flux and reflux.” (Lefebvre, 2004). In the study of emergence, emphasis is put on the two ways it operates or manifests: how complex systems emerge from simple rules[3] or, on the other hand, how complex systems produce simple patterns. Through the study of such phenomena and their workings we easily deduce that large scale structures emerge from the actions of individual decision makers (DeLanda, 2000).

We can now state that cities – furthermore, we can extrapolate to the whole sum of the built environment without going amiss – can be seen as an emergent system, the result of a crystallization process provoked and sustained by humanity (which, in turn, evolved from similar processes). It gains legitimacy in itself, not to the point of a conscious entity but as phenomena which presents all the qualities of a natural organism. Its independent animation and continuous reproduction[4] are qualities which determine us to accept their independence from planning and design endeavors and our subsequent limited operational power on them. The following statement challenges a tradition deeply rooted in the psyche of humankind, which preferred to identify itself and, even more so, its acts of building as acts opposing nature – opposing the natural chaos with the order of creation[5]. In my opinion, one critical aspect of the  radical shift which now takes place in architectural theory and practice alike, or one reaction to emergence, is accepting the legitimacy of the city as a living and evolving process which does not oppose nature, but is inherently part of it.

These theoretical advances have been directly supported and translated into urbanism through the use of new digital tools which have emerged and are now becoming accessible to the practice. We can now simulate to amazing degrees of accuracy highly complex phenomena, generating self-organizing patterns that act as a base for the understanding of the formational processes in urban conglomerations. Parametric design, coupled with algorithmic design, provide the main tools with which a new urbanism is now created. Why is there this drive towards this direction? One way of looking at this fact is through the understanding of the relationship between science and architecture/urbanism. Antoine Picon argues that both share the ambition to interpret and transform the world (A.Picon, 2003). Going further, he states that both science and  architecture/urbanism are the primary educators of vision of the world – that is, through their transformation and interpretation of the world they are changing the observer as well, observer which in turn goes on to create new architecture/urbanism. So we can now clearly state that since the way we perceive the world has changed through scientific advancement, now architecture and urbanism are struggling to keep up with those changes and re-create a miniature version of the cosmos as we understand it[6].

The integration and use of emergent behavior and other self-organizing processes in the planning process we are at the point of eliminating the gap between vernacular architecture, or the natural growth of unplanned settlements and the planned, top-down designs. This, up to now, clear distinction between two different types of design, one pertaining to the sum of individual actions of each decision maker involved in the process and the other to one central author, has been at the root of our interpretation of the built environment. The previous statement is based on the supposition that our use and handling of emergence is reaching such a level of preciseness that what we design with the help of computers and the algorithms run by then is reaching a certain level of accuracy where the results become indistinguishable from the reality they are designed to describe, or simulate. In this sense, we might be tempted to boldly announce that we were able to create an abstract machine, or we finally deduced the diagram describing the structure generating processes of cities, or, even more, the built environment. Reality becomes a diagram, and the diagram becomes reality.

Science has been advancing in its reductionist path and has unraveled more and more rules which make our world tick. More and more complex phenomena are now being well approximated by new algorithms. This use of scientific images in urbanism and architecture is not new, as we have previously shown. Yet this meeting of science and urbanism is a treacherous one – A.Picon notes that it is productive only when there are similarities between the realities on which both operate (A.Picon, 2003). This leads us to a what might prove to be a rather polemic point of this paper: the sudden onset and assimilation of numerous nature-derived algorithms in urbanism[7] must be looked upon with a certain degree of caution, since most of these adoptions are based on superficial similarities which only rhetorically speculate some qualities associated with the respective phenomena which led to the assimilated rule. There are numerous examples where the organic pattern-generating properties of different algorithms are used to generate whole “ecological” masterplans which, if built, would prove to have the same inflexibility and suffer from the same disadvantages as other, non-algorithmically generated plans. There is a simple reason regarding this, which I have mentioned at the beginning of this paragraph as a fore-warning: the reality at which the respective algorithm works in nature is completely different from the reality on which a master plan operates[8]. This leads to a quite dangerous situation, which is similar to that of the Modern Movement. To elaborate on this, we have to acknowledge the fact that, even though derived from parametric or algorithmic techniques which are infused with a variety of rules stolen from science or biology[9], the resultant designs, when confronted with a clear mind, present us with a static, fixed image which is imposed as a design. For all the talk of emergence and the possibility of anticipating the natural growth of the city, these designs, when enforced, collapse the space of possibility to one single possible outcome. As well, are they not contradicting themselves – is it not that they have one central author, whilst natural growth stems from the sum of actions performed by each agent, which is oblivious of the whole? Furthermore, we would argue against the subversion of algorithms derived from previously studied patterns in nature and applying them to urban scale projects without a good cause. If the built environment is an inherent part of nature, as we have shown previously, we should not try and forcibly adapt laws and rules which are not originally part of it and instead try and devise the algorithms which make cities tick with the same scientific rigor used in devising the physical rules of the universe.

One might argue that this is not true. There is no clear “master planner” role anymore, not at least in the classical sense. There no longer is an author, since his role has been transferred to each agent in his simulation. But we must not forget that simulations are still a design tool. To support this statement, we must refer to the beginning of this paper, where we summarily sketched chaos theory and the fragility of the world it proposes in relationship with our inherent limitations regarding measurements. We must be conscious of the fact that the act of measurement influences the measurement itself and changes reality minutely, but critically[10]. In the context of chaos and the non-linear world we so suddenly found ourselves in, even the most minute and insignificant decimal of a parameter can propagate huge critical changes in the respective system, with completely different outcomes. There is thus this transcendental barrier between simulation and reality. Simulations, as a means of objectifying the design process, are still design tools due to our inherent incapacity of measuring all detail. Predictions results will always be designs, regardless of the precision with which the algorithms used are describing the system[11] as well as the amount of accuracy of the measurements used as input. As Neil Leach notes, “the complexity of material computation[12] within the city far exceeds anything that we might be able to model as yet through computation” (Leach, 2009).

Faced with these facts, there is a need to emphasize the theoretical and practical gaps within the computational world. As Arie Graafland notes, “The computational universe turns dangerous when it stops being an useful heuristic device and transforms itself into an ideology that privileges information over everything else” (Graflaand, 2010).We can argue that this obsession with Algorithmical approaches to urbanism has led to a planning process which is dominated by information, or data. An algorithm left by itself is completely inert in the absence of input. In the case of urban design, the most meaningful input is hard, even impossible to quantify or rationalize in terms of abstracted datascapes. In turn, this has led to the primacy of other input sources which, though more easily quantified, are less relevant to the quality of the urban space they help shape through the algorithmical design process. These planning methodologies thus risk ending up most of the time producing “disembodied data” (Graflaand, 2010), or information which is restratified in such a manner that it becomes meaningless from an urban point of view[13]. Quite often, subjective authorial biases – which are inherent to any design process, as we previously stated – are completely forgotten and masked by a seemingly objective algorithm, which does not suffer from the weaknesses of humanity. Yet one can argue that this is the same mistake which was made some time ago by the Modern Movement. Both fail to take into account the fact that sensorial capacities and precision limitations have always been a driving force of urbanism and architecture. Relinquishing control is not yet[14] possible: we have yet to reach the point in computational power and algorithmical finesse where we will be able to hand in consciousness to the city itself.

A different question now arises with respect to the role of the designer, for his presence is still required both theoretically and practically. How will we take our cues and act? We can now state that urbanists and architects are part of the critical class of decision makers whose actions lead to the self-organization of the structures making up the built environment. In our current world, there is little which escapes the rigorous scrutiny and planning of authorities, and this trend seems to be increasing – leaving little and less to unplanned directions. Parametric design techniques have indeed helped by allowing an increasing number of different complexities be handled by the designer, which in turn makes for better designs only due to the fact that we are no longer constrained by a limited geometrical vocabulary. Sola-Morales criticized the systemic structuralist approach for its superficial renewal by shifting its technical jargon (Sola-Morales, 2008) and its implicit alienation from the true realities and sensibilities through overcomplexification with the subsequent failure to reach the sublime, or a comprehensive holistic understanding of the city as described in Lefebvre’s book on rythmnanalysis (Lefebvre, 2004). We believe that emergence, self-organization and their related theoretical constructs are the ways through which the structuralist approach reaches its moment of sublime, empowering it with an animating force which extends beyond its previous limitations. Regarding the role of the designer, what matters most is that we reach a level of understanding where our mental negotiation between being part of the whole and retaining our capability of independent action coexist and allow us to continue to design.

To sum up, the main goal of this paper was to put forward a few self-critical questions regarding emergence and its associated implications in both theory and practice. I would like to emphasize that urban planning, as a science, has not ever been in a better position to accept and act upon the inherent fractalic fragilities of the city or, by extension, the urban environment. The tools now available can become the conduit to extend into reality of theoretical concepts whose very goal is to explain and describe beyond impersonal formulas the delicate nature of a city. I have tried raising a cautionary tone regarding direct applications of this new-found digital power and its inherent limitations, whilst we end up repeating mistakes of the past. While accepting the built environment as being an inherent part of nature, we must deduce its own, proprietary parameters and rules governing its existence if we are to properly design and create the spaces needed to allow for a harmonious growth of society.

Bibliography:

A.Picon. (2003). Architecture, Science, Technology and The Virtual Realm. Architectural Sciences (pp. 292-313). Princeton: Princeton Press.

DeLanda, M. (2000). A Thousand Years of Nonlinear History. London: Zone Books.

Graflaand, A. (2010). From Embodiment in Urban Thinking to Disembodied Data. The Disappearance of Affect. Delft: TU Delft.

Leach, N. (2009, July/August). Swarm Urbanism. AD: Digital Cities , pp. 56-63.

Lefebvre, H. (2004). Rhythmanalysis: Space, Time and Everyday Life. London: Continuum.

Sola-Morales, M. d. (2008). A Matter of Things. Rotterdam: NAi Publishers.

Stewart, I., & Cohen, J. (1995). The Collapse of Chaos. Discovering Simplicity in a Complex World. London: Penguin Books.

Notes:


[1] Ian Stewart and Jack Cohen perform a mental experiment by which they prove that, would someone devise the formula governing the movement of all material bodies in the whole universe, the amount of paper needed to write it down would exceed the amount of matter found in the universe. (Stewart & Cohen, 1995)

[2] “Unexpected”, in a deterministic context, should be read as “unpredictable”.

[3] To give some examples of this behavior in nature, we only have to look as far as, for example, a school of fish and their way of organizing themselves and presenting a coherent behavior stemming from the actions of each member of the respective group, without a central intelligence imposing patterns.

[4]For example, cities grow and as well they sprout new cities when local limits (in terms of geography, resources, etc) are met.

[5] For example, one can look at the Etruscan rituals performed when founding a new city, rituals which were later maintained by the Romans. The main goal of this act was that of removing the respective plot of land destined to become a new settlement from the chaotic and violent influence of natural elements, and nature itself, and placing it under the order and protection of benevolent deities. The space was ordered through a rectangular grid dominated by the North-South and East-West axes.

[6] Probably the most literal translation of this is found in Charles Jencks late landscaping works. His book, The Jumping Universe (1995, London: Academy Editions) best describes the relationship between aesthetics and the way we understand and perceive the world around us.

[7] A few examples which come to mind are cellular automata, the infamous Voronoi diagram, swarm simulations, diffusion-limited aggregation, etc. and their subsequent variations and materializations.

[8] Take, for example, the Voronoi diagram: it appears in nature at a wide range of scales (from microscopic to macroscopic levels) yet it fails to show up in architectural scales completely and seldom in urban scales (where it appears more due to the unconscious desire of enforcing patterns rather than reality).

[9] Here we can talk about swarm behavior and collective intelligence, basic manifestations of emergence witnessed in the behavior of ant colonies and other similar hive-centered creatures.

[10] This is explanation is best left for a quantum physicist, but I will give it a try as well. In layman’s terms, the most trivial act of measurement requires bouncing light off an object – yet precisely this affects and pollutes the measurement. It is thus easily understandable that we can not accurately measure the speed and position of an object, because by measuring one, we’re influencing the other. This holds true at all scales, though the effects are invisible, but not irrelevant, at scales bigger than an atom.

[11] Here we must mention the fact that the laws of nature are merely the best approximations of observed behavior we have so far. They are not by far universal truths (Stewart & Cohen, 1995).

[12] Material computation is performed by matter. One straightforward example is Gaudi’s famous hanging chains experiments, by which he employed a physical system to compute the optimal shape of catenary arches.

[13] There is a rather blunt saying amongst those familiar with the ways of programming and scripting, which would describe the situation quite well: “Garbage in, garbage out.”. More to the point, whatever the qualities of the algorithm used, its proper function depends on it receiving the right kind of input. An interesting variation of the aforementioned saying, which is just as relevant sounds like this: “Garbage in, Gospel out.”. It makes reference to the way incorrect results from over-trusted algorithms often find themselves advocated as truth.

[14] We have yet to achieve what Ray Kurzweil describes in his book The Age of Spiritual Machines as being the singularity.

10 Comments. Wow. What do you think? Add your voice to the conversation below. Should be fun! Click to show the comments.

Dimitrie –

This is superb. It is refreshing to hear such well-grounded theoretical discussion from someone who is also technically proficient in the tools themselves. There is a huge chasm in architecture and design today between those who theorize the use of new technology and those who actually apply it; we need more examples like this one that can bridge that gap.

Andrew,
Thanks a lot for your words. It really encourages me to see that there still is some self-criticism left around. Though quite a power user (or manipulator?) of computational architecture, I felt the gap you’re mentioning and I’m trying to fill it in somehow – I hope that my criticism doesn’t discredit but goes on to push further developments in the right direction. In the end, it’s a sadistic move. Again, thanks for your comment – it’s a motivation booster :)

A comprehensive critique on treacherous computational applications and borrowed metaphors in contemporary architectural practice formulated so structured and delightfully reasonable – just stunning and enormously inspiring!

It makes me happy to see your constructive digestion on the experiences and driving discussions of the previous months. Never stop questioning the bible!

I especially enjoyed your remarks on similar tendencies regarding the modernist movement as well as the opposing forces of human creation and nature. Thoughts about Adornos/Benjamins notions of “Natur und Gesellschaft” are poppin up instantly. Consequently, if the relation to our environment altered doubtlessly that much, how do we use these insights for creating and transforming our world we’re living in? Does our recently gained knowledge of broad computational application / tooled simulations expose a potential new mode of architectural intervention? From regulation to provision? Some projects of R(&)Sie(n) are tackling exactly that dilemma stated in your paper regarding the design result as a paradoxically fixed/ static image created by a central author called designer. However, even though there are overlaps art (to which I fully count François Roche) differs from architecture in responsibility and manifestation.

Thanks for sharing your thoughts and reminding us that we should think about the tools we’re using in order to avoid being notelessly controlled by them.

Holla Patrick,
I guess I already thanked you in person for your insightful offline review – as well for all the other formal and informal opportunities – our discussions have been a critical drive towards the articulation of the above paper.
The pleasure’s all mine!

I’m not sure how the change in our relationship with the natural environment will affect the practice – this holistic approach of nature and the built environment is as of yet unexplored (methinks) yet promising – we might go and build avatar-style worlds, or we might fork nature off towards a human-designed nature or we might blend in or merge with to the point that there will be no longer a visible distinction – bypassing aesthetics and all. Ah… ramblings..

Hey Dimitri,

Intresting and well formulated thoughts. Good to hear a critical voice from within computaional design scene.

Have fun in Brussels, I can’t make it, but have suggested your presentation to my students. If you plan on comming to Ghent for the second day of the conference, be sure to stop by at the mmlab.

regards,

corneel

A very well put together argument.

I would like to bring up one point that I think would be interesting to discuss. Starting with the idea that algorithms are neutral (meaning neither implicitly good or evil), this leads to your note of garbage in/garbage out. But then what creates a direction to this computational trajectory that architecture/urbanism is going down, because I assume that we all want to create a “better” world. If the it does not come from the algorithm, then it must come from us. So it may be dangerous to cede too much control.

matei, thanks for your comment.
well, in the next piece i’ll publish i’m arguing that any kind of algorithmic simulation is an act of design due to the inherent limitations of any systemic approach (as expressed by Gödel’s incompleteness theorem) and by our measurement abilities (as deduced from quantum uncertainty). Basically what i’m trying to say is that one way or the other, algorithms are still design tools, ie they push in a certain direction.

As for the drive to create a better world, this is true, but not quite. The driving forces behind the “evolution” of the built environment are rarely motivated by pure goodness…

regarding control, you can also, in a way, code un-control which results in variation – but this variation has to be well understood and doctored in order to actually appeal to developers, users, etc…

i’ll stop rambling now.

Constantin CozmaJuly 17th, 2011 at 12:40 am

Hello Dimitrie! I have read your text and i must say i am also a big fan of Manuel de Landa, and his book – A thousand years of non-linear history really opens our eyes to see the inner structures of the world around.
Now i wanted to express some personal thoughts about parametricism.

It is clear that this tool will revolutionize architecture in some way or another, but the difference will stay in the way it is going to be used. Is it a tool, or is more than that? I mean, will or has already created an particular style? The type that Patrick Schumacher talks about it? Or would it be better if the results of a parametric process won`t be recognized or catalogized as part of a particular style or architectural language?

I am very sure that we are still in the enfancy of this new technique and its very easy to become more enthusiastic about the result of the process than the process itself.

See for example Zaha Hadid works wich are most of them fluid and continuuos shapes. She always arguments her works to be deeply rooted in the context, and shaped by the fluid flows of interests, but look around in nature. You don`t see only fluids. You also see minerals, and plants of all shapes and colors, etc. My point is that the whole impact wich these kind of generated forms have in an urban context must be carefully analyzed and the psychological aspect is crucial.

I think there is a tendancy now to asume that we should create a kind of living, organic, self creating architecture. It is easy to start a kind of fetish for these type of objects/things.

In his book – The System of objects- Jean Baudriallard has a very interesting critic about a similar phenomenon wich has happenend in product design. Basically he observed that humans tend to create objects wich by being more and more usefull, more and more helpfull and close to our needs they dispose us from our capabilities, they somehow enslave us.

I am wondering if this kind of phenonemon is happening also in architectural field. I mean, in our search for close-to-nature objects we create, don`t we get separated from nature deep inside? Do we want to live in an intelligent architectural being that will protect and adapt to our needs or we want to still be able to adapt and protect ourselfs?

Maybe i am going too far wiith these questions but philosophical thoughts are always welcome.

C'mon, write your thoughts in the box below.