You can not reduce the city to merely the sum of its parts – its nature escapes the deterministic world we, until recently, thought we lived in and goes on to manifest itself in the realm of the unpredictable, where it is said that a flap of a butterfly’s wing here can cause a thunderstorm half a globe way. This new way of seeing the world, in which interaction of complex systems yield results of such extreme richness and fragility that even the most minute decimal of a number can propagate changes of vast and striking nature is not as chaotic and random as it seems at a first glance. Though essentially unpredictable, this world is all but devoid of pattern. Up to a few years ago, we were perceiving nature in a deterministic way – that is, we thought that we were smart enough to be able to fully comprehend and elaborate its rules in formulas that would ultimately hold the whole truth and would be capable of describing it in its whole richness and lushness of detail and dynamism.
This approach proved to be a folly – formulas got so complicated you needed several pages to write them down. The quest for a Theory of Everything is essentially flawed. Systemic theory seemed to be pushing towards a blind dead end, as Gödel elegantly proved in 1931. In short, Gödel first theorem of incompleteness states that for any system there will always be true statements about the system which are unprovable within the respective system. This led us to look for other mechanism to more efficiently describe the complex behavior of systems. Chaos theory was born and alongside it different concepts evolved which would help describe all the new found detail and richness we found after humanity dropped its deterministic glasses. By far the most comprehensive and elusive (from a mathematical point of view) is the concept of emergence. In a systemic theoretical context, this term describes the unexpected appearance of novel features of a system, features which either enrich the respective system or transform it into a new one. Furthermore, it is necessary to introduce the notion of self-organization, which became a crucial tool of comprehension for today’s complex world. In short, self-organization is the process by which structures or patterns appear without a central authority or external element imposing them through planning or specific design. Furthermore, it has been proven that emergence is the rule rather than the exception (Stewart & Cohen, 1995).
All these changes coming from the world of sciences have prompted parallel developments in the philosophical world. Space is no longer perceived as flat and linear but as being a striated medium, folding upon itself and demarcated by gradients (or intensities) of values, with no “deterministic” boundaries. Manuel DeLanda argues for a perception of reality as a flow of matter-energy animated from within by self-organization processes (DeLanda, 2000). Cities thus become mineral exoskeletons which crystallize out of this flow, while at the same time performing themselves operations of stratification, destratification and restratification on the flows that transverse them. Henri Lefebvre describes this organic aesthetic of the flows of life in a city and its continuous restlessness and unpredictability as rhythms with as having a “maritime” quality: ”There on the square, there is something maritime about the rhythms. Currents traverse the masses. Streams break off, which bring or take away new participants. Some of them go towards the jaws of the monster, which gobbles them down in order quite quickly to throw them back up. The tide invades the immense square, then withdraws: flux and reflux.” (Lefebvre, 2004). In the study of emergence, emphasis is put on the two ways it operates or manifests: how complex systems emerge from simple rules or, on the other hand, how complex systems produce simple patterns. Through the study of such phenomena and their workings we easily deduce that large scale structures emerge from the actions of individual decision makers (DeLanda, 2000).
We can now state that cities – furthermore, we can extrapolate to the whole sum of the built environment without going amiss – can be seen as an emergent system, the result of a crystallization process provoked and sustained by humanity (which, in turn, evolved from similar processes). It gains legitimacy in itself, not to the point of a conscious entity but as phenomena which presents all the qualities of a natural organism. Its independent animation and continuous reproduction are qualities which determine us to accept their independence from planning and design endeavors and our subsequent limited operational power on them. The following statement challenges a tradition deeply rooted in the psyche of humankind, which preferred to identify itself and, even more so, its acts of building as acts opposing nature – opposing the natural chaos with the order of creation. In my opinion, one critical aspect of the radical shift which now takes place in architectural theory and practice alike, or one reaction to emergence, is accepting the legitimacy of the city as a living and evolving process which does not oppose nature, but is inherently part of it.
These theoretical advances have been directly supported and translated into urbanism through the use of new digital tools which have emerged and are now becoming accessible to the practice. We can now simulate to amazing degrees of accuracy highly complex phenomena, generating self-organizing patterns that act as a base for the understanding of the formational processes in urban conglomerations. Parametric design, coupled with algorithmic design, provide the main tools with which a new urbanism is now created. Why is there this drive towards this direction? One way of looking at this fact is through the understanding of the relationship between science and architecture/urbanism. Antoine Picon argues that both share the ambition to interpret and transform the world (A.Picon, 2003). Going further, he states that both science and architecture/urbanism are the primary educators of vision of the world – that is, through their transformation and interpretation of the world they are changing the observer as well, observer which in turn goes on to create new architecture/urbanism. So we can now clearly state that since the way we perceive the world has changed through scientific advancement, now architecture and urbanism are struggling to keep up with those changes and re-create a miniature version of the cosmos as we understand it.
The integration and use of emergent behavior and other self-organizing processes in the planning process we are at the point of eliminating the gap between vernacular architecture, or the natural growth of unplanned settlements and the planned, top-down designs. This, up to now, clear distinction between two different types of design, one pertaining to the sum of individual actions of each decision maker involved in the process and the other to one central author, has been at the root of our interpretation of the built environment. The previous statement is based on the supposition that our use and handling of emergence is reaching such a level of preciseness that what we design with the help of computers and the algorithms run by then is reaching a certain level of accuracy where the results become indistinguishable from the reality they are designed to describe, or simulate. In this sense, we might be tempted to boldly announce that we were able to create an abstract machine, or we finally deduced the diagram describing the structure generating processes of cities, or, even more, the built environment. Reality becomes a diagram, and the diagram becomes reality.
Science has been advancing in its reductionist path and has unraveled more and more rules which make our world tick. More and more complex phenomena are now being well approximated by new algorithms. This use of scientific images in urbanism and architecture is not new, as we have previously shown. Yet this meeting of science and urbanism is a treacherous one – A.Picon notes that it is productive only when there are similarities between the realities on which both operate (A.Picon, 2003). This leads us to a what might prove to be a rather polemic point of this paper: the sudden onset and assimilation of numerous nature-derived algorithms in urbanism must be looked upon with a certain degree of caution, since most of these adoptions are based on superficial similarities which only rhetorically speculate some qualities associated with the respective phenomena which led to the assimilated rule. There are numerous examples where the organic pattern-generating properties of different algorithms are used to generate whole “ecological” masterplans which, if built, would prove to have the same inflexibility and suffer from the same disadvantages as other, non-algorithmically generated plans. There is a simple reason regarding this, which I have mentioned at the beginning of this paragraph as a fore-warning: the reality at which the respective algorithm works in nature is completely different from the reality on which a master plan operates. This leads to a quite dangerous situation, which is similar to that of the Modern Movement. To elaborate on this, we have to acknowledge the fact that, even though derived from parametric or algorithmic techniques which are infused with a variety of rules stolen from science or biology, the resultant designs, when confronted with a clear mind, present us with a static, fixed image which is imposed as a design. For all the talk of emergence and the possibility of anticipating the natural growth of the city, these designs, when enforced, collapse the space of possibility to one single possible outcome. As well, are they not contradicting themselves – is it not that they have one central author, whilst natural growth stems from the sum of actions performed by each agent, which is oblivious of the whole? Furthermore, we would argue against the subversion of algorithms derived from previously studied patterns in nature and applying them to urban scale projects without a good cause. If the built environment is an inherent part of nature, as we have shown previously, we should not try and forcibly adapt laws and rules which are not originally part of it and instead try and devise the algorithms which make cities tick with the same scientific rigor used in devising the physical rules of the universe.
One might argue that this is not true. There is no clear “master planner” role anymore, not at least in the classical sense. There no longer is an author, since his role has been transferred to each agent in his simulation. But we must not forget that simulations are still a design tool. To support this statement, we must refer to the beginning of this paper, where we summarily sketched chaos theory and the fragility of the world it proposes in relationship with our inherent limitations regarding measurements. We must be conscious of the fact that the act of measurement influences the measurement itself and changes reality minutely, but critically. In the context of chaos and the non-linear world we so suddenly found ourselves in, even the most minute and insignificant decimal of a parameter can propagate huge critical changes in the respective system, with completely different outcomes. There is thus this transcendental barrier between simulation and reality. Simulations, as a means of objectifying the design process, are still design tools due to our inherent incapacity of measuring all detail. Predictions results will always be designs, regardless of the precision with which the algorithms used are describing the system as well as the amount of accuracy of the measurements used as input. As Neil Leach notes, “the complexity of material computation within the city far exceeds anything that we might be able to model as yet through computation” (Leach, 2009).
Faced with these facts, there is a need to emphasize the theoretical and practical gaps within the computational world. As Arie Graafland notes, “The computational universe turns dangerous when it stops being an useful heuristic device and transforms itself into an ideology that privileges information over everything else” (Graflaand, 2010).We can argue that this obsession with Algorithmical approaches to urbanism has led to a planning process which is dominated by information, or data. An algorithm left by itself is completely inert in the absence of input. In the case of urban design, the most meaningful input is hard, even impossible to quantify or rationalize in terms of abstracted datascapes. In turn, this has led to the primacy of other input sources which, though more easily quantified, are less relevant to the quality of the urban space they help shape through the algorithmical design process. These planning methodologies thus risk ending up most of the time producing “disembodied data” (Graflaand, 2010), or information which is restratified in such a manner that it becomes meaningless from an urban point of view. Quite often, subjective authorial biases – which are inherent to any design process, as we previously stated – are completely forgotten and masked by a seemingly objective algorithm, which does not suffer from the weaknesses of humanity. Yet one can argue that this is the same mistake which was made some time ago by the Modern Movement. Both fail to take into account the fact that sensorial capacities and precision limitations have always been a driving force of urbanism and architecture. Relinquishing control is not yet possible: we have yet to reach the point in computational power and algorithmical finesse where we will be able to hand in consciousness to the city itself.
A different question now arises with respect to the role of the designer, for his presence is still required both theoretically and practically. How will we take our cues and act? We can now state that urbanists and architects are part of the critical class of decision makers whose actions lead to the self-organization of the structures making up the built environment. In our current world, there is little which escapes the rigorous scrutiny and planning of authorities, and this trend seems to be increasing – leaving little and less to unplanned directions. Parametric design techniques have indeed helped by allowing an increasing number of different complexities be handled by the designer, which in turn makes for better designs only due to the fact that we are no longer constrained by a limited geometrical vocabulary. Sola-Morales criticized the systemic structuralist approach for its superficial renewal by shifting its technical jargon (Sola-Morales, 2008) and its implicit alienation from the true realities and sensibilities through overcomplexification with the subsequent failure to reach the sublime, or a comprehensive holistic understanding of the city as described in Lefebvre’s book on rythmnanalysis (Lefebvre, 2004). We believe that emergence, self-organization and their related theoretical constructs are the ways through which the structuralist approach reaches its moment of sublime, empowering it with an animating force which extends beyond its previous limitations. Regarding the role of the designer, what matters most is that we reach a level of understanding where our mental negotiation between being part of the whole and retaining our capability of independent action coexist and allow us to continue to design.
To sum up, the main goal of this paper was to put forward a few self-critical questions regarding emergence and its associated implications in both theory and practice. I would like to emphasize that urban planning, as a science, has not ever been in a better position to accept and act upon the inherent fractalic fragilities of the city or, by extension, the urban environment. The tools now available can become the conduit to extend into reality of theoretical concepts whose very goal is to explain and describe beyond impersonal formulas the delicate nature of a city. I have tried raising a cautionary tone regarding direct applications of this new-found digital power and its inherent limitations, whilst we end up repeating mistakes of the past. While accepting the built environment as being an inherent part of nature, we must deduce its own, proprietary parameters and rules governing its existence if we are to properly design and create the spaces needed to allow for a harmonious growth of society.
A.Picon. (2003). Architecture, Science, Technology and The Virtual Realm. Architectural Sciences (pp. 292-313). Princeton: Princeton Press.
DeLanda, M. (2000). A Thousand Years of Nonlinear History. London: Zone Books.
Graflaand, A. (2010). From Embodiment in Urban Thinking to Disembodied Data. The Disappearance of Affect. Delft: TU Delft.
Leach, N. (2009, July/August). Swarm Urbanism. AD: Digital Cities , pp. 56-63.
Lefebvre, H. (2004). Rhythmanalysis: Space, Time and Everyday Life. London: Continuum.
Sola-Morales, M. d. (2008). A Matter of Things. Rotterdam: NAi Publishers.
Stewart, I., & Cohen, J. (1995). The Collapse of Chaos. Discovering Simplicity in a Complex World. London: Penguin Books.
 Ian Stewart and Jack Cohen perform a mental experiment by which they prove that, would someone devise the formula governing the movement of all material bodies in the whole universe, the amount of paper needed to write it down would exceed the amount of matter found in the universe. (Stewart & Cohen, 1995)
 “Unexpected”, in a deterministic context, should be read as “unpredictable”.
 To give some examples of this behavior in nature, we only have to look as far as, for example, a school of fish and their way of organizing themselves and presenting a coherent behavior stemming from the actions of each member of the respective group, without a central intelligence imposing patterns.
For example, cities grow and as well they sprout new cities when local limits (in terms of geography, resources, etc) are met.
 For example, one can look at the Etruscan rituals performed when founding a new city, rituals which were later maintained by the Romans. The main goal of this act was that of removing the respective plot of land destined to become a new settlement from the chaotic and violent influence of natural elements, and nature itself, and placing it under the order and protection of benevolent deities. The space was ordered through a rectangular grid dominated by the North-South and East-West axes.
 Probably the most literal translation of this is found in Charles Jencks late landscaping works. His book, The Jumping Universe (1995, London: Academy Editions) best describes the relationship between aesthetics and the way we understand and perceive the world around us.
 A few examples which come to mind are cellular automata, the infamous Voronoi diagram, swarm simulations, diffusion-limited aggregation, etc. and their subsequent variations and materializations.
 Take, for example, the Voronoi diagram: it appears in nature at a wide range of scales (from microscopic to macroscopic levels) yet it fails to show up in architectural scales completely and seldom in urban scales (where it appears more due to the unconscious desire of enforcing patterns rather than reality).
 Here we can talk about swarm behavior and collective intelligence, basic manifestations of emergence witnessed in the behavior of ant colonies and other similar hive-centered creatures.
 This is explanation is best left for a quantum physicist, but I will give it a try as well. In layman’s terms, the most trivial act of measurement requires bouncing light off an object – yet precisely this affects and pollutes the measurement. It is thus easily understandable that we can not accurately measure the speed and position of an object, because by measuring one, we’re influencing the other. This holds true at all scales, though the effects are invisible, but not irrelevant, at scales bigger than an atom.
 Here we must mention the fact that the laws of nature are merely the best approximations of observed behavior we have so far. They are not by far universal truths (Stewart & Cohen, 1995).
 Material computation is performed by matter. One straightforward example is Gaudi’s famous hanging chains experiments, by which he employed a physical system to compute the optimal shape of catenary arches.
 There is a rather blunt saying amongst those familiar with the ways of programming and scripting, which would describe the situation quite well: “Garbage in, garbage out.”. More to the point, whatever the qualities of the algorithm used, its proper function depends on it receiving the right kind of input. An interesting variation of the aforementioned saying, which is just as relevant sounds like this: “Garbage in, Gospel out.”. It makes reference to the way incorrect results from over-trusted algorithms often find themselves advocated as truth.
 We have yet to achieve what Ray Kurzweil describes in his book The Age of Spiritual Machines as being the singularity.