.


Framing Optimization in Contemporary Architecture


April 4th, 2012, essays

Framing Optimization in Contemporary Architecture[1]

Keywords: optimization, architecture, computational techniques

I am in debt to the generous friends (Patrick, Matas, Gustavo) who shared their thought on the initial rather thoughtless rant which was the base for the following piece, as well as to Henriette Bier who guided and steered its manufacture. Thanks!

Optimization, and the processes which inherently give birth to it, have become a central topic in both the discourse and practice of computational architecture. Despite this fact, there have been little attempts to probe and test this concept (or technique) by studying its theoretical implications. Subsequently, the final purpose of this paper is to test optimization against several critical insights stemming from recent advancements in systems theory. Essentially laying out a strategie fatale scenario will hopefully expose some crucial insights with clear repercussions in the way we design or justify our designs. In order to do so, we must first explore the concept of optimization in a more broad historical context and try to identify it in the standard, pre-digital architectural design process and then clearly expose its digital manifestations, along with the limitations and advantages it implies.

Optimization, as a concept per se, is anything but a new concept[2]. Nature, as well as humankind, has always strived to do things better, which usually means increased performance of the resulting object[3] coupled with less effort spent doing the thing itself. It is probably the reason our monkey ancestors picked up the first branch and transformed it into a weapon. More recent manifestations, such as the industrial revolution, are bespoke manifestations of this concept. Mechanical tools replaced human workers due to their increased efficiency and reliability. Henry Ford’s invention of the assembly line is a clear improvement of the process of manufacturing – in such a manner that it ushered in a new age in human society. In the non-anthropic world[4], optimization is an intrinsic quality of nature. For example, the trajectory of a river is optimized in such a way that it follows the lines of least resistance through its geographical context. Trees branch and grow in such a way that maximize their reach of sunlight and yet maintain structural integrity against wind. The phenomena we know as evolution is essentially a process of optimization against the testing ground of the environment. From this standpoint, the built environment is the result of an optimization process of the human society, which needed to improve its chances of survival against the elements and its predators (which, more often than not, include itself).

Now that we’ve succinctly considered the ubiquitous nature of optimization and its intrinsic role in both natural and artificial processes, we can start to narrow down our investigation and elaborate on its presence in the architecture. Our focus lies in the direction of conscious architecture and not in the direction of vernacular architecture, in which optimization is more akin to the collective, unconscious forces of nature. In the pre-computational period of architecture, optimizing a design was simply achieved by iteratively adjusting the design with the aim of making it better. One can argue that for any designer one of his fundamental instincts are to find the best fitting solution for a given assignment – in order to ensure his project’s viability and as well as to distinguish it from the other possible competitive designs. Thus, every time we sketch a possible solution for a plan, we always try to improve on the previous variant – sometimes with success, sometimes without. The mechanism behind such actions is quantifiable only to a certain degree – it’s a heuristic process loosely defined as architectural intuition. Nevertheless, in the wake of computational techniques, this exact heuristic nature of the optimization process changes towards a more precise, algorithmic[5] approach.

As we have argued above, optimization is not a new concept – or desire – inside design disciplines. We can even argue that the act of designing is inseparable from the act of optimization[6]. Nevertheless, its current manifestation as a process in computational architecture is new. We shall now focus on how optimization manifests itself in the digital practice, and try to answer the first critical question of the essay: whether optimization is theoretically capable of replacing architectural intuition[7].

Digital techniques have exposed to the rational speed of the computer more parameters – at the expense of the traditional, heuristic, methods, thus making different optimization models possible as well as increasing the accuracy of existing ones by several degrees of magnitude. For example, before the advent of environmental analysis software, solar optimization was the result of the architect’s intuition, experience and education. Now, using computational tools, the detail and accuracy of such an analysis allows for more precise architectural decisions to be taken in the context of a given design assignment. New optimization models based on highly specialized algorithms derived from science are repurposed to solve architectural problems. For example, swarm intelligence algorithms are used to negotiate complicated functional distributions over an environmental and contextual setting which can now be described using a much higher level of accuracy.

In computational design, optimization is sometimes used as a magical trick through which one can improve a given design. Whether applied on localized, well-defined issues – such as energetic concerns of a building – or on larger, loosely-defined aspects of a project – such as functional placement – optimization takes a centre role in most theoretical discourses and often is used as a rhetorical device pertaining to the given project. Alongside the obvious advantages of computational optimization, which include speed and accuracy, theoretical discourse has rarely spoken about or acknowledged the disadvantages of such an approach. Some of them are pertaining to the digital medium itself, and can be seen not as pertaining exclusively to optimization procedures themselves – indeed they can be extended to a criticism of computational techniques. Other aspects are not resulting from the nature of a digital approach and have more to do with the rationale, or the way of reasoning when one embarks on an optimization quest.

As a first step in our endeavour to bring these issues to light, we shall attempt to define optimization in a computational setting. Thus, we “postulate” that optimization represents the improvement of a design by finding the optimal values for the parameters which describe it. This simplified definition can be broken down into three parts. First, there’s the act of improvement, or betterment of a given design. Second, there’s the action of finding that respective configuration. Last, there are the parameters that describe the design – the values, whose interplay and modulation describe the design. In the following paragraphs, we shall consider all these three aspects, not in the order mentioned here, but in an incremental way dictated by complexity posed by their problems.

The least problematic part is the act of finding: how does one search through all the possible n-dimensional space described by the parameters of the design is a question which, in computational terms, has specific, well defined algorithmic approaches. What before was exclusively the domain of the heuristic process we dubbed as architectural intuition has now been supplemented with different computational methods which navigate the complex space of possible solutions defined by the design’s parameters. Depending on the complexity of the problem at hand, there are two main possible approaches with two different theoretical implications resulting from their guaranteed outcomes. In the case of simple, mathematically determinate problems – like finding the minimum thickness of a column, or calculating a catenary arch – the deterministic algorithms employed guarantee that the final solution, if found, is a global optima[8]. Much more interesting are the algorithms which tackle problems that are not prone to be formulated in mathematically determinate terms, mostly due to the complexity of the possible solution space. The most common are evolutionary algorithms (EA) and swarm-intelligence-based procedures (PSO). The former generates solutions using techniques inspired from natural evolution, such as mutations, selection and inheritance, while the latter optimizes a problem by considering possible solutions as particles which move in the search space of the problem governed only by its local best known position[9]. The interesting fact is that this second class of optimization algorithms are heuristic, i.e. not mathematically determinate, and thus incapable of providing a global optimum or a guarantee that the solution found is the best one. Indeed, if that were possible, it would mean that the problem they were set to solve in fact is mathematically determinate and therefore their use [d a s2] unjustified. What is of interest is that optimization problems in architecture are usually too complex to be formulated in a deterministic way, therefore the use of EAs and PSOs is quite widespread. Due to the inherent heuristic nature of the optimization process and its results when applied to architectural problems, we must concede the fact that there is no absolute optimum, or no single best solution. Rather, what we’re looking at is a collection of local optima which are equivalent in terms of performance. The singular, independent architectural object is thus refuted – the pretence of authorial uniqueness evaporates when confronted with the multiplicity of equivalent solutions which results from an computationally rationalized heuristic (as opposed to the semi-conscious architectural intuition) optimization process.

Following, the next part of our definition of optimization which we shall analyze is that regarding the parameters describing the design. Through the careful definition of a set of parameters one can, ideally, fully encompass a design in all its aspects. Nevertheless, the amount to which architecture can be rationalized, or quantified, into a parametric process which is computable (or understood by the computer) is highly debatable. Geometrically, technique has advanced to such a point that we can safely assume there are little limitations left to conquer, and much of those are amounting to technical innovations which have nothing to do with architecture. Algorithmically we are now able to describe shapes in three dimension without any limitations. Nevertheless, the resultant qualities stemming out from the geometrical manifestation of the architectural object – qualities, like circulation pathways, positive and negative space relationships, the interplay between exterior and interior or other, more subjective qualities– are not directly parametrizable due to their more abstract or instinctive nature. This can lead to a lot of mis-directed optimization attempts: if you don’t use the right parameters to describe your system then you can’t possibly argue that you are optimizing it meaningfully. It would be like carefully tinkering the design of an underground metro station for the “optimal” sun insolation values, or the optimizing the circulation routes in an apartment in order to minimize the house-wife’s daily routine: the pantry needs to be next to the kitchen, but that’s common sense and not optimization[10]. To sum up, the design of the system which is going to be optimized is crucial. Optimizing a system which is essentially flawed – or not described by parameters which are crucial to it – will still amount to a solution which performs as bad as the original starting design, even though it is “optimized”.

Finally, the last part of our rather simplifying definition of optimization which we shall analyze is the actual goal of the optimization, or what the system is optimized towards. We have purposely left this part for last because from it we hope to bring to light a more abstract line of thought, coming from systems theory, which can provide a clear theoretical direction for the formulation (and interpretation) of optimization.

In other domains, such as computer science, optimization goals are usually clear and easily described. For example, an optimized search algorithm would be the one that runs faster than its predecessor or one that returns more relevant results, a better compression standard would allow for smaller file sizes or a better pathfinding algorithm would be more accurate or more fast in its execution. Architecture has more difficult goals to set for optimization procedures. There is a rather limited set of clear, straight-forward goals which can be easily quantified, modelled and subsequently optimized towards. An incomplete list would be composed of energy efficiency, energy consumption reduction, better energy production, structural performance, less material usage and other various economical constraints. Alongside these quantifiable qualities there are numerous other unquantifiable characteristics of an architectural project – such as spatial qualities, beauty – which do not lend themselves easily to parameterization. Many such goals are playing a double role in the articulation of architecture and the urban environment. For example, we can look at the overall connectivity of a street network or of an urban setting. At first sight, we would think that maximizing connectivity is a viable optimization goal in terms of  improving the design of a new urban development. By looking at real-life examples we get a different picture: less accessible places provide shelter and quiet from otherwise busy, traffic-intense surroundings and play a critical role in any design. This is a quantifiable example of architecture and urban design employing a contrast of a certain performance criteria so as to evolve a successful design which can accommodate the needs of its users. Optimizing towards a single, fixed goal is reminiscent of Modernism’s partially failed project because it negates the variation of the needs of the users, effectively collapsing the complexity of the human society towards one “global optimum”.

The duality of performance criteria in architecture and the importance of their variation needs to be acknowledged and used. Global optima are characteristic of determinate, static systems – in other words, inanimate, mechanical systems. This brings us to the question of how do we perceive the built environment and one of the main shapers of it, architecture. In Manuel DeLanda’s vision, the built environment is a mineral exoskeleton sustaining and enabling human society. On a small timescale and at a superficial glance, the built environment is a static, inert lattice – nevertheless, on a larger timescale its dynamism is evident. Furthermore, this dynamism can be extended to any kind of system: from the universe itself to the crystalline structure of clay. Extrapolating, reality can be seen as flow in time of matter-energy which continuously shapes itself and the different structures which crystallize in and out of it. For any kind of flow to exist, there needs to be an imbalance – a difference of potential in a gradient field that sparks an exchange of energy. Optimization, in this context, becomes the vector which guides and gives the direction of this flow of matter-energy, which becomes a succession of local optima, each better than the previous. The notion of a global optimum doesn’t exist anymore, for it would mean the cessation of the flow, or the death of reality[11].

This theory of the “flow” of matter-energy, which DeLanda used to metaphorically describe reality, now has an emerging counterpart from science. It was coined as constructal theory, and it sees the act of design in nature as a physics phenomena which unites all animate and inanimate systems. The main law states that for a finite-sized system to persist in time (to live), it must evolve in such a way that it provides easier access to the imposing currents that flow through it. In the case of a river, the imposed currents consist of water; in the case of an urban setting the imposed currents are composed of traffic (pedestrian and motorized) and the infrastructural requirements of the context – electricity, water, gas[12]. In the case of a localized architectural object, the flows which guide its design are of a nature more difficult to rationalize: functional flows and aesthetic trends are added to those of energy, utilities and people. Functional flows are of a very minute intricacy and complexity as they are directly interlinked to social cycles and behaviours, which are in turn a delicate balance between inaction and movement. Our attempt to theoretically frame optimization throughout the varied gradient of scales of the built environment (from a macro, infrastructural level to the micro, apartment-sized level) needs to take into account the temporal characteristics, or manifestations, of each abstraction level.

Optimization is not an action, rather it is a never ending process which defines and shapes the flow of matter-energy of which we call reality. Any act of design, either conscious (anthropic) or unconscious (natural processes) is an act of optimization to the extent that the two are inseparable from each other. From this point of view, optimization is a generative process which continuously informs itself and takes into account the environment of the object. Nature didn’t design a lion by starting with a blank page – it continuously “optimized-designed” an archetypal organism towards a certain context, with the result being what we now call “lion”. Furthermore, a lion will not visibly optimize itself during its lifespan. Design processes give birth to locally[13] optimized instances of the same objectile[14], and they themselves evolve recognisably in time. Nevertheless, computational techniques of optimization are making possible the temporal compression of several stages of evolution into a few electronic seconds – yet this does same act of temporal compression does not break, or jump-start the normal evolutionary cycle, which continues unabated once the respective instance of the process is materialized. This is so because the context around the respective object continues to evolve at its own, non-digitally enhanced rate.

To conclude, optimization is a keyword which needs to be used with caution. As process in itself it identifies with design, or vice-versa: any act of design is inseparable from an act of optimization, be that aesthetical, functional, economical or from the point of view of any other performance criteria. Computational techniques of optimization do not, by virtue of the theoretical concept itself and that of the algorithms employed, provide a absolute optimal solution to a problem – there is no such thing as a global optimum. Instead, what we get – or, even more importantly, what we should aim for – are a collection of local optima, solutions which are the best yet not unique. Here, speculating in the broader realm of non-standard and interactive architecture, architectures can be designed that are flexible enough to encompass all of the local optima into one singular object. Otherwise, architectural objects, however “optimized” they are, remain just fragile, semi-static instances of the process from which they emerge.

Bibliography:

Bejan, Adrian, and Gilbert W. Merkx, . Constructal Theory of Social Dynamics. New York: Springer, 2007.

Bonabeau, Eric, Marco Dorigo, and Guy Theraulaz. Swarm Intelligence. From Natural to Artificial Systems. New York: Oxford University Press, 1999.

DeLanda, Manuel. A Thousand Years of Nonlinear History. London: Zone Books, 2000.

—. Philosophy and Simulation: The Emergence of Synthetic Reason. London: Continuum, 2011.

Kwinter, Sanford. Far from Equilibrium: Essays on Technology and Design Culture. Barcelona: ACTAR, 2008.

Leach, Neil. “Swarm Urbanism.” AD: Digital Cities, July/August 2009.

Steadman, Philip. The Evolution of Designs. Cambridge: Cambridge University Press, 1979.

Stefanescu, Dimitrie Andrei. “Algorithmic Abuse.” Edited by Joseph Scherer. PLAT Journal, (Fall 2011): 72-76.

Stewart, Ian, and Jack Cohen. The Collapse of Chaos. Discovering Simplicity in a Complex World. London: Penguin Books, 1995.


[1] The original variant of the title was Framing Optimization in Computational Architecture.

[2] Nevertheless, it did gain recently a lot of attention in the context of the environmental crisis we are too slowly beginning to address.

[3] Object or system, process – we are not limiting the observation to spatially or temporally finite static artefacts.

[4] Some do argue that there is no distinction between the Natural and the Anthropic; yet for the sake of argument we shall temporarily uphold this Modern dichotomy.

[5] The exact antonym of heuristic is, frustratingly enough for the author, non-heuristic. The term algorithmic was used here in place of rationalized or quantified.

[6] This will be justified later on in more depth, since it has a huge bearing on the discussion of optimization as a generative process.

[7] Question which can easily boil down to a clicheatic debate on the rationality of an “architectural singularity”, yet which avoid by situating ourselves in the realm of the possible and not in the realm of speculation based on trends in computational power which, for that matter, are now faltering.

[8] The technical expression is that the algorithm converges over time towards one solution.

[9] Particle swarm optimization (PSO), as the technique is known, harness the power of swarm intelligence which consists of the fact that a decentralized, self-organizational systems tend towards finding optimal solutions to problems with only limited knowledge of their surroundings.

[10] The Bauhaus actually did perform studies regarding this same matter. This example is intentionally set out of context in order to provide a humorous example, nevertheless the original study is not meaningless at all.

[11] The definition of equilibrium is a condition of a system in which all competing influences are balanced. If human society were to reach “equilibrium” it would mean it would stop any kind of evolution, since the reason for change would be gone. Yet what is pushing reality forward is the local changes which achieve equilibrium, yet provoke imbalance in other places.

[12] This enumeration is omitting obvious distribution networks of food and commodities, informational flows, monetary flows, etc.

[13] Locally stands for both spatial limitations as well as temporal limitations of context.

[14] An objectile is a collection of instances resulting from the maximum possible variation of the same process.


3 Comments. Wow. What do you think? Add your voice to the conversation below. Should be fun! Click to show the comments.

Excellent article Andrei.

Very well thought and very well argued. There is a lot of confusion about optimization in design. A poll in the linked in group on generative design indicated that the majority of designers believe that you can design through optimization. You refute this with such clarity. Well done.

Hey Sivam,

Thanks for the kind words! It’s not that I am refuting completely – I’m just showing (or trying to) that one is the same as the other, hopelessly intertwined or just about the same thing. To design is to optimize, and vice-versa – in the relatively materialistic framework i’m weaving around for the matter.

C'mon, write your thoughts in the box below.