There are many manifestations of “digital” architecture which have yet to be filtered and to coagulate into a coherent and articulate movement. For example, Patrik Schumacher’s proposal of “parametricism” as a global style (Schumacher 2011), while provoking a much needed debate and dialogue, has been met with skepticism by the practitioners and thinkers of the trade. ”Blob architecture” is now already classed as being dated, mainly due to its highly vulnerable aesthetics; “non-standard architecture”, with its fascination for digitally exuberant ornamentation, has stopped innovating and relishes in its geometrical experimentation (Kwinter 2008). Digital architecture is still, from this point of view, mutating and its offspring are highly volatile. It is because of its transient nature that, instead of interpreting the exact architectural results of contemporary digital architecture, this essay will take a different route towards enlarging the understanding of the current computational paradigm and its effects on the discipline. Thus, we propose to analyze the new tools that are employed by computational architecture and their bi-directional relationship with the designer. Whereas the designer’s needs informs the tools, the tools also inform the designer and facilitate a certain type of expression. This auto-catalytic (DeLanda 2000) loop of creativity is not fully understood yet in the case of digital design tools, and in what position it will settle or not is yet to be determined. Furthermore, shifting our discourse from a relatively risky projection of values on the different digital aesthetics and form towards one based on the more determinate technical aspects of the computational utensils allows for the exploration of a more overlooked aspect of architecture: the tools and the subsequent creative power they allow for and their projective meaning and power in the psyche of the designer.
Going beyond formal or geometrical freedoms – which easily turn into limitations – computational tools allow for the projection of a complete new (or a re-iteration of the old) set of fetishes onto the design process and its final output. Architecture has always relished in its tools and the relatively complexity of their nature when confronted with those un-initiated. When referring to tools we are not aiming towards the abstract concepts (such as space, volume, transparency) which an educated architect might employ in his creative process but the exact technological implements which help him articulate that vocabulary. As Steven Pinker notes, you can not think what you can not express: mental abstraction is impossible to detach from a mode of expression (Pinker 1995), or, in the case of architecture, materialization. Alongside theoretical constructs, the most basic tools of the architect are (or were, until the digital revolution) the drawing, the ruler, the triangle and the compass.
Architecture’s adoption of the digital medium has changed this. Initially just creating virtual replicas of the standard drawing board, design tools have now begun to fully speculate the virtues of the digital medium and distance themselves from traditional approaches. We could argue that we are now witnessing the beginning of thinking and creation from inside the digital paradigm itself. Clarifying what we understand by (or what deserves the name of) computational tools becomes crucial. At a first glance, there are the different software packages which make possible the formal exuberance characteristic of the majority of digital architecture products, yet the underlying algorithmical processes which drive them are the key players in the construction of the digital designer’s psyche. Initially obscured by the software engineers, these are steadily becoming more transparent and accessible to the architect, transforming his way of thinking – and due to the increasing computer literacy of practitioners – transforming as well the architectural way of thinking. Computational tools are not the software packages, like Maya or Rhinoceros, but the algorithms which they allow the designer to employ in order to create architecture.
Algorithms are, at their origin, inert logical structures. At a superficial glance, this setting allows for little in the way of projecting any values beyond the objective goals towards which the specific construct has been created. How can one relate, for example, the sharp rationality of an algorithm which optimizes the sorting of values within a database to any kind of design related values? The example chosen above serves the purpose to bring forward optimization, which is a keyword that sees a lot of use in computational architecture and hence is the vessel of a strong digital fetish. Optimization, as a concept per se, is anything but new. Nature, as well as humankind, has always tried to do things better, increasing performance of the resulting object or process and decreasing the effort consumed in the action itself. The industrial revolution, for example, is a bespoke example of this concept: mechanical tools replaced human workers due to their increased efficiency and reliability as well as reduced maintenance costs. Generalizing, the phenomena we know as evolution is essentially a process of optimization set against the testing ground of the environment.
In the pre-computational period of architecture, optimization stood for the iterative adjustment of a design by the designer. For a designer, one of his fundamental instincts was to find the best fitting solution for a given assignment in order to ensure his project’s viability as well as to distinguish it from other possible competitive designs. The mechanism behind these actions does not lend itself easily to rationalization – it is a heuristic process loosely defined as architectural intuition and it is highly dependant on the author. It is usually the result of a lifetime of accumulation of experience on top of a general education received in the beginning of one’s career. More importantly, it is essentially what bears the architect’s mark, being the signature of his authorship or what distinguishes him as the author. Nevertheless, the current manifestation of optimization in computational architecture manifests itself in a new, surprising way: instead of being an instrument of enforcing authorship, it becomes a vessel for the exact opposite: the dissolution of the designer’s responsibility.
Digital techniques have exposed to the speed of the computer more and more rationalization-prone parameters of the design process, thus making different optimization models possible. This steady progress towards digitalization happens at the expense of the “traditional” heuristic process described previously. Architectural intuition shall never be replaced by computationally driven optimization due to the limitations of the medium as well as the nature of architecture – yet the exact opposite trend can be observed unfolding. Optimization algorithms and procedures are often used as rhetorical devices around an architectural project in order to provide “objective” arguments for the justification of a certain solution, essentially masking the authorial bias and removing the author. This is one of the key fetishes in the digital architect’s psyche: dissolving his responsibility into an often obscure mass of unbiased algorithms.
This can be seen as a projection of the surrounding contemporary world onto the digital screen of architecture: a society which is governed by strict rules and complex systems which process every aspect of our lives with cold and ruthless logic. Moreover, it may also be an escapist technique for avoiding the stress which comes along with responsibility. On the other hand, more often than not, the apparent removal of authorial bias amounts to nothing more than a shortcut towards justifying a certain inner preference for a specific formal – or functional – solution by using invented objectivity claims. Here we need to recall the fact that algorithms are nothing more than inert logical structures. Optimization processes are no different: in the absence of input, they remain static. Furthermore, if the input they are given is flawed, the result will be also flawed. In the context of architecture, designers employing optimization algorithms which transcend their localized functional role of solving determinate problems (and thus become actual design instruments themselves) do not eliminate authorial bias or the responsibility which comes with design.
Another aspect of computational architecture which has a great impact on the psyche of the designer is coming from the fact the output of the design process is no longer a singular object, but rather an objectile. An objectile is a set, or a family, of objects defined by all the possible variations of the parameters describing the respective design process. Computational techniques are making this easily possible to visualize. In the past, this exploration of variation was limited due to the friction of the analog medium (consisting of pen and paper) in which architecture was expressing itself. The virtual realm has effectively eliminated this barrier and thus paved the way for a comprehensive exploration of the objectile. The architect is no longer an author of objects, which loose their identity inside their family, but an author of processes. This shift in the nature of the authorial role, from that of creator of objects to that of creator of objectiles, has prompts a huge change in attitude: no longer being constrained to singularities, the designer relishes in the power to generate multiplicities. The reality of construction – feasibility, material constraints, etc. – sometimes acts as an ulterior selection criteria which collapses the objectile into an object.
The power of algorithmical exploration of multiplicities does not stop here. Not only can computational tools easily facilitate the blind variation of parameters, but they can also inform it with the same mechanisms that nature uses to create its complex structures: the evolution of species, the formation of geological structures, and, we might argue, even the articulation of the built environment. The design process is no longer a blind search controlled only by the whims of the designer and sometimes by constraints pertaining to the reality of the construction site. Harnessing the power of emergence through carefully crafted algorithms, the designer is no longer an author of processes, but a semi-deity which creates a system, similar in mind and body to those of nature, from which his object of attention unfolds. Computational tools allow designers to no longer design a tree, but to design the process which gives birth to trees. Obviously, touching even briefly – because it is no more than briefly that our current technological skills and equipment allows us to – this immense and beautiful power is thrilling, yet dangerous. As Neil Leach remarked, “in every architect there is a fascist” (Leach 1999). Projecting onto himself the qualities of a designer-deity, and the subsequent control this puts into his hands, what started off as an innocent fetish risks to end up into a race to control every aspect of the built environment through the systems of computational architecture that mimic natural processes.
This fetish of absolute control has its obvious downsides, which are common throughout any “absolutist” movements of architecture. Evidently, trying to encompass all of the complexity that architecture presents us with in a however intricate, yet rationalized system leads to drawing simplifying conclusions and assumptions, which, as the Modernist movement showed us, lead to failure. In architecture, the multitude of the objectile has to collapse into a singular object upon realization and this brings back upon it all the limitations which are otherwise not present in the objectile itself. Nevertheless, there is another, easily overlooked, aspect of the same trend: the desire of the designer-deity (or of the tools he is employing) to remove the Modern dichotomy between Nature and Culture and integrate, through whatever computational means possible, with the former. Acknowledging this would open up a whole new page in the relationship between the built environment and the natural environment, one of inclusion and affiliation rather than conflict, as it currently is presented by the environmental discourse.
The lack of a discipline-wide understanding of computational tools allows for the easily obfustication of the processes involved. There is a significant theoretical base supporting digital architecture and its hidden fetishes which tries to advocate for the usability and feasibility of its products. Yet, more often than not, these texts serve as rhetorical devices which rather confuse than enlighten the uneducated reader, thus gaining his trust through mystification. As it is with every specific domain, technical jargon is unavoidable, yet “everything that can be said at all can be said clearly”. Supported by strong histrionic imagery, a culture of obscurantism is slowly developing in and around computational architecture. Thus we reach another subconscious fetish of the contemporary designer, that of becoming akin to the secretitive master mason of the middle ages and thus becoming the exclusive master of even more power. Unconsciously (or sometimes consciously) guarding his secretes with a plethora of algorithmical tools and jargon, whose results can not be reproduced easily, this shift to a pre-albertian way of working augments the change briefly mentioned above of returning to a pre-modern (or, according to Bruno Latour, non-modern) (Latour 1993) way of thinking.
Computational tools have a very solid extension into reality through the means of digital fabrication techniques. These tools aptly bridge the virtual world with the material, real, world, effectively blurring the borders between the two. What you can see on your computer screen can be transformed into a solid object which you can touch, smell and feel in less than a day. Though they are still limited in scale, these tools serve as credible leverages for the digital designer, which is quick to state that reality becomes a diagram, and the diagram becomes reality through their means. Yet this statement fails to acknowledge the transcendent limitations of any diagrammatic, or systemic approach – which, due to our bounded predictive capabilities stemming from incomplete logic as well as our imperfect grasp of precision will always leave computational architecture one step behind the creation of a deleuzian abstract machine pertaining to itself (Stefanescu 2011).
The purpose of this essay was to draw attention to the potential disruptive nature of the digital tools which manifests itself at an unconscious level to through the projections on the designer’s psyche, as well at a resultant conscious, tangible level through the specific outputs of computationally-enhanced design. A world where an authorially-detached (and hence irresponsible) designer-deity projects disembodied constructs (Graflaand 2010) straight from the virtual realm into concrete shapes and buildings is one that would tax both society and nature dearly as well as the credibility of the domain itself. The computational paradigm shift presents architects with a change of tools: it is now up to the designers themselves to project onto them meaningful values and shape them towards architecturally sustainable uses, and not let themselves fall victims to the tumultuous nature of the digital medium.
 The current transient nature of digital architecture is inevitable given its age and the huge amount of ongoing technological changes, changes which need to be reflected by the discipline.
Furthermore, these software packages all have different backgrounds, related to architecture to a greater or lesser extent. This is evident in the way they allow the designer to modulate and shape their algorithms (and hence the evident software-specific architectural products which become evident to the trained eye), yet the processes which we shall analyze recur throughout all aspects of computational architecture.
In programming jargon, this phrase translates as: “Garbage in, garbage out.”. An though provoking mutation of this phrase is “Garbage in, prophecy out.”. This means that sometimes the belief in the algorithm is so strong that, when faced with erroneous input and subsequent false output, the false results are interpreted as being true.
DeLanda, Manuel. A Thousand Years of Nonlinear History. London: Zone Books, 2000.
Graflaand, Arie. From Embodiment in Urban Thinking to Disembodied Data. The Disappearance of Affect. Delft: TU Delft, 2010.
Kwinter, Sanford. Far from Equilibrium: Essays on Technology and Design Culture. Barcelona: ACTAR, 2008.
Latour, Bruno. We Have Never Been Modern. Cambridge: Harvard University Press, 1993.
Leach, Neil. The Anaesthetics of Architecture. Cambridge: MIT Press, 1999.
Pinker, Steven. The Language Instinct: How the Mind Creates Language. London: HarperCollins, 1995.
Schumacher, Patrik. The Autopoiesis of Architecture: A New Framework for Architecture. London: Wiley, 2011.
Stefanescu, Dimitrie Andrei. “Algorithmic Abuse.” Edited by Joseph Scherer. PLAT (Rice School of Architecture), no. 1.5 (2011): 72-76.