.


Algorithmic Abuse


December 9th, 2011, essays

PLAT 1.5 has just been released, and with it my critical essay entitled Algorithmic Abuse, which I’m sharing below. It’s a short attempt to raise awareness towards several gaps in computational architecture’s theory as well as practice.

Please support the journal by purchasing a copy!

A recurring concern among the practitioners and promoters of what we shall refer to generally as “computational architecture” is the oft-mentioned (but rarely justified) “crisis of complexity” in which the world and its architects apparently find themselves. This condition manifests itself as an information overload, which is seen as the natural consequence of an ever-larger pool of numbers and decimals vying for our attention.[1] In response to this crisis, computational architecture is in a rush to dictate a paradigm shift by promoting the assimilation and implementation of concepts and theories emerging from science and philosophy – which, in combination, are intended to help us to navigate the confusing world described by chaos theory.

Naturally, given the epistemological and ontological framework in which the architectural discourse defines its crisis, “information” and its attendant verb “compute” become the most critical terms of the design process. Information, rationalized as pure numeric data, becomes the driving morphological force not only for the natural world, but for architecture.

Information is processed through the act of computing. Though it is most often used in the context of digital media, “computation” can denote any kind of process or algorithm. Here we must distinguish between two types of computation: material and digital. Material computation refers to the processes that manifest nature’s way of shaping the world around us. These processes primarily drive toward the structural optimization of matter. For example, we can look at the way a soap bubble negotiates (“computes”) the complex relationship between air pressure, gravity, and surface tension to find a shape of minimal energy which balances these parameters. Digital computation, on the other hand, concerns the comparatively rudimentary simulation of such processes within silicon chips. Despite the relative simplicity of the latter method, recent technological advancements have greatly increased our ability to simulate – and thereby explore – different natural processes. More and more complex behaviors and phenomena can be digitally approximated using new algorithms, allowing science to advance in its quest to discover the rules that make our world tick.

It is this trend, however, which brings into question the relationship between the built environment and science. Architecture’s use of scientific images is not new, but, as Antoine Picon notes, this meeting is productive only when there are similarities between the realities upon which both operate.[2] Contrary to Picon’s conditions for productivity, current relations between science and architecture are often based on superficial similarities or metaphors, necessitating a skeptical review.

As an example, we only have to consider the (in)famous Voronoi algorithm. Though it appears in nature at a variety of scales,[3] it makes few (if any) natural appearances at the architectural scale. Critically, “scale” concerns not only (or even primarily) physical dimensions, but with the forces that define the organization of matter at a given threshold. There is a huge difference between the electrostatic-, pressure-, and tension-based factors that operate at the microscopic scale – where the effects of gravity are almost negligible – and the way these forces operate at the scale of the architectural design process.[4] Yet the Voronoi algorithm is often advertised as a generator of organic, natural, efficient designs, which, needless to say, is not inevitable: a Voronoi-pattern facade is not necessarily more environmentally friendly than one using prefabricated elements.

This analogical relationship to natural phenomena – this mimicry – is also problematic because of its failure to acknowledge the legitimacy of the built environment as an inherent part of nature.[5] Architectural products are already part of the natural world as a manifestation of material computation. This acknowledgement eliminates modernity’s two distinct ontological zones – the human, cultural regime and the non-human, natural regime – and paves the way for an understanding of architecture as a hybrid system in which social forces and natural mechanisms blend. We don’t need to abuse digital computation in order to “fake it” – this only leads to a suppression of the ecological conflict between them. By replacing the dialectical relationship between the built environment and nature with one of inclusion, we discover a better framework for tackling the complexity and subtlety of the environmental issues confronting architecture.

Furthermore, there is a significant theoretical base supporting these experiments in computational architecture that attempts to legitimize the architecture as meaningful and usable. Yet, more often than not, these jargon-cluttered texts are little more than rhetoric carefully disguised as unbiased theory. Links to abstract datasets are used to justify the claims of legitimacy and performance of a given design. The results, however, amount to nothing more than disembodied data[6] – un-rigorous translations of information by arbitrary rules and subjective algorithms into geometric forms. The speculation that the diagram becomes reality and that reality becomes a diagram, while valid from a philosophical standpoint, is limited in practice – and certainly in architecture – by the precision of our measurements and our bounded predictive capabilities.[7] In short, computational architecture is far from being able to devise a Deleuzian abstract machine pertaining to itself.

To sum up, I would like to advocate for a more considered assimilation of computational tools. Rushing to repeat history and promote radical shifts has a high chance of failure and of improper application, and architecture is a realm in which mistakes are difficult and painful to fix. Speculating on the raw power of generating novel formal language is something that should be looked upon with caution. The desire to make the new scientific and philosophical paradigm legible through metaphorical translation of its ideas into architectonic expression is problematically and uncannily reminiscent of postmodernism’s failed formal project. Arie Graafland has argued that “The computational universe turns dangerous when it stops being an [sic] useful heuristic device and transforms itself into an ideology that privileges information over everything else.”[8] Coupled with the natural limitations of a systematic approach, this warning denies computational architecture the convenient “unbiased” arguments often employed to justify its design process. The apparent objectivity of computational techniques cannot mask the subjective, authorial aspects of the design process, still less erase the social, political, and cultural responsibility of the designer. Whatever methods and tools we use, we still play a critical role in the decision making process out of which the built environment emerges.


Bibliography:

1. DeLanda, Manuel. 2000. A Thousand Years of Nonlinear History. London: Zone Books.

2. Stewart, Ian and Jack Cohen. 1995. The Collapse of Chaos. Discovering Simplicity in a Complex World. London: Penguin Books.

3. Picon, Antoine. 2003. Architecture, Science, Technology and The Virtual Realm in Architectural Sciences, Princeton: Princeton Press, p.292-313.

4. Graafland, Arie. 2010. From Embodiment in Urban Thinking to Disembodied Data. The Disappearance of Affect. Delft: TU Delft.

5. Latour, Bruno. 1993. We Have Never Been Modern. Hertfordshire: Harvester Wheatsheaf.


[1] Regarding informational overload, Cory Doctorow has proposed in his blog post on The Guardian on the 22nd of February 2011 an interesting corollary stemming from it: redundancy. He also argues that meaningful and important information will eventually surface into the mainstream through the use of new social media mechanics.

[2] A. Picon. 2003. Architecture, Science, Technology and The Virtual Realm in Architectural Sciences, Princeton: Princeton Press, 294.

[3] From the way cells are organized and shaped to the veins on a dragonfly’s wing to the scales on a crocodile’s skin  to the way matter is distributed in the universe, the principles of the Voronoi diagram fundamentally shape matter at completely different and surprising scales.

[4] For a more in-depth description of this phenomena, see Dimitrie Ștefănescu, “f* Voronoi.” Last modified October 28, 2010.0. http://improved.ro/blog/2010/10/f-voronoi.

[5] This is probably the biggest change in thinking we, as architects, have to assimilate. Ever since the Roman rituals for founding a city by removing a plot of land from the chaotic influence of nature, the artificiality of the built environment has been conceived as existing in opposition to the natural world. The critique of this dichotomy is an important element of contemporary architectural discourse, and deserves a longer discussion than can be provided here. An extended and incisive treatment of the subject can be found in Manuel de Landa, A Thousand Years of Nonlinear History. London: Zone Books.

[6] Graafland, Arie. 2010. “From Embodiment in Urban Thinking to Disembodied Data.” The Disappearance of Affect. Delft: TU Delft, 42.

[7] We will never be able measure to the last significant digit – therefore, whatever the accuracy of our predictions, they will still have the seed of a design. Coupling this with Kurt Gödel’s incompleteness theorem, which states that for any system there will always be true facts within it that cannot be proven to be true using the rules of the same system, we can clearly see the inherent limitations of any systemic or structuralist approach.  For example, for all the advances of technology and science, weather predictions more than five days in advance have had the same accuracy rate since the 1950’s.

[8] (Graafland 2010, 46).

Please support the journal by purchasing a copy!

8 Comments. Wow. What do you think? Add your voice to the conversation below. Should be fun! Click to show the comments.

Dimitrie,

A string and well thought out piece as usual. I would like to suggest that the abuse of the algorithm has now become a larger problem than just the superficial link between the image of nature and its replication in architecture. I think we are at a point where computational architecture has reach a crisis. Enough work has been to see the possibilities but we do not yet know why. Until we bridge that gap and find an argument that can push this type of work out of the experimentation phase and into a more relevant socio-political discourse, everything we do is in danger of not only being an abuse, but also irrelevant.

I am a big fan of your work, which I find well thought out and better grounded than most. And I think that it will be this type of thinking beyond the simple tools and into the meaning of what we do that is needed to get to the next level. Keep it up.

Matei

Matei,

Thanks for your kind words. I really like your transition from abuse to irrelevancy – it’s true and it’s even more worrisome from my point of view. Irrelevancy leads to disinterest and subsequently to the issue society at large is experiencing today with the digital world in general: program or be programmed (Douglas Rushkoff).

Computational architecture has indeed reached a crisis of its own making – creating projects that just feed the insatiable (formal) appetite of the digital tools, and not the real needs of the job is one factor which led us to this point as well as to the irrelevancy you mentioned.

I’ll keep my blabber short today. Thanks for your comment!

Hi Dimitrie,

This is a nice critical piece. Not enough of this coming out of the Hyperbody crowd.
I agree with you and Matei that Computational Architecture is in a crisis of irrelevancy. As someone who doesn’t use algorithms for design (but wish I knew how), whenever I see a project based on parametric design I simply wonder Why? What’s the point? I see a lot of relevant, useful and constructive applications of parametric tools to create better architecture, but these applications are almost never seriously explored, and parametric-based projects usually end up being merely formal explorations or proposals for radical interventions.

However, I wanted to add that this is not just an issue of computational architecture, but contemporary architectural practice/education as a whole. I am afraid the our whole profession is quickly sliding into irrelevancy for the same reasons: we don’t know why we design anymore. The most publicized and coveted projects (in all styles, scales and program) are either “artistic” explorations of form or proposals for brutally radical interventions, both of which ignore the complexities of real issues and the social responsibilities of the architect. After all, a design for simple, soft, sensitive and incremental improvement doesn’t look so good on competition posters.

This lack of a “raison d’etre” is less obvious in the more traditional-tooled architecture because, unlike computational architecture, it has already “proved” itself to be worthy. In a sense we are now abusing the trust built over generations of architects. Computational architecture is a new tool that has yet to be thoroughly tested in practice and prove itself to be useful. So people (architects and non-architects), rightfully, ask Why? And and answer that implies “For its own sake” is simply not good enough for either computational or traditional-tooled design.

A thoughtful post, at last, when I was wondering why unquestioning exultations and can-do-ism seem de rigueur.

2010, in “Building Research & Information”, Michael J. Ostwald exposed in “Ethics and the auto-generative design process” what algorithmic design actually is: a formal aesthetic strategy to occupy the hilltops of the architectonic landscape from where to control and mortar the peer group’s terrain. And, that’s putting it mildly.

Before, in 2008, in his piece “Applied Curiosity” for the “Design and the Elastic Mind” MoMa exhibition, Hugh Aldersey-Williams offered “The double helix of DNA itself has become an enduring motif expressive of the machinery of life in art, design, illustration, and figurative speech. […] Such work sometimes seems so at odds with other trends in design aesthetics that one wonders whether it would ever have taken this form but for the gloss of scientific validity. […] Using science for inspiration is all well and good, but caution is necessary if larger claims are made for it. […] Critic Charles Jencks is thus misled when he answers his own question, ‘Why should one look to the new sciences for a lead?’ with these words: ‘Partly because they are leading in a better direction – towards a more creative world view than that of Modernism – and partly because they are true.’ Both of these justifications seek to endow design that is inspired by science with a superior moral authority. But garden ironwork such as Jencks himself has created inspired by ‘quantum waves’ has no higher morality or deeper meaning than a cornstalk fence. Designs with randomized elements chosen on the basis of DNA sequences – a recent fashion in architecture schools – have no closer connection to life as a result. These phenomena are as good a basis for a stylistic idea as any, but no better.”

Algorithmic abuse – the situation is quite like in H.C. Andersen’s fairy tale: the emperor is naked.

Andreas,

Thanks for your most insightful comment – that’s an understatement. Glad to hear that there are other voices agreeing with this!

To play a little bit the role of the devil’s advocate, I can maybe take Jenck’s argument and try to re-work it in a more convincing way and say that, in a sense, architecture (talking about authored architecture, mind) has always consciously tried to reflect, mirror or contain the weltanschaung/”world understanding” of its time. This happens maybe because of unconscious societal drives – like the need to make legible what we don’t really understand, make it comfortable enough for our brains to cope with it.

But from this to the “narcissists stricken with computer love” (Kwinter, quoted out of memory) which abuse the whole setting not out willful intent, but as a result of the general lack of understanding of their peers – clients or teachers – is quite a long step into a direction which I’m glad to hear we both find as being wrong. As well, hopefully I’ll find a place to work all this out in a decent PhD thesis and enforce it as mandatory reading to all 1st years (in every architect there’s a fascist, even in me it seems) everywhere.

Kapo – I wanted to reply to your comment a long time ago, never got to do it up to now.

You’re absolutely right that the relevant applications of computation in architecture are not really speculated. And my hunch is that this is so because many of them are boring, or at least without a direct formal manifestation! Where I believe computation works best is roughly at the small urban scale towards full-blown urbanism – large scale, large datasets, inherent error tolerances of the system itself which can offset the inaccuracies of the inputs, objective and measurable (at least in part) criteria/parameteres, etc. Unfortunately, all these things do not necessitate the zaha/schumacher-esque aesthetic of masterplans, which is contradicting the whole “bottom-up” logic by enforcing strict geometric manifestations, essentially a modernism with another hat.

In order to maintain some relevance, or actually better said, to find it again, we need to let go of huge parts of our ego and actually use computational tools in the way we are currently lying that we’re using them: as utensils that finally allow us to design open-ended processes, and sell this as such – not as pseud-organic looking frozen modernism.

There;s another trend which i want to (quickly) talk about: at a small scale, computational tools are helping to contain in an expressive way (or a way resulting from the current societal/technological developments, nevertheless still formal/geometrical) the current complexity of our time – increased or not. I have nothing against this, but it should be acknowledged as such and not be presented as pseudo-scientific mambo-jambo.

Thanks a lot for your contribution to the dialogue, I’m always getting excited to talk about all these things, and I really enjoy any excuse to do so…

C'mon, write your thoughts in the box below.