
Tuesday, March 31, 2009
"The computer with its smart software is killing the designer's authorship"! Sounds like a sentence from a cheap futuristic movie. But on the other hand-all sense of the discussion is laid around this nowadays phenomenon. It seems that today's designer (or architect) plays role of a "guider of the evolution of form"not a generator or composer. Controlling form's development ("designer does not specify every single point of the curve, but only a few key weights which deform the cure in certain ways"M.de Landa) is his initial duty. It can be partially true. But it doesn't frees us as a designers from having artistic flair, from understanding the nature and potential of the material. The computer is just a mean of the achievement of the purpose.
I was also impressed with such words as:"The genes guide but do not command the final form. In other words, the genes do not contain a blueprint of the final form"(M. de Landa). A balance would be a perfect decision. A hand and a pencil with the computer and smart software should be a projection of someones mind in the space.
De Landa
De Landa writes that to achieve the potential of genetic algorithm in design, designers are no longer specifying forms but boundaries and constraints. To allow for the evolution of more complex and interesting forms, sets of primary codes/rules must be generated. Designers set constraints but must not foresee any particular result; indeed, the "evolutionary result [is] to be truly surprising". When the algorithm ceases to produce surprises it is no different than a simple CAD program that produces rigid polygons. It is the different intensities that drive the evolutionary process and thus the complexity of forms.
The challenge to designers, therefore, is to know what and to what extent these intensities are to be employed. To achieve this, designers need to acquire the understanding of the complexity of material. They are to become, to a certain degree, craftsmen whose craft is often overlooked.
Monday, March 30, 2009
Delanda Readings
DeLanda
A little confuse about the role that architect playing
About the evolution of the virtual structure, when the number of the changes or the generations are large enough, the result will be quite unexpected, and far different from the original generation. Surely that will bring up some amazing designs after the process, but what role is architect playing in this process? Just a critic who gives comment? Or a final-decision-maker to dicide which should be kept or to be thrown away? If so, what's the difference between an architect and a passer-by?
As my thought, architect should be a god of such progress instead of the final-decision-maker, which means architect should get involved of the whole evolution progress rather than take a glance at the result. Architect should design the first generation with required function, which could define the correct topological relations of the several parts in this design; architect need to set up the rules of "natural selection", too, so that every generation could be directed to follow the restrictions. There must be something else an architect need to concern, but so far not so clear in my head.......
Wednesday, March 25, 2009
Delanda
Delanda’s essays investigate the genetic algorithm; their place in architecture and their use in the future. Genetic algorithms are now a tool that can generate form by the mating of different forms. The richest algorithms breed the most surprising results. If the results were predictable the need for this tool would no longer exist. In saying that, there still needs to be some level of predictability to generate true buildings. Structural elements need to still function in the same ways, holding the loads and stresses of the building. For if columns evolved into decorative elements the structural soundness of the building would be compromised. Therefore; before the breeding of forms begin, a “body plane” must be set, much like how mammals have an “abstract vertebrae” plan, building would have to have their own “body plan” that defines the essence of a building. The designer would decide what that plan is comprised of, and in doing so becomes the designer of the egg rather than the designer of the building. Once the egg is decided upon the genetic algorithm takes over and starts generating offspring, as generations pass new building forms will evolve.
My question is then; in the end will algorithms be able to produce more efficient buildings than architects are now able to produce? And if so, will aesthetic design in the future be of less importance compared to “aesthetic fitness”?
Tuesday, March 24, 2009
DeLanda Readings
Intensive vs Extensive
If we divide a volume of matter into two equal halves, we end up with two volumes, each half the extent of the original. Intensive properties on the other hand are properties such as temperature or pressure, which cannot be so divided.
- Manual Del Landa, Intensive Science and Virtual Philosophy
Extensive quality is the quality that you can measure, such as length, area and volume, hence Extensive quality is quantitative difference. Let’s say if you divide the perfect cube into two equally subdivided solids, these two solids contain same amount of volumes. On the other hand, Intensive quality is when material reaches the threshold to create difference in quality. For example, water turns into ice as temperature decrease. This allows us to investigate self-organized and bottom-up approach. In Bottom-up approach, designers have control of local scale but the global scale is the result of interactions of the local behavior. The important thing is that the global behavior is more than sum of whole, the difference in quality to make new behavior to be emerged such as water flows to turn into tidal waves or spider webs string to create self-supported web network. Designers seem using algorithmic methods for the investigation of the emergent behavior. However, algorithm, either scripting, animation or parametric, they are structured by nested binary choices for example, if A do C, if B do D, if E do G and so on. I still wonder if this binary choice can simulate complexity of nature. Maybe it does. As Steven Wolfram mentions, the simple rule can create complex system.
Genetic Algorithm
from: Deleuze and the Use of the Genetic Algorithm in Architecture, Manuel de Landa
In this short reading Delanda supports fundamental changes in architectural design. He gives credit to Deleuze`s philosophical work and expands on the subject of “genesis of form”. An example of drawing a round column, which he gives as a technique of 3 step procedure (kind of mixture between Euclidian geometry and Aristotle’s categories ), seems to be the beginning of relevance of recursion and aggregation of such form or rather that specific technique to effectively create an evolution of “larger reproductive community” – a population with variable genetic code. The reading appears very relevant and contemporary; while reading, I did seem to think of software like Grasshopper or Houdini for instance, and attach the meaning of genetic mutation to simple step procedures that can be deployed in a computer environment using chunks of information embedded into an element. The notion of “body plan” and the idea of hacking into non-architecture based resources / fields are interesting and necessary to consider avoiding the traditional decayed trends of preference for aesthetics geometry selection. In the end we are seeing the final product through so many different lenses and scales of criticism that I think the grounds for seeing digitally mutated forms can be diminutive in itself, so I don’t think we came close to nearly digital reproduction or similarities; in themselves they hold small bits of the building blocks and techniques to unravel new foundations. Personal style and selection do not run parallel with generating a process through defined sequence of code, that is true but still under question. Subsequently we also begin to see overlapping with earlier reading of Wolfram and Rocker`s research. These are more or less my general comments to this reading, “The Case of Modeling Software” reintroduces some of the same concepts as I am browsing through it; comments on a follow up blog.
Thursday, March 19, 2009
Rocker_Complexity_Project
The image: Zaha Hadid./ It's a digital media and a study for the Thames Gateway as an urban field, the project would be located in London, United Kingdom. This specific project reveals a similar discussion by having a rule set as an origin that follows development, an alteration of that development that leads to multiple possibilities and patterns of structure. They investigate 4 main building typologies throughout the urban area leading to a series of evolutions of these standard typologies that are placed in the site and experimented. The fusion of these typologies creates new possible structures. To see the animation: http://www.youtube.com/watch?v=IksIyui84wE#
Tuesday, March 17, 2009
Rocker/ Complexity / Image

Tuesday, March 10, 2009
wolfram's idea of complexity is easier to define than that of deleuze. for wolfram, complexity arises from a simple code, repeated many times. in this way, while the local scale is simple, the same code at the global scale becomes complex. deleuze's smooth and striated models are inherently complex, and related to each other in many more ways than just local/global scale.
interior of the interactive water pavilion by NOX
Complexity, cellular automata
Rocker writes that “extraction of algorithmic process is an act of high level abstraction”, an act producing visual complexity (2d or 3d diagrams). The unpredictable and the unknown generated through such algorithmic process over-ride experience and perception, two very distinct categories. In the sense cellular automata is a visual form of simple messages and language abstracted into complex behavior and patterns; that is within the medium only.
I suppose a difference between Wolfram and Deleuze would be the method through which spatial complexity can be identified. Wolfram for instance uses procedural sequence, beginning with a few simple rule sets and so the procedural motion is more like a thrust forward or towards the expansion of visual complexity; Deleuze on the other hand seems to draw a line between the two distinct spaces and jump back and forth to form inherent relation between them. Both operate in a dynamic environment.
UNStudio Project of Master plan & Train Station,
Deleuze/Wolfram Complexity, Rocker Essay and Digital Architect Image
Rocker’s studio uses cellular automata as a generative tool for architecture. However, her use of the cellular automata places emphasis on the output diagram as an architectural form. In the last class, we discussed that Rocker’s studio used the algorithm as a way of creating the diagrams, but that diagrams were only one way of representing the computation. I was wondering how computation could be represented if its outcome vary? Is the problem with Rocker’s approach that she is only looking at one outcome and not a variety of outcomes?
The following is an image of a model designed by Greg Lynn for the Kleiburg Block.


