Tuesday, March 31, 2009

2009-03-31 On Board


Manuel de Landa reading

"The computer with its smart software is killing the designer's authorship"! Sounds like a sentence from a cheap futuristic movie. But on the other hand-all sense of the discussion is laid around this nowadays phenomenon. It seems that today's designer (or architect) plays role of a "guider of the evolution of form"not a generator or composer. Controlling form's development ("designer does not specify every single point of the curve, but only a few key weights which deform the cure in certain ways"M.de Landa) is his initial duty. It can be partially true. But it doesn't frees us as a designers from having artistic flair, from understanding the nature and potential of the material. The computer is just a mean of the achievement of the purpose.
I was also impressed with such words as:"The genes guide but do not command the final form. In other words, the genes do not contain a blueprint of the final form"(M. de Landa). A balance would be a perfect decision. A hand and a pencil with the computer and smart software should be a projection of someones mind in the space.

De Landa

I suppose we are all somewhat fascinated by the "egg designer" analogy.

De Landa writes that to achieve the potential of genetic algorithm in design, designers are no longer specifying forms but boundaries and constraints. To allow for the evolution of more complex and interesting forms, sets of primary codes/rules must be generated. Designers set constraints but must not foresee any particular result; indeed, the "evolutionary result [is] to be truly surprising". When the algorithm ceases to produce surprises it is no different than a simple CAD program that produces rigid polygons. It is the different intensities that drive the evolutionary process and thus the complexity of forms.

The challenge to designers, therefore, is to know what and to what extent these intensities are to be employed. To achieve this, designers need to acquire the understanding of the complexity of material. They are to become, to a certain degree, craftsmen whose craft is often overlooked.

Monday, March 30, 2009

Delanda Readings

Delanda’s chief interest is in the complexity of matter and the nature of material. Formal manipulation is purely conceptual, and the problems facing architects include numerous additional factors such as material constraints, structural forces etc. Delanda sees enormous potential in CAD software to not only develop complex forms but also ground them in real life situations, thereby providing a more holistic approach to design. He also challenges the traditional role of materiality in architecture, specifically through the idea of the genetic algorithm. Similarly to how DNA can be restructured to create a variety of organisms, new heterogeneous materials can be restructured and applied universally in any situation. These ideas have the potential to alter the way that buildings are conceived and constructed as well as the role of the architect in the design process.

DeLanda

Both of DeLanda's articles were helpful to me in terms of relating Delueze to our modes of thinking about form and architecture.  I was equally interested in the "egg designer" idea as Matt was, but it also fascinated me the way DeLanda talked about materials.  Rather than talk about the inherent properties of a material, he approached it from a craftsman point of view, in that materials are always varied based on their origins/components.  It started me thinking about what it would mean to be a craftsman today - how does the know-how of materials influence an approach in design?  Particularly a digital approach?  

While using the bottom up approach that Tsakairi talked about, materials are obviously very important and influential in design.  That got me questioning how materiality could relate to the genetic algorithms and designing in a virtual environment - where materiality can assemble itself.  DeLanda talked about the benefits of bone-like materials - where tension and compression stresses are dealt with in different ways rather than homogeneously. But if those elements that deal with tension and compression are in themselves homogeneously/mechanically produced, then how are we still representing the possible material variation?

A little confuse about the role that architect playing

For the process about virtual evolution mentioned in the article, "the space of possible designs that the algorithm searches needs to be suficiently rich for the evolutionary results to be truly surprising" is the first point that enlighten me, but also confuses me a little.
About the evolution of the virtual structure, when the number of the changes or the generations are large enough, the result will be quite unexpected, and far different from the original generation. Surely that will bring up some amazing designs after the process, but what role is architect playing in this process? Just a critic who gives comment? Or a final-decision-maker to dicide which should be kept or to be thrown away? If so, what's the difference between an architect and a passer-by?
As my thought, architect should be a god of such progress instead of the final-decision-maker, which means architect should get involved of the whole evolution progress rather than take a glance at the result. Architect should design the first generation with required function, which could define the correct topological relations of the several parts in this design; architect need to set up the rules of "natural selection", too, so that every generation could be directed to follow the restrictions. There must be something else an architect need to concern, but so far not so clear in my head.......

Wednesday, March 25, 2009

Delanda

Delanda’s essays investigate the genetic algorithm; their place in architecture and their use in the future.  Genetic algorithms are now a tool that can generate form by the mating of different forms.  The richest algorithms breed the most surprising results.  If the results were predictable the need for this tool would no longer exist.  In saying that, there still needs to be some level of predictability to generate true buildings.  Structural elements need to still function in the same ways, holding the loads and stresses of the building.  For if columns evolved into decorative elements the structural soundness of the building would be compromised.  Therefore; before the breeding of forms begin, a “body plane” must be set, much like how mammals have an “abstract vertebrae” plan, building would have to have their own “body plan” that defines the essence of a building.  The designer would decide what that plan is comprised of, and in doing so becomes the designer of the egg rather than the designer of the building.  Once the egg is decided upon the genetic algorithm takes over and starts generating offspring, as generations pass new building forms will evolve. 

My question is then; in the end will algorithms be able to produce more efficient buildings than architects are now able to produce?  And if so, will aesthetic design in the future be of less importance compared to “aesthetic fitness”?  

Tuesday, March 24, 2009

DeLanda Readings

DeLanda's quest to make a case for modeling software finds its roots in Deleuzian theory. In his text, DeLanda describes Deleuze's attempt to change the dominant philosophy of the genesis of form. However, what struck me as being extremely helpful in seeing the transition from 'the world of obedient rigid polygons' to our current design philosophies was DeLanda's explaination and analysis of embryological development. In embryological developement, an egg is initially very simple. Through phase transitions, the relatively simple egg becomes more and more complex. As DeLanda elegantly phrased it, 'the genes guide but do not command the final form.' This same process must be undertaken when using algorithmic tools. One begins with a relatively simple set of instructions [this brings to mind the transition from a simple A B set that we discussed in class, to the complex patterns that would eventually form] and by following those instructions a complexity based on intensive properties begins to form. This seemly banal analogy between the embroyological process and architectural developement was an eye opener for me. I guess DeLanda was right when he stated that 'they [architects and engineers] will have to become egg designers.'

Intensive vs Extensive

Intensive vs Extensive

If we divide a volume of matter into two equal halves, we end up with two volumes, each half the extent of the original. Intensive properties on the other hand are properties such as temperature or pressure, which cannot be so divided.

- Manual Del Landa, Intensive Science and Virtual Philosophy

Extensive quality is the quality that you can measure, such as length, area and volume, hence Extensive quality is quantitative difference. Let’s say if you divide the perfect cube into two equally subdivided solids, these two solids contain same amount of volumes. On the other hand, Intensive quality is when material reaches the threshold to create difference in quality. For example, water turns into ice as temperature decrease. This allows us to investigate self-organized and bottom-up approach. In Bottom-up approach, designers have control of local scale but the global scale is the result of interactions of the local behavior. The important thing is that the global behavior is more than sum of whole, the difference in quality to make new behavior to be emerged such as water flows to turn into tidal waves or spider webs string to create self-supported web network. Designers seem using algorithmic methods for the investigation of the emergent behavior. However, algorithm, either scripting, animation or parametric, they are structured by nested binary choices for example, if A do C, if B do D, if E do G and so on. I still wonder if this binary choice can simulate complexity of nature. Maybe it does. As Steven Wolfram mentions, the simple rule can create complex system.

Genetic Algorithm

from: Deleuze and the Use of the Genetic Algorithm in Architecture, Manuel de Landa

In this short reading Delanda supports fundamental changes in architectural design. He gives credit to Deleuze`s philosophical work and expands on the subject of “genesis of form”. An example of drawing a round column, which he gives as a technique of 3 step procedure (kind of mixture between Euclidian geometry and Aristotle’s categories ), seems to be the beginning of relevance of recursion and aggregation of such form or rather that specific technique to effectively create an evolution of “larger reproductive community” – a population with variable genetic code. The reading appears very relevant and contemporary; while reading, I did seem to think of software like Grasshopper or Houdini for instance, and attach the meaning of genetic mutation to simple step procedures that can be deployed in a computer environment using chunks of information embedded into an element. The notion of “body plan” and the idea of hacking into non-architecture based resources / fields are interesting and necessary to consider avoiding the traditional decayed trends of preference for aesthetics geometry selection. In the end we are seeing the final product through so many different lenses and scales of criticism that I think the grounds for seeing digitally mutated forms can be diminutive in itself, so I don’t think we came close to nearly digital reproduction or similarities; in themselves they hold small bits of the building blocks and techniques to unravel new foundations. Personal style and selection do not run parallel with generating a process through defined sequence of code, that is true but still under question. Subsequently we also begin to see overlapping with earlier reading of Wolfram and Rocker`s research. These are more or less my general comments to this reading, “The Case of Modeling Software” reintroduces some of the same concepts as I am browsing through it; comments on a follow up blog.

Thursday, March 19, 2009

Rocker_Complexity_Project

Studio Rocker emphasizes in exploring the possibilities of Architecture by manipulating the idea of codes, how this entity is viewed in its original state and configuration to multiple possibilities of expressions. The code is not seen as an external factor that limits the performance of Architecture; it is used as the mediator to transform the basis of the discipline by being materialized or executed. It becomes distant to our traditional view of restriction of codes transcending to a variety of possible performances and results (such as structure and surface) that are controlled and originated by a set of rules and organizations that determines the after effect of this configuration. I was personally interested in the idea of patterns and repetition that this computational process provides. Having one original state of configuration transcending to a more complex and recoded effect that is given by repetition. Similar to Neumann’s UCC (1940’s) that reveals the idea of copy and construction that gave product to unstable patterns, in Rocker’s Recursions (2004) it becomes evident that the automata is revealed in the understanding of rules, repetition of these operations and the effect that causes by relating one to the other; contributing to a new generation. Interesting to see how the author compares these lines as generations because they behave exactly like the idea of generations, an organization that is transformed by the other. The result: a variety of patterns. It ends up to a modification of codes and rules that stimulate a new understanding in Architecture by computational mediums. The digital software breaks the boundary of the screen and provides not only a theoretical understanding of its function but also the multiple physical possibilities that it can produce: the architecture.
Deleuze’s complexity relates two different kinds of system or spaces that are not completely independent from the other; the smooth and the striated. The relationship is not considered restricted or controlled; it does not reveal an evident origin or a clear sequence of its development. Wolfram’s complexity becomes more choreographed and its development through time is controlled by a rule set that change and maintains a reference from its previous. It becomes more of an addition or a repetition of the original rule set that transcends to a more complicated system different from Deleuze’s complexity that is less systematic. I agree in most of the comments that these two perceptions (Deleuze and Wolfram) are similar to each other by creating a constant relationship with the changing events (rule set or spaces), in addition they are both progressive and developmental. Independent from one being less strained and manipulated than the other, they evolve to a different use of the space. They both generate a complexity that emerges by their common and constant relationships through time to another dimension that contains information from one stage to the other, one that becomes edited or altered independent from their consistency and origin.
The image: Zaha Hadid./ It's a digital media and a study for the Thames Gateway as an urban field, the project would be located in London, United Kingdom. This specific project reveals a similar discussion by having a rule set as an origin that follows development, an alteration of that development that leads to multiple possibilities and patterns of structure. They investigate 4 main building typologies throughout the urban area leading to a series of evolutions of these standard typologies that are placed in the site and experimented. The fusion of these typologies creates new possible structures.
To see the animation: http://www.youtube.com/watch?v=IksIyui84wE#

Tuesday, March 17, 2009

Rocker/ Complexity / Image

Studio Rocker use cellular automata as a tool.  It generates a code, and they dictate some form of representation upon it.   For me Melissa says it well when she defines the designer's role as merely a chooser of representation.
---

When Pawel talks about Deleuze's complexity as being relational,  and Wolfram's as being sequential, he got me thinking about what that means in terms of time.  Matt talked about this too, saying that both systems relate to time.  To me it initially seems as if Wolfram's complexity is that of a progressively increasing nature, and Deleuze's complexity already completely exists - it is just waiting to be discovered.  But then that gets me thinking about the setup of both theories.  

Wolfram's complexity emerges from simple rules - and so does Deleuze's.  [smooth = this, striated = this, they interact thusly].  For Deleuze, the complexity emerges from how the two ideas interact, and how they relate.

Although Wolfram presents his diagrams of cellular automata in a linear and progressive format - what would happen if we relate them differently?  We talked in class about the Rocker studio having issues with modes of representation.  What if these rules related to each other in a different method - radially or as numbers, as random examples.  The type of representation and relation determines in part the level of complexity.  (If you only allowed the rules to progress two steps, with a three square wide grid, the diagram would be a lot less complex than an 18 step 50 square wide grid)  

So it seems that both Wolfram and Deleuze present complexity in terms of a relational and representational means.  Starting with a simple definition (rules for squares, and two ideologically different spaces) in relating the two spaces, or in relating how the rules operate in space, complexity develops.  The rules in and of themselves have no complexity, just as the definition of a smooth or striated space has inherently no complexity.  
---

The image is of the Hydrogen House in Austria by Michael McInturf, Greg Lynn, and Martin Treberspurg.

Tuesday, March 10, 2009

studio rocker experiments with cellular automata to create three dimensional diagrams, which is just another way of expressing the code. there is no more complexity in the architectural solution than there was in the code – it is exactly the same information in a different form. she acknowledges this deficiency though, when saying 'any code's expression is thus always just one of an infinite set of possible realisations'. the role of the designer, for her studio, is to simply decide on the method of representation of the code.

wolfram's idea of complexity is easier to define than that of deleuze. for wolfram, complexity arises from a simple code, repeated many times. in this way, while the local scale is simple, the same code at the global scale becomes complex. deleuze's smooth and striated models are inherently complex, and related to each other in many more ways than just local/global scale.

interior of the interactive water pavilion by NOX

Complexity, cellular automata


Rocker writes that “extraction of algorithmic process is an act of high level abstraction”, an act producing visual complexity (2d or 3d diagrams). The unpredictable and the unknown generated through such algorithmic process over-ride experience and perception, two very distinct categories. In the sense cellular automata is a visual form of simple messages and language abstracted into complex behavior and patterns; that is within the medium only.

I suppose a difference between Wolfram and Deleuze would be the method through which spatial complexity can be identified. Wolfram for instance uses procedural sequence, beginning with a few simple rule sets and so the procedural motion is more like a thrust forward or towards the expansion of visual complexity; Deleuze on the other hand seems to draw a line between the two distinct spaces and jump back and forth to form inherent relation between them. Both operate in a dynamic environment.

UNStudio Project of Master plan & Train Station,

Bologna, Italy, 2007


Deleuze/Wolfram Complexity, Rocker Essay and Digital Architect Image

Wolfram’s reading indicates that complexity can be based on a simple rule set. By following the defined rules, the outcome can be visually complex. In this system, complexity is based on a sequence of ‘if then’ statements. However, Deleuze’s theory suggests that there is an inherent complexity constructed of two different yet intertwined entities: the smooth and the striated. There is no if then statement with these, because the smooth and the striated act in relation to each other, while at the same time remaining their own entities. One thing that both of the readings have in common is that the systems exist within time and they are not static. In Wolfram’s text, time is displayed in the process of analyzing the previous steps based on the logic of the system and then implementing them to create the outcome. In the Deleuze reading, time is a property that is present in the smooth, because intensities and forces cannot exist without the element of time.

Rocker’s studio uses cellular automata as a generative tool for architecture. However, her use of the cellular automata places emphasis on the output diagram as an architectural form. In the last class, we discussed that Rocker’s studio used the algorithm as a way of creating the diagrams, but that diagrams were only one way of representing the computation. I was wondering how computation could be represented if its outcome vary? Is the problem with Rocker’s approach that she is only looking at one outcome and not a variety of outcomes?

The following is an image of a model designed by Greg Lynn for the Kleiburg Block.

Monday, March 9, 2009

Tuesday, March 3, 2009

Relation Between Categories and Prior Analytics

Aristotle’s categories consist of methods of describing (categorizing) human perception and understanding. Topics include substance, quantity, relativity, quality, action / affection, opposition, contradiction, priority, simultaneity, movement, possession. Predicate logic seems to exist in the categories specifically in the substance category to describe the individual, man and animal relationship. It struck me as being similar to the syllogism discussed in last week’s class: Humans are mortal, Socrates is a human, Socrates is mortal. Prior Analytics builds upon these ideas, developing further modes of logic and reasoning as well as methods for establishing and refuting propositions and investigating problems.