The Art of the Prototypical
by Marc Fornes
Published in: AD Parametricism 2.0 - Rethinking Architecture’s Agenda for the 21st Century
Editor: H. Castle, Guest-edited by Patrik Schumacher, AD Profile #240, March/April 2016
The work of Marc Fornes / THEVERYMANY may be described as ‘prototypical architectures’, in which ‘prototypical’ – an adjective – extends a logical suite that begins with the sample (a test of a precise element within a unit), building to the prototype (a unit or relationship between units) and from there up to the mock-up (a number of units set up and not assembled completely).
The premise of each project is based on precisely defined architectural concerns, such as structure, enclosure, porosity etc. Through a process of empirical and serial experiments in both computational descriptive geometry and material systems, development grows from the scale of a unit, to a system of units, to an entire project, where all of its nature is fully tested at 1:1 scale – including, most importantly, the pleasure of its spatial experience – with the potential for further scalability.
This article outlines the driving parameters behind THEVERYMANY’s prototypical architectures, including the unique terminologies that have matured in tandem with the development process itself.
Explicit and Encoded Protocols
One of the initial premises within the work of THEVERYMANY is related to the specificity of the process: all morphologies result from explicit protocols – or finite series of steps, unambiguous instructions, hierarchically organised into a linear sequence, and translated through the shortest possible notation into an operational algorithm.
The creation of a design process by applied logic in hierarchical steps is not unique to computer science (it can be analogue), unless its logic is explicitly encoded to be interpreted by computers. THEVERYMANY’s protocols are explicitly written within a text file, articulated within a computational syntax (Python), call upon a vocabulary or methods from external libraries (Rhinocommon) and are finally executed within a software environment (Rhino3D).
Protocol of Precise Indetermination
Such protocols are defined and driven through numerically controlled parameters, and therefore are precise. There is no such thing as a computational ‘maybe’! Yet also and most of all the protocol is considered precise because it calls least upon randomness: one wants to be able to run the same code twice and get the same result each time if one wants to implement or debug a specific code or geometrical problem, especially if a special or uncommon case.
While the operational logic requires precision in order to be implementable, especially when fabrication is involved, there is also a request from a design standpoint to leave room for an element of surprise for the purpose of exploration and invention. Yet even if the protocol is the sum of very deterministic steps (assuming the author wrote every line of code, and understands the method’s logic and limitations / black box) it is still often required to execute the code in order to visualise its result. Due to the number of lines, steps and conditions (‘if … then …’ statements), it is impossible for the author to anticipate the result exactly, and therefore writing computational protocols of design constantly includes a factor of indetermination: one likes a moment of the result and yet it isn’t obvious at first sight to its author what has triggered such a result – the happy mistake, to be understood, controlled, designed upon and finally implemented. The amount of surprise or distortion between the anticipated and actual results could be defined as ‘inertia of the protocol’ or ‘resonance within the system’ – results even higher than expected – but not to be confused with the field of Emergence Theory that would require a much more exhaustive and sophisticated set of criteria.
From Form Finding to Structurant Morphologies
Based on a history of empirical serial testings focusing on the translation of digital geometries to physical reassemblies, the work of THEVERYMANY was forced to address degrees of failures – from the most dramatic failures such as total collapse, to invisible logistical issues detrimental to the possibility of scalability. Since at first exclusively empirical, this sharpened an understanding of structure and focused interest onto performant self-supportive structures, such as hyper-thin shell structures. At the level of the low-funded installation, the integration of the different into a single system is also cost-efficient for its reduction of complexity (number of elements, assembly types etc).
Intensive Curvature vs Extensive Curvature
The performance of hyper-thin self-supported structures is achieved through extensive curvature – a principle based on maximising the overall double curvature of a surface or volume in order to take advantage of its structural capacities. Yet double curvature in itself is not enough. While the work of Frei Otto demonstrates that the structural model of the soap bubble is far more performant than that of a box, such a model is relatively less performant if scaled to the size of a building. Scale matters: what is perceived as double curved at an architectural scale can often be approximated as the compound of straight lines or planes, and therefore has to be compensated by either material thickness or, in the case of active tension, heavy masts able to sustain the tension forces, such as in Otto’s Munich stadium (1972).
Intensive curvature intends to maximise double curvature everywhere (extensive) yet constrain maximum radii of curvature. Morphologies based on intensive curvature tend to curl in all directions (in order to maximise the differentiation of radii, but also the direction of curvature) and compound ‘closed-profile’ elements, such as thick lattice networks (as having a tight radius in one direction at least, and therefore extremely structurally performant).
Base Mesh vs Basic Mesh
Morphologies such as those within Frei Otto’s work can be digitally simulated from a ‘simpler’ planar mesh with a series of anchors locked in place (vertices, curves etc) and the application of forces. Even though it acts as an abstract elastic fabric during the simulation process, such a starting mesh can exist in planar form without topological issues such as overlaps or self-intersections.
The work of THEVERYMANY is based on a two-step process, emphasising further development into the creation of the initial base mesh topology as well as into the relaxation process itself. Meshes are vertices (coordinates), edges (relationships) and faces (representation) with directions, that can potentially represent endless types of complex and non-linear morphologies: compounds of open/closed, non-manifolds, branching or recombination, etc. As such they can most often not exist in planar form and require for example to be built through multiple additive or difference Boolean operations in three-dimensional space.
Prototypical structures such as THEVERYMANY’s Labrys Frisae (2011) – a 10-by-10-metre structure, 6 metres tall and built from aluminium sheets that are less than a millimetre thick – could nonetheless support its dead load, as well as the live loads of multiple people climbing to its top. This loading test has been empirically tested with up to three people freely and simultaneously ascending it.
THEVERYMANY
The work of THEVERYMANY has been exploring the physical production of structural morphologies through the development of custom protocols of tessellation (the description of the surface/mesh through simple elements, from triangles/quads to irregular polygons). The issue demonstrated through serial physical prototyping of such systems is that they rely on singularity: each face (triangle/quad/polygon of n number of edges) is materialised as a panel, making the total number of elements and unique parts potentially endless. While this situation may be great for the design of patterns (directionality, intensities etc), it presents a nightmare to physically reassemble. The creation of parts becomes, on one hand, simpler because they are not curved (and therefore do not require moulding/carving/printing), yet complexity re-emerges through logistics: naming conventions, production, double-checks, and the increased risk of error and long amounts of time in reassembly.
From The Very Many to The Very Least
For protocols addressing such logistical issues of reassembly, the issue becomes one of turning the very many into the very least number of parts. THEVERYMANY’s initial direction was based on a principle of recombination: tessellation of surface/meshes according to selected criteria (for example, smaller elements at tightest curvature) recombined into larger sets – such as stripes – rather than accepting the sum of singularities as material system.
The very first example – invented for THEVERYMANY’s n|Strip project (2010) – was based on a linear recombination of singular panels as chains, or striped morphologies, where the sum of singular planar parts are potentially developable as well (if no issues such as self-intersection exist) and therefore transformed as a material system. This dramatically reduced the number of parts while, as a by-product – dramatically increasing structural performance by increasing redundancies of connections to multiple neighbours.
From Descriptive Geometry to Search Protocols
The issue with material systems of linear stripes created through recombination is that while such protocols allow local choices for best-fit behaviour (according to, for instance, curvature) within each stripe, there is no overall knowledge of the entire system.
The introduction of parallel computing and ‘multi-agent-based systems’ allows agents – or encapsulated sets of rules – to understand others at each loop, and exchange feedback. This means picking seeds, for instance at the edges, and running them like ants onto the morphology, leaving a trail. When the trail is too long, they die and the path becomes material stripes.
The invention of such local reading to describe a mesh and define a linear material system allows for non-mathematicians/computer scientists to bypass primitive laws of traditional descriptive geometry and replace it by a numerous ‘population’ of agents crawling onto the morphology. From there, the author can decide through test trial errors the best path of progress according to the reading of local conditions.
Competitive Rule Sets and Schizophrenic Behaviours
However, descriptive systems based on search can often not rely on one single set of rules. Due to the nature and complexity required by structurant morphologies based on intensive curvature, a rule that solves a problem for one local condition often triggers new problems elsewhere. Such protocol of description requires competitive rule sets in order to find the best set of rules, and the best-fit parameters to solve overall the maximum amount of conditions through local decision making. The behaviours of stripes observed with such rules are nervous, fighting one another, and therefore referred to as ‘schizophrenic’.
Protocols – such as for THEVERYMANY’s y/Struc/Surf project (2011) for the Centre Pompidou in Paris – feed their agents with the local parameters in reverse order of best fit; the solution that first passes the test isn’t necessarily the best, but rather the first one acceptable, triggering a best average solution.
From Non-Linear Morphologies to Sets of Linear Descriptions
The trails of such agents, once converted to geometries (with attributes such as relative width, thickness, technicalities and other detailing), can be digitally cut as linear stripes within sheet material. Typical fitness criteria for production are length and shape. If too long, stripes can’t fit on standard sheets of material or a specific machine bed. If too curly, stripes won’t nest or layer well for packaging; yet if too straight or similar in shape, it becomes harder to differentiate them during physical reassembly. Also, if the stripes are too long, they won’t work well at physical high differentiation of curvature such as recombinatory or splitting nodes; though if too short, the process is back at square one (singularities), with too many parts.
Coloration vs Colour(s)
Colours are obviously highly subjective, submitting to trends and fashions. Choosing a colour, and moreover standing for it for years, can take its toll for an architect. Computation and procedural protocols of tessellation have opened up new paradigms: each physical part can be assigned an attribute of a single colour, therefore the sum of the parts can precisely approximate gradients (rather than the fuzziness of earlier airbrush solutions).
Coloration defines the procedural art of applying multiple colours across sets of parts. For example, gradients can be parametrised smooth (depending on the number of parts), stepped with precise amplitude (contrast from one part to the other), linear (in the blend from one colour to another), non-linear (with local intensities), with two or multiple colours, zebras with a single constant colour, or mixed with other gradients through precise modulo alternation. The possibilities are endless. The effect of such coloration protocols can become extremely intricate, therefore potentially less subjective to initial prejudice about specific single colours, since the complex logics established have first to be analysed and understood (both at global and local scale) before even getting one’s mind around it.
Prototypical Architectures vs Architecture
From a research standpoint, the work of THEVERYMANY has focused on the invention of the algorithmic descriptive of mesh geometry via planar stripes and their physical reassembly into self-supported doubly curved surfaces without the need for expensive moulds or temporary scaffolding. The results are fully immersive experiences to visit, engage, play in and lose oneself in. Even though these structures are often temporary interior ‘installations’ funded through art, there is a focused motivation to become permanent, to grow up in scale, and to be exposed to more elements, live loads, multiple programmes and very different cultures and contexts. The aim is not to be exclusively known as prototypical architectures among the expert audience of a specialised field, but rather to operate fundamentally as architecture.
Text © 2016 John Wiley & Sons Ltd. Images: © MARC FORNES/THEVERYMANY