The End of the Digital
In 1992 Peter Eisenman published two now-famous articles announcing the coming of a new age of electronics in design.1 The technical logic of electronics, he argued, is symbolized by the telefax, the old age of mechanical technologies by the photograph. William Mitchell had just written a seminal book explaining that digital photography (then still in its infancy) may look like traditional photography, but it is technically and ontologically quite unlike it. Traditional photographs are indexical traces of the originals they represent; digital photographs are sequences of numbers that can be edited (or recalculated) anytime.2 This often unpredictable variability, inherent in all electronic media, goes counter to centuries-old standards of mechanically mediated perception and may become a tool to further “dislocate vision.” Eisenman first mentioned here Deleuze’s theory of the fold.”3 The rest of the story is known: Deleuze’s “fold” became—and in many ways remains—a trope of digital design.
In the course of the last twenty years, the digital turn has destabilized far more than vision. And in their latest avatar to date, digital technologies are giving rise to no less than a new way of making and using objects. Ubiquitous—and for that reason often undetected—in the digital realm, this new digital way of making is best observed in some marginal media applications where it is overtly exploited. Wikipedia, a familiar example, may read like a traditional encyclopedia in print, but its entries can change anytime, as any reader of Wikipedia can edit almost all of its pages or write new ones. The logic of the system posits that as errors will be emended by further contributors, the amount of usable information will accrue over time. Wikipedia entries hence become statistically reliable only when they draw from a very large, and theoretically infinite, number of contributors, and users must beware that each page may be more or less seriously damaged or corrupted by new interventions, unpredictably and at any point in time. The statistical paradox of the process is that, by relying on an abstract wisdom of crowds that can only be attained at infinity, the system is designed to be always wrong, to some extent; and yet almost always usable.4
The case of Wikipedia may be trivial and anecdotal, but the technical and cultural logics at work here are not. For the same logics of variability, interactivity, and open-ended aggregation of content are inherent in all digital technologies, and they affect all that is designed, controlled, or managed using digital tools-which is to say, almost all of our technical environment. Indeed, signs of Wikipedia‘s functioning ricketiness already show-to different degrees-in most of today’s digital media objects, from texts and images to software and interfaces. Open-source software is built on the same participatory premises, but even mainstream, proprietary softwares now often follow similar strategies of endless and/or collaborative versioning. This is why the looks and functions of our own files-documents we used to think we owned-may change suddenly and incongruously, as the software we use (but increasingly we do not own) is regularly updated, in some cases without our knowing or consent. The same applies not only to cell phone operating systems, aircraft fly-by-wire software or automobile controls, but also to utility grids, medical technologies and services, and nuclear reactor control systems. We live in a Beta world: Since all digital, networked appliances can be fixed anytime, most of them are released when they are far from perfect. In fact, none of these systems will ever be perfect. The very notion of perfection is meaningless in an open-ended environment. It is replaced by redundancy and endless fine-tuning: If one way to make the system work fails, try another one-or try again tomorrow.
This logic is often hidden or dissimulated on the assumption that many today may find it disturbing; yet it is not unprecedented. Quite to the contrary: For centuries trial and error was the end-users’ only way to negotiate with an uncertain technical environment. Before the rise of print, scribes could, and often would, edit the manuscripts they copied.5 As a result, most circulating copies of the same text were different. The idea that an “author” may own or have rights over copies of his writing only came with identical reproduction in print and with Renaissance humanism. Today most physical objects-from soft drinks to cities-are designed before they are made, but before the humanist invention of design, most physical objects were made by craftsmen, conceived and developed on the fly in conversation with other communities of makers and users. And before the Industrial Revolution, approximation, not precision, ruled over every aspect of life. Leibniz and Newton may well have founded a new science of numbers, but they never had to catch a train at 6:25.
It may appear counterintuitive that the iron law of digital notation (at its basis a binary system that consists only of zeros and ones with nothing in between) should spawn a new culture of technical ricketiness, yet anyone with even a cursory exposure to open-source software could attest to that. Unlike the organic approximation of hand-making, which was insular, accidental, and specific, today’s digital ricketiness is networked, systemic, and machine controlled, thus adding a new artisanal dimension to McLuhan’s global village of electronics. But both preindustrial craftsmanship and digital crowdsourcing derive from and build on similar principles of community, collaboration, and often anonymity.
Architects have so far been quick to dismiss the participatory use of digital tools. Unlike software, buildings cannot be so easily “updated” after they are built. And design by committee, both in the socialist and in the corporate version, has a long negative reputation among architects. Yet, if buildings are not variable media, design notations are. Architectural design is pure information, a media object like texts, images, or music. And just like all other digital media, it can be produced collectively and interactively. And, indeed, it often is. Building Information Modeling software, the most significant Web 2.0 implement for architecture, was developed by the construction industry, and was originally devised for project management, not for design. Yet, as some critics and practitioners have pointed out, the use of BIM—particularly in the form of coordinated decision-making known as Integrated Project Delivery—implies a drastic redefinition of architectural liabilities and, ultimately, of architectural authorship among designers, contractors, and clients.6
BIM may be a tamed and conservative approach to participatory design. Collaboration in a BIM-based project is limited to invited technical participants, and interaction is based on consensus, not on open-ended and aleatoric accrual, as in the Wikipedia model. Architects may decry Integrated Project Delivery as a limitation to their design prerogatives and BIM as a ploy devised by the construction industry. But taken in its full, unmitigated import, the new digital way of making by aggregation is even more remote from, if not alien to, the design culture we have inherited from early modern humanism and from industrial modernity. As Alberti famously claimed on the eve of the Renaissance, design is conceived in the mind, expressed through drawings and models, then executed without change. Alberti’s notions of design and authorship countered the collaborative way of making that prevailed at the end of the Middle Ages, much as they are countered by the collaborative mode of use of today’s digital tools. Because of the way it works, digital making can more easily compound the thinking of many minds and evolve through the making of many hands. Just like the often disjointed and coarse textual patchworks that are the hallmark of Wiki writing, most things made by aggregatory versioning are unlikely to look smooth, polished, or elegantly finished. In the modern and Alberti an tradition, architects liked to think that they were makers of form and that they were in control. In the new digital way of making, no one is in charge of the product, albeit someone may try to be in charge of the process—as the curator, supervisor, manager, or censor of the efforts of others.
Indeed, this would be the natural mode of use of contemporary parametric design systems—or at least the mode of use that most fully exploits the technology on which parametric systems are built. Yet it appears that most architects today ignore or stifle the participatory potential inherent in parametric tools, and they tweak them to second their authorial ambitions —to author their own designs, in the purest Albertian tradition. Many architects pride themselves on using open-sourced design tools, but few or none on authoring open-ended design-architectural notations that others could finetune or modify at will. Likewise, architectural theory has been so far remarkably silent on the topic of digitally driven, collaborative interaction—both in the negotiated, BIM-like version and in the curated, parametric one—and thus has missed the most important recent development in digital culture and technology.7 Clearly, the digital turn has taken a direction that architects may be unlikely or unwilling to follow this time around.
For good reason. The current stage in the development of digital design technologies promises to be more disruptive than all previous ones. Beginning in the early 1990s, digital form-making and nonstandard seriality (the digital mass production of variations) have enriched the expressive repertoire of architectural design, and favored new formal and tectonic solutions that would have been inconceivable—or rather, unbuildable—without digital tools. But this revolution against the technical tenets of industrial modernity and the visual canons of architectural Modernism occurred within the mold of the Albertian, humanistic, and modern authorial tradition. The present participatory turn does not, since it threatens the very notion of architectural design by notation, and with that the raison d’etre of the architectural profession. Architects would now need to take a bolder step backward in order to grasp the bigger picture, understand what is at is at stake, reset, refresh, and restart. Instead, it may be easier to proclaim that the digital turn has come to an end—and some are doing just that.8 That would mean that after pioneering and prodding digital change for almost twenty years, architects could now be left behind. We would be in good company: From shorthand typists and travel agents to managers or record companies and publishers in encyclopedias in print, the list of professions made obsolete by digital technologies is already long. The digital turn would then continue without us, most likely to our detriment.
1 “Visions Unfolding: Architecture in the Age of Electronic Media,” Domus 734 (1992), 17-24; republished in Architectural Design 62 (1992), xvi-xviii; in The Invisible in Architecture, ed. Ole Bouman and Roemer van Toorn (London: Academy Editions, 1994), 144-49, and elsewhere; “The Affects of Singularity,” Architectural Design 62 (1992), 42-45. Both essays republished in Eisenman, Written into the Void, Selected Writings 1990-2004 (New Haven, CT: Yale University Press, 2007), 19-24, 34-41.
2 William J. Mitchell, The Reconfigured Eye: Visual Truth in the Post-Photographic Era (Cambridge, MA: MIT Press, 1992).
3 Eisenman, “Visions Unfolding, xvii.
4 On the Wikipedia “style of many hands,” see Carp, “Digital Style,” Log 23 (2011), 41-52, and “The Bubble and the Blob,” Lotus International 138 (2009), 19-27, with further references and bibliography.
5 Deliberate textural “variances” were frequent in the copy of modern works in the vernacular; classical authors and theological sources were transcribed more faithfully. See Carpo, The Alphabet and the Algorithm (Cambridge, MA: MIT Press, 2011), 139-40, with further references and bibliography.
6 See Peggy Deamer and Phillip G. Bernstein, ed., Building (in) The Future: Recasting Labor in Architecture (New York: Princeton Architectural Press, 2010), in particular Bernstein’s essay “Models for Practice: Past, Present, Future,” 191-98; see also Bernstein, “A Way Forward? Integrated Project Delivery,” Harvard Design Magazine 32 (2010), 74-77.
7 The unlimited open-endedness of “aggregatory” digital making may be theoretically incompatible with the dual process of design and building, since even the most free-floating design notations must at some point be frozen in time in order to be physically built. Both in BIM and in parametric software, the scope of digital interactivity is in fact curtailed and adjusted to suit the specific requirements of building: in the case of BIM by envisaging some digitally mediated mode of “design by leadership,” and in the case of parametric software by inherently splitting architectural authorship into separate layers of primary and secondary agency. But in a striking exception to this general trend, digital designers, while so jealously defending their authorial prerogatives against the intrusions of other social actors, are often happy to relinquish their control on form-making to the random mutations supposedly self-organizing technical systems: the morphogenetic metaphor has been an important component of digital design theory from the very beginning and is responsible for a vitalistic, irrationalist and sometimes covertly phenomenological and mystical approach to digital making, which remains strong to this day, particularly among a younger generation of new digital craftsmen. On the stylistic and design implications of these different approaches, see Carpo, “Digital Style” (on the “style of many hands”); “The Craftsman and the Curator,” Perspecta 44 (2011), 86-91; and The Alphabet and the Algorithm, 123-29 (on BIM collaboration and parametric interactivity).
8 For example, in a recent issue of AD, the design of new materials (including, more esoterically, the design of living materials suitable for building) is explicitly presented as the new technological frontier of architecture: See Neil Spiller and Rachel Armstrong, ed., “Profile 210: Protocell Architecture,” special issue, Architectural Design 81 (2011), 17. On the present state of apparent tiredness of digital design theory, see Carpo, “The End of the Digital Era: The End of the Beginning and the End of the Project,” Le Visiteur 17 (November 2011), 181-84.