banner
conanxin

conanxin

Limn: Utopian Hacker

Compiled from: Utopian Hacks - Limn, Author: Götz Bachmann

In a laboratory in Oakland, a group of elite and unorthodox engineers is attempting to reimagine what computers can do and what they should do. It is here, in this laboratory located in Silicon Valley (or near Silicon Valley, depending on how you define its boundaries), that my ethnography is based. This team gathers around an engineer named Bret Victor and is part of the YC Research Human Advancement Research Community (HARC), a research lab funded by the industry that is dedicated to open and foundational research. "Hackers" refers to members of this group, a term that, like many others for engineers, is at best a word for experimental work (e.g., "It's just a hack") or refers to using technology for purposes other than its original intent. It can also be a pejorative term, referring to the consequences of amateurish, low-quality technological development. Therefore, when the engineers I study describe their work, "hacker" is not one of the key terms they would choose. However, I want to argue that some of their work practices bear similarities to hacking, albeit in different domains. This article asks: How do engineers hack the imagined technology of what it is and what it could be?

I argue this point by analyzing these engineers, which I call "radical engineering," as there is no better term. Radical engineers fundamentally challenge existing concepts of technology (here referring to digital media): their basic characteristics, purposes, and possible futures. Their radicalism should not be confused with political radicalism, "disruptive" radicalism, or the radicalism of certain engineering outcomes. Their radicalism places them outside the more obvious, self-evident, time-tested, or desirable fields of engineering. Their stance is so heretical that they often no longer refer to themselves as "engineers." But there is no other word to replace it. They might try to use terms like "artist" or "designer in the style of Horst Rittel," but both are unstable and prone to misunderstanding. After all, these individuals are educated in disciplines such as electrical engineering, mechanical engineering, computer science, or mathematics, and their work often requires solving highly complex technical problems.

Bret Victor's team is trying to establish a new medium. To achieve this, it is less about a sudden flash of inspiration and more about a permanent, stubborn process that transcends what is currently imaginable. The lab takes existing technologies, such as projectors, cameras, lasers, whiteboards, computers, and Go pieces, and reconfigures them with new or historical ideas about programming paradigms, system design, and information design, along with a range of assumptions and visions about cognition, communication, social interaction, politics, and media. The team is building a series of operating systems for a spatial dynamic medium, each operating system based on the construction experiences of the previous one, with each construction taking about two years. The current operating system is named "Realtalk," and its predecessor was called "Hypercard in The World" (both names pay homage to historical, unorthodox programming environments: Smalltalk from the 1970s and Hypercard from the 1980s). When this team develops such operating systems, it involves a process of writing and rewriting code, declarations, a lot of conversation, and even more moments of collective silence, spells of iteration and adjustment, digesting films and books, and numerous technical papers, as well as building dozens—actually hundreds—of hardware and software prototypes.

Prototypes are everywhere in the lab, with new ones being added every week. Within a month, visitors can point a laser at a book in the library, and the projector will project the internal content of that book onto the wall beside her. A few weeks later, you would see people jumping around on the floor, playing a game called "laser socks": people trying to shine lasers on each other's white socks. A few months later, a table turned into a pinball machine made of light emitted from a projector, with videos of cats following around each rectangle drawn on paper. Currently, the group is experimenting with "small languages" in spatial media: specific domain programming languages based on paper, pens, scissors, Go pieces, or strings, all possessing dynamic properties, thus having the ability to directly guide computation or visualize complexity. The emphasis of all these prototypes is not on that kind of dazzling technical complexity. In fact, it is quite the opposite. The purpose of the prototypes is simplicity and simplification—empirically, one can assume that the fewer lines of code involved, and the simpler those lines are, the more successful the prototype is considered to be.

Illustration by David Hellman (draft), imagining Dynamic Land, the next iteration of dynamic spatial media in 2017

Although these prototypes are interesting, they remain "working artifacts," forming "traps" for potential possibilities with "illusions of self-movement." In Bret Victor's research group, the work of prototypes is to capture and showcase the potential attributes of a new, spatial, dynamic medium. One of its ideal attributes is simplicity, and those prototypes that demonstrate this attribute are often selected as successful. Every two years or so, the entire process produces a new operating system, which then allows for the construction of a whole new generation of prototypes, which are typically (though not always) based on the capabilities of their respective current operating systems while already exploring the potential of the next generation. The overall goal is to create a fundamental breakthrough, equivalent to the technological leap of the 1960s and early 1970s, when the quadruple introduction of microprocessors, personal computers, graphical user interfaces, and the internet fundamentally transformed computing by turning computers into a medium. Turning computing into a medium in the 1960s and 1970s meant confronting technology with technology: by using new computational capabilities, a medium was created that conformed less to what people at the time thought computers "were" and more to what a dynamic version of paper might look like. In the work of Bret Victor's research group, this form of working against computation becomes radical.

Whether in spirit or in real life, the guardian of this endeavor is Alan Kay, one of the most famous radical engineers and a key contributor to those breakthroughs in the field of computing in the 1960s and 70s that Bret Victor's team is trying to catch up with today. Let's take a look at Alan Kay. In the 1960s, he began working in the newly established computer science department at the University of Utah, writing what is considered one of the boldest doctoral theses of all time, a mad technical dream about a new kind of computing. The thesis begins with a desperate cry from another radical engineer—"I wish these computations were done with steam" (from Charles Babbage)—and after 250 pages of reflection on "reactive engines," the climax of the thesis is a fictional "Flex Machine" manual: the first iteration of a series of ideas that would later peak in Alan Kay's vision of the "DynaBook" (1972). While researching this thesis, Kay became one of the young members of a research group funded by the Pentagon's Advanced Research Projects Agency (ARPA) Information Processing Techniques Office (IPTO), which was then taking its first steps toward establishing ARPANET. In the early 1970s, after a postdoc with John McCarthy at Stanford University, Kay joined Bob Taylor's new Xerox PARC research lab, where legendary figures in engineering like Lampson, Thacker, Metcalfe, and many others were building the ALTO system, the first system to connect independent machines with advanced graphical capabilities.

Once the first iteration of the ALTO/Ethernet system—understanding the latter as a system rather than an independent computer is crucial—began to operate, it provided Alan Kay with a powerful playground. Kay reflected on some of his work in the 1960s, during which he analyzed SIMULA (an obscure Norwegian programming language) and, along with Dan Ingalls and Adele Goldberg, developed a hybrid between programming languages, operating systems, and children's toys—Smalltalk. The first iteration of Smalltalk was an object-oriented experiment aimed at modeling all programming from scratch after a distributed messaging system: later versions abandoned this, and after an initial phase of success, Smalltalk ultimately lost its dominance in object-oriented programming to languages like C++ and Java. However, in the mid-1970s, the ALTO/Ethernet/Smalltalk system became a hotbed of ideas about graphical user interfaces (GUIs) and many applications that are now commonplace. Thus, Kay and his "Learning Research Group" can be seen both as a lost computing holy grail, destroyed by the computing models cast in hardware and software by capitalism, and as one of the key genealogical centers of its later emergence. It is this dual significance that makes this work so unique and interesting to this day.

A whiteboard in Bret Victor's lab filled with Alan Kay's papers

Alan Kay's contribution to the history of computing is the result of a thorough disruption of computing paradigms and imaginations of his time. Kay adopted unorthodox programming techniques pioneered by SIMULA, new visualization techniques developed by the Sutherland brothers, McCarthy's desire for "private computing" and Wes Clark's "lonely machines," Doug Engelbart's group's augmentation experiments, and new ideas about distributed networks, among others. This technology was not common in the emerging fields of software engineering and programming, but it began to circulate within the elite engineering circles where Alan Kay worked. Kay combined them with the educational, psychological, and mathematical ideas of Maria Montessori, Seymour Papert, and Jerome Bruner, and further energized them through Marshall McLuhan's fashionable media theories. Kay also understood early on the implications of what Carver Mead called "Moore's Law," an exponential line of increasingly smaller, faster, and cheaper forms of computation triggered by mass-produced integrated circuits, now leading to positive feedback in technological development and the creation of new markets. Thus, Alan Kay recombined all these ideas, desires, technologies, and opportunities. The result is an important contribution to a new and emerging socio-technical imagination, representing in many ways the computer as a digital medium, which we have today. Therefore, Alan Kay's work can be seen as a benchmark for radical engineering, as it allows us to critique the current stagnation and potential decline in the quality of most imaginations about technology.

But is it really that easy? Is radical engineering merely a result of a bit of mixing? Clearly, it is a more complex process. One of the most compelling descriptions of this process comes from another legendary radical engineer, namely Doug Engelbart, mentioned earlier. In 1962, a few years before Alan Kay began his career, Engelbart devised a project for his own research group at the Stanford Research Institute, funded by the U.S. Air Force, aimed at redesigning "HLAM-T," or "Human using Language, Artifacts, Methodology, in which he is Trained." This HLAM-T has always been a semi-mechanism, allowing it to participate in a continuous process of "augmenting human intellect." Engelbart believed that the latter could be achieved through a "bootstrapping" process. In Silicon Valley, this term can mean many things, from booting up systems to launching startups, but in the context of Engelbart's work, "bootstrapping" is "…an interesting [recursive] assignment of developing tools and techniques to make it more effective at carrying out its assignment. Its tangible product is a developing augmentation system to provide increased capability for developing and studying augmentation systems." Like Moore's Law, this is a dream of exponential progress emerging from nonlinear, self-executing feedback. Can you be more Californian?

To ensure that Engelbart and English's description is not just a cybernetic daydream, we need to remind ourselves that they are not merely talking about technological artifacts. Simply building prototypes from prototypes is not a wise move in radical engineering: once put into use, prototypes often collapse; thus, the prototype toolkit is not very useful for developing further prototypes. Therefore, "bootstrapping" as a process can only work if we assume it is part of a larger process, in which "tools and techniques" develop over a long time with social structures and local knowledge. The process is recursive, much like the "recursive publics" described by Chris Kelty in free software development communities: in both cases, developers create socio-technical infrastructures through which they can communicate and collaborate, then spread to other parts of life. Kelty shows that this recursive effect is not just a magical result of self-reinforcing positive feedback. The recursive process is based on politics, resources, qualified personnel, care, and guidance. In short, they require constant production.

Thus, bootstrapping can take different scopes and directions. Although Engelbart and English's project sounded ambitious, at least in the 1960s, they still believed that bootstrapping within a research group would achieve the desired effect. Alan Kay's Learning Research Group expanded this setting in the 1970s through pedagogy and McLuhan's media theory. By introducing children, they aimed to achieve recursive effects that transcended the laboratory, with the long-term goal of involving the whole world in a bootstrapping-like process. Bret Victor and his research group's approach to bootstrapping is like a multi-layered onion. What kind of people should be part of it, and when, leads to intense internal discussions. Once the group launches "Dynamic Land," it will enter the next phase. Meanwhile, bootstrapping has taken many forms. Prototypes involve the bootstrapping process, such as pointers, tentacles, searches, improvisational repetitions, scaffolding, operating systems, blocking, performance, imagined test cases, demonstrations, and so on. In fact, in the larger bootstrapping process, a multitude of prototyping techniques is included. In the lab, they collectively produce a feeling of sitting in the brain. The lab as a whole—its walls, tables, whiteboards, roof, machines, and the people living inside—serves as the first demonstration of an alternative medium.

A detail from the HARC lab: the top image shows Alan Kay in white jeans. The bottom image is Engelbart's 1962 paper, posted by Bret Victor on the wall of the San Francisco Mission District

Building the series of operating systems iteratively may require a significant amount of engineering tasks in the traditional sense; for example, writing kernels in C or process hosts in Haskell. However, the overall effort is clearly not driven by technology. In future spatial media, computation should decrease. Computation will play a role as infrastructure: just as books need light but do not mimic the logic of light, media can utilize the computational possibilities provided by backend operating systems when necessary, but it should not be driven by them. Instead, dynamic spatial media should be driven by the characteristics of the medium itself and should also be driven by technology. The characteristics of this medium await exploration through the bootstrapping process. In the group's words, whether the medium or the way they produce this medium, it is "from the future." This future is not given but depends on the medium that this group is imagining. Therefore, it depends on the properties of the medium that the group is exploring, selecting, and practicing. On one hand, technology gives rise to a new medium, which is imagined as shaping the future; on the other hand, the future is imagined as shaping the new medium, which in turn should drive technology.

While most of the group's work is about making devices, thinking is also part of their work. The latter enables engineers to understand what the prototype work reveals. It also points the direction of work for the lab, inspires its endeavors, and is part of obtaining funding. So far, the entire process has produced a series of interrelated and constantly evolving ideas and goals: for example, one group is looking for new ways to represent and understand complex systems. A second group aims to gain more knowledge by removing the limitations of contemporary media (such as the limitations of screens, which produce forms of knowledge that are difficult to understand, like trillions of lines of code written on a screen and then staring at the screen). The third group explores new forms of expressing time, while the fourth group explores new forms of incorporating physical properties more effectively into spatial media systems. All these clusters will lead, so that goals and hypotheses move more seamlessly up and down the "abstract ladder." Seemingly echoing Nietzsche, McLuhan, or Kittler's media theory thinking with engineering solutions, a larger goal is to make new ideas possible, which, due to the inadequacies of contemporary media, have remained "unimaginable" until now. Enhanced embodied forms of cognition and better ways of collaborating to generate ideas can heal loneliness and suffering, which are often part of deep thinking. As one internal email put it, all of this together might "prevent the world from splitting apart."

One way to understand what is happening here is to frame all of this as another form of "hacking." When you "hack," you can say you are hacking apart or hacking together. Hacking apart can be seen as a practice that develops from refusing to accept previous black-box behaviors. Moving into the realm of radical engineering, hacking apart means not accepting the black boxes of current technological paradigms, such as screen-based computing, or ready-made futures like "smart cities, smart homes," or "the Internet of Things." Instead, you would open such black boxes and dissect them: assumptions about what is considered technological success and assumptions about future technological advancements that align with certain versions of social order, often combined with unhealthy commercial opportunities. The black box likely also contains ideas about different types of engineers, programmers, designers, managers, and other roles. If you take all of these apart, you might look at these elements, throw away many of them, distort others, add some elements from elsewhere, and then grow some yourself. You would study different, often historical technological paradigms, as well as other ideas about what is technically possible (and when), different ideas about social order, the good life, and problems that need solving, other books that need reading, different uses of media power, and different views on what kinds of people and their professional or non-professional natures should be responsible for all this. If you are lucky, you have the conditions and capabilities to complete all of this in a long, nonlinear process, also known as bootstrapping, during which you experience multiple iterations of hacking apart and hacking together, while creating fundamentally different ideas about what technology should do and can do, and helping to shape these ideas through a series of means and practices, demonstrating to yourself and others that some utopias may not be so far-fetched. This is what radical engineers do.

While they have made considerable efforts to escape the fantasy of technological solutions, they have not abandoned the engineering approach of solving problems through building things; they have developed a method that people might call "radical media solutionism," although they have a contradictory attitude towards the latter. To avoid misunderstanding: neither I nor the engineers I study believe that a true future can be pieced together by a group of engineers in Palo Alto or Oakland alone. But I do believe that radical engineers like Engelbart, Kay, or Victor's research group, in their specific, highly privileged positions, add something crucial to the complex forces that drive us toward the future. My ongoing fieldwork has made me curious about what is produced here, and many visitors to the lab agree that the first products to "arrive" are indeed shocking and incredible. If we believe in the self-perception of this organization, then their technology is like hacking, merely providing temporary, makeshift solutions for something greater that may come one day. Radical engineers would also be the first to propose the same makeshift solutions, which, if development stops and premature concretization occurs, could be a pejorative source of hacking. According to their stories, the latter is precisely what happened 40 years ago when prototypes left the lab too early and entered the worlds of Apple, IBM, and Microsoft, resulting in a multitude of poor decisions that led to people now staring at smartphones.

In such stories, radical engineers might adopt a retrospective "could have been" approach, mixed with traces that distinguish them from "normal" engineers. Even if they distance themselves from Silicon Valley's startup culture, their insulation from "California ideology" may not always be 100% tight. In fact, they may provide much-needed heretical solutions to the Silicon Valley mainstream. However, these radical engineers are potential allies aimed at breaking the liberal, authoritarian, and powerless fantasies that Silicon Valley often offers us, whether it be "bullshit internet" or "garbage internet." From the perspective of critical theory, social movements, or political economic analysis, the conceptual poverty of most of the futures currently available in Silicon Valley can certainly become visible. However, if we compare Silicon Valley to a utopia of radical engineering, its intellectual timidity also becomes apparent, a timidity that is merely masked by the destruction it causes.

Alan Kay in a Japanese manga created by Mari Yamazaki

About the Author: Götz Bachmann is a professor of digital culture at the Institute for Digital Media Culture and Aesthetics at Leuphana University in Germany and the convener of the Bachelor's program in Digital Media. He is currently also a visiting scholar at Stanford University. He is an ethnographer who has conducted fieldwork among warehouse workers, salespeople, and cashiers in Germany, as well as in Japan's Nico Chuu. He has also written the German children's comic series KNAX.

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.