continent. maps a topology of unstable confluences and ranges across new thinking, traversing interstices and alternate directions in culture, theory, biopolitics and art.
Issue 8.1-2 / 2019: 1-4

Letter from the Editors

Jamie Allen, Anthony Enns

Any sufficiently advanced technology is indistinguishable from magic. 

 — Arthur C. Clarke

 

Truth is a matter of the imagination. The soundest fact may fail or prevail in the style of its telling: like that singular organic jewel of our seas, which grows brighter as one woman wears it and, worn by another, dulls and goes to dust. Facts are no more solid, coherent, round, and real than pearls are. 

 —Ursula K. Le Guin

 

These past few years, the fairly ancient concept we call “truth” has been bandied about the place quite a bit. Our social trust barometers, for a long time calibrated with “politician” on one side and “scientist” at the other, have been thrust into stormy weather. People like Donald Trump and Richard Dawkins have buried the needle into extremes of rhetorical squall, political uproar and techno-scientific demand, operationalising belief and fact in excessive ways — destructive of both self and others. The rest of us, muddling through this other ancient concept we call “modern life”, try and poise ourselves somewhere in between because, in practice, things are never entirely subject to whim, spurious input or personal opinion, but neither are they always impermeably empirical, tested and proven. Sometimes, we check our references.

        Living in conditions of technological saturation requires that we negotiate, constantly, the encroachment and/or acceptance of new social trusts that are enabled, subverted and possibly even perverted by technologies. Jacques Ellul famously argued that modern technologies convey “the feeling of the sacred,” as they are “always joined to mystery and magic,” and he diagnosed this quasi-religious faith as an expression of the human “power instinct,” as technologies effectively transform average citizens into “heroes, geniuses, or archangels.” Anthony Giddens similarly argued that our trust in technology tends to increase the more the complexity of technological systems surpasses our understanding, as “faith is sustained in the workings of knowledge of which the lay person is largely ignorant.” However, he also emphasized that our inability to understand these systems can lead to resistance, as “ignorance always provides grounds for scepticism or at least caution.” In other words, users tend to trust technologies that grant them a sense of empowerment, and this trust often resembles a kind of religious conviction when users are dependent on technological systems that they do not fully understand. This condition can inspire utopian fantasies, in which technologies are imagined as the solution to all of our problems (and potentially even our salvation), yet the inability to understand can also inspire dystopian fantasies, in which technologies are imagined as the cause of our problems (and potentially even our extinction).

        As a result, our notions of “trust” and “truth” are increasingly mediated, modelled, reputed, and digitised. Our flawed and all-too-human approximate and atmospheric barometrics are being interwoven by the churning of symbols and representational interfaces, making the trust that we bestow to technological systems themselves all the more essential to investigate. How do we know when something is working ‘as it should’? What does it mean for a technical system to be ‘true’ in its operation? Can we grant machines the responsibility to be trustworthy and to take care (of us) ‘as they should’? What then, would it mean to be in a “post-truth” condition, politically, socially, economically, and as a central concern of this double issue, dear reader, technologically? What responsibilities belie the projection of certitude and veracity — part of the ‘magical’ enchantment of the technological — as we grant authority to technoscientific distance more than we do to personal proximity, to one another? Technologies, as they advance, may also be getting better at emotional expediency, and better at lying.

At continent., our conversations during these times have been ever-concerned with the composition of publics that internet publishing, and acting as an experimental publishing collective, can effect. What truths, co-composed, do we make with our friends? We deliberate often on the responsibility we have to translate the messages of authors, speakers, institutions and other friends, and the means and interests we have to influence and modulate, poeticise and figurate the supposed ‘raw data’ of communications, ideas, information, interviews, and authorship. How far, how close and how tangential should our experiments of publishing and media making stray, and what can these media (online and otherwise) allow us to say about how ‘truth’, such as it is today, is understood?

 

Lie Machine at the LEAP (Lab for Emerging Arts and Performance) Berlin, as part of the exhibition "Obsessive Sensing" in 2014. 


Through a number of research trajectories, including this continent. issue itself and an online group started in 2014, this issue's co-editor Jamie Allen has trepidatiously instigated reflection and archeologies of those technologies that we suppose to be truth-telling devices — which turns out, in some sense, to be all of them. He is grateful for and to the Media Archeology Lab, Counterpath publishers, Dateline Gallery Denver, Isaac Linder, LEAP space Berlin, Moritz Greiner-Peter, Jussi Parikka Archaeologies of Media and Technology (AMT) research group, and the Critical Media Lab Basel — all of whom have supported and made public this research in various ways, including the Functional Failures, Operative Fakes and Tenuous Techniques sessions in Switzerland. This work focused principally on voice- and polygraphy-based ‘lie detection’ techniques as well as therapeutic and testing devices like the theratest machine, pseudo-religious objects like the Scientology E-Meter and other simple machines variously known as ‘mood quantifiers’ and ‘love detectors’. Allen’s The Lie Machine (2014) is a recreation of an early instrument for the processing of voice with Voice Stress Analysis algorithms (a presentation for the Gaîté Lyrique programme “Merveilleux Scientifique”, organized by curated by Marie Lechner and Fleur Hopkins, is upcoming in 2020). The algorithm gained notoriety recently in the U.S. trial of George Zimmerman for the charge of the second-degree murder of Trayvon Martin. Zimmerman successfully passed a CVSA test (Computer Voice Stress Analysis) and was subsequently cleared of all charges.

The familiar way that such devices play on our expectations or technical machines to read the world objectively, at times joyous and playful and other times dangerous and maligned, evolved into a growing and expansive, pervasive and interpretable, idea of “Apocryphal Technologies”. It is an idea that hopes to provoke discussions and interventions into what it is, precisely, that we expect from technologies, and what these technologies can, should or do deliver by way of truth. “Apocrypha” are works of unknown authorship or of doubtful origin, a term derived from the Medieval Latin adjective apocryphus, meaning “secret, or non-canonical”, also from the Greek adjective ἀπόκρυφος (apokryphos), “obscure” and “to hide away”. Biblical apocrypha are those texts, oftentimes submitted by intentioned or provocative authors — for example, Kings or land owners attempting to have their territories declared sacred, or people attempting to prove or obtain sainthood.

By extension and extrapolation, “Apocryphal Technologies” has become a bottom-up, peripatetic research project into methods of creating technologies and technological imaginaries for things that do not “work” despite being widely held and in circulation as “functional”—technologies not just of dubious authenticity but also those having spurious or false content and hidden or suspect motives. While the term “technological imaginary” is often used to describe how technologies are invested with utopian aspirations that prevent users from sensing legitimate disappointments, frustrations, or malfunctions, “apocryphal technologies” refers to those technological imaginaries that are inherently suspicious or fraudulent. It also refers more specifically to technologies that are actually designed, built, and implemented (as opposed to the purely theoretical or speculative) and that are openly endorsed by their designers, promoters, and users (as opposed to the obviously impossible or fictional). The apocryphal nature of these technologies becomes particularly evident when they are used in the service of cryptic, murky offerings, such as truth verification, bodily enhancement, cognitive amplification, or religious rituals, yet in a sense all technologies contain at least some element of apocrypha, as they always comprise functions or benefits that exceed their limitations in the here and now. An awareness of these limitations raises fundamental questions about the reliability of technology by exposing the (inadvertent or intentional) misrepresentations and the (naïve or willful) misinterpretations that surround technological developments. It also challenges teleological narratives of technological progress by exposing the social, political, and economic forces that determine whether a particular technology obtains and retains a sense of legitimacy and authenticity within a given context.

This special issue attempts to examine these forces by gathering together contributions, constellations, and networks of media that describe, highlight, and challenge the beliefs inspired by technological innovations and failures. It is in light of the numerous and diverse submissions we commissioned and received a double issue, appropriately, perhaps, mimicking the real/imaginary dichotomies that this theme provokes. We have gathered together a wide range of contributions — scholarly works, as well as media and artistic contributions — that resonate with or criticize some variation on the concept of “apocryphal technologies,” giving new views on this topic and extending prevailing discourses, practices, and histories of media, art, and technology to explore how media-technical fields intermingle with the realms of philosophy, psychology, religion, and myth. The gatherers, Anthony Enns and Jamie Allen, were aided by the continent and its editorial friendship group — Nina Jäger, Matt Bernico, Isaac Linder, Mayssa Fattouh, Maximilian Thoman, Bernhard Garnicnig, Paul Boshears.

What we hope you will trust us, dear reader, is that we intend these works to provoke us all to “question the authority”, as a panoply of skateboard stickers cautioned us in the 1990s. Most particularly, a questioning of the continuously expanding orthodoxy of proliferate technological knowhow, applications and rhetoric. This, while keeping a keen eye on how trust, veracity and intention remain important and require sophisticated, honed literacies. As both Ursula and Arthur call out at the opening of this letter, fact and technology are realms riveted by emotion and magic, but this does less to reduce the importance of seeking and aligning with ‘truth’ than it does to elevate the importance of imaginations — technological and otherwise.

 

Yours,

Jamie Allen & Anthony Enns

Co-Editors

continent.