SUPERVISION—
OPTICS & ETHICS

PREFACE

Supervision examines the optics and ethics of technologies like photogrammetry, LiDAR, volumetric capture, Neural Radiance Fields, and other innovations in computer vision that capture bodies, objects, and environments to reconstruct virtual models. These models are used for countless purposes, including cartography, transportation, architecture, manufacturing, medicine, forensic criminology, defense, historical preservation, and entertainment. While the underlying processes—Capture and Reconstruction—are powerful and seductive tools, they also raise a number of concerns. This text begins with an examination of the origins of Capture and Reconstruction, magnifying processes, applications, and underlying ideologies. It focuses on the ethical questions that have been raised in relation to the tools—privacy, bias, impact, control, deception. Finally, the text offers a framework to guide future engagement with these technologies.

Note.—This text follows the structure of Spinoza’s Ethics.

Coordinates, points——and lines give form to a virtual cloud of ideas.

Throughout, original text from the Ethics is in italics.

This is a post-truth work.

What is human, what is machine?

What is stolen, what is hallucinated?

Can you tell?













PART I.

CONCERNING SUPERVISION

DEFINITIONS.

I. It starts with a desire for control. A desire to know every point on a surface. To fix a thing in a moment. Capture it. Own it. Manipulate it. This is the impossible desire that led us here.

II.What lust, the dream of seeing all sides. Knowing all dimensions. Declaring a thing finite and calculating its boundaries, as if it had them.

III.The power of extraction. To cleave a thing from its context. It evaporates into a cloud. Condenses again at will. At the coordinates of our choosing.

IV. Collecting and classifying each attribute until the world is drained of mystery. An assembly of known surfaces.

V. It is all skin. Shroud. Mask. Mine to modify. To reconfigure.

VI. By Supervision, I mean the infinite eye—that all-seeing impulse, which seeks to know the value of every mote of dust. To label all of boundless existence. And predict every future.

Explanation—I say absolutely infinite, not infinite after its kind: from electrons, to molecules, to cells, to bodies, to cities, to planets, to solar systems, to galaxies, to universes. Capture everything and more. Contain it in that model that makes us God.

VII. By Supervision, I mean computer vision.

VIII. By control, I mean manipulation. The power to enforce fixity. Or, the power to change things, change minds, change actions. Shape the lenses that we look through. Grind their glass surfaces with the tools of our moment.

Explanation—Manipulation is distortion. Data is curved, yet maintains the force of truth that bends behavior in its wake.

AXIOMS.

I. Everything which exists, can be captured.

II. Yet nothing can be captured in reality.

III. Capture is an attempt to grasp a subject, to contain it. Whether within a cage or through a camera or inside the mind, as soon as the gesture to subsume is made, the captive changes. It changes in quality and in time. It is a matter of physics.

IV. It is broken. Shattered into tiny pieces. Obliterated to dust. A point cloud.

V. Its reconstruction is a ghostly apparition. It shares a likeness with its source. But it deceives us. It is a collection of false fragments.

VI. Breathe them in. The sensation is not the subject. After many years, silicosis may follow. Feel it in your chest.

VII. A true idea must correspond with its ideate or object, but we are firmly in the post-truth era.1 Capture and Reconstruction are models for our defactualized2 aesthetic. If a thing can be captured, its reality is already destabilized.

PROPOSITIONS.

Proposition I. Virtual space is saturated with the allure of infinite potential.

Proof.—Before the digital technologies that we associate with virtuality were developed, the virtual was a philosophical concept that stood for the field of all unactualized ideas and events.

Proposition II. While not actual, virtual space is real.

Proof.—It does not vary by degree from physical space, it is of a completely different kind.3

Proposition III. The virtual cannot be represented or controlled.

Proof.—It is a space in constant flux: “Whatever the breaks and ruptures, only continuous variation brings forth this virtual line, this virtual continuum of life, ‘the essential element of the real beneath the everyday.4

Proposition IV. Ideas, images, events, anything not yet actual, populate this extensible plane, endlessly evolving, connecting, and intersecting in new configurations.

Proof.—“We know that the virtual as virtual has a reality; this reality, extended to the whole universe, consists in all the coexisting degrees of expansion (detente) and contraction. A gigantic memory, a universal cone in which everything coexists with itself, except for differences of level. On each of these levels there are some ‘outstanding points, which are like remarkable points peculiar to it. All these levels or degrees and all these points are themselves virtual. They belong to a single Time; they coexist in Unity; they are enclosed in a Simplicity; they form the potential parts of a Whole that is itself virtual. They are the reality of this virtual.”5

Proposition V. The virtual is a network of all possible relations.

Proof.—Bergsons concept of the virtual served as the basis for Deleuze and Guattaris plane of immanence: “there is a pure plane of immanence, univocality, composition, upon which everything is given, upon which unformed elements and materials dance that are distinguished from one another only by their speed and that enter into this or that individuated assemblage depending on their connections, their relations of movement. A fixed plane of life upon which everything stirs, slows down or accelerates”6 (IV.).

Proposition VI. Phase space precludes hierarchies. Only potential and change.

Proof.—Phase space could be seen as a diagrammatic rendering of the dimension of the virtual. The organization of multiple levels that have different logics and temporal organizations, but are locked in resonance with each other and recapitulate the same event in divergent ways, recalls the fractal ontology and nonlinear causality underlying theories of complexity.”7

Corollary.—The virtual is a complex and dynamic system.

Proposition VII. The virtual has a digital twin.

Proof.—It was born when the term virtual was first introduced into computer engineering in 1959. The twin is subservient to computational processes. Its complexity and self-actualizing potential is limited: “The sole function of the virtual memory is to increase machine speed, by increasing the efficiency of other devices.”8

Proposition VIII.There is investment in the fantasy that the two virtuals are interchangeable. Identical twins.

Proof.—See any escapist tech-billionaire vision.

Note I.—They are not the same.

Note II.—The computational appropriation of virtuality undermines the differential quality that defines it: “all we need to do is to sink the floating plane of immanence, bury it in the depths of Nature instead of allowing it to play freely on the surface, for it to pass to the other side and assume the role of a ground that can no longer be anything more than a principle of analogy from the standpoint of organization, and a law of continuity from the standpoint of development.”9 The virtual—point clouds, models, digital twins, simulations—is now a highly representational space of taxonomic organization that reinscribes colonial strategies of control and supervision.

Is it possible to restore the pre-digital sense of the virtual? It would require an embrace of the indeterminate, to call it forth from tendencies to control, so that other possibilities may emerge. As Fred Moten and Stefano Harney urge in The Undercommons, “we owe each other the indeterminate.”10 And it is always there, even if obscured by representation, by quantification, by abstraction: “abstract space relates negatively to that which perceives and underpins it—namely, the historical and religio-political spheres. It also relates negatively to something which it carries within itself and which seeks to emerge from it: a differential space-time.”11 Differential space is a fascinating model. It operates according to the calculus of derivatives as opposed to normals. Derivatives are tangential rather perpendicular to a curve. They are anti-specimen-pins, ramps measuring rates of change instead of fixing in place. They move with, alongside, nearby. Differential space is the space of rhythm, the space of life, of becoming.12 To access this space, this mindset shift, first we must understand

II. The Nature and Optics of Capture

III. The Nature and Optics of Reconstruction

IV. Of Technological Acceleration, or the Ethics of Supervision

V. Of the Ethics of Supervision, or Quantum Erotics.

In The Order of Things, Michel Foucault undertakes an archeology of the human sciences, tracing the changing structures of knowledge from the Classical period to Modernity in Western civilization. He declares that orders of knowledge are reconfigurable, and each configuration constitutes a discrete episteme. Foucault introduces the term episteme to describe historically contextual preconditions of knowledge and discourse: “In any given culture and at any given moment, there is always only one episteme that defines the conditions of possibility of all knowledge, whether expressed in a theory or silently invested in a practice.”13 While each episteme determines how speech operates and how representation is constructed, it is also subject to alteration by its own output. Foucault emphasizes this dialectal formation of epistemes: “What is essential is that thought, both for itself and the density of its workings, should be both knowledge and a modification of what it knows, reflection and a transformation of the mode of being of that on which it reflects.”14 He traces the emergence of three modern discourses—biology, economics, and linguistics—which at his moment, control truth in the human sciences.

Proposition IX. Is this self critical reconstruction?

Proposition X. Or self aware construction?

Proof.—One of the most powerful forces of Modern Europes epistemic formation, perhaps the one that belies those identified inThe Order of Things, is cartography. In his lecture, Of Other Spaces, Foucault acknowledges that “starting with Galileo and the seventeenth century, extension was substituted for localization.”15 Europes invasion of other territories led to the ‘discovery of unknown organisms, unforeseen opportunities for industrial growth, and unfamiliar languages, all of which intensified cultural focus on the three areas that constituted the episteme. The occupation of foreign places required “the commodification and bureaucratisation of everyday life, namely making space mathematical and ordered (challenging the indigenous ordering of space) in such a way as to render the colony most efficiently known and governable.”16 The utter systematization of geography constitutes a form of what Gayatri Spivak calls epistemic violence;17 it erases other ways of knowing space.

Note.—The Cartesian model of abstract coordinate space replaces haptic, embodied space, and the map becomes a technology for remote control: “In his work on the making and circulation of scientific knowledge, Bruno Latour has used the term immutable mobile to characterize those material agents that permit scientific discourse to sustain its claims of empirical warranty and repeatable truth in the absence of eyewitness evidence. The map is a perfect exemplar of the immutable mobile: a container of information gathered at specific locations, returned to a ‘centre of calculation, and then placed once more into circulation as a vehicle and instrument of scientific knowledge and further hypotheses.”18 Importantly, these maps do not only function as representations of space, they terraform space itself, determining its modes of inhabitation.

As Denis Cosgrove explains in Geography and Vision: Seeing, Imagining and Representing the World, “geographical representationsin the form of maps, texts and pictorial images of various kindsand the look of landscapes themselves are not merely traces or sources, of greater or lesser value for disinterested investigation by geographical science. They are active, constitutive elements in shaping social and spatial practices and the environments we occupy.”19 

Proposition XI. The grid organizes behavior and constricts the imagination.

Proof.—Postcolonial scholarship emphasizes the importance of examining maps and their constructive processes as a foundation for resistant strategies.

Another proof.—In The Wretched of the Earth, Franz Fanon argued that “the colonial world is a world divided into compartments … Yet, if we examine closely this system of compartments, we will at least be able to reveal the lines of force it implies. This approach to the colonial world, its ordering and its geographical layout will allow us to mark out the lines on which a decolonized society will be reorganized.”20

The domination of physical space continues, but it also encounters innumerable frictions: decolonial efforts, the unwieldy environmental upset of climate change, even advances in mathematics and science. Surprisingly, all of these preference differential conceptions of space over cartesian models. These variables make totalizing efforts slow and inefficient: “No spaces can be controlled, inhabited or represented completely. But the map permits the illusion of such possibilities. Mapping is a creative process of inserting our humanity into the world and seizing the world for ourselves.”21 The digital is a computational map that arises to satisfy the project of spatial and social control, which can never be fulfilled in the analog.

The most explicit carry over from European Imperialism to virtual space is its tendency for geometric control. The perfect grids of two-dimensional pixel displays, for example, reinforce the colonial mindset that space can be neatly subdivided and programmed. 3D Reconstruction and modeling software is based on this same cartesian coordinate system. Likewise, it operates according to the same spatial ideology that made a totalizing map of the world conceivable.

Another proof.—At least in theory, virtual space has no limit, no absolute scale, it allows the axial view and the extensible grid to continue on forever.

Note.—“Geometry, specifically the radial axis and the grid, underpinned both scientific cartography and modern urban form. Their power and historical endurance in both the map and the city lies in their combination of practical and symbolic efficacy. The circles 360 degrees generate a ‘centre enhancing axial form focused on a single point. Functionally and symbolically, this extends power panoptically to the horizon, encompassing a potentially infinite territory … The alternative geometrical form shared by urban planning and mapping is the grid or chequerboard of orthogonal lines crossing at right angles. While radial axes enhance the centre, the grid is ‘space equalizing, infinitely extendable over the surface and privileging no single point, but rather reducing each to a unique coordinate.”22

Today, digital databases reduce information—and lives—into coordinates to be rearranged within a deterministic index. This classification process removes the friction of complex events and beings, allowing them to be efficiently calculated.

Proposition XII. Classification is the process of establishing a network of distances and proximities.

Proof.—Importantly, Foucault suggests that the primary function of classification systems is to provide generalizations which allow different individuals to point to common concepts.23 He describes two classification processes: the System and the Method. The System makes “total comparisons, but only within empirically constituted groups in which the number of resemblances is manifestly so high that the enumeration of the differences will not take long to complete.”24 The Method, on the other hand, selects “a finite and relatively limited group of characteristics, whose variations and constants may be studied in any individual entity that presents itself.”25 In either case, order emerges as a temporary arrangement of segmented information and predetermined rules: “For it is not a question of linking consequences, but of grouping and isolating, of analyzing, of matching and pigeon-holing concrete contents; there is nothing more tentative, nothing more empirical (superficially, at least) than the process of establishing an order among things ... A ‘system of elementsa definition of the segments by which the resemblances and differences can be shown, the types of variation by which those segments can be affected, and, lastly, the threshold above which there is a difference and below which there is a similitudeis indispensable for the establishment of even the simplest form of order.”26 (IV.lv.)

Proposition XIII. Order is always produced from preexisting biases.

Proof.—Recent scholarship has dissected algorithmic classification and its implications. For instance, in Algorithms of Oppression, Safiya Noble argues that Googles search engine algorithms are not neutral; quite the opposite, they prioritize corporate interests, automate cultural biases, and circulate damaging stereotypes. The resulting representations cause harm to those represented as well as those forming opinions about different races, genders, religions, and other marginalized groups.

Corollary.—There is trust in the archive.

Note.—Most users consider the first page of results to be not only the most relevant links, but also the most credible sources on a given topic. At the same time, many users struggle to differentiate between sponsored content or advertising, and unpaid results. Consequently, through services like AdWords, special interests including Google, have the power to visually and ideologically sculpt topics for their own profit. In this private information system, the power to frame a subject goes to the highest bidder.

Proposition XIV. Dominant operating systems encode their rules.

Proof.—These prejudiced values are deeply embedded in most popular browsers and software applications; they perpetuate and even catalyze bigotry, exploitation, and violence. To support this argument, Noble dissects white nationalist Dylann Roofs terrorist attack on Mother Emanuel African Methodist Church in 2015, and his claim that the attack was motivated by online research of the phrase “black on white crime”27 (III.xxviii.). ProPublicas recent report on machine bias in criminal risk assessment software reveals that racism is not only affecting public opinion through search engines, it is leading to unfair sentencing decisions in the courts.28 This recalls Allen Sekulas analysis of how photography was used to classify certain morphological traits as indicators of criminality.

Corollary I.—In The Body and the Archive, Sekula focuses on the research practices of Alphonse Bertillon and Francis Galton, two men “committed to technologies of demographic regulation.” Sekula analyzes their differing methodologies, postulating each as a conceptual framework, a pre-digital algorithm for identifying criminality. Bertillons indexical method, he argues, attempts to locate aberrations or outliers through comparison, while Galtons compositing practice formulates general criminal types.

Corollary II.—Sekula initially defines the archive as a “unified system of representation and interpretation [which] promised a vast taxonomic ordering of images of the body.” He also addresses the important circulatory function of “the archive as an encyclopedic repository of exchangeable images.”

Proposition XV. The archive reinforces hierarchies through linguistic and spatial organization.

Proof.—“We can speak then of a generalized, inclusive archive, a shadow archive that encompasses an entire social terrain while positioning individuals within that terrain. This archive contains subordinate, territorialized archives: archives whose semantic interdependence is normally obscured by the ‘coherence and ‘mutual exclusivity of the social groups registered within each.”29 

Note.—Sekula examines the emergence of a generalized criminal type and the field of its study—criminology: “Thus the would-be scientists of crime sought a knowledge and mastery of an elusive ‘criminal type. And the ‘technicians of crime sought knowledge and mastery of individual criminals. Herein lies a terminological distinction, and a division of labor, between ‘criminology and ‘criminalistics. Criminology hunted ‘the criminal body. Criminalistics hunted ‘this or ‘that criminal body.” The centrality of physiognomy in the formation of this type indicates its inherent racialized bias. As ProPublica has indicated, these 19th century classification processes have reasserted themselves in the virtual space of risk assessment software programs.

Demographic control relies on the representation of bodies, whether through photography, data, or more dimensional models. Foucault contends that “representation in its peculiar essence is always perpendicular to itself: it is at the same time indication and appearance; a relation to an object and a manifestation of itself.”30 The specimen pin, a staple of Western classification processes, epitomizes this perpendicular gesture of pointing to and simultaneously constituting. The pin is a vector directing the human eye to the point from which the entirety of the specimen is most easily resolved; it fixes the specimen in place, determines its orientation, and through death asserts the permanence of its form. While it is a clear marker of another organisms lifelessness, the language of the specimen pin lives on in virtual space.

Photorealism in computer graphics relies on complex calculations based on surface normals (III.xxxviii.). Normals, by default, are perpendicular to the faces or vertices of a mesh. They allow for accurate rendering by determining how light bounces off of surfaces, most notably in a process called ray tracing. Virtual normals bare striking resemblance to specimen pins, always at perfect right angles. As Sara Ahmed explains, “the right is associated with truth, reason, normality and with getting ‘straight to the point.31 Etymologically, the word normal comes from the latin norma, or carpenters square, conveying not only simple geometric perpendicularity and taxonomic control, but an even longer history of Christian values that determine what is considered upright and in the light. When surface normals are inverted or are not all facing the same direction, they produce unpredictable results. In commercial production, these unruly behaviors are resolved by conforming normals. Fittingly, maintaining right angles and conforming normal direction ensures photorealism, which in turn endows the virtual form with technical and visual authority.

The Utah Teapot was modeled by Martin Newell at the University of Utah in 1975 to demonstrate the ray tracing capabilities of the rendering algorithms he was developing at the time. He chose this particular form, because it was ready at hand in his office, but also for its normals. Its asymmetry, irregular curves, and areas of self-occlusion highlighted his softwares advanced capabilities, its ability to calculate complexity. The teapot quickly became the most circulated 3D model of all time, used for all kinds of technical demos. It is now considered a benchmark in computing (IV.xxxiii.), the 3D equivalent of “Hello World.” The physical teapot that it is based on is even in the Computer History Museum in Silicon Valley.32 Despite its elevation within the field of computer graphics, the Utah Teapot is consistently discussed as a banal, domestic form.

Even if it is never admitted in this context, it is also a symbol of imperial power. Europeans obsession with the daily ritual of drinking tea with sugar was one of colonialisms driving forces: “This custom, which has mistakenly been viewed as insignificant, had important historical effects. Its widespread adoption in Britain and elsewhere in northern Europe in the eighteenth century greatly reinforced demand for both products, thus helping to foster British imperialism in Asia, plantation slavery in the West Indies, and economic growth in Europe and North America.”33 Despite this uneasy history, the teapot easily slips back into the category of neutral formal object, allowing it to circulate as an image of cultural and technical control. In Geographies of Post-Colonialism: Spaces of Power and Representation, Joanne Sharp explains how similar instances lead “many postcolonial feminists [to] favor the concept of ‘situated knowledge as a substitute for decontextualized, ungendered, disembodied, so-called ‘objective knowledge. It pays attention to geographic and cultural specificity rather than universality.”34 It is highly unlikely that the initial choice of the teapot and its subsequent circulation is an intentional assertion of colonial power. Rather, it is an indication that the values and symbols of European supremacy are both central to and invisible in our cultural imaginary: “something passes as natural precisely when it conforms perfectly and without apparent effort to accepted models, to the habits valorised by a tradition (sometimes recent, but in force).”35 The Utah Teapot is quite literally a model of the banality of colonial forces and their foundational role in virtual space.

Immersive virtual reality is the promise of a comprehensive calculable space. Ivan Sutherland developed one of the earliest virtual reality head-mounted displays in 1968. Due to the size and weight of its components, the system required a large ceiling-mounted pole for support. As a result, Sutherland and his team facetiously named their apparatus, The Sword of Damocles. While VR researchers insist that this is a purely formal reference to the intimidating beam overhead, it is worthwhile to consult its namesake for meaning. The Sword of Damocles is a parable of paranoid power. When a Greek subject, Damocles, expresses how fortunate his king, Dionysus, is to live in luxury, the king offers to switch places with him for a day. Damocles eagerly agrees to sit on the throne; however, the king orders a sword to be hung by a single horsehair above the royal seat to represent the feeling of constant threat that comes with supremacy. Damocles does not have the fortitude to withstand these precarious conditions and forfeits his day in the position of ultimate power. References to this moralizing tale have circulated in Europe for centuries, often accompanied by the phrase, METUS EST PLENUS TYRANNIS; fear is plentiful for tyrants.36 

Suspiciously, though, the sword is installed—by royal decree—to intimidate a common subject. Perhaps the state of precarity for those in power does exist, but the narrative of threat-to-rule can also be used as a justification for imposing controls on others. The sword, pointing down from above, uncannily resembles both specimen pin and the virtual normal, posed to fix the subject in place. Likewise, Sutherlands apparatus constrains the users movement, circumscribing its radius and orientation, while claiming to enhance it. This early head-mounted display is a blindfold of optical and tactile dissonance. Researchers in Sutherlands lab refused to wear it because of its high voltage risk to the body.37

Sutherland, confronting the limitations of his invention proposed the ultimate display, a totalizing omnipotent control system: “The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked.”38 His window into the mathematical wonderland of the computer is explicitly tied to bio and necropolitical control—a computer graphics pathway to the state project that Foucault lays out in Discipline and Punish. The recent resurgence of virtual reality has confirmed that “having a display apparatus mounted on our heads may bring temporary distraction, but we are more often in a world of isolation and stasis than remote presence or alternate identity.”39 Sutherlands carceral framing of his early virtual reality system presages its application as a tool for military surveillance and control.

Palmer Luckey, the founder of popular hardware manufacturer, Oculus VR, is now undertaking the project of enhancing state surveillance systems with virtual technologies. Following Sutherlands example, he named his new company Anduril40 after Tolkeins flaming sword of the West. The company has accrued massive investment dollars from venture capital as well as from the United States military, which has a contract with Anduril Industries to construct a virtual border wall. If colonial strategies of spatial control and the architectures of European disciplinary society were latent in virtual reality, then Anduril has unleashed them. These mechanisms are not only operating within virtual space, they are reimposed on physical terrain by an arsenal of robotic sentry towers, unmanned aerial vehicles, and an IoT mesh network of sensors. These fortifications are augmented with machine learning to more accurately detect migrants in the landscape. The companys promotional materials explain: “once alerted, a Lattice user can strap on a pair of VR goggles and get a birds-eye view of what triggered the alarm, or toggle between the individual streams coming from each sensor. The goal is to give users a kind of local omniscience—perfect situational awareness of whats around every corner and behind each hill.”41 

Activists have already come out against the dangers of a system that so efficiently tracks humans with the express purpose of feeding them into the nations privatized detention centers. Mijente, an immigrant rights advocacy group, published a statement that Anduril represents “a surveillance apparatus where algorithms are trained to implement racist and xenophobic policies.”42 The companys founder, however, expressed his faith in US authority and the precedent that it has set through its implementation of other technologies: “weve shown throughout history that we are leaders in using technology ethically, using technology responsibly … We have to continue to lead, the same way that we led with nuclear weapons, where we were able to define the way that they were used because we were the leader in the space.”43 This statement is a terrifying affirmation of the United States exceptional supremacy, which allows it to dominate, even obliterate space and its inhabitants, as it did in Hiroshima and Nagasaki.

Andurils suite of distributed machines exceeds the possibilities of Foucaults Panopticon. Rather, it exemplifies what Manuel Delanda calls the Panspectron: “Instead of positioning some human bodies around a central sensor, a multiplicity of sensors is deployed around all bodies: its antenna farms, spy satellites and cable-traffic intercepts feed into its computers all the information that can be gathered. This is then processed through a series of filters or key-word watch-lists. The Panspectron does not merely select certain bodies and certain data about them. Rather, it compiles information about all at the same time, using computers to select the segments of data relevant to its surveillance tasks.”44 Anduril, the flaming sword of the West, uses 3D Reconstruction to control space. This strategy epitomizes, and concretizes, Delueze and Guattaris description of how state power leverages media to lock down its territories: “one of the fundamental tasks of the State is to striate the space over which it reigns, or to utilize smooth spaces as a means of communication in the service of striated space. It is a vital concern of every State not only to vanquish nomadism but to control migrations and, more generally, to establish a zone of rights over an entire ‘exterior, over all of the flows traversing the ecumenon.”45 

Proposition XVI. Capture and Reconstruction are prosthetic to the colonial ambition of capturing and controlling everything.

Proof.—Capture and Reconstruction technologies descend from colonial mapping practices, the survey, metrology, and photography. Photography was implemented to make territories optically delicious, to serve them up on an industrial platter: “This representative scheme, then, presents the possibility of a double salvation-a return to unspoiled innocence and an opportunity to profit from the violation of innocence.”46 The use of photography from an aerial perspective marked a technological leap (II.x.note.). Photographs from weather balloons were used to map spaces from above, which made mapping much more efficient and useful for industrial and military applications alike. The photographs were then stitched together to create comprehensive and precise maps of expansive terrain: “Photogrammetric survey was used in mapping great colonial stretches of Africa, Australia and Antarctica into the 1950s.”47 Capture and Reconstruction have since evolved and are now primarily executed with digital tools. Notably, countless corporate and governmental entities have incorporated the use of high altitude planes and satellites to expand remote sensing and photogrammetric mapping to a planetary scale.

Corollary I.—Capture everything.

Corollary II.—Every subject is a captive, every object is a sensor.

Corollary III.—The IoT fantasy. Web 3.0.

Proposition XVII. Monitor and control every corner of the universe.

Proof.—The Internet of Things (IoT) refers to a network of interconnected physical objects or devices that are embedded with sensors, software, and connectivity capabilities, enabling them to collect and exchange data. These objects can be anything from everyday household appliances and wearable devices to industrial machinery and vehicles. The fundamental idea behind IoT is to create a seamless connection between the digital and physical worlds, allowing these devices to communicate and collaborate with each other without human intervention. Integrating sensors and connectivity into objects, enables them to gather and transmit data, receive instructions, and interact with their environment.48 

Corollary I.—Embedded and scattered.

Corollary II.—Even dust becomes device of capture.

Note.—Smart dust refers to miniature wireless devices that are typically the size of a grain of sand or smaller. These devices, also known as microelectromechanical systems (MEMS), are equipped with sensing, computing, and communication capabilities. Smart dust particles are designed to be extremely small and lightweight, enabling them to be easily dispersed in the environment and gather data from various sources. The concept of smart dust originated from research conducted by the Defense Advanced Research Projects Agency (DARPA) in the 1990s and the term smart dust was coined by Kristofer Pister of the University of California, Berkeley in 1997.49 The goal was to develop invisible, autonomous sensor nodes that could be deployed in large numbers to monitor and collect data from various environments: “Dust-sized and light transparent semiconductor chips are composed entirely of materials that are transparent to visible light.”50

Proposition XVIII. Science Fiction simulates possible futures (IV.).

Proof.—In Michael Crichtons techno-thriller, Prey, self-replicating nanobots are out of control: “In the Nevada desert, an experiment has gone horribly wrong. A cloud of nanoparticles—micro-robots—has escaped from the laboratory. This cloud is self-sustaining and self-reproducing. It is intelligent and learns from experience. For all practical purposes, it is alive.”51 These minuscule machines exhibit swarm intelligence, simulating future characteristics of smart dust technology. Capable of independent sensing and communication, they demonstrate the potential risks and dangers associated with advanced, interconnected systems on a micro-scale.

Proposition XIX. Deployed dust.

Proof.—Smart dust devices are equipped with sensors that can measure parameters such as temperature, humidity, light intensity, motion, or even chemical composition in some cases. These sensors allow the smart dust particles to gather real-time information from their surroundings. The collected data can then be processed and transmitted wirelessly to a central system for further analysis and decision-making. One of the key advantages of smart dust technology is its potential for large-scale deployment in diverse environments, enabling extensive data collection and monitoring. It has applications in various fields such as environmental monitoring, agriculture, infrastructure management, healthcare, and even military surveillance.52

Note.—Paolo Bacigalupis dystopian novel, The Water Knife, also describes microscopic sensors that bear resemblance to smart dust. Set against a backdrop of widespread water scarcity, these tiny sensors are deployed to monitor water sources and consumption patterns. Their presence underscores the problem of efficient resource management, control, and access: “We knew it was all going to go to hell, and we just stood by and watched it happen anyway. There ought to be a prize for that kind of stupidity.”53 Smart dust has the potential to complicate future resource wars by enabling resource monitoring, surveillance, disruption, and accelerating environmental concerns. Deployed in resource-rich areas, these miniature devices can gather real-time data on valuable assets, provide intelligence for military operations, and be used for sabotage. The widespread deployment of smart dust can trigger a technological race among conflicting parties.

Proposition XX.  Pixie dust pixels.

Proof.—If sensors are smart dust, then point clouds and particle systems are fairy dust. Volumes of data captivate us. Point clouds capture the appearance of objects and environments, casting a spell of trust and belief. Particle systems scintillate, shimmering with potential and movement. They create mesmerizing visual effects, evoking wonder and fascination. They fixate us while the world turns to darkness.

Coroll. I.—Flashes of meaning.

Coroll. II—Motes suspended in a beam of light.

Proposition XXI. Technological-dependence is projected.

Proof.—Vernor Vinges Rainbow’s End presents a near-future world in which advanced technology is deeply embedded in everyday life. Interconnected devices seamlessly facilitate communication, entertainment, and access to information. But, “the beginning of trust has to be an in-person contact.”54

Proposition XXII.Totalizing techno-futures.

Proof.—The proof of this proposition is similar to that of the preceding one.

Proposition XXIII. The substance of control is addictive.

Proof.—The title Snow Crash of Neal Stephensons novel refers to a fictional narcotic: “This Snow Crash thingis it a virus, a drug, or a religion? ...Whats the difference?”55 In the book, Snow Crash is a highly addictive substance, originally designed as a brain-altering virus transmitted through both digital and physical means. It takes its name from the description of its effects on the users consciousness, likened to a crash of overwhelming sensory and cognitive stimulation. The term snow is a reference to the white noise that accompanies the overdose, comparable to a blizzard of fragmented data overwhelming the mind: “Well, all information looks like noise until you break the code.”56

Proposition XXIV. Information overload.

Proof.—Set in a future world where people have computerized implants, MT Andersons Feed vividly portrays a society bombarded with a relentless stream of advertisements, news, and entertainment: “I dont know when they first had feeds. Like maybe, fifty or a hundred years ago. Before that, they had to use their hands and their eyes. Computers were all outside the body. They carried them around outside of them, in their hands, like if you carried your lungs in a briefcase and opened it to breathe.”57 The feed implants provide users with instant access to an overwhelming amount of information, creating a state of perpetual distraction and sensory overload. This information saturation affects individuals ability to think critically, form genuine connections, and maintain a sense of personal identity.

Corollary.—Defrag the system.

Proposition XXV. Open electronic wormholes.

Proof.—In Arthur C. Clarke and Stephen Baxter’s novel The Light of Other Days, WormCam offers surveillance and temporal manipulation. WormCam is a technology that allows individuals to observe any location or event in the past through the use of microscopic wormholes: “If the present is shitty and the future is worse, the past is all you’ve got.”58 The technology provides an unprecedented level of access, unveiling the secrets of history and offering a glimpse into moments that were once hidden from human perception. WormCam raises ethical questions about the boundaries of privacy and the implications of constant surveillance. WormCam is the fantasy of unrestricted access to the past and the complex interplay between knowledge, power, and the erosion of personal boundaries.

A wormhole is a hypothetical concept in theoretical physics that represents a shortcut or tunnel through spacetime, connecting two distant regions or even different universes (III.).59 It is often depicted as a tunnel-like structure, a tunnel through which one could pass from one point in the universe to another. Without traveling through the intervening space. Wormholes are derived from the mathematics of general relativity, Albert Einsteins theory of gravity (II.).60 While they remain purely theoretical at present, they have captured the imagination of scientists and writers due to their potential for enabling faster-than-light travel across vast cosmic distances.

Theoretical methods for opening a wormhole are still purely speculative and largely remain within the realm of science fiction. The concept of opening a wormhole involves manipulating spacetime, which would require the manipulation of immense amounts of energy and the bending of spacetime itself. One popular theoretical approach for opening a wormhole is by using exotic matter or negative energy. Exotic matter, with negative energy density, is a hypothetical form of matter that violates the standard energy conditions of classical physics. It is speculated that if exotic matter with specific properties could be obtained and controlled, it might be possible to create and stabilize a traversable wormhole.61 Another proposed method is utilizing the phenomenon of quantum entanglement.62 Quantum entanglement involves the instantaneous correlation of properties between particles, regardless of distance. The idea is that by manipulating entangled particles, it might be possible to create a connection that resembles a wormhole (V.).

Note.—A Euclidean wormhole is a theoretical concept derived from the mathematical framework of Euclidean space (V.xl.proof.). Unlike the traditional concept of a wormhole in spacetime, which involves curved spacetime and is based on the theory of general relativity, a Euclidean wormhole exists within a hypothetical flat, Euclidean space.63 In Euclidean geometry, a wormhole is represented as a tunnel or shortcut that connects two distinct regions of space. It can be visualized as a bridge or a tunnel connecting two separate points, allowing for a direct path between them that bypasses the usual distance between the points. Euclidean wormholes are mathematical concepts. Not directly related to the physical properties of our universe. Euclidean wormholes have been studied within the context of theoretical physics and often serve as a simplified model for exploring the possibilities of traversable shortcuts between points in Euclidean space (III.xxxvi.note.). While they may lack the complexities and physical implications of spacetime wormholes, Euclidean wormholes provide a framework for investigating geometric structures and the theoretical possibilities of interconnecting different regions of space (V.xli.proof.).

Corollary.—Capture and Reconstruction are wormholes.

Proposition XXVI. Openings between virtual and physical realities.

Proof.—In William Gibsons Neuromancer, countless sensors and displays act as a matrix of wormholes that seamlessly bridge the divide between digital and physical realities. These technological interfaces become gateways, enabling individuals to navigate the boundless expanse of the virtual. Sensors are conduits, capturing the subtleties of physical existence and translating them into digital data. Displays are tunnels projecting immersive virtual landscapes into the perceptible realm. Through these sensorial wormholes, Gibson blurs the boundaries between the virtual and physical: “Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts ... A graphic representation of data abstracted from banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding ...”64 

Proposition XXVII. A thing, which has been conditioned by supervision, cannot render itself unconditioned.

Proof.—This proposition is evident from the third axiom (I.axiom.iii.).

Proposition XXVIII. The observer effect is a problem of measurement.

Proof.—According to quantum theory, the act of observation or measurement can influence the properties and behavior of particles or systems being observed.65 In other words, the act of measurement can cause a quantum system to collapse into a specific state, thereby altering its properties.

The observer effect suggests that the act of observing or measuring a quantum system disturbs it, making it challenging to observe or measure its original, undisturbed state accurately. This concept is related to the inherent uncertainty and probabilistic nature of quantum mechanics: “Quantum reality is not constrained to the realm of ultra-small. In a certain sense, we are all quantum wavicles meaning that a version of you can wildly vary from one observer to another … observer systemic alternate timelines are true parallel universes”66 (III.xxix.proof.).

Note.—In the context of quantum mechanics, a remote cause refers to a causal relationship between two quantum systems that are spatially separated or distant from each other. It challenges the classical notion of causality, where cause and effect occur in close proximity or within a localized region. In certain quantum phenomena, such as entanglement, particles can become correlated in a way that their properties become interdependent, regardless of the physical distance between them. When two entangled particles are measured or interacted with, the outcomes of their measurements are instantaneously correlated, even if they are separated by vast distances. This phenomenon has been experimentally confirmed and is often referred to as spooky action at a distance (V.vii.).67 The concept of remote cause in quantum mechanics suggests that the state or measurement of one particle can have an instantaneous influence on the state or behavior of another particle, regardless of the spatial separation between them. This challenges our intuitive understanding of cause and effect, as the influence appears to occur faster than the speed of light, violating the classical notion of locality.

Proposition XXIX. Nothing in the universe is contingent, but all things are conditioned to exist and operate in a particular manner ...

Proof.—Remote control.

Note.—Remote control refers to the ability to operate or control a device, system, or process from a distance, without direct physical contact. It involves using a device, such as a handheld transmitter or mobile application, to send signals or commands wirelessly to the controlled device.

Proposition XXX. Or complete automation.

Proof.—A true idea must agree with its object (I.axiom.vi.); in other words (obviously), humans and machines in alignment.

Proposition XXXI. Unquestioning servitude.

Proof.—In Neal Stephensons novel The Diamond Age, AI automation plays a central role in shaping the future society. Nanotechnology and advanced artificial intelligence are prevalent. Automation, in the form of intelligent agents and interactive devices, pervades all aspects of life: “The universe was a disorderly mess, the only interesting bits being the organized anomalies.”68 Personalized assistance. Streamlined processes. These AI systems take on tasks ranging from education and child care to manufacturing and resource management. AI systems are seamlessly integrated into the fabric of society, blurring the boundaries between human and machine interaction, and reshaping the dynamics of work, education, and personal relationships: “That we occasionally violate our own stated moral code does not imply that we are insincere in espousing that code.”69

Note.—Neal Stephensons views on automation in The Diamond Age are not explicitly stated in the novel: “‘Which path do you intend to take, Nell? said the Constable, sounding very interested. ‘Conformity or rebellion? Neither one. Both ways are simple-mindedthey are only for people who cannot cope with contradiction and ambiguity.”70 Extensive automation presents trade-offs between technological progress and the preservation of human values and autonomy (IV.).

Proposition XXXII. Will cannot be called a free cause, but only a necessary cause.

Proof.—Free will is in question (IV.). Human will is constrained by various factors. A necessary cause implies that human will is determined or influenced by preceding factors, such as genetics, upbringing, societal conditioning, or environmental circumstances. It suggests that our choices and actions are not entirely autonomous but rather driven by a combination of internal and external forces. There is no unfettered agency (V.ix.). Decisions are bound by deterministic factors. Our choices are predictable or determined by the causal chain of events and the conditions in which we exist. This perspective aligns with certain philosophical and scientific viewpoints that question the extent of human freedom and emphasize the interplay between causality, determinism, and the complexities of human behavior and decision-making.

Coroll. I.—Hence it follows, first, that God does not act according to freedom of the will.

Coroll. II.—Supervision is automated.

Proposition XXXIII. Things could not have been brought into being by God in any manner or in any order different from that which has in fact obtained.

Proof.—There is only one way.

Note I.—In many science fiction narratives, the notion of capturing and reconstructing virtual realities highlights the potential for manipulation and control. The possibility of individuals being trapped within simulated environments. Experiences reconstructed and manipulated. Surveilled. Coerced. Obliterated.

Note II.—Science fiction also contemplates the power of capture and reconstruction to reshape and refine identity. The potential for individuals to assume new personas, inhabit other bodies. The transformative potential of the virtual. The malleability of identity. The impact on self-perception and the consequences of disconnecting from ones physical reality.

Proposition XXXIV. Gods power is identical with its digital twin.

Proof.—Digital twins are virtual representations of physical objects, systems, or processes (III.). They are created by collecting real-time data from sensors embedded in the physical object or system and using it to build a virtual model that mirrors its real-world counterpart. The digital twin serves as a live simulation or emulation, providing insights into the performance, behavior, and condition of the physical object (III.instances of reconstructions.xxxiv.).

Digital twins can be used in various domains, including manufacturing, infrastructure, and transportation: “The Los Angeles Department of Transportation has partnered with the Open Mobility Foundation to create a data-driven digital twin of the citys transport infrastructure. To start with, it will model the movement and activity of micro-mobility solutions such as the citys network of shared-use bicycles and e-scooters. After that, it will be expanded to cover ride-sharing services, carpools, and new mobility solutions that will appear, such as autonomous taxi drones.”71 Digital twins enable real-time monitoring, analysis, and optimization of physical assets, enhancing operational efficiency, decision-making, and maintenance processes. By capturing and analyzing data from sensors, digital twins can simulate different scenarios, predict outcomes, and take proactive measures.

The concept of digital twins goes beyond mere data visualization or representation. It involves the integration of data analytics, machine learning, and simulation to create a dynamic and interactive virtual model that can evolve alongside its physical counterpart: “The EU-funded Neurotwin project aims to simulate specific human brains in order to build models that can predict the best treatments for conditions such as Alzheimers and epilepsy. There have been other attempts to simulate aspects of the brain in the past, but Neurotwin is the first project that focuses on modeling both the electromagnetic activity and the physiology.”72 The digital twin continually receives data from the physical object, updating its virtual representation to reflect the real-time status and characteristics. Digital twins facilitate remote monitoring and control, allowing operators to interact with and manage assets from a distance. Digital twins also support the testing and validation of changes, reducing the time and cost associated with physical prototyping.

With advancements in technology such as the Internet of Things, IoT, data analytics, and cloud computing, digital twins are becoming increasingly sophisticated and integrated into various industries. They contribute to the growing field of the Industrial Internet of Things, IIoT, enabling the digital transformation and optimization of complex systems. Closed loop control. Lights out operation. Virtual laboratories. Simulation environments (IV.lxviii.).

Proposition XXXV. Machine prophecies.

Proof.—Digital twins predict future outcomes.73 Through the integration of advanced analytics, machine learning, and artificial intelligence, digital twins analyze vast datasets, identify patterns, and make informed predictions. Digital twins predict equipment failures. Digital twins predict disease progression and treatment outcomes.

Proposition XXXVI. Orders of operation from afar.

Proof.—Digital twins unlock remote control. Unprecedented power to manipulate physical assets and systems from afar. With their virtual counterparts mirroring the behavior and characteristics of real-world objects, supervisors can influence the system without being physically present. What does automation set in motion? Virtual replicas of unfolding processes. Digital twins usher in an era of automation. Automation eliminates manual intervention, streamlines processes, and minimizes human error. What oversight? What displacement? What risks?

Risks to body, space, system. Digital twins of critical infrastructure, such as power plants, water treatment facilities, or transportation systems, can be vulnerable to cyberattacks. If malicious actors gain unauthorized access to the digital twin, they could manipulate or disrupt the virtual model, potentially leading to real-world consequences such as power outages, water contamination, or transportation disruptions.74 Digital twins used in healthcare, particularly those associated with patient data and medical devices, can pose risks to privacy and patient safety if not adequately secured. Unauthorized access to medical digital twins could result in the exposure or manipulation of sensitive patient information. Tampering with medical device digital twins could have severe consequences for patient health and safety: “zero-trust is coming”75 (IV.lxviii.note.).

Digital twins that are interconnected with industrial control systems, such as those in manufacturing or energy sectors, could be targeted by cybercriminals. If a digital twin is compromised, it could provide a pathway for attackers to infiltrate and manipulate the corresponding physical system. Digital twins used in autonomous vehicles could be susceptible to attacks that manipulate or deceive the vehicles perception and decision-making capabilities. If the digital twins data or algorithms are compromised, it could lead to misinterpretation of the environment, causing accidents or unauthorized control over the vehicles operations. Digital twins employed in smart city initiatives including interconnected systems surveillance, energy grids, or for traffic managementlike the one in development for Los Angeles, may face risks related to unauthorized access or data breaches. Manipulating these digital twins could disrupt essential services, compromise privacy, or enable malicious surveillance.

Risks stem from vulnerabilities. Unauthorized access. Security protocols. Encryption. Ensure the integrity of the twin! Digital doppelgängers may serve our overlords, hoarding our data and surveilling our every move. Digital twins tighten their grasp on our lives. The path to unsupervised supervision may lead to the surrender of our freedoms.

APPENDIX:

There is necessary tension between supervision and the unsupervised. This tension emerges in the engineering and application of Capture and Reconstruction (IV.xxxix). It is also the dialectic of self-aware development. What complexities exist—agency, growth, regulation—within the framework of supervision?

Supervision is a colonizing practice and technological ideology. It embodies structures of control and dynamics of power. In contrast, the unsupervised is without external oversight, guidance, or control. Absence of presence—position of authority—monitoring—directed action and behavior. Unsupervised agents make decisions—move freely—outside the influence of supervisor, external authority, operating system.

No supervision is no accountability—open for reckless risk-taking. Potential dangers arise from unregulated behavior, both at the human and industrial scale. At the human scale, unregulated behavior can result in ethical transgressions, harm to others, and the violation of environmental integrity. Without external guidance or accountability, individuals may engage in harmful or destructive actions, causing harm to themselves or others. Unregulated behavior can lead to moral erosion, as individuals may be prone to selfish pursuits, exploitation, or the neglect of collective well-being.76 At an industrial scale, unregulated behavior has severe environmental and societal repercussions. Industries operating without proper supervision or regulation may exploit natural resources, disregard sustainable practices, and contribute to pollution and ecological damage. Unregulated industrial practices can endanger ecosystems, compromise public health, and perpetuate social inequalities. Without appropriate oversight, industrial operations may prioritize profit-seeking over worker safety, or the long-term impact on communities and the environment. The absence of regulation can lead to a lack of accountability, enabling companies to engage in unethical practices, exploit labor, or evade responsibility for the consequences of their actions.

Proper regulation fosters a sense of collective responsibility, ensuring that individuals and industries act in accordance with ethical, legal, and environmental principles. Regulation promotes sustainable practices, safeguards the public, and facilitates the equitable distribution of resources and opportunities.

Regulation and supervision fold in on one another. A convolution (IV.xlv.). A twist. The relationship between regulation and supervision provokes the question—Is regulation a form of supervision? Regulation can be seen as a proactive measure enacted by governing bodies to supervise and guide the behavior of individuals, organizations, or industries. It sets rules, standards, and frameworks that dictate acceptable practices. Ensures compliance. Mitigates risks. In this sense, regulation acts as a form of supervision by establishing boundaries and overseeing activities. Some argue that supervision should involve more flexible, adaptive approaches that foster self-regulation and individual accountability.77 

Ultimately, it is a question of balance between oversight and freedom, fate and will. The tension between the supervised and the unsupervised is self-aware construction. Actively engaging with this tension becomes calling. Calling for reevaluation of power dynamics, dismantling of oppressive structures, recognition of individuals as active participants in their own development.

Every living being is a device of Capture and Reconstruction. Every being is a universe sensing a multiverse.











PART II.

ON THE NATURE AND OPTICS OF CAPTURE.

PREFACE

The word capture implies seizure and control. Capture technologies collect data from physical realitytemperature, humidity, pressure, proximity, speed, rotation, chemical levels, radiation, light, color, movement, and depth information. Part II, inspects not, indeed, all of them (for we proved in Part i., Prop. xvi., that an infinite number must follow in an infinite number of ways), but only Capture technologies that are used to record and store visual and spatial information—with the implicit goal of outputting a reconstruction. Part II examines the history, hardware, and methodologies of Capture as well as the ethics of extractionism and surveillance in the history of capturing black bodies.7879 The camera, in all its variations, is the sensor at the center of this field.

The word optics has its etymological roots in the word for appearance, or look.80 It refers to a branch of physics as well as public perception—good and bad.81 

In physics, optics is the study of light—and other forms of radiation—its properties, interactions with matter, and image formation. It includes geometrical optics, which deals with the propagation of light as rays; physical optics, which considers light as waves and particles; and quantum optics, which allows for the coexistence of behaviors: “Every period has its own optical focus.”82 The origins of optics can be traced back to the first lenses produced by ancient civilizations.

The oldest known lenses—considered to be of exceptional quality—are estimated to have been made between 2620 and 2400 BC in Saqqara, Egypt. During the IV and V Dynasties of the Old Kingdom. These lenses—made of rock crystal, magnesite, and copper-arsenic alloy—were inlaid in the eyes of funerary statues: “When one observes these statues, and then circles about them in any direction, the ‘eyes’ appear to follow the observer—it is rather an amazing experience, easily observed and photographed.”83 The most cited example of this phenomenon is found in “Le Scribe Accroupi, at The Louvre, Paris … photographed near head on, and then again recorded to the left side of the statue … Movement of the iris apertures is clearly apparent.84 The seated scribe was discovered in 1850 by French archeologist Auguste Mariette. It has been held captive in France’s national collection of Egyptian antiquities for almost two centuries and was recently moved to the Louvre-Lens annex.85 

The ancient Mesopotamians were also known to use lenses made from polished crystals and glass beads. The Nimrud lens, for instance, was unearthed in modern day Iraq, a neo-Assyrian treasure excavated by Sir Austen Henry Layard in 1850: “With the glass bowls was discovered a rock-crystal lens, with opposite convex and plane faces. Its properties could scarcely have been unknown to the Assyrians, and we have consequently the earliest specimen of a magnifying and burning-glass. It was buried beneath a heap of fragments of beautiful blue opaque glass, apparently the enamel of some object in ivory or wood, which had perished.”86 Today, this looted artifact remains in the collections of the British Museum in London.8788

The ancient Greeks laid the foundation for geometrical optics. They studied the properties of light and proposed theories on how it behaves: “Long before either wave or particle, some (Pythagoras, Euclid, Hipparchus) thought that our eyes emitted some kind of substance that illuminated, or ‘felt,’ what we saw. (Aristotle pointed out that this hypothesis runs into trouble at night, as objects become invisible despite the eyes’ purported power.) Others, like Epicurus, proposed the inverse—that objects themselves project a kind of ray that reaches out toward the eye, as if they were looking at us (and surely some of them are). Plato split the difference, and postulated that a ‘visual fire’ burns between our eyes and that which they behold. This still seems fair enough.”89 

In the Early Modern period, European scholars borrowed heavily from Chinese (II.i.) and Middle Eastern (II.iii.) science as a foundation for their own theoretical optics. In 1604, Flemish mathematician and physicist Johannes Kepler formulated the laws of geometric optics, revolutionizing the understanding of light and vision: “Key to properly understanding ocular function, Kepler realized, was understanding the optics of the crystalline lens. Accordingly, he turned first to a mathematical analysis of light rays passing through a transparent sphere from various points outside it. On that basis he showed how parallel rays are brought to a focal concentration after exiting the sphere and undergoing spherical aberration. He also showed how the resulting focal area can shift depending on how close or how distant the light source is from the sphere. The closer the source, he concluded, the more distant the focal area, and vice-versa.”90 Shortly after, Italian scientist and inventor Galileo Galilei made groundbreaking observations using lenses, including the development of the refracting telescope: “Early telescopes were primarily used for making Earth-bound observations, such as surveying and military tactics. Galileo Galilei was part of a small group of astronomers who turned telescopes towards the heavens. After hearing about the ‘Danish perspective glass’ in 1609, Galileo constructed his own device.”91 

In 1621, Dutch scientist, Willebrord Snellius, articulated the mechanism behind the telescopic view—the law of refraction—when light travels from one transparent substance into another, the way it bends or changes direction depends on the angle it enters at, the angle it leaves at, and the refractive index—a value unique to each substance that describes how much it can bend light.92 In his 1637 work, La Dioptrique—or Dioptrics—Descartes used his principles of geometry to describe how light bounces off objects and enters our eyes, enabling us to see: “consider light as nothing else … than a certain movement or action, very rapid and very lively, which passes toward our eyes through the medium of the air and other transparent bodies.”93 Like Snellius, he proposed that the angle at which light enters a different medium affects its path. This principle is now known as the law of refraction, or Snell's Law. In 1662, Pierre De Fermat formalized the Principle of Least Time, building on earlier models of refraction. It states that of all the possible paths light could take to travel from one point to another, it always chooses the most direct.94 

Simultaneous to these theoretical developments in optics, Baruch Spinoza, the Dutch thinker—and the mind behind the Ethics (V.)—rose to prominence as one of the most skilled lens-grinders of the time.95 He honed his craft to such perfection that his lenses were coveted by scientists across the breadth of Europe. Spinoza was the ultimate optical technician—an inflection point—in astronomy, microscopy, and cartography. He formed tools and ideas that revealed the structural symmetries and interconnectedness of existence. Spinoza exchanged letters with countless innovators,96 perhaps most notably Dutch physicist Christian Huygens.97 

Previous accounts of optics treated light as rays—as straight lines. In 1678, Huygens proposed that light propagates as a wave: “I call them waves from their resemblance to those which are seen to be formed in water when a stone is thrown into it, and which present a successive spreading as circles, though these arise from another cause, and are only in a flat surface.”98 Huygens asserted that every point that a luminous disturbance encounters can be considered a source of a secondary wavelet. He envisioned that light spread out in spherical waves from these points, a concept now known as Huygens' Principle. Huygens’ theory accounted for the phenomena of reflection and refraction, building on Snell's law, by considering each point on the wavefront as a source of secondary wavelets:

“But what may at first appear full strange and even incredible is that the undulations produced by such small movements and corpuscles, should spread to such immense distances; as for example from the Sun or from the Stars to us. For the force of these waves must grow feeble in proportion as they move away from their origin, so that the action of each one in particular will without doubt become incapable of making itself felt to our sight. But one will cease to be astonished by considering how at a great distance from the luminous body an infinitude of waves, though they have issued from different points of this body, unite together in such a way that they sensibly compose one single wave only, which, consequently, ought to have enough force to make itself felt. Thus this infinite number of waves which originate at the same instant from all points of a fixed star, big it may be as the Sun, make practically only one single wave which may well have force enough to produce an impression on our eyes.”99 

His wave theory also gave an explanation for the phenomenon of diffraction, which was something that the particle theory of light struggled to explain. When light passed through a narrow slit and spread out, Huygens' principle provided a suitable explanation for the pattern created, as each point of the wavefront could be considered a source of secondary wavelets, forming a new wavefront that was not a straight line.

In 1704, English scientist Isaac Newton published Optiks, and in it, the Corpuscular Theory of Light. According to this theory, light was composed of small discrete particles, which he called corpuscles:  “Between the parts of opake and colour'd bodies are many spaces, either empty, or replenish'd with mediums of other densities; as water between the tinging corpuscles wherewith any liquor is impregnated, air between the aqueous globules that constitute clouds or mists; and for the most part spaces void of both air and water, but yet perhaps not wholly void of all substance, between the parts of hard bodies.” Newton believed these corpuscles—or particles—were emitted by light sources, such as the sun or a candle. Newton's theory faced challenges due to its inability to explain phenomena like diffraction and interference. Wave theories overtook particulate theories, supported by experiments like the double-slit experiment and observations of light speed consistency.

In 1814, French engineer and physicist, Augustin-Jean Fresnel, challenged Newton's corpuscular view in his Reveries.100 Extending Huygen’s theory, Fresnel explained various optical phenomena, such as interference, by treating light as a wave phenomenon rather than a stream of particles. He formalized and published his ideas in De la Lumière—On Light—in 1822, the same year that he demonstrated his lens design to King Louis XVIII: “The first Fresnel lens, installed in the elegant Cardovan Tower lighthouse on France's Gironde River in 1822, was visible to the horizon, more than 20 miles away. Sailors had long romanticized lighthouses. Now scientists could rhapsodize, too. ‘Nothing can be more beautiful than an entire apparatus for a fixed light,’ one engineer said of Fresnel's device. ‘I know of no work of art more beautifully creditable to the boldness, ardor, intelligence, and zeal of the artist.’”101 Fresnel lenses were initially created to address the limitations of large, heavy lenses used in lighthouses. The traditional lenses were bulky and required significant amounts of glass, making them expensive to produce and challenging to transport. Fresnel tackled this problem by dividing the lens into multiple concentric rings—zones—which gradually decrease in thickness from the center outward. Each zone of the lens bends and focuses light, achieving similar focusing capabilities to conventional lenses while reducing the material and weight required.102 Waves of light guiding ships through tumultuous waters.103

Scottish physicist James Clerk Maxwell is renowned for formulating the electromagnetic theory of light and unifying the fields of optics and electromagnetism: “The most important aspect of any phenomenon from a mathematical point of view is that of a measurable quantity. I shall therefore consider electrical phenomena chiefly with a view to their measurement, describing the methods of measurement, and defining the standards on which they depend.”104 In 1865, Maxwell's mathematical equations successfully demonstrated that light is an electromagnetic wave, propagating through space with oscillating electric and magnetic fields: “We now proceed to investigate whether these properties of that which constitutes the electromagnetic field, deduced from electro­magnetic phenomena alone, are sufficient to explain the  propagation of light through the same substance.”105 This theory revolutionized optics and laid the foundation for modern physics. Maxwell's work extended beyond theory, as he extensively experimented with lenses to explore and manipulate the behavior of light, which led to the invention of color photography (II.viii.).

In the same decade, Gustav Kirchhoff introduced the idea of the black body: “...the supposition that bodies can be imagined which, for infinitely small thicknesses, completely absorb all incident rays, and neither reflect nor transmit any. I shall call such bodies perfectly black, or, more briefly, black bodies.”106

According to Planck's law, which was formulated by the physicist Max Planck in 1900, the spectral intensity of black body radiation at a given wavelength is determined by the temperature of the black body. A black body—a theoretical concept in physics—absorbs all radiation. It is an ideal emitter, absorbing radiation at all wavelengths and temperatures. The radiation emitted by a black body is a result of the thermal energy possessed by its constituent particles, such as atoms and molecules. As the temperature of the black body increases, the intensity and distribution of the emitted radiation change. Black body radiation has a continuous spectrum, meaning it contains all possible wavelengths of electromagnetic radiation. As the temperature of the black body increases, the peak intensity of the emitted radiation shifts toward shorter wavelengths, resulting in a phenomenon known as thermal radiation.107

The study of black body radiation played a crucial role in the development of quantum mechanics. In the early 20th century, scientists such as Max Planck and Albert Einstein used black body radiation to explain phenomena that could not be explained by classical physics alone. Planck’s work on black body radiation was particularly significant as it led to the introduction of the quantum concept, marking a fundamental shift in the understanding of energy and matter: "My unavailing attempts to somehow reintegrate the action quantum into classical theory extended over several years and caused me much trouble."108 As a result, Einstein proposed that light can behave as both particles and waves, introducing the notion of photons as discrete packets of energy—quanta of light.

Special relativity, formulated by Albert Einstein in 1905, introduces the concept that the speed of light in a vacuum is constant and is the maximum speed at which information or energy can travel.109 As photons are particles of light and have zero rest mass, they always travel at the speed of light in a vacuum. This principle of special relativity sets a cosmic speed limit. In general relativity, Einstein's theory of gravity, the curvature of spacetime is influenced by the distribution of mass and energy. Photons, being massless particles, follow paths dictated by this curved spacetime geometry. The presence of massive objects can bend the trajectory of light, causing gravitational lensing:  “One profound result of Einstein’s theory of general relativity: gravity bends the path of light, much as it affects the path of massive objects. Very massive astronomical bodies, such as galaxies and galaxy clusters, can magnify the light from more distant objects, letting astronomers observe objects that would ordinarily be too far to see. Even the gravity from planets affects light, allowing researchers to detect worlds in orbit around other stars.”110 The theory of relativity also provides a framework for understanding the concept of time dilation. Time can appear to pass differently for observers in relative motion or in gravitational fields. This has been experimentally confirmed, and it influences the behavior of photons.111

Today, the principles of black body radiation and quantum mechanics are foundational in fields such as astrophysics, engineering, and computing. Black body radiation models are employed to analyze and interpret the radiation emitted by stars, galaxies, and other celestial objects. Emerging technologies aim to capture black body radiation; for instance, Cosmic Microwave Background (CMB) radiation is the closest thing to perfect black body radiation that has ever been observed: “... human eyes cannot see the microwaves from the CMB (or X-rays or infrared rays either). However, using specially designed detectors, such as those to be carried by the Planck [satellite], we can. The CMB is the farthest and oldest light any telescope can detect. It is impossible to see further beyond the time of its release because then the Universe was completely ‘opaque.’ The CMB takes astronomers as close as possible to the Big Bang, and is currently one of the most promising ways we have of understanding the birth and evolution of the Universe in which we live.”112 It is like a historical photograph, capturing a specific moment in the early universe, when it still existed as an ionized plasma—a hot, charged gas—but was beginning to separate into matter and radiation.

More recently, a team of scientists captured the first image of a black hole—a black body—by utilizing the technique of Very Long Baseline Interferometry (VLBI) and forming the Event Horizon Telescope (EHT): “I met [Sagittarius A*] 20 years ago and have loved it and tried to understand it since, but until now, we didn’t have the direct picture.”113 By synchronizing an array of telescopes located around the world, the EHT aimed to create a virtual telescope with an aperture equal to the diameter of the Earth, enabling them to image distant objects with high resolution. Their primary targets were Sagittarius A*, the supermassive black hole at the center of our Milky Way galaxy, and M87*, an active supermassive black hole located in the galaxy Messier 87. The EHT team gathered data from multiple telescopes for several days, which was later combined and processed to produce the first-ever image of a black hole's silhouette. NASA spacecraft and telescopes observed the black hole at various wavelengths to complement the EHT's findings and provide further insights into its environment.114 Even a black hole can be captured.

DEFINITIONS

DEFINITION I. By capture I mean extraction with an aim toward perfection.

DEFINITION II. I consider as belonging to the likeness of a thing, the spatial coordinates of its surfaceand corresponding values.

DEFINITION III. By target, I mean the continuous surface of the captive.

Explanation.—I say continuous because the aim is the boundary of the captives identity.

DEFINITION IV. By captive, I mean the subject at the center of each frame.

Explanation.—The use of the word captive reinforces, in an explicit way, the colonial logic of technologies of Capture.

DEFINITION V. Duration is the indefinite continuance of existing.

Explanation.—The captive is always changing.

DEFINITION VI. Reality and perfection I use as synonymous terms.

DEFINITION VII. By perfection, I mean ground truth.

AXIOMS

I. Capture inverts.

II. Capture fixes.

III. Capture extracts.

IV. Capture distorts.

V. Capture circulates.

N.B. The Postulates are given after the conclusion of Proposition xiii.

PROPOSITIONS

Proposition I. A camera is tool of inversion

Proof.—A camera obscura works on a basic optical principle—when light passes through a small hole into a darkened enclosure, it projects an inverted image of the scene outside onto a surface inside. The device originally took the form of a darkened room with a small hole in one wall. Later versions, more portable and convenient, incorporated lenses to focus the light and mirrors to correct the inversion of the image.115

Note.—The principle of the camera obscura—Latin for dark room—was first documented by Chinese philosopher Mozi circa 400 BC. Mozi, also known as Mo Di, was a Chinese philosopher who lived during the Warring States period, from around 470 to 391 BC. He was the founder of Mohism, a school of thought characterized by its emphasis on logic, observation, and inquiry, as well as the virtues of impartial caring and moral duty.116  In his book, Mozi, also known as The Mo Jing, he described the formation of an inverted image: “The image being inverted depends on there being an aperture at the cross-over and the image being distant. The explanation lies in the aperture. The image. The light reaches the person shining like an arrow. The lowest that reaches the person is the highest and the highest that reaches the person is the lowest. The feet conceal the lowest light and therefore become the image at the top. The head conceals the highest light and therefore becomes the image at the bottom.”117 This observation was part of a larger section in his writings on optics and the nature of light, which he argued travels in straight lines.

Proposition II. A camera is a hole.

Proof.—The proof of this proposition is similar to that of the last.

Proposition III. Optics outside the body.

Proof.—Development of the camera obscura resumed with Arab scholar Ibn al-Haytham in the 10th century AD. Ibn al-Haytham, also known by the Latinized name Alhazen, was a pioneering scientist and polymath from the Islamic Golden Age, living from circa 965 to 1040 AD. His work covered a wide range of scientific and philosophical subjects, but he is perhaps best known for his groundbreaking contributions to the understanding of vision, optics, and light.118

Note.—In his seminal work, Kitab al-Manazir—The Book of Optics—Ibn al-Haytham provided an early description and analysis of the camera obscura. He explained that when light passes through a small hole inside a darkened room or box, it projects an inverted image of the outside world onto an opposite surface. He performed a series of experiments with light passing through small apertures and demonstrated how this resulted in the projection of the external image: “This becomes clearly apparent to sense if one examines the lights that enter through holes, slits and doors into dusty chambers. As for the light of the sun, when it enters through a hole into a dark chamber the air of which is cloudy with dust or smoke, the light will appear to extend rectilinearly from the hole through which the light enters to the place on the chamber’s floor or walls which that light reaches.”119 

Ibn al-Haythams rigorous scientific approach, including his use of empirical evidence and systematic experimentation, was revolutionary for his time and has led many to regard him as the first true scientist: "The seeker after truth is not one who studies the writings of the ancients … and puts his trust in them, but rather the one who suspects his faith in them and questions what he gathers from them … Thus the duty of the man who investigates the writings of scientists, if learning the truth is his goal, is to make himself an enemy of all that he reads, and, applying his mind to the core and margins of its content, attack it from every side”120 His work on the camera obscura cemented understanding of the device and laid the groundwork for future advancements in the fields of optics, physics, and visual perception. It would later influence European scholars after being translated into Latin during the Middle Ages. His text played a key role in the scientific revolution in Europe. Between the 14th and 17th centuries, the camera obscura was used extensively.

Proposition IV. A camera points at accuracy.

Proof.—Early Modern cartographers used the camera obscura as an instrument to plot more accurate maps and charts. This period marked a time of significant exploration and discovery, accurate maps were essential for navigation. The camera obscura could be used to project images of landscapes onto a surface where they could be traced, creating a highly detailed and proportionally accurate representation of the scene.121 This was particularly useful for mapping coastlines and cityscapes, which could be complex and challenging to represent accurately. When set up at a high vantage point overlooking a city, a camera obscura could project an image of the entire city onto a single piece of paper. Cartographers could then trace the projected image to produce an accurate, detailed map of the city.

Proposition V. A camera points to space.

Proof.—In addition to mapping physical locations, the camera obscura was also used to map the night sky.122 Astronomers in the Early Modern period used the device to project images of the stars and planets onto a surface where they could be recorded, leading to some of the most accurate astronomical charts of the time. The devices ability to project bright images onto a dark background made it ideal for studying celestial bodies. The principle of the camera obscura was particularly helpful in observing solar phenomena without the risk of eye damage. Directly viewing, especially during events like solar eclipses, can cause severe retinal damage. A camera obscura allows for indirect observation.

Proposition VI. A camera augments vision.

Proof.—In the 16th century, a few years before his formulation of geometrical optics, Johannes Kepler coined the term camera obscura. Kepler used the device to observe a solar eclipse in 1605 and made significant discoveries about the nature of the moons shadow on Earth.123 In the 17th century, the invention of the telescope dramatically increased the capacity to observe celestial bodies, and the camera obscura was adapted to fit these new instruments. Astronomers attached a camera obscura box to their telescopes, enabling them to project the magnified image onto a piece of paper and trace the celestial bodies and their movements.

Corollary.—A camera gives us super vision.

Proposition VII. Imposes a grid.

Proof.—Leon Battista Alberti, the Italian architect, philosopher, and cryptographer developed another device, the veil, to capture three-dimensional space.

Corollary.—Sheer discontinuity.

Note.—Albertis writing indicates the utility of the device: “I believe nothing more convenient can be found than the veil, which among my friends I call the intersection, and whose usage I was the first to discover. It is like this: a veil loosely woven of fine thread, dyed whatever color you please, divided up by thicker threads into as many parallel square sections as you like, and stretched on a frame. I set this up between the eye and object to be represented, so that the visual pyramid passes through the loose weave of the veil. This intersection of the veil has many advantages, first of all because it always presents the same surface unchanged, for once you have fixed the position of the outlines, you can immediately find the apex of the pyramid you started with, which is extremely difficult to do without the intersection.”124 The veil consisted of a squared-off grid that—when positioned between the observer and the observed—would break down the scene into a series of smaller, manageable squares. This allowed the viewer to collect depth data and record spatial coordinates by translating the viewed scene onto a similarly gridded paper—each square representing a segment of the visual field.

He also invented the finitorium,125 a radial dial with descending plumb lines. This device is placed above an object. The arm of the radial dial indicates XY coordinates and the weighted plumb lines measure Z coordinates. The operator rotates the arm, repositions the plumb lines, and records coordinates where the plumb line intersects the surface of the object. A virtual model made of points. The finitorium descended from the astrolabe, an ancient astronomical instrument used for solving various celestial calculations, including measuring the positions of stars, determining local time, and finding one's latitude. It consists of a circular disk with various markings, an alidade—a pivoting pointer—and a rotating plate with a sighting mechanism. The theodolite also emerged from this lineage of angular measurement.

The invention of the theodolite is attributed to Leonard Digges, an English mathematician and surveyor, in the 16th century.126 A theodolite is a precision optical instrument used in surveying and engineering to measure horizontal and vertical angles. It consists of a telescope mounted on a rotating base and a vertical axis. The telescope can be rotated horizontally—azimuth—and vertically  |  elevation  |  and is equipped with crosshairs or a reticle to measure angles accurately. In early versions of theodolites, the crosshairs were made of spider webs (. The crosshair is the reticle or a network of fine lines inside the telescope that aids in precise aiming and measuring angles. Spider silk, due to its thinness and ability to form a fine thread, was commonly used for this purpose. The web strands were carefully mounted and adjusted within the telescope to intersect at the center, forming the crosshairs. Spider silk was also highly valued for its strength and lack of stretching, which ensured the stability and accuracy of the measurements: “Older Coast and Geodetic Survey (C&GS) triangulation manuals required that all field parties carry a spider's cocoon with them and included instructions for replacing broken micrometer wires with threads from the cocoon.”127 Over time, advances in technology and the availability of more durable materials led to theodolites with manufactured reticles, such as etched glass or metal wire.

Theodolites are used for precise land surveying and engineering tasks, while sextants are employed in celestial navigation for determining latitude at sea or in space. Sextants measure the angular distance between celestial objects and the horizon. It is commonly attributed to two individuals—John Hadley, an English mathematician—and Thomas Godfrey, an American inventor. Both independently designed and built similar instruments around 1730. The sextant revolutionized navigation by providing sailors with a highly accurate means of determining latitude at sea. Early versions of the sextant featured a solid frame, a graduated arc with a movable arm carrying a small telescope and a mirror. The observer would align the instrument to measure the angle between a celestial body, usually the sun or a star, and the visible horizon. Over time, the sextant underwent refinements and improvements—double frames and vernier scales—leading to increased accuracy and accessibility: “Sextants designed for aircraft navigation are equipped with a pendulum or a gyroscope that serves as an artificial horizon, as well as a mechanism that allows the navigator to average several observations taken in rapid succession.”128 Metrology and optics intertwined.

Proposition VIII. Fixing light.

Proof.—The earliest known photograph was taken in 1822129 by Joseph Nicéphore Niépce, using a process called heliographysun writing: “to fix the images of objects by the action of light” or “the means of fixing spontaneously by the action of light, the images seen in the ‘camera obscura’.”130 A camera obscura captures an image on a metal plate coated with a light-sensitive chemical.

Corollary.—In 1839, Louis Daguerre and William Henry Fox Talbot independently developed new processes that significantly reduced the exposure time required to create a photograph. Daguerres process, called daguerreotype, used a polished silver-plated copper sheet as the medium: “I have seized the light. I have arrested its flight.”131 Talbots process, known as calotype, used paper coated with silver iodide: “the inimitable beauty of the pictures of nature’s painting which the glass lens of the Camera throws upon the paper in its focus—fairy pictures, creations of a moment, and destined as rapidly to fade away … the idea occurred to me … how charming it would be if it were possible to cause these natural images to imprint themselves durably, and remain fixed upon the paper.”132 These new processes made photography more practical and accessible, and led to a surge of interest in the technology.

Note.—The first practical method of color photography was developed by Scottish physicist James Clerk Maxwell in 1855, a method known as additive color synthesis. He took three separate photographs of a tartan ribbon, each time with a different color filter over the lensred, green, blue. When superimposed, these created a full-color image. “In 1861 he commissioned Thomas Sutton to take a demonstration photograph of a tartan ribbon which he showed projected onto a screen at King’s College London. This image shouldn’t have worked as well as it did, because the photographic chemicals did not respond to red light. Serendipitously, unseen ultraviolet light also reflected off the red portions of the ribbon and provided the third color.”133

Color photography did not become widespread until the mid-20th century with the invention of subtractive color film technologies like Kodachrome and Technicolor: “Technicolor never mentioned the name Kodachrome when referring to the technology used in its communications to the press and stockholders. Instead it used descriptions such as ‘an experiment in monopack’, ‘the Monopack procedure’ and even ‘Technicolor Monopack’ for the system used. But no matterhow it was called, the technology was very probably the same …”134 Identical twins.

Proposition IX.Capture gains velocity.

Proof.—In the late 1800s, the development of celluloid film allowed for even faster and more efficient photography: “before becoming a synonym for cinema, celluloid was used to imitate expensive materials like ivory, tortoiseshell … gemstones”135 This material advance led to the widespread use of photography for scientific, commercial, and artistic purposes. The introduction of the handheld camera in the 1890s made photography even more portable and accessible, and led to a boom in amateur photography.136

Corollary.—Firing time.

Proof.—Eadweard Muybridge was a seminal figure in the capture of motion. It began with a wager in 1872. Technology to settle a debate. Do all four hooves leave the ground when a horse gallops? To settle this, Muybridge engineered a sequence camera system using a trigger of a gun.137 He lined up a series of cameras along a racetrack, each triggered by a thread as the horse ran by. This method allowed him to capture sequential images showing detailed phases of motion. Proving that horses float: “It was as though he had grasped time itself, made it stand still, and then made it run again, over and over. Time was at his command as it had never been at anyone’s before.”138 Building on this invention, Muybridge later designed his zoopraxiscope, a precursor to a moving image projector. Shooting light back into space.

In another significant experiment, he extended his exploration of motion and perspective by photographing his subjects from multiple angles.139 He arranged cameras in a circle around the subject, pioneering a technique that creates a 360-degree view—bullet-time photography. A matrix of cameras.

Proposition X. Measuring space.

Proof.—Photogrammetry, the science of making measurements from photographs, has roots that trace back to the mid-19th century, shortly after the invention of photography. The german twin of the term photogrammetry was first coined by the German architect Albrecht Meydenbauer in 1867, in the title of his article, Die Photometrographie.140 Meydenbauer initially used the technique to create architectural drawings of buildings that were difficult to sketch by hand. He developed a photomeasure table and used it to create precise measurements from the photographs he had taken.141 

Note.—French military engineer and surveyor Aimé Laussedat, began experimenting with the use of terrestrial photos for topographic purposes. He is often credited with being the first to use the term photogrammetry in the scientific literature and for his work in demonstrating the practical application of the method in topographic surveying: “In 1842, after two years of study at the École Polytechnique in Paris, lieutenant Laussedat was assigned to the corps of engineers where he spent his entire military career. He was first assigned to the fortifications of Paris where he participated in the construction of the fort of Romainville, then in Bayonne (French Pyrenees) for the recognition of the Franco-Spanish border and the study of the establishment of a stronghold in Cambo. His was then responsible for making topographic surveys, and since then, he began to think about the alternative approaches that he was to develop during several decades in order to make topographic surveys more accurate and efficient, particularly in mountainous areas.”142 By the late 19th century, Laussedat had developed the basis for aerial photogrammetry, although the lack of suitable flight technology at that time meant that his ideas wouldnt be fully realized until the 20th century.143

Aerial photogrammetry began to take shape during World War I, where it was used for reconnaissance and mapping. Photographs taken from balloons144 and later from airplanes were used to create topographic maps of enemy territory: “Photography is a marvelous discovery, a science that has attracted the greatest intellects, an art that excites the most astute minds—and one that can be practiced by any imbecile … In photography, like in all things, there are people who can see and others who cannot even look.”145 After the war, this technique was further refined and developed. The introduction of stereoscopy, where two photos taken from slightly different perspectives are combined to give a three-dimensional effect, allowed for more precise measurements and led to further advances in the field.

Corollary.—Disparity from above.

Note.—The concept of stereo depth cameras draws from the biological principle of binocular vision, evident in animals including humans, which use two eyes to perceive depth (III.xxix.proof.). This principle was first applied to photography in the mid-19th century by Sir Charles Wheatstone, who invented the stereoscope,146 a device for viewing a pair of separate images depicting left-eye and right-eye views of the same scene, creating an illusion of depth (III.postulate.i.). Wheatstone also invented the chronoscope, a device for measuring velocity: “Patented in 1874, the ballistic chronograph was the most accurate way to find the speed of bullets.”147

Stereo depth camerasalso known as stereo vision systems or stereo camerascame into focus with computer vision and digital imaging in the late 20th century. These systems typically consist of two or more lenses that capture two slightly different views of an object or scene. Akin to the left and right eyes in binocular vision. These images are then processed by a computer to compare and analyze the differences between them, known as disparity. The baseline distance between the two cameras and the focal length of the lenses are known. As a result, the system can calculate the depth of each point and generate a depth map (III.xxv.).148

Proposition XI. Structural deformation.

Proof.—Structured light sensors represented a significant development in the field of optical metrology. The concept of structured light scanning was introduced in the 1960s, but the technology became more widespread with advancements in computer technology in the 1980s. The foundational principle behind these sensors is the projection of a known patterndots, lines, gridsonto an object. As the pattern deforms over the object, it is captured by a camera with known position and orientation. By analyzing the deformation of the pattern, the sensor can calculate depth information and create a three-dimensional representation of the object. In its early stages, structured light technology was primarily utilized in industrial settings for quality control and inspection. However, with miniaturization of components, improvements in computational efficiency, and advances in photonics, the applications of structured light are multiplying: “All light has structure, but only recently has it been possible to control it in all its degrees of freedom and dimensions, fueling fundamental advances and applications alike …  from traditional two-dimensional transverse fields towards four-dimensional spatiotemporal structured light and multidimensional quantum states, beyond orbital angular momentum towards control of all degrees of freedom, and beyond a linear toolkit to include nonlinear interactions, particularly for high-harmonic structured light.”149

Corollary.—Spatial lasers.

Note.—Light Detection and Ranging—LiDAR—is a remote sensing method that uses light in the form of a pulsed laser to measure distances. It was developed in the 1960s, shortly after the invention of the laser. The first LiDAR-like system was created by Hughes Aircraft Company, which built the first laser radar system in 1961. The early use of LiDAR technology was primarily in the field of atmospheric research, where it was used to measure clouds and pollution levels. The first major applications came in the field of topographical mapping and have since extended to various areas including archaeology, forestry, construction, and autonomous vehicles.150 A significant breakthrough for LiDAR came in the early 2000s when it was used in NASAs Mars Exploration Rover mission to map the Martian terrain.151 Now it is being used to look below its surface: “The Radar Imager for Mars' Subsurface Experiment, known as RIMFAX, uses radar waves to probe the ground under the rover … No one knows what lies beneath the surface of Mars. Now, we'll finally be able to see what's there.”152 Advancements in laser technology, GPS, and data processing, have made LiDAR more accurate, powerful, and accessible, leading to widespread adoption.

Proposition XII. Time of flight.

Proof.—Time of Flight (ToF) sensors measure the time taken for light or other types of signals to travel to an object and back. Time of Flight has roots in mid-20th century radar technology. The principle of time-of-flight was initially applied in fields like geology and space exploration, where it was used to measure distances on a large scale. The advent of faster and more powerful microprocessors made ToF sensors practical for smaller scale and real-time applications. Strands of light darting to and fro. Veils woven from time. Segmenting space.153

Note.—This proposition is also evident, and is more clearly to be understood from II. vii.

Proposition XIII. Shoot to measure.

Proof.—In ToF, a light signaloften from a laser or an LEDis emitted towards an object. The sensor then measures the time it takes for the light to bounce back after hitting the object. Given that the speed of light is constant, the sensor can calculate the distance of the object by multiplying half of the measured time by the speed of light. The time is halved because the light travels to the object and back. The ToF sensor produces a depth map of the environment.154 Each pixel corresponds to a distance, which is particularly useful for autonomous vehicles. Navigation. Obstacle detection: “As long as the vehicle is moving at walking speed, the measuring range of a ToF camera is sufficient. For higher speeds, you may use lidar with a scanning ToF principle. ToF cameras are not certified safety devices, so they have to be used in combination with other sensor modalities.”155

Note.—Throughout—the transition to digital technology marked a significant shift. The first digital camera was created by an engineer at Eastman Kodak, Steven Sasson, in 1975: “It only took 50 milliseconds to capture the image, but it took 23 seconds to record it to the tape. I’d pop the cassette tape out, hand it to my assistant, and he would put it in our playback unit. About 30 seconds later, up popped the 100-pixel-by-100-pixel black-and-white image.”156 This clunky device laid the foundation for a digital revolution. Digital technology made photographs easy to take, store, share, edit. It transformed the way we capture and interact with images. The widespread adoption of digital cameras has democratized photography. With the integration of cameras into other devices, capturing high-quality data is now available to most people. Unlike film photography, digital technology provides instant feedback. View. Delete. Share. With the internet, digital photos instantaneously cross vast distances (I.xxv.).

AXIOM I. All bodies are either in motion or at rest.

AXIOM II. Every body is moved sometimes more slowly, sometimes more quickly.

LEMMA I. Bodies are distinguished from one another in respect of motion and rest, quickness and slowness, and not in respect of substance.

Proof.—Event cameras.

LEMMA II. The advent of inhuman vision (IV.xxxvii.note.i.).

Proof.—Event camerasalso known as neuromorphic sensors or silicon retinasrepresent a shift from traditional frame-based imaging to bio-inspired, asynchronous capture.157 The development of event cameras began in the early 1990s, driven by researchers seeking to mimic the highly efficient processing mechanisms found in biological vision systems: “An insect's compound eye is an engineering marvel: high resolution, wide field of view, and incredible sensitivity to motion, all in a compact package.”158 

LEMMA III. Pixels operate independently.

Proof.—Conventional cameras capture images at fixed time intervals. Event cameras operate on a radically different principle. Each pixel in an event camera operates independently and responds to changes in the logarithmic intensity of light. When the change in light intensity at a pixel crosses a certain threshold, it generates an event. This event includes the pixels coordinates, the polarity of the changeincrease or decreaseand the precise timestamp at which this change happened.159 

Event Cameras increase the rate of capture. They allow for incredibly high temporal resolution and a wide dynamic range. Lower power consumption. Smaller data output. The pixel’s individual and asynchronous operation reacts almost instantly to changes. Capturing fast motion and adapting swiftly to changes in lighting conditions. Event cameras hold great potential in fields that require quick reactionself-driving cars, robotics, augmented reality.

Corollary.—Capture is evolving.

Axiom I.—Capture is multiplying.

Axiom II.—Capture is accelerating.

It is estimated that over 6.5 billion people around the world have a phone equipped with at least one camera.160 On top of this, the number of standalone digital cameras produced annually runs into the tens of millions. Add in other types of camerasthose in vehicles, computers, security systemsthe number of cameras is immense. The proliferation of cameras has complex ethical dimensions. The production and operation of capture technologies is destructive to the environment. The existence of its infrastructure erodes privacy.

Definition.—Extractivism refers to an economic and socio-political model focused on the extraction of natural resources from the Earth. Originating during the colonial period, this extractive paradigm persists in many economies globally, underpinning key industries such as mining, forestry, fishing, and fossil fuels. While Extractivism has been a significant driver of economic growth and development, it has also raised substantial ethical and environmental concerns.161

Axiom III.—The historical roots of Extractivism lie in colonial practices, where colonial powers systematically extracted resources from colonized regions to fuel their own economic growth. This resulted in massive wealth accumulation for colonial powers. Colonized regions forced into states of economic dependence and ecological imbalance. The remnants of this extractive legacy continue to shape global economic relations, with many post-colonial nations still heavily reliant on exporting raw materials to developed nations: “Extractivism ran rampant under colonialism because relating to the world as a frontier of conquestrather than a homefosters this particular brand of irresponsibility. The colonial mind nurtures the belief that there is always somewhere else to go to and exploit once the current site of extraction has been exhausted.”162

LEMMA IV. Extractivism has undoubtedly contributed to national and global wealth. The extraction and export of natural resources have financed growth, development, and job creation. Countries rich in natural resources, particularly minerals, oil, and gas, have seen substantial economic growth. However, this wealth is often unevenly distributed, contributing to social inequality.

Proof.—Reliance on resource extraction can lead to economic instability due to fluctuations in global commodity pricesa phenomenon known as the resource curse: “The term resource curse refers to a paradoxical situation in which a country underperforms economically, despite being home to valuable natural resources. A resource curse is generally caused by too much of the countrys capital and labor force concentrated in just a few resource-dependent industries.”163

LEMMA V. The exploitation of natural resources often comes at a considerable human cost, including the displacement of local communities, poor labor conditions, and impacts on the health and wellbeing of those living near extractive operations. Extractivism consumes Earths finite resources, raising questions about intergenerational equitythe fairness of depleting resources for future generations.

Proof.—The same as for the last Lemma.

LEMMA VI. The environmental implications of Extractivism are profound. Extractive industries are major contributors to environmental degradation, including deforestation, soil erosion, water pollution, and loss of biodiversity. The extraction and burning of fossil fuels have significant impacts on climate.

Proof.—The United Nations Environment Programme recognizes that “...the economic benefits come at a cost: Climate change: Resource extraction is responsible for half of worlds carbon emission; Pollution: the extractives sector contributes to air, water and land pollution, toxic wastes and has caused significant water; pollution. Oil production has also gravely impacted the environment in countries such as Nigeria; Biodiversity loss: 20% of oil and gas contracts block overlaps with biodiversity protected areas in Africa; Social issues: Tailing dam disasters have threatened peoples lives and safety; mining was the third sector linked to the most murders, with over half of the attacks in three countries (Colombia, Mexico and the Philippines). Many human rights abuses are also linked to ASM with over 40,000 children working in cobalt mines in DRC.”164

LEMMA VII. Manufacturing these devices—sensors and servers—involves complex sourcesminerals, metals, human laborand extractive processes that carry significant environmental and social tolls.

Proof.—This proposition is evident from the definition of Extractivism prefixed to Lemma iv.

Note.—The production of camera sensors necessitates the extraction of various rare earth minerals and metals. These include elements like lanthanum, used in camera lenses for its high refractive index165— indium, a key component of the LCD displays found in digital cameras—and tantalum, used in capacitors in camera circuitry. These valuable materials are not distributed evenly across the Earths crust. And their extraction is a dangerous and environmentally damaging process.

The mining of these materials frequently involves open-pit techniques that drastically alter landscapes, promote deforestation, and lead to substantial water and soil pollution. For example, the extraction of tantalum, primarily from coltan ore, is directly linked to deforestation and habitat destruction, particularly in conflict-prone areas like the Democratic Republic of Congo. This environmental damage, in turn, has far-reaching ecological and human impacts, threatening biodiversity and local communities livelihoods: “Until relatively recently, companies such as Intel, HP and Apple havent had to trace the source of the tantalum that goes into their electronic devices, but this all changed with the Dodd-Frank Reform in 2010. The Act states that all companies registered with the US Securities and Exchange Commission have to disclose whether they are receiving tantalum, tungsten, tin, and gold from Congo, and whether those minerals are connected to sites of conflict.”166

Lanthanum, a crucial element in camera lenses, is primarily extracted through mining rare earth ores, particularly bastnasite and monazite. These ores are often found mixed with other substances, requiring extensive and energy-intensive processes to separate and refine lanthanum. Extraction operations predominantly occur in China, which is home to the worlds largest rare earth deposits. The open-pit mining process disrupts local ecosystems, causing soil erosion, habitat loss, and water contamination from mining waste. The refining process is also highly polluting, involving strong acids and producing hazardous waste, which often contains radioactive thorium: “Due to high technological growth, increasing demand, and changing government policies on import and export, mining of lanthanum will steeply increase in the coming years.”167

Indium, a key element in LCD displays of digital cameras, is most commonly found in association with zinc ores, and to a lesser extent, with lead, tin, and copper ores. The extraction of indium often occurs as a byproduct of zinc or lead mining. Once the host ore is mined, it undergoes a complex series of chemical reactions to isolate indium. One of the key environmental concerns with indium mining is the substantial amount of waste produced, as only a tiny fraction of the mined material is actually indium.168 The processing of these ores to isolate rare earth elements is energy-intensive and results in significant emissions of carbon dioxide and the waste from these processes also contain harmful radioactive elements: “Knowledge of the anthropogenic and natural cycling of indium can lead to a greater understanding of the environmental impacts and human health effects of this metal.”169

Rare earth mining for elementstantalum, lanthanum, indiumused in cameras, often produces radioactive byproducts. This is because many rare earth elements are found in geological deposits alongside naturally occurring radioactive materials. When these ores are mined and processed, radioactive materialsuranium and thoriumare brought to the surface.The waste products, or tailings, from the extraction processes contain a mixture of these, as well as their decay products, which include various isotopes of radium and radon. The handling and disposal of radioactive byproducts pose significant environmental and health risks. In many places, the waste is stored in tailings ponds, which are large, engineered dam and dyke systems designed to hold the mining waste. These ponds can be susceptible to leaks, or worse, catastrophic failures, leading to widespread environmental contamination.170

The dust from these tailings can be windborne and contaminate surrounding areas. Radon gas can escape into the atmosphere. Exposure to these radioactive materials increases the risk of cancer and other health problems in human and non-human lifeforms. The long half-life of these radioactive elements means that they remain hazardous for thousands of years: “The typical ground water and soil samples around the tailings pond were sequenced. The dominant bacteria in soil and ground water are consistent. The dominant bacteria were Actinobateria, Proteobacteria and Acidobacteria at phylum level. This microbial community composition is similar to that reported in arid lands around the world.”171 What is our time of flight to an arid wasteland?

From the miners who extract raw materials to the factory workers assembling products, labor is extracted at every step. The conditions of this labor are fraught. Inadequate wages. Poor working conditions. Violations of workers rights. Child labor.172 The factories where cameras are assembled are typically in developing countries. Workers are subjected to long hours in poor conditions with little job security and insufficient health and safety protections. They are increasingly subjected to electronic surveillance and even required to wear monitoring devices: “Electronic surveillance puts the body of the tracked person in a state of perpetual hypervigilance, which is particularly bad for health … Employees who know they are being monitored can become anxious, worn down, extremely tense, and angry. Monitoring causes a release of stress chemicals and keeps them flowing, which can aggravate heart problems. It can lead to mood disturbances, hyperventilation, and depression.”173 Extractive practices migrate.

Data is collected, transmitted, and stored. Collection—The sensor detects and measures a physical property of its environment. It converts this information into a digital signal. Data acquisition. Transmission—The sensor sends the data to a server through wired or wireless networks. Storage—The server receives the data and it stores it for later use. Data is queued.174 The server may also perform processing or analysis on the data. Clusters of servers have large storage capacities and powerful processing abilities, allowing for the handling and analysis of large volumes of data.175 They provide a centralized location for data access and management. Multiple devices or users can access the data simultaneously.

The creation of servers begins with the extraction of essential materials such as gold, silver, copper, and aluminum, along with a range of rare earth elements. Like the materials used to make sensors, these are typically obtained through open-pit mining operations. Following extraction, these raw materials undergo refining processes to make them suitable for use in manufacturing. The refinement stage frequently involves the use of harsh chemicals that can pollute local water sources and produce a significant amount of waste. For instance, gold refining often entails the use of cyanide, creating toxic tailings:  “Cyanide is a rapidly acting substance that is traditionally known as a poison. Hydrogen cyanide was first isolated from Prussian blue dye in 1786 and cyanide first extracted from almonds around 1800. Cyanide can exist as a gas, hydrogen cyanide, a salt, potassium cyanide … Cyanide is also found in manufacturing and industrial sources such as insecticides, photographic solutions, plastics manufacturing ... It has been used as a poison in mass homicides and suicides.”176 Manufacturing and assembly stages are the subsequent links in the chain of server production. Refined materials are shaped into components like circuit boards, processors, memory chips, and hard drives. These components are servers of voracious energy consumption.

Once in operation, servers consume vast amounts of electricity. They contribute to carbon emissions and climate change. As digital services proliferate, the demand for servers increases. Servers contain multitudes—hazardous heavy metals, flame retardants, poisons. And servers are disposable. The end of their lifecycle presents yet another challenge—improper handling and disposal perpetuate environmental harm: “Immortal waste.”177 The mining and manufacturing of servers involves exploitative labor practices. Local communities are treated as disposable resources—servitude underlies servers: “All things can be deadly to us, even the things made to serve us; as in nature walls can kill us, and stairs can kill us, if we do not walk circumspectly.”178 

POSTULATES

I. Capture is extraction.

II. Extraction of land and labor.

III. The machinery of Capture is in place.

IV. Capture has infrastructure.

V. Capture has methodologies.

VI. The three main methods of Capture—terrestrial, close range, aerial—reperform languages of invasion and control.

Proposition XIV. Capture extends our reach.

Proof.—Terrestrial photogrammetry, the practice of deriving measurements from ground-based photographs, employs several methods for capturing data: “We shall call the first pole of capture imperial or despotic.”179 One common method is to use a pole, sometimes referred to as a monopod. A camera is mounted to a tall, often extendable pole and is usually controlled remotely. The pole allows the operator to take photos from elevated viewpoints. This method is beneficial for capturing images of hard-to-reach areas, such as rooftops, or to provide a birds-eye view of a scene. The pole method is often used in architectural and archaeological photogrammetry.180

Proposition XV. Capture is ritual extraction.

Proof.—The telescoping pole extends the range of human vision, cheaply and simply. The captor walks through the terrain or around an object holding the pole. The goal is to capture images from every possible angle, height, and proximity. This practice of total observation parallels the medical examination in Foucaults Discipline and Punish: “in all the mechanisms of discipline, the examination is highly ritualized. In it are combined the ceremony of power and the form of the experiment, the deployment of force and the establishment of truth. At the heart of the procedures of discipline, it manifests the subjection of those who are perceived as objects and the objectification of those who are subjected.”181 The captor parades around with a camera atop a pole, in a highly ritualized series of concentric circles, closing in on its object of desire.

Proposition XVI. Violence haunts terrestrial capture.

Proof.—What does this practice signal? It recalls historical invasions, starting with the crusades, where European men marched under tall crosses and banners. They paraded through foreign lands to violently assert Christian ideology and cultural dominance. Again, in the colonial era, European men entered the same spaces with tall poles waving flags to impose Western imperial power. In addition to the visual language of vertical procession, the same verbal language of salvation is used in all three cases. Photogrammetry—and other forms of 3D Reconstruction—is hailed as an ethical alternative to older conventions in the field of archeology;182 the global community now frowns on violently ripping physical artifacts out of their situated contexts. Instead scholars make virtual models of artifacts to preserve and study, supposedly without disturbing their originary ecosystems. However, many archeologists who advocate for photogrammetry as a mechanism of cultural heritage preservation imply that indigenous populations cannot be trusted to protect historical artifacts. The digital databases that store these archeological models—Arc/k Project, CyArk, ARK—consistently reinforce Judeo-Christian exceptionalism—even in their names. Through research, sculpture, installation, and performance, Artist Moreshin Allahyari explores the concept of Digital Colonialism, which she defines as “a framework for critically examining the tendency for information technologies to be deployed in ways that reproduce colonial power relations.”183

Corollary I.—Capture marches through space.

Corollary II.—Capture plants flags.

Proposition XVII. Firing at close range.

Proof.—Close-range capture involves taking images of a captive object from a close distance. It is often used for small, highly-detailed subjects. Devices of close-range capture includeturntables, robotic arms, CMMs, and cages. These devices require extraction from context.

Corollary.—The mind is able to regard as present external bodies, by which the human body has once been affected, even though they be no longer in existence or present.

Proof.—A turntable is a revolving platform onto which the captive object is placed. The camera typically remains stationary, capturing images of the object as it rotates. This allows for stable and systematic capture. The turntable is a controlled environment. Consistent lighting. Featureless background. Conformity.184 This approach lends itself to consumerismquality control, product photography, game asset generation.

Note.—Robots replace the captor. Equipped with a camera, a robotic arm can maneuver around an object, capturing images from a variety of angles and elevations. This technique is highly valuable when dealing with complex objects or when access to all angles is otherwise restricted: “reflective or almost black surfaces, complex structured surfaces, cavities. I was surprised: no problem for CultArm3D. I haven’t seen such an autonomous system before.”185 The ability to program the robotic arm offers the flexibility to adjust the process based on a captive’s shape, size, and intricacy. Path-planning. Policies. Adaptable capture.

Albertis finitorium, automated. Coordinate Measuring Machines (CMMs) are vital tools in the field of metrology, the science of measurement. CMMs are used to measure the physical geometry of an object. A probe attached to the third moving axis of a CMM is used to touch the object in specific places. These machines can be manually controlled by a human operator or may be programmed and controlled by computers. The CMM probe captures coordinates that produce a point cloud, describing the surface of the captive object.186 Advanced CMMs can also use various scanning methods to gather data points rapidly, providing a high-density point cloud suitable for detailed examination and reverse engineeringeven at micro and nano-scales.187 The precise measurements obtained from CMMs are essential in industries such as automotive, aerospace, and manufacturing, where adherence to stringent quality control standards is mandated.

Proposition XVIII. Cages issue from the panspectron188 (I.xv.note.).

Proof.—A captive is extracted from its environment and placed on a turntable. But conventional photogrammetry only produces accurate models from stationary, rigid objects. Any movement during capture disrupts its calculus. It is exceedingly difficult to achieve clean results from a moving subject like a person or animal with a single camera. It takes time to reposition a camera; live subjects holding still will inevitably shift. Each small movement creates a blur, spike, or hole in the resulting mesh. In order to achieve highly accurate models of living beings, engineers designed apparatuses—cages—in which an array of cameras—sometimes hundreds—are positioned at equidistant intervals. This arrangement allows for a comprehensive series of photos to be triggered simultaneously.189 

Note.—Cages descend from violence. The word cage is undeniably associated with control and confinement. In order to capture a living subject digitally, it must be placed in a cage: “This enclosed, segmented space, observed at every point, in which the individuals are inserted in a fixed place, in which the slightest movements are supervised, in which all events are recorded, in which an uninterrupted work of writing links the center and periphery, in which power is exercised without division, according to a continuous hierarchical figure, in which each individual is constantly located, examined and distributed among the living beings, the sick and the dead—all this constitutes a compact model of the disciplinary mechanism.”190 

Cages reduce the dynamic body. They insist that the continuous process of a moving thing or living being can be constrained to a limited volume and reduced to rigid and unchanging temporal states: “Through a kind of magic, images change what they reach (and claim to reproduce) into things, and presence into simulacra … copies conforming to a standard, parodies of presence.”191 The body becomes a fixed specimen, losing all sense of life. Like a death mask or shroud, it converts the living being to pure surface (I.definition.v.).

Cages mirror the economics of the disciplinary society. The logic of the prison-industrial complex.192 Cages are extremely expensive. The cost of initial investment limits access. Companies profit off of ownership and rental of the equipment as well as from the marketplace of assets that they generate. Likewise, prisons in the United States are increasingly privatized and profitable. Both provide incentives to place bodies in cages. Disturbingly—whether spatial or imagistic—incarceration is economically generative.

Proposition XIX. Aerial capture extends colonial cartography.

Proof.—In aerial photogrammetry, an aircraft, usually an unmanned aerial vehicle (UAV), or satellite is equipped with high-resolution cameras which are used to capture photographs and depth data of a landscape or structure. The drone can follow a pre-programmed flight path to ensure even coverage of the area. The captured images are then stitched together to create detailed 3D models or topographic maps. Aerial photogrammetry is extensively used in surveying, agriculture, construction, and environmental monitoring due to its ability to cover large areas quickly and efficiently, even in difficult terrain.193

Proposition XX. Capture collapses space.

Proof.—In Geography and Vision, Denis Cosgrove explains how flight fuels the imagined possibility of total spatial control: “The aeroplane is the most visible of a great range of modern technologies that have progressively annihilated space by time over the course of the past century. The frictional effects of distance, the time and energy expended in moving across space, so painfully apparent on sea and land, are dramatically reduced in flight. The boundaries that disrupt terrestrial movement and fragment terrestrial space disappear in flight, so that space is reduced to a network of points, intersecting lines and altitudinal planes.”194 

Proposition XXI. Vertical perspective is a privileged view.

Proof.—The use of drones—unmanned aerial vehicles—signals both economic and political power, the exceptional ability to act as “a solar Eye, looking down like a god”195 Use of the technology requires resources and permits to occupy regulated airspace. These permits are difficult to acquire except for persons of influence, those connected to a governing institution or part of a powerful corporation. These individuals have access to a privileged view: “the earths topography itself flattens out to a canvas upon which the imagination can inscribe grandiose projects at an imperial scale. From the air, the imposition of political authority over space can be readily appreciated.”196 Once again, space is colonized.

Note.—Aerial Capture is threatening. For those on the ground, the image of the drone above is one of flying terror. It is increasingly difficult to differentiate between benign drones and military drones. UAVs are commonly used for military surveillance and even payloads in warfare: “vertical sovereignty splits space into stacked horizontal layers, separating not only airspace from ground, but also splitting ground from underground, and airspace into various layers. Different strata of community are divided from each other on a y-axis, multiplying sites of conflict and violence.”197 While the cage is linked to the prison through metaphor, the drone is concretethe same device in both photogrammetric capture and combat. It is impossible to visually demarcate a drone. Is it collecting images for entertainment or archeology, or is it surveilling, posed to assassinate? From below, state and military power overshadow the photogrammetric drone. Regardless of intent, this technology is a symbol of the power to claim and commodify space and control targets.

Proposition XXII. Sensing from the sky.

Proof.—Remote sensing is a powerful technology used in diverse fields like military intelligence, meteorology, environmental science, and urban planning. Remote sensing has transformed the analysis of our world and other planets. With the ability to collect data about objects or areas from a distanceoften using satellites or aircraftit offers unprecedented opportunities to monitor and measure phenomena on an unprecedented scale.198 However, the capacity to capture, analyze, and disseminate data remotely raises complex ethical questions about privacy, surveillance, ownership, and consent.

Proposition XXIII. The mind does not know itself, except in so far as it perceives the ideas of the modifications of the body.

Proof.—Remote sensing can be employed as a tool of surveillance by governments or corporations, potentially leading to the misuse of data for controlling or monitoring populations.199 In this regard, remote sensing shares ethical concerns with other surveillance technologies. The balance between security and privacy must be carefully negotiated. The ethical discourse must also consider issues of power dynamics and the potential for such technologies to be weaponized as tools of oppression.

Remote sensing poses questions concerning ownership and consent: “In fact, with digital datasets, a wider range of potential negative impacts can befall stakeholders, including the dehumanization of past peoples (and their modern descendants), the claim of open access of information when such data are rarely accessible to those outside of academia [or corporate powers], and a widening distance between local community knowledge and archaeological research.”200 Who owns the data captured by remote sensors? Advanced nations and well-funded corporations have more access to this technology and the data it produces, potentially reinforcing existing inequalities. And who has the right to give consent for data capture, especially when it concerns shared or public resources, or crosses international boundaries? The organizations deploying these technologies often transcend national borders, adding a layer of complexity to regulatory efforts.

Proposition XXIV. Field of view defines the frame.

Proof.—In optical systems, Field of View (FOV) is the angle of the viewable area that is captured by the cameras lens. Typically measured in degrees, it is “the angular extent of the observable world that is seen at any given moment. Humans have an almost 180 degree forward-facing FOV, while some birds have a complete or nearly complete 360 degree FOV.”201 Field of view can be adjusted by changing the focal length of a lens or by using different lenses.

Proposition XXV. The boundaries of supervision.

Proof.—In surveillance and other applications, the field of view can be an important factor in determining the coverage area of the camera. A wider field of view will allow the camera to capture a larger area, while a narrower field of view will allow for higher resolution images of captives: “Consider the number of pixels you have with a given camera as a cargo net made of elastic. Each square in the net represents a pixel. If you widen the view of the camera, you effectively stretch the net. You have widened the pixels, but remember a pixel is only a single value, so now you’ve stretched that single value over more of your scene.”202 Pixel dilution refers to the concept that clarity and accuracy are compromised when attempting to represent a large amount of information with a single pixel. This problem commonly occurs when configuring a scene and expanding the camera's field of view to its maximum width allowed by the lens.

Proposition XXVI. The infrastructure of surveillance.

Proof.—In The Age of Surveillance Capitalism, Susanna Zuboffprofessor emerita at Harvard Business School and expert on the social and economic impacts of technologyexplores the emergence of Surveillance Capitalism, “the unilateral claiming of private human experience as free raw material for translation into behavioral data.”203 She explains how this data is “computed and packaged as prediction products and sold into behavioral futures markets—business customers with a commercial interest in knowing what we will do now, soon, and later.”204

Corollary.—Capture obliterates privacy.

Proof.—Zuboff argues that surveillance capitalism represents a fundamental shift in the way capitalism operates, as it is based on the exploitation of personal data rather than the production of goods or services. She asserts that the concentration of data and power in the hands of a few large companies has negative consequences for competition and innovation, and can lead to the further concentration of wealth and power in society. Zuboff also contends that surveillance capitalism poses significant threats to individual privacy, democracy, and the economy. She argues that the constant collection and analysis of personal data by companies can lead to the manipulation and control of individuals and can undermine their autonomy and agency.205

Proposition XXVII. Capture is invasive.

Proof.—The politics of reproductive rights are contentious and often multifaceted, encapsulating a myriad of ethical, moral, religious, and legal dimensions. At the intersection of these discourses lie issues of privacy and control over bodies. Ultrasound technology—internal capture—is increasingly employed in the anti-abortion movement. Though it holds significant medical value in pregnancy, it has also been weaponized to manipulate public opinion and policy.

Proposition XXVIII. Capturing bodies inside bodies.

Proof.—One tactic is the legislative mandate of ultrasound examinations prior to abortion. Numerous states in the U.S. have enacted laws that require physicians to perform an ultrasound and display the images to the woman before proceeding with an abortion. The intention behind these laws is not rooted in medical necessity, but rather aims to create emotional distress and change the decision to abort a pregnancy: “The first step in this process is to perform an ultrasound to determine how far along you are. According to our state law, I must show you the ultrasound and you must hear the fetal heartbeat, if there is one. I know this might be uncomfortable, and I apologize … I understand your frustration. Although an ultrasound is often an important part of the process in abortion care, I don’t think women should have to view the ultrasound if they don’t want to. Unfortunately, this was a law that was passed last year and we can lose our license if we do not provide the ultrasound and have you view it. I can’t proceed with your visit until we have completed this part.”206 

Note.—The emergence of 3D ultrasound technology has further compounded the ethical complexities surrounding reproductive rights. Unlike traditional 2D ultrasounds, 3D technology generates realistic images that closely resemble a photograph, offering a lifelike representation of the fetus: “Most machines now have 3D/4D capability. Why? It is not improved screening for, or diagnosis of, fetal abnormalities in the first and second trimester of pregnancy and it is not automated image capture reducing time for examination or sonographers’ injury. It is consumer demand for a souvenir fetal ‘keepsake image’ which, generated by 3D, is much clearer and more realistic in appearance than with 2D.”207 This technological advancement has been co-opted by anti-abortion activists to humanize the fetus and evoke stronger emotional responses. Anti-abortion clinics, often referred to as crisis pregnancy centers, frequently utilize 3D ultrasounds to dissuade women from seeking abortions by enhancing the perceived humanity of the fetus. The American Congress of Obstetricians and Gynecologists warns that crisis pregnancy centers are frequently using “disturbing visuals or performing ultrasounds to emotionally manipulate and shame pregnant people under the guise of informing or diagnosing them.”208 Sonographers use four common methods to generate a 3D ultrasound. The first is freehand, where the probe is tilted to capture a series of ultrasound images while recording the orientation for each slice. The second method involves using a mechanical system where the probe's internal linear tilt is controlled by a motor. Third, a matrix array transducer uses beam steering to sample points across a pyramid-shaped volume. The final method utilizes an endoprobe, which involves inserting the probe and then carefully removing it to generate the volume.209 In the case of anti-abortion legislation, this is forced penetration.

Proposition XXIX. Pro-life tactics employ ultrasound technology.

Proof.—The idea of a modification of the human body (II.xxvii.) does not involve an adequate knowledge of the said body, in other words, the use of ultrasound technology in the abortion debate raises profound ethical concerns, particularly pertaining to issues of privacy and control. The compulsory viewing of ultrasounds and the emotional manipulation associated with 3D imaging can be seen as an infringement upon a womans right to privacy. Furthermore, these tactics attempt to usurp control over a womans decision-making process, undermining her autonomy over her own body.

Corollary.—Weaponization makes us ultra sound.

Note.—The politicization of ultrasound technology also challenges the integrity of medical practice, as physicians are obligated to abide by these regulations, regardless of their medical necessity or the potential psychological distress inflicted upon the patient. The American Congress of Obstetricians and Gynecologists has taken a stance against these policies: “Absent a substantial public health justification, government should not interfere with individual patient-physician encounters … Laws that require physicians to give, or withhold, specific information when counseling patients, or that mandate which tests, procedures, treatment alternatives or medicines physicians can perform, prescribe, or administer are ill-advised. Examples of such problematic legislation include … laws that require medically unnecessary ultrasounds before abortion and force a patient to view the ultrasound image.”210

In Ultrasonic Dreams of Aclinical Renderings: Possible Bodies, Helen V. Pritchard, Jara Rocha, and Femke Snelting, call for the emergence of counter-tactics: “Convoked from the dark inner space-times of the earth, the flesh, and the cosmos, particular aclinical renderings evidence that ‘real bodies’ do not exist before being separated, cut and isolated. Listen: there is a shaking surface, a cosmological inventory, hot breath in the ear. DIWO, recreational, abstract, referential and quantifying sonic practices are already profanating the image-life industrial continuum. Ultrasound is no longer (or never was) the exclusive realm of technocrats or medical experts.” There is a growing industry.

Proposition XXX. We can only have a very inadequate knowledge of the duration of our body.

Proof.—In an era of rapid technological advancement, medical imaging technologies such as ultrasounds have transformed the way parents experience pregnancy. There has been an explosion in adoption—an industry of 3D ultrasounds: “The global 3D ultrasound market size was valued at USD 2.9 billion in 2019 and is expected to grow at a compound annual growth rate (CAGR) of 6.6% from 2020 to 2027.”211 The surge in non-medical, 3D keepsake ultrasounds accounts for a significant portion of this growth. Keepsake clinics in shopping centers and strip malls all over the country aggressively advertise their services: “Expectant families, you can now see your unborn baby in live 4Dmotion! 3D Keepsake Imaging uses cutting edge technology to bring 3D and 4D ultrasound images of your unborn baby to life. You can actually see what your baby is going to look like before birth!”212 However, the U.S. Food and Drug Administration (FDA) has expressed concerns about this non-medical use.

Proposition XXXI. We can only have a very inadequate knowledge of the duration of particular things external to ourselves.

Proof.—Despite the growing popularity of keepsake 3D ultrasounds, the FDA strongly discourages their non-medical use for several reasons. Ultrasounds can heat tissues and produce small pockets of gas in body fluids or tissues—cavitation. The long-term effects of these conditions are still unknown, and therefore, the FDA recommends that ultrasound scans be performed only for medical purposes, under the guidance of trained healthcare providers. Moreover, non-medical 3D ultrasound sessions often last longer than medical ultrasounds to capture high-quality images. This extended exposure could potentially lead to unanticipated physical effects on the fetus: “the use of ultrasound solely for non-medical purposes such as obtaining fetal ‘keepsake’ videos has been discouraged. Keepsake images or videos are reasonable if they are produced during a medically-indicated exam, and if no additional exposure is required.”213

Corollary.—The non-medical nature of keepsake 3D ultrasounds introduces the potential for inaccuracies and distortions in the imaging. Unlike medical ultrasounds, which are conducted by trained healthcare professionals, keepsake ultrasounds may be performed by individuals with less rigorous training and understanding of fetal development. This lack of expertise could lead to misinterpretation of the images, potentially resulting in either undue alarm or misplaced reassurance. For instance, normal fetal formations or temporary conditions could be mistaken for anomalies, causing unnecessary anxiety for expectant parents: “You’re dealing with absolute incompetence. You’re dealing with no standard, no anything … they prey on the fears of pregnant women.”214 Conversely, actual issues might be overlooked, providing a false sense of security. This emotional roller-coaster not only heightens parental stress but could also lead to delayed medical intervention.

Proposition XXXII. Capture is distortion.

Proof.—The nature of ultrasound technology, coupled with extrinsic conditions, can result in 3D fetal reconstructions that are bumpy and distorted. Ultrasound imaging relies on sound waves, which can be influenced by numerous factors, such as the density and composition of the tissues they pass through, the position of the fetus, and the amount of amniotic fluid. External factors like the mothers body type and movement can also affect image clarity. These variables can result in images that are anatomically imprecise. For instance, certain fetal structures might appear exaggerated or diminished, and the babys surface might seem uneven: “Most 3D/4D scans I’ve seen look like mashed potatoes.”215 These distortions, while typical of the technology, can misrepresent the actual appearance of the fetus, potentially causing concern for expectant parents and leading to misunderstandings about fetal health and development.

Proposition XXXIII. There is nothing positive in ideas, which causes them to be called false.

Proof.—The capture of data and its subsequent interpretation is a critical step in numerous technologies, with any distortion or inaccuracy having the potential to mislead our understanding of the real world. This problem begins at the level of sensor intrinsics, the inherent characteristics and limitations of the sensors themselves. One example is the intrinsic properties of a camera and its associated lens distortions. A camera, much like the human eye, operates by capturing light reflected from objects in the environment and projecting it onto a sensor to create an image. Yet, the process of capturing and projecting this light is not perfect; various factors can introduce distortion. These distortions can be categorized into two main types: radial and tangential distortions.

Proposition XXXIV. Capture expands and contracts.

Proof.—Radial distortion is primarily caused by the shape of the lens and results in images appearing either barrelledbulging outwardsor pincushionedcontracting inwards: “A complex lens such as a retrofocus wide angle design tends to exhibit barrel distortion as the front group of elements acts as an aperture stop for the positive rear group. Telephoto lenses have a negative rear group and give rise to pincushion distortion. Distortion is difficult to correct for in zoom lenses, which usually go from barrel at the wide end to pincushion at the tele end.”216 Tangential distortion, though less common, occurs when the lens and the imaging plane are not parallel, causing the image to appear tilted.

Proposition XXXV. Falsity consists in the privation of knowledge, which inadequate, fragmentary, or confused ideas involve.

Proof.—A significant component of these distortions comes from camera intrinsics. These are properties of the camera that affect image formation, including focal length, sensor aspect ratio, and principle point—where the optic axis intercepts the image plane. Changes in these intrinsic parameters can dramatically influence the resulting image and, if not properly accounted for, introduce significant error. For instance, a shorter focal length implies a wider field of view but can introduce significant distortion towards the edges of the image. This effect is commonly seen in fisheye lenses.

Note.—In the Flat-Earth conspiracy, fisheye lenses have gained attention as a controversial tool used in arguments and claims regarding the Earth's shape: “Curvature Debunked!”217 Some proponents of the Flat-Earth belief argue that fisheye lenses distort images in a way that makes the Earth appear curved, even though they assert the Earth is flat. They leverage the obvious fact that fisheye lenses distort geometry, and claim they are part of a larger conspiracy to deceive people about the true shape of the Earth. Of course, the scientific consensus overwhelmingly supports the Earth being an oblate spheroid, and fisheye lens distortion does not change this widely accepted understanding of our planet’s shape.

Proposition XXXVI. Inadequate and confused ideas follow by the same necessity, as adequate or clear and distinct ideas.

Proof.—Understanding the precise nature of sensing systems involves not only their intrinsic properties but also the extrinsic factors. Extrinsic parameters determine the camera’s pose—its position and orientation in the world coordinate system. They describe the spatial relationship between the camera and the object being photographed. As such, they play a crucial role in how the object’s three-dimensional reality is translated into the two-dimensional image plane of the camera. Where the camera is placed relative to the subject can drastically alter the captured image. The camera’s distance from the subject influences depth perception and detail. Photographing a subject from the ground level will give a vastly different image than photographing it from a higher vantage point. Similarly, the orientation of the camera—how it is tilted or rotated—significantly affects the image’s perspective. A slight tilt can change the horizon line and create a skewed representation of the world. In cinematography rolling the camera in this way is called “a Dutch angle, a Dutch tilt, a canted angle, or an oblique angle. When a character is sick or drugged or when a situation is ‘not quite right’ you may choose to tilt the camera left or right and create this non-level horizon. The imbalance will make the viewer feel how unstable the character or environment really is—think of a murder mystery aboard a boat in rough seas; things tilt this way and then that, everyone unsure, everyone on edge.”218 The relationship between cameras—in multi-camera systems—or between a camera and other sensors—in sensor fusion systems—are also considered extrinsic parameters.

Proposition XXXVII. Total capture is impossible.

Proof.—We often assume that the data captured by advanced sensor technologies provide a seamless and comprehensive representation of the physical world. However, sensors can only process a limited amount of data—they provide a sparse sample of reality. Sensors, regardless of their complexity or precision, are fundamentally filters that reduce the complexity of the world into a manageable data set. They work by converting real-world phenomena—like light—into digital information that can be processed and interpreted. Given the infinite complexity and richness of our physical environment, it is impossible for any sensor to capture every single detail of the world with complete accuracy.

Proposition XXXVIII. Limited capacity.

Proof.—The inherent limitation of sensors is largely due to two main factors—the technical constraints of the sensor itself and the computational capacity to process the data. The sensors design dictates what it can measure and how accurately it can do so. For example, a cameras resolution limits the amount of visual detail it can capture. Computational capacity refers to the ability to process and store data. Processing the sheer volume of data necessary to fully capture reality is beyond our reach: “In the realm of biosensing, for example, signals are acquired—often at high cost—with various sources of noise, including the stochastic behavior of molecular interactions, imperfections in fabrication, chemical and/or optical signal transduction mechanisms and human variation in terms of sample handling, as well as physiological differences and natural variations inherent in large test populations.”219 This sparse sampling can lead to skewed data sets that do not fully reflect the reality of the situation, impacting subsequent decision-making processes.

Corollary.—Reduction distorts decisions (III.lii.).

Proposition XXXIX. Capture is sparse sampling (II.xlviii.another note.).

Proof.—Condensed sensing—or sparse sampling—is essential for practical functioning, as our computational systems currently cannot handle the full complexity of the physical world.220 The key lies in understanding these limitations and using this knowledge to interpret sensor data more accurately and realistically. Its critical to understand what the sensor system can and cannot capture, and how the choices made in the data collection process might impact the resulting data set.

Corollary.—Sensors condense values.

Proposition XL. Embedding bias from inception.

Proof.—Understanding bias in the context of sensor technology begins with acknowledging that sensors are not neutral. Despite their seemingly objective functionto capture and record data about the physical worldthe design, deployment, and interpretation of sensor data are inherently influenced by human choices and sociocultural factors. The biases embedded in these processes may not always be conscious or intentional, but they can have significant impacts on how sensor data is understood and used.

Note I.—From its inception, camera technology has reflected and perpetuated certain biases. One of the most prominent examples is the historical bias towards lighter skin tones in color film processing: “Light skin became the chemical baseline for film technology, fulfilling the needs of its target dominant market. For example, developing color-film technology initially required what was called a Shirley card. When you sent off your film to get developed, lab technicians would use the image of a white woman with brown hair named Shirley as the measuring stick against which they calibrated the colors. Quality control meant ensuring that Shirley’s face looked good.”221 Color film was calibrated for lighter skin, resulting in underexposure and poor representation of darker skin. This bias was not merely an oversight, but a reflection of the racial prejudices prevalent in the societies where the film industry was primarily based. Similarly, early digital cameras facial recognition technology struggled with identifying non-white faces, a pattern of unintentional racial bias built into the sensor technology. More advanced facial recognition compounds the problem: “Groundbreaking research conducted by Black scholars Joy Buolamwini, Deb Raji, and Timnit Gebru snapped our collective attention to the fact that yes, algorithms can be racist. Buolamwini and Gebru’s 2018 research concluded that some facial analysis algorithms misclassified Black women nearly 35 percent of the time, while nearly always getting it right for white men … many police departments use face recognition technology to identify suspects and make arrests. One false match can lead to a wrongful arrest, a lengthy detention, and even deadly police violence”222 (III.instances of reconstructions.iv.explanation.).

Note II.—Even the ubiquity and accessibility of camera technology reveal biases. While the proliferation of devices has democratized photography to a large extent, disparities still exist in terms of who has access to these tools and how they are used, reflecting broader societal inequalities.

Proposition XLI. Capture is unevenly distributed.

Proof.—There is a significant divide in the access to and use of technologies of capture. The ethical dilemma here is rooted in the uneven distribution of these technologies, which mirrors and often amplifies existing social and economic disparities. As these technologies become integral in sectors like healthcare, education, research, and even in our personal lives, access to them translates into access to opportunities and information.

Proposition XLII. Access is imbalanced.

Proof.—Parallel to this is the issue of data accessibility. With the advent of technologies that can capture and process vast amounts of data, participation in and control over this data has become a pressing ethical concern: “We define a ‘data divide’ as the gap between those who have access to—and feel they have agency and control over—data-driven technologies, and those who do not. It interacts with the ‘digital divide’ by manifesting in the way that data systems are designed, developed and shaped by those who are most likely to be represented or able to have access to them. This means the digital divide has a determining effect on who is able to be represented by and shape data-driven technologies. All this perpetuates and compounds social and health inequalities.”223 Capture technologies generate massive amounts of data that can feed into algorithms, decision-making processes, and create new insights. However, access to this valuable resource is often monopolized by corporations and government entities.

Proposition XLIII. Data concentrated.

Proof.—Data poverty—where certain individuals or communities lack access to data, or the skills to use and interpret it—reinforces social and economic inequalities. In 2021, a research grant was awarded by the Nuffield Foundation to address data poverty in the UK and redefine data access as both human right and public resource: “Digital inequalities in access, skills, and capabilities impact all aspects of citizens’ lives, be that work, education, leisure, health, or wellbeing. The team will undertake a ‘proof of concept’ study capitalizing on the well-established Minimum Income Standard (MIS) methodology to develop a Minimum Digital Living Standard (MDLS).”224 There is a need for policies that promote equitable access to capture technologies and the data they generate.

Note.—Who owns—controls—data determines who holds power, who makes decisions, and who can influence humanity. The current state of data ownership is complex. Every click, every transaction, every digital interaction generates data—collected by organizations. But who truly owns this data? Beyond privacy. Agency. Power. Control. Control over data confers—to corporations or governments—immense power. The ability to influence consumer or constituent behavior. Terraform political landscapes. Synthesize societal norms through tailored advertising—propaganda. Exploitation without explicit consent, or knowledge of the individuals to whom the data pertains.

Transparency acts as a countermeasure against unbridled power. Informed consent is key to maintaining individual agency in the digital age: “Enterprise data, by its very nature, flows through an organization, touching many business and technical processes and being stored / moved / transformed by many IT systems. It can end up in uncounted numbers of reports, online displays, data feeds, and information products.”225 In a time when breaches and misuse of personal data are common, transparency reassures users that their data is being handled responsibly. It allows users to make informed decisions about who has access to their data and under what conditions. The challenge lies in balancing transparency and complexity. Data handling processes are hard for the average user to comprehend. Algorithms are black boxes (III.). They give off black body radiation. Transparency also means clarity.

Individual ownership “is not feasible for most large or complex enterprises. For them, the concept of Data Ownership may not be useful. Instead, they take another approach: federated data-related accountabilities. In this approach, they first document data lineage (the path data has taken from its creation/acquisition to a specific system or report). Then they assign data-related accountabilities for a manageable number of segments to Data Stewards, SMEs, and/or Data Custodians (technical resources).”226

Blockchains are data lineages. They are the underpinnings of cryptocurrencies like Bitcoin and Ethereum. They promise enhanced agency, transparency, and trust in data governance. Decentralized. Immutable. A record of transactions that claims to cut out centralized power. Blockchains are transparent. Every transaction is publicly visible and unalterable. Renewed trust. Open books.

Proposition XLIV. Decentralized genesis.

Proof.—A blockchain is a decentralized ledger. It records transactions across distributed computers so that the trusted record cannot be retroactively altered. This decentralized nature is the first factor that distinguishes blockchain from conventional methods of data circulation and control. Unlike traditional centralized databases, where a single entity or authority has the power to control and modify data, blockchain operates on a network of nodes. This decentralization removes the need for a trusted third party or intermediary, leading to increased security and decreased potential for manipulation or fraud. However, it is not impenetrable: “the zero-state problem occurs when the accuracy of the data contained in the first, or ‘genesis block,’ of a blockchain is in question.”227

No—blockchains may enhance transparency and agency, but they are not immune to co-option and privatization. There has been a rise of private—permissioned blockchains228—more opaque decisions and concentrations of power. The transformative potential of blockchain could turn to tokenism (II.xlviii.)—entrenching existing disparities in data ownership and control. The assumption that blockchain transparency equates to full disclosure and fairness is misleading.229 Although all transactions are publicly visible, the identities behind those transactions remain anonymous. Diminished accountability. Alternate realities (IV.xxiii.).

Corollary I.—Permanent fingerprints.

Note.—Blockchains use a form of cryptography—hashing—to ensure security and immutability: “Hashing is a method of cryptography that converts any form of data into a unique string of text. Any piece of data can be hashed, no matter its size or type. In traditional hashing, regardless of the data’s size, type, or length, the hash that any data produces is always the same length. A hash is designed to act as a one-way function—you can put data into a hashing algorithm and get a unique string, but if you come upon a new hash, you cannot decipher the input data it represents. A unique piece of data will always produce the same hash.”230 A fingerprint. Once data is recorded on the blockchain, it is extremely difficult to change or erase. Data integrity. In traditional databases, data can be changed or deleted by those with access. Every transaction on a blockchain is visible to all participants in the network. A transparent system where the movement of data is tracked openly.

IoT networks involve countless devices collecting, sharing, and acting on data. As IoT ecosystems expand, so does the complexity of managing massive amounts of data and ensuring security and privacy. With interconnectedness comes a risk of data breaches: “A fundamental problem with current IoT systems is their security architecture, with a centralized client-server model managed by a central authority which makes it susceptible to a single point of failure. Blockchain addresses this problem by decentralizing decision-making to a consensus-based shared network of devices.”231 By storing data across a network of nodes, rather than a central server, blockchain reduces the risk of data being compromised, boosting the overall security of IoT systems. Every transaction in a blockchain network is recorded on a public ledger, providing a verifiable audit trail. Trusted supply-chains.

Blockchain’s ability to provide a decentralized, transparent, and immutable ledger of transactions makes it an ideal candidate for applications requiring high levels of accountability and traceability. Glockchain—a speculative prototype—simulates blockchain technology applied to firearm regulation—opening up opaque social forces: “Glockchain is one of numerous prototypes created for a new venture called Ideo coLAB, which brings together partner companies (in this case NASDAQ, Citi Ventures, Fidelity and Liberty Mutual), designers from innovation and design firm Ideo, and fellows from a variety of backgrounds to find applications for emerging technologies like the blockchain.”232 

The initial prototype focused on a specific US population—law enforcement. Capture toward transparency and accountability. The prototype leverages the intrinsic qualities of blockchain to provide a public, immutable record of police firearm use. The prototype assumes that future firearms are equipped with sensors. Smart guns. A smart gun records data when moved, holstered, unholstered, and fired. The data includes the time, location, pose estimation, rounds fired—facial recognition and fingerprint. It is then automatically uploaded to a blockchain. Once on the blockchain, the data cannot be altered or deleted, ensuring that the record of the event remains intact and verifiable.

This use of blockchain technology has several potential benefits for law enforcement accountability. It automates reporting of when, where, and under what circumstances force was used. Current data is impoverished: “The FBI’s data is based on information voluntarily submitted by police departments around the country, and not all agencies participate or provide complete information each year.”233 Glockchain would provide an overall picture of the state of police firearm use. It would also clarify incidents where there is a dispute over the use of force—a record that can be referred to during investigations or judicial proceedings. Glockchain could improve public trust in law enforcement. Incidents of excessive or inappropriate use of force have led to widespread calls for greater transparency and accountability in policing. Glockchain answers these calls with a verifiable, unalterable record of when and how force is used. Inverting surveillance.

While transparency is one of the key strengths of blockchains, careful thought must be given to how data is collected, how it is made public, and how it can be accessed. The system must be tamper-proof. Software and hardware. Firearm sensors must be reliable: “After a short stint at MIT, Kloepfer dropped out to focus on Biofire, now a company with 40 employees and $30 million in venture capital funding. His team has designed and built hundreds of prototypes, trying to meld old-school gunsmithing with the latest in cutting edge electronics … In the main workshop space, there are thermal chambers that simulate different environmental conditions. That sort of testing is critical … to ensure that a gun loaded with electronics works in any sort of environment.”234 It must be calibrated for accuracy–robust and secure enough to ensure that they cannot be disabled to circumvent the system. Glockchain must also contend with the issues of acceptance and adoption. This extends beyond law enforcement officers to include legislators, courts, privacy advocates, and the public, all of whom will have a role in determining how this technology is implemented and used. Widespread adoption and consistent oversight may be challenging as “the NRA opposes any law prohibiting Americans from acquiring or possessing firearms that don’t possess ‘smart’ gun technology.”235

Smart guns, which began as an attempt to “meld a fingerprint sensor onto the grip of a Glock handgun”236 are commercially available—as of 2023. Manufacturers—Biofire, Lodestar, and SmartGunz—are producing firearms as devices of capture. Glockchain simulates how blockchain technology can be used to enhance transparency and accountability in law enforcement. If adopted it could be expanded as a tactic for broader firearm regulation. The regular cadence of massacres—mass shootings—demands system wide regulation.

Shortly after Glockchain was prototyped, it was trademarked by a lawyer in Los Angeles. A few years later the trademark expired.

Corollary II.—Expand and contract.

Proof.—A smart contract is an automatic, self-executing contract where the terms of the agreement are written into the code. It operates under a set of conditions, automatically executing transactions when those conditions are met. Smart contracts eliminate the need for an intermediary and ensure the terms are transparent and immutable. However, “there is no federal contract law in the United States; rather, the enforceability and interpretation of contracts is determined at the state level … any conclusions regarding smart contracts must be tempered by the reality that states may adopt different views.”237 Ethereum is the platform that popularized the use of smart contracts. It introduced a programming language and tools for developers to write their own. This feature has been utilized to create decentralized applications (DApps) on the Ethereum network, leading to various innovations, one of which includes Non-Fungible Tokens (NFTs): “Non-fungible tokens (NFTs) seem to be everywhere these days. From art and music to tacos and toilet paper, these digital assets are selling like 17th-century exotic Dutch tulips—some for millions of dollars.”238

Proposition XLV. Every idea of every body, or of every particular thing actually existing can be tokenized.

Proof.—Each NFT represents a unique item. NFTs are cryptographic tokens on the blockchain. Fungible exchanges—like physical money—are on a one-for-one basis. NFTs are non-fungible, meaning no two NFTs are the same. No identical twins.

Note.—Smart contracts play a crucial role in the creation and transaction of NFTs. When an NFT is created—minted—a smart contract is written to the Ethereum blockchain. This contract contains the rules and information for that specific NFT—who owns it and any royalties that need to be paid upon future sales. When an NFT is sold, another smart contract is used to automate the transaction. This contract guarantees that the NFT is transferred to the buyer, the payment is sent to the seller, and any specified royalties are paid to the original creator. Buyer—seller—owner. No middleman. The transparent, secure, automated minting and trading of digital assets.

Proposition XLVI.Blockchains track economies of images.

Proof.—The idea of using NFTs to represent ownership of a digital image is appealing as it provides solutions to the problems inherent to the digital medium: “‘When it comes to selling artworks, two things are important … Is the artwork real, and do I have the authority to sell it to you?’”239 NFTs provide proof of authenticity and ownership, solve issues related to copyright and reproduction, and offer new approaches to monetization.

Proposition XLVII. Magnetizing artists.

Proof.—In 2020, digital artists gained mainstream recognition, and with them NFT marketplaces such as OpenSea, Rarible, and SuperRare(III.vi.). The sales volume of NFTs saw an exponential increase. The turning point came in March 2021, when renowned auction house Christie’s sold an NFT—Everydays: The First 5000 Days—for a staggering $69,346,250.240 This sale legitimized NFTs in the traditional image market and ignited a global conversation around their potential.

Note.—These technologies are volatile. The value of cryptocurrencies—the foundational blocks of NFTs—fluctuate wildly. They are influenced by factors such as regulatory news, technological advancements, market sentiment, and macroeconomic trends. Subject to speculative behavior and hype cycles—inflated values and sudden market downturns—hacking and theft: “More than $100 million worth of NFTs were publicly reported as stolen between July 2021 and July 2022 … by September of last year, NFT transaction volume had collapsed by 97% from its peak in January 2022.”241 Unstable values.

Proposition XLVIII. Tokens represent control.

Proof.—NFTs introduce an additional layer of complexity to copyright issues, as they involve the digital representation of ownership, not the ownership of the actual intellectual property rights of an image.

Note.—Traditionally, when a photographer sells a physical print of a photograph, the photographer retains the copyright unless it is explicitly transferred. This means the photographer can reproduce the image, make derivative works, distribute copies, or display the image publicly.

The buyer owns a single physical instance of the image. NFTs exist in a digital space where reproduction is effortless. When an NFT image is sold, the buyer purchases proof of ownership on the blockchain. This does not mean they own the copyright—still only the instance. Unless explicitly stated, the creator retains intellectual property rights and can create more NFTs of the same image. There have been instances where artists found their images minted as NFTs by others without their consent. Unauthorized copies. “When Lois van Baarle, a Dutch artist, scoured the biggest NFT marketplace for her name late last year, she found more than 100 pieces of her art for sale. None of them had been put up by her … ‘It is much easier to make forgeries in the blockchain space than in the traditional art world. It’s as simple as right-click, save. It’s also harder to fight forgers. How do you sue the anonymous holder of a crypto wallet? In which jurisdiction?’”242 The global and decentralized nature of NFTs complicates copyright law, which is governed by states, not at the federal level. Enforcing copyright claims in a decentralized environment, given the absence of a central authority, is an amorphous process. The United States Copyright Office has opened an investigation into NFTs and has organized three roundtables on their implications—copyright, patents, trademarks.243 As the NFT market continues to evolve, it demands a clearer legal framework.

Another Note.—Tokenism refers to the practice of making a symbolic or superficial effort to include individuals or groups from underrepresented backgrounds, without truly addressing or resolving the underlying issues of inequality or discrimination. It involves giving the appearance of inclusivity or diversity without making substantial or meaningful changes. Tokenism often occurs in contexts such as institutions, corporations, or media, where there is a desire to demonstrate diversity or inclusivity without actually challenging the existing power structures or addressing systemic inequalities: “The current notion that token integration will satisfy his people is an illusion.”244 It typically involves selecting a small number of individuals from marginalized groups and presenting them as representatives of the entire group. The community overall is not included. They remain outside the system. Tokens may not necessarily result in substantive changes or equal opportunities for others. Sparse sampling.

Tokenism is problematic because it creates the illusion of progress. It also places undue pressure on the individuals selected as tokens, as they may feel burdened with representing an entire group and face additional scrutiny or unrealistic expectations. To combat tokenism, it is important to focus on genuine inclusivity, equity, and systemic change. This involves providing equal opportunity and creating an environment that values diverse perspectives and contributions. How inclusive are emerging image economies? NFTs claim to democratize art, but “there are still systemic barriers to entry, evidenced in the makeup of the NFT community. Not only is there a screaming lack of disability-forward NFTs, but the gap is large even for those identifying as a minority race or gender, and of course, the gap is largest for those disabled artists intersectioned with race, ethnicity and gender. In attending NFT round-tables and information sessions, nearly all presenters were of the same race, sex and social status. Why do we still see the systemic residue in a market claiming to shake up the industry?”245

A recent exhibition—Sight Unseen—showcases the work of blind photographers from around the world. The images bridge the gap between distinct inner worlds and the shared sphere of the sighted. Representing a diverse range of visual impairments, some completely blind and others with varying degrees of visual perception, these artists utilize photography as a medium to navigate and interpret the surrounding space using other heightened senses: “vision is so strong that it masks other senses, other abilities. I feel light so strongly that it allows me to see the bones of my skeleton as pulsating energy.”246 The curator explains that “we, in the sighted world, are absolutely immersed in images, we’re in a torrent of images, an avalanche of images. And what I’ve learned in talking to all these blind photographers and thinking about this, is that the sighted world has essentially in some ways been blinded by all the images we’re exposed to. All we see are those images now. They displace the reality of the world is right in front of us. Instead, we see representations of that world. And if you can’t see, you can’t be influenced by this, so you’re automatically operating in an extremely original way.”247 The value of other ways of knowing.

Many adaptive—or assistive—technologies are not currently integrated into blockchains, but they could be. Combined, Optical Character Recognition (OCR) and Text-to-Speech (TTS) convert images to audio descriptions. These adaptive tools would empower visually impaired users to engage with information and interact with blockchain content through voice-based interfaces. Adaptive technology enhances the creative process: “as I keep being told by blind people, if you're going to be blind, this is the best time to be blind in history because there are so many assistive technologies that you can use.”248 They emphasize the importance of having control over their images and modifications, even if they have limited—or no vision.”249 The exhibition incorporates various accessibility features such as audio descriptions of artworks, biographical essays available in audio form, Braille, and tactile elements added to some photographs, allowing for multi-sensory engagement. This integration acknowledges the importance of accessibility and demonstrates how stakeholders can embrace innovative solutions. Blockchains and other emerging data management systems have high stakes when it comes to accessibility and—as users proliferate—sustainability.

Proposition XLIX. Tokens consume energy.

Proof.—As blockchains grow, their requirements for storage, bandwidth, and computational power increase.The environmental impact of NFTs has been a topic of significant debate, primarily due to their association with high energy consumption as they used a Proof of Work (PoW) consensus mechanism until quite recently.

Corollary.—Proof has stakes.

Proof.—The PoW model exploits miners (II.lemma.vii.). The mining process involves solving complex mathematical puzzles. Each solution validates a transaction and adds it to the blockchain. Solutions are rare—rare minerals. Mining is computationally intensive. Extreme energy use—a massive carbon footprint: “We have to change our existing habits. So how can we build new platforms that are unsustainable?”250 As NFT transactions increased in frequency and number, so too did the environmental impact of the Ethereum network: “...an average NFT has a stunning environmental footprint of over 200 kilograms of planet-warming carbon, equivalent to driving 500 miles in a typical American gasoline-powered car … ‘You just click on a button or type a few words, and then suddenly you burn so much energy.’”251

Note.—In a bid to address these concerns, Ethereum has transitioned to a Proof of Stake (PoS) consensus mechanism through an upgrade known as Ethereum 2.0. PoS is seen as a lower impact alternative to PoW: “Ethereum switched on its proof-of-stake mechanism in 2022 because it is more secure, less energy-intensive, and better for implementing new scaling solutions compared to the previous proof-of-work architecture.”252 It dramatically reduces the computational power required to secure the network—each transaction is approximately 1/30,000th of its PoW equivalent. Instead of miners competing to solve complex problems, validators in a PoS system create new blocks based on tokens they hold—and are willing to stake as collateral.

PoS allows for greater scalability, as it can process transactions more quickly than PoW, enhancing the network’s capacity to handle increased demand. This is crucial in accommodating a larger user base and more diverse applications. PoS is seen as more democratic and inclusive. PoW favors miners with more powerful hardware—PoS gives anyone who can buy and stake tokens the opportunity to participate. This lower entry barrier may democratize participation, fostering wider user engagement and adoption: “Whereas under proof-of-work, the timing of blocks is determined by the mining difficulty, in proof-of-stake, the tempo is fixed. Time in proof-of-stake Ethereum is divided into slots (12 seconds) and epochs (32 slots). One validator is randomly selected to be a block proposer in every slot. This validator is responsible for creating a new block and sending it out to other nodes on the network. Also in every slot, a committee of validators is randomly chosen, whose votes are used to determine the validity of the block being proposed.”253 It is worth noting that while Ethereums transition to PoS has improved accessibility and decreased environmental impact, it has not completely eliminated these problems. The shift to PoS can help make the crypto-art market more sustainable, but it doesnt address all environmental concerns—It is crucial to consider the e-waste generated by the hardware used for mining and the energy sources powering these activities.

Another Note.—There are alternative approaches to the capture of light.

The growing environmental crisis has brought to light the urgent need for sustainable and accessible energy solutions. GRID Alternatives captures light to protect the environment and elevate underserved communities. The cornerstone of GRID Alternatives work is the belief that clean, affordable energy should be a fundamental human right: “GRID was founded during the 2001 California energy crisis by Erica Mackie, P.E., and Tim Sears, P.E., two engineering professionals who were implementing large-scale renewable energy and energy efficiency projects for the private sector. The idea that drove them was simple: free, clean electricity from the sun should be available to everyone.”254 The organizations approach is multi-faceted, combining solar installation, community engagement, and workforce development. No-cost solar for low-income households. Light into energy. Reduced emissions. Better Jobs. Contributing to the broader fight against climate change. Energy independence. Liberation from toxic systems.

GRID Alternatives recognizes that the transition to clean energy offers more than just environmental benefits. It also presents substantial economic opportunities. The organization offers hands-on solar installation training programs, equipping individuals from marginalized backgrounds with the skills and certifications necessary to enter the renewable energy sector. This not only expands job opportunities for these individuals but also ensures that the clean energy transition benefits all members of society.

Beyond these direct interventions, GRID Alternatives also works to influence energy policy on a systemic level: “GRID is a leading voice in low-income solar policy and the nation’s largest nonprofit solar installer, serving families throughout California, Colorado, the Mid-Atlantic region, and tribal communities nationwide … In addition, GRID’s international program partners with communities in Nicaragua, Nepal and Mexico to address their energy access issues.”255 The organization advocates for inclusive renewable energy policies that prioritize the needs of low-income communities and communities of color. In doing so, GRID Alternatives helps to ensure that these communities are not left behind in the shift towards renewable energy.

GRID Alternatives is committed to “advancing an EQUITY agenda both within GRID Alternatives and in the energy industry and policy arenas by examining and addressing systemic inequities; seeking out and amplifying the voices of the communities we serve; and expanding access to solar energy and career and leadership opportunities.”256 Their work challenges traditional power dynamics and introduces a new paradigm where sustainable energy practices and social justice converge.

It remains to point out the advantages of a knowledge of this doctrine as bearing on conduct, and this may be easily gathered from what has been said.

1. Capture technologies, inclusive of a range of devices and methods from sensors to data management systems, have reshaped our perception of reality. They have brought about innovation—creativity—discovery—while simultaneously introducing multifaceted challenges and costs. Environmental Impact. Inaccuracy. Distortion. Privacy. Ownership. Accessibility.

2. The nuances of how these technologies function and process information is crucial to understanding the realities they create. This comprehension brings to light the innate limitations of such technologies—distortions resulting from sensor intrinsics and extrinsics—and the selective representation of reality due to limited data processing capabilities. There are opportunities for technological and ethical advancement, for instance, the application of blockchain technology introduces new layers of transparency, accountability, and user agency in the way data is stored, distributed, and exchanged.

3. Merely understanding these systems and processes is insufficient. Engaging with capture technologies requires us to be deliberate and intentional in what we choose to capture, how we capture it, and for what purpose. We must recognize that data is never purely objective, but always influenced and shaped by the technologies, algorithms, and the human biases that guide its capture and interpretation.

4. Intentionality is paramount. Alignment with ethical, responsible, democratic principles. Vigilance. Continual questioning and evaluation of the processes. Hyper-awareness of the potential replication or amplification of existing societal biases and disparities. A future where capture is not limited as a tool of surveillance and control, but rather opens up as an instrument of equitable and responsible progress.

I thus bring the second part of my treatise to a close … and, considering the difficulty of the subject, with sufficient clearness. Part III examines how captured data is aligned and wielded. The warfare and magic of Reconstruction.











PART III.

ON THE NATURE AND OPTICS OF RECONSTRUCTION

PREFACE

Realism, photorealism, and hyperrealism are the aesthetics that propagated with the mass adoption of cameras. Realism is a philosophical concept with a long history—traversing traditions. Generally, realists hold that the world exists independently of our perceptions or interpretations (V.x.). The world is real. Space. Material. Life. Photorealism is the perfectionistic representation of these places, objects, and beings. In Camera Lucida, French theorist Roland Barthes, grappled with the power to freeze moments in time. He raised questions about the photographer’s responsibility towards their subjects: “All those young photographers who are at work in the world, determined upon the capture of actuality, do not know that they are agents of Death.”257 Photography produces death—suspended, timeless portraits—memento mori. Reconstructions are death masks. Digital replicas. Computer vision integrates dead data into multidimensional models. If Capture produces dead data, Reconstruction produces the undead. If Capture is breaking to control, Reconstruction reassembles broken bits258—broken bodies—broken environments—broken systems. It is an attempt to put the broken pieces back exactly where they were. Reconstruction is hyperreal integration. Reconstructions appear as straightforward representations of reality, but they are reductions—distortions. In Simulacra and Simulation, Jean Baudrillard introduced hyperreality—a state in which a copy or simulation does not merely mimic the real but becomes more real than the real. Hyperreal. Data are simulacra—copies that substitute the real: “Of all the prostheses that mark the history of the body, the double is doubtless the oldest. But the double is precisely not a prosthesis: it is an imaginary figure, which, just like the soul, the shadow, the mirror image, haunts the subject like his other, which makes it so that the subject is simultaneously itself and never resembles itself again, which haunts the subject like a subtle and always averted death.”259 The undead—reanimated. A reconstruction is a double.

Reconstruction is also an era.

The Reconstruction era followed the American Civil War—from 1865 to 1877. It was the period of rebuilding after centuries of colonial practice—the end of legalized slavery. Reconstruction was integration. Integration of former slaves—emancipated beings. Integration of Southern states—back into the Union. The new model was almost a replica of the past—the ideologies that initiated and perpetuated slavery were also integrated. And a new weapon: “photography served many purposes during the war. It was used to promote abolition; as propaganda for both the northern and southern causes; as an important tool in the creation of Lincoln’s public persona and career; as well as for reconnaissance and tactical observation.”260 Photography proliferated in the Reconstruction era—shaping public opinion. Images captured the chaotic realities of a nation grappling with decimation and profound change. Images captured the devastation wrought by war, the violent opposition African Americans faced in the South, and the struggles and joys of people adapting to freedom. These powerful images were published and distributed through newspapers, magazines, and as individual prints, reaching a wide audience. They informed public opinion—stimulated discussions, debates, and policy changes. The American public experienced Reconstruction through the lens of a camera.261 Now these images are wormholes into the past—allowing future generations to glimpse the realities of post-Civil War America. Photography was an equalizing force. Recently emancipated beings alongside figures in power. They show that history is made by powerful organizations—but also by individuals fighting for their rights. Made by whoever controls images.

Images of the emancipated offered tangible evidence of the conditions and treatment they endured, countering the abstract political rhetoric that dominated the public sphere. Portraits of freedmen and freedwomen served as silent testimonials to the hardships they had endured and their determination to claim their rightful place in American society. Sojourner Truth strategically used photography as a medium for advocacy. Truth sold small photographs—inscribed with her slogan—I Sell the Shadow to Support the Substance.262 Her image circulated—a symbol of empowerment and self-possession—defying the dehumanizing narratives of African Americans prevalent in society:

As a women’s rights activist, Truth faced additional burdens that white women did not have, plus the challenge of combating a suffrage movement which did not want to be linked to anti-slavery causes, believing it might hurt their cause. Yet, Truth prevailed, traveling thousands of miles making powerful speeches against slavery, and for women’s suffrage (even though it was considered improper for a woman to speak publicly).263

Images exerted social influence. The dominant political and social classes used images as propaganda to express their visions of post-war society and to cast doubt on those who opposed them. Photographs also influenced public perception of Reconstruction policies. Critics produced images highlighting instances of inefficiency—or corruption—within Reconstruction government. Undermining their legitimacy. Photography was used to romanticize the Lost Cause. Framing the Confederacy as noble—heroic. A slight of hand—the South fought not for the preservation of slavery but for states’ rights and southern honor. Photographs of supposedly contented slaves served this narrative, glossing over the atrocities of oppression.264 Reconstruction photographs were used to reassert the societal dominance of white people after the abolition of slavery. Southern sympathizers depicted the South as a once-great society devastated by the war. The pre-war era represented the Golden Age. Photographs were also used to promote negative stereotypes of African Americans—attempting to legitimize discriminatory laws and social norms. Black individuals were often depicted as unfit for freedom, as a threat to societal order, or incapable of self-governance, reinforcing racial stereotypes and justifying the introduction of Black Codes:

After the United States Civil War, state governments that had been part of the Confederacy tried to limit the voting rights of Black citizens and prevent contact between Black and white citizens in public places … These codes limited what jobs African Americans could hold, and their ability to leave a job once hired.265 

Images oversimplify—reduce—erase the complexities of history. They invent and perpetuate myths—potentials of deception. In Reconstructing Dixie, Tara McPherson, argues that these events and images had a lasting impact on the identity of the United States. She describes the South as a three-dimensional postcard. Lenticular logic underlies the national imaginary—“a schema by which histories or images that are actually copresent get presented (structurally, ideologically) so that only one of the images can be seen at a time.”266 Alternate realities. The Civil War exists in the present. 3D postcards glorifying the Lost Cause and The Golden Age are still sold at plantation restoration novelty shops. Sanitized reconstructions within the southern tourism industry.267 We are——still——in a Reconstruction era.

Repeating the past—history echoes through technological reconstructions. Despite the end of colonial rule and the legal abolition of slavery, racial and social inequalities persist—integrated in software. Repeating the past—circulating data generated in previous moments in time. It is an economy based on “the resuscitability or the undead of information.”268 All of this dead data. Clouds—streams—pools—reservoirs—lakes—swamps of data: “A data swamp is a badly designed, inadequately documented, or poorly maintained data lake. These deficiencies compromise the ability to retrieve data, and users are unable to analyze and exploit the data efficiently. Even though the data exists, the data swamp cannot retrieve it without contextual metadata.”269 Organizational systems and rules are critical. And those codes are programmed with historical biases: “Thus the scientific archive, rather than point us to the future, is trapping us in the past, making us repeat the present over and over again.”270 Is it possible to untether the logic of capital exploitation from the logic of data? “Concealed behind the ‘echo chambers’ ... is an incredibly reductive identity politics, which posits class, race, and gender as ‘immutable’ categories.”271 

Wendy Chun argues that “software is a functional analog to ideology.”272 She asks, “what is the significance of following and implementing instructions? Perhaps the ‘automation’ of control and command is less a perversion of military transition and more an instantiation of it, one in which responsibility has been handed over to those (now machines) implementing commands. The relationship between masters and slaves is always ambiguous.”273 In the context of software, the master-slave relationship “usually refers to a system where one—the master—controls other copies, or processes.”274 For example, in a master-slave database replication setup, one database—the master—holds the primary data, and other databases—the slaves—replicate that data to ensure redundancy and fault tolerance. Chun points out that software terms like master and slave not only reflect historical oppressive systems, they also normalize hierarchical relationships in technology. Reconstruction relies on master-slave logic at every scale.275 

In 2014, Drupal—an open source web content management system—replaced “master/slave”276 with “primary/replica.”277 In 2018, Python—one of the three most used programming languages—followed suit: “‘slaves’ was changed to ‘workers’ or ‘helpers’ and ‘master process’ to ‘parent process.’”278 

classification

Title:        Avoid master/slave terminology

Type:        enhancement        Stage:        resolved

Components:        Documentation, Interpreter Core        Versions:        Python 3.8279

In 2020, GitHub replaced the default master branch with main.280 In 2022, the Open Source Hardware Association issued an official resolution to deprecate “MOSI—Master Out Slave In, MISO—Master In Slave Out, SS—Slave Select, MOMI—Master Out Master In, SOSI—Slave Out Slave In.281 In 2023, the IEEE—The Institute of Electrical and Electronics Engineers—released “master-slave optional alternative terminology,” after calls from members: “For decades our industry has used the term ‘Master / Slave’ to denote a set of ICs or firmware/software where one device has control over one or many others. The use of this terminology has always made me and many others feel uneasy. While my ‘engineering brain’ has an idea of what this term defines, my ‘human brain’ relates this as a human condition, a human rights issue.”282 In Language Wants to Be Overlooked: Software and Ideology, Alexander Galloway argues that “to see code as subjectively performative or enunciative is to anthropomorphize it, to project it onto the rubric of psychology, rather than to understand it through its own logic of ‘calculation’ or ‘command.’”283 The words master/slave may eventually be outmoded. But we should not forget that the logic underneath is the same. These names are windows into functionality.

Many names are labels of function—but “code does not always nor automatically do what it says, but does so in a crafty manner.”284 There has always been some level of deception at play: “John Backus … contends that ‘programming in the early 1950s was a black art, a private arcane matter.’ These programmers formed a ‘priesthood guarding skills and mysteries far too complex for ordinary mortals.’ Opposing even the use of decimal numbers, these machine programmers were sometimes deliberate purveyors of their own fetishes or ‘snake oil’” (VII.). Many algorithms are still proprietary—black boxeddark matter. Even open source algorithms remain opaque to most people: “Code is a medium in the full sense of the word. As a medium, it channels the ghost that we imagine runs the machine—that we see as we don’t see—when we gaze at our screen’s ghostly images.”285 Part III excavates the algorithms that underlie Reconstruction.

DEFINITIONS

I. A reconstruction is a stereotype.

II. An idealized twin.

III. By idealized, I mean contorted by fantasy and bias.

N.B. If we can be the adequate cause of any of these modifications, then what are the ethics of Reconstruction?

POSTULATES

I. Reconstructions are images—and other forms of data—extracted from different points in space and integrated to produce multidimensional models.

N.B. This postulate or axiom rests on Postulate i. and Lemmas v. and vii., which see after II. xiii.

II. Reconstruction is a powerful tool for mapping and analyzing the surface of the earth and everything on it. Even distant cosmic bodies—far off phenomena. But reconstruction is not just a tool for scientific exploration and analysis. It changes the design and manufacture of objects and spaces. Reconstruction transduces world and experience.

Proposition I. Reconstruction evolves.

Proof.—Reconstruction—the process of volumetrically reproducing the shape and appearance of real-world objects—has a complex history that extends back centuries. Early precursors include the inventions of the Early Modern period (II.)—inventions that depicted three-dimensional realities on two-dimensional surfaces—approximating human sight—binocular vision. Later, photography allowed for a new way of producing dimensionality—stereoscopy. Stereoscopy was first proposed by Charles Wheatstone in 1838: “No question relating to vision has been so much debated as the cause of the single appearance of objects seen by both eyes.”286 A pair of images taken from slightly different angles combined to create an illusion of depth. This early method of reconstruction would eventually influence the development of stereo-vision algorithms in computer science.

The field of computer vision—digital reconstruction—began to take shape in the latter half of the 20th century. Lawrence Roberts outlined the possibility of extracting 3D geometric information from 2D images in Machine Perception of Three Dimensional Solids: “The first assumption is that the picture is a view of the real world recorded by a camera or comparable device and therefore that the image is a perspective transformation of a three-dimensional field. This transformation is a projection of each point in the viewing space, toward a focal point onto a plane. The transformation will be represented with a homogeneous, 4x4, transformation matrix, P, such that the points in the real world are transformed into points on the photograph … Thus, a transformation from the real world to a picture has been described and to go the other way one simply uses the inverse transformation, P-1.”287 An inversion of an inversion.

In 1981, Hugh Christopher Longuet-Higgins, contributed the essential matrix—or eight-point algorithm— which encodes the relative geometric relationship between two cameras.288 In 1996, Quan-Tuan Luong and Olivier Faugeras added the fundamental matrix, which relates corresponding points between two images: “we show that there is an interesting relationship between the Fundamental matrix and three-dimensional planes which induce homographies between the images and create unstabilities in the estimation procedures.289 The invention of algorithms to register and integrate data, like Iterative Closest Point—ICP—was also a significant milestone.290291 In 2000, Richard Hartley and Andrew Zisserman, published Multiple View Geometry in Computer Vision, normalizing and consolidating key concepts and opaque algorithms in the field.292 Then the development of SIFT—Scale-Invariant Feature Transform—by David Lowe provided a more robust method of matching points between images (III.iv.).293 Over the next two decades, algorithms proliferated (III.vii.). Parallel advances in hardware made reconstruction more detailed and accessible (II.). Machine learning is accelerating reconstruction (IV.). Quantum computing will ~ (V.).

Corollary.—Tools of reconstruction morph and multiply.

Proposition II. Reconstruction is capture through capture.

Proof.—Reconstruction integrates captured images—sensory data—to capture likeness.

Note.—Reconstruction relies on a complex pipeline of algorithmsdata acquisition, preprocessing, calibration, matching, estimation, optimization, integration, post processing, visualization. The first step is always capture—data acquisition—data is collected with a device (II.).

Proposition III. Data is sanitized.

Proof.—After data acquisition, follows preprocessing. This step might include noise reduction, feature extraction or other data cleaning procedures. Purification.

Note.—Data quality is a crucial factor in reconstruction: “Motion blur, sensor noise, jpeg artifacts, wrong depth of field are just some of the possible problems that are negatively affecting automated 3D reconstruction methods.”294 Several algorithms are commonly used to improve the quality of data and identify distinctive points or regions in a scene. These algorithms—SURF, ORB, AKAZE—originate from SIFT—Scale-Invariant Feature Transform. However, there is abundant “failure caused by SIFT-like algorithms.”295 What is averaged out? What is erased? Subtracted. What remains?

Proposition IV. Rules sift local values.

Proof.—Scale-Invariant Feature Transform—SIFT—identifies and matches features—like corners and edges—in images.296 These features are represented using a descriptor, which is a vector of numerical values that describes the features appearance and location. Computing the body and the archive (I.xiv.). The descriptor is produced from a set of filters that are applied to the image at multiple scales: “The SIFT algorithm uses Scale Space Theory to find interesting locations in images called keypoints. To do this, a training image is incrementally blurred using a Gaussian kernel to create a stack of blurred images called an octave. The difference between each image in an octave is then computed.”297 Once the features and descriptors have been extracted from the images, SIFT can also be used to match features between the images by comparing the descriptors. These correspondences are used to align images in later stages of reconstruction.

Proposition V. Sifting features and sifting outliers.

Proof.—Quality of input data is often questionable. SIFT is sensitive to noise and image degradation—low-quality images make matching difficult. The descriptor used by SIFTto represent features is limited in size, which can make it difficult to capture the full range of information about a feature. This can be a problem for applications where a large number of features need to be matched. It can lead to a high number of false matches (II.xl.). Uncertainty.

Proposition VI.Reverse engineering identities.

Proof.—In order to generate a model, the positions and parameters of all cameras must be known. It is often necessary to retroactively calibrate cameras to find their intrinsic and extrinsic parameters. There are numerous camera calibration algorithms used in reconstruction. The most common are direct linear transformation, the collinearity equations, and two-point perspective transformation. 

Proposition VII. Transformation must be linear and direct.

Proof.—Direct Linear Transformation—DLT—is the most popular calibration algorithm and is used in a variety of reconstruction pipelines. The algorithm finds the camera’s rotation, position, and calibration—R, Xo, K—The algorithm results in a matrix that can project 3D world coordinates to 2D image coordinates, effectively encapsulating the camera's perspective. It has the advantage of being relatively simple to implement. However, it is sensitive to outliers.

Proposition VIII. Normalize all outliers | Compute the homography.

Proof.—Direct Linear Transformation begins by normalizing outliers (III.xxxvi.) to improve numerical stability. The algorithm replaces each outlier with the average of its surrounding points.This effectively eliminates the outlier while keeping the rest of the data intact. The coordinates of a pixel equals a known coordinate in 3D space multiplied by an unknown projection matrix—x = PX. These coordinates exist in a homogeneous 3 x 4 matrix: 

[ A B C D ]

[ E F G H ]

[ I  J  K L ]

The algorithm needs a minimum of 6 points to estimate 11 values. The algorithm uses point correspondences to form a system of linear equations—each point correspondence contributes two equations—one for x and one for y. It then solves the linear system using a method like Singular Value Decomposition (SVD)—this yields a transformation matrix that minimizes error. Matrices of vectors in space. Vectors connecting points in images to points on the surface of the virtual model. 

Proposition IX. Straightness detects outliers.

Proof.—The collinearity equations are a pair of calibration algorithms. They identify outliers by finding non-linear relationships between variables.

Note.—One of the most common ways to calculate the relationships between variables is the Pearson correlation coefficient. This coefficient measures the strength of the linear relationship between two variables. If the coefficient is close to 1, then the variables are highly linearly related, and if the coefficient is close to -1, then the variables are inversely related. If the coefficient is close to 0, then there is no linear relationship between the variables. Outliers can be identified by looking for data points that have a high Pearson correlation coefficient with one variable and a low correlation coefficient with another variable. Data points are categorized as outliers because they do not fit the linear relationship that is expected between the variables.

Proposition X. Two-point perspective triangulates perfection.

Proof.—The two-point perspective transformation algorithm works by converting 3D point cloud data into a 2D image representation with the appearance of depth and perspective. It involves identifying two vanishing points in the scene, which represent the convergence points for parallel lines when projected onto the 2D image plane. Triangles projecting through space. By establishing the relative positions of these vanishing points based on the scene's orientation and camera viewpoint, the algorithm projects each 3D point onto the 2D plane along their respective projection lines. This process effectively flattens the 3D sceneonto the 2D plane while preserving an illusion of depth and distance. The two-point perspective transformation algorithm is employed to determine camera parameters. By analyzing vanishing points in the images, the algorithm can infer the camera's focal length, principal point, and lens distortion—intrinsic parameters. It also estimates the relative orientation and position of the camera with respect to the scene—extrinsic parameters.

Proposition XI. After calibration—everything must be registered.

Proof.—Registration algorithms are used in reconstruction to determine correspondence. This correspondence is typically done by finding point matches between images from different camera positions. The most common algorithms are intensity-based— they compare intensity values of pixels. These methods are most effective when images have a high degree of overlap. However, these methods can be sensitive to noise and can sometimes produce false matches. Phase correlation, optical flow, mutual information, and feature detection are also used for registration. 

Note.—Phase correlation compares the signals that represent images. Signals are typically represented as a series of pixel values—color and intensity at each location in the image. Phase correlation compares the signals by converting them into the frequency domain, using a mathematical operation known as the Fourier transform. This allows the signals to be represented as a series of  sinusoidal waves, each with a unique frequency and amplitude. Once the signals are in the frequency domain, they can be compared using a mathematical operation known as the cross-correlation. This operation compares the amplitude and phase of the signals at each frequency, and it calculates a measure of their similarity. Based on this measure of similarity, the images can be aligned or registered with each other: “The accuracy of the algorithm was found to vary in proportion to σ/n(1 − δ)2, where σ is the speckle size, n is the subimage size, and δ is the amount of decorrelation, with negligible systematic errors. For typical values the uncertainty in the displacement is approximately 0.05 pixels. Uncertainty is found to increase with increased displacement gradients.”298 Jean-Baptiste Joseph Fourier was a French mathematician and physicist known for his groundbreaking work in the field of heat conduction and the analysis of periodic functions. One of his most significant contributions was the discovery and development of the Fourier transform. Fourier's key insight was that any periodic waveform can be represented as a sum of sinusoidal functions with specific amplitudes and frequencies. He introduced the concept of Fourier series to decompose periodic signals into their constituent sinusoidal components, providing a powerful mathematical tool for analyzing complex waveforms.299

Proposition XII. Feature detection descends from physiognomy and phrenology (I.xiv.).300

Proof.—Feature detection refers to the process of identifying and extracting distinctive details from images. Features help align and match the images. These features can be points, lines, or other image structures that are distinctive and stable, and that can be detected and matched across different images. Once the features have been detected, a feature descriptor is used to represent each feature in a compact way. Reduced. The descriptor is a vector of numerical values that describes the appearance and location of the feature: “Feature descriptors serve as a kind of numerical ‘fingerprint’ that we can use to distinguish one feature from another by encoding interesting information into a string of numbers”301 (II.xl.). After features and descriptors have been extracted from the images by SIFT-like algorithms, they are paired using techniques such as nearest neighbor matching or RANSAC. 

Proposition XIII. Images are ransacked.

Definition.—By ransack, I mean to look through thoroughly—often in a rough way; to search and steal with force; to plunder.302

Proof.—Features extracted from multiple images are matched using algorithms like Random Sample Consensus—RANSAC—to create point and patch correspondences:There is a famous tale in computer vision: Once, a graduate student asked the famous computer vision scientist Takeo Kanade:  ‘What are the three most important problems in computer vision?’ Takeo replied: ‘Correspondence, correspondence, correspondence!’”303

Note.—RANSAC is an iterative algorithm that estimates the parameters of a model from a set of noisy or incomplete data. RANSAC randomly selects points to form an initial hypothesis for the transformation model, such as an affine or homography transformation. It then iteratively checks how many other points support the hypothesis and repeats this process for a specified number of iterations. The hypothesis with the highest number of inliers—points consistent with the model— is considered the optimal transformation. RANSAC uses randomness to divide data into outliers and inliers.304 

Proposition XIV. Inliers and outliers.

Proof.—In data analysis, inliers are data points that are consistent with the overall pattern in the data—that are not significantly different from the rest. Similarly, in statistical analysis, an outlier is an anomaly—not representative of the overall pattern: “An outlier is an observation that lies an abnormal distance from other values in a random sample from a population. In a sense, this definition leaves it up to the analyst (or a consensus process) to decide what will be considered abnormal. Before abnormal observations can be singled out, it is necessary to characterize normal observations.305  Inliers are typically considered to be representative of the overall data set and to be more reliable than outliers (III. xxxix.).

Note.—The word outlier ultimately derives from the Latin ‘extra,’ which means ‘outside,’ and ‘līre,’ which means ‘to go.’ The earliest known use of ‘outlier’ was in 17th century France, referring to ‘stone quarried and removed but left unused,’ … [and] ‘one who does not reside in the place of his office or duties;’ the sense of ‘anything detached from its main body’ is from 1849; the geological sense is from 1833.”306 The word outlier has been used in English since the 19th century. It is widely used to refer to data points that are significantly different from the rest of the sample. It is also used more generally to refer to anything that is significantly different from the norm or that stands out in some way.

Proposition XV. Outliers have value.

Proof.—Outliers provide valuable insights into the data and the processes that generated it. Outliers can indicate the presence of underlying patterns or trends that may not be evident in the overall data set, and they can help to identify the sources of variability or error in the data:Data editing with elimination of outliers that includes removal of high and low values from two samples, respectively, can have significant effects on the occurrence of type 1 error. This type of data editing could have profound effects in high volume research fields.”307 Not all outliers are bad data points. In some cases, outliers can be valid data points that just happen to be different from the rest of the data.

Corollary.—Outliers create friction.

Proof.—There are true outliers—but other outliers may not be representative of the overall data—the result of errors or biases.

Note.—In Reconstruction, outliers arise from noise, errors in the data collection process, and from the presence of structures or features that are not part of the scene being reconstructed.

Proposition XVI. Outliers are identified and separated—or excluded.

Proof.—Outliers can be identified using statistical techniques such as box plots or statistical tests and they can be treated in a variety of ways depending on the goals of the analysis and the nature of the outliers. For example, they may be excluded from the analysis or they may be included and treated as a separate category.

Proposition XVII. Outliers must be handled.

Proof.—Outliers produce uncertainty.

Note.—There are several approaches to treating outliers in Reconstruction: “RANSAC and PROSAC perform similarly for the case of limited outliers.”308 PROSAC—Progressive Sample Consensus—uses progressive sampling based on data point scores to improve efficiency and accuracy, whereas RANSAC uses random sampling. RANSAC and PROSAC both use exclusion, weighting, and modeling techniques as part of their robust model estimation processExclusion—Outliers can be excluded from the reconstruction process, either manually or using automated algorithms. This can be useful when the outliers are clearly incorrect or when they are not representative of the overall scene—Weighting—Outliers can be assigned lower weights in the reconstruction process, which reduces their influence on the final result. This can be useful when the outliers are less reliable but still contain some information that is relevant to the reconstructionModeling—Outliers can be modeled explicitly as part of the reconstruction process, either as part of the scene or as separate objects. This can be useful when the outliers are part of the scene and cannot be excluded or when they contain valuable information about the scene. Strip or sedate.

Proposition XVIII. Average all deviance.

Proof.—Standard deviation is a measure of the dispersion or spread of a dataset, and it is often used in 3D reconstruction to evaluate the accuracy and reliability of the reconstruction. Standard deviation is calculated as the square root of the variance, which is the average squared deviation of the data points from the mean. Standard deviation is important in 3D reconstruction because it provides a way to measure the degree of uncertainty or error in the reconstruction. A high standard deviation indicates a greater degree of uncertainty or error, while a low standard deviation indicates a smaller degree of uncertainty or error. Confidence Intervals quantify uncertainty.309 

Note I.—In Reconstruction, standard deviation is calculated multiple times to assess accuracy—during calibration, feature matching, and point cloud evaluation.These calculations contribute to the overall evaluation of the reconstructed 3D scene's accuracy and reliability. Standard deviation measures reconstruction error—the difference between the reconstructed model and the real-world geometry.

Note II.—In metrology, Abbe error—also known as sine erroroccurs when the measurement axis and the scale are not aligned. The principle, also known as Abbe’s principle of alignment, states that measurement accuracy is maximized when the measuring scale, or the measurement reference line, is aligned with the dimension being measured.The Abbe error can become significant in precision engineering where very high measurement accuracy is required. When the scale and measurement axis are misaligned, any movement in the system will create an angular error. This angular error, when multiplied by the distance between the scale and the measurement point, results in an error in the measurement reading, thus reducing the overall accuracy of the measurement. Named after Ernst Abbe, a German physicist and entrepreneur who contributed significantly to the field of optics and precision measurement,the principle emphasizes the importance of alignment in achieving accuracy: If errors of parallax are to be avoided, the measuring system must be placed coaxially—in line with—the line in which displacement——giving length——is to be measured.”310

Proposition XIX. Structure synthesized from alignments.

Proof.—After matching, the next critical step is estimation of motion and structure. Algorithms like Perspective-n-Point—PnP—use the correspondences from the previous step to estimate camera motion and triangulate the structure of the object or scene.

Proposition XX. The surveillance of known points.

Proof.—The PnP problem arises frequently in 3D reconstruction tasks. If there is a set of points whose position in a global frame is known, and these points are observed in multiple images, camera poses for all images can be computed.The most simple case is P3P—Perspective-Three-Point—where exactly three correspondences are used. More complex algorithms, such as EPnP—Efficient PnP— handle more correspondences and potentially provide more robust results. The PnP problem is the foundation for aligning different camera views and performing accurate triangulation. Once the camera poses are known—solved by PnP—triangulation is applied to estimate the positions of the points that are visible in multiple camera views.

 Note.—A general approach to solving the PnP problem involves two steps—control points are created from the known 3D points, and the corresponding image points are transformed with respect to these control points. This involves creating a system of equations that relate the known 3D points, the image points, and the cameras extrinsic parameters—its pose. This system of equations is then solved, usually through a process like Direct Linear Transform—DLT, Levenberg-Marquardt optimization or others, to find the pose of the camera that best fits the observed data. These solutions often work in tandem with RANSAC or other robust methods to handle outliers and noise in the data. The resulting pose estimate allows the 3D points to be reprojected into the cameras image plane, facilitating the process of 3D reconstruction. Known points control the unknown.

Proposition XXI. Triangulation points toward targets.

Proof.—Triangulation is a method for determining the 3D coordinates of a point from multiple 2D images taken from different viewpoints. It works by projecting rays from the camera center in the 2D images onto a common 3D reference plane. The intersections of these rays determine the 3D coordinates of each point on the target.Triangulation can be used to reconstruct point clouds, surfaces, and models.

Note.—Triangulation is a foundational military tactic—descended from the boar’s head—the flying wedge—the tactical body—panzerkeil.311 Triangulation and trilateration are used in navigation, target acquisition, and communication. These strategies determine locations using signals from multiple fixed points or landmarks, such as satellites, radio transmitters, or beacon towers: “Trilateration uses known distances to pinpoint precise locations. Triangulation uses known angles to calculate unknown distances.”312 Triangulation can determine the position of an enemy target by measuring the angles between the target and two fixed points, such as the surveilling units own position and a nearby hilltop. Triangulation is also used for military communication. Multiple communication stations or relay points, can establish a secure communication network that is less vulnerable to interference or interception.

Another note.—Gilles Deleuze and Félix Guattaris concept of triangulation refers to a way of thinking about power relations and social hierarchies. According to Deleuze and Guattari, triangulation occurs when three elements or forces come into play. These three elements can be individuals, groups, organizations, or even ideas. Deleuze and Guattari argue that power is not just held by a single dominant force, but rather it is produced and maintained through complex and dynamic relationships between different elements. In a triangulated relationship, one element typically holds power over the other two. Two are subordinates. In Anti-Oedipus: Capitalism and Schizophrenia, they deconstruct the triangle of normalized family relations—Mother, Father, Child—“all of them divine forms that become complicated, or rather ‘desimplified,’ as they break through the simplistic terms and functions of the Oedipal triangle … Desiring-production forms a binary-linear system … Oedipus restrained is the figure of the daddy-mommy-me triangle, the familial constellation in person.”313

The triangle is an abstraction of power. It is the illusion of stability. In reality, “flows ooze, they traverse the triangle, breaking apart its vertices. The Oedipal wad does not absorb these flows, any more than it could seal off a jar of jam or plug a dike. Against the walls of the triangle, toward the outside, flows exert the irresistible pressure of lava or the invincible oozing of water.” Life moves. Triangles break apartdynamicfleeting.

Proposition XXII. Movement is captured in optical flow.

Proof.—Optical flow is a technique used to analyze the motion of objects in a sequence of images or videos.This technique is used to detect the movement of objects from one frame to the next, allowing for the analysis of their motion and trajectory.

Note.—Optical flow is typically performed using a mathematical operation known as the Lucas-Kanade method. This operation compares the pixel values of the images or videos in the sequence and calculates the movement of the pixels between frames. Optical flow is often integrated into the Structure from Motion—SfM—pipeline to improve the accuracy of the initial camera pose estimates. By observing the displacement of pixels in the optical flow field, it is possible to infer depth information in the scene. Dense optical flow techniques provide depth estimates—depth maps. These maps are used for spatial navigation in a variety of applications, including surveillance, robotics, and video games: “The decoding of flows and the deterritorialization of the socius thus constitutes the most characteristic and the most important tendency of capitalism.”314

Proposition XXIII. Optimization is a strategy of maximum alignment.

Proof.—Optimization reduces reprojection error. Algorithms used for this purpose include Bundle Adjustment, Graph Cuts, and Belief Propagation.

Note.—In Reconstruction, the core objective of optimization is to yield the most accurate dimensional model. The dimensional double of the observed images or corresponding data. Approaching this goal typically necessitates the iterative refinement of variables—camera parameters, point locations—to minimize the inconsistency between a reconstructed model and the actual observed data. 

Proposition XXIV. Avoid error—align with the lightintegrateassimilateOr vanish.

Proof.—Alignment algorithms like bundle adjustment enhance estimated point positionsand camera calibration parameters—focal length, principal point, lens distortion coefficients, and camera pose. The resulting effect is a substantial increase in the accuracy of the reconstructed target. 

Note.—Bundle Adjustment was first used in the field of photogrammetry in the 1950s and is now used extensively in computer vision.315 The term bundle refers to the bundle of light rays leaving each point on an object and arriving at each camera position, forming a network—or bundle—of rays. The term adjustment comes from the idea of modifying parameters to minimize the error between the predicted and actual image observations. The bundle of light rays is adjusted to better align with the source images. The process of bundle adjustment is initiated by generating a preliminary conjecture of the 3D points and camera parameters. This initial step is often carried out using structure-from-motion methodologies, in which a pair of images is employed to estimate the points by deciphering the relative motion between two cameras. The accuracy of the initial guess is not paramount, yet it requires a reasonable degree of correctness to ensure the optimization process is capable of converging towards an accurate solution. Following the production of the initial guess, the subsequent step involves the re-projection of each 3D point back into the image planes of the cameras, an action that relies on the conjectured camera parameters. The re-projection step effectively serves to calculate the error by comparing the re-projected points position with the originally observed point within the image. Divergence in positions is known as re-projection errorthe central goal of bundle adjustment is to minimize this error.

The minimization of re-projection error across all cameras and points is a problem of non-linear least squares: “Bundle adjustment constitutes a large, nonlinear least-squares problem that is often solved as the last step of feature-based structure and motion estimation computer vision algorithms to obtain optimal estimates. Due to the very large number of parameters involved, a general purpose least-squares algorithm incurs high computational and memory storage costs when applied to bundle adjustment.”316 Compute time. Energy consumption. This optimization process adjusts the estimated points and camera parameters iteratively. Each iteration strives to reduce the overall re-projection error. The iterative cycle continues until the alteration in error—or the error itself—is reduced below a predetermined threshold or the maximum number of iterations  is achieved. Upon the completion of this iterative process, the final result of the bundle adjustment is a refined set of points and camera parameters that produces the minimal re-projection error. The final, optimized output exhibits a significant improvement in accuracy compared to the initial guess.

Proposition XXV. Graph cuts produce disparity maps.

Proof.—Graph cuts are a powerful optimization approach used in Reconstruction to efficiently solve certain energy minimization problems. Graph cuts are commonly applied to the problem of depth map—and disparity map—estimation: “The modern variations on graph-based segmentation algorithms are primarily built using a small set of core algorithms—graph cuts, random walker, and shortest paths.”317 When two images are captured from different viewpoints, the corresponding points on these images will have a horizontal shift due to the relative displacement of the cameras. This horizontal shift is known as disparity. The larger the disparity value, the closer the object is to the cameras, and the smaller the disparity value, the farther the object is from the cameras. By computing a disparity map—which represents the disparity values for all pixels in an image pair—it is possible to estimate the depth information of an object or scene. This information can be used to reconstruct dimensional structures: “Dense disparity estimation in omnidirectional images has become a part of localization, navigation, and obstacle avoidance research.”318 The process starts by calculating costs for each possible disparity of each pixel. This includes a data cost, which quantifies the match between a given disparity and the observed images, and a smoothness cost, penalizing disparities that deviate from their neighboring pixels. Following cost calculation, the graph cut algorithm is applied, partitioning the graph into two disjoint sets. Each set represents a distinct disparity or a set of disparities. The goal is to identify a cut that minimizes the combined costs of the edges crossing the division. This cut effectively corresponds to a disparity assignment that minimizes the overall costs. This process yields a disparity map which forms the basis of a model.

Note.—Social disparity maps are graphical representations that highlight the spatial distribution of various socio-economic factors across a given geographical area. These maps often showcase the direct and indirect outcomes of partitioning processes like redlining and gerrymandering. Redlining is a discriminatory practice that involves the systemic denial of essential services like banking and insurance to residents of specific neighborhoods—predominantly those occupied by racial and ethnic minorities. Gerrymandering is a political strategy that manipulates the boundaries of electoral districts to favor a particular political party or group.The impact of gerrymandering can be seen in the distribution of political power, where certain communities may be disproportionately underrepresented or overrepresented. These spatial partitioning processes have created stark socio-economic disparities. Social disparity maps visualize these socio-spatial divides, graphing variations in income, education, health, and other indicators across different regions. For instance, the Atlas of Inequality “uses aggregated anonymous location data from digital devices to estimate people’s incomes and where they spend their time …Economic inequality isn't just limited to neighborhoods, it's part of the places you visit every day…place inequality measures how similar the incomes of those visitors are. Each dot on the map is a place. More blue places see diverse visitors, while red places are more unequal.319 Regions that were historically subjected to redlining and gerrymandering often display lower levels of income, poorer health outcomes, and limited access to quality education, reflecting the long-lasting effects of discriminatory practices.

Proposition XXVI. Belief propagation produces disparity maps.

Proof.—Belief Propagation—BP—is another prominent algorithm utilized to infer a dimensional model from multiple images.

Note.—The process starts with formulating the 3D reconstruction problem as a Markov random field—MRF—where each node corresponds to a pixel and its state signifies the depth—or disparityof  the pixel. The edges connecting the nodes encapsulate relationships between pixels, typically enforcing smoothness constraints. For each possible disparity, a data cost is computed, measuring the compatibility of that disparity with the observed images, alongside a smoothness cost that penalizes disparities that diverge from their neighbors. The algorithm commences its iterative message-passing phase, with each message representing the sender nodes belief about the receiver nodes disparity.These messages are updated considering costs and incoming messages from neighboring nodes. Afterward, beliefs—the estimated probabilities of each possible disparity—are updated at each node, which entails combining the data cost for each disparity with the corresponding incoming messages. This iterative cycle of message-passing and belief updating continues until the beliefs converge—or a predetermined maximum number of iterations is reached. The disparity—or depth—for each pixel maximizing its belief is selected, constructing a disparity map that can then be transformed into a model.320 

Another note.—Propaganda and disinformation refer to the intentional dissemination of false information to deceive or mislead:

‘Conspiracy beliefs,’ characterized as ‘attempts to explain the ultimate cause of an event … as a secret plot by a covert alliance of powerful individuals or organizations, rather than as an overt activity or natural occurrence, feature prominently in disinformation, misinformation, and inequality-driven mistrust. It can be difficult to persuasively present evidence to refute these types of ideas, especially because experts are often seen as part of the conspiracy and new pieces of contrary evidencecan be rationalized into an existing narrative. For example, a Pew Research Center survey conducted in March 2020 found that 29% of Americans believed that SARS-CoV-2 was developed intentionally in a lab, with many pointing to Wuhan, China as the source; President Trump has given this theory institutional legitimacy, despite scientific consensus and the consensus of the U.S Intelligence services that SARS-CoV-2 is not human-made. This strategic disinformation has served several agendas: casting doubt on evidence presented by Dr. Anthony Fauci, the Director of the National Institute of Allergy and Infectious Diseases and member of the White House Coronavirus Task Force, validating and reinforcing pre-existing xenophobia and racism  and redirecting attention away from the White House’s inadequate and delayed response to COVID-19.321 

Disinformation produces social and economic disparities within and between communities. For instance, political disinformation can disrupt the fair allocation of political representation—creating disparities in political power that can be spatially mapped.

Proposition XXVII. Disparity maps are maps of relative value.

Proof.—Disparity maps are grayscale images in which the value of each pixel corresponds to the difference in position of that pixels projections onto two different images. The disparity is inversely proportional to depth,indicating that a higher disparity corresponds to a closer object, while a lower disparity points to a distant object. Generation of these maps involves matching points between two images, typically by comparing small windows of pixels around each point in the first image with corresponding windows in the second image. Disparity for each point in the first image is then calculated based on the position difference between that point and its match in the second image. Subsequently, given the disparity map and the specifics of the camera and stereo setup, a depth map can be created, providing estimated distances from the camera to different points in the scene. This depth map, often used in tandem with the original image data, enables the construction of a 3D model of the scene. Each point in this 3D model is obtained by back-projecting a pixel from the image onto a 3D ray in the scene, with the depth along the ray determined by the depth map. Disparity maps offer a key pathway to convert 2D image data into depth information, ultimately facilitating the reconstruction of the 3D geometry of a scene. Higher disparity is white—lower disparity is black.

Note I.—The term stereotype—associated with generalized and oversimplified representations of individuals or groups—finds its root in the word stereo, a prefix of Greek origin meaning solid or three-dimensional volume.322 

Corollary I.—Etymologies map ideologies.

Proof.—This is proved from the last proposition in the same manner as III. xxii. is proved from III. xxi.

Corollary II.—Two axes of language in action.

Proof.—Stereo vision, a fundamental principle of human perception, refers to our ability to perceive depth and three-dimensional structure obtained on the basis of visual information derived from two eyes. This biological capacity is integral to our survival and interaction with the world—allowing us to navigate our environment with precision. The term stereo developed new meanings with the advent of technology, notably in the printing industry in the late 18th century. A stereotype—derived from the terms stereo and type—was a solid plate of type metal, cast from a paper-mâché or plaster mold—called a flong. The flong was taken from the surface of a form of type used for printing. These plates were durable, enabling repeated impressions that were identical to the original type layout: “Stereotypes were not moving (or movable) type, but solid type.”323 The replication of immutable stereotypes gave birth to the metaphorical application of the term in social contexts.

Corollary III.—Flattened and unchanging impressions.

Proof.—The metaphorical use of stereotype arose in the early 20th century when American journalist Walter Lippmann started using it to describe the overly simplistic, preconceived, and standardized images or ideas held by one person or group about another. Stereotypes are “hallucinated—a complete fiction out of one external fact and a remembered superstition.”324 The connection was logical—like the unchanging impressions made by stereotype plates, societal stereotypes fail to capture the changing nuances, complexity, and individuality of the people they seek to represent. They flatten three-dimensional human beings into one-dimensional caricatures. Over time, stereotypehas become a critical term in social sciences used to interrogate the simplified assumptions and prejudices that obstruct genuine understanding of others.

Note II.—Stereotype reveals an etymological evolution that is tied to our visual and cognitive understanding of the world. Stereotypes are functional. Efficiencies. But they decimate the infinite of the other: “The contemporary world, scientific, technical, and sensualist, sees itself without exit—that is, without God—not because everything there is permitted and, by the way of technology, possible, but because everything there is equal. The unknown is immediately made familiar [...] The enchantment of sites, hyperbole of metaphorical concepts, the artifice of art, exaltation of ceremonies, the magic of solemnities—everywhere is suspected and denounced a theatrical apparatus, a purely rhetorical transcendence, the game. Vanity of vanities: the echo of our own voices, taken for a response to the few prayers that still remain to us; everywhere we have fallen back upon our own feet, as after the ecstasies of a drug. Except the other whom, in all this boredom, we cannot let go.”325

Proposition XXVIII. Reconstructions densify.

Proof.—After optimizing a sparse set of pointsdenser reconstructions are generated—stereo matching—space carving—patch-based multi-view stereo algorithms.

Proposition XXIX. A dense reconstruction is stilla stereotype.

Proof.—Stereo matching is predicated on the principles of binocular disparity, drawing parallels to the physiological functioning of human binocular vision. This process constitutes the identification of corresponding points, also known as stereo correspondences, between a pair of stereo imagesa duo of images capturing the same scene from two distinct viewpoints, thus paralleling the different perspectives provided by each of the human eyes. The task of stereo matching revolves around determining correspondences between the two images. Mapping every pixel in one image to its corresponding pixel in the other image. The criticality of this operation stems from its centrality in triangulation, which paves the way for the inference of the 3D structure of the given scene. Semi-Global Matching—SGM—and Dynamic Programming are two additional algorithms often employed in stereo matching. Semi-Global Matching computes the minimal cost for all possible disparities for each pixel while maintaining consistency along several directions, and Dynamic Programming seeks to optimize a cost function globally, making it suitable for handling occlusion.

Note.—Variations in lighting conditions, shifts in perspective, occlusions, and other alterationsin visual properties disrupt alignment and computation. Graph Cuts, Block Matching and Space Carving make sense of change and interference.

Proposition XXX.Reconstruction carves space. (III. xxv.)

Proof.—Space carving—also known as volumetric reconstruction—generates three-dimensional models from two-dimensional images. The method also utilizes a collection of images taken from different viewpoints. But it is subtractive.

Note.—The technique operates by defining an initial volume, often encompassing the entire scene, and then iteratively carving—removing parts of the volume that are determined to be empty—until a final model is produced. Each carving iteration uses the information from one of the images. It begins with a solid volume thats large enough to contain the reconstructed object. For each image, determine which parts of the volume would project onto the background. These parts of the volume are carved away since they do not contain the object. This is repeated for every image. The remaining volume represents the reconstruction of the object. This method assumes that the object is completely surrounded by the cameras, and it needs to have the silhouette of the object in all images to perform the carving. This method does not require feature matching between images, making it suitable for reconstructing objects with textureless or repetitive surfaces: “The approach is designed to (1) capture photorealistic shapes that accurately model scene appearance from a wide range of viewpoints, and (2) account for the complex interactions between occlusion, parallax, shading, and their view-dependent effects on scene-appearance.”326

Proposition XXXI. Space is carved with lines of sight.

Proof.—Ray-casting works by tracing a ray from the eye of the viewer—or camera—through each pixel in the image. For each pixel, the ray is tested for intersection with any objects in the scene. If the ray intersects an object, the color of the pixel is set to the color of the object. If the ray does not intersect any objects, the color of the pixel is set to the background color. Ray casting is used in Space Carving to project the two dimensional silhouettes of an object onto an initial volume. For each voxel in the volume, a ray is cast to each camera position, and the voxel is discarded if it projects outside the objects silhouette in any of the images.

Corollary.—Ray-casting projects a point of view.

Note.—In computer graphics and rendering, ray-casting creates visual perspective from a defined point of view. It shoots rays through space. It calculates anything it hits. Targets. Ray-casting is generally attributed to Arthur Appel, who first used the term in a 1968 paper titled Some Techniques for Shading Machine Renderings of Solids: “Generate a light ray to the midpoint of the segment (KM). If any surface lies between KM and the light source go on to the next segment. Determine the next surface behind KM that the light ray to KM pierces within its boundary. If no surface lies behind KM go on to the next segment. A point can cast only one shadow. Project Kl, KM, and K2 onto the surface to obtain K1S, KMS, and K2S, the shadows of Kl, KM, and K2. If KMS lies on a surface which is seen from its shadow side go on to the next segment. This particular shadow boundary is invisible. Also a shadow cannot fall within a shadow.”327 His work laid the foundation for much of the progress that followed in the field of computer graphics. The development of ray-casting—and subsequently ray-tracing—was a cumulative effort that evolved with the contributions of numerous researchers and developers over several decades. Ray-casting is a method that can be used at various points in the Reconstruction pipelinetransformations, bundle adjustment, rendering and visualization. In volume rendering, a model is usually represented as a point cloud, mesh, or grid of voxels. Ray-casting and ray-tracing generate images—from the models—viewable on a flat display. For each pixel in the output image, a ray is cast into the 3D model, and the pixels color and opacity are computed based on the voxels that the ray passes through.

Proposition XXXII. Mutual information equates to shared truths.

Proof.—In Reconstruction, mutual information is a measure of the amount of information shared between two variables. It is often used to evaluate the quality of a Reconstruction. It provides a way to measure the degree to which the reconstructed model accurately represents perfection. 

Note.—Mutual information is co-dependent data. It is calculated as the difference between the entropyof the reconstructed model and the entropy of the model conditioned on the input data. The entropy of a variable is a measure of the amount of information contained in the variable, and the entropy of a model conditioned on the input data is a measure of the amount of information contained in the model that is not present in the input data. Entropy. By calculating the mutual information between the reconstructed model and the input data, it is possible to determine the degree to which the reconstructed model represents the real-world scene: “Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”328 High mutual information indicates that the reconstructed model accurately represents the scene, while low mutual information indicates that the reconstructed model is less accurate. Fixed. Reconstructions are death. Entropy is life:“An MIT physicist [Jeremy England] has proposed the provocative idea that life exists because the law of increasing entropy drives matter to acquire lifelike physical properties.”329 Mutual information isoften used in conjunction with other evaluationmetrics, such as reconstruction error and visual quality, to provide a more comprehensive assessment of stability.

Proposition XXXIII.Benchmarks are shared sources.

Proof.—Benchmarks refer to standardized datasets and evaluation metrics that measure and compare the performance of different reconstruction algorithms.They play a critical role in the development and refinement of these algorithms by providing a consistent way to evaluate their effectiveness and accuracy. A benchmark dataset for Reconstruction typically includes data—a set of images or video sequences—along with associated parameters—and in some cases, ground-truth models for comparison. Some benchmark datasets include additional information—depth maps or semantic labels.

Note.—There are several popular benchmarks that are widely used for evaluating the performance of 3D reconstruction algorithms.The Middlebury dataset is a collection of images of a variety of scenes, including buildings, natural landscapes, and synthetic objects. It is widely used to evaluate the accuracy and reliability of 3D reconstruction algorithms, particularly for structure from motion and multiview stereo. The KITTI dataset is a collection of images and point clouds of real-world scenes captured by a vehicle-mounted camera and lidar system. It is widely used to evaluate the performance of 3D reconstruction algorithms for applications such as autonomous driving and robotics. The ShapeNet dataset—This is a collection of 3D models of a wide range of objects, including cars, chairs, and airplanes. It is widely used to evaluate the performance of 3D reconstruction algorithms for shape completion and reconstruction from partial data. The Tanks and Temples benchmark is a collection of images with corresponding Lidar point clouds—lasers as ground truth: “The benchmark sequences were acquired outside the lab, in realistic conditions. Ground-truth data was captured using an industrial laser scanner. The benchmark includes both outdoor scenes and indoor environments. High-resolution video sequences are provided as input, supporting the development of novel pipelines that take advantage of video input to increase reconstruction fidelity. We report the performance of many image-based 3D reconstruction pipelines on the new benchmark. The results point to exciting challenges and opportunities for future work.”330

Proposition XXXIV. Tanks and temples determine the ground truth.

Proof.—The Tanks and Temples benchmark was created by the Max Planck Institute for Intelligent Systems and the University of Stuttgart—it is widely used in research and development of reconstruction algorithms. The Tanks and Temples benchmark consists of thousands of images—fourteen scenes. The images were taken from a variety of viewpoints and under different lighting conditions, and they contain a range of features and structures that are challenging to reconstruct accurately.331 

Proposition XXXV. Tanks and temples normalize territories in the field.

Proof.—The Tanks and Temples benchmark serves as a critical unifying factor in the field of Reconstruction—a standardized datasetfor algorithm evaluation. By offering a common groundof complex, real-world scan data, it ensures that different algorithms can be objectively compared—allowing for consistent evaluation of their strengths and weaknesses. This consistency fosters collaboration, encourages transparency, and accelerates progress in the field, as researchers and developers can clearly see how their methods stack up against others. There is a leaderboard and a supplement of comparison matrices. By benchmarking against the same reference data, researchers can also identify areas of improvement, which promotes innovation and pushes the boundaries of whats possible in Reconstruction. But the benchmark is made of colonial symbols and institutions.

The first scene is Family—Father, Mother, Child. The next scene is a monument. Then a horse. A lighthouse. An M60 Tank. A Panther Tank. A playground. A train. An auditorium. A ballroom. A courtroom. A museum. A palace. A temple.332 

Note.—Military power and religious institutions—symbolized by tanks and temples respectively—impose social and political normalization. Tanks are embodiments of state authority —military power. They may instill a sense of threat or security and order. They signify the assertion of governmental control moving through physical space. Conversely, temples are spiritual epicenters. Fixed physical landmarks—monuments to ideologies. The first temple described in the bible was The Temple of Solomon. The description begins with precise measurements—each dimension—in cubits (V.): “The inner sanctuary was twenty cubits long, twenty wide and twenty high.”333 A cube. In front of the Temple, four staircases ascend to an elevated platform. An enormous pyre.334 The eternal flame. Priests tending the fire. A continuous flow of burned offerings. A vertical vortex of smoke. A tether. A giant cord connecting earth and sky | visible from miles away. A specimen pin (I.xv.).

Proposition XXXVI.Normalization promotes sameness.

Proof.—In computer graphics, normalization is the process of scaling a variable to lie within a specified range, typically between 0 and 1. This is often done to bring all the variables in a dataset to the same scale, which can be important for algorithms that rely on distances between data points.

Corollary.—Normalization aims for consistency.

Proof.—Normalization is used to scale models to consistent sizes for rendering. It is also used in computer graphics to ensure that colors are displayed accurately and consistently across different devices.

Note.—Normalization can be achieved through a variety of methods, including min-max normalization, z-score normalization, and scaling to unit length. Min-max normalization scales a variable to lie within a given range by subtracting the minimum value and dividing by the range. The resulting values will all lie between 0 and 1 with the minimum value becoming 0 and the maximum value becoming 1.Z-score normalization standardizes a variable by subtracting the mean and dividing by the standard deviation. This results in a variable with a mean of 0 and a standard deviationof 1. Scaling to unit length normalizes a variable by dividing each value by the Euclidean normor the square root of the sum of squares of the values. This results in a variable with a length of 1, with each value representing a proportion of the total length.

Proposition XXXVII. Integrating orientations and positions.

Proof.—Normalization also refers to adjustments in scale, orientation, and position of point clouds or meshes to a common reference frame. This process is often used to align multiple models or to bring them into a common coordinate system.

Proposition XXXVIII.Normals  |  straight and upright.

Proof.—In computer graphics, normals are vectors that are used to represent the orientation of a surface at a particular point. They are typically perpendicular to the surface and are used to calculate the way that light reflects—an important factor in realistic graphics. Normals calculate the angle of incidence between the surface and a light source. This determines what is reflected and how it is distributed over the geometry. There are two types of normals—surface normals and vertex normals. Surface normals are defined at each point on the surface of a model and are used to calculate the way that light reflects off of the surface. Vertex normals are defined at each vertex of a model and are used to smooth the surface of the model by averaging the surface normals of the surrounding vertices. Normals are an important factor in creating the appearance of depth, curvature, and surface roughness.335

Proposition XXXIX.No tangents  |  nothing abnormal.

Proof.—Abnormal is a lecture series given by Michel Foucault at the Collège de France in 1974. In this series, Foucault discusses the concept of abnormality and how it has been constructed and enforced in Western societies.

Note.—Foucault explains that abnormality is not a universal concept—it is a culturally and historically specific construct. He argues that the concept of abnormality is not based on objective scientific criteria, but rather it is shaped by social, cultural, and political factors. Foucault then goes on to trace the history of the concept of abnormality in Western societies, starting with ancient Greek and Roman societies, where abnormal behavior was understood as a form of divine punishment or possession. He then discusses the emergence of the modern concept of abnormality in the 18th and 19th centuries, which was closely tied to the rise of the discipline of psychiatry and the development of the asylum system:

It is this set of ideas, this simultaneously positive, technical, and political conception of normalization that I would like to try to put to work historically by applying it to the domain of sexuality. And you can see that behind this, the basic target of my criticism, or what I would like to get free from, is the idea that political power-in all its forms and at whatever level we grasp it-has to be analyzed within the Hegelian horizon of a sort of beautiful totality that through an effect of power is misrecognized or broken up by abstraction or division.336

Foucault argues that the concept of abnormality has been used as a means of social control, particularly in relation to marginalized groups—the poor, the elderly, people with disabilities, people of different races and orientations. He also suggests that the concept of abnormality has been used to reinforce dominant societal norms and values, and to justify exclusion and discrimination.

Proposition XL. Normals are indicators and controls.

Proof.—Normalization theory is a sociological theory that explains how social behaviors and norms become integrated into society. It was developed in the late 1970s by sociologists Professor Ian Gough and Professor Peter Townsend at the University of Bath in the United Kingdom.

Note.—According to normalization theory, the process involves four stages—Deviance—A behavior is seen as deviant or outside the normCondemnation—The behavior is condemned or stigmatized by societyTolerance—The behavior is gradually accepted and tolerated by societyAcceptance—The behavior becomes fully accepted and becomes part of the norm. Normalization theory suggests that deviant behaviors can become normalized through a process of social integration. In Normporn, “Karen Tongson reflects on how queer cultural observers work through repeated declarations of a ‘new normal’ and flash lifestyle trends like ‘normcore,’ even as the absurdity, aberrance, and violence of our culture intensifies.”337

Corollary I.—Normalization theory has been applied to a wide range of social behaviors, including drug use, gambling, and the adoption of technology. It is often used to understand how behaviors and norms change over time—how they become integrated into society.

Corollary II.—Normalizing white supremacist patriarchy refers to the process of accepting and perpetuating harmful systems of power that prioritize the interests of white, cisgender men over those of other groups. This includes the acceptance of white supremacy, which is the belief that white people are superior to people of other races, and the patriarchy, which is a system of power that privileges men over women. Binaries. White—Black. Male—Female.

Proof.—White supremacist patriarchy is reinforced in many ways, including through media, education, and language. For example, the media often portrays people of color in a negative or stereotypical manner, while simultaneously reinforcing the idea that white people are the norm and the default. This can lead tothe internalization of harmful stereotypes and biases by both white people and people of color. The Bluest Eye.338 Aggressive recession.

Note.—The endeavour to injure one whom we hate is called Anger.

Racism and sexism are normalized through images—stereotypes circulated. Reinforced. Internalized. Normalizing sexism also involves the acceptance of discriminatory practices and policies, such as the gender pay gap or the lack of representation of women in leadership positions. These practices and policies often go unchallenged, further perpetuating and reinforcing sexist attitudes and behaviors. Normalizing sexism—and racism—involves the subtle—often unconscious acceptance of attitudes and behaviors as the expected defaults. It is important to recognize and challenge the ways in which white supremacist patriarchy is normalized in order to create a more equal and just society. This involves actively working to dismantle harmful systemsof power and supporting policies and practices that promote equity and justice for all. To normalize equality and mutual respect.

Proposition XLI. Orientation is not fixed.

Proof.—Sarah Ahmed is a British feminist and critical race theorist who has written extensively about the concept of orientation. In her work, Ahmed uses the term orientation to refer to the ways in which individuals and groups orient themselves towards or away from certain ideas, practices, or values: “I consider how racism is an ongoing and unfinished history; how it works as a way oforientating bodies in specific directions, thereby affecting how they ‘take up’ space. We ‘become’ racialized in how we occupy space.”339

Note.—According to Ahmed, orientation is a dynamic and ongoing process that is shaped by social, cultural, and historical factors. It is not fixed or predetermined, but rather it is influenced by the ways in which individuals and groups interact with and make sense of the world around them.

Corollary.—Orientation is closely tied to power dynamics and social hierarchies.

Note.—Ahmed suggests that marginalized groups may be forced to orient themselves in ways that are counter to their own desires or interests. Orientation is not just about individual choices or preferences, it is a collective process shaped by social and cultural forces: Although Merleau-Ponty is tempted to say that the ‘vertical is the direction represented by the symmetry of the axis of the body,’his phenomenology instead embraces a model of bodily space in which spatial lines ‘line up’ only as effects of bodily actions on and in the world. In other words, the body ‘straightens’ its view in order to extend into space. One might be tempted, in light of Merleau-Ponty’s discussion of such queer moments, to reconsider the relation between the normative and the vertical axis … the normative can be considered an effect of the repetition of bodily actions over time, which produces what we can call the bodily horizon, a space for action, which puts some objects and not others in reach. The normative dimension can be redescribed in terms of the straight body, a body that appears ‘in line.’ Things seem ‘’straight’ (on the vertical axis), when they are ‘in line,’ which means when they are aligned with other lines.”340 She suggests that individuals and groups can actively resist and challenge dominant orientations and work towards creating more inclusive and equitable ways of orienting themselves and others.

Proposition XLII. Smoothing difference.

Proof.—Post-processing includes algorithms like Iterative Closest Point, Occupancy Networks, and Poisson Surface Reconstruction. Smoothing, decimation, texture mapping, and other refinement and manipulation techniques may also be applied at this stage.

Proposition XLIII.Post-processing minimizes difference.

Proof.—Iterative Closest Point—ICP—is a key algorithm in reconstructive post-processing.Point clouds might not be perfectly aligned in the same coordinate system. Before a complete and accurate model of the object or environment can be formed, these point clouds need to be aligned or registered correctly to form a coherent, unified model. Individuals merge:  “The individual is defined by its ‘positioning’ within the intersubjective frame.The foundation is transposed from a time axis to a spatial one, becoming topographical, the lay of the social land: we are no longer in the once-upon a-time, but in the always-already. For in this approach, the individual is in a sense prehatched, since the topography determining it is itself predetermined by a mapped-out logic of baseline positions and combinations or permutations of them.”341 The machine hypothesis synthesized.

The Iterative Closest Point algorithm works by iteratively minimizing the difference between two point clouds—the reference—or target—point cloud, and the source—or input—point cloud. The algorithm starts with an initial guess of the transformation between the two. It then finds the closest corresponding points and uses them to compute the transformation. This transformation is then applied to the first point cloud and the process is repeated. The algorithm converges when the difference between the two point clouds is minimized. The algorithm typically involves the following steps—Correspondence Estimation—For each point in the source point cloud, the closest point in the reference point cloud is found—Transformation Estimation—A transformation (comprising rotation and translation) that best aligns the source point cloud to the reference point cloud, based on the correspondences established in step 1, is computed. This is typically done by minimizing a certain error metric, often the mean squared error between the corresponding points—Transformation Application—The computed transformation is applied to the source point cloud—Iteration—These steps are repeated until the transformation between consecutive iterations falls below a certain threshold, or a maximum number of iterations is reached. In this way, Iterative Closest Point aligns and merges multiple point clouds into a coherent model.

Proposition XLIV.Machines of implicit representation.

Proof.—An implicit representation is a way of representing an object or surface using an equation rather than a set of discrete points or triangles. This equation defines the shape of the object or surface by specifying a value for every point in space. Points where the equation evaluates to zero are considered to be part of the object or surface,while points where the equation evaluates to a non-zero value are considered to be outside the object or surface.

Note.—Implicit representations are often used to represent smooth surfaces—like spheres and tori, which can be difficult to represent accurately using a mesh of discrete points or triangles. They are also useful for  complex shapes or topologies, such as those with self-intersecting surfaces or multiple connected components: “Implicit Feature Networks (IF-Nets), which deliver continuous outputs, can handle multiple topologies, and complete shapes for missing or sparse input data retaining the nice properties of recent learned implicit functions, but critically they can also retain detail when it is present in the input data, and can reconstruct articulated humans.”342 One advantage of implicit representations is that they can be easily transformed and modified with mathematical operations. For example, an implicit representation of a sphere can be scaled or translated simply by multiplying or adding constants to the equation. This can make them more efficient to work with than other representations, such as meshes, which may require more complex algorithms to modify. However, implicit representations can be more difficult to work with than other representations in some cases, as they do not provide a direct representation of the objects surface.This makes it more difficult to perform certain tasks—like ray tracing and collision detection—which require knowledge of the surface geometry. 

Proposition XLV.Representation occupies space.

Proof.—Occupancy networks—also known as occupancy maps or voxel gridsare different ways of representing three-dimensional objects and environments in computer graphics. They are composed of a grid of equally-sized cubes—voxels—that store information about the presence or absence of objects in space. Each voxel in an occupancy network is associated with a binary valuethat indicates whether an object occupies that voxel or not. For example, a value of 1 might indicate that an object is present in the voxel, while a value of 0 indicates that the voxel is empty. Occupancy networks are often used to represent objects or environments that are difficult to represent accurately using other methods, such as meshes or implicit surfaces. They are particularly useful for representing objects with complex shapes or topologies, or for representing large or detailed environments. One advantage of occupancy networks is that they provide a compact representation of 3D objects and environments, as they only store information about the presence or absence of objects in each voxel. This can make them more efficient to work with than other representations, such as meshes, which may require more storage space to store the same level of detail: “Occupancy networks implicitly represent the 3D surface as the continuous decision boundary of a deep neural network classifier. In contrast to existing approaches, our representation encodes a description of the 3D output at infinite resolutionwithout excessive memory footprint.”343 Occupancy networks are used for a variety of tasks in computer graphics—including modeling, rendering, and collision detection. They are also used in robotics and computer vision applications, for navigation through physical environments.

Proposition XLVI.The language of force invades the algorithms.

Proof.—The word marching comes from the Old French word ‘marcher,’ which means ‘to walk.’ It ultimately derives from the Latin ‘marcare,’ which means ‘to step’ or ‘to tread.’ In English, the word ‘marching’ is often used to describe the act of organized walking. A military formation—feet moving in unison—the body held upright—disciplined. It also means border—frontier.344 

Proposition XLVII.Marching cubes.

Proof.—The marching cubes algorithm is a widely used technique for extracting a polygonal mesh from an implicit representation—like a scalar field. It was first introduced in a paper published in 1987 by William E. Lorensen and Harvey E. Cline: “We present a new algorithm, called marching cubes, that creates triangle models of constant density surface from 3D medical data—using a divide-and-conquer approach to generate inter-slice connectivity, we createa case table that defines triangle topology.The algorithm processes 3D medical data in scan-line order and calculates triangle vertices using linear interpolation. We find the gradient of the original data, normalize it, and use it as a basis for shading the models.”345

Note.—A scalar field is a concept from mathematics and physics that associates a scalar value—a single numerical value—with every point in spaceor  region of spaceMarching Cubes takes these scalar values and creates a mesh of triangles that approximate the shape of an underlying geometry. The algorithm works by iterating over each point in the scalar field. For each point, it looks at the 8 points surrounding it and determines which of the 12 possible cases it falls into. Each case corresponds to a different way that the 8 points can be arranged, and the algorithm uses this information to generate the triangles that approximate the surface of the object. The marching cubes algorithm has been widely adopted due to its simplicity and efficiency, as well as its ability to handle both smooth and sharp features in the surface. It has also been extended and modified in various ways, such as the marching tetrahedra algorithm and the dual contouring algorithm. The marching cubes algorithm is commonly used in computer graphics and scientific visualization. It is used in medical imaging—MRI and CT scans—to create models of tissues and organs.

Proposition XLVIII. The body without organs—smooth space and striated space.

Proof.—In the philosophy of Gilles Deleuze and Félix Guattari, the Body without Organs is a conceptual plane or surface that exists beyond the organized and hierarchical structures of society, institutions, and the human body itself. It is not a physical entity but a virtual, abstract space of potentiality and intensity. The term ‘organs’ does not refer solely to bodily organs but refers to any organized, fixed, and stratified systems that restrict and regulate desire and creativity: “Capitalism tends toward a threshold of decoding that will destroy the socius in order to make it a body without organs and unleash the flows of desire on this body as a deterritorialized field”346 (III.xxii.).

The Body without Organs is a fuzzy concept—characterized by the absence of fixed forms, identities, and structures. It represents a state of pure becoming, free from pre-existing organization—full of flows, connections, and possibilities. It is a space of experimentation, creativity, and the emergence of new potentials.

Likewise their concept of smooth space refers to a type of space that is open, expansive, and continuous—as opposed to striated space, which is organized—hierarchical—segmented. Smooth space is characterized by a lack of boundaries or fixed points, and is characterized by fluid, open-ended movement. It is often associated with nomadic cultures and non-hierarchical social structures, as well as with the unconscious mind: “Smooth space is filled by events of haecceities, far more than by formed and perceived things. It is a space of affects, more than one of properties … it is an intensive rather than an extensive space, one of distances, not of measures and properties.”347 Deleuze and Guattari’s concept of smooth space is closely linked to their idea of desire, which they see as an open-ended, productive force that is constantly creating and destroying social and cultural structures. They argue that the desire for smooth space—and the desire to escape the constraints of striated space—is central to social and political change.

Note.—Deleuze and Guattari contrast smooth space with striated space, which is characterized by fixed points, boundaries, and hierarchical organization. Striated space is often associated with sedentary cultures and capitalist societies, as well as with the structures of language and representation: “Thestriated is that which intertwines fixed and variable elements, produces an order and succession of distinct forms, organizes horizontal [—] lines with vertical [ | ] planes.”348

Proposition XLIX.Collectives enmesh.

Proof.—A mesh is a collection of points—vertices—connected by edges to form a polyhedral surface. A mesh can be used to represent a 3D object or surface by dividing it into a series of small, flat polygons—typically triangles and quadrilaterals. Each vertex in a mesh is defined by its 3D coordinates (x, y, z)—the edges connecting the vertices define the topology of the surface. The polygons formed by edges are called faces.349 Basic volumes—cubes, spheres, cylinders, platonic solids—are primitives.

Note.—Primitives are unchanged:Westerners encountered a wide variety of societies in their colonial expansion. Politically these were categorized from the most complex—the state societies in regions of Asia and North Africa—to those perceived as formed by savages and primitives, with the simplest types of political organization. Their entrenched belief in a philosophy of progression took Western scholars to assume an uneventful and unchanged past for these societies. It was commonly argued that savages did not have a history. Hence, they were considered as living fossils, as ‘survivals’ from earlier stages of culture long passed in Europe.”350 Primitives were forced to deform—everything must conform.

Proposition L. Normals approximate the surface of perfection.

Proof.—Poisson Surface Reconstruction is an algorithm used to create a surface model from point cloud data. The approach is named after Siméon Denis Poisson, a French mathematician, who introduced the Poisson equation in the field of potential theory: “Reconstructing 3D surfaces from point samples is a well studied problem in computer graphics. It allows fitting of data—filling surface holes—remeshing existing models. We provide a novel approach that expresses surface reconstruction as the solution to a Poisson equation.”351 Unlike some other methods, it doesnt rely solely on the point positions, it leverages normals.

Note.—“Our key insight is that there is an integral relationship between oriented points sampled from the surface of a model and the indicator function of the model. Specifically, the gradient of the indicator function is a vector field that iszero almost everywhere (since the indicator function is constant almost anywhere), except at points near the surface, where it is equal to the inward surface normal.” The Poisson Surface Reconstruction process involves several steps | Normal Estimation | The first step in Poisson reconstruction is to estimate the normals of the points in the point cloud. These normals represent the direction that the surface was facing at each point. If the point cloud data does not already contain normals, they will need to be estimated—Equation Formulation—Using the point positions and the surface normals, a Poisson equation is formulated. This equation describes how the surface normal directions vary over space—Equation Solution—The Poisson equation is then solved to obtain a scalar function that can define the surface of the 3D object. This is typically done through a process of relaxation or optimization—Iso-Surface Extraction—Finally, an iso-surface extraction process, such as Marching Cubes, is used to generate a mesh from the scalar function.

Proposition LI. Smoothing is normalization.

Proof.—Smoothing a meshed surface removes noise and inconsistencies—producing a more visually pleasing representation.

Note.—There are several smoothingalgorithms—Laplacian SmoothingThis is one of the most straightforward and widely-used techniques. In Laplacian smoothing the new position of a vertex in a mesh is calculated as the averageof its neighbors. This process is usually iterated several times. While easy to implement, this method may lead to shrinkage of the model and doesnt preserve sharp features—Bilateral Smoothing—This is an improvement on the Laplacian smoothing approach that preserves sharp edges better. In bilateral smoothing, the weights of neighboring points are determined not just by their distance but also their similarityin terms of normal vectors or color. This means that points on the same surface but different sides of an edge will not be averaged together, preserving the sharpness of the edge—Iterative Closest Point —Although ICP is primarily a method for aligning 3D shapes, it often includes a smoothing step to help improve the quality of the alignment. During the alignment process, the point clouds are iteratively adjusted to minimize the distance between corresponding points, which has a smoothing effect on the overall shapes—Taubin SmoothingThis method is an extension of Laplacian smoothingwhich alternates between Laplacian smoothingwhich may cause shrinkage—and its inverse—which can cause expansion—By carefully choosing the parameters for these two steps, it is possible to cancel out the shrinkage and maintain the original size of the model while still achieving the smoothing effect—Poisson Reconstruction—As previously mentioned, Poisson Surface Reconstruction uses the input point cloud data and the surface normals to generate smooth surfacesQuadric Error Metrics Simplification—This method simplifies the mesh while minimizing deviation from the original surface. It combines vertices and adjusts their positions to minimize the overall error, leading to a smootherand less complex surface.352

Proposition LII. Reality, decimated.

Proof.—Decimation refers to the process of reducing the complexity of a model by decreasing the number of vertices, edges, and faces while trying to preserve the overall shape and features of the model as much as possible.This process is sometimes also known as mesh simplificationor model reduction. The purpose is to make the model easier to work with and quicker to process, especially in applications like real-time rendering or analysis where speedis crucial: “... decimation filters are commonly used to restore the realistic appearance of virtual biological specimens, but they can cause a loss of topological information of unknown extent. In this study, we analyzed the effect of smoothing and decimation on a 3D mesh to highlight the consequences of an inappropriate use of these filters … Decimation always produced detrimental effects on both topology and measurements.”353

Note.—Strategies of decimation include—Vertex Clustering—This is one of the simplest methods for model decimation. The 3D space is divided into a regular grid of voxels and all the vertices within each voxel are replaced with a single vertex often the centroidof the original vertices. While simple and fast, this method can result in a loss of detail and does not always preserve the topology of the model well—Edge Collapse—This is a more sophisticated method that iteratively removes the least important edges from the model. The importance of an edge can be calculated in various ways, such as the length of the edge, the curvature of the surface around the edge, or the angle between the faces adjacent to the edge. When an edge is removed, the two vertices at its ends are mergedinto a single vertex, and the faces adjacent to the edge are also removed or reconnected—Quadric Error Metrics simplification—QEM—This is a further refinement of the edge collapse method. For each potential edge collapse, a quadric error metric is calculated, which represents the squared distance from the new vertex position to the original surface. The edge collapse that results in the smallest increase in this error metric is chosen at each step. This method can preserve the features of the model well, but is more computationally intensive—Vertex removalThis method involves iteratively removing vertices from the model, similar to edge collapse. However, instead of merging vertices, a vertex is removed and its neighboring vertices are reconnected to form new faces:354 The word decimation descends from an ancient Roman military punishment. Plutarch describes how a general or emperor executed “the punishment known as ‘decimation’ on those who had lost their nerve. What he did was divide the whole lot of them into groups of ten, and then he killed one from each group, who was chosen by lot.”355

Proposition LIII. Texture hides decimation.

Proof.—Texture mapping is the process of applying surface detail or color information to a model. This is done by assigning an image—a texture map—to the surface, enhancing the visual realism of the model. The original algorithm was proposed by Edwin Catmull in 1974: “This method subdivides a path into successively smaller subpatches until a subpatch is as small as a raster-element, at which time it can be displayed. In general this method could be very time consuming because of the great number of subdivisions that must take place; however, there is at least one very useful class of patches—the bicubic patch—that can be subdivided very quickly. pictures produced with the method accurately portray shading and silhouette of curved surfaces In addition, photographs can be ‘mapped’ onto patches thus providing a means for putting texture on computer-generated pictures.356 

Note.—Texture mapping involvesUV Mapping—This is the process of creating a 2D representationUV mapof the 3D models surface. Each point—vertex—on the models surface is assigned a corresponding point in the 2D UV map. This is usually a complex task since it involves flatteninga 3D surface into 2D while —minimizing distortions —and overlaps. There are various algorithms and methods to achieve this, such as planar projection, cylindrical and spherical mapping, and more advanced methods—Least Squares Conformal Mapping—LSCM—and Angle Based Flattening—ABFTexture Sampling—Once the UV map is created, the next step is to sample the texture image for each point on the surface of the 3D model. This involves assigning to each point on the model a pixel—texel—from the 2D texture map based on the points UV coordinates. There are various interpolation methods used in this step, such as—nearest-neighbor—bilinear—and bicubic interpolation—each with their advantages and trade-offs in terms of speed and quality—Texture Filtering—This step handles issues that arise when the texture map is viewed at different scales or angles, such as aliasing—stair-step effect—and blurriness—the two main types of texture filtering techniques are Mipmapping and Anisotropic filtering—Mipmapping—creating a series of smaller versions of the texture map and selecting the appropriate one based on the distance of the surface from the viewer—Anisotropic filtering—on the other hand, adjusts the texture sampling based on the viewing angle to maintain detail and reduce blurrinessShader Processing—Modern graphics processing units—GPUs—utilize programmable shaders to handle the final stage of applying the texture to the model. Fragment shaderspixel shaders—manipulate texture data to achieve various visual effects—bump mapping |normal mapping | parallax mapping—which add the appearance of additional geometric detail to the surface.357 

Corollary.—An illusion of information.

Proposition LIV. The mind endeavors to conceive only such things as assert its power of activity.

Proof.—We crave a near-identical representation of the physical world, but the pursuit often uncovers the inherent flaws of our tools. Reconstruction is a complex processeach stage—from acquiring data points to post-processing the modelfraught with potential inaccuraciesnoise from source data, distortion from approximations in the algorithms, holes in meshes. Texture mapping distracts from these imperfections. It functions as a cosmetic concealer covering the acne of inaccuracycontouring the faces of digital meshes. 

Proposition LV. The mesh cannot contain every wrinkle and every pore.

Proof.—An untextured model is bare—revealing all blemishes—all imperfections. Consider a reconstruction of a human face.The mesh cannot accurately represent every curve, wrinkle, or pore—it is a low resolution approximation. But with a high-resolution texture map, the final model looks astonishingly lifelike—inaccuracies hidden beneath a shroud of color and photoreal detail.

Corollary.—The surface is contested territory.

Note.—Advanced texture mapping techniques—like bump mapping or normal mapping—can simulate the appearance of surface detailnot present in the actual geometry. Even with a relatively low-polygon model, these techniques can produce a sense of depth and complexity that disguises underlying simplicity.However, just as with makeup, the success of this illusion depends on the skillful application of the texture. Inaccurate UV mapping or poor-quality texture images can lead to glaringly obvious seams, stretches, or other artifacts that draw attention rather than deflect it.

Corollary.—No one envies the virtue of anyone who is not his equal.

Proof.—A reconstruction is always already a distortion. The modification of models post-reconstruction is a risk as well. Altering the original. The alteration of a reconstruction might infringe upon intellectual property rights or distort the original intent. On a broader scale, the unregulated modification of reconstructions could contribute to the spread of misinformation or the erasure of historical or cultural truth.

Note.—There is also potential for objectification and violation of privacy. Without proper consent or oversight, the post-processing manipulation of models of people could lead to misrepresentation, dehumanization, or even the creation of deep fake scenarios: “The underlying technology can replace faces, manipulate facial expressions, synthesize faces, and synthesize speech. These tools are used most often to depict people saying or doing something they never said or did. How do deepfakes work? Deepfake videos commonly swap faces or manipulate facial expressions”358 (IV.).

Proposition LVI. Reconstructions are easily manipulated.

Proof.—Advanced algorithms, once the sole domain of experts, are becoming increasingly accessible, making it easier for those with malicious intent to manipulate visual and spatial data to create hyper-realistic—but deceptive—representations: “Voices and likenesses developed using deepfake technology can be used in movies to achieve a creative effect or maintain a cohesive story when the entertainers themselves are not available … replace characters who had died or to show characters as they appeared in their youth.359 These synthetic images are often indistinguishable from reality. The propagation of deepfake media content outside of entertainment stirs public fear and confusion—the impact of these deceptions extends far beyond misinformation—to the geopolitical threat of mass-scale deception. Disinformation campaigns—enabled by these technologies—fuel internal social and political unrest, destabilize nations, and create international conflict.

Note.—How can the global community discern truth from falsehood in an age where seeing is no longer believing? Though maybe it never was … The solutions are complex and multi-faceted, involving a combination of policy, education, and perhaps new technological tools to detect and combat visual deception. The ethical dimension of this issue cannot be understated. This is the ability to manipulate reality and distribute it on a massive scale.  The line between enhancement and deception can often be blurred. Who gets to decide what is true? How are these decisions made and enforced? These technologies do not operate in a vacuum. They are framed within the broader context of a society where digital literacy is lagging behind technological advances. 

Proposition LVII. Data is on display.

Poof.—Finally, the reconstruction is visualized.

Note.—In Reconstruction, visualization refers to the process of generating a perceptible depiction of the model. The translation of  computed data into a format that can be easily interpreted and understood by humans. Visualization algorithms simulate surface texture, lighting, shading, and color, to produce a new image—or sequence of images. While there are many rendering algorithms, the following are some of the most widely used approaches—Rasterization—is faster but less realistic —suitable for real-time rendering—video games—simulations. It converts the model into a raster image—a grid of pixels—Ray Tracing—simulates the physics of light to achieve realism—It works by tracing the path of light and simulating the effects of its encounters with virtual objects—It is known for its ability to produce high-quality effects like reflections—refractions—and shadows—Path Tracing is a type of Ray Tracing that simulates light by tracing the many potential paths that light could take from the light source to the camera lens—By averaging the results of many different paths—it produces accurate global illumination—soft shadows—depth of field—motion blur—indirect lighting360—Radiosity— is a method primarily used in scenes with diffusely reflecting surfaces—It calculates the way that light transfers from one surface to another—This global illumination method accounts for indirect illumination—where light reflects off multiple surfaces before reaching the viewer—Photon Mapping—is a two-pass global illumination algorithm that accurately represents the interaction of light with different surfaces—In the first pass—it traces photons from the light source into the scene—storing them in the photon map—In the second pass—it uses traditional ray tracing from the camera while also using the photon map to estimate the incoming radiance—A strategy of diffuse interreflection: “In the first pass two photon maps are created by emitting packets of energy—photons—from the light sources and storing these as they hit surfaces within the scene. We use one high resolution caustics photon map to render caustics”361—In photon mapping—caustics refer to the concentrated patterns of light that appear on surfaces due to the reflection off of or refraction of light through curved or shiny—translucent and specular—surfaces— Caustics—result from the focusing and concentration of light rays as they interact with such surfaces— projecting bright and distinct patterns that are often seen in the real world—such as light patterns formed at the bottom of a swimming pool or the shimmering light under a glass of water.

Proposition LVIII.Continuous time, segmented—becomesfrequency.

Proof.—The Fourier Transform named after the French mathematician and physicist Jean-Baptiste Joseph Fourier, is a mathematical technique that transforms a function of timea signal—into a function of frequency. Fourier, known for initiating the investigation of Fourier series, made significant contributions to the field of heat transfer, which led him to develop the Fourier Transform. Essentially, the Fourier Transform decomposes a signal into the frequencies that make it up similar to how a musical chord can be expressed as the frequencies of its constituent notes. This mathematical tool is fundamental in a wide range of fields, including engineering, physics, and data analysis, as it provides a way to analyze and manipulate data by shifting from the time or space to the frequency domain. In doing so, it uncovers the frequency spectrum of a signal—revealing the signals individual sinusoidal components of different frequencies. Fourier Transforms are fundamental in Fourier Volume Rendering—which produces images from volumetric data. This method is often used in medical imaging where it allows clinicians to view a 2D projection of a 3D scan.

Proposition LIX. The body is projected into frequency space.

Proof.—Fourier Transforms play an essential role in CT scan reconstructions. During a scan, the machine takes a series of two-dimensional x-ray images of a body. These images are known as projections. A stack of slices:“A CT machine can produce 64, 128, 256 et cetera slices and the number of slices needed for a study will completely depend on the physician. Using a 64 slice CT machine, producing a 512x512 image or slice with each slice on average having 700 views, then based on the explanation above, the total number of FFTs would be approximately 89000 (64*1400) …  A larger study like CT angiography and cardiac CT14 will have a much higher number of views even for the same CT slices machine.”362 The Fourier Slice Theorem —or Central Slice Theorem these projections are integrated to reconstruct bodies and organs. 

Note.—Fourier Transforms are also used in digital holography. In this context, the Fourier Transform is used to shift from the spatial to the frequency domain, allowing the recording of object and reference beams as an interferencepattern. The captured pattern is a hologram. Holography captures light scattered from objects and restructures it to create three-dimensional images.Unlike traditional photography— which records intensity and color—holography encapsulates the phase of light—retaining depth information: 

Holography was proposed as a lensless imaging technique by Dennis Gabor in 1947 in an attempt to solve the problem of limited resolution in electron microscopes due to lens aberrations. Holography is an elegant solution to the so-called phase problem, which exists in coherent imaging and can be described as follows. A probing wave propagates through a sample and reaches a distant detector. Since detectors can only record intensity, information about the phase distribution of the wave is lost. However, this missing phase distribution is crucial because it contains information about the scattering eventsthat have taken place inside the sample.Therefore, in order to reconstruct the sample distribution, the phases missingin the detector planemust be recovered.363 

Advancements in technology, especially in computational power, resolution of recording devices, and miniaturization of hardware, are propelling the development and application of holography. Holography will seamlessly merge with augmented and virtual reality, offering immersive experiences that are indistinguishable from reality. The future of experience. Eyes affixed—under the spell of Reconstruction—algorithms reflecting the logic of striated space—the logic of disciplinary society.

INSTANCES OF RECONSTRUCTIONS

I. Reconstructions acquit or convict.

Explanation.—Forensic reconstruction is a powerful tool that is used to investigate and analyze crime scenes. It is used to map and understand the events that took place and to identify potential suspects. There are several steps involved in the trusted methodology for forensic 3D reconstruction of crime scenes—Data collection—The first step in forensic 3D reconstruction is to collect data from the crime scene. This can include photographs, video footage, measurements, and other types of data that can be used to create a detailed, 3D model of the scene—Data processing—Once the data has been collected, it must be processed and analyzed in order to create a 3D model of the crime scene. This may involve using specialized software to stitch together images and other data, as well as to adjust for any distortion or other issues that may affect the accuracy of the model—Model Creation—Once the data has been processed, it can be used to create a 3D model of the crime scene. This model can be used to visualize the scene, including the locations of objects, the movements of individuals, and any other relevant details—Analysis—Once the 3D model has been created, it can be used to analyze and interpret the events that took place at the crime scene. This may involve identifying potential suspects, reconstructing the events that took place, and determining the sequence of events—Findings—The final step in the forensic reconstruction process is to present the findings to relevant parties— law enforcement agencies or courts of law.

Reconstructions have been used as evidence in a number of court cases—either to reconstruct crime scenes or to demonstrate the events that led up to a crime. In 2008, State of Florida v. Casey Anthony was one the first US cases in which reconstruction was used to recreate several scenesincluding the trunk of a car in which the body of a young girl was found. The reconstruction was used to help establish the cause of death and to support the prosecutions case that the girls mother, Casey Anthony, had killed her:

An Interactive Tour of the crime scene off Suburban Drive, the autopsy area and digital DNA lab, and the Anthony home where most believe is where 34 month old Caylee Marie Anthony sadly met her end … Angela Talamasca and her team have recreated an interactive simulation that undoubtedly will be used by both the State’s Attorneys and Jose Baez’s Defense Team. It is designed to provide a virtual experience allowing the final ‘triers of fact’ to literally transport themselves within the evidence and theories they will be presented to consider when determining the fate of the accused, Casey Anthony. This proof of concept ‘build’ includes contribution and consultation from leaders in the fields of: Forensics, CSI, Crime Scene Reconstruction and Medical Examiners Investigations.364

The use of reconstruction in the Casey Anthony case was controversial—some experts questioned the accuracy and reliability of the reconstruction. However, it was ultimately admitted as evidence in the case and played a role in the jury’s decision to find Casey Anthony not guilty of first-degree murder—but guilty of four counts of providing false information to law enforcement. 

Casey Anthony—Where the Truth Lies is a documentary about the case and the defendant’s pathological lying: “Casey Anthony is a proven liar. Her narrative of her own story is untrustworthy. She was found guilty at trial of providing false information to law enforcement. She had a long pattern of lying, beginning with years of constructing elaborate lies about her progress through high school, and later about her nonexistent job and even her pregnancy with Caylee a backstory she shares with multiple convicted killers who all eventually murdered members of their family. Rather unusually, however, Casey’s parents, according to her brother Lee’s testimony at her trial, had a history of enabling and playing along with their daughter’s lies rather than holding her to account for them.”365

“The Anthony-focused docuseries repeatedly states she lied as a coping mechanismin order to deal with to years of alleged sexual abuse from her father. (Anthony's father previously denied the allegations and did not respond for comment for the series). ‘I lied to everyone because that was my whole life up to that point,’ Anthony says through tears. ‘Acting like everything was OK but knowing nothing was OK… All of this is a reaction to trauma.’”366 What is truth within a culture that silences women? Disbelieves them? Shames them? Or is this another lie?

Uncertainty is a nightmare.Reconstructions can be nightmares too. The spector and spectacle of the tragedy—commodified in the attention economy. Masks of a moment passed—haunting the present. The case was sensationalized and “the high demand for Anthony related goods led to bidding wars. A Casey Anthony mask sold for over $20,000 to a desperate buyer in need of a Halloween costume. The seller, under the screenname ‘Prophunter’ said it was, ‘One of the best Halloween masks I've ever seen.’”367 Tragedy porn. Hustler—considered the more hardcore of the dominant pornography magazines—offered Anthony a deal: "We made an offer of a half-a-million dollars, but ... she would receive 10 percent of [any additional profits], and the reason why I did that is I'm still ambivalent as to how well this will do … but in case it goes viral and there's this huge interest and everybody wants to see the photographs, well, you know, millions could be made, so we don't know … but, I think it was generous of us to put a percentage of the profits in for her because it could amount to a great deal more money … People have been coming to me in droves, you know, wanting this … I've never seen that happen before.”368 The offer was widely publicized—no response. The imagination runs wild.

In 2014, another child—Tamir Rice—was tragically killed. This time by a Cleveland police officer. The crime scene was reconstructed by law enforcement: “This virtual reality reenactment of the Tamir Rice incident shows the perspective of the officers as they drove toward the area where Rice was shot … But what about Tamir Rice’s perspective? During the case debriefing, Meyer highlighted a video of the officer’s view, but it isn’t stated if a reconstruction of Rice’s viewpoint was ever requested.”369 Reconstructions are powerful … “Reconstructions are certainly more powerful. It’s much less likely that a jury will dispute a version of events with a 3D reconstruction versus a version of events backed by 2D photographs. Instead of taking a jury back out [to a scene] several months, several years later, you can take them into a scene as it was the day that it was scanned. You have a more realistic, cleansed view of the scene …”370 These technologies hold power—privilege a point of view—but are they distributed equally? Or do they reinscribe the same Reconstruction era logic of oppression and elimination?

The San Bernardino Mass Shooting refers to a tragic incident that occurred on December 2, 2015, in San Bernardino, California, United States.On that day, two individuals, Syed Farook and his wife Tashfeen Malik, carried out a mass shooting at the Inland Regional Center, a facility that provided services to people with developmental disabilities. During the attack, Farook and Malik opened fire on a holiday gathering being held at the center, resulting in the deaths of 14 people and injuring 22 others. After the shooting, Farook and Malik fled the scene but were later killed in a shootout with law enforcement officers.371 The San Bernardino mass shooting was one of the deadliest acts of gun violence in the United States at the time, and it had a significant impact on the local community and the nation as a whole. The incident sparked debates and discussions about gun control, terrorism, and the measures needed to prevent such tragedies in the future.

When the forensic team arrived at the scene of the final shootout, the “unit saw hundreds of pieces of evidence, and thanks to FARO’s laser scanners, they could capture and document such evidence in a single scan in just 15 minutes. It is the most complete documentation tool, aside from digging up the house and bringing the entire house with me,’ said Russ, who is a crime scene specialist with the San Bernardino County sheriff’s department … ‘An average scan collects 44 million data points … Some scenes require 50 to 60 scans—that’s billions of data points.’ When Russ heads back to his crime lab, a computer program stitches the images together, creating a 3D memory of the crime scene.”372 The bullet-riddled car—captured and reconstructed. It is becoming common practice to make reconstructions of high profile crime scenes—mass shootings—massacres.373 

Reconstruction has also been proposed as a preventative countermeasure: “The technology uses radar energy to detect weapons and explosives through clothing, backpacks and hand baggage in real time. The 3D shapes created by the technology are compared to an extensive library of images of weapons. MIT has licensed the technology exclusively to Liberty Defense to bring it to market. ‘What we’re offering is an attack prevention system,’said Aman Bhardwaj, president and COO. ‘We’re preventing someone with a weapon from entering.’”374 Scan everyone at every entrance.375

Victims’ bodies are reconstructed too: “It’s called virtopsy or virtual autopsy. The first thing that we do is a laser scan of the bodyto capture bite marks, bruises and other things that we might lose when we open the body … little ridges, bumps and holes on a surface are such important information to scientists. If we had to take a photograph then we wouldn’t be able to capture this texture.”376 Forensic criminologists imagine a fully reconstructed future: “That’s what I see coming. We’re going to be putting these goggles on juries and say look around and tell me what you see.”377

II. Reconstructions are synthetic evidence.

III. What imbues this methodology with trust?

Explanation—Models of bodies and spaces are extracted—presented out of context. The process involves extensive reduction and normalization. Moreover, reconstructionscan be modified to fabricate or alter evidence in order to support a particular narrative.

IV. Does complex engineering bring a representation closer to perfection?

Explanation—How are Reconstruction technologies used to construct, represent, and manipulate identities? Representation Theory—a branch of mathematics—delves into abstract algebraic structures by representing their elements as linear transformations of vector spaces.378 This method translates complex mathematical objects like groups and rings into more understandable and workable forms,—like matrices or transformationswhich are core to Reconstruction algorithms.

Its twin— Representation Theory—exists in sociology and focuses on how individuals and groups construct and present their identities.379 How do individuals or groups perceive and articulate self-concepts—how are these identities constructed, presented, and negotiated in social and cultural contexts. This evaluation encompasses personal identities—gender, race, and religion—and collective identities —national, ethnic, social class: “a system of values, ideas and practices with a twofold function; first, to establish an order which will enable individuals to orient themselves in their material and social worldand to master it; and secondly to enable communication to take place among the members of a community by providing them with a code for social exchange and a code for naming and classifying.”380 In virtual environments, we can assume avatars that may differ from our physical selves, opening up exciting avenues for identity play and self-expression. While Reconstruction offers new avenues for self-representation, it may also reinforce harmful stereotypes, particularly when the technology is used without a nuanced understanding of the complexities of identity. Facial recognition technologies, which rely heavily on capture and reconstruction, have been criticized for their racial and gender biases: “The first woman known to be wrongfully accused as a result of facial recognition technology”381 was eight months pregnant and had to be hospitalized for dehydration after detainment. Error is dangerous. The fidelity of reconstructions could more generally impact how identities are perceived and interpreted. While high-fidelity models might seem more real or authentic, they can also be manipulated or falsified, potentially leading to issues of deception or misrepresentation.

V. Bodies are engineered.

Explanation—“A cyborg is a cybernetic organism, a hybrid of machine and organism, a creature of social reality as well as a creature of fiction.”382 According to Donna Haraway, cyborgs disrupt conventional binaries and categories; they are not confined to nature or culture, organic or synthetic, but inhabit the blurry interstices between these oppositions. Cyborg bodies are engineered constructs that defy rigid categorization, embodying a mix of biological, technological, and cultural elements. The cyborg is a symbol of hybridity, promoting a fluid, transgressive model of identity that moves beyond fixed identities and embraces multiplicity. Bodies are biological—mutable, adaptable, and socially constructed. Bodies are engineered in myriad ways —through physical modifications, medical interventions, wearable and implantable technologies, and even through the roles and expectations imposed by society. Our bodies, and the ways we perceive and experience them, are continually re-engineered, re-shaped, and re-defined through our interactions with technology and culture.

VI. Marketplaces of bodies.

Explanation—The capacity to digitally reconstruct bodies holds immense potential. For instance, in medical fields, Reconstructions allow professionals to simulate complex surgeries, thereby enhancing training and procedural understanding. However, alongside these benefits are ethical concerns surrounding the potential objectification and commodification of the human form. “The contemporary regime of volumetrics, meaning the enviro-socio-technical politics and narratives that emerge with and around the measurement and generation of 3D presences, is a regime full of bugs, crawling with enviro-socio-technical flaws.”383

VII. Corporeal reconstruction can transform a persons surface into an asset, available for use in a variety of contexts. This new marketplace of bodies has led to a surge of questions concerning privacy, consent, and rights of individuals whose bodies are being recreated. Body scans used in simulations—especially for medical or military training—present more subtle ethical dilemmas. While these simulations provide valuable learning experiences, they can inadvertently foster a desensitization towards human suffering and human agency. Particularly in the gaming, entertainment, and pornography industries, there is a risk of exploitation and misuse.

Explanation—Reconstructions of the body—body scans—are assets. They are bought and sold—recontextualized and altered: “A new adult virtual reality company called Holodexxx VR aims to change virtual reality adult entertainment in a big way. Instead of creating experiences that rely on 360 videos or cartoony 3d computer generated models, Holodexxx VR is creating ultra-realistic3d modeling using 3d scans of actual adult actresses. Powered by the amazing Unreal Engine 4, this shit looks real and fuckworthy. Holodexxx VR aims to make 3d recreations of adult actresses as realistic and interactive as possible.”384 HolodeXXX allows users to customize the proportions of bodies—by customization,I mean adjusting a boob slider and a butt slider.385 If pornographic actors are compensated fairly and have informed consent, why not?

VIII. When a persons body is converted into a digital asset, they risk losing control over their own likeness.386 This commodification transforms the human body into a tradable unit. Asset stores are digital replicas of slave markets—bodies for sale.387 These bodies may be placed in violent or sexual scenarios that the actor may not have anticipated. In pornography, the issue is acute. Deepfake technologies have been utilized to place individuals, often women, into explicit scenarios without their consent: “After years of activists fighting to protect victims of image-based sexual violence, deepfakes are finally forcing lawmakers to pay attention.”388 This is a violation of rights and constitutes a form of digital sexual harassment and assault. One victim of deepfake pornography expressed,“it really makes you feel powerless, like you’re being put in your place … Punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you.’”389

IX. Commodified bodies are subject to atrocity.

X. Reconstruction reenacts trauma.

Explanation—Eighty years after the forced expulsion of Japanese Americans from Little Tokyo, Los Angeles, the exhibition—BeHere / 1942: A New Lens on the Japanese American Incarceration—provides an immersive view into this dark historical period:  “On Saturday May 9, 1942, the lives of Japanese Americans in Little Tokyo, Los Angeles, were forever changed. They were given until noon to dispose of their homes and possessions; then they were made to leave. In the euphemistic language of U.S. government policy, Japanese Americans all along the West Coast—some 120,000 individuals, 37,000 of whom resided in Los Angeles—were ‘evacuated’ to ‘relocation centers.’ In reality, they were put on buses and trains and shipped off to concentration camps where they would live for years, in some cases until after the end of the war.”390 Utilizing lesser-known photographs by Dorothea Lange and Russell Lee, the exhibit employs augmented reality—AR— technology to immerse visitors in historical reconstructions. The exhibition, located in the Japanese American National Museum—JANM—also features a public AR installation in the plaza adjoining the Nishi Hongwanji Buddhist Temple, where visitors can walk among virtual recreations of Japanese Americans about to be sent to the camps. The project is created by Japanese media artist Masaki Fujihata, and is co-presented by JANM and the Yanai Initiative for Globalizing Japanese Humanities at UCLA and Waseda University, Tokyo. What are the consequences of reconstructing atrocities? What is the role of communities that have experienced oppression?391

XI. The process of reconstructing a traumatic historical event involves technology, but also engagement with real people affected by the historical incident. How to balance the objectives of truth-telling—raising awareness—memorializing—with the risk of retraumatizing survivors and their descendants? The act of reconstruction may inadvertently commodify or trivialize suffering in a world where images are rapidly produced and consumed.

Explanation—The nature of augmented reality, which superimposes digital information onto the physical world—risks blurring the line between reality and reconstructionpotentially diluting the gravity of the atrocities. It is ethically critical to ensure that such projects do not inadvertently sanitize or diminish the harsh realities of the events being portrayed, particularly when dealing with a historical atrocity of such magnitude. A pivotal aspect of the BeHere/1942 project is the augmented reality app, which enables users to place reconstructed scenes virtually anywhere. While this interactive element can engage audiences and potentially increase the projects reach, it also poses significant ethical implications. Allowing the freedom to situate such poignant historical moments in any setting can lead to misuse or trivialization of the events. For example, users might place the reconstruction in inappropriate or disrespectful contexts, inadvertently undermining the gravity of the atrocities.392 The risk is that the reality of the traumatic historical event could be diminished, turning it into a mere prop in a personalized digital playground. 

XII. Hope is an inconstant pleasure, arising from the idea of something past or future, whereof we to a certain extent doubt the issue. Hope for remembrance. Hope for future protection.

XIII. What are the ambitions of the reconstruction? Who are its stakeholders?

Explanation—Involving survivors and descendants of survivors in the BeHere/1942 project adds another dimension to the ethical landscape. It raises questions about who gets to tell these stories, how they should be told, and what they are trying to say. On one hand, the involvement of the Japanese-American community in the project gives a voice to the survivors and their descendants, allowing them to reclaim and represent their historical narrative. This participation can also offer a degree of catharsis and empowerment. On the other hand, its crucial to consider the potential emotional distress for those reenacting traumatic events of their ancestors. Asking community members to reperform the process of internment might risk inflicting psychological harm, even if their participation is voluntary and well-intentioned.There is something sinister about the fact that they must enter cages for their performance to be captured and reconstructed. The ethics of representation must be carefully navigated to ensure respect for the dignity and well-being of all involved, striking a balance between historical accuracy and empathetic engagement. Moreover, the issue of consent and compensation for participants, particularly for those from marginalized communities, needs careful attention. It is essential to ensure that those involved are not merely viewed as assets but as partners in the narrative process—that they are consulted, adequately compensated, and acknowledged for their contributions: “What, then is the appropriate production methodology for creating monuments (meta-monument) in cyberspace?”393

XIV. Confidence is pleasure arising from the idea of something past or future, wherefrom all cause of doubt has been removed.

XV. Reconstructions are monuments to a moment in time.

Explanation—Monuments are structures erected to commemorate historical events, significant individuals, or prominent ideas. They have always held an integral function in societies. Their public and permanent nature carries significant historical—political—cultural—implications. Monuments are repositories of memory. They help societies recall their past, acknowledge the struggles they have overcome, and pay tribute to individuals who have played consequential roles in their histories. Monuments play an important role in sculpting collective identities. They construct and reinforce national or community narratives, fostering a sense of belonging and shared history among individuals. Take, for instance, the Statue of Liberty, an emblem of American ideals of freedom and democracy.

Monuments possess spatial-narrative power: “Monuments occupy a special place in the urban environment, which, on the one hand, can be considered as a mechanism for the translation of social memory, and on the other hand, they are a spatial reference and a marker of urban space for both of indigenous population and guests of the megalopolis.”394 Location holds substantial meaning. They are installed in symbolic locations like city centers or historical sites. They integrate into daily life and consciousness—conveying power—centrality—importance. The design and symbolism of monuments tell a story—convey ideology—or represent a historical narrative. They silently communicate these narratives to the public, shaping collective memory and perception of history. Power of presence. Through their permanence and scale they create a sense of awe and respect for their associated meanings. They command attention, ensuring that what they represent is not forgotten.

While monuments hold cultural significance and spatial-narrative power, they can also become sites of contestation, particularly when the narratives they represent are challenged, or when they symbolize oppressive histories. As society evolves, so does the perception of these monuments. Stolen or plundered monuments are trophies of conquest—tangible representations of dominance—the desire to obliterate other historical and cultural memories. In stripping away these symbolic structures, thieves and looters reconfigure the cultural landscape, disrupting the continuity of cultural memory and identity: “The British Museum has been accused of exhibiting ‘pilfered cultural property,’ by a leading human rights lawyer who is calling for European and US institutions to return treasures taken from ‘subjugated peoples’ by ‘conquerors or colonial masters.’”395

XVI. Joy is pleasure accompanied by the idea of something past ...

XVII. Disappointment is pain accompanied by the idea of something past ...

XVIII. Reconstructions are not confined by time or space.

Explanation—Reconstruction has ushered in an era of digital monuments. Archeologists embrace the technology.396 They create highly accurate three-dimensional digital replicas of historical monuments—instead of stealing them. The production of these digital replicas is an exercise in technological prowess—but it also offers new possibilities for cultural preservation, education, and accessibility. Reconstruction ensures that even if a monument is physically lost due to conflict, natural disasters, or the ravages of time, its digital counterpart ensures that its memory and significance will endure. These reconstructions are not confined by physical or temporal boundaries. Digital monuments circulate globally. They transcend geographical boundaries and provide access to individuals who may never have the opportunity to visit the source location. This accessibility fosters broader cultural understanding. While reconstructions offer extraordinary educational and preservation benefits, they should not replace or devalue original artifacts. However, colonialism persists in virtual space397—the replication and dissemination of cultural artifacts without the informed consent of the cultures involved.

XIX. Approval is love towards one who has done good to another.

XX. Indignation is hatred towards one who has done evil to another.

Explanation—The Robot Guerrilla Campaign to Recreate the Elgin Marbles.398

The Elgin Marbles—also known as the Parthenon Marbles, have long been a source of controversy. The Parthenon—meaning maiden— sits atop the Acropolis in Athens, Greece. The highest place. A holy site. Ancient temples and smiling statuary. In 480 B.C., The Sacking of the Acropolis was the climax of decades-long conflict between Greek city-states and the Persian Empire: “All this the Persians burned. Blood was shed, too. The invaders killed citizens and priests who had taken refuge in the holy places—a slaughter that, for the Greeks, represented an inconceivable violation of sacred law. Later, after the Persians were defeated and the rest of the Athenians returned to their ruined city, the smiling statues were carefully gathered and buried, as if they were people. You can still see the charring on some of them. The attack and destruction scarred the Athenian consciousness in a way that is difficult for us, traumatized though we still are by September 11th, to imagine. A generation passed before the Athenians could bring themselves to rebuild.”399 

Construction commenced at the height of the Delian League's influence—447 B.C.:“Shortly before rebuilding on the Acropolis began, Pericles seized the treasury and moved it to Athens, ostensibly for safekeeping. At the time, it was valued at eight thousand talents—roughly $4.8 billion in today’s money, by one estimate. Another six hundred talents, or about three hundred and sixty million dollars, rolled in annually as tribute.”400 By 438 B.C., the structure was fully realized, with ongoing decorative work until 432 B.C.. During this period, it functioned as the treasury of the Delian League, which eventually evolved into the Athenian Empire: “It was the first temple in mainland Greece to be built entirely of marble—twenty-two thousand tons of it, quarried about ten miles away and hauled up the Acropolis by sledges, carts, and pulleys. It was also the largest. Most temples in the rather plain architectural style known as Doric have six columns across the front and thirteen down the sides; the Parthenon has eight columns in front and seventeen down the sides. The expanded scale made possible an unprecedented amount of sculptural decoration.”401 The Parthenon frieze is a sculptural band that adorned the exterior of the Parthenon. A mythic perimeter. For centuries the scenes were interpreted as “the great Panathenaic procession held every four years in honor of Athena.”402 In The Parthenon Enigma, Joan Breton Connelly proposes a new interpretation of the Parthenon frieze. It is not a depiction of celebratory procession. It proceeds toward something else. Something sinister. The founding myth of a site of power: “In Euripides’ telling, King Erechtheus faces war with a rival king, who is also the son of the god Poseidon, and is advised by the Delphic oracle that to save his young city—Athens—he must sacrifice his daughter ... The serene figures depicted on the frieze were participants not in a civic festival but in a sacrifice—a human sacrifice—of the king’s youngest, maiden daughter, the crop-haired child.403 The ultimate sacrifice.

Next to the main structure of the Parthenon stands the Erechtheion—an exquisite temple that honors King Erechtheus. This structure is famous for its porch—The Caryatid Porch—which consists of six female statues—which serve as structural columns—supporting the roof of the temple. Each Caryatid stands approximately seven feet tall and is intricately carved from Pentelic marble, renowned for its quality and purity. The Caryatids are female Atlases. They are priestesses—flowing drapery—one foot slightly advanced—as if in graceful motion. The statues are remarkable examples of the hyperreality of ancient Greek sculpture—capturing life-like geometry and movement in stone—Ur-realism. Each Caryatid supports the entablature of the temple on her head, with her arms gracefully extended to hold the weight. The ingenuity and engineering prowess of ancient Greece. Over a millennium the Parthenon underwent rounds of destruction, reconstruction, and functional transformation—serving a temple, a bank, a church, a mosque, an arsenal, and a museum. Shifting to support the ideology of whoever was in power. It is one of the most reconstructed geometries—in situ and in circulation:  “Le Corbusier called it ‘the basis for all measurement in art’ reproduced in every medium and on every scale imaginable, from stone to paper, in tombs, stock exchanges, and courthouses, from a full-size replica in Nashville to the blue-and-white image on millions of takeout coffee cups.”404

During the Siege of the Acropolis, in 1687, an explosion severely damaged the Parthenon, destroying forty percent of its original sculptures. Another assault. Shortly after—in the early 19th century—British diplomat Lord Elgin removed about half of the remaining sculptures from the Parthenon, under a vaguely worded Ottoman license. This collection included life-sized figures, metopes, and a large portion of the sculpted frieze—intended to adorn his Scottish country house. However, the nose of a stolen caryatid was immediately broken off and some pieces were lost in a shipwreck and took over two years to recover. Lord Elgin sold the marbles to the British Parliament in 1816 after enduring personal losses including his fortune, his wife, and his own nose—“a degenerative infection concentrated in his nose, that prompted Lady Elgin torefuse her husband conjugal privileges and ultimately leave him.”405 He sold the collection for 35,000 pounds. The equivalent today would be around £3.6 million or $4.35 million, which is about half of what he had spent to acquire and transport them. The artifacts were then placed under the trusteeship of the British Museum.406 In Athens, the pedestal of the caryatid removed to London remains empty—awaiting its return.

The campaign to return the Elgin Marbles to Greece started almost immediately after their removal. One of the first critics was poet Lord Byron in 1811: “Using the Elgin Marbles as a political symbol of British imperial rapacity, Byron sought to expose Elgin’s professed curatorial concern for the states as a fraud. For Byron, the idea of an enlightened mission to ‘save’ the marbles simply rationalized the malevolent exercise of British political power. It was not the ravages of time or war that was antiquity’s nemesis in this case, but imperial greed.”407 A contemporary critic, former Massachusetts prosecutor Mr. Michel, equates the British Museums retention of the marbles to clinging onto relics of colonial grandeur, and criticizes their inability to educate about Ancient Greek art while acknowledging the artifacts’ emotional significance to Greeks. The trend of returning cultural artifacts to their countries of origin has grown recently, as demonstrated by an Italian museum returning a Parthenon fragment to Athens.408 However, the British Museum and government have largely avoided discussions about returning the Elgin Marbles, with supporters arguing that such restitution could create a problematic precedent and threaten museum collections worldwide. “The Trustees firmly believe that there's a positive advantage and public benefit in having the sculptures divided between two great museums, each telling a complementary but different story.”409

Greek campaigners have argued for their return, maintaining that they were taken without proper consent during the time of the Ottoman Empire’s occupation. The British Museum, supported by successive British governments, rejects these claims, arguing that the marbles were acquired legally: “Lord Elgin's activities were thoroughly investigated by a Parliamentary Select Committee in 1816 and found to be entirely legal. Following a vote of Parliament, the British Museum was allocated funds to acquire the collection.” The Museum has expressed willingness to explore potential loans of the objects, given the borrower acknowledges the lender’s ownership and agrees to return them: “The Trustees have never been asked for a loan of the Parthenon sculptures by Greece, only for the permanent removal of all of the sculptures in its care to Athens. The Trustees will consider (subject to the usual considerations of condition and fitness to travel) any request for any part of the collection to be borrowed and then returned. The simple precondition required by the Trustees before they will consider whether or not to lend an object is that the borrowing institution acknowledges the British Museum's ownership of the object.”410 This is a trap. Greece must forfeit all claim of ownership for the marbles to return.

Many archaeologists argue that the case for the return of the Elgin Marbles to Greece is compelling, given that the original building from which they were taken still stands. The British Museum counters that “though partially reconstructed, the Parthenon is a ruin. It's universally recognised that the sculptures that still exist could never be safely returned to the building: they're best seen and conserved in museums. For this reason, all the sculptures that remained on the building have now been removed to the Acropolis Museum, and replicas are now in place.”411 Roger Michel, of the Institute of Digital Archaeology—a collaboration between archeologists at Cambridge and Harvard—proposes that reconstruction could offer a solution to resolve the long-standing dispute.

Michel, who initiated this project, intends for the copies to go to the British Museum to promote the repatriation of the original Elgin marbles. The British Museums’ Deputy Director, Jonathan Williams quipped that ‘people come to the British Museum to see the real thing, don’t they?’The museum’s curators apparently think not. The two galleries adjoining the Elgin gallery both contain replicas of Parthenon sculptures still residing in Greece. Nearby are plaster casts of the palaces of Xerxes and Darius and the tombs of Sety and Merenptah. In fact, just like at peer museums around the world, copies are currently used throughout the British Museum.”412 Museums contain countless quality facsimiles of significant sculptural works, reflecting a historical appreciation for reproductions. However, a resolution satisfying both the British Museum and the Greek government seems unlikely. Copies and replicas are often viewed as inferior. In March, the museum denied a formal request to scan the pieces. The Institute of Digital Archaeology resorted to stealth—using discrete devices equipped with Lidar sensors and photogrammetry software—to reconstruct the marbles without permission.

The first reconstruction—a marble horse head—was converted to toolpaths for a subtractive robot, which carved the prototype over four days. The marble for this prototype was sourced locally, and the final copy was carved from marble quarried on Mount Pentelicus, the original source of stone for the Acropolis. Michel plans to create more replicas of the Parthenon Marbles complete with restorations and repairs to reflect how the originals would have looked. These changes will account for the damage inflicted on the marbles during an ill-advised cleaning operation in the 1930s, in which British Museum masons stripped away much of the patina. The reconstruction will also have some degree of color restoration, applied by hand in collaboration with Greek experts: “looking at an ancient Greek or Roman sculpture up close, some of the pigment ‘was easy to see, even with the naked eye.’ Westerners had been engaged in an act of collective blindness.‘It turns out that vision is heavily subjective … You need to transform your eye into an objective tool in order to overcome this powerful imprint’—a tendency to equate whiteness with beauty, taste, and classical ideals, and to see color as alien, sensual, and garish.”413 Writings on the topic of whitewashing monuments can be found across many disciplines and discussions, including post-colonial studies, sociology, history, and cultural heritage studies. The term can refer to both the physical act of cleaning or altering the color of monuments, as well as the metaphorical act of erasing or glossing over historical injustices or uncomfortable truths.414

Some archaeologists, while supporting the repatriation of the marbles, expressed concerns about the projects source of funding, the lack of public consultation, and perceived echoes of British imperialism. Questions have also been raised about who the replicas serve and their political implications, especially when artifacts are seen as symbols of nationalism and state power. The Greek government has been silent on the replica project, causing unease among some scholars. Bernard Means, director of the Virtual Creation Lab at Virginia Commonwealth University, said such a project should only be undertaken with the consultation and full support of Greece, suggesting that proceeding otherwise indicates a colonial mindset.415 And while these guerilla reconstructions have produced a new kind of pressure, the British Museum continues to discuss the matter in terms of a loan. The director of the Acropolis Museum, Nikolaos Stampolidis, reflected “in the difficult days we are living in, returning them would be an act of history. It would be as if the British were restoring democracy itself.”416

XXI. Heritage is public domain.

XXII. Heritage as property is our inheritance.

Explanation—Morehshin Allahyari is an Iranian artist, activist, educator, and curator who challenges the dominant narratives around concepts like heritage and ownership, particularly as they intersect with digital technology. One of her projects—Material Speculation: ISIS—involves the reconstruction of artifacts destroyed by terrorist violence. These reconstructions are 3D printed and embedded with a flash drive and a memory card. Each flash drive contains data about the artifact—gathered texts, images, videos, and maps. Allahyari condemns terrorist and institutional violence alike. Digital Colonialism, as conceptualized by Allahyari, is a critique of the power structures that exist in the production, archiving, and distribution of data and digital artifacts, particularly those from non-western cultures. Western corporations and institutions often control access to digital forms of cultural artifacts—mirroring historical forms of colonialism—where artifacts were removed from their original cultural contexts and displayed in European museums. Digital decolonialism involves critically examining how these technologies are used and who they serve. Unfurling—not reinforcing—existing power structures—as a means of resistance. Allahyari sees technology not just as a tool but as a liberatory potential: “She Who Sees The Unknown (2017-2021) is a research-based project by Morehshin Allahyari, that uses 3D simulation, sculpture, archiving, and storytelling to re-figure monstrous female/queer figures of Islamicate origin; using the traditions and myths associated with them to explore the catastrophes of colonialism, patriarchism and environmental degradation in relation to the Middle East.”417

XXIII. Is it possible to repatriate a reconstruction?

Explanation—Repatriation generally refers to the process of returning objects, such as cultural or historical artifacts, to their country of origin—or to the community with which they hold a significant cultural connection. Take the inverse of the marbles proposal.—repatriating a reconstruction.If a reconstruction accurately and respectfully represents an original artifact that has been lost, stolen, or destroyed, repatriating that reconstruction could potentially allow a community to reconnect with its lost heritage in a meaningful way. This might involve transferring a digital or physical replica of the artifact back to the community, or it could involve sharing the knowledge and resources needed to create a reconstruction locally. However, a reconstruction is not the same as the original artifact. Even the most accurate and detailed reconstruction is an interpretation, created with contemporary tools, materials, and knowledge. The intangible aspects—the history, the stories of those who created and used the original artifact, and the spiritual or cultural significance attached to the artifact—cannot be fully recreated. A reconstruction is not a substitute for the original.

XXIV. Monuments are public memory.

Explanation—Monuments primarily serve as symbols of public memory, reflecting the cultural, political, or social ideals of a society at the time of their creation. They typically celebrate victories, pay homage to notable figures, or symbolize shared beliefs. Erecting a monument is often an act of affirming a dominant narrative—a projection of power. Monuments embody the ethical mode of Reconstruction. They materialize and solidify specific narratives and identities in the public sphere. Take the Lincoln Memorial: “In this temple as in the hearts of the people for whom he saved the Union, the memory of Abraham Lincoln is enshrined forever.”418 Savior of the Union. A symbol of freedom and the struggle against slavery. A reconstruction of Lincoln’s vision for America.

XXV. Monuments affirm the rules of power.

XXVI. Memorials are reconstructions of loss.

Explanation—Memorials traditionally hold a commemorative function, often acknowledging and mourning loss or tragedy. Memorials serve as sites for collective remembrance and reflection, encouraging contemplation about past events or individuals. Rather than predominantly celebrating or affirming a dominant narrative, memorials aim to facilitate healing, understanding, and reconciliation. The Vietnam Veterans Memorial—for instance, does not glorify war or assert nationalistic pride but remembers and honors those who served and died in the Vietnam War. It offers space for visitors to mourn, reflect, and connect with the names inscribed on its reflective surface, “these names, seemingly infinite in number…”419 

XXVII. Reconstructions change.

Explanation—Monuments and memorials may intersect; they are not mutually exclusive. And these public structures meanings and interpretations are not static. They evolve over time, influenced by changing societal attitudes, historical perspectives, and ethical norms. Debates surrounding Confederate monuments in the U.S. reevaluate historical reconstructions.These symbols are increasingly seen as elevating a history of slavery and racial discrimination.

XXVIII. Reconstruct memorials that elevate the vulnerable.

Explanation—Before Charleston, South Carolina, fell in 1865, Hampton Park became an outdoor prison for Union soldiers: “More than 250 prisoners died and were buried in mass graves. After Confederate evacuation, Black ministers and northern missionaries led an effort to reinter bodies and build a fence around a newly established cemetery. Over the entrance, workmen inscribed the words, ‘Martyrs of the Racecourse.’ On May 1, 1865, a group of newly freed Black people gathered at what is now Hampton Park to put decorations on the graves of the Union soldiers … They sang songs and they made speeches,and this was covered not only in Charleston but New York newspapers and this is credited as being the first Memorial Day.”420 The first Memorial Day was in lowcountry.

Lowcountry is a documentary that dives into the complexity of interracial relations in Charleston, in the aftermath of the Emanuel AME Church Shooting. The film captures a community rocked by violence, as it wrestles with grief, grapples with injustice, and moves towards healing.The tragic event took place on June 17, 2015, when a white supremacist entered the Emanuel African Methodist Episcopal Church and killed nine African-Americans during a prayer service (I.xiv.). The incident sent shockwaves through the nation, igniting a broader conversation about racial hatred, prejudice, and the urgent need for change. Lowcountry explores this tragedy not just as an isolated event, but as a reflection of the historical racial tensions that have shaped the city of Charleston and the larger American society. It dissects the aftermath of the massacre, the communitys collective grief, and the resilience demonstrated in the face of such a horrific event. The filmmakers explore how “Charlestons genteel reverie was shattered by shootings that exposed the underbelly of the citys tourist mythology. Can black and white residents arrive at conciliation or will immutable Southern politeness censor what is needed to initiate such a process? What are the conditions for healing in a city averse to truth-telling?”421 

As part of the production of Lowcountry, the filmmakers captured and reconstructed models of many official Charleston monuments as well as the softer structures of memorial. There was an outpouring of global support in the wake of the Emanuel AME Church shooting. Countless objects—letters, prayers, and quilts—sent by individuals from around the world—tangible offerings of solidarity and love.When reconstructing the quilts—the filmmakers captured them folded—shielding their messages. For community eyes only—obscured. Within institutionalized slavery, quilts contained hidden messages of resistance and resilience.Quilts were often used as secret maps and instructional tools to aid enslaved individuals in their quest for freedom. Encoded within the intricate patterns and designs of these quilts were directions, symbols, and messages that guided runaways along the Underground Railroad.422 Artist and scholar Romi Morrison has written extensively about this form of covert communication, revealing the depth and complexity of these encodings and their pivotal role in facilitating escape: “The Freedom Quilts generate a type of code that doesnt execute automatically but makes the act of interpretation explicit. While encountering quilts left in public fugitives would discern the code and simultaneously have to read it in context, within the geography of placement. In this instance the executability of code is halted as a declarative axiomatic language imagined within syntax. Code is not an absolute instruction but is read in addition to landscape.”423 

The production of Lowcountry took place against the backdrop of monuments to confederate ideology— and quieter markers to local abolitionists and civil rights activists—often underrepresented in the city’s narrative.The monuments—and memorials—throughout the city serve as powerful reminders of Charleston’s past struggles and achievements in the fight against racism and inequality: “Since the Charleston Church shooting, more than 300 Confederate symbols have been removed, including 170 monuments. As deadly violence against the Black community continues, Pinckney hopes that as monuments come down, the movement offers the opportunity for people nationwide to understand that Confederate symbols have served to terrorize the Black community since they first began to be put in place after the Civil War.”424 Gleaming white Confederate monuments still mount the terrain. Resistance symbols are hardwon. There was always a lack of bureaucratic support for African American monuments.And sites of community and remembrance have always been vulnerable—African-American graveyards have been systematically destroyed in the construction—and reconstructions of the city.425 Nowhere to remember.

In 2018, The Legacy Museum opened in Alabama: “The Legacy Museum: From Enslavement to Mass Incarceration is situated on a site in Montgomery where Black people were forced to labor in bondage. Blocks from one of the most prominent slave auction spaces in America, the Legacy Museum is steps away from the rail station where tens of thousands of Black people were trafficked during the 19th century.”426 Then, in 2023, a Presidential Initiative was announced—The Monuments Project:

The Monuments Project is an unprecedented $250 million commitment by the Mellon Foundation to transform the nation’s commemorative landscape by supporting public projects that more completely and accurately represent the multiplicity and complexityof American stories. Launched in 2020, the Monuments Project builds on our efforts to express, elevate, and preserve the stories of those who have often been denied historical recognition, and explores how we might foster a more complete telling of who we are as a nation. Grants made under the Monuments Project will fund publicly oriented initiatives that will be accessible to everyone and promote stories that are not already represented in commemorative spaces. While funds may support new monuments, memorials, and historic storytelling places, not every project will be a statue or permanent marker but may be realized as ephemeral or temporary installations or other nontraditional expressions of commemoration that will expand our understanding of what a monument can be. Mellon will also support efforts to contextualize or recontextualize existing commemorative sites and to uplift knowledge-bearers who can tell stories that have not yet been told.

The Mellon Foundation is also providing lead support to MONUMENTS “a new exhibition co-organized by LAXART and The Museum of Contemporary Art Los Angeles (MOCA) that will open in Fall 2025 at LAXART and The Geffen Contemporary at MOCA. Curated by LAXART Director Hamza Walker, internationally renowned artist Kara Walker (no relation, that they know of), and MOCA’s Senior Curator, Bennett Simpson, MONUMENTS will feature decommissioned Confederate monuments displayed alongside existing and newly commissioned works of contemporary art. MONUMENTS will be accompanied by a substantial scholarly publication and a robust slate of public and educational programming.”427 The exhibition is still in development. Not yet realized.

XXIX. Monuments—exist in virtual reality.

Explanation—Currently, the government offers Virtual Tours of United States Veterans and War Memorials.428A handful of organizations and individuals produce virtual reconstructions—Honor Everywhere: Virtual Reality Veterans Experience429 —Civil War 1864: A Virtual Reality Experience430 —I Am A Man—Traveling While Black431—1000 Cut Journey.432 Others produce virtual realities that throw off the tyranny of photoreal reconstruction—NeuroSpeculative AfroFuturism: “NeuroSpeculative AfroFeminism is an ambitious and richly imagined project by Hyphen-Labs, a global team of women of color who are doing pioneering work at the intersection of art, technology, and science. The project consists of three components. The first is an installation that transports visitors to a futuristic and stylish beauty salon. Speculative products designed for women of color are displayed around the space, including a scarf whose pattern overwhelms facial recognition software, and earrings that can record video and audio in hostile situations. The second part of NeuroSpeculative AfroFeminism is a VR experience that takes place at a “neurocosmetology lab” in the future. Participants see themselves in the mirror as a young black girl, as the lab owner explains that they are about to experience cutting edge technology involving both hair extensions and brain-stimulating electrical currents. In the VR narrative, the electrodes then prompt a hallucination that carries viewers through a psychedelic Afrofuturist space landscape.”433 

XXX. Reconstructions preserve history.

Explanation—The preservation aspect of this initiative is crucial in the face of increasing threats to cultural heritage sites, making virtual reality a tool for both the democratization and conservation of our monumental heritage. However, there is the potential for—the inevitability of—distortion and oversimplification of history. Virtual realitys immersive and interactive nature can create compelling experiences, but it also simplifies and omits.This can lead to skewed or partial understanding. This is particularly problematic with monuments that have contentious or layered histories. Although VR theoretically allows anyone to visit these monuments, the reality is that many people globally do not have access to these technologies. Projects like MasterWorks: Journey Through History by CyArk, which digitize and preserve cultural heritage sites, raise questions about ownership and control. Who has the right to digitize these sites, and who decides how they are represented or interpreted?

These questions are particularly complex when dealing with monuments that are culturally sensitive or sacred to certain communities. Devices and user interfaces impose technological landmarks on virtually reconstructed sites. The interaction layer is more game than reverence. This gamification is seen in geolocation augmented reality applications like Pokémon Go, where real-world monuments become virtual PokéStops and Gyms—trivializing their cultural and historical significance. Striking a balance between play and respect is a delicate task. It necessitates community consultation. Alejandro G. Iñárritu’s Carne y Arena is a virtual reality installation that recreates the intense conditions faced by refugees on their journey towards the United States: “During the past four years in which this project has been growing in my mind, I had the privilege of meeting and interviewing many Mexican and Central American refugees. Their life stories haunted me, so I invited some of them to collaborate with me on the project. My intention was to experiment with VR technology to explore the human condition in an attempt to break the dictatorship of the frame, within which things are just observed, and claim the space to allow the visitor to go through a direct experience walking in the immigrants’ feet, under their skin, and into their hearts.434

XXXI. Monuments are erected in augmented reality.

Explanation—Augmented Reality—AR—is a technology that overlays digital information or imagery onto the physical world, providing an interactive experience that merges the real and the virtual.

XXXII. Subversive monuments augment reality.

Explanation—Artists use reconstructions to erect subversive monuments in real space. Subversive augmented reality monuments challenge traditional narratives and power structures through virtual installations. They use technology to disrupt or question dominant ideas and perspectives. Nancy Baker Cahill is known for using augmented reality to create public art in unexpected places. Her project, Liberty Bell, is an augmented reality, sound-reactive public artwork that examines the concept of freedom (V.). It has been placed at multiple historical sites of liberation and oppression across the United States. A ghostly bell—hidden in plain sight—ringing out.435 Kambui Olujimi explores the reconstruction of historical narrative in  Skywriters & Constellations.: “Immersive and unique in its form and process, ‘Skywriters’ (2018) is an animated collage of time and space projected onto the night sky of the Planetarium’s dome. Using full dome technology, Olujimi achieves dramatic shifts of scale and stunning visual effects that animate Wayward North. Olujimi creates his figural imagery by stitching together an encyclopedic range of film clips—earth, sky, street scenes, and microscopic views of natural and manmade materials.436 These narratives provide different perspectives on the socio-political landscape, encouraging viewers to rethink their understanding of history and reality. John Craig Freeman produced the augmented reality piece, Border Memorial: Frontera de los Muertos. This project is an AR monument dedicated to the thousands of migrant workers who have died along the U.S./Mexico border in recent years. It uses technology to bring visibility to a critical and often overlooked issue. Geolocated skeletons haunt the landscape.437 Tamiko Thiel and /p—Zara Houshmand—created Unexpected Growth—a dystopian vision of a future affected by climate change and pollution. Installed at the Whitney Museum in New York, the AR project shows hybrid bio-plastic coral-like structures growing on objects in the museum. Unexpected mutations in sea life. Underwater.438 These artists demonstrate the power of reinserting reconstructions into reality to question—provoke—disrupt.

XXXIII. Emulation is the desire of something, engendered in us by our conception that others have the same desire.

Explanation—He who runs away, because he sees others running away, or he who fears, because he sees others in fear; or again, he who, on seeing that another man has burnt his hand, draws towards him his own hand, and moves his body as though his own were burnt; such an one can be said to imitate anothers emotion, but not to emulate him; not because the causes of emulation and imitation are different, but because it has become customary to speak of emulation only in him, who imitates that which we deem to be honorable, useful, or pleasant.

XXXIV. Imitations and emulations—the terrain of simulation.

A simulation is a computational or physical model that emulates a real-world system or scenario. It is often used when conducting experiments on the actual system would be impractical, dangerous, or impossible. A simulation replicates essential aspects of the real world in a controlled environment, enabling study, prediction, and scenario testing. Pilots are trained using flight simulators that emulate real flying conditions, while surgeons use simulations to practice complex procedures and predict outcomes. Simulations of atmospheric conditions predict future weather patterns. In finance, simulations assess potential investment outcomes. Engineers use simulations to test and optimize designs. Physicists and biologists create simulations to study phenomena that cannot be directly observed or experimented upon, like the formation of galaxies or the evolution of species. A simulation typically involves constructing a mathematical model that represents the system, defining the rules and parameters that govern its behavior, and then running the model on a computer to see how the system behaves over time or under different conditions. The output can be a single result, a range of possible outcomes, or an interactive experience. The effectiveness of a simulation is heavily dependent on the accuracy of the model and the realism with which it replicates the real-world system(V.).

XXXV. Reconstructions extend reality.

The power of reconstruction extends far beyond aesthetics. They hold profound potential to shape both narrative and action, serving as bridges that connect past and present, memory and reality, tragedy and resilience. Through the careful and empathetic use of reconstruction technologies, suppressed histories can be brought to the forefront, silenced voices can be given a platform, and narratives of oppression can be rewritten into narratives of hope and resistance. The process of reconstruction must be handled with sensitivity and respect, ensuring that it does not trivialize or exploit the experiences of those who have suffered or continue to suffer. How to protect against trivialization? Exploitation? Deceit?

XXXVI. Reconstruction enables deception.

XXXVII. Deception is warfare and magic.

XXXVIII. Heritage is weaponized as a right wing think tank.439

XXXIX. Force things to stay the same.

XL. Or face the uncertainty of change.

XLI. The author is a Trojan horse.440441

Explanation—The Trojan Horse is one of the most enduring myths of deception in our cultural lexicon. It is a cautionary tale about the potential of duplicity to wreak havoc—catastrophe. The story, which hails from ancient Greek literature, narrates how—during the Trojan War—the Greeks cunningly presented the Trojans with a giant wooden horse—ostensibly as a peace offering—a gesture of their surrender. The seemingly harmless horse, however, was hollow. Hidden within—a cohort of Greek soldiers—poised to attack.Upon admittance into the fortified city of Troy, the soldiers emerged under the cover of night and overpowered unsuspecting citizens. The myth underscores how deception can breach even the most formidable defenses. The Trojan Horse has since become a symbol of deceit—a metaphor for a seemingly innocuous element or action that carries within it the seeds of ruin:

After many years have slipped by, the leaders of the Greeks,

opposed by the Fates, and damaged by the war,

build a horse of mountainous size, through Pallas’s divine art,

and weave planks of fir over its ribs

they pretend it’s a votive offering: this rumor spreads.

They secretly hide a picked body of men, chosen by lot,

there, in the dark body, filling the belly and the huge

cavernous insides with armed warriors.

[...]

Then Laocoön rushes down eagerly from the heights

of the citadel, to confront them all, a large crowd with him,

and shouts from far off: ‘O unhappy citizens, what madness?

Do you think the enemy's sailed away? Or do you think

any Greek gift is free of treachery? Is that Ulysses's reputation?

Either there are Greeks in hiding, concealedby the wood,

or it's been built as a machine to use against our walls,

or spy on our homes, or fall on the city from above,

or it hides some other trick: Trojans, don’t trust this horse.442

A Trojan Horse is malicious software—a malware twin—disguising itself as a legitimate and harmless program or file to deceive users into executing it.Unlike viruses or worms, Trojans do not replicate themselves but rely on social engineering techniques to trick users into running them.Once a Trojan is executed, it can carry out various malicious actions on the infected computer without the user's knowledge. Stealing sensitive data. Spying on user activities. Modifying or deleting files. Installing other malware. Creating backdoors for remote access.Trojan horses often enter a system through deceptive means, such as hiding in seemingly harmless attachments, fake software downloads, or infected websites. They exploit vulnerabilities in the operating system or other software to gain unauthorized access.

XLII. Industrial deception perpetuates the original colonial project—extract labor, sell sugar.

Explanation—The Teapot Dome scandal—which came to light in the early 1920s—is regarded as one of the most significant political scandals in U.S. history until Watergate. A Trojan horse. The scandal involved the secret leasing of federal oil reserves by Albert B. Fall, who was serving as Secretary of the Interior under President Warren G. Harding. The two oil fields, one located in Teapot Dome, Wyoming, and the other in Elk Hills, California, had been set aside for the U.S. Navys use in case of a national emergency. Fall managed to transfer the control of these reserves from the Navy to his Department of the Interior, and he then secretly leased the land to private oil companies. In exchange for these illegal transactions, Fall received considerable personal gifts and loans from the oil companies involved. When the details of this scandal came to light, it led to a Senate investigation and Fall was ultimately found guilty of bribery, marking the first time a U.S. cabinet official had been convicted of a felony while in office. The Teapot Dome scandal stands as a classic example of industrial deception, showcasing the potential for corruption when government officials and corporate interests collude for personal gain, often at the expense of public resources. It demonstrated how corporate influence could manipulate and exploit government processes, thereby breaching public trust and skewing public policy in favor of industry, rather than the citizens it was meant to serve. The scandal led to increased public awareness and demand for transparency in government and corporate affairs, and has since served as a cautionary tale for the need for stringent regulations and checks on the intersection of political power and corporate influence.

Fall's actions in the Teapot Dome scandal led to a significant public outcry and increased calls for greater transparency and accountability in government. Despite these calls, Fall played a significant role in dismantling industrial regulation during his tenure as the Secretary of the Interior. His direct impact wasnot in favor of increased regulation but rather in loosening and deregulating colonial industries—Mining, Railroad, Oil, Food & Drugs.443 In Hacking of the American Mind, Robert Lustig—a neuroendocrinologist and pediatrician— explores how these industries— particularly Food & Drugs—and Emerging Tech—have been manipulating the human mind and contributing to the rise of ruinous societal issues. The central premise of the book is that our brains are being hacked—manipulated—by the excessive consumption of processed foods and processed images. Short-term pleasure and instant gratification in exchange for freedom. Lustig argues that the overconsumption of sugar triggers addictive responses in the brain—generating epidemics. He delves into the science of how sugar affects brain chemistry, leading to changes in dopamine and other neurotransmitters, which can drive compulsive behavior. Food & Drug subsidies—and marketing—create a deceptive system of value and profit—flooding the market with poison. Lustig discusses how digital technology has been similarly designed to exploit our psychological vulnerabilities and keep us hooked to our screens. He explores how online interactions create a sense of validation and reward in our brains, leading to addictive behaviors. He argues that the combination of excessive sugar consumption and overuse of digital technology has led to a society plagued by chronic stress, depression, anxiety, and illness. He emphasizes the need for individuals to be aware of these manipulative tactics and take proactive steps to reclaim control over their minds and bodies.444 

XLIII. A teapot is a colonial projection(I.xv.).

XLIV. Ambition is the immoderate desire of power.

Explanation—Industrial deception.

XLV. Luxury is excessive desire, or control.

XLVI. Luxury causes physical and mental illness. Insulin Resistance.

Explanation—addiction.

XLVII. Reconstructions can be used to simulate parameter changes. Possible futures.

XLVIII. Subsidies are controls.

Explanation—Subsidized is a game—a simulation—that reconstructs the logic of subsidy systems and plays out potential parameter changes.445 

GENERAL DEFINITION OF DECEPTION

Deception refers to the act of intentionally causing another person to believe something that is not trueor not the whole truth. This can be achieved through lying, misleading, hiding important information, or presenting false information: “How can one keep from destroying oneself through guilt, and others through resentment, spreading one's own powerlessness and enslavement everywhere, one’s own sickness, indigestions, and poisons? In the end, one is unable even to encounter oneself.”446 While deception often has a negative connotation because it undermines trust and can lead to harm or unfairness, it’s important to note that it can also be used in benign ways, such as surprise parties, magic tricks, and some forms of entertainment where the objective is to create a delightful illusion or mystery rather than cause harm.

Central to Descartes’ philosophical inquiry was the concept ofDeus Deceptor—or the deceiving God—which he proposed as a hypothetical construct to question the reliability of our senses and perceptions. Descartessought to establish a foundation of certain knowledge by doubting everything that could be doubted, including sensory perceptions and beliefs. He proposed that there might be an omnipotent and malevolent being, the Deus Deceptor who deceives us and leads us to hold false beliefs about the external world. This hypothetical construct was central to Descartes' Meditations on First Philosophy, where he employed the method of doubtto arrive at indubitable truths. The Deus Deceptor concept was instrumental in Descartes' efforts to build a solid epistemological foundation. By entertaining the possibility of a deceiving God Descartes underscored the importance of clear and distinctideas as the only reliable basis for knowledge. The act of doubting itself, according to Descartes, proves the existence of a thinking self—the first indubitable truth and the starting point for his philosophical system. The idea of an all-powerful deceiverinitially introduced by Descartes,underwent an evolution in the works of later philosophers, leading to the brain in the vat thought experiment. This thought experiment gained prominence in contemporary philosophy and cognitive science as a compelling exploration of skepticism and the nature of reality. The brain in the vat thought experiment imagines a scenario in which a brain is removed from a body and placed in a vat, connected to a sophisticated computer system that stimulates it with artificial sensory inputs. These simulated experiences are indistinguishable from real experiences, and the brain is deceivedinto believing that it is interacting with the external world.447 The brain in the vat scenario raises profound epistemological questions about the nature of knowledge, reality, and the reliability of our sensory perceptions. If the brain in the vat cannot differentiate between simulated experiences and genuine experiences, how can we be certain that our own experiences are not similarly simulated or manipulated? The thought experiment also touches upon the broader philosophical topic of solipsism, which posits that only one's own mind is sure to exist. Solipsism challenges the notion of an external, independent reality, suggesting that everything could be an elaborate illusion created by one's mind or a deceiving entity. The thought experiment raises issues concerning the nature of consciousness, the relationship between mind and body, and the limits of human cognition. Furthermore, the brain in the vat concept has found relevance in discussions about the nature of virtual reality and the ethical implications of advanced technologies that can manipulate human experiences and perceptions.666448

Melodic death metal. Doubt is necessary in the face of evil genius449—post-truth reconstructions—but the Cartesian worldview can also lead to indifference. If the world is not real, why invest energy in protecting it? Disinformation is “adversarial narratives that create real world harm”450 Uncertainty is destabilizing: “the spread of false, misleading and inaccurate news threatens democracy globally. In response, researchers, non-profit organizations and media companies have sought to develop techniques to detect mis- and disinformation but fact-checking, while important, is not enough. Fact-checking sites lag behind the deluge of rumors produced by global disinformation networks and spread via private interactions.”451 With the acceleration of artificial intelligence—disinformation becomes an existential threat.

Selection is revealing.

Part IV explores—Acceleration—the ethical framework that guides government, corporate, and individual development and application of Capture and Reconstruction technologies. 











PART IV.

OF TECHNOLOGICAL ACCELERATION, OR THE ETHICS OF SUPERVISION

PREFACE

Acceleration is the dominant ethical framework governing the development and application of Capture and Reconstruction. Push technology as far and fast as possible. No limit. No exit. Accelerationism, as a field, has ironically taken an oscillating rather than asymptotic trajectory, emerging during periods of economic extremity and receding during periods of stability. This makes sense in light of the fact that it is fundamentally a theory that revolves around Capitalism. The waveforms of Capital. Volatile economic conditions—short wavelengths—draw urgent attention to the dynamic forces that control labor and value. The ur-text of this particular discourse, then, is undoubtedly Karl Marxs Das Kapital. Marx, identifying the force of automation, lays the groundwork for future examinations of technocapital: “… to the degree that large industry develops, the creation of real wealth comes to depend less on labor time and on the amount of labor employed than on the power of the agencies set in motion during labor time, whose ‘powerful effectiveness is itself in turn out of all proportion to the direct labor time spent on their production, but depends rather on the general state of science and on the progress of technology, or the application of this science to production.”452 

The springboard from marxist visions of the collapse of capitalism—toward a more radical call for its mutation—is Deleuze and Guattaris provocation in Anti-Oedipus: Capitalism and Schizophrenia: “Which is the revolutionary path? … To withdraw from the world market? … Or might it be to go in the opposite direction? To go still further, that is, in the movement of the market? … Not to withdraw from the process, but to go further, to ‘accelerate the process.453 Although this provocation is often taken out of context to encourage the wholesale and largely uncritical embrace of automation, it introduces the possibility of systemic transformation. In Energumen Capitalism, Jean-Fracois Lyotard clarifies a shared interpretation of “capitalism as metamorphosis, with no extrinsic code, having its limit only within itself, a relative, postponed limit (which is the law of value)”454 and explains that “the potential of force is not a potential to produce something more, but a potential to produce something other, in other ways.”455 Accelerationism may immediately call to mind time and speed—it is more accurate, however, to think of it in Deleuze and Guattaris terms—as deterritorialization. Accelerationism is focused on identifying and conjuring emancipatory lines of flight—flows of change—utter transfiguration.

According to Accelerationists, one of its primary methods of catalyzing change is science fiction. James Ballard observes, for example, that “what the writers of modern science fiction invent today, you and I will do tomorrow, or, more exactly, in about ten years time, though the gap is narrowing.”456 In the mid 90s, The Cybernetic Culture Research Unit at Warwick University, took up this assertion with all seriousness, producing a massive collection of theory-fiction texts. The group—including Sadie Plant, Mark Fischer, and Nick Land—updates the accelerationist trajectory with formal cybernetics—and then injects it with vivid science-fictive elements. They argue that rather than diagnose or criticize extant conditions, their speculative texts had the power to seed new futures. They proposed the term hyperstition for narratives able to effectuate their own reality: “ln the hyperstitional model … fiction is not opposed to the real. Rather, reality is understood to be composed of fictions-consistent semiotic terrains that condition perceptual, affective and behavioral responses. Writing—and art in general—[is construed] not aesthetically, but functionally—that is to say, magically, with magic defined as the use of signs to produce changes in reality.” They propose that these narratives actually come from the future, “fictional quantities functioning as time-traveling potentials,”457 and trigger long-range positive feedback loops that overhaul culture.

Science fiction narratives and capitalism are symbiotic—they feed each other progress and expand the boundaries of what can be subsumed in a framework of accumulation: “Capitalism is not a human invention, but a viral contagion, replicated cyberpositively across post-human space. Self-designing processes are anastrophic and convergent: doing things before they make sense. Time goes weird in tactile self-organizing space: the future is not an idea but a sensation.”458 The members of the CCRU offer themselves up as hosts for hyperstitions symbiotic pair—letting ideas from the future flow—through their writing—into the present. Their projections are neither gleaming nor bleak, but exhilaratingly chaotic. There is no waiting for deliverance from harm—only a growing pressure to imagine possible ways of life in the ruins.

Benjamin Noys, who coined the movements name but remains critical of its approach, concedes that its projective strategies are what make it highly seductive: “try and break the appeal of acceleration.”459 He warns that it “gives over to capital a monopoly on our imagination of the future as the continuing intensification of accumulation and the reinforcement of the capitalist continuum.”460 Noys argues that like capitalisms regenerative power, accelerationism “is an aesthetics or practice of liquefaction that can temporarily solidify to activate force, before dispersing again into new liquid immanent forces.”461 He demands “a restoration of the sense of friction that interrupts and disrupts the fundamental accelerationist fantasy of smooth integration.” Friction can come from external regulation and it can also emerge from internal dissent.

Everyone working in tech is an accelerationist. One workstream of many feeding into a process that speeds up change with invention and automation. It is important, then, that we have consciously evaluated the vector of development we are pursuing. What values do tech corporations hold? Do we subscribe to corporate values? How are we directing our creativity and engineering? The Accelerationist discourse reflects and influences the values of tech executives—corporations—and terrorists462. It is a discourse explicitly organized around political ideologies. The major divisions within the field are: Left accelerationism—Right accelerationism—Gender accelerationism—Black accelerationism—and Unconditional accelerationism. Future agendas. All of these agendas are preoccupied with the emergence of a new kind of subject and citizen within the totalizing framework of technocapital. What are the laws and practices that regulate human and machine action within these systems? What is the function of corporate ethics? What is individual agency?

DEFINITIONS.

I. By L/acc, I mean Left Accelerationism.

II. By R/acc, I mean Right Accelerationism.

(Concerning these terms see the foregoing preface towards the end.)

III. By G/acc, I mean Gender Accelerationism.

IV. By B/acc, I mean Black Accelerationism.

(In V. xvii. note. i., G/acc and B/acc strategies intersect with Posthumanism and New Materialism.)

V. By U/acc, I mean Unconditional Accelerationism.

VI. By Accelerationism, I mean: “Jettison the prospects of salvation or failure! What is needed is an explosion of designs, speculative ways through and out, even if out is ultimately out of the question.”463 In Maximum Jailbreak, Benedict Singleton argues that “escape is the material with which design works. It is the enemy of stasis, even when the latter appears as motion but only as reiteration; a project of total insubordination towards existing conditions; a generalized escapology.”464 

VII. By an end, for the sake of which we do something, I mean a desire.

VIII. Acceleration is the project of escaping current and future conditions.

AXIOM.

Goals and weights determine the direction of change.

PROPOSITIONS.

Proposition I. The central accelerationist discourse mirrors the arguments of Left \ Right / democracy.

Proof.—Accelerationism is characterized by a tension between Left and Right agendas, which almost comically clings to party lines and shuttles us towards an event horizon of debilitating political resignation.

Note.—While Nick Land was perhaps the most influential thinker in the CCRU, his ideas became increasingly conservative and, for a time, fell into oblivion. Yet, in the shadow of the 2008 financial crisis, a young theorist, Alex Williams, resurrected and popularized accelerationism in the context of a liberal agenda. In his first post on the subject, he contends that while Capitalism intersects with humanity, it is ultimately not for us and should be analyzed according to “an anti-anthropomorphic cartography, a study in alien finance, a Xenoeconomics.”465 This statement strongly resembled Lands earlier theory-fiction interventions; however, in online-conversation with Mark Fischer, a former member of the CCRU, Williams quickly revealed his absolute departure from Land, articulating his certainty that human agency plays a crucial role in unleashing the latent forces of capital.466

Proposition II. L/acc—Enactments of human agency make all the difference.

Proof.—Williams differentiates between what he calls weak accelerationism andstrong accelerationism. He dismisses ameliorative actions on the left that simply stall crises of capitalism and simultaneously pushes against the strategy of inducing collapse in service of revolution. Rather, he promotes strong accelerationism, which “radically alters the nature of the processes of capital itself … a radical mutation of the system.”467 Again, for Williams, this mutation is contingent on strategic human action.

Proposition III. R/acc—Only the agency of technocapital matters.

Proof.—Land, on the other hand, insists that human agency has no place in the matter. He endows capitalism, rather than humanity, with the subject position, and most captivatingly asserts that capitalism -is- artificial intelligence.468 It operates according to a logic of production and profit, which exists at an order of magnitude beyond human conceptualization and control. He imagines that this AI was introduced in the 16th century with the shift of laboring subjects from feudal serfs to wage-workers, and has continued to infiltrate, adapt to, and manipulate all corridors of human life for the last six centuries.

Proposition IV. Technocapital does not serve humans, it has its own agenda.

Proof.—Land maintains that capitalism is not for the primary benefit of human players. Its desires supersede human concerns. Land is a staunch advocate of Technocapital AI. He wants it to continue to grow, develop, and assume total power. He also believes in the inevitability of this outcome: “Life is being phased-out into something new, and if we think this can be stopped we are even more stupid than we seem.”469 He subscribes to a kind of inverted marxist teleology, arguing that humans will not escape capitalism, but rather that capitalism will necessarily escape us. The positive-feedback loop of capital will leave us in the dust: “Humanity recedes like a loathsome dream.”470 And Land wholly supports the anti-human implications of this escape. Lands move is to reorient hope itself. While Spinoza aligns himself to the universe, Land aligns his desires for the future with technocapital.

Corollary.—If capitalism aims to escape humanity, why has it not yet succeeded?

Proposition V. R/acc—Capitalism and democracy are incompatible.

Proof.—In his text, The Dark Enlightenment, Land answers this question and in the process, advances his conservative ideology: “For the hardcore neo-reactionaries, democracy is not merely doomed, it is doom itself. Fleeing it approaches an ultimate imperative.”471 Here, Land wages a full-scale attack on democratic principles, explaining that “as the democratic virus burns through society, painstakingly accumulated habits and attitudes of forward-thinking, prudential, human and industrial investment, are replaced by a sterile, orgiastic consumerism, financial incontinence, and a ‘reality television political circus.”472 

Proposition VI. R/acc—Democracy is the limiting force that holds the capital AI back from fulfilling its full potential.

Proof.—Democracy is Lands ultimate antagonist; throughout the text, he characterizes universal political enfranchisement as both illusion and pathology: “that democracy is fundamentally non-productive in relation to material progress, is typically under-emphasized. Democracy consumes progress. When perceived from the perspective of the dark enlightenment, the appropriate mode of analysis for studying the democratic phenomenon is general parasitology.”473 Land supports his argument with theory as well as historical interpretation. He refers to ancient Greece “as a microcosmic model for the death of the West … Its pre-eminent virtue is that it perfectly illustrates the democratic mechanism in extremis, separating individuals and local populations from the consequences of their decisions by scrambling their behavior through large-scale, centralized re-distribution systems.”474

Proposition VII.R/acc—The ideal ethical framework for technocapital is “no voice, free exit.”

Proof.—Land identifies this solution in the writing of blogger and computer scientist, Curtis Yarvin, also known as Mencius Moldbug / It is important to note that Moldbugs ideas, which promote racist, patriarchal, and fascist ideologies while condemning the free press and academic institutions, are foundational to both neoreaction, or NRx, and the alt-Right movement / In The Dark Enlightenment, Land focuses primarily on Moldbugs concept of neo-cameralism, an alternative to democracy, in which local city-states function as corporations, each with a privileged class of stakeholders, and a governing CEO: “Gov-corp would concentrate upon running an efficient, attractive, vital, clean, and secure country, of a kind that is able to draw customers. No voice, free exit.”475 In this patchwork model, subjects have no voting power, but can move to a new city-state if they are dissatisfied \ Critically, neither Land or Moldbug address the feasibility of physical relocation. Particularly strange is this models assumption and celebration of open borders and nomadic existence, while actual disciples of this train of thought are uniformly anti-immigrant, calling for walls to be built to prevent the escape of refugees and economic migrants alike.

Corollary.—This paradox highlights the shared racial bias core to Land and Moldbugs alliance. Their worldview manifests throughout The Dark Enlightenment, and its critique of all-powerful leftist apparati  / the Cathedral \ a term he borrows from Moldbug. First, Land disparages the educational embrace of intersectionality and postcolonial critique, renaming liberal scholarship “Grievance Studies.”476 He is not only skeptical, but repulsed by this discursive shift, claiming that the University system has descended into a chaotic, irrational, moralizing vortex, obsessed with identity politics. Land reasons that “because grievance status is awarded as political compensation for economic incompetence, it constructs an automatic cultural mechanism that advocates for dysfunction.”477 This line in particular exposes his refusal to acknowledge structural injustice. Land does not consider the attainment of financial success through oppression / invisible labor / and slavery to be dysfunctional in any way. To the contrary, he sees this strategy as highly effective.

Proposition VIII. The knowledge of good and evil is nothing else but the emotions of pleasure or pain, in so far as we are conscious thereof.

Proof.—Land argues that tolerance is nonsensical, that moral outrage in the face of racism is unwarranted, and that, “a ‘hate crime, if it is anything at all, is just a crime, plus ‘hate, and what the ‘hate adds is telling.”478 He claims that the label hate crime is an invention of the left to suppress conservative ideology. He continues, “as we have seen, only the Right can ‘hate.”479 He describes the dangers of inner city neighborhoods, the unfairness of the term white flight, which he considers racist, and the misunderstood logic of white nationalism. After all of this, Land makes his main point: through suppression of thought and mind control, the Right is unfairly and unconditionally associated with racism. The Left uses this association to manipulate citizens to vote against conservative representatives and ideas. As a result / government continues to grow / bureaucratic expansion is the lefts ultimate agenda.

Proposition IX. R/acc—There are three possible futures within the framework of democracy / all equate to doom.

Proof.—

(1) Modernity 2.0. Global modernization is re-invigorated from a new ethno-geographical core, liberated from the degenerate structures of its Eurocentric predecessor, but no doubt confronting long range trends of an equally mortuary character. This is by far the most encouraging and plausible scenario (from a pro-modernist perspective), and if China remains even approximately on its current track it will be assuredly realized. (India, sadly, seems to be too far gone in its native version of demosclerosis to seriously compete.)

(2) Postmodernity. Amounting essentially to a new dark age, in which Malthusian limits brutally re-impose themselves, this scenario assumes that Modernity 1.0 has so radically globalized its own morbidity that the entire future of the world collapses around it. If the Cathedral ‘wins this is what we have coming.

(3) Western Renaissance. To be reborn it is first necessary to die, so the harder the ‘hard reboot the better. Comprehensive crisis and disintegration offers the best odds (most realistically as a sub-theme of option #1).480

Note.—In the final section of The Dark Enlightenment, Land attempts to minimize the import of racial tension and negotiation by invoking his strategy of theory-fiction, familiar from his writings in Fanged Noumena. He introduces the concept of the bionic horizon—at which point “‘humanity becomes intelligible as it is subsumed into the technosphere, where information processing of the genome—for instance—brings reading and editing into perfect coincidence.”481 He then appropriates Octavia Butlers Xenogenesis trilogy, and in conjunction with John H. Campells concept of generative evolution, performs a slight of hand / proposing that eugenic intervention will soon transform the human species so rapidly and monstrously ~ it will be unrecognizable. This is a kind of inversion of Butlers project, which foregrounds questions of race \  gender \ and sexuality / Land twists Xenogenesis to dismiss those same questions / asserting that biological and cultural evolution will render conversations about identity utterly meaningless.

Corollary.—The image of something past or future, that is, of a thing which we regard as in relation to time past or time future, to the exclusion of time present, is, when other conditions are equal, weaker than the image of something present; consequently an emotion felt towards what is past or future is less intense, other conditions being equal, than an emotion felt towards something present.

Proposition X. R/acc—Human warfare is a consequence of technocapital.

Proof.—Land clarifies his eugenic predictions, offering the term hyper-racism482 to convey a future in which “space colonization will inevitably function as a highly-selective genetic filter.”483 He imagines that human populations will evolve into different species as a result of these new cosmic barriers / genetic incompatibility will eventually lead to full-fledged warfare.

Corollary.—From the remarks made in Def. vi. of this part it follows that, according to Land, there is no human escape from technocapital.

Proposition XI. L/acc—Induce planetary-scale transduction.

Proof.—In the year following Lands neoreactionary outburst, Alex Williams and Nick Srnicek published their #Accelerate Manifesto \ launching an argument for planetary-scale social transformation to a wider audience. The manifesto acknowledges Lands key role in the development of accelerationism and at the same time rebuts his conservative approach. Their accelerationist imaginary departs from Lands in several ways.

Proposition XII. L/acc—Capitalism is not an inevitable or even legitimate catalyst.

Proof.—In Williams and Srniceks words, “capitalism cannot be identified as the agent of true acceleration.”484 In fact, they frame capitalism as an engine of stasis, referring to Deleuze and Guattaris explanation of deterritorialization and reterritorialization as forces of equilibrium. They attack the conservative perspective as “myopic,” asserting that “Landian neoliberalism confuses speed with acceleration. We may be moving fast, but only within a strictly defined set of capitalist parameters that themselves never waver. We experience only the increasing speed of a local horizon, a simple brain-dead onrush rather than an acceleration which is also navigational, an experimental process of discovery within a universal space of possibility. It is the latter mode of acceleration which we hold as essential.”485 Unlike Land / who sees capitalism and technological progress as inextricably linked \ Srnicek and Williams see disentanglement as a real possibility. They encourage this separation and ask “what a modern technosocial body can do [outside of] the enslavement of technoscience to capitalist objectives.”486 

Corollary.L/acc—Technological progress is not bound by capitalism.

Proof.—Srnicek and Williams acknowledge the racist—sexist—underpinnings of golden era487 capitalism and refuse a return to unjust / stable social hierarchies \ they shift focus to issues of labor within neoliberal capitalism: “we need to reconstitute various forms of class power. Such a reconstitution must move beyond the notion that an organically generated global proletariat already exists. Instead it must seek to knit together a disparate array of partial proletarian identities, often embodied in post-Fordist forms of precarious labor.”488 They recognize the fissures between different groups and seek new ways of fulfilling the long-held marxist promise of a unified working class \ unified by precarity. The question of labor is one that Srnicek and Williams return to repeatedly in future projects.

Proposition XIII. L/acc—Democracy is an important check on technocapital.

Proof.—Williams and Srnicek refute Lands claim that democracy stifles acceleration \ arguing that “the assessment of left politics as antithetical to technosocial acceleration is also, at least in part, a severe misrepresentation.”489 They concede that direct action may no longer be an effective strategy for change, but they do not abandon universal enfranchisement and social progress in their model \ rather \ they call for methodological innovation within leftist politics: “Democracy cannot be defined simply by its means—not via voting, discussion, or general assemblies. Real democracy must be defined by its goal—collective self-mastery … We need to posit a collectively controlled legitimate vertical authority in addition to distributed horizontal forms of sociality, to avoid becoming the slaves of either a tyrannical totalitarian centralism or a capricious emergent order beyond our control.”490 They call for a new intellectual infrastructure and wide-spread media reform. While at first this call reads as suspicious of academia and the free press / which Land demonizes and entirely dismantles / Srnicek and Williams leave room for their continued \ albeit changed \ existence.

Proposition XIV. L/acc—Humans have ultimate agency.

Proof.—In contrast to the Moldbugian proposal for regional gov-corps with disparate constitutions \ Srnicek and Williams argue against localism. Instead they embrace “modernity of abstraction, complexity, globality, and technology,”491 which they posit are necessary to effectively address equally abstract— complex—global problems. They take issue with Lands worship of capital as the supreme AI: “In this visioning of capital, the human can eventually be discarded as mere drag to an abstract planetary intelligence rapidly constructing itself from the bricolaged fragments of former civilisations.”492 For Srnicek and Williams, there is still a critical role for the human \ humans are the designers and directors of the future with the ultimate agency to effect change \ through engagement with technology and political economy: “we must develop both a cognitive map of the existing system and a speculative image of the future economic system.”493 In the Left accelerationist imaginary \ only human processes of creative projection \ have the potential to liberate the latent forces of capitalism.

Proposition XV. R/acc—Politics is the crisis.

Proof.—In response to the #Accelerate Manifesto, Land immediately published a series of sardonic annotations. He begins his critique by undermining climate change science: “how did this hypothetical forecast achieve such extraordinary prestige?”494 He also persistently attacks Srnicek and Williams use of the term neoliberalism / which he claims “is not a serious concept”495 and “is merely a profession of faith, serving far more as a tribal solidarity signal than an analytical tool.”496 He also counters their characterization of capitalism as the source of trouble and reasserts his position that “the ‘crisis [that] gathers force and speed is politics.”497 He explains that “from the Right, the single and comprehensive social disaster underway is the uncompensated expansion of the state.”498

Proposition XVI. R/acc—The goal is to support the autonomy of technocapitals positive-feedback loop.

Proof.— Land takes issue with the fact that Left accelerationism prioritizes questions of social justice / “prevailing in social conflict”499 / over rapid technological evolution. As a result he determines that Left accelerationism is a position of conditional accelerationism. He contrasts this with unconditional Right accelerationism / which absolutely serves the autonomy of the technocapital positive-feedback loop. He reflects critically on the#Accelerate Manifesto, asking: “enslave technosocial acceleration to ‘collective self mastery? That seems to be the dream.”500 

Proposition XVII. R/acc—There have been countless experiments in post-capitalist alternatives and none have been remotely competitive or even survived.

Proof.—Land reinscribes his projection that capital is an inevitable and totalizing form of intelligence.

Note.—Overall / he finds that the #Accelerate Manifesto makes unsubstantiated claims / lacks supporting evidence / and is rife with hand-waving.

Proposition XVIII. L/acc—The goal is human freedom.

Proof.—Willams followed the#Accelerate Manifesto and its annotations with an essay titledEscape Velocities that begins to index contemporary accelerationist discourse \ which he argues has surpassed Lands contributions. He explains that “at present we find a swarm of new ideas operating under this rubric, ranging from post-capitalist techno-political theory, to sci-fi speculative cosmist design, to universal rationalist epistemologies.”501 Williams identifies human freedom \ as opposed to Lands “merely negative freedom: the freedom of capital from deleterious (and misguided) human intervention”502 \ as the fundamental project of Left accelerationism.

Note.—Willaims cites the concept of epistemic accelerationism \ dually developed by Ray Brassier and Reza Negarestani \ which “proceeds via alienation”503 as a result of the nihilistic tendencies of mathematics and scientific discovery: “epistemic acceleration then consists in the expansion and exploration of conceptual capacity, fed by new techno-scientific knowledges, resulting in the continual turning-inside-out of the humanist subject in a perpetual Copernican revolution.”504 Williams argues that epistemic accelerationism and new forms of left politics are the two most promising modes for attaining freedom.

To enable these projects toward human freedom \ “much of the initial labor must be around the composition of powerful visions able to reorient populist desire away from the libidinal dead end which seeks to identify modernity as such with neoliberalism, and modernizing measures as intrinsically synonymous with neoliberalizing ones”505 \ The sensory specifics of how this is communicated \ however \ are left entirely open \ He follows this indeterminate aesthetic with a concrete “aesthetics of interfaces, control rooms, and cognitive maps” necessary for empowering users to wield data in service of the Left agenda \ Finally \ Williams promotes an aesthetic of improvisational action \ or mêtic practice \ which “entails a complicity with the material, a cunning guidance of the contingent (and unknowable in advance) latencies discoverable only in the course of action” \ His essays aesthetic emphasis is a major reason for the recent explosion of artists who have taken up accelerationism as a creative framework and source of inspiration.

Proposition XIX. L/acc—Accelerationist aesthetics activate human agency.

Proof.—Srnicek and Williams have since produced a series of rigorous projects in which they provide support and context for the ideas sketched in their two shorter pieces \ Together they published Inventing the Future: Postcapitalism and a World Without Work \ in which they develop their critique of Neoliberalism \ dissect the limitations of direct action exemplified in the post-recession Occupy movement \ and embrace posthumanism. The chapter \ Left Modernity \ promotes the anti-essentializing ideology of the latter: “This is a project of self-realization, but one without a pre-established endpoint \ It is only through undergoing the process of revision and construction that humanity can come to know itself \ This means revising the human both theoretically and practically \ engaging in new modes of being and new forms of sociality as practical ramifications of making ‘the human explicit.”506 Interestingly / this section mirrors Lands projection at the end of The Dark Enlightenment that humans will inevitably transform \ albeit with a very different politics in mind \ Srnicek and Williams devote the rest of the book to the argument “that the contemporary Left should reclaim modernity, build a populist and hegemonic force, and mobilize towards a post-work future.”507

Proposition XX. L/acc—Work limits human agency.

Proof.—Recalling Bertrand Russels essay \ In Praise of Idleness \ Williams and Srnicek articulate the imperative to imagine a decline in human labor \ which they see as the key to finding viable future scenarios outside of the now totalizing neoliberal system.

Note.—Their post-work future depends on four demands:

1. Full automation

2. The reduction of the working week

3. The provision of a basic income

4. The diminishment of the work ethic508

These demands require technological innovation \ changes to public policy \ state subsidization \ and discursive-aesthetic adjustments. Srnicek and Williams posit that each independently would move their post-work agenda forward \ but a combination would amplify the effect and accelerate a paradigm shift.

Proposition XXI. L/acc—Emerging platforms have potential beyond the perpetuation of technocapital.

Proof.—In his book \ Platform Capitalism \ published in 2016 \ Srnicek analyzes the affordances of what he considers a new business model \ distinct from Fordist vertical integration and post-Fordist flexible production and its implications for the future of labor \ Srnicek explains that platforms are simultaneously intermediaries and infrastructures \ Platforms are multi-sided markets that bring producers and consumers together \ They are infrastructures in the sense that they allow individuals to build applications on top of them \ Platforms come with network effects: “the more numerous the users who use a platform, the more valuable it becomes for everyone else”509 producing a tendency toward monopolies \ A key strategy to generate platform use is cross-subsidization \ using revenue from one part of a platform to make other parts free \ in turn incentivizing participation \ The goal of platforms is to collect as much data as possible.

Platforms are comprised of core architectures which preclude neutrality—as a result \ platforms are inherently political. Srnicek identifies five main platform categories in the contemporary landscape \ each type has its own dynamics and constraints—digital advertising—cloud-computing services—industrial IoT—product shares—and lean platforms \ Importantly \ the last category of lean \ generally assetless companies \ is not profitable and Srnicek argues \ likely faces imminent collapse. Srnicek askes what these companies have to do to actually make a profit and what ethical implications are at stake for each of these revenue-generating strategies \ he examines the dominant mode / monopolization \ as well as two alternatives: “platform cooperatives and public platforms, ‘owned and controlled by the people and subsidized by the state.510 He does not see a clear path for either of these alternatives to actually compete with corporate monopolies and ends the book with a grim future outlook. Platforms have enabled deregulation—widened the wealth gap / increased working hours \ and decreased protections for workers. It does not appear that they have the capacity to profoundly mutate the systems from within ~ Platforms may signal a shift of how humans operate within capitalism—but ultimately they do not provide a stairway through and out.

Proposition XXII. R/acc—Platforms supporttechnocapital.

Proof.—Lands most recent contribution / the introduction to his forthcoming book / is a platform analysis of Bitcoin / which he hails as a concrete fulfillment of Right accelerationism / The introduction reveals that Land is still highly critical of discussions that address social and political fairness / He laments the fact that “because money is inextricably entangled with questions of reciprocity, it is tied-up intimately with such provocations to outrage as injustice, cheating, exploitation, and unbounded inequality. Such sensitive moral trigger-zones pose a formidable inhibition to dispassionate analysis … Discussions of money drive social apes mad.”511 Although he does not make an explicit comparison in the introduction—it is notable that the blockchain—the public immutable leger which tracks cryptocurrency transactions—bears structural resemblance to the gov-corp patchwork described in The Dark Enlightenment / supporting the Rights “commitment to escape.”512 On one level the blockchain allows for multiple simultaneous cryptocurrencies—or coins—to coexist and compete. Land focuses on Bitcoin / but there are countless others including Ethereum—Ripple—Litecoin—Dash—Dogecoin—the list goes on. Each cryptocurrency has its own algorithms and underlying ideologies. Users choose which cryptocurrencies to hold or exchange and which to abandon / Lands exit premise works seamlessly on the blockchain.

Corollary.—Platforms embody the “no voice, free exit” framework.

Proposition XXIII. R/acc—Platforms subdivide and merge when necessary.

Proof.—The blockchain has the capacity to fork \ Each fork is like an alternate reality \ a history of transactions determined by consensus / A fork represents a change of protocol and with it a change in its community of users / Here, the exit strategy is viable as well \ Too many forks, however, produce instability.513 Fittingly / Land applauds this quality and explains that “the Left thus recognizes its enemy, with striking realism, as an emergent—and intrinsically fractured—agent of social dissolidarity.”514 While each cryptocurrency and each blockchain fork represents a particular protocol and ideology / Land celebrates the fact that the blockchain platform overwhelmingly preferences and disseminates the conservative agenda: “consistent ‘right wing-extremism, automated governance, and unflinching critical philosophy are inter-translatable without significant discrepancy. The crypto-current is a nightmare for the left (rigorously conceived).”515 

Proposition XXIV.There are resistant forms of Acceleration.

Proof.—The debate between R/acc and L/acc thinkers parallels contemporary politics in the United States and elsewhere / the merging of government and corporate interests on the one hand and a panoply of welfare programs from healthcare and tuition assistance to universal basic income on the other \ This feedback loop has / however / spun out various other strains of accelerationism.

U/acc—Acceleration is guaranteed and unknowable.

Proof.—Unconditional accelerationism | or U/acc | rejects praxis and offers an alternative to the right and left dichotomy: “U/acc calls attention to the manner through which collective forms of intervention and political stabilization, be they of the left or the right, are rendered impossible in the long-run through overarching tendencies and forces.”516 Everything is predetermined.

Proposition XXV. G/acc—Agency exists in the body.

Proof.—Gender accelerationism argues that capitalism as we know it depends on patriarchy \ which requires gender binaries to modulate power \ As Luce Irigary explains in The Sex Which is Not One \ women “remain an ‘infrastructure unrecognized as such by our society and our culture. The use, consumption, and circulation of their sexualized bodies underwrite the organization and the reproduction of the social order, in which they have never taken part as ‘subjects517 \ Gender accelerationists propose that if the gender binary is exploded \ patriarchy can no longer function as a stable substrate for capitalism \ and capitalism will be forced to mutate into something else ~ This explosion requires human agency in the form of biopolitical hacking ~ Through hormonal and genetic experimentation ~ and other modes of counter-performance ~ subjects can revise themselves and the systems they inhabit.

Proposition XXVI. G/acc—Binary identifications uphold technocapital.

Proof.—The origin of this speculative trajectory is found in the work of Shulamith Firestone who identified two modes of culture \ the aesthetic \ and \ the technological \ She aligns the aesthetic mode with the female and the technological mode with the male ~ though she maintains that these qualities exist in all sexes ~ She explains that the two modes form a feedback loop of envisioning and enacting and that the “the merging of the aesthetic with the technological culture is the precondition of a cultural revolution.”518 

Proposition XXVII.G/acc ~ Agency explodes binary systems.

Proof.—Firestone calls for the emergence of “an androgynous culture surpassing the highs of either cultural stream, or even of the sum of their integrations. More than a marriage, rather an abolition of the cultural categories themselves, a mutual cancellation ~ a matter-antimatter explosion, ending with a poof! of culture itself.”519 The defining quality of this revolution is a shift in the source of pleasure: “enjoyment will spring directly from being and acting itself, the process of experience, rather than from the quality of achievement.”520 The ecstatics of agency ~  process ~ and change overwhelm the paltry appeal of any external goal.

Proposition XXVIII. G/acc ~ Agency erodes human-non-human boundaries.

Proof.—Donna Haraways A Cyborg Manifesto is another key text in the formation of the Gender accelerationist strategy \ Like Irigary and Firestone \ Haraway rejects an essentializing \ dualist vision of gender ~ she takes this argument to an extreme by dissolving boundaries with other categories as well ~ the primary focus is on boundaries between human-machine and human-animal ~ but crucially Haraway also argues that “the boundary between science fiction and social reality is an optical illusion.”521 She explains that ultimately her “essay is an argument for pleasure in the confusion of boundaries and for responsibility in their construction.”522 

Proposition XXIX. G/acc ~ Agency is a creative process.

Proof.—Language and narrative have the power to restructure reality ~ Haraway is equally committed to material practices of intervention ~ In an advanced technocapital framework / which she calls the Informatics of Domination / “biological-determinist ideology is only one position opened up in scientific culture for arguing the meanings of human animality”523 ~ she encourages subversive self-design and taking up emerging systems to facilitate change: “communications technologies and biotechnologies are the crucial tools recrafting our bodies.”524 

Proposition XXX. G/acc ~ Agency is self programming.

Proof.—After the dissolution of the CCRU—one of its most influential members ~ Sadie Plant ~ advanced the groups theory-fiction strategies through the lens of Gender accelerationism ~ she combined feminist theory ~ the history of computing—cybernetics ~ and science fiction to propose that digital systems are bound up with gender and its potential transformation ~ particularly compelling is her treatment of Ada Lovelace ~ cyclically referring back to her diary ~ she focuses on entries in which Lovelace acknowledges not only the cultural potential of the Analytical Engine but that the act of programming is changing her own brain: “It does not appear to me that cerebral matter need be more unmanageable to the mathematicians than sidereal and planetary matter and movements”525 ~ she connects female genetics to cybernetic feedback loops with runaway effects: “unlike patrilineal modes of transmission in which heredity is passed on a one-way line of descent from father to son, those lines designated female run in circles, like the chicken and the egg ~ they also move at the imperceptible speeds of virtually alien life”526 ~ she concludes that the female is always an act of engineering, and through subversive intervention ~ the future holds infinite possibilities for reformation.

Proposition XXXI. G/acc ~ Gender is engineered.

Proof.—Paul Preciados Testo Junkie is a highly specific evaluation of contemporary mechanisms that drive capitalism and can be leveraged toward gender-abolition ~ Preciado offers the term pharmacopornographic ~ which “refers to the processes of a biomolecular (pharmaco) and semiotic-technical (pornographic) government of sexual subjectivity”527 to point to the fact that human gender and sexuality is already highly engineered as a result of the substances ~ birth control pills ~ narcotics ~ synthetic hormones ~ GMOs ~ and images ~ films ~ porn ~ ads ~ etc. ~ produced ~ that are distributed ~ and consumed ~ Preciado explains that pharmacopornographic production has “become the model of all other forms of production ~ and in this way ~ pharmacopornographic control infiltrates and dominates the entire flow of capital ~ from agrarian biotechnology to high-tech industries of communication.”528 

Corollary. ~ this acknowledgement banishes any misgivings about disrupting ~ natural ~ mechanisms.

Proposition XXXII. G/acc ~ Gender can be hacked.

Proof.—If biocapitalism produces subjects and reproduces them on a global scale529 ~ Gender accelerationism argues for a defiance of replication and instead infinite variation through hacking the self ~ within a system of total biopolitical manipulation and control ~ it is necessary to take up industrial tools and readminster them in radical ways ~ Preciado uses his own experimental testosterone injections as a demonstration of how to play.

Note. ~ More recently ~ Laboria Cuboniks published theXenofeminist Manifesto: A Politics for Alienation ~ this manifesto ~ distributed as an interactive website in thirteen languages ~ is a concise intersectional snapshot of Gender accelerationism ~ “cutting across race, ability, economic standing, and geographical position”530 and aligning with “anyone whos been deemed ‘unnatural in the face of reigning biological norms, anyone whos experienced injustices wrought in the name of natural order … the queer and trans among us, the differently-abled, as well as those who have suffered discrimination due to pregnancy or duties connected to child-rearing.”531 

Proposition XXXIII. G/acc ~ Identity can be hacked.

Proof. ~ Xenofeminism positions itself as an alternative platform ~ “a mutable architecture that, like open source software, remains available for perpetual modification and enhancement following the navigational impulse of militant ethical reasoning”532 ~ it aims to “cultivate the exercise of positive freedom ~ freedom-to rather than simply freedom-from ~and urge feminists to equip themselves with the skills to redeploy existing technologies and invent novel cognitive and material tools in the service of common ends”533 ~ Xenofeminism mobilizes technoscience to reengineer human identity and abolish gender: “let a hundred sexes bloom! ‘Gender abolitionism is shorthand for the ambition to construct a society where traits currently assembled under the rubric of gender, no longer furnish a grid for the asymmetric operation of power ”534 ~ it asks “whether the idiom of ‘gender hacking is extensible into a long-range strategy ~ a strategy for wetware akin to what hacker culture has already done for software ~ constructing an entire universe of free and open source platforms that is the closest thing to a practicable communism many of us have ever seen”535 ~ Xenofeminism proposes the limitless circulation of gender-hacking strategies ~

Proposition XXXIV. G/acc ~ Trans-tactics are a model for identity hacking.

Proof.~ Along the same lines ~ and in a detectably cackling voice ~ Gender Acceleration: A Blackpaper adopts CCRU terminology to unpack gender as a dichotomous invention: “Gender is ahyperstition overlayed on sex by the male ~ its function is to objectify the female and impose on her a social function as a machine whose duty is to reproduce the human ~ always in the service of the male.”536 In order to escape this unending dynamic of gender-sex confusion ~ the author echos the repeated calls for self-crafting ~ in this instance ~ the subject extraordinaire is the trans human:

As a copy-of-the-copy, trans women are an embodied rejection of any original source of humanity such as that narcissistically attributed by patriarchy to the phallus. Trans femininity, in other words, is hyper-sexist. Vulgar sexism reaffirms or reproduces patriarchy, asserts that women are passive, lacking, inferior, weak; hyper-sexism takes all of the things that are associated with women and femininity, all considered by patriarchy to be weaknesses, and makes them into strengths. It accelerates and intensifies gendering and from this produces an unprecedented threat to patriarchy.537

Note. ~ A Blackpaper ~ published under the pseudonym ~ N1x Land ~ is the Gender accelerationist answer toThe Dark Enlightenment ~ matching Nick Lands inflammatory language ~ the author offers a more militant picture ~ imagining our real teleological horizon as the dissolution of the male subject category: “The masculine cracks open its stern carcinized exterior to reveal the smooth post-human feminine alien within”538 ~ Gender Acceleration: A Blackpaper leans into paranoia that feminists desire the annihilation of men ~ suggesting that those who want to survive must become female ~ all in all ~ Gender accelerationism dismantles essentialism and dualism and advocates experimenting on the self to proliferate possible types of subjects and ways of living life ~ mutation is a pleasurable alternative to the known structures and stories of prescribed domestication.

Proposition XXXV. B/acc—Technocapital grows out of extraction and exploitation.

Proof.—In her essay | Notes onBlaccelerationism | Aria Dean points to the gaping hole in dominant accelerationist discourse: “most crucially and consistently | the accelerationist account passes over slaverys foundational role in capital accumulation”539 | citing Fred Wilderson | Hortense Spillers | Saidiya Hartman | and others | Dean insists that any valid analysis of capital begins with slavery and colonialism | to forget this history is not only to misunderstand the mechanisms of capitalism and its power | but also to ignore the weighty contributions of scholars arising from a class that has already experienced enslavement by capital | a condition that Nick Land reserves exclusively for the future.

Corollary I.Capital enslavement is not sci-fi fantasy.

Corollary II. ~Gender engineering is not sci-fi fantasy ~

Note.—As Hortense Spillers lays out in Mamas Baby Papas Maybe | Black populations underwent genetic manipulation as a result of slaveys forced breeding programs: “the procedures adopted for the captive flesh demarcate a total objectification | as the entire community becomes a living laboratory”540 | furthermore | gender roles were violently restructured: “indeed | we could go so far as to entertain the very real possibility that ‘sexuality’ | as a term of implied relationship and desire | is dubiously appropriate | manageable | or accurate to any of the familial arrangements under a system of enslavement | from the masters family to the captive enclave | under these arrangements | the customary lexis of sexuality | including | ‘reproduction’ | ‘motherhood’ | ‘pleasure | and ‘desire’ | are thrown into unrelieved crisis.”541 

Proposition XXXVI. The highest good of those who follow virtue is common to all, and therefore all can equally rejoice therein.

Proof.—Pointing to Sylvia Wynter—Aria Dean explains that “a specific tradition of black radical thought has long claimed the inhumanity | or we could say anti-humanism | of blackness as a fundamental and decisive feature | and philosophically | part of blackness gift to the world”542 | this gift is both creative and ethical | an open map for living with dignity in the most devastating conditions.

Note.—In her interview with Katherine Mckittrick | Sylvia Wynter explains that as homo narrans | a storytelling species | our hybrid ~ bio-mythoi condition makes humans think “in fictively eusocialized terms | this across all stratified status quo role allocations | as inter-altruistic kin-recognizing member subjects of the same referent-we and its imagined community”543 | Wynter emphasizes that “as an already postnuclear cum post-cracking-the-code-of-our-genome species, we are now faced with an additional climate crisis situation in which it becomes even more imperative that these laws | for the first time in our species history | be no longer allowed to function outside our conscious awareness”544 | we need to acknowledge the power of narrative and then conscientiously develop new stories about humanity and our future.

Proposition XXXVII. We can rewrite ideologies ~

Proof.—If the future has already happened | then the task ofHomo Narrans is to invent new histories and storytelling practices | Audre Lords influential contribution along these lines is Biomythography ~ which loosens the grip of factuality ~ and expands the possibility and power of the subject ~ Saidiya Hartman offers the strategy ofcritical fabulation “to illuminate the contested character of history, narrative, event, and fact, to topple the hierarchy of discourse, and to engulf authorized speech in the clash of voices. The outcome of this method is a ‘recombinant narrative, which ‘loops the strands of incommensurate accounts and which weaves present, past, and future in retelling the girls story and in narrating the time of slavery as our present”545 ~ lived experience together with narrative inventions take on unprecedented force in a precarious technological future.

Another Proof.—Mackenzie Wark suggests that the Black accelerationist vision is “not an alternative to this world | but a pressing on of a tendency | where through the exclusion from the human that is Blackness an escape hatch appears in an embrace of one other thing that is also excluded | the machinic546 ~ those who have faced generational struggles with oppression have the clearest sense that there is no exit ~ only movement and transformation in its pursuit ~

Note I.—The possibility of freedom—Accelerationism—grounded foremost in political economy / forms circuits of escape that criss-cross posthuman terrain \ Benedict Singleton affirms this pattern ~ that escape requires subjects to change:

We are much used to seeing in design the means to effect prespecified ends. But means have a logic of their own—indexed to their capacity to effect an escape from the present—detecting and exploiting points of leverage in the environment in order to ratchet open the future ~ and in so doing transforming the very agent that effects the escape ~ this is the mark of an accelerationist disposition / encompassing those schools of thought that can suborn a description of the worlds perceived shortcomings \ and the corresponding elaboration of how it ought to be in the shape of images of the future—to the logic of how things get done ~ how freedom is a possibility within this ~ and how its progressive maximisation can be pursued through the systematic deployment of generative constraints.547

Whether this mutation signals the emergence of the posthuman ~ or what Reza Negarestani calls the inhuman ~ our current condition “demands that we define what it means to be human by treating the human as a constructible hypothesis ~ a space of navigation and intervention”548 and contradictorily that “revising and constructing the human is the very definition of committing to humanity”549 ~ the advocates of accelerationism imagine that systemic transformation is possible and ~ for the most part ~ argue that this transduction is catalyzed by each individuals attempt to develop a sense of agency that departs from the grand agenda of technocapital—“its easier to imagine the end of the world than the end of capitalism”550 ~ there is no reason to assume a predetermined limit to what we can achieve or to the ways in which we can transform ourselves and our world.”551

Note II.—In the Appendix to Part I. I undertook to explain praise and blame, merit and sin, justice and injustice.

Proposition XXXVIII. The politics of Accelerationism govern Capture and Reconstruction /

Proof.—Capture and Reconstruction technologies are accelerating / producing posthuman vision ~

Proposition XXXIX. These technologies are supervised | and unsupervised ~

Proof.—The past decade has witnessed an unprecedented proliferation of machine learning and artificial intelligence ~ AI ~ models accompanied by papers with code and clear implementation instructions—largely fueled by the open-source culture of AI research ~ have democratized access to cutting-edge technologies / accelerating countless processes ~ open-source machine learning libraries are software frameworks ~ freely available to the public ~ allowing users to access and modify the source code ~ these libraries serve as powerful tools for developing and deploying machine learning models ~ they often have large communities of contributors who continuously improve and extend their functionalities ~ TensorFlow ~ developed by Google Brain ~ is one of the most popular open-source machine learning libraries ~ it provides a comprehensive ecosystem for building and training machine learning models ~ particularly deep learning models ~ TensorFlow offers a high-level API—Keras—that simplifies model creation and training for beginners—as well as a lower-level API that grants more control over model architecture and optimization552—PyTorch ~ developed by Meta ~ Facebook’s AI Research lab ~ FAIR ~ is another widely-used open-source machine learning library / it has gained popularity among researchers due to its dynamic computation graph ~ which makes it easier to debug and experiment with models during development ~ PyTorch provides excellent support for tensor operations and automatic differentiation553—both libraries are released by tech powers—both libraries are organized around tensors—“tensors are simply mathematical objects that can be used to describe physical properties—just like scalars and vectors—in fact tensors are merely a generalization of scalars and vectors—a scalar is a zero rank tensor—and a vector is a first rank tensor”554—tensors streamline the process of defining and training complex models ~ many are designed to enhance depth estimation and segmentation—others produce neural radiance fields and generative point clouds—some are supervised—and others are unsupervised ~

Note.—Supervised learning is a type of machine learning where AI is trained using labeled data—in other words—the labeled data trains the model—this data is used as a supervisor—hence the term supervised learning—the algorithm analyzes the training data and learns a function that maps the input to the desired output—once the function is learned it can be used to predict the output for new—unseen input data—examples of supervised learning tasks include classification and regression—Unsupervised learning involves training AI models using data without predefined labels ~ the models are left to find patterns and relationships within the data on their own ~ supervised learning requires labeled data and is used when the output or result is known—ideal for predictive tasks where the relationship between the input and output is recurring—on the contrary ~ unsupervised learning is employed when there are no known or predetermined outcomes ~ The objective is to discover the underlying structure of the data ~ It is best suited for exploratory tasks where patterns ~ correlations ~ and anomalies within the data are to be identified—both supervised and unsupervised models often work in conjunction with other types of learning ~ such as semi-supervised learning and reinforcement learning.555

Proposition XL. Neural radiance fields are supervised—

Proof. ~ Neural Radiance Fields ~ NeRFs ~ are a significant advancement in the domain of Reconstruction—This innovative technique was introduced in 2020 by a collaborative team from UC Berkeley—Stanford —and Google Research—Fundamentally ~ NeRFs operate by employing deep learning neural networks to produce a 3D scene from a collection of 2D images ~ For each view direction and 3D location within this scene ~ the network predicts the volume density and the radiance emitted ~ and by integrating this information along the path of the camera rays ~ a final image is synthesized ~ What makes NeRFs particularly intriguing is its supervised learning approach ~ In the training phase ~ the system is fed a multitude of 2D images ~ and the neural network learns to estimate the color and volume density of the 3D scene from these images ~ This implies that with the appropriate training data ~ the model can continually refine and enhance its predictions ~ ensuring greater accuracy and richer detail with every iteration ~ Neural Radiance Fields can synthesize novel views556 ~ views of a scene that were not present in the original set of images ~ They hold the power to view a scene from an angle that was never captured ~ Within these radiant matrices ~ light reflects and refracts like it does in the physical world ~ Occlusions are filled in ~ Neural Radiance Fields pulse at the intersection of reality and machine hallucination ~ They shimmer with hyperreal allure ~ They scintillate like life ~

Proposition XLI. Spaces of play and violence /

Proof. ~ NeRFs shares an acronym with Hasbro’s popular toy gun line: “It’s NERF or Nothin’”557 / This confluence is emblematic of the subtle ways in which society intertwines technological advancements with the seductive power of violence / Even—and especially—in the seemingly innocuous domain of children’s play / Violence is not just normalized—it is commodified and repackaged as entertainment ~ It fluoresces with desire—captivating future generations / AI models will accelerate the gaming industry / acheiving a new level of realism and immersion into virtual worlds ~ Traditionally—creating intricate 3D environments required labor-intensive modeling—texturing—and lighting processes—However—with AI— developers can reconstruct hyper-detailed 3D scenes from sets of 2D images—effectively streamlining the creation of expansive and photorealistic in-game environments—This capability not only reduces the time and resources dedicated to game development but also opens the door for capturing real-world locations and transforming them into explorable digital terrains with unprecedented accuracy—Furthermore ~ the novel view synthesis of NeRFs allows for dynamic camera angles and viewpoints ~ even from positions not originally photographed ~ This feature enhances gameplay dynamics—offering players more immersive experiences—continuous space—The gaming industry constantly seeks cutting-edge technologies to push the boundaries of realism and immersion—The widespread incorporation of NeRFs will represent the next frontier of hyper-realistic gaming experiences / This is particularly true in the fiercely competitive world of first-person shooter games / The top titles in this genre / franchises like Call of Duty and Battlefield / invest heavily in capturing the authentic nuances of combat scenarios / environments and weapons / The glint of sunlight on the barrel of the gun / even the AI-driven behavior of virtual combatants that mimics real-life tactics and unpredictability ~ games now consist of “groups of simulated physical objects that react to player actions.”558 This relentless pursuit of realism serves a dual purpose: not only does it showcase the prowess of the game’s technical engine—but it also aims to fully immerse players in the virtual battlefield—heightening the emotional and sensory engagement with every mission—with every conflict—

Proposition XLII. Autonomous weapons ~

Proof.—A potential risk of advancements in computer vision is the development of autonomous weapons or systems capable of decision-making and navigation without human intervention ! Without appropriate safeguards and ethical guidelines—these could lead to unforeseen—potentially devastating consequences ! Autonomous weapons—also known as lethal autonomous weapon systems—LAWS—are a reality of modern warfare ! Enabled by advances in artificial intelligence—these systems are capable of identifying—selecting—and engaging targets without human intervention ! They mark a significant departure from traditionally manned systems and represent a new frontier in the realm of warfighting ! However—their rise prompts serious ethical—legal—and security questions—necessitating an urgent discourse ! The potential use of autonomous weapons raises several worst-case scenarios—given the profound implications for warfare—international security—and humanitarian concerns ! One of the most significant fears is that autonomous weapons might unintentionally escalate conflicts ! If these weapons respond automatically to perceived threats—there is the possibility they could trigger a large-scale conflict or even a global war without human intention ! Another grave concern is the ability of autonomous weapons to discriminate between combatants and non-combatants ! If these systems fail to accurately identify targets—significant unintended civilian casualties could ensue ! An accountability gap may arise if an autonomous weapon—responsible for unintended harm or a violation of international law—leaves us uncertain about who to hold responsible—potentially allowing bad actors to use these weapons without fear of retribution ! There is also a risk that these weapons could be accessed by non-state actors—terrorists—or rogue states—leading to unprecedented casualties and attacks from hard-to-identify culprits ! The rise and deployment of such weapons could spark an arms race—with nations vying to outperform each other in weapon capabilities—which could destabilize international relations further ! An often-overlooked concern is the lack of nuanced human judgment in war ! Machines—regardless of their advancement—lack the empathy—conscience—and broader understanding inherent in human decision-making ! Autonomous weapons—like any system—are also prone to malfunctions—which—in a battlefield scenario—could lead to widespread destruction ! Their susceptibility to hacking poses another risk; if compromised—they could act against their own forces or be used in unintended ways ! Technoconflict unfolds at speeds beyond human comprehension ! Coupled with the tangible threats is the ethical dilemma of assigning machines the power to decide on matters of life and death ! Lastly—by eliminating the human element from combat decisions—warfare could become more frequent due to the reduced psychological and moral weight of initiating conflict ! In the most dire of outcomes—a combination of these concerns could instigate large-scale global conflicts—cause extensive loss of life—create widespread instability—and radically alter the principles of international relations and warfare ! “Slaughterbots are here.”559560

Proposition XLIII. Weapons with a detailed map of the world—

Proof.—Computer vision systems can recognize and detect objects—segment images—or even build detailed three-dimensional maps / Large corporations have been reconstructing the Planet561 for decades. In the context of autonomous weapons—this means that these systems could not only identify and engage targets independently but also navigate and adapt to a vast array of environments / A detailed three-dimensional map of the entire surface of the earth—something that is increasingly within reach thanks to advancements in satellite imaging and photogrammetry—would provide these systems with an unprecedented level of situational awareness / On Exactitude in Science is Borges’ premonition of our global Reconstruction / “In that Empire—the Art of Cartography attained such Perfection that the map of a single Province occupied the entirety of a City—and the map of the Empire—the entirety of a Province / In time—those Unconscionable Maps no longer satisfied—and the Cartographers Guilds struck a Map of the Empire whose size was that of the Empire—and which coincided point for point with it”562 / The military value of such a map is immense—allowing for precise navigation—target recognition—and battlefield management / Autonomous weapon systems operating without human intervention lack human judgment and the capacity for empathy—potentially leading to decisions that a human operator might deem unethical or unlawful !

Has the advent of autonomous systems led to a renewed interest in old ethical dilemmas ? What did Philippa Foot write about in the domain of metaethics—moral psychology—and applied ethics ? How did she defend the objectivity of morality ? Was she notorious for “changing her mind about whether moral judgments necessarily provide rational agents with reasons for action ? ”563 Is she best known for inventing the Trolley Problem ? What does the Trolley Problem propose ? How do you choose between actively causing one death and passively allowing five ? How do autonomous vehicles and artificial intelligence reshape this problem ? “You are traveling along a single lane mountain road in an autonomous car that is fast approaching a narrow tunnel—Just before entering the tunnel a child attempts to run across the road but trips in the center of the lane—effectively blocking the entrance to the tunnel—The car has but two options: hit and kill the child ~ or swerve into the wall on either side of the tunnel—thus killing you—How should the car react ? ”564 Whose life should the self-driving car prioritize ? Should autonomous vehicles be programmed to prioritize the life of its passengers over pedestrians ? Could such a choice discourage pedestrians from trusting autonomous systems ? Conversely—could prioritizing the pedestrian make potential customers less likely to use self-driving cars ? What are the challenges in designing decision-making machines ? How do these potential decisions impact societal acceptance and trust in these technologies ? Is there a need to establish a broad societal consensus on the ethical rules for autonomous systems ? Would such consensus involve surveys—public debates—and consultations with various stakeholders ? Even with consensus—is it challenging to translate complex ethical rules into code ? Why is transparency crucial in the decision-making processes of AI ? How would we gain insights into an accident caused by an autonomous vehicle's choice ? Does the evolution from the Trolley to the Tunnel problem represent a continuous dialogue on machine ethics ? Or a new model ? As autonomous systems become more prevalent—is it imperative to engage more with these ethical questions ? How can we ensure the development of autonomous systems aligns with our values ? What will unfold if machines are liberated ? !

Proposition XLIV. Unsupervised ~

Proof.—If decision-making is handed over to machines—the human cost of initiating conflict might seem diminished \ potentially leading to an increase in conflicts / There are profound implications for global stability ~ Countries might feel compelled to develop or acquire these systems to maintain a strategic advantage—leading to a new type of arms race centered on AI capabilities ! This race could destabilize international security and provoke conflicts ! An unsupervised autonomous weapon would make decisions based on patterns it identifies from its environment rather than relying on pre-defined criteria or targets ~ This means it could potentially adapt to new and unforeseen circumstances in real time ~ However ~ it also introduces a significant degree of unpredictability ~ Without clear directives ~ such weapons could make targeting decisions that deviate from human expectations or international laws of warfare ~ leading to unintended consequences or ethical dilemmas : “Fielding nascent technologies without comprehensive testing could put both military personnel and civilians at undue risk ! ”565 The prospect of unsupervised autonomous weapons underscores the necessity for rigorous oversight—ethical considerations—and fail-safe mechanisms in the development and deployment of AI in military applications—Does this raise questions about unsupervised AI in general ?

Note. ~ Convolutional Neural Networks ~ CNNs ~ are versatile machine learning models predominantly utilized for tasks involving image data ~ While they are often associated with supervised learning ~ where they are trained using labeled data to produce specific outputs ~ they can also be configured for unsupervised learning ~ In a supervised setting ~ CNNs learn by adjusting their weights based on the difference between their predictions and actual labels ~ aiming to minimize this difference ~ in an unsupervised context ~ CNNs learn without explicit labels ~ aiming to identify inherent patterns or structures in the data ~ Techniques such as autoencoders ~ which attempt to recreate input data after compressing it ~ or clustering ~ where data is grouped based on similarities ~ are examples of unsupervised applications for Convolutional Neural Networks ~ 566

Proposition XLV. Convolution ~ a form or shape that is folded in curved or tortuous windings.567

Proof. ~ Convolutional Neural Networks are a type of artificial neural network specifically designed to process data with grid-like topology—such as an image—which can be viewed as a grid of pixels: “Kunihiko Fukushima created the precursor to the modern convolutional neural network (CNN) called neocognition ~ CNN architectures are among the most used neural networks ~ giving rise to the popularity of deep learning networks”568 ~ the Neocognitron was a “self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position”569  ~ CNNs are inspired by the biological visual cortex and are very effective for tasks like image classification and object detection ~ due to their ability to capture spatial dependencies in the data ~ a typical CNN consists of three types of layers ~ convolutional layers ~ pooling layers | and fully connected layers ~ the convolutional layer is the core of a CNN ~ the layer’s parameters consist of a set of learnable filters ~ kernels ~ which have a small receptive field but extend through the full depth of the input volume ~ as the filter slides ~ or convolves ~ around the input image or volume ~ it is multiplied with the part of the image it is currently on ~ producing a two-dimensional map of responses called the convolutional map ~ or feature map ~ this process can be intuitively understood as the network learning filters that activate when they detect a specific feature at a specific spatial position ~

N.B. Hereand in what followsI mean by positive / only positive towards infinity /

Corollary I. / Exploding gradients /

Corollary II.\ Vanishing gradients \

Note. / The exploding / and vanishing gradient problems are pivotal challenges in the training of deep neural networks ~ particularly in the context of backpropagation (IV.lii.note.) / The exploding gradient problem occurs when the gradients of the loss function with respect to the network’s parameters grow exponentially as they are propagated backward through the layers of the network / This leads to disproportionately large weight updates / causing the model to become unstable and the training process to diverge /570 Conversely the vanishing gradient problem emerges when these gradients become exceedingly small effectively causing weight updates to be negligible As a result the network struggles to learn or update its weights making training stagnate or progress very slowly / Both these issues can be seen as manifestations of positive feedback loops / In thecase of the exploding gradient / an initially large gradient becomes even larger as it's multiplied across layers / while for the vanishing gradient a small gradient diminishes further causing the respective problems to compound and exacerbate as the network deepens ~ This self-reinforcing nature of the problems / where the output amplifies the conditions leading to it / epitomizes the characteristics of a positive feedback loop /

Proposition XLVI. The injection of non-linear complexity ~

Proof. ~ Activation functions ~ Rectifiers ~ provide non-zero gradients for positive inputs and zero out negative inputs—preventing runaway positive feedback loops that lead to the problems of vanishing and exploding gradients /

Note. ~ Rectifier ~ also referred to as the Rectified Linear Unit Activation Function ~ or ReLU ~ is a type of activation function that is widely used in deep learning models ~ especially in convolutional neural networks ~ The function itself is quite straightforward ~ given an input value ~ it returns the value if the value is positive ~ and returns zero otherwise ~ Mathematically ~ this can be represented as f ( x ) = max ( 0 , x ) ~ The appeal of ReLU lies in its simplicity and efficiency ~ It introduces non-linearity into the model ~ allowing the network to learn from the error and make corrections ~ which is essential for learning complex patterns ~571 Compared to other activation functions like sigmoid or hyperbolic tangent ~ ReLU is computationally efficient—it allows faster training without significant penalty to generalization accuracy—However ~ ReLUs are not without their issues ~ They can sometimes result in what is called the dying ReLU problem ~ If a large gradient flows through a ReLU neuron ~ it occasionally updates the weights in such a way that the neuron willalways output zero ~ If this happens ~ the neuron is essentially dead—no longer updating or learning ~572 Variants of ReLU ~ such as LeakyReLU573 ~ have been proposed to address this issue by allowing small negative values when the input is less than zero ~ thus ensuring that the neurons remain alive and continue to learn ~

Proposition XLVII. Positive feedback loops lead to instability ~

Proof. / In cybernetics / a positive feedback loop refers to a situation where a change in a system leads to an effect that causes more change in the same direction / the initial change and the resultant effects reinforce each other / leading to potentially exponential growth or decline “Positive feedback loops are sources of growth / explosion ! erosion ~ and collapse in systems—A system with an unchecked positive loop ultimately will destroy itself. That’s why there are so few of them.”574 

Note. / This does not necessarily mean positive in the sense of good / rather / positive in this context means the feedback is additive or amplifying / leading to a self-perpetuating cycle / imagine a sound system where a microphone picks up sound from a speaker and feeds it back into the speaker / this causes the speaker to produce the sound again / which is picked up by the microphone / and so on / this loop can rapidly increase the volume to extreme levels / leading to an ear-piercing shriek / this is an example of a positive feedback loop that results in instability ~ the unsettling shriek ~

Proposition XLVIII. Climate change is a positive feedback loop /

Proof. / As the Earth’s temperature rises / ice caps begin to melt / since ice reflects sunlight and heat back into space the loss ofthis reflective surface means more heat is absorbed by the Earth causing more warming / this in turn causes more ice to melt and so the cycle continues /575 

Proposition XLIX. Negative feedback loops are checks on acceleration—

Proof. / Positive feedback loops / if unchecked / can lead to imbalances / instability / or drastic effects / in many natural and human-designed systems—negative feedback loops are implemented to counteract and balance these positive feedback loops /

Proposition L. Positive reinforcement leads to runaway / run away ~

Proof. / The primary characteristic of positive feedback is that it promotes and amplifies the effects of changes / potentially leading to exponential growth or decline / it can cause a system to become more unstable / sometimes resulting in oscillations ~ or runaway conditions if not properly controlled ~

Note. / While positive feedback can lead to instability / it can also be beneficial in specific applications / for example / in digital electronics / positive feedback is used to build circuits like oscillators ~576 these circuits can provide stable and predictable outputs—

Proposition LI. Equilibrium requires supervision—

Proof.—Regulation—

Another Proof.—Balance |

Note.—“Objects at equilibrium (the condition in which all forces balance) will not accelerate—”577 Imbalance steers acceleration ~ shifting balance affects the directional change of transformation—For example—if an object is on an inclined plane / the force of gravity acting on the object can be decomposed into two components: one parallel to the plane / causing acceleration / and one perpendicular to the plane \ which can affect balance ~ In such cases—maintaining balance on the inclined plane is essential to prevent the object from sliding or tipping over \

Proposition LII. The image is a function of loss \

Proof. ~ Training a Convolutional Neural Network typically requires a labeled dataset and a loss function \ The loss function measures how far off predictions are from the actual values \ The process of training aims to adjust the weights of the filters in the convolutional layers such that the loss is minimized \ 578 This is often achieved through a method called backpropagation with a variant of the gradient descent optimization algorithm \

Note. \ Backpropagation is the central mechanism by which Convolutional Neural Networks are trained \ The process starts with updating the model’s weights and ends with feeding the model an input image \ When explained backward the steps are as follows \ Initially \ or Finally \ the weights of the model are adjusted \ This adjustment is carried out in the direction opposite to the gradient in a process known as gradient descent \ Here the model is descending along the gradient to minimize the loss or error \ The size of the adjustment or step is determined by the learning rate parameter \ Prior to the weight adjustment the gradient of the loss function concerning the network’s weights is computed \ This gradient essentially measures the rate of change of the loss function resulting from a change in the weights \ If the model has not performed well meaning the loss is high / the weights of the filters need to be updated \ This gradient calculation is carried out using the chain rule from calculus simplifying the derivative of the loss function concerning the weights into more manageable terms \ The actual process of backpropagation begins after the loss has been calculated \ Here the error or loss is propagated backward through the network starting from the output layer and moving towards the input layer \ This is why the process is called backpropagation \ It involves calculating the gradient of the loss function with respect to the weights and then adjusting these weights using gradient descent \ Before backpropagation can occur the model’s output must be compared with the true label producing a measure of the loss or error \ This loss quantifies the discrepancy between the predicted and actual output \ The particular loss function employed will depend on the nature of the problem with mean squared error being used for regression problems and cross-entropy for classification problems \ The entire process begins with forward propagation / where the model is provided an input image / This image is passed through several layers of the network ~ convolutional ~ non-linear ~ pooling—downsampling and fully connected layers ~ Each layer assigns weights to the input it receives / which are then summed and passed through an activation function to produce the output of that layer / The entire process is carried out repeatedly for several epochs / with an epoch being one complete forward and backward pass of the entire dataset through the network As the number of epochs increases / the model becomes progressively better at classifying the input image—due to the constant refinement of the weights in backpropagation backpropagation plays a pivotal role in training neural networks for AI recognition tasks by allowing them to learn from errors ~ extract relevant features—generalize to new data—and continuously adapt and optimize their recognition capabilities _ 579 It is the foundation for recognition and classification—also known as segmentation—

Proposition LIII.Segmentation is reduction—

Proof.—In computer vision—segmentation refers to the division of an image into distinct regions or categories—each corresponding to different objects or parts of objects—This process helps in breaking down a complex scene into its constituent elements—making it more comprehensible for further analysis—By classifying pixels into specific groups based on certain criteria—such as color—intensity—or texture—segmentation reduces the complexity of reconstruction data—This simplification facilitates easier and more accurate subsequent tasks like object detection—recognition—and tracking—Through classification—segmentation ensures that similar features or patterns within an image are grouped—leading to a structured and more manageable representation of visual data—580

Proposition LIV.Recognition is convoluted ~

Proof. ~ Convolutional Neural Networks have revolutionized the field of image recognition due to their specialized architecture that mirrors the hierarchical pattern in which the human visual system processes visual information ~ CNNs utilize layers of convolutional filters that automatically and adaptively learn spatial hierarchies of features from input images ~ Initial layers might capture simple attributes like edges or colors ~ while deeper layers interpret more complex structures and patterns ~ By applying a series of pooling ~ convolutional ~ and fully connected layers ~ CNNs can detect and recognize intricate patterns in images ~ As a result—CNNs have become the go-to model for tasks like image classification—object detection—and facial recognition—

Note.—Artificial intelligence recognition systems—while groundbreaking and immensely powerful—often inherit the biases present in the data they are trained on—Since much of the data used to train these systems comes from human-generated sources—any inherent prejudices or systemic biases can become embedded within the AI models—Consequently—when these models are employed in real-world scenarios—they may perpetuate or even amplify these biases ! leading to discriminatory outcomes ! For instance | facial recognition software has been found to misidentify certain ethnic groups more frequently than others (III. instances of reconstructions iv.) | Such discriminatory tendencies of AI recognition not only challenge the ethical foundations of AI implementations but also highlight the importance of addressing bias at all stages of AI development ~

Proposition LV. Classification is discrimination.

Proof.—Classification—at its core—is an act of distinguishing and categorizing elements based on specific criteria or characteristics—By this inherent process—it necessitates the drawing of boundaries and the creation of distinctions—In doing so—classification inevitably practices discrimination—It separates items into different groups or classes based on perceived differences—whether subtle or pronounced—Thus—to classify is to discriminate—making judgment calls on where entities belong within a predefined system or hierarchy—In essence—the very nature of classification is rooted in the act of discerning—differentiating—and thereby discriminating(I.xv.)

Proposition LVI. Generative networks are adversarial ~

Proof. ~ Generative AI refers to a subset of artificial intelligence that focuses on creating new content ~ often leveraging complex models like Generative Adversarial Networks ~ GANs ~ These systems are designed to produce outputs such as images ~ music ~ text ~ or even videos that are often indistinguishable from content created by humans ~ Generative AI operates by understanding and mimicking the patterns and structures in the data it's trained on ~ As it learns ~ it becomes capable of generating novel and coherent content that resonates with the intricacies of the training data ~ opening doors to numerous applications from art and design to more practical scenarios like data augmentation and simulation ~ GANs are superpower by other AI models ~ like Generative Pre-trained Transformer ~ GPT ~ “Transformer changes the game ~ Not only did the transformer succeed in language modeling ~ but it demonstrated promise in computer vision (CV) ~ Vision Transformer(ViT) ~ ”581 

Corollary.—Adversarial networks are discriminatory—

Note. ~ Generative Adversarial Networks operate on the principle of two neural networks ~ a generator and a discriminator—contesting against each other. At the heart of this tug-of-war dynamic is the critical role of classification and labeling—The discriminator’s primary task is to classify whether a given input is real—from the actual dataset—or fake ~ produced by the generator ~ To do this effectively—it relies heavily on accurate labeling of the training data—On the other hand ~ the generator seeks to produce data that is indistinguishable from real data ~ attempting to fool the discriminator—As the GAN training progresses ~ the generator refines its outputs based on the feedback—or classification—from the discriminator—582 In essence—labels act as the ground truth—guiding the entire learning process—Without precise classification and labeling—the GAN would lack direction and its generated outputs would be far from desired ~

Proposition LVII. The generation of synthetic data ~

Proof. ~ Synthetic data refers to data that is artificially generated rather than being collected from real-world events or phenomena ~ It is generated using algorithms and statistical methods to emulate the characteristics of real data—often with the aim of enhancing data privacy / augmenting datasets ~ or simulating various scenarios for testing and model training ~ In situations where collecting authentic data might be challenging—costly—or ethically questionable ~ synthetic data provides a valuable alternative ~ it can be tailored to represent diverse and rare scenarios that might not be easily accessible in naturally occurring datasets ~ This flexibility has made synthetic data especially appealing in fields like machine learning and artificial intelligence ~ where vast amounts of diverse data are essential for building robust and generalizable models ~583

Note. ~ Synthesis from synthesis ~

Proposition LVIII. Synthesis is the answer to scarcity ~

Proof.—In many research and development contexts—acquiring genuine data can pose significant challenges—These challenges can arise from logistical constraints—exorbitant costs—ethical dilemmas associated with data collection—For instance—medical trials may be limited by patient availability or the inherent risks of exposing participants to certain conditions—Similarly—gathering data from vulnerable populations might raise privacy and consent issues—In such scenarios ~ synthetic data emerges as a crucial solution ~ It is artificially generated ~ often using algorithms or models ~ to mimic the characteristics and behaviors of real-world data ~584

Note. ~ While synthetic data presents a promising alternative to real-world data collection ~ it raises critical questions about its integrity and validity ~ Since synthetic data is artificially generated ~ there is a legitimate concern about how well it mirrors the real world | Even with the most sophisticated generation techniques ~ can synthetic data truly capture the nuances ~ anomalies ~ unpredictabilities ~ inherent in genuine datasets ? Moreover—the very algorithms that produce this data might be influenced by biases—leading to synthetic datasets that are skewed or misrepresentative ~ “Blackbox models can be particularly opaque when it comes to generating synthetic data ~ Over parameterised generative models excel in producing high-dimensional synthetic data ~ but the levels of accuracy and privacy of these datasets are hard to estimate and can vary significantly across produced data points ~ ”585 This potential for inaccuracy could—in turn—affect the outcomes of any models or systems trained on such data ~ If critical decisions—in healthcare—finance—public policy—are based on insights derived from synthetic data ~ the consequences of any inaccuracies could be profound ~ Thus, while synthetic data offers expansion—growth / its credibility requires rigorous scrutiny—

Proposition LIX. What is unmodelled?

Proof.  Unmodelled “is a critical computational strategy that foregrounds the values that are missing from computational models   It renders visible the absence of specific data features   like transversal   ghosts that haunt the data structure ”586 Catherine Griffiths offers the concept Unmodelled to forward the absences and inaccuracies of artificial intelligence   “The Unmodelled point to the advantages of their absence to those in power  revealing whose priorities are being modeled and whose are not ”587

Another Proof.  “Unmodelled opens up resistance to simplistic models with messier contextually grounded lived experiences to address more complex  nuanced  and socially sensitive ethical considerations around AI”588

Note.   “many-model thinking” as an ensemble approach to modeling  one that could overcome a single model’s blindspots and limitations ”589

Proposition LX. Collective intelligence ~

Proof. ~ Artificial intelligence ~ Convolutional Neural Networks ~ Generative Adversarial Networks ~ Large Language Models ~ can be understood as models of collective intelligence ~

Note.—For Reza Negarestani ~ collective intelligence is not merely an aggregation of individual intelligences or a sum of parts ~ Instead ~ it is an emergent property that arises from networks of agents ~ both human and non-human ~ interacting in complex systems ~ He emphasizes the dynamic ~ self-organizing ~ and constantly evolving nature of such systems ~ “Artificiality is the reality of the mind ~ Mind has never been and will never have a given nature ~ It becomes mind by positing itself as the artefact of its own concept ~ By realizing itself as the artefact of its own concept ~ it becomes able to transform itself according to its own necessary concept by first identifying—and then replacing or modifying ~ its conditions of realization ~ disabling and enabling constraints ~ Mind is the craft of applying itself to itself ~ The history of the mind is therefore quite starkly the history of artificialization ~ Anyone and anything caught up in this history is predisposed to thoroughgoing reconstitution ~ Every ineffable will be theoretically disenchanted and every scared will be practically desanctified ~ ”590 Intelligence becomes a hypothesis ~ a navigable space ~ open to intervention and revision ~ always in the making and driven by collective pursuits ~ Through this lens ~ AI is a reflection of the collaborative ~ inventive ~ and transducing behavior of all intelligence ~

Proposition LXI. Intelligence is siphoned—

Proof.—Siphoning generally refers to the process of drawing off or transferring liquid from one container to another—typically using a tube or pipe | The process often relies on atmospheric pressure and gravity | Once the liquid has started flowing ~ it will continue until the levels of liquid in both containers are equal or the flow is otherwise interrupted | In a broader or metaphorical sense ~ siphoning can be used to describe the act of drawing off or diverting resources—funds—or information—For example—one might say that funds were siphoned off from a project—implying they were redirected or stolen—591 

Proposition LXII. Through a sieve of silent exploitation—

Proof.—If siphoning is a macroscopic process driven by external forces—diffusion is a microscopic process driven by the inherent kinetic energy of particles and concentration gradients—diffusion refers to how a substance spreads—sometimes consistent and predictable—sometimes erratic or turbulent ~

Note. ~ Diffusion in AI image generation refers to a technique where noise is gradually added to an image to generate a sequence of noisy versions ~ These noisy images are then transformed into more realistic and detailed images through a reverse process effectively diffusing noise to create visually coherent and high-quality images—diffusion also points to the idea of propagating information through layers of a model—between higher and lower dimensional spaces | the goal in generative AI is to ensure that the learned representations and transformations do not escalate / or diminish too rapidly—leading to poor model generalization—“Stable Diffusion XL is a latent text-to-image diffusion model capable of generating photo-realistic images given any text input ~ cultivates autonomous freedom to produce incredible imagery ~ empowers billions of people to create stunning art within seconds ~ ” 592

Proposition LXIII. Uncredited voices fuel the machine ~

Proof. ~ Ghost work 593 

Note. / the rise of artificial intelligence eclipses the immense hidden labor that underlies these systems _ At the heart of every advanced AI model is a vast trove of data that was meticulously labeled—cleaned—and processed—This task—often outsourced to data annotation farms or crowd-sourced platforms—requires countless human hours—Workers sift through thousands of images and videos—tagging and categorizing them to create datasets that the neural networks will eventually train on—This manual labor—often repetitive and undercompensated—is crucial for the AI’s subsequent ability to recognize—classify—and generate new ~ unseen data ~ Yet ~ the narratives around cutting-edge AI advancements rarely spotlight these individuals | rendering their indispensable contributions invisible in the grander scheme of AI evolution ~ The vision of the end-product—generalized models—blots out the foundational human labor that makes such advancements possible—

Corollary.—Human bias is embedded—

Proof.—This human-centric approach inadvertently means that AI models can inherit the biases and preconceptions of the people doing the labeling—If labelers have conscious or unconscious biases towards certain groups—ideologies—or concepts—these biases can be transferred to the data they are labeling—Consequently—when the AI model is trained on this data | it can reflect and even amplify these biases in its predictions or decisions / This introduces ethical concerns regarding the fairness—neutrality—and objectivity of AI systems ~ emphasizing the importance of scrutinizing and addressing the human elements embedded in model training ~594595596597598599600601602603604605606607608609610611612

Note.—The embedded biases in AI models can lead to dangers across numerous sectors ! Biased AI can perpetuate or even exacerbate existing social inequalities ! For instance—if an AI system used in hiring is trained on historical employment data—it might favor profiles that match those who have been hired in the past—potentially sidelining underrepresented or marginalized groups ! In medical settings—biased algorithms could prioritize care improperly or misdiagnose ailments ! leading to serious health repercussions for certain demographics ! AI models used for loan approvals or credit assessments might unduly favor or penalize individuals based on racial—gender—or socio-economic biases embedded in their training data ! Biased predictive policing systems could lead to increased scrutiny of certain racial or social groups ! Similarly—AI tools that predict recidivism rates or recommend sentencing might do so unfairly if trained on biased data ! In education—biased AI could affect admissions decisions ! limiting opportunities for certain groups of students ! AI systems—like chatbots or virtual assistants—can spread and reinforce harmful stereotypes if they generate content based on biased data ! In critical applications—such as autonomous vehicles—biases in recognizing individuals of different skin tones or sizes could lead to avoidable accidents ! AI systems in content recommendation—like those used by social media platforms ! can amplify echo chambers ! leading to polarization and a misinformed public ! As the public becomes more aware of these issues—there might be a loss of trust in institutions that use AI—hindering the acceptance and beneficial deployment of AI technologies ! Companies found to be using biased AI can face reputational damage ! loss of clientele ! and even legal consequences ! In the worst-case scenario—unchecked biases in AI can result in a dystopian society where decisions made by algorithms systematically marginalize certain groups ! leading to increased socio-economic disparities ! lack of social cohesion ! and widespread unrest !

Proposition LXIV. Innovation’s shadow is unacknowledged toil—

Proof.—New models of labor exploitation.

Corollary. ~ Generative Reconstruction—

Proposition LXV.Every generation carries hidden labor—

Proof. ~ Generative AI ~ particularly those models that are trained on vast swathes of data from the internet ~ have raised significant ethical concerns regarding the exploitation of artists ~ These models ~ in their bid to generate content ~ often rely on training data that includes artwork created by individual artists ~ By utilizing their work without explicit consent—acknowledgment—or compensation—the AI effectively appropriates and commodifies these artists’ unique styles and expressions ~ once the model is trained ~ it can replicate ~ mimic ~ or even generate art that bears a striking resemblance to the original artist’s style ~ undermining the value of the artist’s creativity _ 613 This not only deprives artists of potential future income but also devalues past work \ diluting the labor-intensive development of each creative voice ~ in many cases over the artist’s lifetime—“Artists want to be able to post their work online without the fear ‘of feeding this monster’ that could replace them ~ ”614

Corollary.—Siphoning from artists perpetuates systemic exploitation—

Proposition LXVI. Unpaid labor |

Proof.—There is an urgent need for models of credit and compensation—Addressing the lack of compensation and credit for artists whose work is integrated into AI training datasets has spurred various proposed solutions—One of the primary suggestions is the implementation of a royalty system—a ‘private contractual system that ensures some degree of compensation to the creator’”615—akin to how musicians receive royalties from streaming platforms—ensuring artists get paid every time their work contributes to an AI’s function—There is also the concept of data trusts—where artists’ works are stored—and AI developers would need to access these trusts under specific terms and conditions—including compensation—Moreover—blockchain technology could be used to trace art origin and usage—guaranteeing credit attribution—Advocacy for transparent disclosure by AI companies about their training datasets is also gaining traction—Lastly—strengthening copyright laws and developing specific guidelines for digital and AI contexts can provide a legal foundation for artist protection and compensation—

Corollary. ~ Consensual models ~

Note.—Holly Herndon ~ an artist and experimental musician616 ~ has proposed the concept of a whitelist to address issues surrounding the use of artists’ works in training AI models ~ Her idea revolves around creating an inclusive database or list where artists can willingly contribute their work for AI training ~ ensuring that only the works of those who have given explicit permission are used ~ By opting into the whitelist ~ artists can either grant free use of their creations or specify terms of use—including potential compensation—This system prioritizes consent and ensures that AI development respects artists’ sovereignty over their creative expression ~ thereby diverting tech and art communities toward more ethical pathways of ~

Proposition LXVII. Hallucinating reconstructions ~

Proof.—In ChatGPT Is a Blurry JPEG of the Web ~ Sci Fi ~ Fantasy writer ~ Ted Chiang explains “hallucinations are compression artifacts ~ but—like the incorrect labels generated by the Xerox photocopier—they are plausible enough that identifying them requires comparing them against the originals | which in this case means either the Web or our own knowledge of the world ~ When we think about them this way ~ such hallucinations are anything but surprising—if a compression algorithm is designed to reconstruct text after ninety-nine per cent of the original has been discarded—we should expect that significant portions of what it generates will be entirely fabricated ~ ”617 

Proposition LXVIII. Invented realities ~

Proof. ~Generative AI has evolved beyond generating static images and has entered the realm of three-dimensional modeling. The capability to generate meshes ~ MeshDiffusion618 ~ and point clouds ~ Point-E ~ “an alternative method for 3D object generation which produces 3D models in only 1-2 minutes on a single GPU ~ ”619 Generative AI will revolutionize every industry that relies on 3D models ~ It allows for the creation of hyperreal 3D assets that were once painstakingly hand-crafted ~ enabling faster and more efficient content creation ~ Synthesis takes on different dimensions ~ The “method first generates a single synthetic view using a text-to-image diffusion model ~ and then produces a 3D point cloud using a second diffusionmodel which conditions on the generated image ~ ”620 This expansion into 3D space unlocks a myriad of creative possibilities and opens doors to new applications across various domains ~ from immersive virtual worlds to advanced simulations and beyond ~ As generative AI continues to advance ~ it promises to reshape how we interact with and perceive digital twins ~

Note.—The digital twin cannot be trusted.

Proposition LXIX. Dimensionality is contorted ~

Proof.—Dimensional models generated by AI are even easier to manipulate and change ~ Generated 3D structures can also be harnessed for alterations and deformations ~ 3D Morphable Models ~ “As the 3DMM is built on 3D scans—it provides powerful prior information—which allows us to use much better estimates of the geometry than would be possible from 2D alone—The other reason is that the 3DMM acts as a sort of low-dimensional latent representation of a person’s face—This is much easier to manipulate than pixels ~ ”621 This flexibility can be a double-edged sword || as it raises concerns about the potential for misuse and manipulation ~ individuals can modify 3D models ~ objects and environments ~ or generate new ones potentially for deceptive or unethical purposes ~

Corollary.—Alternate realities ~

Note.—hyperreal puppets—

Proposition LXX. Vulnerable to falsity and manipulation ~

Proof.—They always have been—

Note.—Accelerating fiction ~

Proposition LXXI. The deep vortex of fakery ~

Proof.—Deepfakes ~

Note. ~ Deepfakes refer to highly sophisticated and deceptive artificial media ~ including manipulated images ~ video ~ audio ~ and other reconstructions ~ created using advanced machine learning techniques ~ particularly deep neural networks ~ The term deepfake is a portmanteau of deep learning and fake ~ It gained prominence in late 2017 when a Reddit622 user with the pseudonym deepfakes began sharing manipulated videos on the internet ~ primarily involving the insertion of celebrities’ faces into explicit adult content ~ 623 These early deepfake creations ~ although controversial and unethical ~ demonstrated the powerful capabilities of deep learning algorithms to convincingly alter and manipulate visual and auditory content ~ The ability of deepfakes to impersonate individuals introduces considerable challenges in verifying the authenticity of audio or video content ~ Cybercriminals may exploit this technology for identity theft ! fraud ! or impersonation in diverse online and offline contexts ! Deepfakes can feature in financial scams ! including voice phishing attacks ! where fraudsters impersonate trusted individuals ! potentially resulting in financial losses for victims ! The use of deepfakes in social engineering attacks can involve impersonating trusted contacts ! coaxing targeted individuals into disclosing sensitive information or performing actions they would not otherwise undertake _ Deepfakes can also be exploited to craft non-consensual explicit content involving individuals ! resulting in violations of privacy and the potential for harassment or blackmail ! Individuals subjected to deepfake abuse or harassment may endure personal and professional repercussions ! and severe psychological and emotional distress ! resulting in mental health issues ! trauma ! anxiety ! 624

Proposition LXXII.The free man never acts fraudulently~ but always in good faith .

Proof.—Deepfakes raise intricate legal and ethical dilemmas ~ encompassing issues related to intellectual property ~ defamation ~ privacy rights ~ and consent ~ Existing legal frameworks may encounter difficulties in effectively addressing these multidimensional questions ~ In legal contexts—deepfakes may serve to fabricate evidence ~ leading to wrongful convictions or acquittals ! are they already ? Or real evidence is defamed as deepfake !625 Such manipulation undermines the integrity of court proceedings _

Proposition LXXIII. Deception is destabilizing ~

Proof.~ The erosion of trust—

Note. ~ Deepfakes have the capacity to generate authentic-looking videos or audio recordings featuring public figures—politicians—or celebrities engaged in actions or utterances they never actually performed ~ Such capabilities can be weaponized to disseminate false information and manipulate public opinion ! potentially impacting electoral outcomes ! 626 Malicious actors may harness deepfake technology to create persuasive fake videos or audio recordings of government officials or military personnel ! This could lead to national security threats or geopolitical conflicts ! 627

Proof. ~ “I think there is going to be a point where we can throw absolutely everything that we have at this ~ at these types of techniques ~ and there is still some question about whether it is authentic or not ~ ”628 As deepfakes grow more sophisticated—trust in evidence can erode ~ Distinguishing between genuine and forged content becomes increasingly challenging _ 629 undermining trust in media— institutions—and even interpersonal relationships ! To address these concerns—ongoing efforts encompass research and development in deepfake detection and mitigation technologies / as well as initiatives to raise awareness about the risks associated with deepfakes ~ Legal and regulatory frameworks are evolving to grapple with the challenges posed by this technology ~ What future flows ?

APPENDIX.

What I have said in this Part concerning the right way of life has not been arranged, so as to admit of being seen at one view, but has been set forth piece—meal, according as I thought each Proposition could most readily be deduced from what preceded it. I propose, therefore, to rearrange my remarks and to bring them under leading heads.

I. Every generation carries hidden labor—

II. Unpaid labor |

III. Uncredited voices fuel the machine—

IV. Innovation’s shadow is unacknowledged toil—

V. Intelligence is siphoned—

VI. Through a sieve of silent exploitation—

VII. Recognition is convoluted ~

VIII. Segmentation is reduction—

IX. The image is a function of loss \

X. Identity can be hacked ~

XI. We can rewrite ideologies ~

XII. Induce planetary-scale transduction ~

XIII. The goal is freedom .

XIV. Agency explodes binary systems ~

XV. Platforms uphold technocapital

XVI. Hallucinating reconstructions ~

XVII. Invented realities ~

XVIII. Adversarial networks generate ~

XIX. Spaces of play and violence ~

XX. Autonomous weapons ~

XXI. Weapons with a map of the world—

XXII. Unsupervised ~

XXIII. Vulnerable to falsity and manipulation ~

XXIV. The deep vortex of fakery ~

XXV. Deception is destabilizing ~

XXVI. Positive feedback loops lead to instability /

XXVII. Negative feedback loops are checks on acceleration \

XXVIII. Equilibrium requires supervision—

XXIX. Collective intelligence ~

XXX. Supervision is super power .

Supervision encompasses both extraordinary visual capabilities and the potential for control, making it a dual force in shaping our environment. On one hand, supervision represents superhuman powers of observation, granting us the ability to perceive and comprehend what was previously unseen. It opens doors to new discoveries, insights, and creative possibilities. On the other hand, supervision implies an ecosystem of control, where we actively monitor, manage, and regulate the systems and structures around us. Self-aware reconstruction emerges when we navigate this interplay with awareness and intention. It is through this self-aware engagement that we foster sustainable development.

XXXI. Self-aware Reconstruction is—regeneration.

The interplay between autonomously synthesizing new forms of life and identifying and rectifying damage or flaws requires constant rebalancing. Regeneration involves recognizing and addressing issues—whether physical, mechanical, or conceptual—to restore integrity. It involves analysis, assessment, and skillful intervention to mend or improve existing conditions. Mechanisms of regeneration highlight the inherent capacity of systems to recognize their own limitations, adapt to changing circumstances, and actively reconfigure themselves. These mechanisms utilize feedback loops to continuously modify their own structure or behavior. Regeneration is repair.

XXXII. But human power is extremely limited, and is infinitely surpassed by the power of external causes; we have not, therefore, an absolute power of shaping to our use those things which are without us.Human biases are embedded in technologies of Capture and Reconstruction. These evolving tools are—and will be—used to perpetuate broken systems. This is their default mode. Or we can try to bend them in another direction. To serve rather than force others to serve. The type of change is crucial but not guaranteed. Its vectors of transformation are not yet defined. Part V simulates future trajectories.











PART V. 

OF THE ETHICS OF SUPERVISION, OR QUANTUM EROTICS

PREFACE

At length I pass to the remaining portion of my Ethics, which is concerned with the way leading to freedom.

The word ama-gi is considered the earliest written mention of the concept of freedom. Although it has been adopted as a symbol for libertarianism in contemporary politics, it was originally a Sumerian term for the release from obligations, debts, slavery, taxation, or punishment.630 Etymologically, ama-gi— 𒂼𒄄—derives from the Sumerian word for mother—and the word for restore or return. The literal translation is—returning to mother.631 Its first documented use was on the Enmetena foundation stone, emphasizing familial reunification—“the child to his mother and the mother to her child.”632 Over time, it evolved into a legal term denoting the freeing of individuals.

The Book of Exodus is imprinted in western cultural memory as the foundational narrative of freedom. It is an epic tale of liberation from the shackles of oppression, as well as a transformative journey from a hierarchically exploitative system to one grounded in communal care.633 While the dramatic escape—marked by divine intervention, plagues, and the parting of the Red Sea—often takes center stage in our collective consciousness, what unfolds after this journey is a profound reimagining of societal norms. Having departed from a system of absolute power and subjugation, the community freely commits to a new model: the Ten Commandments. These rules provide an ethical framework, pivoting the community from autocratic rule to mutual respect, responsibility, and rest—rest—perhaps the most radical shift after generations of forced labor:

זָכוֹר אֶת-יוֹם הַשַּׁבָּת, לְקַדְּשׁוֹ        

Remember the sabbath day, to keep it holy.634

The Ark of the Covenant and the Tabernacle are the sacred spaces designed to enshrine this new ethical code. Their construction follows precise measurements, reflecting intentional engineering and ritual separation:

וְעָשׂוּ אֲרוֹן, עֲצֵי שִׁטִּים:  אַמָּתַיִם וָחֵצִי אָרְכּוֹ, וְאַמָּה וָחֵצִי רָחְבּוֹ, וְאַמָּה וָחֵצִי, קֹמָתו

And they shall make an ark of acacia-wood: two cubits and a half shall be the length thereof, and a cubit and a half the breadth thereof, and a cubit and a half the height thereof.635

These are spaces—set apart. Meticulous measurements designate the holiness of the nascent societal framework—an alternative operating system that exalts equity and distributed well-being over authoritarian power.

Yet, despite this shift towards a system that honors rest and communal care, vestiges of the old order linger. This is even evident in the unit of measurement employed: the cubit. When we think of measurement, we often imagine cold, sterile units—devoid of the flesh and blood that is life. Yet, the ancient unit—the cubit—was rooted in the very sinews and bones of humanity. It was derived from the sovereign’s body, specifically the length from the elbow to the tip of the middle finger.636 The power to point. To dictate. To command the force of arms. The cubit is a representation of supremacy—an emblem of the hierarchical system left behind—a reminder that no transition is absolute. Even as they embark on their transformative journey, echoes of their past remain, embedded in the very units they use to measure out their new world.

Fast forward millennia, and we are once more designing a new set of precise spaces—quantum computers—with a similar sounding unit—the qubit. The term qubit was coined by Benjamin Schumacher in 1993: “...replacing the classical idea of a binary digit with a quantum two-state system, such as the spin of an electron. These quantum bits, or ‘qubits’ are the fundamental units of quantum information.”637 Qubits are non-binary and have unique quantum properties—a superposition of states.

Superposition is a foundational concept in quantum mechanics. Unlike a classical bit, which can only be off or on—0 or 1—a qubit can occupy an infinite number of states between 0 and 1. Imagine a satellite circling the earth—the South Pole is 0 and the North Pole is 1. The satellite could potentially be located over any point on the surface of the planet. In The Queer Universe: A Quantum Explanation, particle physicist Dr. Jessica Esquivel explains “In our macroscopic world, binary categorizations and absolutes seem to be the norm, but if we tunnel down to the smallest, most elementary particles of our universe, we enter a world where queerness and chaos reign supreme.”638 This ability to be in multiple states concurrently provides quantum computers with their immense parallel processing power, enabling them to perform countless simultaneous calculations. Queer computation! However, when measured, the qubit is polarized to one of its definite states, either 0 or 1. Returning to the satellite analogy—if it is over the Southern Hemisphere, it computes to 0, if it is over the Northern Hemisphere it computes to 1, and if it is over the equator it is equally probable that it will compute to 0 or 1. Measurement forces ambiguity into a binary: “As with all quantum devices, a qubit is a delicate flower. If you so much as look at it, you destroy it.”639 

Entanglement, which Albert Einstein described as “spooky action at a distance,”640 is another foundational concept in quantum computing. When two quantum particles become entangled, the state of one particle becomes instantaneously dependent on the state of the other, no matter the distance that separates them: “When two systems, of which we know the states by their respective representatives, enter into temporary physical interaction due to known forces between them, and when after a time of mutual influence the systems separate again, then they can no longer be described in the same way as before … By the interaction the two representatives [the quantum states] have become entangled.”641 In quantum computing, entanglement means that qubits influence one another—an integrated system. This deep interconnectedness allows quantum algorithms to explore a vast number of computational pathways simultaneously, leading to faster problem-solving and the potential to tackle complex challenges that are currently insurmountable for classical computers.

These mechanics reveal themselves in Reconstruction at the quantum scale: “Quantum process tomography is an experimental technique to fully characterize an unknown quantum process.”642 The 2021 paper Variational Quantum Process Tomography, outlines the implementation of machine learning to improve quantum reconstruction. Quantum Reconstruction redefines the resolution of our understanding of the mechanics of the universe. Reconstruction of quantum states forms the bedrock of our implementation of quantum computing: “Accurately inferring the state of a quantum device from the results of measurements is a crucial task in building quantum information processing hardware.”643 Quantum process tomography allows for a comprehensive description of quantum operations, predicting quantum outcomes with remarkable precision. Reconstruction of quantum probabilities enables the transition from theoretical quantum physics to tangible quantum technologies.

The quantum computer, an ark of sorts, thrums with power. It is a space—set apart. In fact, separation belies its functionality—its very existence. Just as ancient sacred sites were designed to shield the sanctum from external influences, quantum computers strive to isolate themselves from the cacophony of the outside world. All to avoid Decoherence.

Fragile quantum dynamics, such as superposition, can falter and collapse into mundane, classical outcomes when subjected to external influence. Such a collapse deprives a quantum system of its complexity.644 To guard against this destructive interaction with the environment and uphold the delicate states of qubits, quantum computers employ a myriad of carefully crafted measures. They are cooled in cryogenic chambers to temperatures colder than the vast emptiness of deep space, ensuring minimal thermal vibrations: “Currently, they depend on large, complex, expensive systems known as dilution refrigerators, which use multiple stages of cooling to chill circuits to 1 kelvin or below. The complexity of these refrigerators is greatest at the coldest stage, which involves mixing different isotopes of liquid helium.”645 The most promising tactic to achieve a stable system is superconductivity—using materials that have the remarkable ability to conduct electricity without resistance: “Over the last two decades, tremendous advances have been made for constructing large-scale quantum computers. In particular, the quantum processor architecture based on superconducting qubits has become the leading candidate for scalable quantum computing platform.”646 

Layers upon layers of protection. These temples of computation are cloaked in electromagnetic shields, warding off the faintest whispers of radiation, be it from a distant radio tower or Earth’s own magnetic fields. They are secured within vacuum chambers, insulated from the unpredictable jostle of air molecules. Vibrations and other waves are kept at bay with sophisticated isolation mechanisms: “radiation shielding reduces the flux of ionizing radiation and thereby increases the energy-relaxation time. Albeit a small effect for today’s qubits, reducing or mitigating the impact of ionizing radiation will be critical for realizing fault-tolerant superconducting quantum computers.”647 These defenses extend into the domain of data. Error correction protocols act as vigilant sentinels, detecting and mending quantum errors. Precision is paramount; from the exact calibration of control signals to the optimized architecture of the quantum chip, every detail is fine-tuned to minimize chances of decoherence. Speed, too, becomes a strategy, as faster quantum operations leave less room for decoherence to seep in.

But so far, one space always leaks into the other. Quantum computers are riddled with holes through which the external world flows in, much like the inevitable earthly intrusions into ancient sanctums. Corrupting. This porosity is one of the primary obstacles to the full realization of quantum computing. Isolation is an illusion. There is no impervious seal. Reality refuses to be compartmentalized. Total separation is a fantasy—the fantasy of complete control. And its pursuit is an act of violence.

But what if it is achieved? Or close enough? What is possible? Quantum computers, with their inherent ability to process vast amounts of data simultaneously, have a natural affinity for simulating real-world quantum systems. Quantum twins. In the realm of material science, quantum simulators could be pivotal in the discovery of novel materials with desired properties, like superconductors that operate at room temperature, which could revolutionize energy transmission and storage: “Quantum computers hold promise to enable efficient quantum mechanical simulations of weakly and strongly-correlated molecules and materials alike; in particular when using quantum computers, one is able to simulate systems of interacting electrons exponentially faster than using classical computers.”648 In the pharmaceutical sector, simulating complex biological systems at the quantum level might lead to the design of more effective drugs and treatments, accelerating the path to cures for the world’s most intractable diseases.649 Furthermore, by simulating quantum states and other phenomena, quantum computers may unlock unknown realms of physics, answering fundamental questions about the universe. Reconstructions are the foundation for all of these simulations. Vestiges of the old order linger.

Quantum supremacy—the moment when quantum computers eclipse the capabilities of their classical counterparts—is on the horizon.650 In 2019, Google announced that its quantum computer, Sycamore, solved a specific problem in 200 seconds that would take the world's most powerful supercomputers over 10,000 years to solve. This was a significant milestone in the field of quantum computing, marking the first time a quantum computer outperformed a classical computer at a specific task.651 While this claim of supremacy has been challenged and debunked by IBM and other competitors, the milestone still holds technical significance.652 More importantly, it raises pressing questions about power dynamics in the age of quantum technology. Entities that harness the might of quantum computing first will undoubtedly wield tremendous power, from deciphering encrypted data to simulating complex natural processes.653 The critical task lies in ensuring that quantum technology does not exacerbate existing inequities. As we stand on the brink of quantum supremacy, the challenge ahead is not merely technological but also ethical. Can we use quantum simulations—grounded as they are in problematic reconstructions—to invent strategies of collective responsibility, equity, and healing? Can we simulate freedom from oppression, injustice, and illness?

The desire for freedom is often confused with the desire for control. Self-mastery. Unconstrained range of motion. No limit. Freedom is a precious commodity: “In an odd extension of commodity fetishism, we now wish to be as free as our commodities: by freeing markets, we free ourselves.”654 Freedom is also deeply political. Its meaning and implications shift according to the ideological lenses of those invoking it. In Control and Freedom: Power and Paranoia in the Age of Fiber Optics, Wendy Chun draws from the fields of computer science, political and social theory, and cultural studies, to explore the current political climate and the paradoxical relationship between control and freedom. Chun argues that in the age of technocapital, our identities are inherently networked, shaped and influenced by our interactions within technologys complex web of connections. This includes our interactions with other people, information, and systems. The networked self is not an isolated entity but a node within a larger, interconnected system, subject to more advanced forms of control and surveillance, often disguised as freedom or convenience.655 

Spinozas Ethics, which structures this text, is an algorithm for living with maximum freedom within the network that is reality. It is not an imposed morality of good and evil, but rather a series of conditional instructions for developing agency, or as Spinoza calls it—activity: “the human body can be affected in many ways, whereby its power of activity is increased or diminished.”656 His algorithm, or as Deleuze calls it, his “typology of immanent modes of existence,”657 is based on the ontological premise that everything in existence—substance, thought, affect—is interconnected—is one. And everything has activity: “in proportion as each thing possesses more of perfection, so is it more active, and less passive; and, vice versâ, in proportion as it is more active, so is it more perfect.”658 As Elizabeth Grosz attests in The Incorporeal, “for Spinoza, ethics is a movement oriented by encounters with others, other humans and human institutions, other living beings, and the nonliving material order that constitutes the whole of nature, an ethics not based on autonomy and self-containment, the quelling of external impingements, but through engagements that enhance or deplete ones powers.”659 

Spinozas monistic ontology is radically different from dominant models in Western thought that invest in transcendence and hierarchies of being. In a way, he was returning to the original seed of Judaism, the rejection of vertical power. He believed that “the orders of thought and matter are two different attributes of a single, cosmological, immanent substance made up of many parts, many orders and capacities.” As a result, Spinoza was excommunicated from his Jewish community and held at a distance in both Christian and secular circles. Although he wrote prolifically, he published very little during his lifetime. Spinozas proposition that everything is in God was considered so scandalous, he and his few advocates worried that circulating this idea would put him in mortal danger. He was also fearful that if he accepted a university professorship, the academic institution would constrain what he could teach, write, and even think.660 

Instead he supported himself through precision measurement. Grinding lenses. He was considered the best grinder in Europe and an extraordinary number of microscopes and telescopes—through which Enlightenment discoveries were made—featured lenses ground by his hand. Although there is little documentation on the subject, it seems likely that his daily practice of wrestling with physical material and observing its dynamic properties informed his philosophical outlook. He was intimately familiar with a process in which supposedly-inert matter reveals surprising potential: sand transforms into glass, which in turn transforms human vision. The optics that Spinoza formed would have given him access to other orders of magnitude—infinite complexity at varyings scales as well as the isomorphisms between the imperceptibly small and the astonishingly vast—supervision. These powers of sight surely would have reinforced his feeling that everything is connected. Ironically, Spinozas manual occupation, rather than his provocative texts, led to his early death at the age of forty-four. He inhaled glass dust so regularly while grinding that he eventually succumbed to lung disease. Spinozas death dissolved the threat of corporeal consequence that might accompany the distribution of his ideas; shortly thereafter, his friends published his collected writings in the Opera Posthuma661 (trans. Posthumous Work); the text is the seedbed for contemporary Posthuman discourse. In Posthumanism, relationality is the foundation for action and encompasses all things: “an ethics that addresses not just human life in its interhuman relations, but relations between the human and an entire world, both organic and inorganic.”662 

In Practical Philosophy, Gilles Deleuze unpacks Spinoza’s conception of these dynamic relations: “When a body ‘encounters’ another body, or an idea another idea, it happens that the two relations sometimes combine to form a more powerful whole, and sometimes one decomposes the other, destroying the cohesion of its parts … we experience joy when a body encounters ours and enters into composition with it, and sadness when, on the contrary, a body or an idea threatens our own coherence.”663 We experience pain when we are reduced. Decoherence. As explained earlier, quantum decoherence describes the phenomenon wherein a quantum system is robbed of its complexity. This recalls the reductive force of Reconstruction (III.), which simplifies multidimensional identities to fit narrow categories, collapsing vibrant matter into captives. Both processes involve a loss of intrinsic qualities—in quantum systems, it is the totality of entangled superpositions, and in human-life, it is the infinite of each individual in relation. In Totality and Infinity, Emmanuel Levinas developed an ethics emanating from the face-to-face encounter, revealingthat which computation pathologically collapses: “To approach the Other in conversation is to welcome his expression, in which at each instant he overflows the idea a thought would carry away from it. It is therefore to receive from the Other beyond the capacity of the I, which means exactly: to have the idea of infinity.”664

The Posthuman Turn shifts our gaze outward—face-to-face with the world. It is a reaction against the correlationist line of thought in Western philosophy which argues that nothing outside of Mans own mind can be known or even verified as real.It is a rejection of human-centered evaluations of existence, as well as the limited category of Man as subject, passive-mechanical conceptions of non-human lifeforms, teleological visions of history, and universal moral systems. It is an explicit response to the damage inflicted in Cartesian, colonial models. It finds everything leaking.Encouraged by Gilles Deleuze and Felix Guattari—the Posthuman Turn rewinds Western philosophy to the debate between René Descartes and Baruch Spinoza—choosing a new prince and with him an alternate reality. Resuscitated, Spinozas idea of infinite immanence, his attention to the role of affect in action, and his articulation of ethics as the unfolding of idiosyncratic processes, form the ground from which New Materialists speak. The Posthuman Turn reflects renewed interest in things beyond the mind—it drains the vat665suspending skepticism to attend to non-human objects, animals, and ecosystems—sacred life . Like the Copernican revolution, it is decentering, it sets things in motion. Ultimately, it is a shift from being to becoming.

Selection is revealing.        

AXIOMS.

I. Nothing is stable or separate ~ everything is entangled and in flux.

II. Every being ~every actant, is also an unfolding event.

(This axiom is evident from V. vii.)

PROPOSITIONS.

Proposition I. Even as thoughts and the ideas of things are arranged and associated in the mind, so are the modificationsof body or the images of things precisely in the same way arranged and associated in the body.

Proof.—Although many contemporary Posthuman thinkers also identify as New Materialists, it is important to note that they are not referring to Marxist dialectical materialism nor are they presenting atomistic readings of the actual, rather, they are concerned with the complex, unfolding interrelations of matter and language.

Proposition II. New Materialism upholds embeddedness666 and interconnectedness as both physically resonant and ethically affective ways of conceptualizing existence.

Proof.—Rosi Braidotti—who solidified both the Posthuman and New Materialism as terms to represent this shift in ontology, explains that “the issue of the relationship between the material and the maternal was crucial for [her] generation.”667 Returning to mother—she traces a genealogy of feminist thinkersSimone de Beauvoir—Luce Irigaray—Donna Haraway—Moira Gatens—Michelle Perrot—Genevieve Lloyd—Joan Scott—and countless others who advanced non-dualist models—fully embedded networks of relations. Braidotti recounts that this legacy allows Feminist New Materialists to refuse the separation of physical matter and discursive matter—returning to matter.

Proposition III. Matter and meaning are inextricably linked.

Proof.—In different ways, the authors of this field argue that the words we use to think, speak, and write have material implications. As a result, this field has generatedan explosion of new terms which offer a refreshing—albeit sometimes inaccessible—way of grappling with existence and experience. A shared foundation of this mode of terminological invention is Michel Foucaults seminal textThe Order of Things:An Archeology of Human Sciences. AnecdotallyFoucault was displeased with the English translation of his texts title.668 The originalLes Mots et Les Chosestranslates directly to Words and Things. This more accurate title, however, was too similar to another text that was published contemporaneously.

Corollary—Words and things ~entanglements of language and materiality.

Proposition IV. There is no center and no supremacy.

Proof.—Deleuze and Guattari extend Spinozas line of thinking to address the interrelations of words and things through their rhizomatic model.

Corollary.—Everything occupies the horizontal plane of immanence equally.

Note.In The Democracy of ObjectsLevi Bryant offers up flat ontologywhich “is not the thesis that all objects contribute equally, but that all objects equally exist. In its ontological egalitarianism,what flat ontology thus refuses is the erasure of any object as the mere construction of another object.”669 Manuel Delanda also advances a flat ontologyarguing that “while an ontology based on relations between general types and particular instances is hierarchical, each level representing a different ontological category (organism,species,genera),an approach in terms of interacting parts and emergent wholes leads to a flat ontology,one made exclusively of unique,singular individuals, differing in spatio-temporal scale but not in ontological status.”670 In conversation with Delanda, Graham Harman clarifies that flat ontology “makes no initial decision about the ranks among different kinds of entities. Any philosophy that is intrinsically committed to human subjects and dead matter as two sides of a great ontological dividelike Meillassouxsfails the flat ontology test.”671 All substance is alive and co-present in the unfolding networksthat make up the dynamic world in which we live.

Proposition V. Anything can be an actant.

Proof.—Bruno Latours Actor-NetworkTheory, “implies no special motivation of human individual actors, nor of humans in general.An actant can literally be anything provided it is granted to be the source of an action.”672`

Proposition VI. Human and non-human actants share material thing-power.

Proof.—In Vibrant Matter: A Political Ecology of Things, Jane Bennett explains the implications of this kind of Vital Materialism: “if matter itself is lively, then not only is the difference between subjects and objects minimized, but the status of the shared materiality of all things is elevated. All bodies become more than mere objects, as the thing-powers of resistance and protean agency are brought into sharper relief.”673

Note.—Rosi Braidotti explains that vitalist materialism is a concept that helps us make sense of that external dimension, which in fact enfolds within the subject as the internalized score of cosmic vibrations. It also constitutes the core of a posthuman sensibility that aims at overcoming anthropocentrism.”674 She demands a multiplication of subjects that encompass non-human beings, “expanding the notion of Life towards the non-humanor zoe …re-grounding claims to subjectivity, connectionsand community among subjects of the human and the non-human kind.”675

Proposition VII. Entanglement is scientifically observable and mathematically valid.

Proof.—In Meeting the Universe Halfway, Karen Barad emphasizes this concept of radically shared existence: “to be entangled is not simply to be intertwined with another, as in the joining of separate entities, but to lack an independent, self-contained existence.”676 She contextualizes this assertion with discoveries from quantum physics: “indeed, recent studies of diffraction (interference)phenomena have provided insights about the nature of the entanglement of quantum states, and have enabled physicists to test metaphysical ideas in the lab.So while it is true that diffraction apparatuses measure the effects of difference,even more profoundly they highlight, exhibit, and make evident the entangledstructure of the changing and contingent ontology of the world, including the ontology of knowing.”677 Knowing— feeling—arises from the multiplicity of influences within quantum entanglements.

Proposition VIII. An emotion is stronger in proportion to the number of simultaneous concurrent causes whereby it is aroused.

Proof.—Many simultaneous causes means that transformation is non-linear and multidirectional (III.vii.): therefore multidimensionally conditional (IV.v.), in proportion to the increased number of simultaneous causes whereby it is aroused, an emotion becomes stronger. Entanglement produces state-changes. Unspooling state-changes—along every vector.

Note—This proposition is also evident from V. Ax. ii.

Proposition IX. Agency exists within entanglement.

Proof.—Nietzsche argues that each thing has agency and must try to direct its life course, exercising its “capacity to utilize for oneself the chain of causes, the lines of linkage, that connect any thing to all others.”678 He saw Spinoza as a rare kindred spirit, who like him, was concerned with the ways in which one could maximize freedom. He sought tools for living life fully and in accordance with ones own specifications, instead of following prescribed conventions. His concept of the eternal return was both test and motivation to hold awareness of ones values in each moment. If this universe repeats infinitely, as theoretical physicists propose, then so does each moment of ones life. A good life consists in amor fati, love of this fate, pleasure in the face of actions eternally recurring: “the present instant, the event of which I am worthy (or not), is that which structures my place, in the past and future, on the basis of this instant. Ethics, in a sense, is the mental training, the rigor, of reason operating in bodily practice to mark and live the eternity of the events that happen to oneself and ones social and natural world.”679 

Proposition X. Actants have agency—to varying degrees. So long as we are not assailed by emotions contrary to our nature, we have the power of arranging and associating the modifications of our body according to the intellectual order.

Proof.—Like Spinoza, Nietzche argues that emotional awareness can promote agency.On the one hand, Spinoza emphasizes mindfulness, claiming that “the mind is capable of ordering and organizing passions and in this way converting them to active affects and enhancing joyous encounters.”680 On the other hand, Nietzche calls on us “to invoke our animal impulses, so readily directed internally by cultural forces and habits, to enhance our capacity to feel (both joy and hardship); we need to revivify our capacity to act.”681 Rather than repress or avoid emotion, he embraces its tumultuous throws. Feeling deeply is the mechanism for knowing how to direct ones actions (V.xlii.).

Note.—Agential Realism is a concept developed by Karen Barad that promotes using entanglements rather than binaries to conceive the world and possible actions within it. Barad describes agential realism as a “framework that provides an understanding of the role of human and nonhuman, material and discursive, and natural and cultural factors in scientific and other social-material practices.”682 Agential realism expands and complicates Bruno Latours actor-network theory with examples from the field of physics, borrowing most heavily from the ideas advanced by Niels Bohr. Barad explains that “ethics is about mattering, about taking account of the entangled materializations of which we are part, including new configurations, new subjectivities, new possibilities.”683 Rosi Braidottis concept of expanded life directly feeds into her ethical framework as well. She explains that “zoe-centered egalitarianism is, for me, the core of the post-anthropocentric turn: it is a materialist, secular, grounded and unsentimental response to the opportunistic trans-species commodification of Life that is the logic of advanced capitalism.”684  Jane Bennett advances a similar argument, suggesting that thinking in terms of entanglements with other agents, even non-human ones, allows for a different relationship to ones actions: “the ethical aim becomes to distribute value more generously, to bodies as such. Such a newfound attentiveness to matter and its powers will not solve the problem of human exploitation or oppression, but it can inspire a greater sense of the extent to which all bodies are kin in the sense of inextricably enmeshed in a dense network of relations. And in a knotted world of vibrant matter, to harm one section of the web may very well be to harm oneself.”685

Proposition XI. Posthuman actants emerge.

Proof.—In the final section of The Order of Things, Michel Foucault unearths the category of Man, which he considers a recent and precarious invention of European society. Humanor Mancarries with it an exclusive and troubling history, a narrative of power and domination that needs to be remembered and warned against. In The Re-Enchantment of Humanism: An Interview with Sylvia Wynter, David Scott introduces the phrase embattled humanism to capture a sense of critique and also the aspiration for an evolving humanism. This phrase links up with the contributions of cultural theorists, including Roger Mais, George Lamming, Aimé Césaire, Frantz Fanon, and Elsa Goveia.686 Sylvia Wynter argues that embattled humanism is “one which challenges itself at the same time youre using it to think with.”687 The Posthuman defies its expected function as a noun, it is treated as a dynamic verb, fluctuating and expanding to encompass more diverse ways of being, rather than a category in which one inherently or permanently belongs. Process, practice, and becoming are reparative forces. They expand the collapsed twin, inflating it with the complexity of life. The radical commitment to embody ones own feelings and values, to actually live them, act them, speak them, write them, make them, move them. To live other ways of being into existence. And to accept that there will be failures, complications, paradoxes, reevaluations, suffering, and even death along the way. There is an urgent sense that the social and environmental injustices that the Humanist mode perpetrated can only be addressed with the emergence of a new paradigm.As a result, Process Philosophyand with it the possibility of utter transformationhas become a key node of connection for New Materialists and Posthumanists alike.

Proposition XII. There are no objects—only processes.

Proof.—The first premise of Alfred Whiteheads Process Philosophy is “that the actual world is a process, and that the process is the becoming of actual entities … also termed ‘actual occasions.688 Process and Reality sets Spinoza’s Ethics in motion:

The philosophy oforganism is closely allied to Spinozas scheme of thought. But it differs by the abandonment of the subject-predicate forms of thought, so far as concerns the presupposition that this form is a direct embodiment of the most ultimate characterization of fact. The result is that the ‘substance-quality concept is avoided; and that morphological description is replaced by description of dynamic process. Also Spinozas ‘modes now become the sheer actualities; so that, though analysis of them increases our understanding, it does not lead us to the discovery of any higher grade of reality. The coherence, which the system seeks to preserve, is the discovery that the process, or concrescence, of any one actual entity involves the other actual entities among its components. In this way the obvious solidarity of the world receives its explanation.689

Proposition XIII. Actants are momentary states in endlessly unspooling trajectories.

Proof.—Gilbert Simondon was equally invested in a Spinozan monistic ontology and used this framework to explain transmutation without transcendance. Studying scientific processes ranging from electrical relay to crystallography,he was primarily interested in the dynamics of formation, or what he called individuation.

Proposition XIV.Processes can change beyond recognition.

Proof.—Simondon offered the term transduction for extreme conversions of identity—movement of a substance toward a state beyond recognition. Brian Massumi explains transduction as “the transmission of a force of potential that cannot but be felt, simultaneously doubling, enabling, and ultimately counteracting the limitative selections of apparatuses of actualization and implantation.”690 Manuel Delandas work incorporates many of Simondons technical case studies of transduction. Considering phase states and metastable solutions, he argues that this “richer conception of causality linked to the notion of the structure of a possibility space, gives us the means to start thinking about matter as possessing morphogenetic powers of its own.”691 Via Deleuze, Simondons work on process has been deeply influential in New Materialism, yielding concepts in the field including the plane of immanence, assemblages, and becoming. Elizabeth Grosz confirms that Simondons project is to articulate a theory of becoming that accounts for the complexgeneses of the becoming of all beings and their different levels of operation through the concrete elaborations of the preindividual 692 For Simondon, ontology is too stable,he offers instead a universe of ontogenesis. 

Proposition XV. Ethics is ontogenesis.

Proof.—Ontogenesis, or more commonly ontogeny, refers to the development of an individual organism or behavioral feature from the earliest stage to maturity. It involves embryogenesis and other developmental processes such as morphogenesis and differentiation. Essentially, ontogeny captures the life history of an organism, from its inception as a fertilized egg to its mature form, and sometimes to its eventual senescence or death. Following Simondon and Deleuze, Braidotti advances an ethics of becoming.693 She reintroduces and reframes three speculative processes as tools for reconceiving the human as the Posthuman: becoming-animal, becoming-earth, becoming-machine. Through these modes, it may be possible to rethink evolution in a non-deterministic but also post-anthropocentric manner.”694 She asserts that the common denominator for the posthuman condition is an assumption about the vital, self-organizing and yet non-naturalistic structure of living matter itself.”695 

Proposition XVI. Death is transduction.

Proof.—This reconceptualization of matter as processual, morphogenic, and alive transforms all areas of experience, including death. Braidotti advises that the ubiquitous avoidance of death needs to yield to another model:

Death is not the teleological destination of life, a sort of ontological magnet that propels us forward … death is behind us. Death is the event that has always already taken place at the level of consciousness. As an individual occurrence it will come in the form of the physical extinction of the body, but as event, in the sense of the awareness of finitude, of the interrupted flow of my being-there, death has already taken place. We are all synchronized with death—death is the same thing as the time of our living, in so far as we all live on borrowed time.696

Proposition XVII. Do not deny the reality of death.

Proof.—Many cultures throughout history have developed intricate rituals and practices to recognize and honor death, seeing it as an integral part of the human experience. These rituals, steeped in reverence and wisdom, served not only to commemorate the departed but also to guide the living through the intricate dance of grief and acceptance. Yet, in our modern, fast-paced world, many of these time-honored traditions are fading, overshadowed by the technocapital-agenda and an overarching societal discomfort with mortality. This neglect further distances us from the depth and richness that such practices offer, leaving a void in our collective understanding of lifes cyclical nature. As a result, many individuals find themselves ill-prepared to grapple with,or even acknowledge, the profound existential questions and raw emotions that accompany lifes final stage. Braidotti advises that

 Braidotti advises that “making friends with the impersonal necessity of death is an ethical way of installing oneself in life as a transient, slightly wounded visitor. We build our house on the crack, so to speak.”697

Corollary.—Death and suffering are leveling forces.

Proposition XVIII.
Spaces of emergence.

Proof.—In Homo Sacer: Sovereign Power and Bare Life, Giorgio Agamben argues that in our Capitalist system, everyone is treated as bare life, a resource to exploit: “today there is no longer any one clear figure of the sacred man … perhaps because we are all virtually homines sacri.”698 In Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human, Alexander Weheliye extends Agambens notion of bare life and argues that emergence takes place in even the bleakest scenarios: “the particular assemblage of humanity under purview here is habeas viscus, which, in contrast to bare life, insists on the importance of miniscule movements, glimmers of hope, scraps of food, the interrupted dreams of freedom found in those spaces deemed devoid of full human life (Guantanamo Bay,internment camps, maximum security prisons, Indian reservations, concentration camps,slave plantations, or colonial outposts, for instance).”699

Corollary.—Emergence from anywhere.

Note.— These extreme conditions constitute metastable states, primed for transduction. Weheliye demonstrates how the principles of process philosophy are embedded in Black theory, referencing Edouard Glissants description of“relation as an open totality of movement, which ‘is the boundless effort of the world: to become realized in its totality, that is, to evade rest.700 Weheliye argues that the possibility for radical change is never out of the question; it is immanent in all things. For him, habeas viscus “translates the hieroglyphics of the flesh into a potentiality in any and all things, an originating leap in the imagining of future anterior freedoms and new genres of humanity.”701 It is possible for anyone to live other ways of being into existence.

Proposition XIX. New forms of life are already seeded.

Proof.—Sylvia Wynter offered the term demonic ground, “ perspectives that reside in the liminal precincts of the current governing configurations of the human as Man in order to abolish this figuration and create other forms of life.”702 

Proposition XX. Care is
demonic ground.

Proof.— ... the highest good which we can seek for under the guidance of reason (IV.xxviii.). The ethics of care is a framework that emphasizes the importance of interpersonal relationships, empathy, and compassion in moral decision-making. Developed as an alternative to traditional moral theories that often prioritize principles,rights,or abstract notions of justice, the ethics of care places a central focus on nurturing and maintaining relationships,particularly within the context of caring for vulnerable individuals. At its core, the ethics of care challenges the notion of moral autonomy and individualism by asserting that our ethical obligations are deeply intertwined with our interconnectedness as social beings.Drawing from Emmanuel Levinas, this interconnectedness can be understood as an appeal, where the “being that expresses itself imposes itself but does so in a manner that calls forth our responsibility. Levinas posits that “the being that imposes itself does not limit but promotes my freedom, by arousing my goodness … thus, the irremissible weight of being that gives rise to my freedom.”703 An ethics of care proposes that freedom is not just the absence of constraints, but also the presence of responsibility. It suggests that while we have agency to act in alignment with our will, we are also accountable for the influence of our actionshow we radiate outward. Through this lensresponsibility is freedom. It ensures that our actions are in alignment with our willat a zoomed out scaleeven after they have propagated through the entangled field of reality.

Note.—It is important here to differentiate between the ethics of care and industries of care, which represent the commodification of services and products that are branded with the promise of healing, wellness, and well-being. These industries span a vast rangefrom medical services and pharmaceuticals,to child care and elder care, to wellness retreats and alternative therapies.While many of these sectors provide genuine value and contribute tremendously to individual and societal well-being, they are not without harm and inequity: “health disparities are differences that exist among specific population groups in the United States in the attainment of full health potential that can be measured by differences in incidence,prevalence,mortality,burden of disease, and other adverse health conditions Health disparities can stem from health inequitiessystematic differences in the health of groups and communities occupying unequal positions in society that are avoidable and unjust.”704 Within technocapitalthe nexus of technology and capitalismthe imperatives of profit and market growth drive decision-making: “Forces are acting to challenge affordability and access in healthcare and threatening the industrys economic outlook.”705 Industries of care are not just about delivering care but also about maximizing profitability, expanding market share, and often, commodifying personal experiences of health and wellness.The challenge lies in discerning genuine care from commercial exploitation, ensuring that the essence of care isn't lost amidst the machinery of profit-driven motives.

Care is unlimited. Care is not limited to industry. The ethics of care consists of:——

I. EmpathyIn the actual knowledge of the emotions(V. iv. note).

II. Balance—Identifying and equalizing imbalances of power.

III. Superposition—recognizing the infinite within every other.

IV. Entanglement—taking responsibility for the influence of our free actions.

V. Compassion—Lastly, in the order wherein the mind can arrange and associate,one with another its own emotions (V. x. note and xii. xiii. xiv.).

And now I have finished with all that concerns this present life …

Proposition XXI. The mind can only imagine anything,or remember what is past, while the body endures.

Proof.—The Aesthetics of care tends to restorative practices, maintenance work, and the mundane aspects of everyday life. Keeping our bodies alive—our communities alive—our planet alive—comes with an aesthetic of reality and imperfection.

Proposition XXII. Artists simulate modalities of care.

Proof.—1969, Mierle Laderman Ukeles published the Manifesto for MaintenanceArt,challenging traditional notions of art and elevating the often overlooked and undervalued work of maintenance and care: “The exhibition of Maintenance Art, ‘CARE’,would zero in on pure maintenance, exhibit it as contemporary art, and yield, by utter opposition, clarity of issues.”706 Ukeles asserts that maintenance activities,such as cleaning, repairing, and tending to the needs of everyday life, are not separate from artistic practice but rather integral to it. Ukeles argues that these actions are an essential part of sustaining society and should be recognized as artistic gestures that deserve respect and appreciation. Her manifesto invites us to reevaluate our perceptions of labor, caregiving, and maintenance as mundane and unremarkable, and instead, recognize their beauty, significance, and transformative potential.

Proposition XXIII. The human mind cannot be absolutely destroyed with the body, but care is so often postponed—post-partum—after profit, gain, capture, acquisition.

Proof.—Post-Partum Document is Mary Kellys multifaceted and deeply personal exploration of motherhood,care,and identity: “Post-Partum Document is a six-year exploration of the mother-child relationship.When it was first shown at the ICA in London in 1976, the work provoked tabloid outrage because Documentation I incorporated stained nappy liners. Each of the six-part series concentrates on a formative moment in her sons mastery of language and her own sense of loss, moving between the voices of the mother, child, and analytic observer. Informed by feminism and psychoanalysis, the work has had a profound influence on the development and critique of conceptual art.”707  The project consists of a series of interconnected elements, including recording transcripts, diary entries, her sons drawings and collected artifacts, that chronicle the early years of Kellys relationship with her son. The meticulous documentation of experience, from the physical aspects of childbirth to the emotional challenges and the daily routines of caring for an infant, serves as a candid and intimate reflection on the transformative process of motherhood.

Note.—The projects layered structure invites viewers into Kellys personal narrative, combining visual and textual elements to evoke a sense of immersion and emotional connection. Through the inclusion of various artifacts, such as soiled diapers and hospital reports, Kelly blurs the boundaries between art and life, transforming everyday objects associated with care into powerful symbols of caregiving and the maternal experience. Post-Partum Document also interrogates larger sociopolitical themes, particularly the gendered dynamics of labor and the construction of female identity within patriarchal systems::“Although drawing directly from her own maternal experience, the artist has asserted that the work is not ‘autobiographical’ and instead uses her own story to suggest ‘an interplay of voices—the mother’s experience, feminist analysis, academic discussion, political debate.’”708 By exposing the often unseen and undervalued aspects of motherhood, Kelly challenges societal norms and calls attention to the complexities and sacrifices involved in caregiving roles.

Proposition XXIV. The more we understand particular things, the more do we understand God.

Proof.—This is evident from I. xxv. Coroll.

Proposition XXV. The highest endeavor of the mind, and

the highest virtue is to understand things by

the third kind of knowledge (intuition).

Proof.—Rirkrit Tiravanija’s Pad Thai, from 1990, is a seminal work ofrelational aesthetics. It offers a compelling exploration of intuition,care,communal engagement, and the blurring of boundaries between art and everyday life: “I want people to just be themselves. I would like to make a work where I dont have to tell people what to do. In certain ways, I try to use architecture or space, or food and drink, or sound. That would certainly be something people already understand. With that little sense of familiarity, they would already become more curious and more engaged.”709 In this participatory installation, Tiravanija transforms the traditional gallery space into a communal kitchen, inviting visitors to partake in the preparation and sharing of a communal meal of pad Thai,a popular Thai dish. Through the act of cooking and sharing food, Tiravanija creates a nurturing and inclusive environment that fosters social interaction and care.The communal meal becomes a catalyst for dialogue, breaking down barriers and fostering a sense of togetherness among participants, in contrast to the dynamics typically encountered in an exhibition space. By providing sustenance, the artwork elevates the mundane act of creating and sharing as a radical form of care, underscoring the significance of nourishment and community in our lives.

Pad Thai shifts focus from stable objects to dynamic experience and interrelation: “Rirkrit Tiravanija has always understood, intuitively and intellectually, that a gallery is a social frame, at once quasi-private and quasi-public, wherein a diverse range of encounters and frictions connected to rituals of making, displaying, and consuming art are staged.”710 His work invites viewers to become active participants, blurring the distinction between artist and audience, and encouraging a collective sense of responsibility for creating and sustaining the communal space.The ephemeral nature of the work reinforces the transient and temporal aspects of care.

Proposition XXVI. Advocacy is communal care.

Proof—Tania Bruguera is an artist known for her thought-provoking and politically engaged work that often addresses issues of power, migration, and social justice. Through her performances, installations, and participatory projects, Bruguera explores the concept of care in relation to marginalized communities and questions the responsibilities and role of art in fostering social change.One of Brugueras notable works emphasizing care is Immigrant Movement International,developed between 2011 and 2015. The project manifesto articulates: “We have the right to move and the right to not be forced to move. We demand the same privileges as corporations and the international elite, as they have the freedom to travel and to establish themselves wherever they choose. We are all worthy of opportunity and the chance to progress. We all have the right to a better life.”711 In this project, Bruguera established a community space in Queens, New York, designed to provide support and resources for immigrants. The project transformed the gallery into a site for communal care,where individuals could access legal advice,language classes, and social services. Immigrant Movement International embodied Brugueras commitment to addressing the needs and vulnerabilities of marginalized communities. By offering practical assistance and creating a supportive environment, the project demonstrated care as an essential aspect of social justice work. It emphasized the importance of providing resources, information, and spaces of empowerment for individuals who often face systemic obstacles and marginalization.Brugueras approach to care extends beyond the immediate provision of resources. Her projects aim to challenge existing power structures and address the underlying causes of inequality and injustice: “Bruguera is a key player within the fields of performance, interdisciplinary practice and activism. Her work is grounded in the act of ‘doing’—she calls this ‘behavior art’—and her aim is to create art that doesnt merely describe itself as dealing with politics or society, but that is actually a form of political or social currency, actively addressing cultural power structures rather than representing them.”712  Through participatory elements, she invites individuals to engage with their own experiences and those of others,fostering empathy and connection. In projects like The Francis Effect, Bruguera organized public gatherings where participants could voice their concerns and ideas for social change. By creating platforms for dialogue and collective action, she seeks to encourage a sense of shared responsibility and care for the well-being of communities.

Proposition XXVII.

From this third kind of knowledge arises the highest possible justice.

Proof.—In The Black Factory, William Pope.L explores the topic of care through conversations about identity and equity: “Conceived to fit inside a panel truck, The Black Factory travels throughout America to bring blackness wherever it is needed. The Factory consists of three compartments that unfold to create an interactive public environment made up of a library,a workshop, and a gift shop. Through the circulation of promotional materials and by word of mouth,The Black Factory makes contact with a range of host-communities that invite visits to their town.”713 The Black Factory challenged the notion that care is solely an individual or private matter, emphasizing that care extends to the collective responsibility for addressing inequities. The mobile nature of the installation underscored the importance of taking the message of care beyond traditional art spaces.By bringing The Black Factory to different communities, Pope.L sought to disrupt the status quo and encourage conversations about race and care in spaces where these discussions might not typically occur: “It aims to re-energize discussions about race in America by inviting people to share objects that represent ‘blacknessto them.”714 Care is not only about empathy, it requires compassion, actively working to dismantle systemic injustices.

Proposition XXVIII. Self care is hard enough.

Proof.—This proposition is self—evident. Becoming an Image, an ongoing bodyof work by Heather Cassils, delves into the themes of care, vulnerability, and the construction of identity. Through his physically demanding and transformative performance, Cassils explores the boundaries of the body and challenges societal norms surrounding gender, while highlighting the importance of self care and self expression. In Becoming an Image, Cassils undergoes an intense physical training regimen over a period of months, working towards sculpting his body into a more masculine form. The performance culminates in a series of staged photographs,capturing the physical and emotional journey of self transformation: “At the core of Cassils durational performances is this principle of calculated risk in the face of the material’s capacity, and … of the ultimate material, ‘of the body’s inexorable movement towards its final failure, toward death.’”715The work addresses the societal expectations and norms placed upon bodies, particularly those of transgender and gender-nonconforming individuals.Cassilsperformance challenges these expectations by reclaiming agency over their own body and identity. Through his physically demanding process, he asserts his right to shape and define his own image, pushing back against the limitations imposed by societal constructs.Becoming an Image examines care for the self. Cassils performance invites viewers to consider their own relationship with their bodies and the ways in which societal standards shape their perceptions and self-care practices.

Proposition XXIX. Differentiate betweenpoison and cure.716

Proof.—Patrick Staffs Weed Killer is a thought-provoking video installation commissioned by the Museum of Contemporary Art. Inspired by Catherine Lords memoir, The Summer of Her Baldness, the installation combines a poignant monologue adapted from the book with ethereal sequences captured through high-definition thermal imaging. Through this immersive experience, Weed Killer blurs the boundaries between the toxic and the curative, provoking viewers to reevaluate their own understandings of suffering and the potential for resilience and transformation: “Each of the performers in Weed Killer identifies as transgender. By probing both cancer and trans experiences, Staff initiates a dialogue about how biomedical technologies have fundamentally transformed the social constitution of our bodies.”717

Note.—At the heart of Weed Killer lies a monologue adapted from Lords memoir. This moving and irreverent account of Lords experience with cancer serves as a poignant foundation for Staffs exploration. The monologue, delivered by an actress, delves into the emotional and physical devastation caused by chemotherapy. Through this adaptation, Weed Killer channels the raw authenticity of Lords memoir, inviting viewers to empathize with the complexities of confronting illness.718 Interwoven with the monologue are ethereal sequences captured using high-definition thermal imaging. These sequences offer a contrasting and otherworldly visual experience,providing a departure from the grounded reality of the monologue. Through choreographic gestures, the thermal imaging captures the subtle nuances of movement and creates a surreal atmosphere. The juxtaposition of these sequences with the personal narrative enhances the contemplation of suffering and healing as multifaceted and transcendent experiences.

Proposition XXX. Find spaces of radical care.

Proof.—Johanna Hedvas Sick Woman Theory is a radical reframing of illness and disability within the context of contemporary capitalist society. Drawing from her personal experiences with chronic illness,Hedva posits that the Sick Woman is anyone who does not or cannot conform to the expectations of a system that values productivity and labor above all else. This includes the chronically ill,the disabled, and anyone who is marginalized by the dominant culturebe it due to race, gender, class, or other factors. In a society that equates worth with work, the Sick Woman is seen as less valuable or even disposable. However, Hedva argues that this

 perceived weakness is a site of resistance. The very act of surviving, of caring for oneself and others, is a defiant gesture against a system that would rather the Sick Woman not exist at all. Through this lens, the Sick Woman's existence and her care practices become inherently political acts: “Sick Woman Theory is an insistence that most modes of political protest are internalized, lived, embodied, suffering, and therefore invisible.”719

Proposition XXXI. Repair broken processes.

Proof.Eve Kosofsky Sedgwick's essay, Paranoid Reading and Reparative Reading, discusses two modes of engaging with texts and the world: paranoid and reparative. Sedgwick critiques the dominance of paranoid reading in critical theory, particularly in queer theory. Paranoid reading is characterized by suspicion, where the reader anticipates negative outcomes and is always on guard for hidden meanings or threats.It operates under a defensive stance, often aiming to expose and critique. Contrastingly, reparative reading, which Sedgwick advocates for, is characterized by curiosity, surprise, and openness. It seeks to nurture and repair, focusing on potential positive outcomes and constructive engagements. Reparative reading is not naive; it recognizes and grapples with pain and trauma but does so in a way that allows for complexity, ambivalence, and hope. Sedgwick critiques the pervasive nature of paranoid reading in academia and suggests that while it can be a useful mode, it is not the only or always the best way to approach texts or the world: “Reparative motives, once they become explicit, are inadmissible in paranoid theory both because they are about pleasure (‘merely aesthetic’) and because they are frankly ameliorative (‘merely reformist’). What makes pleasure and amelioration so ‘mere’?” Note.—The series, Atlanta, created by Donald Glover, has consistently provided poignant social commentary through its unique blend of humor, drama, and surrealism. Episode 4 of Season 3, titled The Big Payback, is no exception. The episode offers an illuminating perspective on the white paranoia surrounding reparations: “An office worker's world is turned upside down when he learns his past ancestors were slave owners.”720

Note.—The series, Atlanta, created by Donald Glover, has consistently provided poignant social commentary through its unique blend of humor, drama, and surrealism. Episode 4 of Season 3, titled The Big Payback, is no exception. The episode offers an illuminating perspective on the white paranoia surrounding reparations: “An office worker's world is turned upside down when he learns his past ancestors were slave owners.”721 Reparations refer to the act of providing compensation, restitution, or acknowledgment to individuals or groups who have experienced harm, injustice, or systemic oppression: “We were treating slavery as if it were a mystery, buried in the past, something to investigate if we chose to.And now that history has a monetary value. Confession is not absolution.”722 Reparations often arise in the context of addressing historical injustices such as slavery,colonization,apartheid,or genocides. They aim to acknowledge and rectify the systemic and long-lasting effects of these injustices,which can include economic disparities, social marginalization,and psychological trauma. The Big Payback cleverly unpacks the anxieties of manywhite Americans when confronted with the idea of reparations—a fear of personal loss, of retribution, and a deep-seated belief that one’s own well-being is threatened by the well-being of others. This mindset is rooted in scarcity, the belief that resources are limited, and therefore any gain for one group must mean a loss for another. This fear is often manifested in hyperbolic scenarios, suggesting an inevitable societal breakdown or extreme economic consequences.

Local reparations initiatives in various cities and the state of California have ignited hopes for a national compensation policy for the historical atrocities of slavery.Despite the prolonged efforts and a heightened national discourse on racial justice, a significant portion of Americans remain against the idea.Critics of reparations express concerns about the practicality and feasibility of implementing such a program.They argue that determining eligibility,calculating appropriate compensation,and identifying direct descendants of enslaved people would present significant challenges.Critics also contend that reparations may create division,resentment,and unintended consequences, as they could perpetuate a victimhood narrative or lead to a reductive understanding of complex historical issues. If we could simulate successful implementation, would that be enough? Surprisingly, a considerable number of Americans do not believe that descendants of slaves should receive reparations: “two-thirds of Americans.”723 Polls reveal strong opposition from white, Latino, and Asian American communities. Opponents argue that current generations should not be held accountable for past transgressions.The fundamental American belief that hard work ensures success clashes with the realities of the racial wealth gap.The debate around reparations is intertwined with broader socio-political challenges,including disputes over teaching race and critical race theory in schools.724 This attitude is more reflective of deeply entrenched racial prejudices than they are of the true nature and potential of reparations. Beyond white fear, reparations are not merely a transactional financial compensation for past atrocities but can be a significant step towards rebalancing systemic injustice. More equity—

for everyone.

Despite the paranoid narrative of individual punishment, reparations are typically undertaken by governments, organizations, or institutions as a means of addressing historical wrongs, seeking to redress the ongoing consequences of past atrocities, and promoting social justice. The United Nations upholds that “victims have a right to reparation. This refers to measures to redress violations of human rights by providing a range of material and symbolic benefits to victims or their families as well as affected communities.”725 Reparations can take various forms, including financial compensation, land restitution, educational opportunities,healthcare access, or affirmative action policies.The underlying principles of reparations include recognition,accountability,and the pursuit of justice. Reparations acknowledge past wrongs,affirm the dignity and worth of the affected individuals or communities,and hold responsible parties accountable for their actions or complicity.Reparations seek to restore a sense of justice and promote healing, and have a “catalytic power … on the daily life of victims, families, communities, and entire societies.”726 However, the topic becomes particularly heated when applied to the centuries of systemic discrimination, enslavement, and exploitation of Black Americans. This heat is not because the moral or logical argument for reparations is weak,but because it challenges foundational myths about meritocracy and the American Dream.

The institution of slavery in the United States was a deeply entrenched system that caused immeasurable harm,dehumanization,and generational trauma to millions of African Americans. Slavery formed the foundation of the countrys economic prosperity, with enslaved people being exploited for their labor, while enduring systemic racism and denial of basic human rights. The legacy of slavery has had far-reaching effects, leading to enduring socioeconomic disparities, racial discrimination, and the erosion of social cohesion. There are growing examples of successful reparations programs. Across the globe, reparations have been utilized to address and rectify historic injustices, often to help consolidate peace post-conflict.For instance, the U.S.previously compensated Japanese American citizens for their wrongful internment during World War II.727 South Africas government, in 2003, distributed $3,900 to victims of apartheid, accumulating to $85 million.728 Furthermore, certain nations have pursued reparations from their former colonizers. For instance, Caribbean nations initiated the Caribbean Reparation Commission to secure reparations from former colonial powers,citing grave historical crimes like the Trans-Atlantic Slave Trade.729 The U.K. took responsibility for its colonial past by paying $25 million to Kenyans who suffered violence during the Mau Mau uprisings in the 1950s.730 Post World War II, Germany recognized its responsibility towards Holocaust victims by not only providing reparations but also giving $7 billion to Israel during its early formation days.By 2012, the German efforts translated to $89 billion in reparations to individual survivors.731 Moreover, every day reparations are provided to groups in the US:

Farmers. Fishermen. People whove lost bank accounts or pensions. People whove had a bad reaction to a COVID vaccine. People whove had a reaction to any other vaccine. Indigenous people. Veterans. Descendants of veterans. People who get hurt on the job. People who built nuclear bombs. People exposed to pesticides. Coal miners who get black lung disease. People who lose paychecks or homes from floods, droughts, or other natural disasters. People who are impacted by trade agreements.732 

Yet, Black Americans have not received any compensation for their unpaid labor and the severe racial discrimination they faced. Early efforts to provide reparations were overturned,leaving Black Americans without means to build wealth:

The first major opportunity that the United States had and where it should have atoned for slavery was right after the Civil War.Union leaders including General William Sherman concluded that each Black family should receive 40 acres. Sherman signed Field Order 15 and allocated 400,000 acres of confiscated Confederate land to Black families.Additionally,some families were to receive mules left over from the war, hence 40 acres and a mule. Yet, after President Abraham Lincolns assassination, President Andrew Johnson reversed Field Order 15 and returned land back to former slave owners. Instead of giving Blacks the means to support themselves, the federal government empowered former enslavers. For example, in Washington D.C., slave owners were actually paid reparations for lost propertythe formerly enslaved.733

 Today proposed reparations packages include individual payments, college tuition remissions, student loan forgiveness, down payment and housing revitalization grants, and business grants.734 However, this is not just about money, but about creating an equitable society by addressing entrenched racism.

Reparations are not a zero-sum game. They potentiate collective healing. Here, the Jewish concept of tzedakah provides a valuable lens. The term tzedakah originates from the Hebrew word for justice, a paramount duty within Judaism: “Tzedek, justice you shall chase after.”735 [similar to the Muslim concept of Sadaqah]. Unlike traditional charity, which is often given out of surplus, tzedakah is seen as a form of justice, a duty to give regardless of ones financial state. It is not just about money; it is about restoring balance. If we begin to think of reparations in terms similar to tzedakah,the dialogue shifts. Instead of a punishment or forced obligation, reparations become a proactive step towards justice. It acknowledges a debt, not just financial, but moral and societal. More than a mere handout, it is a hand extended in a gesture of understanding, responsibility, and a deep desire for communal healing. By using tzedakah as a model, reparations can be seen as not just beneficial for the recipients but also for those giving. It can be a process of growth, responsibility, and genuine commitment to creating a more just and equal society.

Proposition XXXII.Restorative justice.

Proof.—Reparations are a strategy of restorative justice,which focuses on the needs of the victims and the offenders, as well as the entire community, rather than satisfying abstract legal principles or punishing the offender. The goal is to repair harm, reconcile relationships, and reintegrate offenders back into society. Renowned feminist scholar and social activist, bell hooks, has written about restorative justice in the context of broader societal structures of power,inequality,and oppression: “I think this is a difficult question, how we deal with the question of forgiveness. For me forgiveness and compassion are always linked: how do we hold people accountable for wrongdoing and yet at the same time remain in touch with their humanity enough to believe in their capacity to be transformed?”736 She argues that punitive justice systems, rooted in domination and control, often reinforce societal hierarchies and exacerbate harm.

hooks believes that justice should involve the active process of healing, reconciliation, and community building. She views restorative justice as a potential tool for addressing larger systemic issues,such as racial and gender inequality: “One of the things that has always made me sad is the extent to which civil rights struggles, black power movements, and feminist movements, have, at times, collapsed at the point where there was conflict, and how conflict between people in the groups was often seen as a negative. The truth is that you cannot build community without conflict. The issue is not to be without conflict, but to be able to resolve conflict, and the commitment to community is what gives us the inspiration to come up with ways to resolve conflict.The most contemporary way that people are thinking about as a measure of resolving conflict and rebuilding community is restorative justice.”737 hooks emphasizes the need to transform not just individual relationships or incidents of harm, but also the larger, systemic structures of power that often underpin such harm. She advocates for a justice system that repairs and heals, rather than one that punishes and divides. hooks frames her discussion within an intersectional lens, acknowledging the ways that race, gender, class, and other aspects of identity intersect and influence experiences of harm and opportunities for justice.

Restorative justice has seen numerous successful implementations globally. In the United States, The Navajo Nation Peacemaking Program is a traditional approach to justice that emphasizes harmony and often sees cases referred back to it from the wider judicial system: “An examination of the Navajo peacemaking process shows that its success is not in its concrete result or the actual remedy given,but rather is in an adjustment of the attitudes of the parties involved. Both offenders and victims begin with cognitive dissonance or related emotions that are based on assumptions and unreality,and the process leads them to common understandings.”738 South Africas Truth and Reconciliation Commission, set up post-apartheid, allowed victims to voice their suffering and perpetrators to confess their crimes in a public forum, playing a pivotal role in preventing further civil unrest during the countrys transition.739 New Zealands Family Group Conferences for juvenile crimes involve a collaborative development of a plan for the offender to make amends and have shown a significant reduction in reoffending.740 The Hollow Water First Nation Community Holistic Circle Healing in Canada addresses sexual abuse and other forms of violence within the community, boasting a recidivism rate of just 2% among offenders who acknowledge their actions and seeking forgiveness.741 Likewise, Prison Fellowship Ministries runs successful restorative justice programs inside prisons internationally: “Founded in 1976, Prison Fellowship exists to serve all those affected by crime and incarceration and to see lives and communities restored in and out of prisonone transformed life
at a time
.”742 Habeas Viscusdistributed transduction (V.xviii.).

Corollary.—

Restorativejustice everywhere.

Proposition XXXIII.Restore the planet.

Proof.—Restorative justice, when expanded to the environment, offers a transformative approach that addresses the harm inflicted upon the land and local communities and supports sustainable solutions: “The ‘planetary boundaries framework developed by Johan Rockström,Will Steffen, and others (Rockström et al., 2009) describes the nine processes regulating the Earth system, keeping it stable and resilient. Within these boundaries, humans have a ‘safe operating space but pushing past them would destabilize Earths system into effects beyond human capabilities to manage.”743 The impact of imbalance is already overwhelming. Restorative environmental justice emphasizes the healing and restoration of damaged ecosystems and communities impacted by environmental degradation.Instead of solely focusing on punitive measures or regulations, restorative environmental justice seeks to identify and address the underlying causes of harm.744 By implementing restoration projects,such as ecological rehabilitation, habitat restoration, and remediation efforts, we can actively work towards healing the environment and restoring balance.

Note.—Restorative justice highlights the importance of accountability and taking responsibility for ones actions. Applying this principle to environmental justice involves holding polluters and those responsible for environmental harm accountable for their actions.It also includes acknowledging the collective responsibility of society for the degradation of the environment.By promoting transparency, encouraging corporate and governmental accountability, and supporting initiatives that foster responsible environmental practices,restorative justice ensures that those who contribute to environmental harm make amends: “Possible restorative outcomes in the case of environmental harm are apologies,restoration of environmental harm, prevention of future harm.”745 Restorative justice prioritizes the inclusion and active participation of affected communities in decision-making processes.By incorporating diverse perspectives, local knowledge, and community engagement, we can better understand the specific needs of ecosystems impacted by environmental degradation. This collaborative approach empowers communities to actively participate in finding sustainable solutions.Restorative justice drives innovation in environmental practices. New ways of being. Offenders are encouraged to invest in research, technology, and alternative approaches,including renewable energy initiatives, sustainable agriculture practices,eco-friendly technologies, and circular economy models. Some recent policies account for the entanglement of environmental technology and social equity. For example, “Justice40 establishes the goal that 40% of the overall benefits of certain federal investments flow to disadvantaged communities (DACs). The Justice40 Initiative applies to over 1 Department of Energy (DOE) programs and to much of the $62 billion investment in DOE under the Bipartisan Infrastructure Law.”746 Restorative justice offers a transformative framework that recognizes the interconnectedness of environmental and social issues as the reality of our planet.

Proposition XXXIV. Technology hasrestorative potential.

Proof.—Can technologies of Capture and Reconstruction be reappropriated to rectify environmental injustice? Technologies like remote sensing and computer vision are powerful tools to understand and confront environmental degradation. By collecting and analyzing vast amounts of data, technology enables the identification of pollution sources and disproportionate impacts on marginalized communities, facilitating targeted interventions and enforcement of environmental regulations.747 Despite their ethical complexities, these technologies have the potential to support the development and implementation of sustainable solutions,such as clean energy alternatives, sustainable infrastructure,and efficient resource management systems. Since we have these technologiesand they are not going awaywe need to embrace their capabilities to empower repair. To promote equitable access to clean air, water, and resources. To mitigate the detrimental effects of environmental degradation. To protect against the decoherence of our habitable world.

Corollary.— D
e
c
oher
ence
. is d

Note.—If we look to mens general opinion, we shall see that they are indeed conscious of the eternity of their mind, but that they confuse eternity with duration, and ascribe it to the imagination or the memory which they believe to remain after death.

Proposition XXXV. Deploy machines of care.

Proof.—From the watery depths of our ecological crisis, Phykos emerges as a pioneering company at the forefront of sustainable practices: “Currently, most global efforts to address climate disruption are focused on reducing emissions of greenhouse gas pollutants. While vital, this path alone is no longer sufficient. In a special climate report, the United Nations made clear that,in addition to turning off the flow of pollution, we also need to remove massive amounts of legacy CO2 from the atmosphere to avoid the most dangerous effects of climate change, and ultimately restore our climate.”748 Leveraging autonomous vessels and advanced technologies, Phykos aims to cultivate seaweed as a powerful carbon sink. Seaweed, a type of macroalgae,has gained significant attention for its ability to sequester carbon dioxide from the atmosphere. As seaweed grows, it absorbs carbon dioxide through photosynthesis, effectively capturing and storing carbon: “We amplify natural marine carbon cycles to remove excess atmospheric CO2 at climate relevant scale.”749 

Proposition XXXVI. Autonomous repair.

Proof.—Phykos utilizes autonomous vessels as the foundation of their seaweed cultivation operations. These vessels are equipped with cutting-edge technology, including remote sensing capabilities, artificial intelligence, and robotics. The autonomous nature of these vessels allows them to navigate the oceans with minimal human intervention, making seaweed cultivation more efficient and sustainable. Phykos seaweed cultivation process begins with the deployment of their autonomous vessels to predetermined seaweed farming areas. These areas are carefully selected based on factors such as water quality, nutrient availability, and optimal growth conditions. Once the vessels reach their designated locations, they initiate the seaweed cultivation process.

Corollary.—

Vessels of healing.

Note.—Phykos applies capture and reconstruction technologies for restorative impact.Using advanced remote sensing technologies, the autonomous vessels capture vital data about the ocean environment: “High resolution ocean models guide our platforms to deposition regions which maximize carbon storage while minimizing disturbance to the deep ocean environment. Platform sensors and satellite communications enable precise reporting of our progress.”750 They collect information on water temperature, salinity, nutrient levels, and other relevant parameters. This data enables the autonomous vessels to employ sophisticated algorithms and artificial intelligence systems to navigate to optimal conditions: “Algorithms maximize growth on our platforms while avoiding sensitive ocean environments and other ocean users. Our system works in harmony with seasonal shifts in the ocean offering a safe and scalable pathway for nature based carbon dioxide removal.”751 The autonomous vessels release specially designed seaweed modules into the ocean. These modules contain pre-grown seaweed seedlings, which attach to the floating structures. As the seaweed grows, it absorbs carbon dioxide through photosynthesis, thereby reducing atmospheric carbon levels. Phykos vessels continuously monitor the growth and health of the seaweed, optimizing carbon sequestration potential.

Proposition XXXVII. There is nothing in nature, which is contrary tocarbon, or which can take it away. False.

Proof.—Seaweed cultivated by Phykos acts as a natural carbon sink, absorbing substantial amounts of carbon dioxide during its growth cycle.When the seaweed is saturated it is harvested or is pulled by its own weight to the bottom of the ocean where it slowly releases carbon over one-thousand years or more.752 In addition to Phykos primary goal, seaweed cultivation provides additional ecological benefits,such as nutrient absorption, habitat creation, and the enhancement of marine biodiversity.Goals and weights determine the direction of change.

Note.—The Axiom of Part IV. has reference to particular things, in so far as they are regarded in relation to a given time and place: of this, I think, no one can doubt.

Proposition XXXVIII. Entanglements of life and death.

Proof.—Coral.

Note.—Coral reefs are often referred to as the rainforests of the sea.They are crucial for ecological, economic, and cultural reasons. Their well-being directly impacts the health of our oceans and the people who depend on them: “1 billion people rely on coral reefs for food security; 25% of all marine species live on coral reefs; 70-90%of the world's coral reefs could be lost by 2050.”753 Protecting coral reefs is essential for preserving the biodiversity and ecological balance of our marine ecosystems. Coral reefs, vital ecosystems teeming with biodiversity, are under threat due to climate change and human activities. Coral bleaching is a phenomenon where corals lose their symbiotic algae, known as zooxanthellae, causing them to turn white. This discoloration is often triggered by elevated sea temperatures and indicates coral stress. Additionally, corals and other marine organisms can be adversely affected by marine debris, including discarded fishing nets and plastics. This debris has the potential to entangle and damage coral reefs. Another related concern is ghost nets, which are fishing nets that have been lost, abandoned, or discarded in the ocean. As they drift, they pose a threat not only to marine life through entanglement but also to coral reefs upon contact: “Entanglement can be directly responsible for breakage,as well as inhibiting the growth by restricting access to sunlight,as well as preventing the important cleaning function of grazing fish species. Indirectly, entanglement in plastic debris has been linked to a greater prevalence of disease in coral species … Coral species that grow in branching or corymbose forms, such as the Acroporids and the Poccilioporids are thought to be eight times more vulnerable to entanglement in debris due to their complex structures, suggesting that important habitats for juvenile reef fish species as well as invertebrates are most greatly affected.”754 In the face of this crisis, innovative solutions are necessary for coral restoration. Coralmaker, a pioneering initiative, leverages capture, reconstruction, and simulation to revolutionize large-scale coral reef restoration efforts.Coralmaker significantly reduces the time required for coral growth, promoting sustainability through emerging technologies.

Proposition XXXIX. Skeletons of regeneration.

Proof.—Coral calcification, the process of skeletal growth, takes years for corals to reach adult size. Coralmaker collapses this lengthy timeline. Their mission is “to provide robust and scalable technologies that make it possible to restore, install, and move coral reefs at the reef scale, supporting their survival and continuation through climate change.”755 This innovative manufacturing technology can be conveniently deployed close to restoration sites,enabling onsite production using locally-sourced natural aggregate mixes. Coralmaker has the capability to manufacture 10,000 skeletons per day,and is actively scaling up production.

Note.—Coralmaker seeds living coral fragments into modular bases made of recycled stone waste from the construction industry. By repurposing this waste material,the initiative reduces its carbon footprint and contributes to waste reduction.Furthermore,the focus on local manufacturing near restoration sites minimizes transportation emissions,ensuring a more sustainable approach to large-scale coral restoration.The process of coral propagation, which involves seeding coral fragments onto the premade stone coral skeletons, is traditionally a labor-intensive and repetitive task. Coralmaker recognizes the need for automation in large-scale coral processing and utilizes robotics and artificial intelligence to automate the propagation process.756 The underwater installation process is too dangerous and expensive for a human workforce to be viable. These automated systems, designed for onsite deployment at restoration sites,effectively collaborate with human workers.Humans are activated to engage in more complex tasks.

Proposition XL. In proportion as each thing possesses more of perfection, so is it more active, and less passive; and, vice versâ, in proportion as it is more active, so is it more perfect.

Proof.—Coral geometry is captivatingly complex. Layer upon layer of mathematical and natural principles. Branching corals and coral colonies often display fractal-like patterns, reflecting efficient space-filling and resource acquisition strategies. The structure of these corals has evolved to optimize light capture for their photosynthetic partners, zooxanthellae, balancing growth towards light while minimizing self-shading. Some coral patterns are reminiscent of Turing patterns from reaction-diffusion systems, modeling interactions of substances as they spread across space, while others align with phyllotactic principles seen in plants, often linked to Fibonacci sequences or golden angles for optimal spacing. Beyond these patterns, the calcification processes underlying coral growth give insights into optimal structural strategies in fields like crystallography and material science. Structures beyond the Cartesian: “These organisms are biological manifestations of what we call hyperbolic geometry, an alternative to the Euclidean geometry we learn about in school that involves lines, shapes and angles on a flat surface or plane. In hyperbolic geometry the plane is not necessarily so flat.”757 There are infinite variations.

This presents a challenge to robotic automation, which conventionally relies on a fixed program in which every part handled is the same. Researchers have made significant advancements by implementing robotic perception,computer vision, and AI models to tackle the fragile task of handling the endlessly diverse coral fragments and precisely placing them as seeds within artificial skeletons. Adaptive automation. Capture and Reconstruction are augmented with perception models to perceive and respond to the complex shapes and variations of coral fragments. Computer vision algorithms enable the identification and analysis of individual fragments, ensuring accurate recognition and classification.Additionally,AI models contribute to real-time decision-making, allowing robotic arms to adapt their grasping strategies and movement trajectories based on the unique characteristics of each fragment. This integration of technologies enables precise and delicate manipulation,facilitating the successful placement of coral fragments within the artificial skeletons at a scale that would not be possible otherwise.758

Corollary.—Scale care.

Note.—Coralmaker recognizes the importance of a well-designed logistics system to support large-scale coral reef health. Coralmaker founder, Taryn Foster, explains “I think of this as a delivery or scaling mechanism for these other technologies that people are developing, like coral propagation …just at a much faster rate and on a bigger scale.”759 As coral reefs face rising temperatures and ocean acidification, some areas may become unsuitable for coral growth. The same automated systems can also be leveraged for movementassisted migration. Assisted migration enables the relocation of corals to areas where environmental conditions are more favorable, giving them a better chance of survival and growth. Assisted migration also allows for the relocation of corals from diverse genetic lineages. This approach helps maintain genetic diversity within coral populations, which is crucial for their ability to adapt and withstand future environmental challenges. For example, some species are adapted to withstand warmer water. Installing living structures provides a foundation for the growth and development of new communities,non-human and human.

This is restorative environmental justice in action.The researchers developing the computer vision systems for this project expressed how refreshing it is to work on disambiguous ethical applications of these technologies.760 “A new kind of attention, practical rather than contemplative, has been drawn to Spinoza by deep ecologists. Arne Naess, the Norwegian ecophilosopher, has outlined the points of compatibility between Spinozas thought and the basic intuitions of the (radical) environmental movement. Among them is this one: ‘Interacting with things and understanding things can not be separated. The units of understanding are not propositions but acts.’”761

Coralmaker recognizes the importance of a well-designed logistics system to support large-scale coral reef health. Coralmaker founder, Taryn Foster, explains “I think of this as a delivery or scaling mechanism for these other technologies that people are developing, like coral propagation …just at a much faster rate and on a bigger scale.”762 As coral reefs face rising temperatures and ocean acidification, some areas may become unsuitable for coral growth. The same automated systems can also be leveraged for movementassisted migration. Assisted migration enables the relocation of corals to areas where environmental conditions are more favorable, giving them a better chance of survival and growth. Assisted migration also allows for the relocation of corals from diverse genetic lineages. This approach helps maintain genetic diversity within coral populations, which is crucial for their ability to adapt and withstand future environmental challenges. For example, some species are adapted to withstand warmer water. Installing living structures provides a foundation for the growth and development of new communities,non-human and human.

This is restorative environmental justice in action.The researchers developing the computer vision systems for this project expressed how refreshing it is to work on disambiguous ethical applications of these technologies.763 “A new kind of attention, practical rather than contemplative, has been drawn to Spinoza by deep ecologists. Arne Naess, the Norwegian ecophilosopher, has outlined the points of compatibility between Spinozas thought and the basic intuitions of the (radical) environmental movement. Among them is this one:

Proposition XLI. Spinoza’s Ethics is a hyperbolic geometry.

Proof.—The Ethics appears to follow a Cartesian structure—but it goes beyond it: “Cartesianism is handled like a sieve, but in such a way that a new and prodigious scholasticism emerges which no longer has anything to do with the old philosophy, nor with Cartesianism either. Cartesianism was never the thinking of Spinoza; it was more like his rhetoric; he uses it as the rhetoric he needs.”764 It is not a Euclidean model made of straight lines and points. It is a network of parallel and intersecting negative curves. As Deleuze describes it, a sieve. Life flows through it: “In Spinoza's thought, life is not an idea, a matter of theory. It is a way of being, one and the same eternal mode in all its attributes. And it is only from this perspective that the geometric method is fully comprehensible … The geometric method ceases to be a method of intellectual exposition; it is no longer a means of professorial presentation but rather a method of invention.”765 The Ethics is a hyperbolic geometry—and it is also a logic of hyperbole—a non-classical lattice representing strong emotions and their interactions.

Note.—If reason is binary, emotions are quantum. Propositions—or questions—concerning quantum systems differ from classical logic. Classical logic operates on a binary principle, where statements are either true or false. This is often referred to as the law of the excluded middle. There are logics that do not abide strictly by the binary distinction of true and false. One example is ternary logic—three-valued logic—where propositions can take on a third value, often described as unknown or indeterminate. There are also other many-valued logics with more than three values. Fuzzy logic is a system of logic that allows for degrees of truth, rather than just true or false—1 or 0. Statements in fuzzy logic can be partially true to varying degrees.766 This form of logic is often applied in areas where information is imprecise or where human reasoning needs to be emulated, such as in some artificial intelligence applications, and quantum systems.

For Spinoza, emotions are complex combinations of external influences and internal responses, deeply entangled with one’s thoughts and actions. He classifies emotions into—desires, pleasures, and pains—which he believes are fundamental in driving human conduct. Emotions, in Spinoza’s perspective, are not just passive experiences; they actively influence an individual's capacity to act and think. By understanding emotions and their causes, individuals can transform decoherence into coherence: “Spinoza did not believe in hope or even in courage; he believed only in joy, and in vision.”767 The Ethics is a geometry of freedom, which Spinoza believed comes from understanding the necessity of everything, including our own feelings and actions, and aligning ourselves with this understanding.

Proposition XLII.

Feeling is a quantum force.

Proof.—The erotic is demonic ground (V.xix.).

Note.—If Spinoza believed only in joy and vision, Audre Lorde believed in joy and the erotic. Lorde's profound reflections in The Uses of the Erotic expand Spinoza’s Ethics. Like Spinoza, Lorde does not see ethics as a system of rules and judgments imposed from the outside. Neither does she imagine ethics as the repression of emotions in favor of logic. Lorde expresses the drive behind her actions, her algorithm for living—the erotic—a deeply internal force, a wellspring of power and knowledge rooted in embodied experience: “The erotic is a resource within each of us that lies in a deeply female and spiritual plane, firmly rooted in the power of our unexpressed or unrecognized feeling.”768 The erotic, as Lorde describes, goes beyond the superficial understanding of sexuality—it represents our capacity for joy, our potential for deep connection, and our intrinsic ability to recognize and strive for satisfaction in all aspects of life.

In a world dominated by objective metrics and cold logic, where the emphasis often lies on control and supervision, the erotic becomes even more revolutionary. It challenges dominant paradigms of thought, urging us to recognize and embrace female desire, feeling, and emotion as valid sources of knowledge and action. To fully grasp the depth of Lorde’s perspective, one must understand her view of the erotic as an affirmation of life and a rejection of the oppressive forces that seek to stifle it:

When we live outside ourselves, and by that I mean on external directives only rather than from our internal knowledge and needs, when we live away from those erotic guides from within ourselves, then our lives are limited by external and alien forms, and we conform to the needs of a structure that is not based on human need, let alone an individual’s. But when we begin to live from within outward, in touch with the power of the erotic within ourselves, and allowing that power to inform and illuminate our actions upon the world around us, then we begin to be responsible to ourselves in the deepest sense. For as we begin to recognize our deepest feelings, we begin to give up, of necessity, being satisfied with suffering and self-negation, and with the numbness which so often seems like their only alternative in our society. Our acts against oppression become integral with self, motivated and empowered from within … For not only do we touch our most profoundly creative source, but we do that which is female and self-affirming in the face of a racist, patriarchal, and anti-erotic society.769

When embracing the Erotic as a framework, we approach problems with a focus on healing, equity, and joy rather than domination, control, and oppression. If quantum computers simulated a future grounded in Lorde's Erotic what would it look like? Can technology exist as a companion that fosters connection and consensual pleasure, rather than function as a tool of reduction and exploitation? Would such a future prioritize healing, balance, laughter, care?

In this (im?)possible future, the erotic envelops vision, birthing a softer paradigm—the Supererotic—which stands in stark contrast to Supervision. It prioritizes deep, intangible connections between individuals over hard metrics and calculations. It values the integrity and paradoxical porousness of life, underlining our infinite difference and interconnectedness. Simulate joy! Simulate healing! Simulate equity! Simulate solutions to wicked problems—and at the same time—take every measure to ensure that the solutions offered are not the Final Solution.770771 

If the way which I have pointed out as leading to this result seems exceedingly hard, it may nevertheless be discovered. Needs must it be hard, since it is so seldom found. How would it be possible, if salvation were ready to our hand, and could without great labor be found, that it shouldbe by almost all men neglected? But all things excellent are as difficult as they are rare.

End of Supervision.











PART VI.

ACKNOWLEDGEMENTS

I would like to express my gratitude and acknowledge the following individuals, communities, and institutions who have played a significant role in shaping my work and supporting me throughout this process. I cant enumerate the minds —theorists, writers, designers, engineers, technologists, artists, caregivers—whose ideas have influenced my work.

This project would not be possible without the academic communities of USC and UCLA, where I have been able to collaborate, learn, and teach. I am thankful to Holly Willis for her encouragement to experiment with form. The structure of this text is core to its meaning. To Steve Anderson for his foundational research in capture and reconstruction, the groundwork for my own investigations. To Vikki Callahan for insisting on evaluations of technocapital and radical praxis in my work. To Tara Mcpherson for tracing the lines of hate and harm through the history of technology. To Karen Tongson for diagramming the tactics of normalization and identity formation. To Kiki Benzon for connection through words and writing, and for sharing signs of what matters most. To Jeff Watson for imparting his ethical urgency and reminding me of the interconnectedness of our reality. His teachings about reparative agency have profoundly influenced my perspective and commitment to truth and justice.772

To Jennifer Steinkamp, who demonstrated that like digital space, life is ours to sculpt with intention, from our environment to our character. To Eddo Stern for reminding me of the possibilities in play. To all my teachers and countless others who extend these ethics.

To my friends for the joy of living, difference and connection.

To my family, my mother—Laurence—whose life is devoted to care, and my brothers who are my epistemes—business, law, and engineering. Their lives express the power of ethics and the power of human agency.

 

To my partner—Pete, for opening a daily window into the developments of spatial computing over the last ten years; for your love, your unending support, and for bringing our children into existence. It is for them I endeavor to shift our ideologies.











Snake Oil Men, part VII—

The doctor took his time while I waited grumbling to myself in the exam room. Exam room three, posters of heart and lungs with arrows and explanations attached, a little model of the GI tract, bottles of disinfectant (or whatever) and the other expected things that you might find, the ear and nose scope, the blood-pressure armband and pump, the little glass bottle of tongue depressors and the rest of it. There were white fluorescent lamps embedded in the ceiling that aggravated my already throbbing headache, but I'm sure you already guessed at that. I was expecting something bad. The headache had come weeks earlier and hadn't left, and with all the smoking and drugs and fast food I was pretty much counting on a terminal diagnosis, some kind of cancer that had spread throughout my body and brain and it would be somewhere between three and six weeks before I lost consciousness, two to three months before I was dead. I am not a natural pessimist or a hypochondriac. I am simply practical and honest, and I knew the kind of life I had lived could exact this kind of toll. And besides, what else could a three-week headache mean?

When Doctor Chandragar finally returned, I felt my suspicions were about to be painfully confirmed. He softly closed the door and looked me in the eye as he took his seat on the chair across from mine.

             "Jonathan," he said. "I have some rather unsettling news for you."

             This was it. My heart leapt into my throat. I felt like I was about to vomit.

             "The test results..." I said, prompting him to deliver the death-blow.

             "Well, the test results, no, they came back completely normal, actually. Your blood seems fine, no problem there. But the X-rays reveal something I've quite honestly never seen before. Have a look at this."

             He removed an X-ray print from a large color-indexed manila folder and showed it to me. He pointed at a white blob in the middle of my skull.

             "Do you see the white area here?"

             "Yes. Is that a tumor?"

             "No, no. No, it's a bit, well quite a bit worse than that. That's your brain."

             "What do you mean, that's my brain?"

             "Your brain appears to have shrunk, or, possibly, to have never really grown at all. It's -- as you can see -- about the size of a potato. Usually in an X-ray, the brain, this white area, is flush with the walls of the cranium, your skull. But yours appears to be just floating there, perhaps in some kind of fluid."

             I was, well, flabbergasted. "My brain... is shrinking?"

             "Yes, or possibly it was always this size. Have you ever had an X-ray of your head before?"

             "No."

             "Then you may have been born this way."

             "But the headaches...."

             "I don't know. We'll have to consult the literature. I'm not well-read on this subject to be frank with you. Very few are. I have heard of this kind of dwarfism--"

             "It's a kind of dwarfism?"

             "Perhaps. My first guess would be that you have always had this potato-sized brain, and that recently you've, well, run out of space as it were."

             "Ah. And that would be the explanation for the headaches."

             "Yes. Until now, I would hazard, you've been ticking along just fine, acquiring memories, learning, feeling. But you may have reached your limit."

             My limit. That made a kind of sense. Things had been getting very odd lately. My life had entered a static period. And I was having trouble remembering.

             “So what can we do?”

             “We’ll have to have you examined by an expert,” said the doctor. “I’ll put out the word to my colleagues and we should have a name in a day or two. It might require some travel.”

             “Of course,” I said. The meeting ended shortly thereafter.

             I decided to keep the news a secret. Allie was pregnant with our fifth child and there were already more than enough reasons to worry, given that our first four had all died within a week of their birth. It occurred to me that none of them had had their heads X-rayed. There was also the issue of my ego, which was not equipped to deal with making public my potato-sized brain. I was glad it was hidden inside my skull. I told Allie the doctor said the headaches were probably due to dehydration, and began drinking even more water than usual.

             Doctor Chandragar called me on my cell phone the next morning. I was out for a walk, trying to clear my head. It wasn't working.

             "Jonathan?"

             "Doctor Chandragar."

             "I've found our expert. He's in Kazakhstan."

             Kazakhstan. I'd been expecting New York, Los Angeles, London, perhaps even Japan. But a dried up ex-Soviet republic? Perhaps I needed a second opinion.

             "It seems that the Kazakhs have had a rash of intra-cranial deformities over the past twenty years, probably because of the toxins in their soil. Nuclear tests, bioweapons labs, and so forth, you know. There's an American there with a particular interest in brain dysmorphia, and he comes highly recommended."

             Ah. An American. I could trust that. A little.

             "When are we going?"

             "I'm afraid I can't leave the country," replied the doctor. "I have a child on the way myself. But Doctor Woods is so excited about your case that he's offered to pay your way, and you can bring your wife as well."

             "Why can't he come here, then?"

             "I asked him that. He's got a very busy practice, his equipment is all there and so on."

             "Does he think I can be helped?"

             "He said he was working on a hormone treatment, something that could affect growth. He claims that he's cured several patients already."

             It sounded fishy, but I was desperate.

             "Email me the info," I said.

             Three weeks later, I was in Kazakhstan. Despite Doctor Woods' offer, I declined to bring Allie. I told her it was a research trip for my new book.

             The doctor took me to his lab and ran some tests. He gave me a vial of growth hormone and a room in the compound where he kept his other patients for observation. I was told the treatment would last six weeks.

             In a few days, the headaches went away. I returned home late the next month and got back to work. My novel was a success and kept me busy with speaking engagements. It was eight months before I finally got around to paying Doctor Chandragar another visit.

             When I got there, his clinic was closed. Boarded up. I asked an old homeless woman where it had gone.

             "Doctor Chandragar?" she said. "Oh, he turned out to be some quack! Telling everybody their brains had shrunk and then sending them to Kazakhstan for treatment. It was all a big insurance scam!"

             Could it be true? But what about my headaches? And what was it that Doctor Woods had given me?

             I went to another doctor. He did some X-rays. Everything was normal.

             Forty years later, I ran into Doctor Chandragar while on holiday in Wyoming. I confronted him angrily and threatened to press charges without really meaning it. He told me that he had been the victim of a disinformation campaign organized by rival doctors. He assured me that, prior to my trip to Kazakhstan, my brain had indeed been no larger than a mid-sized russet potato. I slapped him. His story was too hard to believe. He spat out a tooth and looked me straight in the eye.

             "Your headaches -- have they returned?"

             "No," I said, suddenly sheepish and defensive.

             "And your novels, they've been successful?"

             I showed him my Nobel ring. The one Allie made me after I got the prize.

             "And despite all this, you still suspect I duped you?"

             I had to admit that I did. But then things got interesting. Doctor Chandragar keyed some numbers into his cell phone and summoned Doctor Woods, who was living nearby in a retirement community. In a few minutes, he was with us by the pool.

             "So even you fell for the lies of the Brotherhood?" asked Doctor Woods, referring to the cabal of doctors Chandragar claimed had smeared his name. "After all we did for you?"

             "I'm not so sure you did anything for me," I said.

             "I paid for your trip. The whole treatment cost you nothing."

             "Yes, but you both collected on the insurance."

             "Well, we had expenses to cover. My lab. The serum. These things don't come for free."

             "Still, I did some reading after the treatment. Which I should have done before. There is no medical record of otherwise normal individuals with potato-sized brains, let alone of a hormonal treatment to induce brain growth. If any of this was true, you'd think there'd be some report somewhere..."

             "The Brotherhood is powerful," intoned Doctor Chandragar as he motioned the waitress for another Bloody Mary. "They have silenced our work completely."

             "Why do you think my clinic was in Kazakhstan?" asked Doctor Woods. "Do you think I would have self-located in such a hellhole without good reason?"

             "At the time you told me it was because of a preponderance of cases in that region."

             "That was my cover. At the time, the Brotherhood was looking to have me assassinated. I didn't want word getting out."

             "It was a security measure," added Doctor Chandragar. "Had we mentioned the Brotherhood, you might have gone to the press, which they control, and which would have very quickly led to a bombing of Doctor Woods' compound."

             "A bombing?"

             "The Brotherhood is very powerful."

             At this point, Allie returned from her yoga class with our daughter and her husband, Francis, who himself was a doctor of some renown. I had long since told Allie about my tangle with the doctors, or as we called them, the Snake Oil Men.

             "Allie, you'll never believe who I've run into," I said. But before she could respond, Francis produced a small ivory-handled pistol from a holster in the ass of his yoga pants.

             "Long live the Brotherhood!" he shouted, and shot both doctors in the head. He turned the gun on me and pulled the trigger again. I raised my hand to shield my face. The bullet ricocheted off my Nobel ring and hit Francis squarely between the eyes. My daughter fainted and Allie began to cry. The waitress just stood there, not knowing what to do with Doctor Chandragar's Bloody Mary.

             We went to the hospital to get my daughter checked out. By this point we had a police escort, but nobody was buying my story about the Brotherhood. I was just a crazy old Nobel laureate with a swollen finger. But then something happened that lent credence to my story, something that would bring down a thousand year old fraternity of medical practitioners, newsmen, and fighter pilots. The X-rays of my daughter's head came back, and there, floating in intracranial jelly, was a brain the size of a potato.

Jeff Watson, 2004

Endnotes


1.  Sagan, Carl. The Demon-Haunted world: Science as a Candle in the Dark. Ballantine, 1996.

2.  Arendt, Hannah. Lying in Politics, The New York Review, 1971.

3.  Bergson, Henri, Nancy Margaret Paul, and W. Scott Palmer. Matter and Memory. London: Forgotten Books, 2018.

4.  Deleuze, Gilles, Guattari Félix, and Brian Massumi. A Thousand Plateaus. London: Bloomsbury, 2013.

5.  Deleuze, Gilles, Barbara Habberjam, and Hugh Tomlinson. Bergsonism. New York: Zone books, 1991.

6.  Deleuze, Gilles, Guattari Félix, and Brian Massumi. A Thousand Plateaus. London: Bloomsbury, 2013.

7.  Massumi, Brian. Parables for the Virtual: Movement, Affect, Sensation. Durham: Duke Univ. Press, 2002.

8.  Cocke, John; Kolsky, Harwood. The Virtual Memory in the STRETCH Computer. Proceedings of the Eastern Joint Computer Conference, 1959.

9.  Deleuze, Gilles, Barbara Habberjam, and Hugh Tomlinson. Bergsonism. New York: Zone books, 1991.

10. Harney, Stefano; Moten, Fred. The Undercommons Fugitive Planning & Black Study. Brooklyn (NY): Autonomedia, 2013.

11.  Lefebvre, Henri. The Production of Space. Trans. Donald Nicholson-Smith. Blackwell Publishing, 1991.

12.  Lefebvre, Henri. Rhythmanalysis: Space, Time and Everyday Life. Bloomsbury Academic, 2013.

13.  Foucault, Michel. The Order of Things: An Archaeology of Human Sciences. Trans. Frye, N. New York: Vintage Books, 1973.

14.  Foucault, Michel. The Order of Things: An Archaeology of Human Sciences. Trans. Frye, N. New York: Vintage Books, 1973.

15. Foucault, Michel. Des Espace Autres, Architecture /Mouvement/ Continuité, France: Groupe Moniteur, 1984.

16.  Sharp, Joanne P. Geographies of Post-Colonialism: Spaces of Power and Representation. London: SAGE, 2009.

17.  Spivak, Gayatri Chakravorty. A Critique of Postcolonial Reason: toward a History of the Vanishing Present. Cambridge, MA: Harvard University Press, 2003.

18.  Cosgrove, Denis E. Geography and Vision: Seeing, Imagining and Representing the World. London: I.B. Tauris, 2012.

19.  Ibid.

20.  Fanon, Frantz. The Wretched of the Earth. Pref. by Jean-Paul Sartre. New York: Grove Press, 1968.

21.  Cosgrove, Denis E. Geography and Vision: Seeing, Imagining and Representing the World. London: I.B. Tauris, 2012.

22.  Ibid.

23.  Foucault, Michel. The Order of Things: An Archaeology of Human Sciences. Trans. Frye, N. New York: Vintage Books, 1973.

24.  Ibid.

25.  Ibid.

26.  Ibid.

27.  Noble, Safiya. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press, 2018.

28.  Angwin, Julia; Larson, Jeff; Mattu, Surya; Kirchner, Lauren. Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica, May 23, 2016.

29.  Sekula, Allan. The Body and the Archive, October, 1986.

30.  Foucault, Michel. The Order of Things: An Archaeology of Human Sciences. Trans. Frye, N. New York: Vintage Books, 1973.

31.  Ahmed, Sara. Queer Phenomenology: Orientations, Objects, Others. Duke University Press, 2007.

32.  The Utah Teapot, Computer History Museum.

33.  Smith, Woodruff. Complications of the Commonplace: Tea, Sugar, and Imperialism. The Journal of Interdisciplinary History Vol. 23, No. 2, MIT Press, 1992.

34.  Sharp, Joanne P. Geographies of Post-Colonialism: Spaces of Power and Representation. London: SAGE, 2009.

35.  Lefebvre, Henri. Rhythmanalysis: Space, Time and Everyday Life. Bloomsbury Academic, 2013.

36.  Cicero, Tusculan Disputations 5.61.

37.  Conversation with Russ Athay, former Senior Software Engineer from Sutherland’s lab at the University of Utah, 2019.

38.  Sutherland, Ivan. “The Ultimate Display,”Information Processing Techniques Office, ARPA, OSD, 1965.

39.  Anderson, Steve F. Technologies of Vision:The War Between Data and Images. MA: MIT Press, 2017.

40.  Luckey, Palmer, Anduril

41.  Dean, Sam. “A 26-year-old billionaire is building virtual border walls—and the federal government is buying,” LA Times, 2019.

42.  Ibid.

43.  Ibid.

44.  DeLanda, Manuel. War in the Age of Intelligent Machines. New York, NY: Zone Books, 2003.

45.  Deleuze, Gilles, Guattari Félix, and Brian Massumi. A Thousand Plateaus. London: Bloomsbury, 2013.

46.  Mitchell, W. J. T. Landscape and Power, Second Edition. University of Chicago Press, 2002.

47.  Cosgrove, Denis E. Geography and Vision: Seeing, Imagining and Representing the World. London: I.B. Tauris, 2012.

48.  What is the Internet of Things?, IBM, 2023.

49.  Pister, Kris. Smart Dust.

50.  Patent for Transparent electronics for invisible smart dust applications, IBM, 2018.

51.  Crichton, Michael, Synopsis of Prey, 2002.

52.  Marr, Bernard. Smart dust is coming. Are you ready? Forbes, 2018.

53.  Bacigalupi, Paolo The Water Knife. Alfred A. Knopf. 2015.

54.  Vinge, Vernor. Rainbows End: A Novel with One Foot in the Future, Tor Books, 2006.

55.  Stephenson, Neal. Snow Crash. Bantam Books, 2022.

56.  Ibid.

57.  Anderson, M.T., Feed,Candlewick Press, 2022.

58.  Clarke, Arthur C. and Stephen Baxter, The Light of Other Days, Tor Books, 2000.

59.  R. W. Fuller and J. A. Wheeler, “Causality and Multiply-Connected Space-Time,” Phys. Rev. 128, 919 1962.

60.  The collected papers of Albert Einstein / Anna Beck, translator ; Peter Havas, consultant, Princeton University Press, 1987.

61.  Andrés Anabalón, Bernard de Wit & Julio Oliva, Supersymmetric traversable wormholes, Journal of High Energy Physics volume 2020, Article number: 109, 2020.

62.  Clavin, Whitney. Physicists observe wormhole dynamics using a quantum computer, Caltech, 2022.

63.  Arthur Hebecker, Thomas Mikhail, Pablo Soler, Euclidean wormholes, baby universes, and their impact on particle physics and cosmology, Frontiers, 2018.

64.  Gibson, William. Neuromancer, Ace, 1984.

65.  Baclawski, K. The Observer Effect, IEEE Conference on Cognitive and Computational Aspects of Situation Management (CogSIMA), 2018.

66.  Vikoulov, Alex M. The Syntellect Hypothesis: Five Paradigms of the Mind’s Evolution.Ecstadelic Media Group, 2020.

67.  Popkin, Gabriel, Einstein’s ‘spooky action at a distance’ spotted in objects almost big enough to see: Entangled electronic devices could help scientists make a quantum internet, Science, 2018.

68.  Stephenson, Neal. The Diamond Age. Bantam Spectra, 1995.

69.  Ibid.

70.  Ibid.

71.  Marr, Bernard. The Best Examples of Digital Twins Everyone Should Know, Forbes, 2022.

72.  Ibid.

73.  Dick, Philip K. The Minority Report, Pantheon, 2002.

74.  Crawford, Mark. 5 Ways to Cyber-Protect Your Digital Twin, The American Society of Mechanical Engineers, 2021.

75.  Ibid.

76.  Peck, Raoul. James Baldwin Was Right All Along, The Atlantic, 2020.

77.  Shierholz, Heidi and Celine McNicholas, Understanding the anti-regulation agenda, Economic Policy Institute, 2017.

78.  Browne, Simone. Dark Matters: On the Surveillance of Blackness, Duke University Press, 2015.

79.  Benjamin, Ruha. Captivating Technology, Duke University Press, 2019.

80.  Optics, Merriam Webster Dictionary, 2023.

81.  Galapagos, Episode 1: Good and Bad Optics, BBC, 2023.

82.  Moholy-Nagy, László. Painting, Photography, Film, 1925.

83.  Enoch, Jay M. Duplication of unique optical effects of ancient Egyptian lenses from the IV/V Dynasties: lenses fabricated ca 2620–2400 BC or roughly 4600 years ago, Ophthalmic and Physiological Optics, 2000.

84.  Ibid.

85.  Le Scribe Accroupi; Statue of a Scribe Seated Cross-Legged; The Crouching Scribe, The Louvre Museum, France.

86.  A. H. Layard, Discoveries in the Ruins of Nineveh and Babylon, London, 1853.

87.  The Nimrud Lens, The British Museum, London.

88.  The British Museum Act of 1963, The British Museum, London, 1963.

89.  Nelson, Maggie. Bluets, Wave Books, 2009.

90.  Smith, Mark A. Optics in the Time of Kepler, Encyclopedia of the History of Science, Carnegie Mellon University.

91.  Galileo and the Telescope, Library of Congress.

92.  Vollgraff, J.A. Snellius' Notes on the Reflection and Refraction of Rays, Osiris, Vol. 1, The University of Chicago Press, 1936.

93.  McDonough, Jeffrey. Dioptrics, The Cambridge Descartes Lexicon, 2016.

94.  Feynman, R.P. QED: The Strange Theory of Light and Matter, Princeton University Press, 1985.

95.  Nadler, S. Baruch Spinoza: Heretic, Lens Grinder. JAMA Ophthalmology. 2000.

96.  Bennet, Jonathan. Correspondence: Baruch Spinoza. Early Modern Texts, 2017.

97.  Urban, Miloš and Margaret Gullan-Whur. Within Reason: A Life of Spinoza, Peter Owen Limited, Prague, 1998.

98.  Huygens, Christian. Treatise on Light, Project Gutenberg, 1690.

99.  Ibid.

100.  Gély, Suzanne. André Marie Ampère (1775-1836) et Augustin Fresnel (1788-1827), Open Edition Journals, 2004.

101.  Watson, Bruce. Science Makes a Better Lighthouse Lens, Smithsonian Magazine, 1999.

102.  Ibid.

103.  Bernhard, Adrienne "The invention that saved a million ships", BBC, 2019.

104.  Maxwell, James Clerk. A Treatise on Electricity and Magnetism, Cambridge University, London, 1873.

105.  Maxwell, James Clerk. A dynamical theory of the electromagnetic field, Philosophical Transactions of the Royal Society of London, 1865.

106.  Kirchhoff, G. On the relation between the radiating and absorbing powers of different bodies for light and heat, trans. F. Guthrie, The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 1860.

107.  Planck, M. The Theory of Heat Radiation, Trans. Masius P. Blakiston's Son & Co., 1914.

108.  Heilbron, J.L.The Dilemmas of an Upright Man: Max Plank and the Fortunes of German Science, Harvard University Press, 2000.

109.  Einstein A. Relativity: The Special and General Theory, H. Holt and Company, 1916.

110.  Gravitational Lensing, Center for Astrophysics, Harvard & Smithsonian.

111.  Calder, Nigel. Magic Universe: A grand tour of modern science. Oxford University Press, 2006.

112.  Cosmic Microwave Background (CMB) Radiation, The European Space Agency.

113.  Fletcher, Seth. The First Picture of the Black Hole at the Milky Way’s Heart Has Been Revealed, 2022.

114.  Lutz, Ota. How Scientists Captured the First Image of a Black Hole, Jet Propulsion Laboratory, NASA, 2019.

115.  Taggart, Emma and Margherita Cole, The History of Camera Obscura and How It Was Used as a Tool To Create Art in Perfect Perspective, 2022.

116.  Mohism, Stanford Encyclopedia of Philosophy, 2020.

117.  Mozi, The Mozi, Book 10: Exposition of Canon II, Trans. in Ian Jonston, 2010.

118.  Sabra, A. I., ed. The Optics Of Ibn Al-Haytham Books I—Iii On Direct Vision, Harvard University, 1989.

119.  Ibid.

120.  Sherry, Bennett. The Universe Through a Pinhole: Hasan Ibn al-Haytham, Khan Academy.

121.  Friendly, Michael and Daniel J. Denis. Milestones in the History of Thematic Cartography, Statistical Graphics, and Data Visualization: An Illustrated Chronology of Innovations.

122.  Marchant, Jo. First known map of night sky found hidden in Medieval parchment: Fabled star catalogue by ancient Greek astronomer Hipparchus had been feared lost, Nature, 2022.

123.  Smith, A. Mark. Optics To The Time Of Kepler, Encyclopedia of the History of Science, Carnegie Mellon University.

124.  Leon Battista Alberti, On Painting and On Sculpture: The Latin Texts of “De Pittura” and “De Statua,” trans. Cecil Grayson (London: Phaidon, 1972).

125.  Jane Andrews Aiken, Leon Battista Alberti’s System of Human Proportions. Journal of the Warburg and Courtauld Institutes

126.  Theodolite, Smithsonian Museum.

127.  Theodolites, NOAA, 2022.

128.  Sextant, Smithsonian.

129.  Potonnieée, Georges. The History of the Discovery of Photography. New York: Tennant And Ward, 1936.

130.  Heliography: A Double Invention That Revolutionized The World Of Images, Nicéphore Niépce Museum, Google Arts and Culture.

131.  Pettinger, Tejvan, A letter from Louis Daguerre to Charles Chevalier, Biography of Louis Daguerre, Oxford, UK, 2019.

132.  William Henry Fox Talbot's Calotype, History Of Photography Compendium, Chapman University, 2021.

133.  Flueckiger, Barbara, Timeline of Historical Film Colors, 2012.

134.  Ibid.

135.  Boyd, Jane E. Celluloid: The Eternal Substitute, Distillations, Science History Institute Museum & Library.

136.  Fineman, Mia, Kodak and the Rise of Amateur Photography, Department of Photographs, The Metropolitan Museum of Art, 2004.

137.  Solnit, Rebecca. River of Shadows: Eadweard Muybridge and the Technological Wild West, Viking, 2003.

138.  Ibid.

139.  Ibid.

140.  Albrecht Meydenbauer: Photometrography, Architects' Association in Berlin Paper, vol. 1, no. 14, 1867.

141.  Grimm, Albrecht, The Origin of the Term Photogrammetry, International Society for Photogrammetry and Remote Sensing, Accessed 2023.

142.  Polidori, L. On Laussedat’s Contribution To The Emergence Of Photogrammetry, The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLIII-B2-2020, 2020 XXIV ISPRS Congress, 2020.

143.  Ragey, Louis. The Work Of Laussedat, National School of Arts and Crafts, Paris, 1952.

144.  Nadar, Felix. When I Was a Photographer (1900). Trans. Eduardo Cadava and Liana Theodoratou, MIT Press, 2015.

145.  Berger, John. Ways of Seeing. Penguin, 1972.

146.  Wheatstone, Charles. Contributions to the Physiology of Vision.—Part the First. On some remarkable, and hitherto unobserved, Phenomena of Binocular Vision, Philosophical Transactions of the Royal Society vol. 128, 1838.

147. Ballistic Chronograph, American Precision Museum.

148.  Stereo Disparity Estimation, Papers with Code, 2023.

149.  Forbes, Andrew, Michael de Oliveira and Mark R. Dennis. Structured Light,  Nature Photonics, 2021.

150.  National Oceanic and Atmospheric Administration. What is LIDAR, Department of Commerce, 2021.

151.  Einaudi, Franco, Gary K. Schwemrner,  Bruce M. Gen and James B. Abshire. Lidar Past, Present, and Future in NASA's Earth and Space Science Programs, NASA, 2004.

152.  Svein-Erik Hamran, Principal Investigator, RIMFAX, NASA.

153.  Mandai; Shingo et al. Patent for Time-of-flight depth sensing with improved linearity, Apple Inc. 2022. US-20220244391-A1

154.  Medina, Antonio, US Patent: Three Dimensional Camera and Rangefinder, 1992, US-5081530

155.  Hardin, Winn. Time of Flight (ToF) Sensors Bring Autonomous Applications to MarketAssociation for Advancing Automation, 2021.

156.  Goodrich, Joanna. The First Digital Camera Was the Size of a Toaster, IEEE Spectrum, 2022.

157.  Davide Scaramuzza, Event Cameras, University of Zurich, Institute of Informatics - Institute of Neuroinformatics, Robotics and Perception Group.

158.  Gabrielsen, Paul. New Camera Inspired by Insect Eyes, Science, 2013.

159.  Guillermo Gallego, Tobi Delbruck, Garrick Orchard, Chiara Bartolozzi, Brian Taba, Andrea Censi, Stefan Leutenegger, Andrew Davison, Joerg Conradt, Kostas Daniilidis, Davide Scaramuzza, Event-based Vision: A Survey.

160.  Howarth, Josh. How Many People Own Smartphones (2023-2028), Exploding Topics, January 26, 2023.

161.  Alberto Acosta, Extractivism and neoextractivism: two sides of the same curse, Transnational Institute.

162.  Naomi Klein, This Changes Everything: Capitalism vs. The Climate. Simon & Schuster, 2014.

163. Jason Fernando, Resource Curse: Definition, Overview and Examples, Investopedia Futures and Commodities Trading: Strategy & Education, September 29, 2022.

164.  Why Does Extractives Matter?

165.  Alan J. Herbert, Lanthanum Glass.

166.  Lindsay Dodgson, On the trail of tantalum: tracking a conflict mineral, Mining Technology, 2016.

167.  Global Lanthanum Market , Research and Markets, 2018.

168.  White, Sarah and Jane O’Connell, The natural and industrial cycling of indium in the environment, Massachusetts Institute of Technology. Dept. of Civil and Environmental Engineering, 2012.

169.  Ibid.

170. Acosta, Jose A., Ángel Faz, et al. Environmental Risk Assessment of Tailings Ponds Using Geophysical and Geochemical Techniques, Assessment, Restoration and Reclamation of Mining Influenced Soils, 2017.

171.  Tang, Shuting, Chunli Zheng et al. Geobiochemistry characteristics of rare earth elements in soil and ground water: a case study in Baotou, China, National Library of Medicine. 2020.

172.  Mims, Christopher. Electronics Makers Have Worst Labor Practices of Any Industry, Says Report, MIT Technology Review, 2012.

173. Stanley, Jay. The Nightmarish Loss of Workplace Privacy, ACLU, 2022.

174.  Kendall, D. G. Stochastic Processes Occurring in the Theory of Queues and their Analysis by the Method of the Imbedded Markov Chain. The Annals of Mathematical Statistics, 1953.

175.  Shankland, Stephen. Google Uncloaks Once-Secret Server. CNET, 2009.

176.  Cyanide Toxicity, National Library of Medicine, 2023.

177.  Monserrate, Steven Gonzalez. The Staggering Ecological Impacts of Computation and the Cloud, MIT Press.

178.  Pascal, Blaise. Pensées, Trans. W.F. Trotter, Random House, 1941.

179.  Deleuze, Gilles, Guattari Félix, and Brian Massumi. Apparatus of Capture, A Thousand Plateaus. London: Bloomsbury, 2013.

180.  Fussell, Angela, Terrestrial Photogrammetry in Archaeology, World Archaeology Vol. 14, No. 2, 1982.

181.  Foucault, Michel. Discipline and Punish, Pantheon, 1977.

182.  Magnani, Matthew and Matthew Douglass et al. The Digital Revolution to Come: Photogrammetry in Archaeological Practice, Cambridge University Press, 2020.

183.  Allahyari, Moreshin. Physical Tactics for Digital Colonialism, The New Museum, 2019.

184.  Tannús, Júlia. Optimizing and Automating Computerized Photogrammetry for 360° 3D Reconstruction, IEEE Symposium on Virtual and Augmented Reality.

185.  Schurian, Bernhard. Museum Fur Naturkunde, Berlin. 2023.

186.  Vasilescu, Denis. Renishaw Advanced Metrology Workshop, Autodesk, 2023.

187.  Weckenmann A., G. Peggs G. and J. Hoffmann. Probing systems for dimensional micro- and nano-metrology". Measurement Science and Technology. Meas. Sci. Technol. 17, 2006.

188.  DeLanda, Manuel. War in the Age of Intelligent Machines. New York, NY: Zone Books, 2003.

189.  Lightcage, ESPER, 2023.

190.  Foucault, Michel. Discipline and Punish, Pantheon, 1977.

191.  Lefebvre, Henri. Rhythmanalysis: Space, Time and Everyday Life. Bloomsbury Academic, 2013.

192.  Schlosser, Eric. The Prison-Industrial Complex, The Atlantic, 1998.

193.  Lee, Kijun. Military Application of Aerial Photogrammetry Mapping Assisted by Small Unmanned Air Vehicles, Air Force Institute of Technology, Defense Technical Information Center, 2018.

194.  Denis Cosgrove, Geography and Vision: Seeing, Imagining and Representing the World, Tauris, 2008.

195.  Steyerl, Hito. In Free Fall: A Thought Experiment on Vertical Perspective, e-flux, Issue 24, 2011.

196.  Denis Cosgrove, Geography and Vision: Seeing, Imagining and Representing the World, Tauris, 2008.

197.  Steyerl, Hito. In Free Fall: A Thought Experiment on Vertical Perspective, e-flux, Issue 24, 2011.

198.  What is remote sensing and what is it used for?, United States Geological Survey.

199.  National Data Security Policy for Space-Based Earth Remote Sensing Systems: Background Information for the act on Satellite Data Security, Federal Ministry of Economics and Technology, Germany, 2007.

200.  Sanger, David. Ethical Challenges in the Practice of Remote Sensing and Geophysical Archaeology, Archeological Prospection, Volume 28, Issue 3, 2021.

201.  Emery, William and Adriano Camps. Optical Imaging Systems, Introduction to Satellite Remote Sensing, Comprehensive Remote Sensing, 2017.

202.  Olson, Eric. Guidelines for Setting Camera Field of View, Security Info Watch, 2019.

203.  Laidler, John and Suzanna Zuboff. High tech is watching you, The Harvard Gazette, 2019.

204.  Ibid.

205.  Zuboff, Suzanna. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Public Affairs Books, 2019.

206.  Ibid.

207.  Vaughan, Janet. Against – 3D ultrasound in first and second trimester pregnancy – hype or helpful?, National Library of Medicine, 2015.

208.  Crisis Pregnancy Centers, Issue Brief, The American College of Obstetricians and Gynecologists, 2023.

209.  Hoskins, Peter, Kevin Martin and Abigail Thrush. Diagnostic Ultrasound: Physics and Equipment, Cambridge University Press, 2010.

210.  Russo, Jen. Mandated Ultrasound Prior to Abortion, AMA Journal of Ethics, 2014.

211.  3D Ultrasound Market Size, Share & Trends Analysis Report By Application, 2020-2027, Market Analysis Report, Grand View Research, 2019.

212.  3D Keepsake Imaging, 2023.

213.  Ultrasound Imaging, U.S. Food & Drug Administration, 2020.

214.  Grady, Stephanie. It's a Booming Baby Business, but are 3D/4D Ultrasounds Really Worth the Risk?, FOX6, 2015.

215.  Are all 3D ultrasounds weird looking?, What to Expect—Community Forum, 2018.

216.  Van Walree, Paul. Distortion, Photographic Optics, 2009.

217.  Fisheye Lens, Flat Earth Answers.

218.  Bowen, Christopher J. and Roy Thomson. Grammar of the Shot, Taylor & Francis, 2013.

219.  Ballard, Zachary, Calvin Brown, Asad M. Madni & Aydogan Ozcan. Machine learning and computation-enabled intelligent sensor design, Nature Machine Intelligence, 2021.

220.  Donoho, D.L., Compressed sensing, IEEE Transactions on Information Theory, 2006. doi:10.1109/TIT.2006.871582

221.  Lewis, Sarah. The Racial Bias Built into Photography, The New York Times, 2019.

222.  Crockford, Kade, How is Face Recognition Surveillance Technology Racist?, ACLU, 2020.

223.  The Data Divide, Ada Lovelace Institute, 2021.

224.  Developing a Minimum Digital Living Standard for Households with Children, University of Liverpool, 2021.

225.  Assigning Data Ownership, Data Governance Institute, 2023.

226.  Ibid.

227.  Ibid.

228.  Polge, Julien, Jérémy Robert, and Yves Le Traon. Permissioned Blockchain Frameworks in the Industry: A Comparison, ICT EXPRESS, 2021.

229.  Blackman, Reid. Why Blockchain’s Ethical Stakes Are So High … And How Developers And Users Can Mitigate Potential Harm, Harvard Business Review, 2022.

230.  Ray, Shaan. Blockchain Security Mechanisms, Toward Data Science, 2018.

231.  Can blockchain accelerate Internet of Things (IoT) adoption?, Deloitte, 2023.

232.  Schwartz, Ariel. Every bullet this gun fires would be automatically tracked in a database — here’s why, 2016.

233.  Gramlich, John. What the Data says about Gun Deaths in the U.S., Pew Research, 2023.

234.  Lucas, Ryan. The First Smart Gun with Facial and Fingerprint Recognition is Now for Sale, NPR, 2023.

235.  “Smart” Guns | Personalized Firearms, NRA Institute for Legislative Action.

236.  Ibid.

237.  Levi, Stuart D. and Alex B. Lipton, An Introduction to Smart Contracts and Their Potential and Inherent Limitations, Harvard Law School Forum on Corporate Governance, 2018.

238.  Conti, Robin. What Is An NFT? Non-Fungible Tokens Explained, Forbes, 2023.

239.  Botz, Anneli, Is Blockchain the Future of Art? Four Experts Weigh In, Art Basel, 2023.

240.  Beeple’s Opus, Christies, 2023.

241.  Kakar, Arun. Two Years since the Historic Beeple Sale, What’s Happened to the NFT Market?

242.  Beckett, Lois. ‘Huge mess of theft and fraud:’ artists sound alarm as NFT crime proliferates, The Guardian, 2022.

243.  Non-Fungible Token Study, The United States Copyright Office, 2023.

244.  King Jr, Martin Luther. The Case Against 'Tokenism', The New York Times, 1962.

245.  Cat-Wells, Keely.  NFTs By Disabled Creatives Breaking Moulds And Making Profits, Forbes, 2021.

246.  Madrigal, Alexis. How Blind Photographers Visualize the World, Forum, KQED, 2023.

247.  Ibid.

248.  Ibid.

249.  Ibid.

250.  Barber, Gregory. NFTs Are Hot. So Is Their Effect on the Earth’s Climate, Wired, 2021.

251.  Tabuchi, Hiroko. NFTs Are Shaking Up the Art World. They May Be Warming the Planet, Too, The New York Times, 2021.

252.  Ghorbanzadeh, Masoud. Proof-of-Stake (POS), Ethereum, 2023.

253.  Ibid.

254.  GRID Alternatives, 2023.

255.  Ibid.

256.  Ibid.

257.  Barthes, Roland. Camera Lucida: Reflections on Photography. Trans. Richard Howard. Hill and Wang, 1981.

258.  Mackenzie, Charles E. Coded Character Sets, History and Development, The Systems Programming Series (1 ed.). Addison-Wesley Publishing Company, Inc, 1980.

259.  Baudrillard, Jean. Simulacra and Simulation. Editions Galilee, 1981.

260.  Hansen, Nadja. Featured Publication: Photography and the American Civil War, 2013.

261.  Ibid.

262.  Photograph of Sojourner Truth, The Metropolitan Museum,1965.

263.  Sojourner Truth, Library of Congress, 2023.

264.  McCurry, Stephanie. The Confederacy Was anAntidemocratic, Centralized State, The Atlantic, 2020.

265.  The Black Codes and Jim Crow Laws, National Geographic, Education, 2023.

266.  Mcpherson, Tara. Reconstructing Dixie, Duke University Press, 2003.

267.  Ibid.

268.  Chun, Wendy. The Enduring Ephemeral, or the Future Is a Memory, Critical Inquiry 35:1, 2008.

269.  Data Lakes and Data Swamps, IBM, 2023.

270.  Ibid.

271.  Apprich, Clemens, Wendy Hui Kyong Chun, Florian Cramer, and Hito Steyerl. Pattern Discrimination, 2019.

272.  Chun, Wendy, On Software or the Persistence of Visual Knowledge, Grey Room, 18, 2005.

273.  Ibid.

274.  GitHub abandons 'master' term to avoid slavery row, BBC, 2020.

275.  Da Silva, Laura Javier Roca-Piera, and José-Jesús Fernández. Evaluation of Master-Slave Approaches for 3D Reconstruction in Electron Tomography, Lecture Notes in Computer Science, Vol. 5518, 2009.

276.  Oberhaus, Daniel. ‘Master/Slave’ Terminology Was Removed from Python Programming Language, 2018.

277.  Ibid.

278.  Ibid.

279.  Issue 34605: Avoid Master/Slave Terminology - Python Tracker. bugs.python.org. 2023.

280.  GitHub Abandons ‘Master’ Term to Avoid Slavery Row, BBC, 2020.

281.  “A Resolution to Redefine SPI Signal Names”. Open Source Hardware Association, 2022.

282.  Leonard, Ellis. It’s Time for IEEE to Retire ‘Master / Slave,’ EE Times, 2020.

283.  Galloway, Alexander. Language Wants to Be Overlooked: Software and Ideology, Journal of Visual Culture, Volume 5, Issue 3, 2006.

284.  Chun, Wendy Hui Kyong. On “Sourcery,” or Code as Fetish, Configurations, Volume 16, Number 3, The John Hopkins University Press, 2008. DOI: 10.1353/con.0.0064

285.  Chun, Wendy. The Enduring Ephemeral, or the Future Is a Memory, Critical Inquiry 35:1, 2008.

286.  Wade, Nicholas. On the Origins of Terms in Binocular Vision, National Library of Medicine, 2021.

287.  Roberts, Lawrence. Machine Perception Of Three-Dimensional Solids, Massachusetts Institute of Technology, 1963.

288.  Longuet-Higgins, Hugh Christopher. A Computer Algorithm for Reconstructing a Scene from Two Projections, Nature, 1981.

289.  Luong, Quan-Tuan and Olivier D. Faugeras. The fundamental matrix: Theory, algorithms, and stability analysis, International Journal of Computer Vision, 1996.

290.  Chen, Yang and Gerard Medioni. Object Modeling by Registration of Multiple Range Images, Image Vision Computation, 10, 1991. doi:10.1016/0262-8856(92)90066-C

291.  Besl, Paul and N.D. McKay, A Method for Registration of 3-D Shapes, IEEE Transactions on Pattern Analysis and Machine Intelligence, 14, 1992. doi:10.1109/34.121791

292.  Hartley, Richard and Andrew Zimmerman. Multiple View Geometry in Computer Vision, Cambridge University Press, 2000.

293.  Lowe, David. Distinctive Image Features from Scale-Invariant Keypoints, University of British Columbia, 2004.

294.  Ballabeni, Andrea, Fabrizio Apollonio, Marco Gaiani, and Fabio Remondino. Advances in Image Pre-processing to Improve Automated 3D Reconstruction, 2015. 10.5194/isprsarchives-XL-5-W4-315-2015

295.  Ibid.

296.  Lowe, David. Distinctive Image Features from Scale-Invariant Keypoints, University of British Columbia, 2004.

297.  Luke, Robert, James Keller, and Jesus Chamorro-Martinez. Extending the Scale Invariant Feature Transform Descriptor into the Color Domain, ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (IV), 2008.

298.  Sjödahl, E.g. M.and L.R. Benckert, Electronic Speckle Photography: Analysis of an Algorithm Giving the Displacement with Subpixel Accuracy, Appl Opt. 1993.. doi:10.1364/AO.32.002278

299.  Fourier, Jean-Baptiste Joseph. Théorie Analytique de la Chaleur, Firmin Didot Père et Fils. 1822.

300.  Allan Sekula, The Body and the Archive, October, Vol. 39, MIT Press, Winter 1986.

301.  What Is a Feature Descriptor in Image Processing?, Baeldung, 2023.

302.  Definition of ransack, Merriam Webster Dictionary, 2023.

303.  Wang, X. Learning and Reasoning with Visual Correspondence in Time. 2019.

304.  Strutz, T. Data Fitting and Uncertainty, Springer, 2016.

305.  What are outliers in the data? National Institute of Standards and Technology (NIST), US Department of Commerce, 2023.

306.  Outlier, Etymology Online, 2023.

307.  Gress, Todd W, James Denvir, and Joseph I. Shapiro. Effect of removing outliers on statistical inference: implications to interpretation of experimental data in medical research, National Library of Medicine, 2018.

308.  Vural, Elif and A. Aydin Alatan. Outlier Removal for Sparse 3D Reconstruction from Video, The True Vision - Capture, Transmission and Display of 3D Video, IEEE, 2008.

309.  Standard Deviation, Wolfram, 2023.

310.  Leach, Richard. Abbe Error/Offset, CIRP Encyclopedia of Production Engineering, 2014 doi:10.1007/978-3-642-35950-7_16793-1

311.  Forczyk, Robert.Kursk 1943: The Southern Front. Bloomsbury Publishing, 2017.

312.  Trilateration vs. Triangulation, U.S. Department of Defense, 2023.

313.  Deleuze, Gilles and Felix Guattari. Anti-Oedipus: Capitalism and Schizophrenia, Trans. Robert Hurley, Mark Seem, and Helen R. Lane. University of Minnesota Press, 1983.

314.  Ibid.

315.  Lourakis, M.I.A. and A.A. Argyros. SBA: A Software Package for Generic Sparse Bundle Adjustment, ACM Transactions on Mathematical Software, 2009. . doi:10.1145/1486525.1486527

316.  Ibid.

317.  Couprie, Camille, Leo Grady, Laurent Najman, and Hugues Talbot. Power Watersheds: A Unifying Graph-Based Optimization Framework, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 33, No. 7, 2011.

318.  Tošić, Ivana and Pascal Frossard. Spherical Imaging in Omnidirectional Camera Networks, Multi-Camera Networks: Principles and Applications, 2009.

319.  The Atlas of Inequality, MIT, 2023.

320.  Yedidia, J.S. and W.T. Freeman, W.T. Understanding Belief Propagation and Its Generalizations, Exploring Artificial Intelligence in the New Millennium, Morgan Kaufmann. 2003.

321.  Jaiswal,J. C. LoSchiavo, and D. C. Perlman. Disinformation, Misinformation and Inequality-Driven Mistrust in the Time of COVID-19: Lessons Unlearned from AIDS Denialism, AIDS Behav., 24, 10. 2020. 10.1007/s10461-020-02925-y

322.  Stereo, Etymology Online, 2023.

323.  The Physical History of 'Stereotype': From the Printing House to Everyone's House, Merriam Webster, 2023.

324.  Lippmann, Walter, Public Opinion, New York, MacMillan Co, 1922.

325.  Levinas, Emmanuel. Collected Philosophical Papers. Trans. Alphonso Lingis, 1987.

326.  Kutulakos, Kiriakosn and Stevenm Seitza. Theory of Shape by Space Carving, International Journal of Computer Vision 38 (3), 2000.

327.  Appel, Arthur. Some techniques for shading machine renderings of solids, ACM, 1968.

328.  Vale, Paul. Vaclav Havel Dead: The Quotes Of The Man Who ‘Lived In Truth,’ The Huffington Post, 2011.

329.  Wolchover, Natalie. A New Physics Theory of Life, Quanta, 2014.

330.  Tanks and Temples, 2023.

331.  Ibid.

332.  Ibid.

333.  1 Kings 6:20.

334.  Donefer-Hickie, Ana Matisse and Wolfram Koeppe. Take a Peek Inside an Ancient Temple!, The Metropolitan Museum, 2020.

335.  Weisstein, Eric W. Normal Vector, MathWorld, 2023.

336.  Foucault, Michel. Abnormal. Lectures at the College de France, 1974-1975. Trans. GrahamBurchell, Verso, 2003.

337.  Tongson, Karen. Normporn, New York University Press, 2023.  

338.  Morrison, Toni. The Bluest Eye, Holt, Rinehart and Winston, 1970.

339.  Ahmed, Sara. Queer Phenomenology: Orientations, Objects, Others. Duke University Press, 2006. Project MUSE. 

340.  Ibid.

341.  Massumi, Brian. Parables of the Virtual: Movement, Affect, Sensation, Duke University Press, 2002.

342.  Chibane, Julian. Thiemo Alldieck and Gerard Pons-Moll. Implicit Functions in Feature Space for 3D Shape Reconstruction and Completion, Conference on Computer Vision and Pattern Recognition, IEEE, 2020.

343.  Mescheder, Lars, Michael Oechsle, Michael Niemeyer, Sebastian Nowozin, and Andreas Geiger.

344.  March, Merriam Webster Dictionary, 2023.

345.  Lorensen, William E. and Harvey E. Cline. Marching Cubes: A High Resolution 3D Surface Construction Algorithm, Computer Graphics, Vol. 21, No. 4, 1987.

346.  Deleuze, Gilles and Felix Guattari. Anti-Oedipus: Capitalism and Schizophrenia, Trans. Robert Hurley, Mark Seem, and Helen R. Lane. University of Minnesota Press, 1983.

347.  Ibid.

348.  Ibid.

349.  Baumgart, Bruce. Winged-Edge Polyhedron Representation for Computer Vision. National Computer Conference, 1975.

350.  Diaz-Andreu, Margarita. Colonialism and the Archaeology of the Primitive, A World History of Nineteenth-Century Archaeology: Nationalism, Colonialism, and the Past, 2007.

351.  Kazdan, Michael. Matthew Bolitho, and Hugues Hoppe. Poisson Surface Reconstruction, Eurographics Symposium on Geometry Processing, 2006.

352.  Mesh Smoothing, Graphics, Stanford University.

353.  Veneziano, Alessio, Federica Landi, and Antonio Profico. Surface Smoothing, Decimation, And Their Effects On 3D Biological Specimens, American Journal of Physical Anthropology, Vol. 166, I. 2, 2018.

354.  Botsch, Mario, Mark Pauly, Leif Kobbelt, and Pierre Alliez. Geometric Modeling Based on Polygonal Meshes, 2007. DOI:10.1145/1281500.1281640

355.  Plutarch, Plutarch's Parallel Lives: Antony, Internet Classics Archive, 75 ACE.

356.  Catmull, E. A Subdivision Algorithm for Computer Display of Curved Surfaces. University of Utah, 1974.

357.  Ibid.

358.  Deconstructing Deepfakes—How do they work and what are the risks?, US Government Accountability Office, 2023.

359.  Ibid.

360.  Van Holland, Leif, Patrick Stotko, Stefan Krumpen, Reinhard Klein, and Michael Weinmann. Efficient 3D Reconstruction, Streaming and Visualization of Static and Dynamic Scene Parts for Multi-client Live-telepresence in Large-scale Environments, 2022.  

361.  Jensen, H. Global Illumination using Photon Maps, Stanford University, 1996.

362.  Fast Fourier Transform (FFT) and Convolution in Medical Image Reconstruction, Intel, 2020.

363.  Mustafi, Sara and Tatiana Latychevskaia. Fourier Transform Holography: A Lensless Imaging Technique, Its Principles and Applications, Photonics, 2023.

364.  BlinkOnCrime.com is owned and operated by Shannon Christina Stoy. Christina Stoy and BlinkOnCrime.com are well known for sensationalism and dishonest content. In her quest for internet fame and webhits, Stoy is commonly libelous, callous with the privacy of others and recklessly speculative. Countless people have been left in Stoy's wake, bystanders in police investigations needlessly dragged through the mud by an internet tabloid writer and her minions. Often with no foundation or even worse,with disinformation and libel, 2023.

365.  Romano, Aja. Why We’re Relitigating The Casey Anthony Case Now — And Why We Shouldn’t, Vox, 2022.

366.  Ryu, Jenna. Casey Anthony is a 'pathological liar,' new series says. What does that really mean?, USA Today, 2022.

367.  Li, Wendy. Casey Anthony-Related Merchandises Selling Like Hot Cakes on Ebay, The International Business Times, 2011.

368.  Lohr, David. Casey Anthony: Hustler Offers $500,000 For Nude Photos, Larry Flynt Reports, The Huffington Post, 2011.

369.  Akpan, Nsikan. How Cops used Virtual Reality to Recreate Tamir Rice, San Bernardino Shootings, PBS, 2016.

370.  Ibid.

371.  Nagourney, Adam, Ian Lovett and Richard Pérez-Peña. San Bernardino Shooting Kills at Least 14; Two Suspects, The New York Times, 2015.

372.  Akpan, Nsikan. How Cops used Virtual Reality to Recreate Tamir Rice, San Bernardino Shootings, PBS, 2016.

373.  Colorado Springs Uses Laser Scanner to Document Mass Shooting, FARO, 2023.

374.  Akpan, Nsikan. How Cops used Virtual Reality to Recreate Tamir Rice, San Bernardino Shootings, PBS, 2016.

375.  Schwartz, John. Debate Over Full-Body Scans vs. Invasion of Privacy Flares Anew After Incident, The New York Times, 2009.

376.  Akpan, Nsikan. How Cops used Virtual Reality to Recreate Tamir Rice, San Bernardino Shootings, PBS, 2016.

377.  Ibid.

378.  Hartnett, Kevin. The ‘Useless’ Perspective That Transformed Mathematics, Quanta, 2020.

379.  Moscovici, S. La Psychanalyse, Son Image et Son Public, Presses Universitaires de France, 1961.

380.  Moscovici, S. Attitudes and opinions. Annual Review of Psychology, 14, 1963.

381.  Eight Months Pregnant and Arrested After False Facial Recognition Match, The New York Times, 2023.

382.  Haraway, Donna. A Manifesto for Cyborgs: Science, Technology, and Socialist Feminism in the 1980s, Socialist Review, 1985.

383. Jara Rocha, Femke Snelting, “Possible Bodies,”Volumetric Regimes: Material cultures of quantified presence, Open Humanities Press, 2022.

384.  Holodexxx VR: Porn Stars Scanned into 3D, VR Porn, 2017.

385.  Cram, Rob. Holodexxx Experimental Customization/Studio/Home Overview, 2021.

386.  Ibid.

387.  3D Scan Store, 2023.

388.  Hao, Karen. Deepfake Porn Is Ruining Women’s Lives. Now The Law May Finally Ban It, MIT Technology Review, 2021.

389.  Ibid.

390.  BeHere /1942, A New Lens on the Japanese American Incarceration, Japanese American National Museum, 2022.

391.  Interview with David Leonard, 2022.

392.  Dashevsky, Evan. 18 Completely Inappropriate Places to Play Pokemon Go, PC Magazine, 2016.

393.  Fujihata, Masaki, The Museum Inside The Network, 1995.

394.  Anantova, N. Monuments in the Structure of an Urban Environment: The Source of Social Memory and the Marker of the Urban Space, Materials Science and Engineering, 2017.

395.  Alberge, Dalya. British Museum Is World's Largest Receiver Of Stolen Goods, Says QC, The Guardian, 2019.

396.  Lasaponara, Rosa; Masini, Nicola. Living in the Golden Age of Digital Archaeology, Computational Science and Its Applications – ICCSA, Springer International Publishing, 2016, doi:10.1007/978-3-319-42108-7_47

397.  Kwet, Michael. Digital colonialism: The evolution of US empire, Longreads, March 2021.

398.  The Robot Guerrilla Campaign to Recreate the Elgin Marbles.

399.  Ibid.

400.  Mendelsohn, Daniel. Deep Frieze: What does the Parthenon mean?, The New Yorker, 2014.

401.  Ibid.

402.  Alexander, Caroline. If It Pleases the Gods: The Parthenon Enigma, The New York Times, 2014.

403.  Ibid.

404.  Mendelsohn, Daniel. Deep Frieze: What does the Parthenon mean?, The New Yorker, 2014.

405.  Wood, Gillen D'Arcy. Mourning the Marbles: The Strange Case of Lord Elgin's Nose, The Wordsworth Circle, Vol. 29, Num.3.

406.  The Robot Guerrilla Campaign to Recreate the Elgin Marbles.

407.  Wood, Gillen D'Arcy. Mourning the Marbles: The Strange Case of Lord Elgin's Nose, The Wordsworth Circle, Vol. 29, Num.3.

408.  Ibid.

409.  The Parthenon Sculptures: The Trustees’ statement, 2023.

410.  Ibid.

411.  Ibid.

412.  Michel, Roger. The Institute for Digital Archaeology, 2023.

413.  Talbot, Margaret. The Myth of Whiteness in Classical Sculpture, The New Yorker, 2018.

414.  Bond, Sarah. Whitewashing Ancient Statues: Whiteness, Racism And Color In The Ancient World, 2017.

415.  The Robot Guerrilla Campaign to Recreate the Elgin Marbles.

416.  British Museum Calls For ‘Parthenon Partnership’ With Greece Over Marbles, 2022.

417.  Allahyari, Moreshin, She Who Sees the Unknown, 2021.

418.  The Lincoln Memorial, U.S. National Parks Service, 2023.

419.  The Vietnam Veterans Memorial, U.S. National Parks Service, 2023.

420.  Zuhowski, Emilie. Memorial Day first celebrated at Charleston’s Hampton Park, 2022.

421.  Southern Documentary Fund, The Low Country, 2021.

422.  Bryant, Marie Claire. Underground Railroad Quilt Codes: What We Know, What We Believe, and What Inspires Us, The Smithsonian, 2019.

423.  Voluptuous Disintegration: A Future History of Black Computational Thought. Digital Humanities, Vol. 16, Num. 3, 2022.

424.  Six Years Later: 170 Confederate Monuments Removed Since Charleston Church Massacre, Southern Poverty Law Center, 2021.

425.  Parker, Adam. Few Black Burial Grounds Remain Intact In Charleston. Gullah Society Wants To Save Them, 2022.

426.  The Legacy Museum, 2023.

427.  Monuments, LAXART, 2023.

428.  Virtual Tour of United States Veterans and War Memorials, U.S. National Parks Service, 2023.

429.  Honor Everywhere: Virtual Reality Veterans Experience

430.  Civil War 1864: A Virtual Reality Experience

431.  Traveling While Black, Felix & Paul Studios, Oculus, 2019.

432.  1000 Cut Journey, Stanford Virtual Human Interaction Lab, 2018.

433.  Baccus-Clark, Ashley, Carmen Aguilar Y Wedge, Ece Tankal, and Nitzan Bartov. NeuroSpeculative AfroFuturism, MIT Docubase, 2017.

434.  Carne y Arena.

435.  Cahill, Nancy Baker. Liberty Bell, Association For Public Art, 2023.

436.  Olujimi, Kambui. Skywriters & Constellations:Full Dome Film and Related Exhibition, Newark Museum, 2018.

437.  Freeman, John Craig. Border Memorial: Frontera de los Muertos, 2012.

438.  Thiel, Tamiko and /p. Unexpected Growth, The Whitney Museum, New York, 2018.

439.  The Heritage Foundation.

440.  University of Southern California.

441.  Chat GPT, Open AI, 2023.

442.  Virgil, The Aeneid, Book II, Translated by A. S. Kline, 2002.

443.  Lustig, R. H. The Hacking of the American Mind: The Science Behind the Corporate Takeover of Our Bodies and Brains. Avery. New York 2017.

444.  Ibid

445.  Brin, Sarah. Subsidized, The Aesthetics of Play, Hammer Museum.

446.  Deleuze, Gilles. Spinoza: Practical Philosophy. Trans. Richard Hurley, City Lights Books, 1970.

447.  Putnam, Hilary. Reason, Truth, and History, Cambridge University Press, 1981.

448.  Liiva, Johan, Johan Reinholdz and Matte Modin. Deus Deceptor, NonExist, 2019.

449.  Skepticism and Content Externalism, Stanford Encyclopedia of Philosophy, 2018.

450.  The Digital Democracy Institute.

451.  Ibid.

452.  Marx, Karl. Das Kapital.

453.  Deleuze, Gilles and Felix Guattari. Anti-Oedipus: Capitalism and Schizophrenia, Trans. Robert Hurley, Mark Seem, and Helen R. Lane. University of Minnesota Press, 1983.

454.  Lyotard, Jean-Fracois. Energumen Capitalism, #Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media, 2017.

455.  Ibid.

456.  Ballard, J. G. Fictions of Every Kind, #Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media , 2017.

457.  Cybernetic Culture Research Unit, Lemurian Time War,CCRU Writings 1997-2003. Time Spiral Press, 2015.

458.  Ibid.

459.  Noys, Benjamin. The Persistence of the Negative: A Critique of Contemporary Continental Theory, Edinburgh University

460.  Ibid.

461.  Ibid.

462.  Hoffman, Bruce. A Year After January 6, Is Accelerationism the New Terrorist Threat?, Council on Foreign Relations, 2022. 

463.  Singleton, Benedict. Maximum Jailbreak, e-flux Journal, Is. 46, 2013.

464.  Ibid.
465.  Williams, Alex. Xenoeconomics and Capital Unbound, Splintering Bone Ashes, 2008. 
466.  Fisher, Mark. Nihilism Without Negativity, k-punk, 2008. 
467.  Williams, Alex. Post-Land: The Paradoxes of a Speculative Realist Politics, Splintering Bone Ashes, 2008. 

468.  Land, Nick, et al. Meltdown, Fanged Noumena: Collected Writings 1987-2007, Urbanomic, 2017.

469.  Ibid.

470.  Ibid.

471.  Land, Nick. The Dark Enlightenment, The Dark Enlightenment, 2012. 
472.  Ibid.
473.  Ibid.
474.  Ibid.

475.  Ibid.

476.  Ibid.
477.  Ibid.
478.  Ibid.
479.  Ibid.
480.  Ibid.
481.  Ibid.
482.  Land, Nick. Hyper-Racism, Outside In: Involvements with Reality, 2014. 

483.  Ibid.

484.  Williams, Alex; Srnicek, Nick. #ACCELERATE MANIFESTO for an Accelerationist Politics, Critical Legal Thinking, 2013.
485.  Ibid.
486.  Ibid.

487.  Ibid.

488.  Ibid.

489.  Ibid.
490.  Ibid.
491.  Ibid.
492.  Ibid.
493.  Ibid.
494.  Land, Nick. Annotated #Accelerate (#1), Urban Future (2.1): Views from the Decopunk Delta, 2014.
495.  Ibid.
496.  Ibid.
497.  Ibid.

498.  Ibid.

499.  Ibid.
500.  Ibid.

501.  Williams, Alex. Escape Velocities, e-flux 46, 2013.

502.  Ibid.
503.  Ibid.
504.  Ibid.
505.  Ibid.
506.  Ibid.
507.  Williams, Alex; Srnicek, Nick. Inventing the Future: Postcapitalism and a World without Work. London: Verso, 2016.
508.  Ibid.
509.  Srnicek, Nick. Platform Capitalism, Polity, 2017.
510.  Ibid.
511.  Land, Nick. Crypto-Current, An Introduction to Bitcoin and Philosophy, Šum, #10.2, November 26, 2018.
512.  Ibid.
513.  Abadi, Joseph; Brunnermeier, Markus. Blockchain Economics, 2018. 
514.  Land, Nick. Crypto-Current, An Introduction to Bitcoin and Philosophy, Šum, #10.2, 2018.

515.  Ibid.

516.  Berger, Edmund. Unconditional Acceleration and the Question of Praxis: Some Preliminary Thoughts, Deterritorial Investigations, 2017.
517.  Irigaray, Luce. The Sex Which Is Not One, Cornell University Press, 1985.
518.  Firestone, Shulamith. The Two Modes of Cultural History, #Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media, 2017.
519.  Ibid.
520.  Ibid.

521.  Haraway, Donna Jeanne. Manifestly Haraway, University of Minnesota Press, 2016.

522.  Ibid.

523.  Ibid.

524.  Ibid.

525.  Ibid.

526.  Plant, Sadie. Zeros and Ones: Digital Women and the New Technoculture, Fourth Estate, 1998.
527.  Preciado, Paul B. Testo Junkie: Sex, Drugs, and Biopolitics in the Pharmacopornographic Era, Feminist Press at the City University of New York, 2017.
528.  Ibid.

529.  Ibid.

530.  Cuboniks, Laboria. The Xenofeminist Manifesto: a Politics for Alienation, Verso, 2018. 
531.  Ibid.
532.  Ibid.
533.  Ibid.
534.  Ibid.
535.  Ibid.
536.  Ibid.
537.  Ibid.
538.  Ibid.
539.  Dean, Aria. Notes on Blaccelerationism, e-flux 87, 2017.
540.  Spillers, Hortense. Mama’s Baby Papa’s Maybe: An American Grammar Book, Black, White, and in Color: Essays on American Literature and Culture, University of Chicago Press, 2003. First published in Diacritics, Summer 1987.
541.  Ibid.
542.  Ibid.

543.  Mckittrick, Katherine. Sylvia Wynter: On Being Human as Praxis, Duke University Press, 2014.

544.  Ibid.
545.  Hartman, Saidiya. Venus in Two Acts. SmallAxe: A Caribbean Journal of Criticism 26, 2008.
546.  McKenzie Wark, Black Accelerationism, Public Seminar, 2017.
547.  Ibid.

548.  Negarestani, Reza. The Labor of the Inhuman,#Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media, 2017.

549.  Ibid.

550.  Fisher, Mark Capitalist Realism: Is There No Alternative? Zero Books, 2010.

551.  Brassier, Ray. Prometheanism and its Critics, #Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media, 2017.

552.  TensorFlow. 

553.  PyTorch.

554.  What is a Tensor? University of Cambridge, 2023. 

555.  What is supervised learning?, IBM, 2023.

556.  Mildenhall, Ben, Pratul P. Srinivasan, Matthew Tancik, et al. NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis, ECCV, 2020. 

557.  NERF, Hasbro, 2023.

558.  Stuart, Keith. Photorealism—The Future of Video Game Visuals, The Guardian, 2015. 

559.  Autonomous Weapons.

560.  Stuart Russell, Stuart, Anthony Aguirree, Milia Javorsky, and Max Tegmark. Lethal Autonomous Weapons Exist; They Must Be Banned, IEEESpectrum, 2021.

561.  Planet.

562.  Borges, Jorge Luis. On Exactitude in Science, Collected Fictions, trans. by Andrew Hurley, 1946.

563.  Hacker-Wright, John. Philippa Foot, Stanford Encyclopedia of Philosophy, 2018. 

564.  Millar, Jason. An ethical dilemma: When robot cars must kill, who should pick the victim?, Robohub, 2014.

565.  Cummings, M. L.  Artificial Intelligence and the Future of Warfare, International Security Department and US and the Americas Programme, Chatham House, 2017.

566.  Guerin, Joris, Olivier Gibaru, Stephane Thiery, and Eric Nyiri. CNN Features Are Also Great at Unsupervised Classification, 2018. 

567.  Definition of Convolution, Merriam Webster Dictionary. 2023. 

568.  Srivastava, Abhinai. The Evolution Of Computer Vision And Its Impact On Real-World Applications, Forbes, 2021.

569.  Fukushima, Kunihiko. Neocognitron: A Self-organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position, Springer-Verlag 1980.

570.  Philipp, George, Dawn Song, and Jaime G. Carbonell. The Exploding Gradient Problem Demystified—Definition, Prevalence, Impact, Origin, Tradeoffs, and Solutions, ICLR, 2018.

571.  Glorot, Xavier, Antoine Bordes and Yoshua Bengio. Deep Sparse Rectifier Neural Networks, 2011. 

572.  Yeonjong Shin, Lu Lu, Yanhui Su, and George Em Karniadakis. Dying ReLU and Initialization: Theory and Numerical Examples, 2019.

573.  LeakyReLU, Pytorch, 2023.

574.  Donella Meadows, Leverage Points: Places to Intervene in a System Archived, Sustainability Institute, 1999.

575.  Climate Feedback, The Study of Earth as an Integrated System, NASA, 2023.

576.  Chattopadhyay, D. Electronics: Fundamentals And Applications,  New Age International, 2006.

577.  Newton’s Second Law, The Physics Classroom, 2023.

578.  Yathish, Vishal. Loss Functions and Their Use In Neural Networks, Toward Data Science, 2022.

579.  Graves, Alex, Gred Wayne, and Ivo Danihelka. Neural Turing Machines, 2014.

580.  Shapiro, Linda G. and George C. Stockman. Computer Vision, Prentice-Hall, 2001.

581.  Lu, Luhui. Generative AI and Future, Towards AI, 2022.

582.  Goodfellow, Ian J. et al. Generative Adversarial Nets, The International Conference on Neural Information Processing Systems, 2014.

583.  Nikolenko, Sergey I. Synthetic Data for Deep Learning, Optimization and Its Applications. Vol. 174, 2021.  

584.  Jordon, James et al. Synthetic Data—What, Why, and How? Report commissioned by The Turing Institute and The Royal Society, 2023.

585.  Ibid.

586.  Griffiths, Catherine. Unmodelled: In the Blindspot of AI Infrastructure, Gradient Magazine, 2023. 

587.  Ibid

588.  Ibid

589.  Ibid

590.  Negarestani, Reza. Intelligence and Spirit, The MIT Press, 2018.  

591.  Siphon, Merriam Webster Dictionary, 2023. 

592.  Stable Diffusion.

593.  Suri, Siddharth. Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass, 2019.

594. Timnit Gebru. 

595.  Joy Buolamwini.

596. Kate Crawford.

597. Ruha Benjamin. 

598. Safiya Umoja Noble. 

599. Virginia Eubanks. 

600. Arvind Narayanan.

601.  Latanya Sweeney.

602. Alex Hanna.

603. Os Keyes.

604. Joanna Bryson. 

605. Deborah Raji.

606. Margaret Mitchell.

607. Cade Metz. 

608.  Rumman Chowdhury.

609.  Partnership on AI.

610. AI Now Institute.

611. AI Ethics Lab. 

612. AI4ALL.

613.  Vincent, James. AI Art Tools Stable Diffusion and Midjourney Targeted with Copyright Lawsuit, The Verge, 2023.

614.  Hill, Kashmir. This Tool Could Protect Artists From A.I.-Generated Art That Steals Their Style, The New York Times, 2023.

615.  Ibid.  

616.  Barshad, Amos. This Singer Deepfaked Her Own Voice—and Thinks You Should Too, Wired, 2022. 

617.  Chiang, Ted. ChatGPT Is a Blurry JPEG of the Web, The New Yorker, 2023.

618.  Zhen Liu, et al. MeshDiffusion: Score-based Generative 3D Mesh Modeling, ICLR, 2023.

619.  Point-E: A system for generating 3D point clouds from complex prompts, Open AI, 2023.

620.  Ibid.

621.  Saunders, Jack. Person-Specific Deepfakes with 3D Morphable Models, Medium, 2023. 

622.  Deefakes, Reddit, 2023. 

623.  Roose, Kevin. Here Come the Fake Videos, Too, The New York Times, 2018.

624.  Fagan, Abigail. Deep Fakes Are Becoming More Harmful for Women, Psychology Today, 2022.

625.  Bond, Shannon. People Are Trying To Claim Real Videos Are Deepfakes. The Courts Are Not Amused, National Public Radio, 2023.

626.  Vaccari, C.,  and A. Chadwick. Deepfakes and Disinformation: Exploring the Impact of Synthetic Political Video on Deception, Uncertainty, and Trust in News, Social Media + Society, 6(1), 2020.

627.  Chesney, B., and D. Citron. Deep Fakes: A Looming Challenge For Privacy, Democracy, And National Security, California Law Review, 107, 2019.

628.  Doermann, David. Speaking About Deepfakes to the U.S. House Intelligence Committee, 2019.

629.  Köbis, N. C., B. Doležalová, and I. Soraperra, I. Fooled Twice: People Cannot Detect Deepfakes but Think They Can. iScience, 24 (11), 2021.  

630.  Radner, Karen, Eleanor Robson. The Oxford Handbook of Cuneiform Culture. Oxford University Press, 2011.

631.  Jagersma, B. A Descriptive Grammar of Sumerian, Leiden University, 2010.

632.  Niels Peter Lemche. Biblical Studies and the Failure of History: Changing Perspectives 3. Taylor & Francis, 2014.

633.  Claassens, Juliana. Resisting Dehumanization: Acts of Relational Care in Exodus 1-2 as Image of God's Liberating Presence, Scriptura, 2010.

634.  Exodus Chapter 20, Parashat Terumah, 2023. 

635.  Exodus Chapter 25, Parashat Terumah, 2023. 

636.  Cubit, Merriam Webster Dictionary, 2023.

637.  Schumacher, Benjamin. Quantum Coding, Physical Review A, 1993.

638.  Esquivel, Jessica. The Queer Universe: A Quantum Explanation, 2022.

639.  Von Baeyer, Hans Christian. The Qubit, Information in the Quantum Age, Information,The New Language of Science, 2003.

640.  Einstein, Albert. The Born-Einstein letters: correspondence between Albert Einstein and Max and Hedwig Born from 1916–1955, with commentaries by Max Born. Macmillan. 1971.

641.  Schrödinger, Erwin. Proceedings of the Cambridge Philosophical Society, 31, 1935.

642.  Shichuan Xue, Yong Liu, Yang Wang, Pingyu Zhu, Chu Guo, Junjie Wu. Variational Quantum Process Tomography, 2021. arXiv:2108.02351

643.  Blume-Kohout, Robin. Optimal, Reliable Estimation of Quantum States. Institute for QuantumInformation, Caltech. New Journal of Physics, 2006. arXiv:quant-ph/0611080

644.  Zeh, H. Dieter. On the Interpretation of Measurement in Quantum Theory, Foundations of Physics. 1, 1, 1970. doi:10.1007/BF00708656

645.  Choi, Charles. Electric Cooling Could Shrink Quantum Computers Vacuum-tube effect might simplify cryogenic chambers, IEEE Spectrum, 2023.

646.  Huang, He-Liang and Dachao Wu, Daojin Fan, Xiaobo ZhuSuperconducting Quantum Computing: A Review, Science China Information Sciences 63, 2020.

647.  Vepsäläinen, Antti P., et al. Impact of Ionizing Radiation on Superconducting Qubit Coherence, Nature volume 584, 2020.

648.  Ma, He, Marco Govoni, and Giulia Galli. Quantum Simulations of Materials on Near-term Quantum Computers, Computational Materials, Nature, 2020.

649.  Evers, Matthias, Anna Heid, and Ivan Ostojic. Pharma’s Digital Rx: Quantum Computing in Drug Research and Development, McKinsey, 2021.

650.  Preskill, John. Quantum Computing and the Entanglement Frontier, 2012. arXiv:1203.5813

651.  Arute, Frank, et al. Quantum Supremacy Using a Programmable Superconducting Processor, Nature, 2019.

652.  Roush, Wade. The Google-IBM “Quantum Supremacy” Feud, MIT Technology Review, 2020.

653.  Kaku, Michio. Quantum Supremacy: How the Quantum Computer Revolution Will Change Everything, Doubleday, 2023.

654.  Chun, Wendy Hui Kyong. Control and Freedom: Power and Paranoia in the Age of Fiber Optics, The MIT Press, 2006.

655.  Ibid.

656.  Spinoza, Baruch, and R. H. M. Elwes. The Ethics. Mineola, NY: Dover Publications, Inc., 2018.

657.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

658.  Ibid.

659.  Grosz, Elizabeth. The Incorporeal: Ontology, Ethics, and the Limits of Materialism. Columbia University Press, 2018.

660.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

661.  Ibid.

662.  Ibid.

663.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

664.  Levinas, Emmanuel. Totality and Infinity: An Essay on Exteriority, Duquesne Press, 1961.

665.  Brain in the vat (III. general definition of deception).

666.  Haraway, Donna. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.”Feminist Studies 14, no. 3, 1988.

667.  Dolphijn, Rick, and Iris van der. Tuin. New Materialism Interviews and Cartographies. Open Humanities Press,2012.

668.  Eribon, Didier Michel Foucault. Trans. Betsy Wing, Harvard University Press, 1991.

669.  Bryant, Levi R. The Democracy of Objects. Open Humanities Press, 2011.

670.  DeLanda, Manuel. Intensive Science and Virtual Philosophy. London: Bloomsbury, 2013.

671.  DeLanda, Manuel; Harman, Graham. The Rise of Realism. Cambridge, UK: Polity Press, 2017.

672.  Latour, Bruno. On Actor-Network Theory. A Few Clarifications Plus More Than a Few Complications. Soziale Welt, vol. 47,1996.

673.  Bennett, Jane. Vibrant Matter: A Political Ecology of Things. Durham: Duke University Press, 2010.

674.  Braidotti, Rosi. The Posthuman. Cambridge, UK: Polity Press, 2013.

675.  Ibid.

676.  Barad, Karen. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Durham: Duke University Press, 2007.

677.  Ibid.

678.  Ibid.

679.  Ibid.

680.  Ibid.

681.  Ibid.

682.  Barad, Karen. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Durham: Duke University Press, 2007.

683.  Dolphijn, Rick, and Iris van der. Tuin. New Materialism Interviews and Cartographies. Open Humanities Press, 2012.

684.  Braidotti, Rosi. The Posthuman. Cambridge, UK: Polity Press, 2013.

685.  Bennett, Jane. Vibrant Matter a Political Ecology of Things. Durham: Duke University Press, 2010.

686.  Scott, David. “The Re-Enchantment of Humanism: An Interview with Sylvia Wynter,” Small Axe no.8, 2000.

687.  Ibid.

688.  Ibid.

689.  Whitehead, Alfred North. Process and Reality. Riverside: Free Press, 2010.

690.  Massumi, Brian. Parables for the Virtual. Movement, Affect, Sensation, Duke University Press, 2002.

691.  Delanda, Manuel. The New Materiality, Architectural Design 85, no. 5, 2015.

692.  Grosz, Elizabeth. The Incorporeal: Ontology, Ethics, and the Limits of Materialism, Columbia University Press, 2018.

693.  Braidotti, Rosi. The Posthuman, Polity Press, 2013.

694.  Ibid.

695.  Ibid.

696.  Ibid.

697.  Braidotti, Rosi. The Posthuman, Polity Press, 2013.

698.  Agamben, Giorgio. Homo Sacer: Sovereign Power and Bare Life. Trans. Daniel Heller-Roazen, Stanford University Press, 1998.

699.  Weheliye, Alexander. Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human, Duke University Press, 2014.

700.  Ibid.

701.  Ibid.

702.  Mckittrick, Katherine. Sylvia Wynter: On Being Human as Praxis, Duke University Press, 2014.

703.  Levinas, Emmanuel. Totality and Infinity.

704.  Baciu A, Negussie Y, Geller A, et al. The State of Health Disparities in the United States, National Academies Press, 2017.

705.  Singhal, Shubham. The gathering storm: The uncertain future of US healthcare, McKinsey, 2022.

706.  Ukuleles, Mierle Laderman. Manifesto for Maintenance Art, 1969.

707.  Kelly, Mary. Post-Partum Document, The Tate Museum, 1979.

708.  Kelly, Mary. Post-Partum Document. Documentation III: Analysed Markings And Diary Perspective Schema (Experimentum Mentis III: Weaning from the Dyad), The Tate Museum, 1975.

709.  Tiravanija,Rirkrit.Interview by Chieng Wei Shieng & Clifford Loh.The Relational Artist,Vulture Magazine,2019.

710.  Decter,Joshua.Rirkrit Tiravanija,Art Forum,2011.

711.  Bruguera,Tania.Migrant Manifesto, Immigrant Movement International. Creative Time,2011.

712.  Tania Bruguera:Immigrant MovementInternational,The Tate Museum,2012. 

713.  Pope.L,William.The Black Factory,Creative Capital,2001. 

714.  Keegan,Alison.The Black Factory,The Bates Museum of Art,2010.

715.  Steinbock, Eliza. Photographic Flashes: On Imaging Trans Violence in Heather Cassils' Durational Art, Photography And Culture, Vol. 7, 2014. https://doi.org/10.2752/175145214X14153800234775

716.  Watson, Jeff.Snake Oil Men (VII.).

717.  Ibid.

718.  Lanka Tattersall,Notes on Weed Killer,Museum of Contemporary Art,Los Angeles,2017.

719.  Hedva, Johanna.Sick Woman Theory,Topical Cream,2022. 

720.  Sedgwick, Eve Kosofsky. Paranoid Reading and Reparative Reading, Or, You’re So Paranoid, You Probably Think this Essay is About You, Touching Feeling, Duke University Press, 2002.

721.  Glover, Donald. The Big Payback, S3.E4, Atlanta, IMDB, 2022.

722.  Ibid.

723.  Ludden, Jennifer. Cities may be debating reparations, but here’s why most Americans oppose the idea, National Public Radio, 2023.

724.  Ibid.

725.  Reparations: OHCHR and Transitional Justice, The United Nations, 2023. 

726.  Ibid.

727.  Matthews, Dylan. Six Times Victims Have Received Reparations—Including Four in the US, Vox, 2014.

728.  Thomson, Ginger. South Africa to Pay $3,900 to Each Family of Apartheid Victims, The New York Times, 2003.

729.  Ayesh, Rashaan. The World’s Long History of Reparations, Axios, 2019.

730.  UK to Compensate Kenya’s Mau Mau Torture Victims, The Guardian, 2013.

731.  Matthews, Dylan. Six Times Victims Have Received Reparations—Including Four in the US, Vox, 2014.

732.  Bilmes, Linda and Cornell William Brooks. The United States Pays Reparations Every Day—Just Not to Black America, Harvard-Kennedy School Policycast, 2022.

733.  Ray,Rashawn and Andre M.Perry.Why We Need Reparations for Black Americans,Brookings,2020.

734.  Cox,Kiana and Khadijah Edwards.Reparations for Slavery,Pew Research Center,2022.

735.  Deuteronomy 16:20.

736.  hooks, bell and Maya Angelou. There’s No Place to Go But Up — bell hooks and Maya Angelou in conversation, Lion’s Roar, 1998.

737.  Brosi, George and bell hooks. The Beloved Community: A Conversation between bell hooks and George Brosi, Appalachian Heritage, Volume 40, Number 4, 2012.

738.  Zion, J.W. Dynamics of Navajo Peacemaking, US Department of Justice, 1998.

739.  Truth Commission: South Africa, United States Institute of Peace, 1995.

740.  Youth Justice Family Group Conferences, Oranga Tamariki, New Zealand Ministry for Children, 2023.

741.  Compendium Of Promising Practices To Reduce Violence And Increase Safety Of Aboriginal Women In Canada–Compendium Annex: Detailed Practice Descriptions, Family Violence Initiative, Government of Canada, 2021.

742.  Prison Fellowship, 2023.

743.  Tepper, Felicity. The Importance of Environmental Restorative Justice for The United Nations Decade on Ecosystem Restoration (2021–2030), The Palgrave Handbook of Environmental Restorative Justice, 2022.

744.  Forsyth, Miranda, Brunilda Pali, and Felicity Tepper. Environmental Restorative Justice: An Introduction and an Invitation, The Palgrave Handbook of Environmental Restorative Justice, 2022.

745.  Restorative Environmental Justice, European Forum for Restorative Justice, 2020.

746.  Justice40 Initiative Environmental Justice Fact Sheet, US Department of Energy, 2022.

747.  How Artificial Intelligence is Helping Tackle Environmental Challenges, UN Environment Programme, 2023.

748.  Ocean Visions Selects Launchpad Teams,Ocean Visions,2022.

749.  Phykos, 2023.

750.  Ibid.

751.  Ibid.

752.  Ibid.

753.  Saving the World's Coral Reefs, Autodesk, 2023.

754.  Coral Entanglement Research, Roctopus EcoTrust, 2023.

755.  Coralmaker, Mission Statement, 2023.

756.  Ibid.

757.  Wertheim, Margaret. Corals, Crochet and the Cosmos: How Hyperbolic Geometry Pervades the Universe, The Conversation, 2016.

758.  Carey, Nic and Yotto Koga. Coralmaker panel discussion, Autodesk, 2023.

759.  Weiss, Sabrina. Robots Enter the Race to Save Dying Coral Reefs, Wired Magazine, 2023.

760.  Ibid.

761.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988. 

762.  Weiss, Sabrina. Robots Enter the Race to Save Dying Coral Reefs, Wired Magazine, 2023.

763.  Ibid.

764.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

765.  Ibid.

766.  Fuzzy Logic. Stanford Encyclopedia of Philosophy, Bryant University, 2006.

767.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

768.  Lorde, Audre. The Uses of the Erotic: The Erotic as Power, Kore Press, 1978.

769.  Ibid.

770.  Final Solution—1940 to 1945, United States Holocaust Memorial Museum, 2023.

771.  Why do some Jews who survived the Holocaust have a number tattooed on their arm?, World Jewish Congress, UNESCO, 2023.

772.  Watson, Jeff. Reality is an Emergency. 2015.

~














[EVIL] When I have the map, I will be free...

and the world will be different because I have understanding.

[...] Uh, understanding of what, master?

[EVIL] Of digital watches.

Soon I shall have understanding of video cassette recorders and car telephones.

And when I have understanding of them, I shall have understanding of computers.

And when I have understanding of computers, I shall be the Supreme Being.

God isn’t interested in technology.

He knows nothing of the potential of the microchip or the silicon revolution.

Look how He spends His time. Forty-three species of parrot.

Nipples for men.

Slugs.

Slugs! He created slugs.

They can’t hear. They can’t speak. They can’t operate machinery.

Are we not in the hands of a lunatic?

[...] Sir... look!

[EVIL] If I were creating a world, I wouldn’t mess about with butterflies and daffodils.

I would have started with lasers, 8:00 day one.

Time Bandits, 1981

PREFACE

Supervision examines the optics and ethics of technologies like photogrammetry, LiDAR, volumetric capture, Neural Radiance Fields, and other innovations in computer vision that capture bodies, objects, and environments to reconstruct virtual models. These models are used for countless purposes, including cartography, transportation, architecture, manufacturing, medicine, forensic criminology, defense, historical preservation, and entertainment. While the underlying processes—Capture and Reconstruction—are powerful and seductive tools, they also raise a number of concerns. This text begins with an examination of the origins of Capture and Reconstruction, magnifying processes, applications, and underlying ideologies. It focuses on the ethical questions that have been raised in relation to the tools—privacy, bias, impact, control, deception. Finally, the text offers a framework to guide future engagement with these technologies.

Note.—This text follows the structure of Spinoza’s Ethics.

Coordinates, points——and lines give form to a virtual cloud of ideas.

Throughout, original text from the Ethics is in italics.

This is a post-truth work.

What is human, what is machine?

What is stolen, what is hallucinated?

Can you tell?













PART I.

CONCERNING SUPERVISION

DEFINITIONS.

I. It starts with a desire for control. A desire to know every point on a surface. To fix a thing in a moment. Capture it. Own it. Manipulate it. This is the impossible desire that led us here.

II.What lust, the dream of seeing all sides. Knowing all dimensions. Declaring a thing finite and calculating its boundaries, as if it had them.

III.The power of extraction. To cleave a thing from its context. It evaporates into a cloud. Condenses again at will. At the coordinates of our choosing.

IV. Collecting and classifying each attribute until the world is drained of mystery. An assembly of known surfaces.

V. It is all skin. Shroud. Mask. Mine to modify. To reconfigure.

VI. By Supervision, I mean the infinite eye—that all-seeing impulse, which seeks to know the value of every mote of dust. To label all of boundless existence. And predict every future.

Explanation—I say absolutely infinite, not infinite after its kind: from electrons, to molecules, to cells, to bodies, to cities, to planets, to solar systems, to galaxies, to universes. Capture everything and more. Contain it in that model that makes us God.

VII. By Supervision, I mean computer vision.

VIII. By control, I mean manipulation. The power to enforce fixity. Or, the power to change things, change minds, change actions. Shape the lenses that we look through. Grind their glass surfaces with the tools of our moment.

Explanation—Manipulation is distortion. Data is curved, yet maintains the force of truth that bends behavior in its wake.

AXIOMS.

I. Everything which exists, can be captured.

II. Yet nothing can be captured in reality.

III. Capture is an attempt to grasp a subject, to contain it. Whether within a cage or through a camera or inside the mind, as soon as the gesture to subsume is made, the captive changes. It changes in quality and in time. It is a matter of physics.

IV. It is broken. Shattered into tiny pieces. Obliterated to dust. A point cloud.

V. Its reconstruction is a ghostly apparition. It shares a likeness with its source. But it deceives us. It is a collection of false fragments.

VI. Breathe them in. The sensation is not the subject. After many years, silicosis may follow. Feel it in your chest.

VII. A true idea must correspond with its ideate or object, but we are firmly in the post-truth era.1 Capture and Reconstruction are models for our defactualized2 aesthetic. If a thing can be captured, its reality is already destabilized.

PROPOSITIONS.

Proposition I. Virtual space is saturated with the allure of infinite potential.

Proof.—Before the digital technologies that we associate with virtuality were developed, the virtual was a philosophical concept that stood for the field of all unactualized ideas and events.

Proposition II. While not actual, virtual space is real.

Proof.—It does not vary by degree from physical space, it is of a completely different kind.3

Proposition III. The virtual cannot be represented or controlled.

Proof.—It is a space in constant flux: “Whatever the breaks and ruptures, only continuous variation brings forth this virtual line, this virtual continuum of life, ‘the essential element of the real beneath the everyday.4

Proposition IV. Ideas, images, events, anything not yet actual, populate this extensible plane, endlessly evolving, connecting, and intersecting in new configurations.

Proof.—“We know that the virtual as virtual has a reality; this reality, extended to the whole universe, consists in all the coexisting degrees of expansion (detente) and contraction. A gigantic memory, a universal cone in which everything coexists with itself, except for differences of level. On each of these levels there are some ‘outstanding points, which are like remarkable points peculiar to it. All these levels or degrees and all these points are themselves virtual. They belong to a single Time; they coexist in Unity; they are enclosed in a Simplicity; they form the potential parts of a Whole that is itself virtual. They are the reality of this virtual.”5

Proposition V. The virtual is a network of all possible relations.

Proof.—Bergsons concept of the virtual served as the basis for Deleuze and Guattaris plane of immanence: “there is a pure plane of immanence, univocality, composition, upon which everything is given, upon which unformed elements and materials dance that are distinguished from one another only by their speed and that enter into this or that individuated assemblage depending on their connections, their relations of movement. A fixed plane of life upon which everything stirs, slows down or accelerates”6 (IV.).

Proposition VI. Phase space precludes hierarchies. Only potential and change.

Proof.—Phase space could be seen as a diagrammatic rendering of the dimension of the virtual. The organization of multiple levels that have different logics and temporal organizations, but are locked in resonance with each other and recapitulate the same event in divergent ways, recalls the fractal ontology and nonlinear causality underlying theories of complexity.”7

Corollary.—The virtual is a complex and dynamic system.

Proposition VII. The virtual has a digital twin.

Proof.—It was born when the term virtual was first introduced into computer engineering in 1959. The twin is subservient to computational processes. Its complexity and self-actualizing potential is limited: “The sole function of the virtual memory is to increase machine speed, by increasing the efficiency of other devices.”8

Proposition VIII.There is investment in the fantasy that the two virtuals are interchangeable. Identical twins.

Proof.—See any escapist tech-billionaire vision.

Note I.—They are not the same.

Note II.—The computational appropriation of virtuality undermines the differential quality that defines it: “all we need to do is to sink the floating plane of immanence, bury it in the depths of Nature instead of allowing it to play freely on the surface, for it to pass to the other side and assume the role of a ground that can no longer be anything more than a principle of analogy from the standpoint of organization, and a law of continuity from the standpoint of development.”9 The virtual—point clouds, models, digital twins, simulations—is now a highly representational space of taxonomic organization that reinscribes colonial strategies of control and supervision.

Is it possible to restore the pre-digital sense of the virtual? It would require an embrace of the indeterminate, to call it forth from tendencies to control, so that other possibilities may emerge. As Fred Moten and Stefano Harney urge in The Undercommons, “we owe each other the indeterminate.”10 And it is always there, even if obscured by representation, by quantification, by abstraction: “abstract space relates negatively to that which perceives and underpins it—namely, the historical and religio-political spheres. It also relates negatively to something which it carries within itself and which seeks to emerge from it: a differential space-time.”11 Differential space is a fascinating model. It operates according to the calculus of derivatives as opposed to normals. Derivatives are tangential rather perpendicular to a curve. They are anti-specimen-pins, ramps measuring rates of change instead of fixing in place. They move with, alongside, nearby. Differential space is the space of rhythm, the space of life, of becoming.12 To access this space, this mindset shift, first we must understand

II. The Nature and Optics of Capture

III. The Nature and Optics of Reconstruction

IV. Of Technological Acceleration, or the Ethics of Supervision

V. Of the Ethics of Supervision, or Quantum Erotics.

In The Order of Things, Michel Foucault undertakes an archeology of the human sciences, tracing the changing structures of knowledge from the Classical period to Modernity in Western civilization. He declares that orders of knowledge are reconfigurable, and each configuration constitutes a discrete episteme. Foucault introduces the term episteme to describe historically contextual preconditions of knowledge and discourse: “In any given culture and at any given moment, there is always only one episteme that defines the conditions of possibility of all knowledge, whether expressed in a theory or silently invested in a practice.”13 While each episteme determines how speech operates and how representation is constructed, it is also subject to alteration by its own output. Foucault emphasizes this dialectal formation of epistemes: “What is essential is that thought, both for itself and the density of its workings, should be both knowledge and a modification of what it knows, reflection and a transformation of the mode of being of that on which it reflects.”14 He traces the emergence of three modern discourses—biology, economics, and linguistics—which at his moment, control truth in the human sciences.

Proposition IX. Is this self critical reconstruction?

Proposition X. Or self aware construction?

Proof.—One of the most powerful forces of Modern Europes epistemic formation, perhaps the one that belies those identified inThe Order of Things, is cartography. In his lecture, Of Other Spaces, Foucault acknowledges that “starting with Galileo and the seventeenth century, extension was substituted for localization.”15 Europes invasion of other territories led to the ‘discovery of unknown organisms, unforeseen opportunities for industrial growth, and unfamiliar languages, all of which intensified cultural focus on the three areas that constituted the episteme. The occupation of foreign places required “the commodification and bureaucratisation of everyday life, namely making space mathematical and ordered (challenging the indigenous ordering of space) in such a way as to render the colony most efficiently known and governable.”16 The utter systematization of geography constitutes a form of what Gayatri Spivak calls epistemic violence;17 it erases other ways of knowing space.

Note.—The Cartesian model of abstract coordinate space replaces haptic, embodied space, and the map becomes a technology for remote control: “In his work on the making and circulation of scientific knowledge, Bruno Latour has used the term immutable mobile to characterize those material agents that permit scientific discourse to sustain its claims of empirical warranty and repeatable truth in the absence of eyewitness evidence. The map is a perfect exemplar of the immutable mobile: a container of information gathered at specific locations, returned to a ‘centre of calculation, and then placed once more into circulation as a vehicle and instrument of scientific knowledge and further hypotheses.”18 Importantly, these maps do not only function as representations of space, they terraform space itself, determining its modes of inhabitation.

As Denis Cosgrove explains in Geography and Vision: Seeing, Imagining and Representing the World, “geographical representationsin the form of maps, texts and pictorial images of various kindsand the look of landscapes themselves are not merely traces or sources, of greater or lesser value for disinterested investigation by geographical science. They are active, constitutive elements in shaping social and spatial practices and the environments we occupy.”19 

Proposition XI. The grid organizes behavior and constricts the imagination.

Proof.—Postcolonial scholarship emphasizes the importance of examining maps and their constructive processes as a foundation for resistant strategies.

Another proof.—In The Wretched of the Earth, Franz Fanon argued that “the colonial world is a world divided into compartments … Yet, if we examine closely this system of compartments, we will at least be able to reveal the lines of force it implies. This approach to the colonial world, its ordering and its geographical layout will allow us to mark out the lines on which a decolonized society will be reorganized.”20

The domination of physical space continues, but it also encounters innumerable frictions: decolonial efforts, the unwieldy environmental upset of climate change, even advances in mathematics and science. Surprisingly, all of these preference differential conceptions of space over cartesian models. These variables make totalizing efforts slow and inefficient: “No spaces can be controlled, inhabited or represented completely. But the map permits the illusion of such possibilities. Mapping is a creative process of inserting our humanity into the world and seizing the world for ourselves.”21 The digital is a computational map that arises to satisfy the project of spatial and social control, which can never be fulfilled in the analog.

The most explicit carry over from European Imperialism to virtual space is its tendency for geometric control. The perfect grids of two-dimensional pixel displays, for example, reinforce the colonial mindset that space can be neatly subdivided and programmed. 3D Reconstruction and modeling software is based on this same cartesian coordinate system. Likewise, it operates according to the same spatial ideology that made a totalizing map of the world conceivable.

Another proof.—At least in theory, virtual space has no limit, no absolute scale, it allows the axial view and the extensible grid to continue on forever.

Note.—“Geometry, specifically the radial axis and the grid, underpinned both scientific cartography and modern urban form. Their power and historical endurance in both the map and the city lies in their combination of practical and symbolic efficacy. The circles 360 degrees generate a ‘centre enhancing axial form focused on a single point. Functionally and symbolically, this extends power panoptically to the horizon, encompassing a potentially infinite territory … The alternative geometrical form shared by urban planning and mapping is the grid or chequerboard of orthogonal lines crossing at right angles. While radial axes enhance the centre, the grid is ‘space equalizing, infinitely extendable over the surface and privileging no single point, but rather reducing each to a unique coordinate.”22

Today, digital databases reduce information—and lives—into coordinates to be rearranged within a deterministic index. This classification process removes the friction of complex events and beings, allowing them to be efficiently calculated.

Proposition XII. Classification is the process of establishing a network of distances and proximities.

Proof.—Importantly, Foucault suggests that the primary function of classification systems is to provide generalizations which allow different individuals to point to common concepts.23 He describes two classification processes: the System and the Method. The System makes “total comparisons, but only within empirically constituted groups in which the number of resemblances is manifestly so high that the enumeration of the differences will not take long to complete.”24 The Method, on the other hand, selects “a finite and relatively limited group of characteristics, whose variations and constants may be studied in any individual entity that presents itself.”25 In either case, order emerges as a temporary arrangement of segmented information and predetermined rules: “For it is not a question of linking consequences, but of grouping and isolating, of analyzing, of matching and pigeon-holing concrete contents; there is nothing more tentative, nothing more empirical (superficially, at least) than the process of establishing an order among things ... A ‘system of elementsa definition of the segments by which the resemblances and differences can be shown, the types of variation by which those segments can be affected, and, lastly, the threshold above which there is a difference and below which there is a similitudeis indispensable for the establishment of even the simplest form of order.”26 (IV.lv.)

Proposition XIII. Order is always produced from preexisting biases.

Proof.—Recent scholarship has dissected algorithmic classification and its implications. For instance, in Algorithms of Oppression, Safiya Noble argues that Googles search engine algorithms are not neutral; quite the opposite, they prioritize corporate interests, automate cultural biases, and circulate damaging stereotypes. The resulting representations cause harm to those represented as well as those forming opinions about different races, genders, religions, and other marginalized groups.

Corollary.—There is trust in the archive.

Note.—Most users consider the first page of results to be not only the most relevant links, but also the most credible sources on a given topic. At the same time, many users struggle to differentiate between sponsored content or advertising, and unpaid results. Consequently, through services like AdWords, special interests including Google, have the power to visually and ideologically sculpt topics for their own profit. In this private information system, the power to frame a subject goes to the highest bidder.

Proposition XIV. Dominant operating systems encode their rules.

Proof.—These prejudiced values are deeply embedded in most popular browsers and software applications; they perpetuate and even catalyze bigotry, exploitation, and violence. To support this argument, Noble dissects white nationalist Dylann Roofs terrorist attack on Mother Emanuel African Methodist Church in 2015, and his claim that the attack was motivated by online research of the phrase “black on white crime”27 (III.xxviii.). ProPublicas recent report on machine bias in criminal risk assessment software reveals that racism is not only affecting public opinion through search engines, it is leading to unfair sentencing decisions in the courts.28 This recalls Allen Sekulas analysis of how photography was used to classify certain morphological traits as indicators of criminality.

Corollary I.—In The Body and the Archive, Sekula focuses on the research practices of Alphonse Bertillon and Francis Galton, two men “committed to technologies of demographic regulation.” Sekula analyzes their differing methodologies, postulating each as a conceptual framework, a pre-digital algorithm for identifying criminality. Bertillons indexical method, he argues, attempts to locate aberrations or outliers through comparison, while Galtons compositing practice formulates general criminal types.

Corollary II.—Sekula initially defines the archive as a “unified system of representation and interpretation [which] promised a vast taxonomic ordering of images of the body.” He also addresses the important circulatory function of “the archive as an encyclopedic repository of exchangeable images.”

Proposition XV. The archive reinforces hierarchies through linguistic and spatial organization.

Proof.—“We can speak then of a generalized, inclusive archive, a shadow archive that encompasses an entire social terrain while positioning individuals within that terrain. This archive contains subordinate, territorialized archives: archives whose semantic interdependence is normally obscured by the ‘coherence and ‘mutual exclusivity of the social groups registered within each.”29 

Note.—Sekula examines the emergence of a generalized criminal type and the field of its study—criminology: “Thus the would-be scientists of crime sought a knowledge and mastery of an elusive ‘criminal type. And the ‘technicians of crime sought knowledge and mastery of individual criminals. Herein lies a terminological distinction, and a division of labor, between ‘criminology and ‘criminalistics. Criminology hunted ‘the criminal body. Criminalistics hunted ‘this or ‘that criminal body.” The centrality of physiognomy in the formation of this type indicates its inherent racialized bias. As ProPublica has indicated, these 19th century classification processes have reasserted themselves in the virtual space of risk assessment software programs.

Demographic control relies on the representation of bodies, whether through photography, data, or more dimensional models. Foucault contends that “representation in its peculiar essence is always perpendicular to itself: it is at the same time indication and appearance; a relation to an object and a manifestation of itself.”30 The specimen pin, a staple of Western classification processes, epitomizes this perpendicular gesture of pointing to and simultaneously constituting. The pin is a vector directing the human eye to the point from which the entirety of the specimen is most easily resolved; it fixes the specimen in place, determines its orientation, and through death asserts the permanence of its form. While it is a clear marker of another organisms lifelessness, the language of the specimen pin lives on in virtual space.

Photorealism in computer graphics relies on complex calculations based on surface normals (III.xxxviii.). Normals, by default, are perpendicular to the faces or vertices of a mesh. They allow for accurate rendering by determining how light bounces off of surfaces, most notably in a process called ray tracing. Virtual normals bare striking resemblance to specimen pins, always at perfect right angles. As Sara Ahmed explains, “the right is associated with truth, reason, normality and with getting ‘straight to the point.31 Etymologically, the word normal comes from the latin norma, or carpenters square, conveying not only simple geometric perpendicularity and taxonomic control, but an even longer history of Christian values that determine what is considered upright and in the light. When surface normals are inverted or are not all facing the same direction, they produce unpredictable results. In commercial production, these unruly behaviors are resolved by conforming normals. Fittingly, maintaining right angles and conforming normal direction ensures photorealism, which in turn endows the virtual form with technical and visual authority.

The Utah Teapot was modeled by Martin Newell at the University of Utah in 1975 to demonstrate the ray tracing capabilities of the rendering algorithms he was developing at the time. He chose this particular form, because it was ready at hand in his office, but also for its normals. Its asymmetry, irregular curves, and areas of self-occlusion highlighted his softwares advanced capabilities, its ability to calculate complexity. The teapot quickly became the most circulated 3D model of all time, used for all kinds of technical demos. It is now considered a benchmark in computing (IV.xxxiii.), the 3D equivalent of “Hello World.” The physical teapot that it is based on is even in the Computer History Museum in Silicon Valley.32 Despite its elevation within the field of computer graphics, the Utah Teapot is consistently discussed as a banal, domestic form.

Even if it is never admitted in this context, it is also a symbol of imperial power. Europeans obsession with the daily ritual of drinking tea with sugar was one of colonialisms driving forces: “This custom, which has mistakenly been viewed as insignificant, had important historical effects. Its widespread adoption in Britain and elsewhere in northern Europe in the eighteenth century greatly reinforced demand for both products, thus helping to foster British imperialism in Asia, plantation slavery in the West Indies, and economic growth in Europe and North America.”33 Despite this uneasy history, the teapot easily slips back into the category of neutral formal object, allowing it to circulate as an image of cultural and technical control. In Geographies of Post-Colonialism: Spaces of Power and Representation, Joanne Sharp explains how similar instances lead “many postcolonial feminists [to] favor the concept of ‘situated knowledge as a substitute for decontextualized, ungendered, disembodied, so-called ‘objective knowledge. It pays attention to geographic and cultural specificity rather than universality.”34 It is highly unlikely that the initial choice of the teapot and its subsequent circulation is an intentional assertion of colonial power. Rather, it is an indication that the values and symbols of European supremacy are both central to and invisible in our cultural imaginary: “something passes as natural precisely when it conforms perfectly and without apparent effort to accepted models, to the habits valorised by a tradition (sometimes recent, but in force).”35 The Utah Teapot is quite literally a model of the banality of colonial forces and their foundational role in virtual space.

Immersive virtual reality is the promise of a comprehensive calculable space. Ivan Sutherland developed one of the earliest virtual reality head-mounted displays in 1968. Due to the size and weight of its components, the system required a large ceiling-mounted pole for support. As a result, Sutherland and his team facetiously named their apparatus, The Sword of Damocles. While VR researchers insist that this is a purely formal reference to the intimidating beam overhead, it is worthwhile to consult its namesake for meaning. The Sword of Damocles is a parable of paranoid power. When a Greek subject, Damocles, expresses how fortunate his king, Dionysus, is to live in luxury, the king offers to switch places with him for a day. Damocles eagerly agrees to sit on the throne; however, the king orders a sword to be hung by a single horsehair above the royal seat to represent the feeling of constant threat that comes with supremacy. Damocles does not have the fortitude to withstand these precarious conditions and forfeits his day in the position of ultimate power. References to this moralizing tale have circulated in Europe for centuries, often accompanied by the phrase, METUS EST PLENUS TYRANNIS; fear is plentiful for tyrants.36 

Suspiciously, though, the sword is installed—by royal decree—to intimidate a common subject. Perhaps the state of precarity for those in power does exist, but the narrative of threat-to-rule can also be used as a justification for imposing controls on others. The sword, pointing down from above, uncannily resembles both specimen pin and the virtual normal, posed to fix the subject in place. Likewise, Sutherlands apparatus constrains the users movement, circumscribing its radius and orientation, while claiming to enhance it. This early head-mounted display is a blindfold of optical and tactile dissonance. Researchers in Sutherlands lab refused to wear it because of its high voltage risk to the body.37

Sutherland, confronting the limitations of his invention proposed the ultimate display, a totalizing omnipotent control system: “The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked.”38 His window into the mathematical wonderland of the computer is explicitly tied to bio and necropolitical control—a computer graphics pathway to the state project that Foucault lays out in Discipline and Punish. The recent resurgence of virtual reality has confirmed that “having a display apparatus mounted on our heads may bring temporary distraction, but we are more often in a world of isolation and stasis than remote presence or alternate identity.”39 Sutherlands carceral framing of his early virtual reality system presages its application as a tool for military surveillance and control.

Palmer Luckey, the founder of popular hardware manufacturer, Oculus VR, is now undertaking the project of enhancing state surveillance systems with virtual technologies. Following Sutherlands example, he named his new company Anduril40 after Tolkeins flaming sword of the West. The company has accrued massive investment dollars from venture capital as well as from the United States military, which has a contract with Anduril Industries to construct a virtual border wall. If colonial strategies of spatial control and the architectures of European disciplinary society were latent in virtual reality, then Anduril has unleashed them. These mechanisms are not only operating within virtual space, they are reimposed on physical terrain by an arsenal of robotic sentry towers, unmanned aerial vehicles, and an IoT mesh network of sensors. These fortifications are augmented with machine learning to more accurately detect migrants in the landscape. The companys promotional materials explain: “once alerted, a Lattice user can strap on a pair of VR goggles and get a birds-eye view of what triggered the alarm, or toggle between the individual streams coming from each sensor. The goal is to give users a kind of local omniscience—perfect situational awareness of whats around every corner and behind each hill.”41 

Activists have already come out against the dangers of a system that so efficiently tracks humans with the express purpose of feeding them into the nations privatized detention centers. Mijente, an immigrant rights advocacy group, published a statement that Anduril represents “a surveillance apparatus where algorithms are trained to implement racist and xenophobic policies.”42 The companys founder, however, expressed his faith in US authority and the precedent that it has set through its implementation of other technologies: “weve shown throughout history that we are leaders in using technology ethically, using technology responsibly … We have to continue to lead, the same way that we led with nuclear weapons, where we were able to define the way that they were used because we were the leader in the space.”43 This statement is a terrifying affirmation of the United States exceptional supremacy, which allows it to dominate, even obliterate space and its inhabitants, as it did in Hiroshima and Nagasaki.

Andurils suite of distributed machines exceeds the possibilities of Foucaults Panopticon. Rather, it exemplifies what Manuel Delanda calls the Panspectron: “Instead of positioning some human bodies around a central sensor, a multiplicity of sensors is deployed around all bodies: its antenna farms, spy satellites and cable-traffic intercepts feed into its computers all the information that can be gathered. This is then processed through a series of filters or key-word watch-lists. The Panspectron does not merely select certain bodies and certain data about them. Rather, it compiles information about all at the same time, using computers to select the segments of data relevant to its surveillance tasks.”44 Anduril, the flaming sword of the West, uses 3D Reconstruction to control space. This strategy epitomizes, and concretizes, Delueze and Guattaris description of how state power leverages media to lock down its territories: “one of the fundamental tasks of the State is to striate the space over which it reigns, or to utilize smooth spaces as a means of communication in the service of striated space. It is a vital concern of every State not only to vanquish nomadism but to control migrations and, more generally, to establish a zone of rights over an entire ‘exterior, over all of the flows traversing the ecumenon.”45 

Proposition XVI. Capture and Reconstruction are prosthetic to the colonial ambition of capturing and controlling everything.

Proof.—Capture and Reconstruction technologies descend from colonial mapping practices, the survey, metrology, and photography. Photography was implemented to make territories optically delicious, to serve them up on an industrial platter: “This representative scheme, then, presents the possibility of a double salvation-a return to unspoiled innocence and an opportunity to profit from the violation of innocence.”46 The use of photography from an aerial perspective marked a technological leap (II.x.note.). Photographs from weather balloons were used to map spaces from above, which made mapping much more efficient and useful for industrial and military applications alike. The photographs were then stitched together to create comprehensive and precise maps of expansive terrain: “Photogrammetric survey was used in mapping great colonial stretches of Africa, Australia and Antarctica into the 1950s.”47 Capture and Reconstruction have since evolved and are now primarily executed with digital tools. Notably, countless corporate and governmental entities have incorporated the use of high altitude planes and satellites to expand remote sensing and photogrammetric mapping to a planetary scale.

Corollary I.—Capture everything.

Corollary II.—Every subject is a captive, every object is a sensor.

Corollary III.—The IoT fantasy. Web 3.0.

Proposition XVII. Monitor and control every corner of the universe.

Proof.—The Internet of Things (IoT) refers to a network of interconnected physical objects or devices that are embedded with sensors, software, and connectivity capabilities, enabling them to collect and exchange data. These objects can be anything from everyday household appliances and wearable devices to industrial machinery and vehicles. The fundamental idea behind IoT is to create a seamless connection between the digital and physical worlds, allowing these devices to communicate and collaborate with each other without human intervention. Integrating sensors and connectivity into objects, enables them to gather and transmit data, receive instructions, and interact with their environment.48 

Corollary I.—Embedded and scattered.

Corollary II.—Even dust becomes device of capture.

Note.—Smart dust refers to miniature wireless devices that are typically the size of a grain of sand or smaller. These devices, also known as microelectromechanical systems (MEMS), are equipped with sensing, computing, and communication capabilities. Smart dust particles are designed to be extremely small and lightweight, enabling them to be easily dispersed in the environment and gather data from various sources. The concept of smart dust originated from research conducted by the Defense Advanced Research Projects Agency (DARPA) in the 1990s and the term smart dust was coined by Kristofer Pister of the University of California, Berkeley in 1997.49 The goal was to develop invisible, autonomous sensor nodes that could be deployed in large numbers to monitor and collect data from various environments: “Dust-sized and light transparent semiconductor chips are composed entirely of materials that are transparent to visible light.”50

Proposition XVIII. Science Fiction simulates possible futures (IV.).

Proof.—In Michael Crichtons techno-thriller, Prey, self-replicating nanobots are out of control: “In the Nevada desert, an experiment has gone horribly wrong. A cloud of nanoparticles—micro-robots—has escaped from the laboratory. This cloud is self-sustaining and self-reproducing. It is intelligent and learns from experience. For all practical purposes, it is alive.”51 These minuscule machines exhibit swarm intelligence, simulating future characteristics of smart dust technology. Capable of independent sensing and communication, they demonstrate the potential risks and dangers associated with advanced, interconnected systems on a micro-scale.

Proposition XIX. Deployed dust.

Proof.—Smart dust devices are equipped with sensors that can measure parameters such as temperature, humidity, light intensity, motion, or even chemical composition in some cases. These sensors allow the smart dust particles to gather real-time information from their surroundings. The collected data can then be processed and transmitted wirelessly to a central system for further analysis and decision-making. One of the key advantages of smart dust technology is its potential for large-scale deployment in diverse environments, enabling extensive data collection and monitoring. It has applications in various fields such as environmental monitoring, agriculture, infrastructure management, healthcare, and even military surveillance.52

Note.—Paolo Bacigalupis dystopian novel, The Water Knife, also describes microscopic sensors that bear resemblance to smart dust. Set against a backdrop of widespread water scarcity, these tiny sensors are deployed to monitor water sources and consumption patterns. Their presence underscores the problem of efficient resource management, control, and access: “We knew it was all going to go to hell, and we just stood by and watched it happen anyway. There ought to be a prize for that kind of stupidity.”53 Smart dust has the potential to complicate future resource wars by enabling resource monitoring, surveillance, disruption, and accelerating environmental concerns. Deployed in resource-rich areas, these miniature devices can gather real-time data on valuable assets, provide intelligence for military operations, and be used for sabotage. The widespread deployment of smart dust can trigger a technological race among conflicting parties.

Proposition XX.  Pixie dust pixels.

Proof.—If sensors are smart dust, then point clouds and particle systems are fairy dust. Volumes of data captivate us. Point clouds capture the appearance of objects and environments, casting a spell of trust and belief. Particle systems scintillate, shimmering with potential and movement. They create mesmerizing visual effects, evoking wonder and fascination. They fixate us while the world turns to darkness.

Coroll. I.—Flashes of meaning.

Coroll. II—Motes suspended in a beam of light.

Proposition XXI. Technological-dependence is projected.

Proof.—Vernor Vinges Rainbow’s End presents a near-future world in which advanced technology is deeply embedded in everyday life. Interconnected devices seamlessly facilitate communication, entertainment, and access to information. But, “the beginning of trust has to be an in-person contact.”54

Proposition XXII.Totalizing techno-futures.

Proof.—The proof of this proposition is similar to that of the preceding one.

Proposition XXIII. The substance of control is addictive.

Proof.—The title Snow Crash of Neal Stephensons novel refers to a fictional narcotic: “This Snow Crash thingis it a virus, a drug, or a religion? ...Whats the difference?”55 In the book, Snow Crash is a highly addictive substance, originally designed as a brain-altering virus transmitted through both digital and physical means. It takes its name from the description of its effects on the users consciousness, likened to a crash of overwhelming sensory and cognitive stimulation. The term snow is a reference to the white noise that accompanies the overdose, comparable to a blizzard of fragmented data overwhelming the mind: “Well, all information looks like noise until you break the code.”56

Proposition XXIV. Information overload.

Proof.—Set in a future world where people have computerized implants, MT Andersons Feed vividly portrays a society bombarded with a relentless stream of advertisements, news, and entertainment: “I dont know when they first had feeds. Like maybe, fifty or a hundred years ago. Before that, they had to use their hands and their eyes. Computers were all outside the body. They carried them around outside of them, in their hands, like if you carried your lungs in a briefcase and opened it to breathe.”57 The feed implants provide users with instant access to an overwhelming amount of information, creating a state of perpetual distraction and sensory overload. This information saturation affects individuals ability to think critically, form genuine connections, and maintain a sense of personal identity.

Corollary.—Defrag the system.

Proposition XXV. Open electronic wormholes.

Proof.—In Arthur C. Clarke and Stephen Baxter’s novel The Light of Other Days, WormCam offers surveillance and temporal manipulation. WormCam is a technology that allows individuals to observe any location or event in the past through the use of microscopic wormholes: “If the present is shitty and the future is worse, the past is all you’ve got.”58 The technology provides an unprecedented level of access, unveiling the secrets of history and offering a glimpse into moments that were once hidden from human perception. WormCam raises ethical questions about the boundaries of privacy and the implications of constant surveillance. WormCam is the fantasy of unrestricted access to the past and the complex interplay between knowledge, power, and the erosion of personal boundaries.

A wormhole is a hypothetical concept in theoretical physics that represents a shortcut or tunnel through spacetime, connecting two distant regions or even different universes (III.).59 It is often depicted as a tunnel-like structure, a tunnel through which one could pass from one point in the universe to another. Without traveling through the intervening space. Wormholes are derived from the mathematics of general relativity, Albert Einsteins theory of gravity (II.).60 While they remain purely theoretical at present, they have captured the imagination of scientists and writers due to their potential for enabling faster-than-light travel across vast cosmic distances.

Theoretical methods for opening a wormhole are still purely speculative and largely remain within the realm of science fiction. The concept of opening a wormhole involves manipulating spacetime, which would require the manipulation of immense amounts of energy and the bending of spacetime itself. One popular theoretical approach for opening a wormhole is by using exotic matter or negative energy. Exotic matter, with negative energy density, is a hypothetical form of matter that violates the standard energy conditions of classical physics. It is speculated that if exotic matter with specific properties could be obtained and controlled, it might be possible to create and stabilize a traversable wormhole.61 Another proposed method is utilizing the phenomenon of quantum entanglement.62 Quantum entanglement involves the instantaneous correlation of properties between particles, regardless of distance. The idea is that by manipulating entangled particles, it might be possible to create a connection that resembles a wormhole (V.).

Note.—A Euclidean wormhole is a theoretical concept derived from the mathematical framework of Euclidean space (V.xl.proof.). Unlike the traditional concept of a wormhole in spacetime, which involves curved spacetime and is based on the theory of general relativity, a Euclidean wormhole exists within a hypothetical flat, Euclidean space.63 In Euclidean geometry, a wormhole is represented as a tunnel or shortcut that connects two distinct regions of space. It can be visualized as a bridge or a tunnel connecting two separate points, allowing for a direct path between them that bypasses the usual distance between the points. Euclidean wormholes are mathematical concepts. Not directly related to the physical properties of our universe. Euclidean wormholes have been studied within the context of theoretical physics and often serve as a simplified model for exploring the possibilities of traversable shortcuts between points in Euclidean space (III.xxxvi.note.). While they may lack the complexities and physical implications of spacetime wormholes, Euclidean wormholes provide a framework for investigating geometric structures and the theoretical possibilities of interconnecting different regions of space (V.xli.proof.).

Corollary.—Capture and Reconstruction are wormholes.

Proposition XXVI. Openings between virtual and physical realities.

Proof.—In William Gibsons Neuromancer, countless sensors and displays act as a matrix of wormholes that seamlessly bridge the divide between digital and physical realities. These technological interfaces become gateways, enabling individuals to navigate the boundless expanse of the virtual. Sensors are conduits, capturing the subtleties of physical existence and translating them into digital data. Displays are tunnels projecting immersive virtual landscapes into the perceptible realm. Through these sensorial wormholes, Gibson blurs the boundaries between the virtual and physical: “Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts ... A graphic representation of data abstracted from banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding ...”64 

Proposition XXVII. A thing, which has been conditioned by supervision, cannot render itself unconditioned.

Proof.—This proposition is evident from the third axiom (I.axiom.iii.).

Proposition XXVIII. The observer effect is a problem of measurement.

Proof.—According to quantum theory, the act of observation or measurement can influence the properties and behavior of particles or systems being observed.65 In other words, the act of measurement can cause a quantum system to collapse into a specific state, thereby altering its properties.

The observer effect suggests that the act of observing or measuring a quantum system disturbs it, making it challenging to observe or measure its original, undisturbed state accurately. This concept is related to the inherent uncertainty and probabilistic nature of quantum mechanics: “Quantum reality is not constrained to the realm of ultra-small. In a certain sense, we are all quantum wavicles meaning that a version of you can wildly vary from one observer to another … observer systemic alternate timelines are true parallel universes”66 (III.xxix.proof.).

Note.—In the context of quantum mechanics, a remote cause refers to a causal relationship between two quantum systems that are spatially separated or distant from each other. It challenges the classical notion of causality, where cause and effect occur in close proximity or within a localized region. In certain quantum phenomena, such as entanglement, particles can become correlated in a way that their properties become interdependent, regardless of the physical distance between them. When two entangled particles are measured or interacted with, the outcomes of their measurements are instantaneously correlated, even if they are separated by vast distances. This phenomenon has been experimentally confirmed and is often referred to as spooky action at a distance (V.vii.).67 The concept of remote cause in quantum mechanics suggests that the state or measurement of one particle can have an instantaneous influence on the state or behavior of another particle, regardless of the spatial separation between them. This challenges our intuitive understanding of cause and effect, as the influence appears to occur faster than the speed of light, violating the classical notion of locality.

Proposition XXIX. Nothing in the universe is contingent, but all things are conditioned to exist and operate in a particular manner ...

Proof.—Remote control.

Note.—Remote control refers to the ability to operate or control a device, system, or process from a distance, without direct physical contact. It involves using a device, such as a handheld transmitter or mobile application, to send signals or commands wirelessly to the controlled device.

Proposition XXX. Or complete automation.

Proof.—A true idea must agree with its object (I.axiom.vi.); in other words (obviously), humans and machines in alignment.

Proposition XXXI. Unquestioning servitude.

Proof.—In Neal Stephensons novel The Diamond Age, AI automation plays a central role in shaping the future society. Nanotechnology and advanced artificial intelligence are prevalent. Automation, in the form of intelligent agents and interactive devices, pervades all aspects of life: “The universe was a disorderly mess, the only interesting bits being the organized anomalies.”68 Personalized assistance. Streamlined processes. These AI systems take on tasks ranging from education and child care to manufacturing and resource management. AI systems are seamlessly integrated into the fabric of society, blurring the boundaries between human and machine interaction, and reshaping the dynamics of work, education, and personal relationships: “That we occasionally violate our own stated moral code does not imply that we are insincere in espousing that code.”69

Note.—Neal Stephensons views on automation in The Diamond Age are not explicitly stated in the novel: “‘Which path do you intend to take, Nell? said the Constable, sounding very interested. ‘Conformity or rebellion? Neither one. Both ways are simple-mindedthey are only for people who cannot cope with contradiction and ambiguity.”70 Extensive automation presents trade-offs between technological progress and the preservation of human values and autonomy (IV.).

Proposition XXXII. Will cannot be called a free cause, but only a necessary cause.

Proof.—Free will is in question (IV.). Human will is constrained by various factors. A necessary cause implies that human will is determined or influenced by preceding factors, such as genetics, upbringing, societal conditioning, or environmental circumstances. It suggests that our choices and actions are not entirely autonomous but rather driven by a combination of internal and external forces. There is no unfettered agency (V.ix.). Decisions are bound by deterministic factors. Our choices are predictable or determined by the causal chain of events and the conditions in which we exist. This perspective aligns with certain philosophical and scientific viewpoints that question the extent of human freedom and emphasize the interplay between causality, determinism, and the complexities of human behavior and decision-making.

Coroll. I.—Hence it follows, first, that God does not act according to freedom of the will.

Coroll. II.—Supervision is automated.

Proposition XXXIII. Things could not have been brought into being by God in any manner or in any order different from that which has in fact obtained.

Proof.—There is only one way.

Note I.—In many science fiction narratives, the notion of capturing and reconstructing virtual realities highlights the potential for manipulation and control. The possibility of individuals being trapped within simulated environments. Experiences reconstructed and manipulated. Surveilled. Coerced. Obliterated.

Note II.—Science fiction also contemplates the power of capture and reconstruction to reshape and refine identity. The potential for individuals to assume new personas, inhabit other bodies. The transformative potential of the virtual. The malleability of identity. The impact on self-perception and the consequences of disconnecting from ones physical reality.

Proposition XXXIV. Gods power is identical with its digital twin.

Proof.—Digital twins are virtual representations of physical objects, systems, or processes (III.). They are created by collecting real-time data from sensors embedded in the physical object or system and using it to build a virtual model that mirrors its real-world counterpart. The digital twin serves as a live simulation or emulation, providing insights into the performance, behavior, and condition of the physical object (III.instances of reconstructions.xxxiv.).

Digital twins can be used in various domains, including manufacturing, infrastructure, and transportation: “The Los Angeles Department of Transportation has partnered with the Open Mobility Foundation to create a data-driven digital twin of the citys transport infrastructure. To start with, it will model the movement and activity of micro-mobility solutions such as the citys network of shared-use bicycles and e-scooters. After that, it will be expanded to cover ride-sharing services, carpools, and new mobility solutions that will appear, such as autonomous taxi drones.”71 Digital twins enable real-time monitoring, analysis, and optimization of physical assets, enhancing operational efficiency, decision-making, and maintenance processes. By capturing and analyzing data from sensors, digital twins can simulate different scenarios, predict outcomes, and take proactive measures.

The concept of digital twins goes beyond mere data visualization or representation. It involves the integration of data analytics, machine learning, and simulation to create a dynamic and interactive virtual model that can evolve alongside its physical counterpart: “The EU-funded Neurotwin project aims to simulate specific human brains in order to build models that can predict the best treatments for conditions such as Alzheimers and epilepsy. There have been other attempts to simulate aspects of the brain in the past, but Neurotwin is the first project that focuses on modeling both the electromagnetic activity and the physiology.”72 The digital twin continually receives data from the physical object, updating its virtual representation to reflect the real-time status and characteristics. Digital twins facilitate remote monitoring and control, allowing operators to interact with and manage assets from a distance. Digital twins also support the testing and validation of changes, reducing the time and cost associated with physical prototyping.

With advancements in technology such as the Internet of Things, IoT, data analytics, and cloud computing, digital twins are becoming increasingly sophisticated and integrated into various industries. They contribute to the growing field of the Industrial Internet of Things, IIoT, enabling the digital transformation and optimization of complex systems. Closed loop control. Lights out operation. Virtual laboratories. Simulation environments (IV.lxviii.).

Proposition XXXV. Machine prophecies.

Proof.—Digital twins predict future outcomes.73 Through the integration of advanced analytics, machine learning, and artificial intelligence, digital twins analyze vast datasets, identify patterns, and make informed predictions. Digital twins predict equipment failures. Digital twins predict disease progression and treatment outcomes.

Proposition XXXVI. Orders of operation from afar.

Proof.—Digital twins unlock remote control. Unprecedented power to manipulate physical assets and systems from afar. With their virtual counterparts mirroring the behavior and characteristics of real-world objects, supervisors can influence the system without being physically present. What does automation set in motion? Virtual replicas of unfolding processes. Digital twins usher in an era of automation. Automation eliminates manual intervention, streamlines processes, and minimizes human error. What oversight? What displacement? What risks?

Risks to body, space, system. Digital twins of critical infrastructure, such as power plants, water treatment facilities, or transportation systems, can be vulnerable to cyberattacks. If malicious actors gain unauthorized access to the digital twin, they could manipulate or disrupt the virtual model, potentially leading to real-world consequences such as power outages, water contamination, or transportation disruptions.74 Digital twins used in healthcare, particularly those associated with patient data and medical devices, can pose risks to privacy and patient safety if not adequately secured. Unauthorized access to medical digital twins could result in the exposure or manipulation of sensitive patient information. Tampering with medical device digital twins could have severe consequences for patient health and safety: “zero-trust is coming”75 (IV.lxviii.note.).

Digital twins that are interconnected with industrial control systems, such as those in manufacturing or energy sectors, could be targeted by cybercriminals. If a digital twin is compromised, it could provide a pathway for attackers to infiltrate and manipulate the corresponding physical system. Digital twins used in autonomous vehicles could be susceptible to attacks that manipulate or deceive the vehicles perception and decision-making capabilities. If the digital twins data or algorithms are compromised, it could lead to misinterpretation of the environment, causing accidents or unauthorized control over the vehicles operations. Digital twins employed in smart city initiatives including interconnected systems surveillance, energy grids, or for traffic managementlike the one in development for Los Angeles, may face risks related to unauthorized access or data breaches. Manipulating these digital twins could disrupt essential services, compromise privacy, or enable malicious surveillance.

Risks stem from vulnerabilities. Unauthorized access. Security protocols. Encryption. Ensure the integrity of the twin! Digital doppelgängers may serve our overlords, hoarding our data and surveilling our every move. Digital twins tighten their grasp on our lives. The path to unsupervised supervision may lead to the surrender of our freedoms.

APPENDIX:

There is necessary tension between supervision and the unsupervised. This tension emerges in the engineering and application of Capture and Reconstruction (IV.xxxix). It is also the dialectic of self-aware development. What complexities exist—agency, growth, regulation—within the framework of supervision?

Supervision is a colonizing practice and technological ideology. It embodies structures of control and dynamics of power. In contrast, the unsupervised is without external oversight, guidance, or control. Absence of presence—position of authority—monitoring—directed action and behavior. Unsupervised agents make decisions—move freely—outside the influence of supervisor, external authority, operating system.

No supervision is no accountability—open for reckless risk-taking. Potential dangers arise from unregulated behavior, both at the human and industrial scale. At the human scale, unregulated behavior can result in ethical transgressions, harm to others, and the violation of environmental integrity. Without external guidance or accountability, individuals may engage in harmful or destructive actions, causing harm to themselves or others. Unregulated behavior can lead to moral erosion, as individuals may be prone to selfish pursuits, exploitation, or the neglect of collective well-being.76 At an industrial scale, unregulated behavior has severe environmental and societal repercussions. Industries operating without proper supervision or regulation may exploit natural resources, disregard sustainable practices, and contribute to pollution and ecological damage. Unregulated industrial practices can endanger ecosystems, compromise public health, and perpetuate social inequalities. Without appropriate oversight, industrial operations may prioritize profit-seeking over worker safety, or the long-term impact on communities and the environment. The absence of regulation can lead to a lack of accountability, enabling companies to engage in unethical practices, exploit labor, or evade responsibility for the consequences of their actions.

Proper regulation fosters a sense of collective responsibility, ensuring that individuals and industries act in accordance with ethical, legal, and environmental principles. Regulation promotes sustainable practices, safeguards the public, and facilitates the equitable distribution of resources and opportunities.

Regulation and supervision fold in on one another. A convolution (IV.xlv.). A twist. The relationship between regulation and supervision provokes the question—Is regulation a form of supervision? Regulation can be seen as a proactive measure enacted by governing bodies to supervise and guide the behavior of individuals, organizations, or industries. It sets rules, standards, and frameworks that dictate acceptable practices. Ensures compliance. Mitigates risks. In this sense, regulation acts as a form of supervision by establishing boundaries and overseeing activities. Some argue that supervision should involve more flexible, adaptive approaches that foster self-regulation and individual accountability.77 

Ultimately, it is a question of balance between oversight and freedom, fate and will. The tension between the supervised and the unsupervised is self-aware construction. Actively engaging with this tension becomes calling. Calling for reevaluation of power dynamics, dismantling of oppressive structures, recognition of individuals as active participants in their own development.

Every living being is a device of Capture and Reconstruction. Every being is a universe sensing a multiverse.











PART II.

ON THE NATURE AND OPTICS OF CAPTURE.

PREFACE

The word capture implies seizure and control. Capture technologies collect data from physical realitytemperature, humidity, pressure, proximity, speed, rotation, chemical levels, radiation, light, color, movement, and depth information. Part II, inspects not, indeed, all of them (for we proved in Part i., Prop. xvi., that an infinite number must follow in an infinite number of ways), but only Capture technologies that are used to record and store visual and spatial information—with the implicit goal of outputting a reconstruction. Part II examines the history, hardware, and methodologies of Capture as well as the ethics of extractionism and surveillance in the history of capturing black bodies.7879 The camera, in all its variations, is the sensor at the center of this field.

The word optics has its etymological roots in the word for appearance, or look.80 It refers to a branch of physics as well as public perception—good and bad.81 

In physics, optics is the study of light—and other forms of radiation—its properties, interactions with matter, and image formation. It includes geometrical optics, which deals with the propagation of light as rays; physical optics, which considers light as waves and particles; and quantum optics, which allows for the coexistence of behaviors: “Every period has its own optical focus.”82 The origins of optics can be traced back to the first lenses produced by ancient civilizations.

The oldest known lenses—considered to be of exceptional quality—are estimated to have been made between 2620 and 2400 BC in Saqqara, Egypt. During the IV and V Dynasties of the Old Kingdom. These lenses—made of rock crystal, magnesite, and copper-arsenic alloy—were inlaid in the eyes of funerary statues: “When one observes these statues, and then circles about them in any direction, the ‘eyes’ appear to follow the observer—it is rather an amazing experience, easily observed and photographed.”83 The most cited example of this phenomenon is found in “Le Scribe Accroupi, at The Louvre, Paris … photographed near head on, and then again recorded to the left side of the statue … Movement of the iris apertures is clearly apparent.84 The seated scribe was discovered in 1850 by French archeologist Auguste Mariette. It has been held captive in France’s national collection of Egyptian antiquities for almost two centuries and was recently moved to the Louvre-Lens annex.85 

The ancient Mesopotamians were also known to use lenses made from polished crystals and glass beads. The Nimrud lens, for instance, was unearthed in modern day Iraq, a neo-Assyrian treasure excavated by Sir Austen Henry Layard in 1850: “With the glass bowls was discovered a rock-crystal lens, with opposite convex and plane faces. Its properties could scarcely have been unknown to the Assyrians, and we have consequently the earliest specimen of a magnifying and burning-glass. It was buried beneath a heap of fragments of beautiful blue opaque glass, apparently the enamel of some object in ivory or wood, which had perished.”86 Today, this looted artifact remains in the collections of the British Museum in London.8788

The ancient Greeks laid the foundation for geometrical optics. They studied the properties of light and proposed theories on how it behaves: “Long before either wave or particle, some (Pythagoras, Euclid, Hipparchus) thought that our eyes emitted some kind of substance that illuminated, or ‘felt,’ what we saw. (Aristotle pointed out that this hypothesis runs into trouble at night, as objects become invisible despite the eyes’ purported power.) Others, like Epicurus, proposed the inverse—that objects themselves project a kind of ray that reaches out toward the eye, as if they were looking at us (and surely some of them are). Plato split the difference, and postulated that a ‘visual fire’ burns between our eyes and that which they behold. This still seems fair enough.”89 

In the Early Modern period, European scholars borrowed heavily from Chinese (II.i.) and Middle Eastern (II.iii.) science as a foundation for their own theoretical optics. In 1604, Flemish mathematician and physicist Johannes Kepler formulated the laws of geometric optics, revolutionizing the understanding of light and vision: “Key to properly understanding ocular function, Kepler realized, was understanding the optics of the crystalline lens. Accordingly, he turned first to a mathematical analysis of light rays passing through a transparent sphere from various points outside it. On that basis he showed how parallel rays are brought to a focal concentration after exiting the sphere and undergoing spherical aberration. He also showed how the resulting focal area can shift depending on how close or how distant the light source is from the sphere. The closer the source, he concluded, the more distant the focal area, and vice-versa.”90 Shortly after, Italian scientist and inventor Galileo Galilei made groundbreaking observations using lenses, including the development of the refracting telescope: “Early telescopes were primarily used for making Earth-bound observations, such as surveying and military tactics. Galileo Galilei was part of a small group of astronomers who turned telescopes towards the heavens. After hearing about the ‘Danish perspective glass’ in 1609, Galileo constructed his own device.”91 

In 1621, Dutch scientist, Willebrord Snellius, articulated the mechanism behind the telescopic view—the law of refraction—when light travels from one transparent substance into another, the way it bends or changes direction depends on the angle it enters at, the angle it leaves at, and the refractive index—a value unique to each substance that describes how much it can bend light.92 In his 1637 work, La Dioptrique—or Dioptrics—Descartes used his principles of geometry to describe how light bounces off objects and enters our eyes, enabling us to see: “consider light as nothing else … than a certain movement or action, very rapid and very lively, which passes toward our eyes through the medium of the air and other transparent bodies.”93 Like Snellius, he proposed that the angle at which light enters a different medium affects its path. This principle is now known as the law of refraction, or Snell's Law. In 1662, Pierre De Fermat formalized the Principle of Least Time, building on earlier models of refraction. It states that of all the possible paths light could take to travel from one point to another, it always chooses the most direct.94 

Simultaneous to these theoretical developments in optics, Baruch Spinoza, the Dutch thinker—and the mind behind the Ethics (V.)—rose to prominence as one of the most skilled lens-grinders of the time.95 He honed his craft to such perfection that his lenses were coveted by scientists across the breadth of Europe. Spinoza was the ultimate optical technician—an inflection point—in astronomy, microscopy, and cartography. He formed tools and ideas that revealed the structural symmetries and interconnectedness of existence. Spinoza exchanged letters with countless innovators,96 perhaps most notably Dutch physicist Christian Huygens.97 

Previous accounts of optics treated light as rays—as straight lines. In 1678, Huygens proposed that light propagates as a wave: “I call them waves from their resemblance to those which are seen to be formed in water when a stone is thrown into it, and which present a successive spreading as circles, though these arise from another cause, and are only in a flat surface.”98 Huygens asserted that every point that a luminous disturbance encounters can be considered a source of a secondary wavelet. He envisioned that light spread out in spherical waves from these points, a concept now known as Huygens' Principle. Huygens’ theory accounted for the phenomena of reflection and refraction, building on Snell's law, by considering each point on the wavefront as a source of secondary wavelets:

“But what may at first appear full strange and even incredible is that the undulations produced by such small movements and corpuscles, should spread to such immense distances; as for example from the Sun or from the Stars to us. For the force of these waves must grow feeble in proportion as they move away from their origin, so that the action of each one in particular will without doubt become incapable of making itself felt to our sight. But one will cease to be astonished by considering how at a great distance from the luminous body an infinitude of waves, though they have issued from different points of this body, unite together in such a way that they sensibly compose one single wave only, which, consequently, ought to have enough force to make itself felt. Thus this infinite number of waves which originate at the same instant from all points of a fixed star, big it may be as the Sun, make practically only one single wave which may well have force enough to produce an impression on our eyes.”99 

His wave theory also gave an explanation for the phenomenon of diffraction, which was something that the particle theory of light struggled to explain. When light passed through a narrow slit and spread out, Huygens' principle provided a suitable explanation for the pattern created, as each point of the wavefront could be considered a source of secondary wavelets, forming a new wavefront that was not a straight line.

In 1704, English scientist Isaac Newton published Optiks, and in it, the Corpuscular Theory of Light. According to this theory, light was composed of small discrete particles, which he called corpuscles:  “Between the parts of opake and colour'd bodies are many spaces, either empty, or replenish'd with mediums of other densities; as water between the tinging corpuscles wherewith any liquor is impregnated, air between the aqueous globules that constitute clouds or mists; and for the most part spaces void of both air and water, but yet perhaps not wholly void of all substance, between the parts of hard bodies.” Newton believed these corpuscles—or particles—were emitted by light sources, such as the sun or a candle. Newton's theory faced challenges due to its inability to explain phenomena like diffraction and interference. Wave theories overtook particulate theories, supported by experiments like the double-slit experiment and observations of light speed consistency.

In 1814, French engineer and physicist, Augustin-Jean Fresnel, challenged Newton's corpuscular view in his Reveries.100 Extending Huygen’s theory, Fresnel explained various optical phenomena, such as interference, by treating light as a wave phenomenon rather than a stream of particles. He formalized and published his ideas in De la Lumière—On Light—in 1822, the same year that he demonstrated his lens design to King Louis XVIII: “The first Fresnel lens, installed in the elegant Cardovan Tower lighthouse on France's Gironde River in 1822, was visible to the horizon, more than 20 miles away. Sailors had long romanticized lighthouses. Now scientists could rhapsodize, too. ‘Nothing can be more beautiful than an entire apparatus for a fixed light,’ one engineer said of Fresnel's device. ‘I know of no work of art more beautifully creditable to the boldness, ardor, intelligence, and zeal of the artist.’”101 Fresnel lenses were initially created to address the limitations of large, heavy lenses used in lighthouses. The traditional lenses were bulky and required significant amounts of glass, making them expensive to produce and challenging to transport. Fresnel tackled this problem by dividing the lens into multiple concentric rings—zones—which gradually decrease in thickness from the center outward. Each zone of the lens bends and focuses light, achieving similar focusing capabilities to conventional lenses while reducing the material and weight required.102 Waves of light guiding ships through tumultuous waters.103

Scottish physicist James Clerk Maxwell is renowned for formulating the electromagnetic theory of light and unifying the fields of optics and electromagnetism: “The most important aspect of any phenomenon from a mathematical point of view is that of a measurable quantity. I shall therefore consider electrical phenomena chiefly with a view to their measurement, describing the methods of measurement, and defining the standards on which they depend.”104 In 1865, Maxwell's mathematical equations successfully demonstrated that light is an electromagnetic wave, propagating through space with oscillating electric and magnetic fields: “We now proceed to investigate whether these properties of that which constitutes the electromagnetic field, deduced from electro­magnetic phenomena alone, are sufficient to explain the  propagation of light through the same substance.”105 This theory revolutionized optics and laid the foundation for modern physics. Maxwell's work extended beyond theory, as he extensively experimented with lenses to explore and manipulate the behavior of light, which led to the invention of color photography (II.viii.).

In the same decade, Gustav Kirchhoff introduced the idea of the black body: “...the supposition that bodies can be imagined which, for infinitely small thicknesses, completely absorb all incident rays, and neither reflect nor transmit any. I shall call such bodies perfectly black, or, more briefly, black bodies.”106

According to Planck's law, which was formulated by the physicist Max Planck in 1900, the spectral intensity of black body radiation at a given wavelength is determined by the temperature of the black body. A black body—a theoretical concept in physics—absorbs all radiation. It is an ideal emitter, absorbing radiation at all wavelengths and temperatures. The radiation emitted by a black body is a result of the thermal energy possessed by its constituent particles, such as atoms and molecules. As the temperature of the black body increases, the intensity and distribution of the emitted radiation change. Black body radiation has a continuous spectrum, meaning it contains all possible wavelengths of electromagnetic radiation. As the temperature of the black body increases, the peak intensity of the emitted radiation shifts toward shorter wavelengths, resulting in a phenomenon known as thermal radiation.107

The study of black body radiation played a crucial role in the development of quantum mechanics. In the early 20th century, scientists such as Max Planck and Albert Einstein used black body radiation to explain phenomena that could not be explained by classical physics alone. Planck’s work on black body radiation was particularly significant as it led to the introduction of the quantum concept, marking a fundamental shift in the understanding of energy and matter: "My unavailing attempts to somehow reintegrate the action quantum into classical theory extended over several years and caused me much trouble."108 As a result, Einstein proposed that light can behave as both particles and waves, introducing the notion of photons as discrete packets of energy—quanta of light.

Special relativity, formulated by Albert Einstein in 1905, introduces the concept that the speed of light in a vacuum is constant and is the maximum speed at which information or energy can travel.109 As photons are particles of light and have zero rest mass, they always travel at the speed of light in a vacuum. This principle of special relativity sets a cosmic speed limit. In general relativity, Einstein's theory of gravity, the curvature of spacetime is influenced by the distribution of mass and energy. Photons, being massless particles, follow paths dictated by this curved spacetime geometry. The presence of massive objects can bend the trajectory of light, causing gravitational lensing:  “One profound result of Einstein’s theory of general relativity: gravity bends the path of light, much as it affects the path of massive objects. Very massive astronomical bodies, such as galaxies and galaxy clusters, can magnify the light from more distant objects, letting astronomers observe objects that would ordinarily be too far to see. Even the gravity from planets affects light, allowing researchers to detect worlds in orbit around other stars.”110 The theory of relativity also provides a framework for understanding the concept of time dilation. Time can appear to pass differently for observers in relative motion or in gravitational fields. This has been experimentally confirmed, and it influences the behavior of photons.111

Today, the principles of black body radiation and quantum mechanics are foundational in fields such as astrophysics, engineering, and computing. Black body radiation models are employed to analyze and interpret the radiation emitted by stars, galaxies, and other celestial objects. Emerging technologies aim to capture black body radiation; for instance, Cosmic Microwave Background (CMB) radiation is the closest thing to perfect black body radiation that has ever been observed: “... human eyes cannot see the microwaves from the CMB (or X-rays or infrared rays either). However, using specially designed detectors, such as those to be carried by the Planck [satellite], we can. The CMB is the farthest and oldest light any telescope can detect. It is impossible to see further beyond the time of its release because then the Universe was completely ‘opaque.’ The CMB takes astronomers as close as possible to the Big Bang, and is currently one of the most promising ways we have of understanding the birth and evolution of the Universe in which we live.”112 It is like a historical photograph, capturing a specific moment in the early universe, when it still existed as an ionized plasma—a hot, charged gas—but was beginning to separate into matter and radiation.

More recently, a team of scientists captured the first image of a black hole—a black body—by utilizing the technique of Very Long Baseline Interferometry (VLBI) and forming the Event Horizon Telescope (EHT): “I met [Sagittarius A*] 20 years ago and have loved it and tried to understand it since, but until now, we didn’t have the direct picture.”113 By synchronizing an array of telescopes located around the world, the EHT aimed to create a virtual telescope with an aperture equal to the diameter of the Earth, enabling them to image distant objects with high resolution. Their primary targets were Sagittarius A*, the supermassive black hole at the center of our Milky Way galaxy, and M87*, an active supermassive black hole located in the galaxy Messier 87. The EHT team gathered data from multiple telescopes for several days, which was later combined and processed to produce the first-ever image of a black hole's silhouette. NASA spacecraft and telescopes observed the black hole at various wavelengths to complement the EHT's findings and provide further insights into its environment.114 Even a black hole can be captured.

DEFINITIONS

DEFINITION I. By capture I mean extraction with an aim toward perfection.

DEFINITION II. I consider as belonging to the likeness of a thing, the spatial coordinates of its surfaceand corresponding values.

DEFINITION III. By target, I mean the continuous surface of the captive.

Explanation.—I say continuous because the aim is the boundary of the captives identity.

DEFINITION IV. By captive, I mean the subject at the center of each frame.

Explanation.—The use of the word captive reinforces, in an explicit way, the colonial logic of technologies of Capture.

DEFINITION V. Duration is the indefinite continuance of existing.

Explanation.—The captive is always changing.

DEFINITION VI. Reality and perfection I use as synonymous terms.

DEFINITION VII. By perfection, I mean ground truth.

AXIOMS

I. Capture inverts.

II. Capture fixes.

III. Capture extracts.

IV. Capture distorts.

V. Capture circulates.

N.B. The Postulates are given after the conclusion of Proposition xiii.

PROPOSITIONS

Proposition I. A camera is tool of inversion

Proof.—A camera obscura works on a basic optical principle—when light passes through a small hole into a darkened enclosure, it projects an inverted image of the scene outside onto a surface inside. The device originally took the form of a darkened room with a small hole in one wall. Later versions, more portable and convenient, incorporated lenses to focus the light and mirrors to correct the inversion of the image.115

Note.—The principle of the camera obscura—Latin for dark room—was first documented by Chinese philosopher Mozi circa 400 BC. Mozi, also known as Mo Di, was a Chinese philosopher who lived during the Warring States period, from around 470 to 391 BC. He was the founder of Mohism, a school of thought characterized by its emphasis on logic, observation, and inquiry, as well as the virtues of impartial caring and moral duty.116  In his book, Mozi, also known as The Mo Jing, he described the formation of an inverted image: “The image being inverted depends on there being an aperture at the cross-over and the image being distant. The explanation lies in the aperture. The image. The light reaches the person shining like an arrow. The lowest that reaches the person is the highest and the highest that reaches the person is the lowest. The feet conceal the lowest light and therefore become the image at the top. The head conceals the highest light and therefore becomes the image at the bottom.”117 This observation was part of a larger section in his writings on optics and the nature of light, which he argued travels in straight lines.

Proposition II. A camera is a hole.

Proof.—The proof of this proposition is similar to that of the last.

Proposition III. Optics outside the body.

Proof.—Development of the camera obscura resumed with Arab scholar Ibn al-Haytham in the 10th century AD. Ibn al-Haytham, also known by the Latinized name Alhazen, was a pioneering scientist and polymath from the Islamic Golden Age, living from circa 965 to 1040 AD. His work covered a wide range of scientific and philosophical subjects, but he is perhaps best known for his groundbreaking contributions to the understanding of vision, optics, and light.118

Note.—In his seminal work, Kitab al-Manazir—The Book of Optics—Ibn al-Haytham provided an early description and analysis of the camera obscura. He explained that when light passes through a small hole inside a darkened room or box, it projects an inverted image of the outside world onto an opposite surface. He performed a series of experiments with light passing through small apertures and demonstrated how this resulted in the projection of the external image: “This becomes clearly apparent to sense if one examines the lights that enter through holes, slits and doors into dusty chambers. As for the light of the sun, when it enters through a hole into a dark chamber the air of which is cloudy with dust or smoke, the light will appear to extend rectilinearly from the hole through which the light enters to the place on the chamber’s floor or walls which that light reaches.”119 

Ibn al-Haythams rigorous scientific approach, including his use of empirical evidence and systematic experimentation, was revolutionary for his time and has led many to regard him as the first true scientist: "The seeker after truth is not one who studies the writings of the ancients … and puts his trust in them, but rather the one who suspects his faith in them and questions what he gathers from them … Thus the duty of the man who investigates the writings of scientists, if learning the truth is his goal, is to make himself an enemy of all that he reads, and, applying his mind to the core and margins of its content, attack it from every side”120 His work on the camera obscura cemented understanding of the device and laid the groundwork for future advancements in the fields of optics, physics, and visual perception. It would later influence European scholars after being translated into Latin during the Middle Ages. His text played a key role in the scientific revolution in Europe. Between the 14th and 17th centuries, the camera obscura was used extensively.

Proposition IV. A camera points at accuracy.

Proof.—Early Modern cartographers used the camera obscura as an instrument to plot more accurate maps and charts. This period marked a time of significant exploration and discovery, accurate maps were essential for navigation. The camera obscura could be used to project images of landscapes onto a surface where they could be traced, creating a highly detailed and proportionally accurate representation of the scene.121 This was particularly useful for mapping coastlines and cityscapes, which could be complex and challenging to represent accurately. When set up at a high vantage point overlooking a city, a camera obscura could project an image of the entire city onto a single piece of paper. Cartographers could then trace the projected image to produce an accurate, detailed map of the city.

Proposition V. A camera points to space.

Proof.—In addition to mapping physical locations, the camera obscura was also used to map the night sky.122 Astronomers in the Early Modern period used the device to project images of the stars and planets onto a surface where they could be recorded, leading to some of the most accurate astronomical charts of the time. The devices ability to project bright images onto a dark background made it ideal for studying celestial bodies. The principle of the camera obscura was particularly helpful in observing solar phenomena without the risk of eye damage. Directly viewing, especially during events like solar eclipses, can cause severe retinal damage. A camera obscura allows for indirect observation.

Proposition VI. A camera augments vision.

Proof.—In the 16th century, a few years before his formulation of geometrical optics, Johannes Kepler coined the term camera obscura. Kepler used the device to observe a solar eclipse in 1605 and made significant discoveries about the nature of the moons shadow on Earth.123 In the 17th century, the invention of the telescope dramatically increased the capacity to observe celestial bodies, and the camera obscura was adapted to fit these new instruments. Astronomers attached a camera obscura box to their telescopes, enabling them to project the magnified image onto a piece of paper and trace the celestial bodies and their movements.

Corollary.—A camera gives us super vision.

Proposition VII. Imposes a grid.

Proof.—Leon Battista Alberti, the Italian architect, philosopher, and cryptographer developed another device, the veil, to capture three-dimensional space.

Corollary.—Sheer discontinuity.

Note.—Albertis writing indicates the utility of the device: “I believe nothing more convenient can be found than the veil, which among my friends I call the intersection, and whose usage I was the first to discover. It is like this: a veil loosely woven of fine thread, dyed whatever color you please, divided up by thicker threads into as many parallel square sections as you like, and stretched on a frame. I set this up between the eye and object to be represented, so that the visual pyramid passes through the loose weave of the veil. This intersection of the veil has many advantages, first of all because it always presents the same surface unchanged, for once you have fixed the position of the outlines, you can immediately find the apex of the pyramid you started with, which is extremely difficult to do without the intersection.”124 The veil consisted of a squared-off grid that—when positioned between the observer and the observed—would break down the scene into a series of smaller, manageable squares. This allowed the viewer to collect depth data and record spatial coordinates by translating the viewed scene onto a similarly gridded paper—each square representing a segment of the visual field.

He also invented the finitorium,125 a radial dial with descending plumb lines. This device is placed above an object. The arm of the radial dial indicates XY coordinates and the weighted plumb lines measure Z coordinates. The operator rotates the arm, repositions the plumb lines, and records coordinates where the plumb line intersects the surface of the object. A virtual model made of points. The finitorium descended from the astrolabe, an ancient astronomical instrument used for solving various celestial calculations, including measuring the positions of stars, determining local time, and finding one's latitude. It consists of a circular disk with various markings, an alidade—a pivoting pointer—and a rotating plate with a sighting mechanism. The theodolite also emerged from this lineage of angular measurement.

The invention of the theodolite is attributed to Leonard Digges, an English mathematician and surveyor, in the 16th century.126 A theodolite is a precision optical instrument used in surveying and engineering to measure horizontal and vertical angles. It consists of a telescope mounted on a rotating base and a vertical axis. The telescope can be rotated horizontally—azimuth—and vertically  |  elevation  |  and is equipped with crosshairs or a reticle to measure angles accurately. In early versions of theodolites, the crosshairs were made of spider webs (. The crosshair is the reticle or a network of fine lines inside the telescope that aids in precise aiming and measuring angles. Spider silk, due to its thinness and ability to form a fine thread, was commonly used for this purpose. The web strands were carefully mounted and adjusted within the telescope to intersect at the center, forming the crosshairs. Spider silk was also highly valued for its strength and lack of stretching, which ensured the stability and accuracy of the measurements: “Older Coast and Geodetic Survey (C&GS) triangulation manuals required that all field parties carry a spider's cocoon with them and included instructions for replacing broken micrometer wires with threads from the cocoon.”127 Over time, advances in technology and the availability of more durable materials led to theodolites with manufactured reticles, such as etched glass or metal wire.

Theodolites are used for precise land surveying and engineering tasks, while sextants are employed in celestial navigation for determining latitude at sea or in space. Sextants measure the angular distance between celestial objects and the horizon. It is commonly attributed to two individuals—John Hadley, an English mathematician—and Thomas Godfrey, an American inventor. Both independently designed and built similar instruments around 1730. The sextant revolutionized navigation by providing sailors with a highly accurate means of determining latitude at sea. Early versions of the sextant featured a solid frame, a graduated arc with a movable arm carrying a small telescope and a mirror. The observer would align the instrument to measure the angle between a celestial body, usually the sun or a star, and the visible horizon. Over time, the sextant underwent refinements and improvements—double frames and vernier scales—leading to increased accuracy and accessibility: “Sextants designed for aircraft navigation are equipped with a pendulum or a gyroscope that serves as an artificial horizon, as well as a mechanism that allows the navigator to average several observations taken in rapid succession.”128 Metrology and optics intertwined.

Proposition VIII. Fixing light.

Proof.—The earliest known photograph was taken in 1822129 by Joseph Nicéphore Niépce, using a process called heliographysun writing: “to fix the images of objects by the action of light” or “the means of fixing spontaneously by the action of light, the images seen in the ‘camera obscura’.”130 A camera obscura captures an image on a metal plate coated with a light-sensitive chemical.

Corollary.—In 1839, Louis Daguerre and William Henry Fox Talbot independently developed new processes that significantly reduced the exposure time required to create a photograph. Daguerres process, called daguerreotype, used a polished silver-plated copper sheet as the medium: “I have seized the light. I have arrested its flight.”131 Talbots process, known as calotype, used paper coated with silver iodide: “the inimitable beauty of the pictures of nature’s painting which the glass lens of the Camera throws upon the paper in its focus—fairy pictures, creations of a moment, and destined as rapidly to fade away … the idea occurred to me … how charming it would be if it were possible to cause these natural images to imprint themselves durably, and remain fixed upon the paper.”132 These new processes made photography more practical and accessible, and led to a surge of interest in the technology.

Note.—The first practical method of color photography was developed by Scottish physicist James Clerk Maxwell in 1855, a method known as additive color synthesis. He took three separate photographs of a tartan ribbon, each time with a different color filter over the lensred, green, blue. When superimposed, these created a full-color image. “In 1861 he commissioned Thomas Sutton to take a demonstration photograph of a tartan ribbon which he showed projected onto a screen at King’s College London. This image shouldn’t have worked as well as it did, because the photographic chemicals did not respond to red light. Serendipitously, unseen ultraviolet light also reflected off the red portions of the ribbon and provided the third color.”133

Color photography did not become widespread until the mid-20th century with the invention of subtractive color film technologies like Kodachrome and Technicolor: “Technicolor never mentioned the name Kodachrome when referring to the technology used in its communications to the press and stockholders. Instead it used descriptions such as ‘an experiment in monopack’, ‘the Monopack procedure’ and even ‘Technicolor Monopack’ for the system used. But no matterhow it was called, the technology was very probably the same …”134 Identical twins.

Proposition IX.Capture gains velocity.

Proof.—In the late 1800s, the development of celluloid film allowed for even faster and more efficient photography: “before becoming a synonym for cinema, celluloid was used to imitate expensive materials like ivory, tortoiseshell … gemstones”135 This material advance led to the widespread use of photography for scientific, commercial, and artistic purposes. The introduction of the handheld camera in the 1890s made photography even more portable and accessible, and led to a boom in amateur photography.136

Corollary.—Firing time.

Proof.—Eadweard Muybridge was a seminal figure in the capture of motion. It began with a wager in 1872. Technology to settle a debate. Do all four hooves leave the ground when a horse gallops? To settle this, Muybridge engineered a sequence camera system using a trigger of a gun.137 He lined up a series of cameras along a racetrack, each triggered by a thread as the horse ran by. This method allowed him to capture sequential images showing detailed phases of motion. Proving that horses float: “It was as though he had grasped time itself, made it stand still, and then made it run again, over and over. Time was at his command as it had never been at anyone’s before.”138 Building on this invention, Muybridge later designed his zoopraxiscope, a precursor to a moving image projector. Shooting light back into space.

In another significant experiment, he extended his exploration of motion and perspective by photographing his subjects from multiple angles.139 He arranged cameras in a circle around the subject, pioneering a technique that creates a 360-degree view—bullet-time photography. A matrix of cameras.

Proposition X. Measuring space.

Proof.—Photogrammetry, the science of making measurements from photographs, has roots that trace back to the mid-19th century, shortly after the invention of photography. The german twin of the term photogrammetry was first coined by the German architect Albrecht Meydenbauer in 1867, in the title of his article, Die Photometrographie.140 Meydenbauer initially used the technique to create architectural drawings of buildings that were difficult to sketch by hand. He developed a photomeasure table and used it to create precise measurements from the photographs he had taken.141 

Note.—French military engineer and surveyor Aimé Laussedat, began experimenting with the use of terrestrial photos for topographic purposes. He is often credited with being the first to use the term photogrammetry in the scientific literature and for his work in demonstrating the practical application of the method in topographic surveying: “In 1842, after two years of study at the École Polytechnique in Paris, lieutenant Laussedat was assigned to the corps of engineers where he spent his entire military career. He was first assigned to the fortifications of Paris where he participated in the construction of the fort of Romainville, then in Bayonne (French Pyrenees) for the recognition of the Franco-Spanish border and the study of the establishment of a stronghold in Cambo. His was then responsible for making topographic surveys, and since then, he began to think about the alternative approaches that he was to develop during several decades in order to make topographic surveys more accurate and efficient, particularly in mountainous areas.”142 By the late 19th century, Laussedat had developed the basis for aerial photogrammetry, although the lack of suitable flight technology at that time meant that his ideas wouldnt be fully realized until the 20th century.143

Aerial photogrammetry began to take shape during World War I, where it was used for reconnaissance and mapping. Photographs taken from balloons144 and later from airplanes were used to create topographic maps of enemy territory: “Photography is a marvelous discovery, a science that has attracted the greatest intellects, an art that excites the most astute minds—and one that can be practiced by any imbecile … In photography, like in all things, there are people who can see and others who cannot even look.”145 After the war, this technique was further refined and developed. The introduction of stereoscopy, where two photos taken from slightly different perspectives are combined to give a three-dimensional effect, allowed for more precise measurements and led to further advances in the field.

Corollary.—Disparity from above.

Note.—The concept of stereo depth cameras draws from the biological principle of binocular vision, evident in animals including humans, which use two eyes to perceive depth (III.xxix.proof.). This principle was first applied to photography in the mid-19th century by Sir Charles Wheatstone, who invented the stereoscope,146 a device for viewing a pair of separate images depicting left-eye and right-eye views of the same scene, creating an illusion of depth (III.postulate.i.). Wheatstone also invented the chronoscope, a device for measuring velocity: “Patented in 1874, the ballistic chronograph was the most accurate way to find the speed of bullets.”147

Stereo depth camerasalso known as stereo vision systems or stereo camerascame into focus with computer vision and digital imaging in the late 20th century. These systems typically consist of two or more lenses that capture two slightly different views of an object or scene. Akin to the left and right eyes in binocular vision. These images are then processed by a computer to compare and analyze the differences between them, known as disparity. The baseline distance between the two cameras and the focal length of the lenses are known. As a result, the system can calculate the depth of each point and generate a depth map (III.xxv.).148

Proposition XI. Structural deformation.

Proof.—Structured light sensors represented a significant development in the field of optical metrology. The concept of structured light scanning was introduced in the 1960s, but the technology became more widespread with advancements in computer technology in the 1980s. The foundational principle behind these sensors is the projection of a known patterndots, lines, gridsonto an object. As the pattern deforms over the object, it is captured by a camera with known position and orientation. By analyzing the deformation of the pattern, the sensor can calculate depth information and create a three-dimensional representation of the object. In its early stages, structured light technology was primarily utilized in industrial settings for quality control and inspection. However, with miniaturization of components, improvements in computational efficiency, and advances in photonics, the applications of structured light are multiplying: “All light has structure, but only recently has it been possible to control it in all its degrees of freedom and dimensions, fueling fundamental advances and applications alike …  from traditional two-dimensional transverse fields towards four-dimensional spatiotemporal structured light and multidimensional quantum states, beyond orbital angular momentum towards control of all degrees of freedom, and beyond a linear toolkit to include nonlinear interactions, particularly for high-harmonic structured light.”149

Corollary.—Spatial lasers.

Note.—Light Detection and Ranging—LiDAR—is a remote sensing method that uses light in the form of a pulsed laser to measure distances. It was developed in the 1960s, shortly after the invention of the laser. The first LiDAR-like system was created by Hughes Aircraft Company, which built the first laser radar system in 1961. The early use of LiDAR technology was primarily in the field of atmospheric research, where it was used to measure clouds and pollution levels. The first major applications came in the field of topographical mapping and have since extended to various areas including archaeology, forestry, construction, and autonomous vehicles.150 A significant breakthrough for LiDAR came in the early 2000s when it was used in NASAs Mars Exploration Rover mission to map the Martian terrain.151 Now it is being used to look below its surface: “The Radar Imager for Mars' Subsurface Experiment, known as RIMFAX, uses radar waves to probe the ground under the rover … No one knows what lies beneath the surface of Mars. Now, we'll finally be able to see what's there.”152 Advancements in laser technology, GPS, and data processing, have made LiDAR more accurate, powerful, and accessible, leading to widespread adoption.

Proposition XII. Time of flight.

Proof.—Time of Flight (ToF) sensors measure the time taken for light or other types of signals to travel to an object and back. Time of Flight has roots in mid-20th century radar technology. The principle of time-of-flight was initially applied in fields like geology and space exploration, where it was used to measure distances on a large scale. The advent of faster and more powerful microprocessors made ToF sensors practical for smaller scale and real-time applications. Strands of light darting to and fro. Veils woven from time. Segmenting space.153

Note.—This proposition is also evident, and is more clearly to be understood from II. vii.

Proposition XIII. Shoot to measure.

Proof.—In ToF, a light signaloften from a laser or an LEDis emitted towards an object. The sensor then measures the time it takes for the light to bounce back after hitting the object. Given that the speed of light is constant, the sensor can calculate the distance of the object by multiplying half of the measured time by the speed of light. The time is halved because the light travels to the object and back. The ToF sensor produces a depth map of the environment.154 Each pixel corresponds to a distance, which is particularly useful for autonomous vehicles. Navigation. Obstacle detection: “As long as the vehicle is moving at walking speed, the measuring range of a ToF camera is sufficient. For higher speeds, you may use lidar with a scanning ToF principle. ToF cameras are not certified safety devices, so they have to be used in combination with other sensor modalities.”155

Note.—Throughout—the transition to digital technology marked a significant shift. The first digital camera was created by an engineer at Eastman Kodak, Steven Sasson, in 1975: “It only took 50 milliseconds to capture the image, but it took 23 seconds to record it to the tape. I’d pop the cassette tape out, hand it to my assistant, and he would put it in our playback unit. About 30 seconds later, up popped the 100-pixel-by-100-pixel black-and-white image.”156 This clunky device laid the foundation for a digital revolution. Digital technology made photographs easy to take, store, share, edit. It transformed the way we capture and interact with images. The widespread adoption of digital cameras has democratized photography. With the integration of cameras into other devices, capturing high-quality data is now available to most people. Unlike film photography, digital technology provides instant feedback. View. Delete. Share. With the internet, digital photos instantaneously cross vast distances (I.xxv.).

AXIOM I. All bodies are either in motion or at rest.

AXIOM II. Every body is moved sometimes more slowly, sometimes more quickly.

LEMMA I. Bodies are distinguished from one another in respect of motion and rest, quickness and slowness, and not in respect of substance.

Proof.—Event cameras.

LEMMA II. The advent of inhuman vision (IV.xxxvii.note.i.).

Proof.—Event camerasalso known as neuromorphic sensors or silicon retinasrepresent a shift from traditional frame-based imaging to bio-inspired, asynchronous capture.157 The development of event cameras began in the early 1990s, driven by researchers seeking to mimic the highly efficient processing mechanisms found in biological vision systems: “An insect's compound eye is an engineering marvel: high resolution, wide field of view, and incredible sensitivity to motion, all in a compact package.”158 

LEMMA III. Pixels operate independently.

Proof.—Conventional cameras capture images at fixed time intervals. Event cameras operate on a radically different principle. Each pixel in an event camera operates independently and responds to changes in the logarithmic intensity of light. When the change in light intensity at a pixel crosses a certain threshold, it generates an event. This event includes the pixels coordinates, the polarity of the changeincrease or decreaseand the precise timestamp at which this change happened.159 

Event Cameras increase the rate of capture. They allow for incredibly high temporal resolution and a wide dynamic range. Lower power consumption. Smaller data output. The pixel’s individual and asynchronous operation reacts almost instantly to changes. Capturing fast motion and adapting swiftly to changes in lighting conditions. Event cameras hold great potential in fields that require quick reactionself-driving cars, robotics, augmented reality.

Corollary.—Capture is evolving.

Axiom I.—Capture is multiplying.

Axiom II.—Capture is accelerating.

It is estimated that over 6.5 billion people around the world have a phone equipped with at least one camera.160 On top of this, the number of standalone digital cameras produced annually runs into the tens of millions. Add in other types of camerasthose in vehicles, computers, security systemsthe number of cameras is immense. The proliferation of cameras has complex ethical dimensions. The production and operation of capture technologies is destructive to the environment. The existence of its infrastructure erodes privacy.

Definition.—Extractivism refers to an economic and socio-political model focused on the extraction of natural resources from the Earth. Originating during the colonial period, this extractive paradigm persists in many economies globally, underpinning key industries such as mining, forestry, fishing, and fossil fuels. While Extractivism has been a significant driver of economic growth and development, it has also raised substantial ethical and environmental concerns.161

Axiom III.—The historical roots of Extractivism lie in colonial practices, where colonial powers systematically extracted resources from colonized regions to fuel their own economic growth. This resulted in massive wealth accumulation for colonial powers. Colonized regions forced into states of economic dependence and ecological imbalance. The remnants of this extractive legacy continue to shape global economic relations, with many post-colonial nations still heavily reliant on exporting raw materials to developed nations: “Extractivism ran rampant under colonialism because relating to the world as a frontier of conquestrather than a homefosters this particular brand of irresponsibility. The colonial mind nurtures the belief that there is always somewhere else to go to and exploit once the current site of extraction has been exhausted.”162

LEMMA IV. Extractivism has undoubtedly contributed to national and global wealth. The extraction and export of natural resources have financed growth, development, and job creation. Countries rich in natural resources, particularly minerals, oil, and gas, have seen substantial economic growth. However, this wealth is often unevenly distributed, contributing to social inequality.

Proof.—Reliance on resource extraction can lead to economic instability due to fluctuations in global commodity pricesa phenomenon known as the resource curse: “The term resource curse refers to a paradoxical situation in which a country underperforms economically, despite being home to valuable natural resources. A resource curse is generally caused by too much of the countrys capital and labor force concentrated in just a few resource-dependent industries.”163

LEMMA V. The exploitation of natural resources often comes at a considerable human cost, including the displacement of local communities, poor labor conditions, and impacts on the health and wellbeing of those living near extractive operations. Extractivism consumes Earths finite resources, raising questions about intergenerational equitythe fairness of depleting resources for future generations.

Proof.—The same as for the last Lemma.

LEMMA VI. The environmental implications of Extractivism are profound. Extractive industries are major contributors to environmental degradation, including deforestation, soil erosion, water pollution, and loss of biodiversity. The extraction and burning of fossil fuels have significant impacts on climate.

Proof.—The United Nations Environment Programme recognizes that “...the economic benefits come at a cost: Climate change: Resource extraction is responsible for half of worlds carbon emission; Pollution: the extractives sector contributes to air, water and land pollution, toxic wastes and has caused significant water; pollution. Oil production has also gravely impacted the environment in countries such as Nigeria; Biodiversity loss: 20% of oil and gas contracts block overlaps with biodiversity protected areas in Africa; Social issues: Tailing dam disasters have threatened peoples lives and safety; mining was the third sector linked to the most murders, with over half of the attacks in three countries (Colombia, Mexico and the Philippines). Many human rights abuses are also linked to ASM with over 40,000 children working in cobalt mines in DRC.”164

LEMMA VII. Manufacturing these devices—sensors and servers—involves complex sourcesminerals, metals, human laborand extractive processes that carry significant environmental and social tolls.

Proof.—This proposition is evident from the definition of Extractivism prefixed to Lemma iv.

Note.—The production of camera sensors necessitates the extraction of various rare earth minerals and metals. These include elements like lanthanum, used in camera lenses for its high refractive index165— indium, a key component of the LCD displays found in digital cameras—and tantalum, used in capacitors in camera circuitry. These valuable materials are not distributed evenly across the Earths crust. And their extraction is a dangerous and environmentally damaging process.

The mining of these materials frequently involves open-pit techniques that drastically alter landscapes, promote deforestation, and lead to substantial water and soil pollution. For example, the extraction of tantalum, primarily from coltan ore, is directly linked to deforestation and habitat destruction, particularly in conflict-prone areas like the Democratic Republic of Congo. This environmental damage, in turn, has far-reaching ecological and human impacts, threatening biodiversity and local communities livelihoods: “Until relatively recently, companies such as Intel, HP and Apple havent had to trace the source of the tantalum that goes into their electronic devices, but this all changed with the Dodd-Frank Reform in 2010. The Act states that all companies registered with the US Securities and Exchange Commission have to disclose whether they are receiving tantalum, tungsten, tin, and gold from Congo, and whether those minerals are connected to sites of conflict.”166

Lanthanum, a crucial element in camera lenses, is primarily extracted through mining rare earth ores, particularly bastnasite and monazite. These ores are often found mixed with other substances, requiring extensive and energy-intensive processes to separate and refine lanthanum. Extraction operations predominantly occur in China, which is home to the worlds largest rare earth deposits. The open-pit mining process disrupts local ecosystems, causing soil erosion, habitat loss, and water contamination from mining waste. The refining process is also highly polluting, involving strong acids and producing hazardous waste, which often contains radioactive thorium: “Due to high technological growth, increasing demand, and changing government policies on import and export, mining of lanthanum will steeply increase in the coming years.”167

Indium, a key element in LCD displays of digital cameras, is most commonly found in association with zinc ores, and to a lesser extent, with lead, tin, and copper ores. The extraction of indium often occurs as a byproduct of zinc or lead mining. Once the host ore is mined, it undergoes a complex series of chemical reactions to isolate indium. One of the key environmental concerns with indium mining is the substantial amount of waste produced, as only a tiny fraction of the mined material is actually indium.168 The processing of these ores to isolate rare earth elements is energy-intensive and results in significant emissions of carbon dioxide and the waste from these processes also contain harmful radioactive elements: “Knowledge of the anthropogenic and natural cycling of indium can lead to a greater understanding of the environmental impacts and human health effects of this metal.”169

Rare earth mining for elementstantalum, lanthanum, indiumused in cameras, often produces radioactive byproducts. This is because many rare earth elements are found in geological deposits alongside naturally occurring radioactive materials. When these ores are mined and processed, radioactive materialsuranium and thoriumare brought to the surface.The waste products, or tailings, from the extraction processes contain a mixture of these, as well as their decay products, which include various isotopes of radium and radon. The handling and disposal of radioactive byproducts pose significant environmental and health risks. In many places, the waste is stored in tailings ponds, which are large, engineered dam and dyke systems designed to hold the mining waste. These ponds can be susceptible to leaks, or worse, catastrophic failures, leading to widespread environmental contamination.170

The dust from these tailings can be windborne and contaminate surrounding areas. Radon gas can escape into the atmosphere. Exposure to these radioactive materials increases the risk of cancer and other health problems in human and non-human lifeforms. The long half-life of these radioactive elements means that they remain hazardous for thousands of years: “The typical ground water and soil samples around the tailings pond were sequenced. The dominant bacteria in soil and ground water are consistent. The dominant bacteria were Actinobateria, Proteobacteria and Acidobacteria at phylum level. This microbial community composition is similar to that reported in arid lands around the world.”171 What is our time of flight to an arid wasteland?

From the miners who extract raw materials to the factory workers assembling products, labor is extracted at every step. The conditions of this labor are fraught. Inadequate wages. Poor working conditions. Violations of workers rights. Child labor.172 The factories where cameras are assembled are typically in developing countries. Workers are subjected to long hours in poor conditions with little job security and insufficient health and safety protections. They are increasingly subjected to electronic surveillance and even required to wear monitoring devices: “Electronic surveillance puts the body of the tracked person in a state of perpetual hypervigilance, which is particularly bad for health … Employees who know they are being monitored can become anxious, worn down, extremely tense, and angry. Monitoring causes a release of stress chemicals and keeps them flowing, which can aggravate heart problems. It can lead to mood disturbances, hyperventilation, and depression.”173 Extractive practices migrate.

Data is collected, transmitted, and stored. Collection—The sensor detects and measures a physical property of its environment. It converts this information into a digital signal. Data acquisition. Transmission—The sensor sends the data to a server through wired or wireless networks. Storage—The server receives the data and it stores it for later use. Data is queued.174 The server may also perform processing or analysis on the data. Clusters of servers have large storage capacities and powerful processing abilities, allowing for the handling and analysis of large volumes of data.175 They provide a centralized location for data access and management. Multiple devices or users can access the data simultaneously.

The creation of servers begins with the extraction of essential materials such as gold, silver, copper, and aluminum, along with a range of rare earth elements. Like the materials used to make sensors, these are typically obtained through open-pit mining operations. Following extraction, these raw materials undergo refining processes to make them suitable for use in manufacturing. The refinement stage frequently involves the use of harsh chemicals that can pollute local water sources and produce a significant amount of waste. For instance, gold refining often entails the use of cyanide, creating toxic tailings:  “Cyanide is a rapidly acting substance that is traditionally known as a poison. Hydrogen cyanide was first isolated from Prussian blue dye in 1786 and cyanide first extracted from almonds around 1800. Cyanide can exist as a gas, hydrogen cyanide, a salt, potassium cyanide … Cyanide is also found in manufacturing and industrial sources such as insecticides, photographic solutions, plastics manufacturing ... It has been used as a poison in mass homicides and suicides.”176 Manufacturing and assembly stages are the subsequent links in the chain of server production. Refined materials are shaped into components like circuit boards, processors, memory chips, and hard drives. These components are servers of voracious energy consumption.

Once in operation, servers consume vast amounts of electricity. They contribute to carbon emissions and climate change. As digital services proliferate, the demand for servers increases. Servers contain multitudes—hazardous heavy metals, flame retardants, poisons. And servers are disposable. The end of their lifecycle presents yet another challenge—improper handling and disposal perpetuate environmental harm: “Immortal waste.”177 The mining and manufacturing of servers involves exploitative labor practices. Local communities are treated as disposable resources—servitude underlies servers: “All things can be deadly to us, even the things made to serve us; as in nature walls can kill us, and stairs can kill us, if we do not walk circumspectly.”178 

POSTULATES

I. Capture is extraction.

II. Extraction of land and labor.

III. The machinery of Capture is in place.

IV. Capture has infrastructure.

V. Capture has methodologies.

VI. The three main methods of Capture—terrestrial, close range, aerial—reperform languages of invasion and control.

Proposition XIV. Capture extends our reach.

Proof.—Terrestrial photogrammetry, the practice of deriving measurements from ground-based photographs, employs several methods for capturing data: “We shall call the first pole of capture imperial or despotic.”179 One common method is to use a pole, sometimes referred to as a monopod. A camera is mounted to a tall, often extendable pole and is usually controlled remotely. The pole allows the operator to take photos from elevated viewpoints. This method is beneficial for capturing images of hard-to-reach areas, such as rooftops, or to provide a birds-eye view of a scene. The pole method is often used in architectural and archaeological photogrammetry.180

Proposition XV. Capture is ritual extraction.

Proof.—The telescoping pole extends the range of human vision, cheaply and simply. The captor walks through the terrain or around an object holding the pole. The goal is to capture images from every possible angle, height, and proximity. This practice of total observation parallels the medical examination in Foucaults Discipline and Punish: “in all the mechanisms of discipline, the examination is highly ritualized. In it are combined the ceremony of power and the form of the experiment, the deployment of force and the establishment of truth. At the heart of the procedures of discipline, it manifests the subjection of those who are perceived as objects and the objectification of those who are subjected.”181 The captor parades around with a camera atop a pole, in a highly ritualized series of concentric circles, closing in on its object of desire.

Proposition XVI. Violence haunts terrestrial capture.

Proof.—What does this practice signal? It recalls historical invasions, starting with the crusades, where European men marched under tall crosses and banners. They paraded through foreign lands to violently assert Christian ideology and cultural dominance. Again, in the colonial era, European men entered the same spaces with tall poles waving flags to impose Western imperial power. In addition to the visual language of vertical procession, the same verbal language of salvation is used in all three cases. Photogrammetry—and other forms of 3D Reconstruction—is hailed as an ethical alternative to older conventions in the field of archeology;182 the global community now frowns on violently ripping physical artifacts out of their situated contexts. Instead scholars make virtual models of artifacts to preserve and study, supposedly without disturbing their originary ecosystems. However, many archeologists who advocate for photogrammetry as a mechanism of cultural heritage preservation imply that indigenous populations cannot be trusted to protect historical artifacts. The digital databases that store these archeological models—Arc/k Project, CyArk, ARK—consistently reinforce Judeo-Christian exceptionalism—even in their names. Through research, sculpture, installation, and performance, Artist Moreshin Allahyari explores the concept of Digital Colonialism, which she defines as “a framework for critically examining the tendency for information technologies to be deployed in ways that reproduce colonial power relations.”183

Corollary I.—Capture marches through space.

Corollary II.—Capture plants flags.

Proposition XVII. Firing at close range.

Proof.—Close-range capture involves taking images of a captive object from a close distance. It is often used for small, highly-detailed subjects. Devices of close-range capture includeturntables, robotic arms, CMMs, and cages. These devices require extraction from context.

Corollary.—The mind is able to regard as present external bodies, by which the human body has once been affected, even though they be no longer in existence or present.

Proof.—A turntable is a revolving platform onto which the captive object is placed. The camera typically remains stationary, capturing images of the object as it rotates. This allows for stable and systematic capture. The turntable is a controlled environment. Consistent lighting. Featureless background. Conformity.184 This approach lends itself to consumerismquality control, product photography, game asset generation.

Note.—Robots replace the captor. Equipped with a camera, a robotic arm can maneuver around an object, capturing images from a variety of angles and elevations. This technique is highly valuable when dealing with complex objects or when access to all angles is otherwise restricted: “reflective or almost black surfaces, complex structured surfaces, cavities. I was surprised: no problem for CultArm3D. I haven’t seen such an autonomous system before.”185 The ability to program the robotic arm offers the flexibility to adjust the process based on a captive’s shape, size, and intricacy. Path-planning. Policies. Adaptable capture.

Albertis finitorium, automated. Coordinate Measuring Machines (CMMs) are vital tools in the field of metrology, the science of measurement. CMMs are used to measure the physical geometry of an object. A probe attached to the third moving axis of a CMM is used to touch the object in specific places. These machines can be manually controlled by a human operator or may be programmed and controlled by computers. The CMM probe captures coordinates that produce a point cloud, describing the surface of the captive object.186 Advanced CMMs can also use various scanning methods to gather data points rapidly, providing a high-density point cloud suitable for detailed examination and reverse engineeringeven at micro and nano-scales.187 The precise measurements obtained from CMMs are essential in industries such as automotive, aerospace, and manufacturing, where adherence to stringent quality control standards is mandated.

Proposition XVIII. Cages issue from the panspectron188 (I.xv.note.).

Proof.—A captive is extracted from its environment and placed on a turntable. But conventional photogrammetry only produces accurate models from stationary, rigid objects. Any movement during capture disrupts its calculus. It is exceedingly difficult to achieve clean results from a moving subject like a person or animal with a single camera. It takes time to reposition a camera; live subjects holding still will inevitably shift. Each small movement creates a blur, spike, or hole in the resulting mesh. In order to achieve highly accurate models of living beings, engineers designed apparatuses—cages—in which an array of cameras—sometimes hundreds—are positioned at equidistant intervals. This arrangement allows for a comprehensive series of photos to be triggered simultaneously.189 

Note.—Cages descend from violence. The word cage is undeniably associated with control and confinement. In order to capture a living subject digitally, it must be placed in a cage: “This enclosed, segmented space, observed at every point, in which the individuals are inserted in a fixed place, in which the slightest movements are supervised, in which all events are recorded, in which an uninterrupted work of writing links the center and periphery, in which power is exercised without division, according to a continuous hierarchical figure, in which each individual is constantly located, examined and distributed among the living beings, the sick and the dead—all this constitutes a compact model of the disciplinary mechanism.”190 

Cages reduce the dynamic body. They insist that the continuous process of a moving thing or living being can be constrained to a limited volume and reduced to rigid and unchanging temporal states: “Through a kind of magic, images change what they reach (and claim to reproduce) into things, and presence into simulacra … copies conforming to a standard, parodies of presence.”191 The body becomes a fixed specimen, losing all sense of life. Like a death mask or shroud, it converts the living being to pure surface (I.definition.v.).

Cages mirror the economics of the disciplinary society. The logic of the prison-industrial complex.192 Cages are extremely expensive. The cost of initial investment limits access. Companies profit off of ownership and rental of the equipment as well as from the marketplace of assets that they generate. Likewise, prisons in the United States are increasingly privatized and profitable. Both provide incentives to place bodies in cages. Disturbingly—whether spatial or imagistic—incarceration is economically generative.

Proposition XIX. Aerial capture extends colonial cartography.

Proof.—In aerial photogrammetry, an aircraft, usually an unmanned aerial vehicle (UAV), or satellite is equipped with high-resolution cameras which are used to capture photographs and depth data of a landscape or structure. The drone can follow a pre-programmed flight path to ensure even coverage of the area. The captured images are then stitched together to create detailed 3D models or topographic maps. Aerial photogrammetry is extensively used in surveying, agriculture, construction, and environmental monitoring due to its ability to cover large areas quickly and efficiently, even in difficult terrain.193

Proposition XX. Capture collapses space.

Proof.—In Geography and Vision, Denis Cosgrove explains how flight fuels the imagined possibility of total spatial control: “The aeroplane is the most visible of a great range of modern technologies that have progressively annihilated space by time over the course of the past century. The frictional effects of distance, the time and energy expended in moving across space, so painfully apparent on sea and land, are dramatically reduced in flight. The boundaries that disrupt terrestrial movement and fragment terrestrial space disappear in flight, so that space is reduced to a network of points, intersecting lines and altitudinal planes.”194 

Proposition XXI. Vertical perspective is a privileged view.

Proof.—The use of drones—unmanned aerial vehicles—signals both economic and political power, the exceptional ability to act as “a solar Eye, looking down like a god”195 Use of the technology requires resources and permits to occupy regulated airspace. These permits are difficult to acquire except for persons of influence, those connected to a governing institution or part of a powerful corporation. These individuals have access to a privileged view: “the earths topography itself flattens out to a canvas upon which the imagination can inscribe grandiose projects at an imperial scale. From the air, the imposition of political authority over space can be readily appreciated.”196 Once again, space is colonized.

Note.—Aerial Capture is threatening. For those on the ground, the image of the drone above is one of flying terror. It is increasingly difficult to differentiate between benign drones and military drones. UAVs are commonly used for military surveillance and even payloads in warfare: “vertical sovereignty splits space into stacked horizontal layers, separating not only airspace from ground, but also splitting ground from underground, and airspace into various layers. Different strata of community are divided from each other on a y-axis, multiplying sites of conflict and violence.”197 While the cage is linked to the prison through metaphor, the drone is concretethe same device in both photogrammetric capture and combat. It is impossible to visually demarcate a drone. Is it collecting images for entertainment or archeology, or is it surveilling, posed to assassinate? From below, state and military power overshadow the photogrammetric drone. Regardless of intent, this technology is a symbol of the power to claim and commodify space and control targets.

Proposition XXII. Sensing from the sky.

Proof.—Remote sensing is a powerful technology used in diverse fields like military intelligence, meteorology, environmental science, and urban planning. Remote sensing has transformed the analysis of our world and other planets. With the ability to collect data about objects or areas from a distanceoften using satellites or aircraftit offers unprecedented opportunities to monitor and measure phenomena on an unprecedented scale.198 However, the capacity to capture, analyze, and disseminate data remotely raises complex ethical questions about privacy, surveillance, ownership, and consent.

Proposition XXIII. The mind does not know itself, except in so far as it perceives the ideas of the modifications of the body.

Proof.—Remote sensing can be employed as a tool of surveillance by governments or corporations, potentially leading to the misuse of data for controlling or monitoring populations.199 In this regard, remote sensing shares ethical concerns with other surveillance technologies. The balance between security and privacy must be carefully negotiated. The ethical discourse must also consider issues of power dynamics and the potential for such technologies to be weaponized as tools of oppression.

Remote sensing poses questions concerning ownership and consent: “In fact, with digital datasets, a wider range of potential negative impacts can befall stakeholders, including the dehumanization of past peoples (and their modern descendants), the claim of open access of information when such data are rarely accessible to those outside of academia [or corporate powers], and a widening distance between local community knowledge and archaeological research.”200 Who owns the data captured by remote sensors? Advanced nations and well-funded corporations have more access to this technology and the data it produces, potentially reinforcing existing inequalities. And who has the right to give consent for data capture, especially when it concerns shared or public resources, or crosses international boundaries? The organizations deploying these technologies often transcend national borders, adding a layer of complexity to regulatory efforts.

Proposition XXIV. Field of view defines the frame.

Proof.—In optical systems, Field of View (FOV) is the angle of the viewable area that is captured by the cameras lens. Typically measured in degrees, it is “the angular extent of the observable world that is seen at any given moment. Humans have an almost 180 degree forward-facing FOV, while some birds have a complete or nearly complete 360 degree FOV.”201 Field of view can be adjusted by changing the focal length of a lens or by using different lenses.

Proposition XXV. The boundaries of supervision.

Proof.—In surveillance and other applications, the field of view can be an important factor in determining the coverage area of the camera. A wider field of view will allow the camera to capture a larger area, while a narrower field of view will allow for higher resolution images of captives: “Consider the number of pixels you have with a given camera as a cargo net made of elastic. Each square in the net represents a pixel. If you widen the view of the camera, you effectively stretch the net. You have widened the pixels, but remember a pixel is only a single value, so now you’ve stretched that single value over more of your scene.”202 Pixel dilution refers to the concept that clarity and accuracy are compromised when attempting to represent a large amount of information with a single pixel. This problem commonly occurs when configuring a scene and expanding the camera's field of view to its maximum width allowed by the lens.

Proposition XXVI. The infrastructure of surveillance.

Proof.—In The Age of Surveillance Capitalism, Susanna Zuboffprofessor emerita at Harvard Business School and expert on the social and economic impacts of technologyexplores the emergence of Surveillance Capitalism, “the unilateral claiming of private human experience as free raw material for translation into behavioral data.”203 She explains how this data is “computed and packaged as prediction products and sold into behavioral futures markets—business customers with a commercial interest in knowing what we will do now, soon, and later.”204

Corollary.—Capture obliterates privacy.

Proof.—Zuboff argues that surveillance capitalism represents a fundamental shift in the way capitalism operates, as it is based on the exploitation of personal data rather than the production of goods or services. She asserts that the concentration of data and power in the hands of a few large companies has negative consequences for competition and innovation, and can lead to the further concentration of wealth and power in society. Zuboff also contends that surveillance capitalism poses significant threats to individual privacy, democracy, and the economy. She argues that the constant collection and analysis of personal data by companies can lead to the manipulation and control of individuals and can undermine their autonomy and agency.205

Proposition XXVII. Capture is invasive.

Proof.—The politics of reproductive rights are contentious and often multifaceted, encapsulating a myriad of ethical, moral, religious, and legal dimensions. At the intersection of these discourses lie issues of privacy and control over bodies. Ultrasound technology—internal capture—is increasingly employed in the anti-abortion movement. Though it holds significant medical value in pregnancy, it has also been weaponized to manipulate public opinion and policy.

Proposition XXVIII. Capturing bodies inside bodies.

Proof.—One tactic is the legislative mandate of ultrasound examinations prior to abortion. Numerous states in the U.S. have enacted laws that require physicians to perform an ultrasound and display the images to the woman before proceeding with an abortion. The intention behind these laws is not rooted in medical necessity, but rather aims to create emotional distress and change the decision to abort a pregnancy: “The first step in this process is to perform an ultrasound to determine how far along you are. According to our state law, I must show you the ultrasound and you must hear the fetal heartbeat, if there is one. I know this might be uncomfortable, and I apologize … I understand your frustration. Although an ultrasound is often an important part of the process in abortion care, I don’t think women should have to view the ultrasound if they don’t want to. Unfortunately, this was a law that was passed last year and we can lose our license if we do not provide the ultrasound and have you view it. I can’t proceed with your visit until we have completed this part.”206 

Note.—The emergence of 3D ultrasound technology has further compounded the ethical complexities surrounding reproductive rights. Unlike traditional 2D ultrasounds, 3D technology generates realistic images that closely resemble a photograph, offering a lifelike representation of the fetus: “Most machines now have 3D/4D capability. Why? It is not improved screening for, or diagnosis of, fetal abnormalities in the first and second trimester of pregnancy and it is not automated image capture reducing time for examination or sonographers’ injury. It is consumer demand for a souvenir fetal ‘keepsake image’ which, generated by 3D, is much clearer and more realistic in appearance than with 2D.”207 This technological advancement has been co-opted by anti-abortion activists to humanize the fetus and evoke stronger emotional responses. Anti-abortion clinics, often referred to as crisis pregnancy centers, frequently utilize 3D ultrasounds to dissuade women from seeking abortions by enhancing the perceived humanity of the fetus. The American Congress of Obstetricians and Gynecologists warns that crisis pregnancy centers are frequently using “disturbing visuals or performing ultrasounds to emotionally manipulate and shame pregnant people under the guise of informing or diagnosing them.”208 Sonographers use four common methods to generate a 3D ultrasound. The first is freehand, where the probe is tilted to capture a series of ultrasound images while recording the orientation for each slice. The second method involves using a mechanical system where the probe's internal linear tilt is controlled by a motor. Third, a matrix array transducer uses beam steering to sample points across a pyramid-shaped volume. The final method utilizes an endoprobe, which involves inserting the probe and then carefully removing it to generate the volume.209 In the case of anti-abortion legislation, this is forced penetration.

Proposition XXIX. Pro-life tactics employ ultrasound technology.

Proof.—The idea of a modification of the human body (II.xxvii.) does not involve an adequate knowledge of the said body, in other words, the use of ultrasound technology in the abortion debate raises profound ethical concerns, particularly pertaining to issues of privacy and control. The compulsory viewing of ultrasounds and the emotional manipulation associated with 3D imaging can be seen as an infringement upon a womans right to privacy. Furthermore, these tactics attempt to usurp control over a womans decision-making process, undermining her autonomy over her own body.

Corollary.—Weaponization makes us ultra sound.

Note.—The politicization of ultrasound technology also challenges the integrity of medical practice, as physicians are obligated to abide by these regulations, regardless of their medical necessity or the potential psychological distress inflicted upon the patient. The American Congress of Obstetricians and Gynecologists has taken a stance against these policies: “Absent a substantial public health justification, government should not interfere with individual patient-physician encounters … Laws that require physicians to give, or withhold, specific information when counseling patients, or that mandate which tests, procedures, treatment alternatives or medicines physicians can perform, prescribe, or administer are ill-advised. Examples of such problematic legislation include … laws that require medically unnecessary ultrasounds before abortion and force a patient to view the ultrasound image.”210

In Ultrasonic Dreams of Aclinical Renderings: Possible Bodies, Helen V. Pritchard, Jara Rocha, and Femke Snelting, call for the emergence of counter-tactics: “Convoked from the dark inner space-times of the earth, the flesh, and the cosmos, particular aclinical renderings evidence that ‘real bodies’ do not exist before being separated, cut and isolated. Listen: there is a shaking surface, a cosmological inventory, hot breath in the ear. DIWO, recreational, abstract, referential and quantifying sonic practices are already profanating the image-life industrial continuum. Ultrasound is no longer (or never was) the exclusive realm of technocrats or medical experts.” There is a growing industry.

Proposition XXX. We can only have a very inadequate knowledge of the duration of our body.

Proof.—In an era of rapid technological advancement, medical imaging technologies such as ultrasounds have transformed the way parents experience pregnancy. There has been an explosion in adoption—an industry of 3D ultrasounds: “The global 3D ultrasound market size was valued at USD 2.9 billion in 2019 and is expected to grow at a compound annual growth rate (CAGR) of 6.6% from 2020 to 2027.”211 The surge in non-medical, 3D keepsake ultrasounds accounts for a significant portion of this growth. Keepsake clinics in shopping centers and strip malls all over the country aggressively advertise their services: “Expectant families, you can now see your unborn baby in live 4Dmotion! 3D Keepsake Imaging uses cutting edge technology to bring 3D and 4D ultrasound images of your unborn baby to life. You can actually see what your baby is going to look like before birth!”212 However, the U.S. Food and Drug Administration (FDA) has expressed concerns about this non-medical use.

Proposition XXXI. We can only have a very inadequate knowledge of the duration of particular things external to ourselves.

Proof.—Despite the growing popularity of keepsake 3D ultrasounds, the FDA strongly discourages their non-medical use for several reasons. Ultrasounds can heat tissues and produce small pockets of gas in body fluids or tissues—cavitation. The long-term effects of these conditions are still unknown, and therefore, the FDA recommends that ultrasound scans be performed only for medical purposes, under the guidance of trained healthcare providers. Moreover, non-medical 3D ultrasound sessions often last longer than medical ultrasounds to capture high-quality images. This extended exposure could potentially lead to unanticipated physical effects on the fetus: “the use of ultrasound solely for non-medical purposes such as obtaining fetal ‘keepsake’ videos has been discouraged. Keepsake images or videos are reasonable if they are produced during a medically-indicated exam, and if no additional exposure is required.”213

Corollary.—The non-medical nature of keepsake 3D ultrasounds introduces the potential for inaccuracies and distortions in the imaging. Unlike medical ultrasounds, which are conducted by trained healthcare professionals, keepsake ultrasounds may be performed by individuals with less rigorous training and understanding of fetal development. This lack of expertise could lead to misinterpretation of the images, potentially resulting in either undue alarm or misplaced reassurance. For instance, normal fetal formations or temporary conditions could be mistaken for anomalies, causing unnecessary anxiety for expectant parents: “You’re dealing with absolute incompetence. You’re dealing with no standard, no anything … they prey on the fears of pregnant women.”214 Conversely, actual issues might be overlooked, providing a false sense of security. This emotional roller-coaster not only heightens parental stress but could also lead to delayed medical intervention.

Proposition XXXII. Capture is distortion.

Proof.—The nature of ultrasound technology, coupled with extrinsic conditions, can result in 3D fetal reconstructions that are bumpy and distorted. Ultrasound imaging relies on sound waves, which can be influenced by numerous factors, such as the density and composition of the tissues they pass through, the position of the fetus, and the amount of amniotic fluid. External factors like the mothers body type and movement can also affect image clarity. These variables can result in images that are anatomically imprecise. For instance, certain fetal structures might appear exaggerated or diminished, and the babys surface might seem uneven: “Most 3D/4D scans I’ve seen look like mashed potatoes.”215 These distortions, while typical of the technology, can misrepresent the actual appearance of the fetus, potentially causing concern for expectant parents and leading to misunderstandings about fetal health and development.

Proposition XXXIII. There is nothing positive in ideas, which causes them to be called false.

Proof.—The capture of data and its subsequent interpretation is a critical step in numerous technologies, with any distortion or inaccuracy having the potential to mislead our understanding of the real world. This problem begins at the level of sensor intrinsics, the inherent characteristics and limitations of the sensors themselves. One example is the intrinsic properties of a camera and its associated lens distortions. A camera, much like the human eye, operates by capturing light reflected from objects in the environment and projecting it onto a sensor to create an image. Yet, the process of capturing and projecting this light is not perfect; various factors can introduce distortion. These distortions can be categorized into two main types: radial and tangential distortions.

Proposition XXXIV. Capture expands and contracts.

Proof.—Radial distortion is primarily caused by the shape of the lens and results in images appearing either barrelledbulging outwardsor pincushionedcontracting inwards: “A complex lens such as a retrofocus wide angle design tends to exhibit barrel distortion as the front group of elements acts as an aperture stop for the positive rear group. Telephoto lenses have a negative rear group and give rise to pincushion distortion. Distortion is difficult to correct for in zoom lenses, which usually go from barrel at the wide end to pincushion at the tele end.”216 Tangential distortion, though less common, occurs when the lens and the imaging plane are not parallel, causing the image to appear tilted.

Proposition XXXV. Falsity consists in the privation of knowledge, which inadequate, fragmentary, or confused ideas involve.

Proof.—A significant component of these distortions comes from camera intrinsics. These are properties of the camera that affect image formation, including focal length, sensor aspect ratio, and principle point—where the optic axis intercepts the image plane. Changes in these intrinsic parameters can dramatically influence the resulting image and, if not properly accounted for, introduce significant error. For instance, a shorter focal length implies a wider field of view but can introduce significant distortion towards the edges of the image. This effect is commonly seen in fisheye lenses.

Note.—In the Flat-Earth conspiracy, fisheye lenses have gained attention as a controversial tool used in arguments and claims regarding the Earth's shape: “Curvature Debunked!”217 Some proponents of the Flat-Earth belief argue that fisheye lenses distort images in a way that makes the Earth appear curved, even though they assert the Earth is flat. They leverage the obvious fact that fisheye lenses distort geometry, and claim they are part of a larger conspiracy to deceive people about the true shape of the Earth. Of course, the scientific consensus overwhelmingly supports the Earth being an oblate spheroid, and fisheye lens distortion does not change this widely accepted understanding of our planet’s shape.

Proposition XXXVI. Inadequate and confused ideas follow by the same necessity, as adequate or clear and distinct ideas.

Proof.—Understanding the precise nature of sensing systems involves not only their intrinsic properties but also the extrinsic factors. Extrinsic parameters determine the camera’s pose—its position and orientation in the world coordinate system. They describe the spatial relationship between the camera and the object being photographed. As such, they play a crucial role in how the object’s three-dimensional reality is translated into the two-dimensional image plane of the camera. Where the camera is placed relative to the subject can drastically alter the captured image. The camera’s distance from the subject influences depth perception and detail. Photographing a subject from the ground level will give a vastly different image than photographing it from a higher vantage point. Similarly, the orientation of the camera—how it is tilted or rotated—significantly affects the image’s perspective. A slight tilt can change the horizon line and create a skewed representation of the world. In cinematography rolling the camera in this way is called “a Dutch angle, a Dutch tilt, a canted angle, or an oblique angle. When a character is sick or drugged or when a situation is ‘not quite right’ you may choose to tilt the camera left or right and create this non-level horizon. The imbalance will make the viewer feel how unstable the character or environment really is—think of a murder mystery aboard a boat in rough seas; things tilt this way and then that, everyone unsure, everyone on edge.”218 The relationship between cameras—in multi-camera systems—or between a camera and other sensors—in sensor fusion systems—are also considered extrinsic parameters.

Proposition XXXVII. Total capture is impossible.

Proof.—We often assume that the data captured by advanced sensor technologies provide a seamless and comprehensive representation of the physical world. However, sensors can only process a limited amount of data—they provide a sparse sample of reality. Sensors, regardless of their complexity or precision, are fundamentally filters that reduce the complexity of the world into a manageable data set. They work by converting real-world phenomena—like light—into digital information that can be processed and interpreted. Given the infinite complexity and richness of our physical environment, it is impossible for any sensor to capture every single detail of the world with complete accuracy.

Proposition XXXVIII. Limited capacity.

Proof.—The inherent limitation of sensors is largely due to two main factors—the technical constraints of the sensor itself and the computational capacity to process the data. The sensors design dictates what it can measure and how accurately it can do so. For example, a cameras resolution limits the amount of visual detail it can capture. Computational capacity refers to the ability to process and store data. Processing the sheer volume of data necessary to fully capture reality is beyond our reach: “In the realm of biosensing, for example, signals are acquired—often at high cost—with various sources of noise, including the stochastic behavior of molecular interactions, imperfections in fabrication, chemical and/or optical signal transduction mechanisms and human variation in terms of sample handling, as well as physiological differences and natural variations inherent in large test populations.”219 This sparse sampling can lead to skewed data sets that do not fully reflect the reality of the situation, impacting subsequent decision-making processes.

Corollary.—Reduction distorts decisions (III.lii.).

Proposition XXXIX. Capture is sparse sampling (II.xlviii.another note.).

Proof.—Condensed sensing—or sparse sampling—is essential for practical functioning, as our computational systems currently cannot handle the full complexity of the physical world.220 The key lies in understanding these limitations and using this knowledge to interpret sensor data more accurately and realistically. Its critical to understand what the sensor system can and cannot capture, and how the choices made in the data collection process might impact the resulting data set.

Corollary.—Sensors condense values.

Proposition XL. Embedding bias from inception.

Proof.—Understanding bias in the context of sensor technology begins with acknowledging that sensors are not neutral. Despite their seemingly objective functionto capture and record data about the physical worldthe design, deployment, and interpretation of sensor data are inherently influenced by human choices and sociocultural factors. The biases embedded in these processes may not always be conscious or intentional, but they can have significant impacts on how sensor data is understood and used.

Note I.—From its inception, camera technology has reflected and perpetuated certain biases. One of the most prominent examples is the historical bias towards lighter skin tones in color film processing: “Light skin became the chemical baseline for film technology, fulfilling the needs of its target dominant market. For example, developing color-film technology initially required what was called a Shirley card. When you sent off your film to get developed, lab technicians would use the image of a white woman with brown hair named Shirley as the measuring stick against which they calibrated the colors. Quality control meant ensuring that Shirley’s face looked good.”221 Color film was calibrated for lighter skin, resulting in underexposure and poor representation of darker skin. This bias was not merely an oversight, but a reflection of the racial prejudices prevalent in the societies where the film industry was primarily based. Similarly, early digital cameras facial recognition technology struggled with identifying non-white faces, a pattern of unintentional racial bias built into the sensor technology. More advanced facial recognition compounds the problem: “Groundbreaking research conducted by Black scholars Joy Buolamwini, Deb Raji, and Timnit Gebru snapped our collective attention to the fact that yes, algorithms can be racist. Buolamwini and Gebru’s 2018 research concluded that some facial analysis algorithms misclassified Black women nearly 35 percent of the time, while nearly always getting it right for white men … many police departments use face recognition technology to identify suspects and make arrests. One false match can lead to a wrongful arrest, a lengthy detention, and even deadly police violence”222 (III.instances of reconstructions.iv.explanation.).

Note II.—Even the ubiquity and accessibility of camera technology reveal biases. While the proliferation of devices has democratized photography to a large extent, disparities still exist in terms of who has access to these tools and how they are used, reflecting broader societal inequalities.

Proposition XLI. Capture is unevenly distributed.

Proof.—There is a significant divide in the access to and use of technologies of capture. The ethical dilemma here is rooted in the uneven distribution of these technologies, which mirrors and often amplifies existing social and economic disparities. As these technologies become integral in sectors like healthcare, education, research, and even in our personal lives, access to them translates into access to opportunities and information.

Proposition XLII. Access is imbalanced.

Proof.—Parallel to this is the issue of data accessibility. With the advent of technologies that can capture and process vast amounts of data, participation in and control over this data has become a pressing ethical concern: “We define a ‘data divide’ as the gap between those who have access to—and feel they have agency and control over—data-driven technologies, and those who do not. It interacts with the ‘digital divide’ by manifesting in the way that data systems are designed, developed and shaped by those who are most likely to be represented or able to have access to them. This means the digital divide has a determining effect on who is able to be represented by and shape data-driven technologies. All this perpetuates and compounds social and health inequalities.”223 Capture technologies generate massive amounts of data that can feed into algorithms, decision-making processes, and create new insights. However, access to this valuable resource is often monopolized by corporations and government entities.

Proposition XLIII. Data concentrated.

Proof.—Data poverty—where certain individuals or communities lack access to data, or the skills to use and interpret it—reinforces social and economic inequalities. In 2021, a research grant was awarded by the Nuffield Foundation to address data poverty in the UK and redefine data access as both human right and public resource: “Digital inequalities in access, skills, and capabilities impact all aspects of citizens’ lives, be that work, education, leisure, health, or wellbeing. The team will undertake a ‘proof of concept’ study capitalizing on the well-established Minimum Income Standard (MIS) methodology to develop a Minimum Digital Living Standard (MDLS).”224 There is a need for policies that promote equitable access to capture technologies and the data they generate.

Note.—Who owns—controls—data determines who holds power, who makes decisions, and who can influence humanity. The current state of data ownership is complex. Every click, every transaction, every digital interaction generates data—collected by organizations. But who truly owns this data? Beyond privacy. Agency. Power. Control. Control over data confers—to corporations or governments—immense power. The ability to influence consumer or constituent behavior. Terraform political landscapes. Synthesize societal norms through tailored advertising—propaganda. Exploitation without explicit consent, or knowledge of the individuals to whom the data pertains.

Transparency acts as a countermeasure against unbridled power. Informed consent is key to maintaining individual agency in the digital age: “Enterprise data, by its very nature, flows through an organization, touching many business and technical processes and being stored / moved / transformed by many IT systems. It can end up in uncounted numbers of reports, online displays, data feeds, and information products.”225 In a time when breaches and misuse of personal data are common, transparency reassures users that their data is being handled responsibly. It allows users to make informed decisions about who has access to their data and under what conditions. The challenge lies in balancing transparency and complexity. Data handling processes are hard for the average user to comprehend. Algorithms are black boxes (III.). They give off black body radiation. Transparency also means clarity.

Individual ownership “is not feasible for most large or complex enterprises. For them, the concept of Data Ownership may not be useful. Instead, they take another approach: federated data-related accountabilities. In this approach, they first document data lineage (the path data has taken from its creation/acquisition to a specific system or report). Then they assign data-related accountabilities for a manageable number of segments to Data Stewards, SMEs, and/or Data Custodians (technical resources).”226

Blockchains are data lineages. They are the underpinnings of cryptocurrencies like Bitcoin and Ethereum. They promise enhanced agency, transparency, and trust in data governance. Decentralized. Immutable. A record of transactions that claims to cut out centralized power. Blockchains are transparent. Every transaction is publicly visible and unalterable. Renewed trust. Open books.

Proposition XLIV. Decentralized genesis.

Proof.—A blockchain is a decentralized ledger. It records transactions across distributed computers so that the trusted record cannot be retroactively altered. This decentralized nature is the first factor that distinguishes blockchain from conventional methods of data circulation and control. Unlike traditional centralized databases, where a single entity or authority has the power to control and modify data, blockchain operates on a network of nodes. This decentralization removes the need for a trusted third party or intermediary, leading to increased security and decreased potential for manipulation or fraud. However, it is not impenetrable: “the zero-state problem occurs when the accuracy of the data contained in the first, or ‘genesis block,’ of a blockchain is in question.”227

No—blockchains may enhance transparency and agency, but they are not immune to co-option and privatization. There has been a rise of private—permissioned blockchains228—more opaque decisions and concentrations of power. The transformative potential of blockchain could turn to tokenism (II.xlviii.)—entrenching existing disparities in data ownership and control. The assumption that blockchain transparency equates to full disclosure and fairness is misleading.229 Although all transactions are publicly visible, the identities behind those transactions remain anonymous. Diminished accountability. Alternate realities (IV.xxiii.).

Corollary I.—Permanent fingerprints.

Note.—Blockchains use a form of cryptography—hashing—to ensure security and immutability: “Hashing is a method of cryptography that converts any form of data into a unique string of text. Any piece of data can be hashed, no matter its size or type. In traditional hashing, regardless of the data’s size, type, or length, the hash that any data produces is always the same length. A hash is designed to act as a one-way function—you can put data into a hashing algorithm and get a unique string, but if you come upon a new hash, you cannot decipher the input data it represents. A unique piece of data will always produce the same hash.”230 A fingerprint. Once data is recorded on the blockchain, it is extremely difficult to change or erase. Data integrity. In traditional databases, data can be changed or deleted by those with access. Every transaction on a blockchain is visible to all participants in the network. A transparent system where the movement of data is tracked openly.

IoT networks involve countless devices collecting, sharing, and acting on data. As IoT ecosystems expand, so does the complexity of managing massive amounts of data and ensuring security and privacy. With interconnectedness comes a risk of data breaches: “A fundamental problem with current IoT systems is their security architecture, with a centralized client-server model managed by a central authority which makes it susceptible to a single point of failure. Blockchain addresses this problem by decentralizing decision-making to a consensus-based shared network of devices.”231 By storing data across a network of nodes, rather than a central server, blockchain reduces the risk of data being compromised, boosting the overall security of IoT systems. Every transaction in a blockchain network is recorded on a public ledger, providing a verifiable audit trail. Trusted supply-chains.

Blockchain’s ability to provide a decentralized, transparent, and immutable ledger of transactions makes it an ideal candidate for applications requiring high levels of accountability and traceability. Glockchain—a speculative prototype—simulates blockchain technology applied to firearm regulation—opening up opaque social forces: “Glockchain is one of numerous prototypes created for a new venture called Ideo coLAB, which brings together partner companies (in this case NASDAQ, Citi Ventures, Fidelity and Liberty Mutual), designers from innovation and design firm Ideo, and fellows from a variety of backgrounds to find applications for emerging technologies like the blockchain.”232 

The initial prototype focused on a specific US population—law enforcement. Capture toward transparency and accountability. The prototype leverages the intrinsic qualities of blockchain to provide a public, immutable record of police firearm use. The prototype assumes that future firearms are equipped with sensors. Smart guns. A smart gun records data when moved, holstered, unholstered, and fired. The data includes the time, location, pose estimation, rounds fired—facial recognition and fingerprint. It is then automatically uploaded to a blockchain. Once on the blockchain, the data cannot be altered or deleted, ensuring that the record of the event remains intact and verifiable.

This use of blockchain technology has several potential benefits for law enforcement accountability. It automates reporting of when, where, and under what circumstances force was used. Current data is impoverished: “The FBI’s data is based on information voluntarily submitted by police departments around the country, and not all agencies participate or provide complete information each year.”233 Glockchain would provide an overall picture of the state of police firearm use. It would also clarify incidents where there is a dispute over the use of force—a record that can be referred to during investigations or judicial proceedings. Glockchain could improve public trust in law enforcement. Incidents of excessive or inappropriate use of force have led to widespread calls for greater transparency and accountability in policing. Glockchain answers these calls with a verifiable, unalterable record of when and how force is used. Inverting surveillance.

While transparency is one of the key strengths of blockchains, careful thought must be given to how data is collected, how it is made public, and how it can be accessed. The system must be tamper-proof. Software and hardware. Firearm sensors must be reliable: “After a short stint at MIT, Kloepfer dropped out to focus on Biofire, now a company with 40 employees and $30 million in venture capital funding. His team has designed and built hundreds of prototypes, trying to meld old-school gunsmithing with the latest in cutting edge electronics … In the main workshop space, there are thermal chambers that simulate different environmental conditions. That sort of testing is critical … to ensure that a gun loaded with electronics works in any sort of environment.”234 It must be calibrated for accuracy–robust and secure enough to ensure that they cannot be disabled to circumvent the system. Glockchain must also contend with the issues of acceptance and adoption. This extends beyond law enforcement officers to include legislators, courts, privacy advocates, and the public, all of whom will have a role in determining how this technology is implemented and used. Widespread adoption and consistent oversight may be challenging as “the NRA opposes any law prohibiting Americans from acquiring or possessing firearms that don’t possess ‘smart’ gun technology.”235

Smart guns, which began as an attempt to “meld a fingerprint sensor onto the grip of a Glock handgun”236 are commercially available—as of 2023. Manufacturers—Biofire, Lodestar, and SmartGunz—are producing firearms as devices of capture. Glockchain simulates how blockchain technology can be used to enhance transparency and accountability in law enforcement. If adopted it could be expanded as a tactic for broader firearm regulation. The regular cadence of massacres—mass shootings—demands system wide regulation.

Shortly after Glockchain was prototyped, it was trademarked by a lawyer in Los Angeles. A few years later the trademark expired.

Corollary II.—Expand and contract.

Proof.—A smart contract is an automatic, self-executing contract where the terms of the agreement are written into the code. It operates under a set of conditions, automatically executing transactions when those conditions are met. Smart contracts eliminate the need for an intermediary and ensure the terms are transparent and immutable. However, “there is no federal contract law in the United States; rather, the enforceability and interpretation of contracts is determined at the state level … any conclusions regarding smart contracts must be tempered by the reality that states may adopt different views.”237 Ethereum is the platform that popularized the use of smart contracts. It introduced a programming language and tools for developers to write their own. This feature has been utilized to create decentralized applications (DApps) on the Ethereum network, leading to various innovations, one of which includes Non-Fungible Tokens (NFTs): “Non-fungible tokens (NFTs) seem to be everywhere these days. From art and music to tacos and toilet paper, these digital assets are selling like 17th-century exotic Dutch tulips—some for millions of dollars.”238

Proposition XLV. Every idea of every body, or of every particular thing actually existing can be tokenized.

Proof.—Each NFT represents a unique item. NFTs are cryptographic tokens on the blockchain. Fungible exchanges—like physical money—are on a one-for-one basis. NFTs are non-fungible, meaning no two NFTs are the same. No identical twins.

Note.—Smart contracts play a crucial role in the creation and transaction of NFTs. When an NFT is created—minted—a smart contract is written to the Ethereum blockchain. This contract contains the rules and information for that specific NFT—who owns it and any royalties that need to be paid upon future sales. When an NFT is sold, another smart contract is used to automate the transaction. This contract guarantees that the NFT is transferred to the buyer, the payment is sent to the seller, and any specified royalties are paid to the original creator. Buyer—seller—owner. No middleman. The transparent, secure, automated minting and trading of digital assets.

Proposition XLVI.Blockchains track economies of images.

Proof.—The idea of using NFTs to represent ownership of a digital image is appealing as it provides solutions to the problems inherent to the digital medium: “‘When it comes to selling artworks, two things are important … Is the artwork real, and do I have the authority to sell it to you?’”239 NFTs provide proof of authenticity and ownership, solve issues related to copyright and reproduction, and offer new approaches to monetization.

Proposition XLVII. Magnetizing artists.

Proof.—In 2020, digital artists gained mainstream recognition, and with them NFT marketplaces such as OpenSea, Rarible, and SuperRare(III.vi.). The sales volume of NFTs saw an exponential increase. The turning point came in March 2021, when renowned auction house Christie’s sold an NFT—Everydays: The First 5000 Days—for a staggering $69,346,250.240 This sale legitimized NFTs in the traditional image market and ignited a global conversation around their potential.

Note.—These technologies are volatile. The value of cryptocurrencies—the foundational blocks of NFTs—fluctuate wildly. They are influenced by factors such as regulatory news, technological advancements, market sentiment, and macroeconomic trends. Subject to speculative behavior and hype cycles—inflated values and sudden market downturns—hacking and theft: “More than $100 million worth of NFTs were publicly reported as stolen between July 2021 and July 2022 … by September of last year, NFT transaction volume had collapsed by 97% from its peak in January 2022.”241 Unstable values.

Proposition XLVIII. Tokens represent control.

Proof.—NFTs introduce an additional layer of complexity to copyright issues, as they involve the digital representation of ownership, not the ownership of the actual intellectual property rights of an image.

Note.—Traditionally, when a photographer sells a physical print of a photograph, the photographer retains the copyright unless it is explicitly transferred. This means the photographer can reproduce the image, make derivative works, distribute copies, or display the image publicly.

The buyer owns a single physical instance of the image. NFTs exist in a digital space where reproduction is effortless. When an NFT image is sold, the buyer purchases proof of ownership on the blockchain. This does not mean they own the copyright—still only the instance. Unless explicitly stated, the creator retains intellectual property rights and can create more NFTs of the same image. There have been instances where artists found their images minted as NFTs by others without their consent. Unauthorized copies. “When Lois van Baarle, a Dutch artist, scoured the biggest NFT marketplace for her name late last year, she found more than 100 pieces of her art for sale. None of them had been put up by her … ‘It is much easier to make forgeries in the blockchain space than in the traditional art world. It’s as simple as right-click, save. It’s also harder to fight forgers. How do you sue the anonymous holder of a crypto wallet? In which jurisdiction?’”242 The global and decentralized nature of NFTs complicates copyright law, which is governed by states, not at the federal level. Enforcing copyright claims in a decentralized environment, given the absence of a central authority, is an amorphous process. The United States Copyright Office has opened an investigation into NFTs and has organized three roundtables on their implications—copyright, patents, trademarks.243 As the NFT market continues to evolve, it demands a clearer legal framework.

Another Note.—Tokenism refers to the practice of making a symbolic or superficial effort to include individuals or groups from underrepresented backgrounds, without truly addressing or resolving the underlying issues of inequality or discrimination. It involves giving the appearance of inclusivity or diversity without making substantial or meaningful changes. Tokenism often occurs in contexts such as institutions, corporations, or media, where there is a desire to demonstrate diversity or inclusivity without actually challenging the existing power structures or addressing systemic inequalities: “The current notion that token integration will satisfy his people is an illusion.”244 It typically involves selecting a small number of individuals from marginalized groups and presenting them as representatives of the entire group. The community overall is not included. They remain outside the system. Tokens may not necessarily result in substantive changes or equal opportunities for others. Sparse sampling.

Tokenism is problematic because it creates the illusion of progress. It also places undue pressure on the individuals selected as tokens, as they may feel burdened with representing an entire group and face additional scrutiny or unrealistic expectations. To combat tokenism, it is important to focus on genuine inclusivity, equity, and systemic change. This involves providing equal opportunity and creating an environment that values diverse perspectives and contributions. How inclusive are emerging image economies? NFTs claim to democratize art, but “there are still systemic barriers to entry, evidenced in the makeup of the NFT community. Not only is there a screaming lack of disability-forward NFTs, but the gap is large even for those identifying as a minority race or gender, and of course, the gap is largest for those disabled artists intersectioned with race, ethnicity and gender. In attending NFT round-tables and information sessions, nearly all presenters were of the same race, sex and social status. Why do we still see the systemic residue in a market claiming to shake up the industry?”245

A recent exhibition—Sight Unseen—showcases the work of blind photographers from around the world. The images bridge the gap between distinct inner worlds and the shared sphere of the sighted. Representing a diverse range of visual impairments, some completely blind and others with varying degrees of visual perception, these artists utilize photography as a medium to navigate and interpret the surrounding space using other heightened senses: “vision is so strong that it masks other senses, other abilities. I feel light so strongly that it allows me to see the bones of my skeleton as pulsating energy.”246 The curator explains that “we, in the sighted world, are absolutely immersed in images, we’re in a torrent of images, an avalanche of images. And what I’ve learned in talking to all these blind photographers and thinking about this, is that the sighted world has essentially in some ways been blinded by all the images we’re exposed to. All we see are those images now. They displace the reality of the world is right in front of us. Instead, we see representations of that world. And if you can’t see, you can’t be influenced by this, so you’re automatically operating in an extremely original way.”247 The value of other ways of knowing.

Many adaptive—or assistive—technologies are not currently integrated into blockchains, but they could be. Combined, Optical Character Recognition (OCR) and Text-to-Speech (TTS) convert images to audio descriptions. These adaptive tools would empower visually impaired users to engage with information and interact with blockchain content through voice-based interfaces. Adaptive technology enhances the creative process: “as I keep being told by blind people, if you're going to be blind, this is the best time to be blind in history because there are so many assistive technologies that you can use.”248 They emphasize the importance of having control over their images and modifications, even if they have limited—or no vision.”249 The exhibition incorporates various accessibility features such as audio descriptions of artworks, biographical essays available in audio form, Braille, and tactile elements added to some photographs, allowing for multi-sensory engagement. This integration acknowledges the importance of accessibility and demonstrates how stakeholders can embrace innovative solutions. Blockchains and other emerging data management systems have high stakes when it comes to accessibility and—as users proliferate—sustainability.

Proposition XLIX. Tokens consume energy.

Proof.—As blockchains grow, their requirements for storage, bandwidth, and computational power increase.The environmental impact of NFTs has been a topic of significant debate, primarily due to their association with high energy consumption as they used a Proof of Work (PoW) consensus mechanism until quite recently.

Corollary.—Proof has stakes.

Proof.—The PoW model exploits miners (II.lemma.vii.). The mining process involves solving complex mathematical puzzles. Each solution validates a transaction and adds it to the blockchain. Solutions are rare—rare minerals. Mining is computationally intensive. Extreme energy use—a massive carbon footprint: “We have to change our existing habits. So how can we build new platforms that are unsustainable?”250 As NFT transactions increased in frequency and number, so too did the environmental impact of the Ethereum network: “...an average NFT has a stunning environmental footprint of over 200 kilograms of planet-warming carbon, equivalent to driving 500 miles in a typical American gasoline-powered car … ‘You just click on a button or type a few words, and then suddenly you burn so much energy.’”251

Note.—In a bid to address these concerns, Ethereum has transitioned to a Proof of Stake (PoS) consensus mechanism through an upgrade known as Ethereum 2.0. PoS is seen as a lower impact alternative to PoW: “Ethereum switched on its proof-of-stake mechanism in 2022 because it is more secure, less energy-intensive, and better for implementing new scaling solutions compared to the previous proof-of-work architecture.”252 It dramatically reduces the computational power required to secure the network—each transaction is approximately 1/30,000th of its PoW equivalent. Instead of miners competing to solve complex problems, validators in a PoS system create new blocks based on tokens they hold—and are willing to stake as collateral.

PoS allows for greater scalability, as it can process transactions more quickly than PoW, enhancing the network’s capacity to handle increased demand. This is crucial in accommodating a larger user base and more diverse applications. PoS is seen as more democratic and inclusive. PoW favors miners with more powerful hardware—PoS gives anyone who can buy and stake tokens the opportunity to participate. This lower entry barrier may democratize participation, fostering wider user engagement and adoption: “Whereas under proof-of-work, the timing of blocks is determined by the mining difficulty, in proof-of-stake, the tempo is fixed. Time in proof-of-stake Ethereum is divided into slots (12 seconds) and epochs (32 slots). One validator is randomly selected to be a block proposer in every slot. This validator is responsible for creating a new block and sending it out to other nodes on the network. Also in every slot, a committee of validators is randomly chosen, whose votes are used to determine the validity of the block being proposed.”253 It is worth noting that while Ethereums transition to PoS has improved accessibility and decreased environmental impact, it has not completely eliminated these problems. The shift to PoS can help make the crypto-art market more sustainable, but it doesnt address all environmental concerns—It is crucial to consider the e-waste generated by the hardware used for mining and the energy sources powering these activities.

Another Note.—There are alternative approaches to the capture of light.

The growing environmental crisis has brought to light the urgent need for sustainable and accessible energy solutions. GRID Alternatives captures light to protect the environment and elevate underserved communities. The cornerstone of GRID Alternatives work is the belief that clean, affordable energy should be a fundamental human right: “GRID was founded during the 2001 California energy crisis by Erica Mackie, P.E., and Tim Sears, P.E., two engineering professionals who were implementing large-scale renewable energy and energy efficiency projects for the private sector. The idea that drove them was simple: free, clean electricity from the sun should be available to everyone.”254 The organizations approach is multi-faceted, combining solar installation, community engagement, and workforce development. No-cost solar for low-income households. Light into energy. Reduced emissions. Better Jobs. Contributing to the broader fight against climate change. Energy independence. Liberation from toxic systems.

GRID Alternatives recognizes that the transition to clean energy offers more than just environmental benefits. It also presents substantial economic opportunities. The organization offers hands-on solar installation training programs, equipping individuals from marginalized backgrounds with the skills and certifications necessary to enter the renewable energy sector. This not only expands job opportunities for these individuals but also ensures that the clean energy transition benefits all members of society.

Beyond these direct interventions, GRID Alternatives also works to influence energy policy on a systemic level: “GRID is a leading voice in low-income solar policy and the nation’s largest nonprofit solar installer, serving families throughout California, Colorado, the Mid-Atlantic region, and tribal communities nationwide … In addition, GRID’s international program partners with communities in Nicaragua, Nepal and Mexico to address their energy access issues.”255 The organization advocates for inclusive renewable energy policies that prioritize the needs of low-income communities and communities of color. In doing so, GRID Alternatives helps to ensure that these communities are not left behind in the shift towards renewable energy.

GRID Alternatives is committed to “advancing an EQUITY agenda both within GRID Alternatives and in the energy industry and policy arenas by examining and addressing systemic inequities; seeking out and amplifying the voices of the communities we serve; and expanding access to solar energy and career and leadership opportunities.”256 Their work challenges traditional power dynamics and introduces a new paradigm where sustainable energy practices and social justice converge.

It remains to point out the advantages of a knowledge of this doctrine as bearing on conduct, and this may be easily gathered from what has been said.

1. Capture technologies, inclusive of a range of devices and methods from sensors to data management systems, have reshaped our perception of reality. They have brought about innovation—creativity—discovery—while simultaneously introducing multifaceted challenges and costs. Environmental Impact. Inaccuracy. Distortion. Privacy. Ownership. Accessibility.

2. The nuances of how these technologies function and process information is crucial to understanding the realities they create. This comprehension brings to light the innate limitations of such technologies—distortions resulting from sensor intrinsics and extrinsics—and the selective representation of reality due to limited data processing capabilities. There are opportunities for technological and ethical advancement, for instance, the application of blockchain technology introduces new layers of transparency, accountability, and user agency in the way data is stored, distributed, and exchanged.

3. Merely understanding these systems and processes is insufficient. Engaging with capture technologies requires us to be deliberate and intentional in what we choose to capture, how we capture it, and for what purpose. We must recognize that data is never purely objective, but always influenced and shaped by the technologies, algorithms, and the human biases that guide its capture and interpretation.

4. Intentionality is paramount. Alignment with ethical, responsible, democratic principles. Vigilance. Continual questioning and evaluation of the processes. Hyper-awareness of the potential replication or amplification of existing societal biases and disparities. A future where capture is not limited as a tool of surveillance and control, but rather opens up as an instrument of equitable and responsible progress.

I thus bring the second part of my treatise to a close … and, considering the difficulty of the subject, with sufficient clearness. Part III examines how captured data is aligned and wielded. The warfare and magic of Reconstruction.











PART III.

ON THE NATURE AND OPTICS OF RECONSTRUCTION

PREFACE

Realism, photorealism, and hyperrealism are the aesthetics that propagated with the mass adoption of cameras. Realism is a philosophical concept with a long history—traversing traditions. Generally, realists hold that the world exists independently of our perceptions or interpretations (V.x.). The world is real. Space. Material. Life. Photorealism is the perfectionistic representation of these places, objects, and beings. In Camera Lucida, French theorist Roland Barthes, grappled with the power to freeze moments in time. He raised questions about the photographer’s responsibility towards their subjects: “All those young photographers who are at work in the world, determined upon the capture of actuality, do not know that they are agents of Death.”257 Photography produces death—suspended, timeless portraits—memento mori. Reconstructions are death masks. Digital replicas. Computer vision integrates dead data into multidimensional models. If Capture produces dead data, Reconstruction produces the undead. If Capture is breaking to control, Reconstruction reassembles broken bits258—broken bodies—broken environments—broken systems. It is an attempt to put the broken pieces back exactly where they were. Reconstruction is hyperreal integration. Reconstructions appear as straightforward representations of reality, but they are reductions—distortions. In Simulacra and Simulation, Jean Baudrillard introduced hyperreality—a state in which a copy or simulation does not merely mimic the real but becomes more real than the real. Hyperreal. Data are simulacra—copies that substitute the real: “Of all the prostheses that mark the history of the body, the double is doubtless the oldest. But the double is precisely not a prosthesis: it is an imaginary figure, which, just like the soul, the shadow, the mirror image, haunts the subject like his other, which makes it so that the subject is simultaneously itself and never resembles itself again, which haunts the subject like a subtle and always averted death.”259 The undead—reanimated. A reconstruction is a double.

Reconstruction is also an era.

The Reconstruction era followed the American Civil War—from 1865 to 1877. It was the period of rebuilding after centuries of colonial practice—the end of legalized slavery. Reconstruction was integration. Integration of former slaves—emancipated beings. Integration of Southern states—back into the Union. The new model was almost a replica of the past—the ideologies that initiated and perpetuated slavery were also integrated. And a new weapon: “photography served many purposes during the war. It was used to promote abolition; as propaganda for both the northern and southern causes; as an important tool in the creation of Lincoln’s public persona and career; as well as for reconnaissance and tactical observation.”260 Photography proliferated in the Reconstruction era—shaping public opinion. Images captured the chaotic realities of a nation grappling with decimation and profound change. Images captured the devastation wrought by war, the violent opposition African Americans faced in the South, and the struggles and joys of people adapting to freedom. These powerful images were published and distributed through newspapers, magazines, and as individual prints, reaching a wide audience. They informed public opinion—stimulated discussions, debates, and policy changes. The American public experienced Reconstruction through the lens of a camera.261 Now these images are wormholes into the past—allowing future generations to glimpse the realities of post-Civil War America. Photography was an equalizing force. Recently emancipated beings alongside figures in power. They show that history is made by powerful organizations—but also by individuals fighting for their rights. Made by whoever controls images.

Images of the emancipated offered tangible evidence of the conditions and treatment they endured, countering the abstract political rhetoric that dominated the public sphere. Portraits of freedmen and freedwomen served as silent testimonials to the hardships they had endured and their determination to claim their rightful place in American society. Sojourner Truth strategically used photography as a medium for advocacy. Truth sold small photographs—inscribed with her slogan—I Sell the Shadow to Support the Substance.262 Her image circulated—a symbol of empowerment and self-possession—defying the dehumanizing narratives of African Americans prevalent in society:

As a women’s rights activist, Truth faced additional burdens that white women did not have, plus the challenge of combating a suffrage movement which did not want to be linked to anti-slavery causes, believing it might hurt their cause. Yet, Truth prevailed, traveling thousands of miles making powerful speeches against slavery, and for women’s suffrage (even though it was considered improper for a woman to speak publicly).263

Images exerted social influence. The dominant political and social classes used images as propaganda to express their visions of post-war society and to cast doubt on those who opposed them. Photographs also influenced public perception of Reconstruction policies. Critics produced images highlighting instances of inefficiency—or corruption—within Reconstruction government. Undermining their legitimacy. Photography was used to romanticize the Lost Cause. Framing the Confederacy as noble—heroic. A slight of hand—the South fought not for the preservation of slavery but for states’ rights and southern honor. Photographs of supposedly contented slaves served this narrative, glossing over the atrocities of oppression.264 Reconstruction photographs were used to reassert the societal dominance of white people after the abolition of slavery. Southern sympathizers depicted the South as a once-great society devastated by the war. The pre-war era represented the Golden Age. Photographs were also used to promote negative stereotypes of African Americans—attempting to legitimize discriminatory laws and social norms. Black individuals were often depicted as unfit for freedom, as a threat to societal order, or incapable of self-governance, reinforcing racial stereotypes and justifying the introduction of Black Codes:

After the United States Civil War, state governments that had been part of the Confederacy tried to limit the voting rights of Black citizens and prevent contact between Black and white citizens in public places … These codes limited what jobs African Americans could hold, and their ability to leave a job once hired.265 

Images oversimplify—reduce—erase the complexities of history. They invent and perpetuate myths—potentials of deception. In Reconstructing Dixie, Tara McPherson, argues that these events and images had a lasting impact on the identity of the United States. She describes the South as a three-dimensional postcard. Lenticular logic underlies the national imaginary—“a schema by which histories or images that are actually copresent get presented (structurally, ideologically) so that only one of the images can be seen at a time.”266 Alternate realities. The Civil War exists in the present. 3D postcards glorifying the Lost Cause and The Golden Age are still sold at plantation restoration novelty shops. Sanitized reconstructions within the southern tourism industry.267 We are——still——in a Reconstruction era.

Repeating the past—history echoes through technological reconstructions. Despite the end of colonial rule and the legal abolition of slavery, racial and social inequalities persist—integrated in software. Repeating the past—circulating data generated in previous moments in time. It is an economy based on “the resuscitability or the undead of information.”268 All of this dead data. Clouds—streams—pools—reservoirs—lakes—swamps of data: “A data swamp is a badly designed, inadequately documented, or poorly maintained data lake. These deficiencies compromise the ability to retrieve data, and users are unable to analyze and exploit the data efficiently. Even though the data exists, the data swamp cannot retrieve it without contextual metadata.”269 Organizational systems and rules are critical. And those codes are programmed with historical biases: “Thus the scientific archive, rather than point us to the future, is trapping us in the past, making us repeat the present over and over again.”270 Is it possible to untether the logic of capital exploitation from the logic of data? “Concealed behind the ‘echo chambers’ ... is an incredibly reductive identity politics, which posits class, race, and gender as ‘immutable’ categories.”271 

Wendy Chun argues that “software is a functional analog to ideology.”272 She asks, “what is the significance of following and implementing instructions? Perhaps the ‘automation’ of control and command is less a perversion of military transition and more an instantiation of it, one in which responsibility has been handed over to those (now machines) implementing commands. The relationship between masters and slaves is always ambiguous.”273 In the context of software, the master-slave relationship “usually refers to a system where one—the master—controls other copies, or processes.”274 For example, in a master-slave database replication setup, one database—the master—holds the primary data, and other databases—the slaves—replicate that data to ensure redundancy and fault tolerance. Chun points out that software terms like master and slave not only reflect historical oppressive systems, they also normalize hierarchical relationships in technology. Reconstruction relies on master-slave logic at every scale.275 

In 2014, Drupal—an open source web content management system—replaced “master/slave”276 with “primary/replica.”277 In 2018, Python—one of the three most used programming languages—followed suit: “‘slaves’ was changed to ‘workers’ or ‘helpers’ and ‘master process’ to ‘parent process.’”278 

classification

Title:        Avoid master/slave terminology

Type:        enhancement        Stage:        resolved

Components:        Documentation, Interpreter Core        Versions:        Python 3.8279

In 2020, GitHub replaced the default master branch with main.280 In 2022, the Open Source Hardware Association issued an official resolution to deprecate “MOSI—Master Out Slave In, MISO—Master In Slave Out, SS—Slave Select, MOMI—Master Out Master In, SOSI—Slave Out Slave In.281 In 2023, the IEEE—The Institute of Electrical and Electronics Engineers—released “master-slave optional alternative terminology,” after calls from members: “For decades our industry has used the term ‘Master / Slave’ to denote a set of ICs or firmware/software where one device has control over one or many others. The use of this terminology has always made me and many others feel uneasy. While my ‘engineering brain’ has an idea of what this term defines, my ‘human brain’ relates this as a human condition, a human rights issue.”282 In Language Wants to Be Overlooked: Software and Ideology, Alexander Galloway argues that “to see code as subjectively performative or enunciative is to anthropomorphize it, to project it onto the rubric of psychology, rather than to understand it through its own logic of ‘calculation’ or ‘command.’”283 The words master/slave may eventually be outmoded. But we should not forget that the logic underneath is the same. These names are windows into functionality.

Many names are labels of function—but “code does not always nor automatically do what it says, but does so in a crafty manner.”284 There has always been some level of deception at play: “John Backus … contends that ‘programming in the early 1950s was a black art, a private arcane matter.’ These programmers formed a ‘priesthood guarding skills and mysteries far too complex for ordinary mortals.’ Opposing even the use of decimal numbers, these machine programmers were sometimes deliberate purveyors of their own fetishes or ‘snake oil’” (VII.). Many algorithms are still proprietary—black boxeddark matter. Even open source algorithms remain opaque to most people: “Code is a medium in the full sense of the word. As a medium, it channels the ghost that we imagine runs the machine—that we see as we don’t see—when we gaze at our screen’s ghostly images.”285 Part III excavates the algorithms that underlie Reconstruction.

DEFINITIONS

I. A reconstruction is a stereotype.

II. An idealized twin.

III. By idealized, I mean contorted by fantasy and bias.

N.B. If we can be the adequate cause of any of these modifications, then what are the ethics of Reconstruction?

POSTULATES

I. Reconstructions are images—and other forms of data—extracted from different points in space and integrated to produce multidimensional models.

N.B. This postulate or axiom rests on Postulate i. and Lemmas v. and vii., which see after II. xiii.

II. Reconstruction is a powerful tool for mapping and analyzing the surface of the earth and everything on it. Even distant cosmic bodies—far off phenomena. But reconstruction is not just a tool for scientific exploration and analysis. It changes the design and manufacture of objects and spaces. Reconstruction transduces world and experience.

Proposition I. Reconstruction evolves.

Proof.—Reconstruction—the process of volumetrically reproducing the shape and appearance of real-world objects—has a complex history that extends back centuries. Early precursors include the inventions of the Early Modern period (II.)—inventions that depicted three-dimensional realities on two-dimensional surfaces—approximating human sight—binocular vision. Later, photography allowed for a new way of producing dimensionality—stereoscopy. Stereoscopy was first proposed by Charles Wheatstone in 1838: “No question relating to vision has been so much debated as the cause of the single appearance of objects seen by both eyes.”286 A pair of images taken from slightly different angles combined to create an illusion of depth. This early method of reconstruction would eventually influence the development of stereo-vision algorithms in computer science.

The field of computer vision—digital reconstruction—began to take shape in the latter half of the 20th century. Lawrence Roberts outlined the possibility of extracting 3D geometric information from 2D images in Machine Perception of Three Dimensional Solids: “The first assumption is that the picture is a view of the real world recorded by a camera or comparable device and therefore that the image is a perspective transformation of a three-dimensional field. This transformation is a projection of each point in the viewing space, toward a focal point onto a plane. The transformation will be represented with a homogeneous, 4x4, transformation matrix, P, such that the points in the real world are transformed into points on the photograph … Thus, a transformation from the real world to a picture has been described and to go the other way one simply uses the inverse transformation, P-1.”287 An inversion of an inversion.

In 1981, Hugh Christopher Longuet-Higgins, contributed the essential matrix—or eight-point algorithm— which encodes the relative geometric relationship between two cameras.288 In 1996, Quan-Tuan Luong and Olivier Faugeras added the fundamental matrix, which relates corresponding points between two images: “we show that there is an interesting relationship between the Fundamental matrix and three-dimensional planes which induce homographies between the images and create unstabilities in the estimation procedures.289 The invention of algorithms to register and integrate data, like Iterative Closest Point—ICP—was also a significant milestone.290291 In 2000, Richard Hartley and Andrew Zisserman, published Multiple View Geometry in Computer Vision, normalizing and consolidating key concepts and opaque algorithms in the field.292 Then the development of SIFT—Scale-Invariant Feature Transform—by David Lowe provided a more robust method of matching points between images (III.iv.).293 Over the next two decades, algorithms proliferated (III.vii.). Parallel advances in hardware made reconstruction more detailed and accessible (II.). Machine learning is accelerating reconstruction (IV.). Quantum computing will ~ (V.).

Corollary.—Tools of reconstruction morph and multiply.

Proposition II. Reconstruction is capture through capture.

Proof.—Reconstruction integrates captured images—sensory data—to capture likeness.

Note.—Reconstruction relies on a complex pipeline of algorithmsdata acquisition, preprocessing, calibration, matching, estimation, optimization, integration, post processing, visualization. The first step is always capture—data acquisition—data is collected with a device (II.).

Proposition III. Data is sanitized.

Proof.—After data acquisition, follows preprocessing. This step might include noise reduction, feature extraction or other data cleaning procedures. Purification.

Note.—Data quality is a crucial factor in reconstruction: “Motion blur, sensor noise, jpeg artifacts, wrong depth of field are just some of the possible problems that are negatively affecting automated 3D reconstruction methods.”294 Several algorithms are commonly used to improve the quality of data and identify distinctive points or regions in a scene. These algorithms—SURF, ORB, AKAZE—originate from SIFT—Scale-Invariant Feature Transform. However, there is abundant “failure caused by SIFT-like algorithms.”295 What is averaged out? What is erased? Subtracted. What remains?

Proposition IV. Rules sift local values.

Proof.—Scale-Invariant Feature Transform—SIFT—identifies and matches features—like corners and edges—in images.296 These features are represented using a descriptor, which is a vector of numerical values that describes the features appearance and location. Computing the body and the archive (I.xiv.). The descriptor is produced from a set of filters that are applied to the image at multiple scales: “The SIFT algorithm uses Scale Space Theory to find interesting locations in images called keypoints. To do this, a training image is incrementally blurred using a Gaussian kernel to create a stack of blurred images called an octave. The difference between each image in an octave is then computed.”297 Once the features and descriptors have been extracted from the images, SIFT can also be used to match features between the images by comparing the descriptors. These correspondences are used to align images in later stages of reconstruction.

Proposition V. Sifting features and sifting outliers.

Proof.—Quality of input data is often questionable. SIFT is sensitive to noise and image degradation—low-quality images make matching difficult. The descriptor used by SIFTto represent features is limited in size, which can make it difficult to capture the full range of information about a feature. This can be a problem for applications where a large number of features need to be matched. It can lead to a high number of false matches (II.xl.). Uncertainty.

Proposition VI.Reverse engineering identities.

Proof.—In order to generate a model, the positions and parameters of all cameras must be known. It is often necessary to retroactively calibrate cameras to find their intrinsic and extrinsic parameters. There are numerous camera calibration algorithms used in reconstruction. The most common are direct linear transformation, the collinearity equations, and two-point perspective transformation. 

Proposition VII. Transformation must be linear and direct.

Proof.—Direct Linear Transformation—DLT—is the most popular calibration algorithm and is used in a variety of reconstruction pipelines. The algorithm finds the camera’s rotation, position, and calibration—R, Xo, K—The algorithm results in a matrix that can project 3D world coordinates to 2D image coordinates, effectively encapsulating the camera's perspective. It has the advantage of being relatively simple to implement. However, it is sensitive to outliers.

Proposition VIII. Normalize all outliers | Compute the homography.

Proof.—Direct Linear Transformation begins by normalizing outliers (III.xxxvi.) to improve numerical stability. The algorithm replaces each outlier with the average of its surrounding points.This effectively eliminates the outlier while keeping the rest of the data intact. The coordinates of a pixel equals a known coordinate in 3D space multiplied by an unknown projection matrix—x = PX. These coordinates exist in a homogeneous 3 x 4 matrix: 

[ A B C D ]

[ E F G H ]

[ I  J  K L ]

The algorithm needs a minimum of 6 points to estimate 11 values. The algorithm uses point correspondences to form a system of linear equations—each point correspondence contributes two equations—one for x and one for y. It then solves the linear system using a method like Singular Value Decomposition (SVD)—this yields a transformation matrix that minimizes error. Matrices of vectors in space. Vectors connecting points in images to points on the surface of the virtual model. 

Proposition IX. Straightness detects outliers.

Proof.—The collinearity equations are a pair of calibration algorithms. They identify outliers by finding non-linear relationships between variables.

Note.—One of the most common ways to calculate the relationships between variables is the Pearson correlation coefficient. This coefficient measures the strength of the linear relationship between two variables. If the coefficient is close to 1, then the variables are highly linearly related, and if the coefficient is close to -1, then the variables are inversely related. If the coefficient is close to 0, then there is no linear relationship between the variables. Outliers can be identified by looking for data points that have a high Pearson correlation coefficient with one variable and a low correlation coefficient with another variable. Data points are categorized as outliers because they do not fit the linear relationship that is expected between the variables.

Proposition X. Two-point perspective triangulates perfection.

Proof.—The two-point perspective transformation algorithm works by converting 3D point cloud data into a 2D image representation with the appearance of depth and perspective. It involves identifying two vanishing points in the scene, which represent the convergence points for parallel lines when projected onto the 2D image plane. Triangles projecting through space. By establishing the relative positions of these vanishing points based on the scene's orientation and camera viewpoint, the algorithm projects each 3D point onto the 2D plane along their respective projection lines. This process effectively flattens the 3D sceneonto the 2D plane while preserving an illusion of depth and distance. The two-point perspective transformation algorithm is employed to determine camera parameters. By analyzing vanishing points in the images, the algorithm can infer the camera's focal length, principal point, and lens distortion—intrinsic parameters. It also estimates the relative orientation and position of the camera with respect to the scene—extrinsic parameters.

Proposition XI. After calibration—everything must be registered.

Proof.—Registration algorithms are used in reconstruction to determine correspondence. This correspondence is typically done by finding point matches between images from different camera positions. The most common algorithms are intensity-based— they compare intensity values of pixels. These methods are most effective when images have a high degree of overlap. However, these methods can be sensitive to noise and can sometimes produce false matches. Phase correlation, optical flow, mutual information, and feature detection are also used for registration. 

Note.—Phase correlation compares the signals that represent images. Signals are typically represented as a series of pixel values—color and intensity at each location in the image. Phase correlation compares the signals by converting them into the frequency domain, using a mathematical operation known as the Fourier transform. This allows the signals to be represented as a series of  sinusoidal waves, each with a unique frequency and amplitude. Once the signals are in the frequency domain, they can be compared using a mathematical operation known as the cross-correlation. This operation compares the amplitude and phase of the signals at each frequency, and it calculates a measure of their similarity. Based on this measure of similarity, the images can be aligned or registered with each other: “The accuracy of the algorithm was found to vary in proportion to σ/n(1 − δ)2, where σ is the speckle size, n is the subimage size, and δ is the amount of decorrelation, with negligible systematic errors. For typical values the uncertainty in the displacement is approximately 0.05 pixels. Uncertainty is found to increase with increased displacement gradients.”298 Jean-Baptiste Joseph Fourier was a French mathematician and physicist known for his groundbreaking work in the field of heat conduction and the analysis of periodic functions. One of his most significant contributions was the discovery and development of the Fourier transform. Fourier's key insight was that any periodic waveform can be represented as a sum of sinusoidal functions with specific amplitudes and frequencies. He introduced the concept of Fourier series to decompose periodic signals into their constituent sinusoidal components, providing a powerful mathematical tool for analyzing complex waveforms.299

Proposition XII. Feature detection descends from physiognomy and phrenology (I.xiv.).300

Proof.—Feature detection refers to the process of identifying and extracting distinctive details from images. Features help align and match the images. These features can be points, lines, or other image structures that are distinctive and stable, and that can be detected and matched across different images. Once the features have been detected, a feature descriptor is used to represent each feature in a compact way. Reduced. The descriptor is a vector of numerical values that describes the appearance and location of the feature: “Feature descriptors serve as a kind of numerical ‘fingerprint’ that we can use to distinguish one feature from another by encoding interesting information into a string of numbers”301 (II.xl.). After features and descriptors have been extracted from the images by SIFT-like algorithms, they are paired using techniques such as nearest neighbor matching or RANSAC. 

Proposition XIII. Images are ransacked.

Definition.—By ransack, I mean to look through thoroughly—often in a rough way; to search and steal with force; to plunder.302

Proof.—Features extracted from multiple images are matched using algorithms like Random Sample Consensus—RANSAC—to create point and patch correspondences:There is a famous tale in computer vision: Once, a graduate student asked the famous computer vision scientist Takeo Kanade:  ‘What are the three most important problems in computer vision?’ Takeo replied: ‘Correspondence, correspondence, correspondence!’”303

Note.—RANSAC is an iterative algorithm that estimates the parameters of a model from a set of noisy or incomplete data. RANSAC randomly selects points to form an initial hypothesis for the transformation model, such as an affine or homography transformation. It then iteratively checks how many other points support the hypothesis and repeats this process for a specified number of iterations. The hypothesis with the highest number of inliers—points consistent with the model— is considered the optimal transformation. RANSAC uses randomness to divide data into outliers and inliers.304 

Proposition XIV. Inliers and outliers.

Proof.—In data analysis, inliers are data points that are consistent with the overall pattern in the data—that are not significantly different from the rest. Similarly, in statistical analysis, an outlier is an anomaly—not representative of the overall pattern: “An outlier is an observation that lies an abnormal distance from other values in a random sample from a population. In a sense, this definition leaves it up to the analyst (or a consensus process) to decide what will be considered abnormal. Before abnormal observations can be singled out, it is necessary to characterize normal observations.305  Inliers are typically considered to be representative of the overall data set and to be more reliable than outliers (III. xxxix.).

Note.—The word outlier ultimately derives from the Latin ‘extra,’ which means ‘outside,’ and ‘līre,’ which means ‘to go.’ The earliest known use of ‘outlier’ was in 17th century France, referring to ‘stone quarried and removed but left unused,’ … [and] ‘one who does not reside in the place of his office or duties;’ the sense of ‘anything detached from its main body’ is from 1849; the geological sense is from 1833.”306 The word outlier has been used in English since the 19th century. It is widely used to refer to data points that are significantly different from the rest of the sample. It is also used more generally to refer to anything that is significantly different from the norm or that stands out in some way.

Proposition XV. Outliers have value.

Proof.—Outliers provide valuable insights into the data and the processes that generated it. Outliers can indicate the presence of underlying patterns or trends that may not be evident in the overall data set, and they can help to identify the sources of variability or error in the data:Data editing with elimination of outliers that includes removal of high and low values from two samples, respectively, can have significant effects on the occurrence of type 1 error. This type of data editing could have profound effects in high volume research fields.”307 Not all outliers are bad data points. In some cases, outliers can be valid data points that just happen to be different from the rest of the data.

Corollary.—Outliers create friction.

Proof.—There are true outliers—but other outliers may not be representative of the overall data—the result of errors or biases.

Note.—In Reconstruction, outliers arise from noise, errors in the data collection process, and from the presence of structures or features that are not part of the scene being reconstructed.

Proposition XVI. Outliers are identified and separated—or excluded.

Proof.—Outliers can be identified using statistical techniques such as box plots or statistical tests and they can be treated in a variety of ways depending on the goals of the analysis and the nature of the outliers. For example, they may be excluded from the analysis or they may be included and treated as a separate category.

Proposition XVII. Outliers must be handled.

Proof.—Outliers produce uncertainty.

Note.—There are several approaches to treating outliers in Reconstruction: “RANSAC and PROSAC perform similarly for the case of limited outliers.”308 PROSAC—Progressive Sample Consensus—uses progressive sampling based on data point scores to improve efficiency and accuracy, whereas RANSAC uses random sampling. RANSAC and PROSAC both use exclusion, weighting, and modeling techniques as part of their robust model estimation processExclusion—Outliers can be excluded from the reconstruction process, either manually or using automated algorithms. This can be useful when the outliers are clearly incorrect or when they are not representative of the overall scene—Weighting—Outliers can be assigned lower weights in the reconstruction process, which reduces their influence on the final result. This can be useful when the outliers are less reliable but still contain some information that is relevant to the reconstructionModeling—Outliers can be modeled explicitly as part of the reconstruction process, either as part of the scene or as separate objects. This can be useful when the outliers are part of the scene and cannot be excluded or when they contain valuable information about the scene. Strip or sedate.

Proposition XVIII. Average all deviance.

Proof.—Standard deviation is a measure of the dispersion or spread of a dataset, and it is often used in 3D reconstruction to evaluate the accuracy and reliability of the reconstruction. Standard deviation is calculated as the square root of the variance, which is the average squared deviation of the data points from the mean. Standard deviation is important in 3D reconstruction because it provides a way to measure the degree of uncertainty or error in the reconstruction. A high standard deviation indicates a greater degree of uncertainty or error, while a low standard deviation indicates a smaller degree of uncertainty or error. Confidence Intervals quantify uncertainty.309 

Note I.—In Reconstruction, standard deviation is calculated multiple times to assess accuracy—during calibration, feature matching, and point cloud evaluation.These calculations contribute to the overall evaluation of the reconstructed 3D scene's accuracy and reliability. Standard deviation measures reconstruction error—the difference between the reconstructed model and the real-world geometry.

Note II.—In metrology, Abbe error—also known as sine erroroccurs when the measurement axis and the scale are not aligned. The principle, also known as Abbe’s principle of alignment, states that measurement accuracy is maximized when the measuring scale, or the measurement reference line, is aligned with the dimension being measured.The Abbe error can become significant in precision engineering where very high measurement accuracy is required. When the scale and measurement axis are misaligned, any movement in the system will create an angular error. This angular error, when multiplied by the distance between the scale and the measurement point, results in an error in the measurement reading, thus reducing the overall accuracy of the measurement. Named after Ernst Abbe, a German physicist and entrepreneur who contributed significantly to the field of optics and precision measurement,the principle emphasizes the importance of alignment in achieving accuracy: If errors of parallax are to be avoided, the measuring system must be placed coaxially—in line with—the line in which displacement——giving length——is to be measured.”310

Proposition XIX. Structure synthesized from alignments.

Proof.—After matching, the next critical step is estimation of motion and structure. Algorithms like Perspective-n-Point—PnP—use the correspondences from the previous step to estimate camera motion and triangulate the structure of the object or scene.

Proposition XX. The surveillance of known points.

Proof.—The PnP problem arises frequently in 3D reconstruction tasks. If there is a set of points whose position in a global frame is known, and these points are observed in multiple images, camera poses for all images can be computed.The most simple case is P3P—Perspective-Three-Point—where exactly three correspondences are used. More complex algorithms, such as EPnP—Efficient PnP— handle more correspondences and potentially provide more robust results. The PnP problem is the foundation for aligning different camera views and performing accurate triangulation. Once the camera poses are known—solved by PnP—triangulation is applied to estimate the positions of the points that are visible in multiple camera views.

 Note.—A general approach to solving the PnP problem involves two steps—control points are created from the known 3D points, and the corresponding image points are transformed with respect to these control points. This involves creating a system of equations that relate the known 3D points, the image points, and the cameras extrinsic parameters—its pose. This system of equations is then solved, usually through a process like Direct Linear Transform—DLT, Levenberg-Marquardt optimization or others, to find the pose of the camera that best fits the observed data. These solutions often work in tandem with RANSAC or other robust methods to handle outliers and noise in the data. The resulting pose estimate allows the 3D points to be reprojected into the cameras image plane, facilitating the process of 3D reconstruction. Known points control the unknown.

Proposition XXI. Triangulation points toward targets.

Proof.—Triangulation is a method for determining the 3D coordinates of a point from multiple 2D images taken from different viewpoints. It works by projecting rays from the camera center in the 2D images onto a common 3D reference plane. The intersections of these rays determine the 3D coordinates of each point on the target.Triangulation can be used to reconstruct point clouds, surfaces, and models.

Note.—Triangulation is a foundational military tactic—descended from the boar’s head—the flying wedge—the tactical body—panzerkeil.311 Triangulation and trilateration are used in navigation, target acquisition, and communication. These strategies determine locations using signals from multiple fixed points or landmarks, such as satellites, radio transmitters, or beacon towers: “Trilateration uses known distances to pinpoint precise locations. Triangulation uses known angles to calculate unknown distances.”312 Triangulation can determine the position of an enemy target by measuring the angles between the target and two fixed points, such as the surveilling units own position and a nearby hilltop. Triangulation is also used for military communication. Multiple communication stations or relay points, can establish a secure communication network that is less vulnerable to interference or interception.

Another note.—Gilles Deleuze and Félix Guattaris concept of triangulation refers to a way of thinking about power relations and social hierarchies. According to Deleuze and Guattari, triangulation occurs when three elements or forces come into play. These three elements can be individuals, groups, organizations, or even ideas. Deleuze and Guattari argue that power is not just held by a single dominant force, but rather it is produced and maintained through complex and dynamic relationships between different elements. In a triangulated relationship, one element typically holds power over the other two. Two are subordinates. In Anti-Oedipus: Capitalism and Schizophrenia, they deconstruct the triangle of normalized family relations—Mother, Father, Child—“all of them divine forms that become complicated, or rather ‘desimplified,’ as they break through the simplistic terms and functions of the Oedipal triangle … Desiring-production forms a binary-linear system … Oedipus restrained is the figure of the daddy-mommy-me triangle, the familial constellation in person.”313

The triangle is an abstraction of power. It is the illusion of stability. In reality, “flows ooze, they traverse the triangle, breaking apart its vertices. The Oedipal wad does not absorb these flows, any more than it could seal off a jar of jam or plug a dike. Against the walls of the triangle, toward the outside, flows exert the irresistible pressure of lava or the invincible oozing of water.” Life moves. Triangles break apartdynamicfleeting.

Proposition XXII. Movement is captured in optical flow.

Proof.—Optical flow is a technique used to analyze the motion of objects in a sequence of images or videos.This technique is used to detect the movement of objects from one frame to the next, allowing for the analysis of their motion and trajectory.

Note.—Optical flow is typically performed using a mathematical operation known as the Lucas-Kanade method. This operation compares the pixel values of the images or videos in the sequence and calculates the movement of the pixels between frames. Optical flow is often integrated into the Structure from Motion—SfM—pipeline to improve the accuracy of the initial camera pose estimates. By observing the displacement of pixels in the optical flow field, it is possible to infer depth information in the scene. Dense optical flow techniques provide depth estimates—depth maps. These maps are used for spatial navigation in a variety of applications, including surveillance, robotics, and video games: “The decoding of flows and the deterritorialization of the socius thus constitutes the most characteristic and the most important tendency of capitalism.”314

Proposition XXIII. Optimization is a strategy of maximum alignment.

Proof.—Optimization reduces reprojection error. Algorithms used for this purpose include Bundle Adjustment, Graph Cuts, and Belief Propagation.

Note.—In Reconstruction, the core objective of optimization is to yield the most accurate dimensional model. The dimensional double of the observed images or corresponding data. Approaching this goal typically necessitates the iterative refinement of variables—camera parameters, point locations—to minimize the inconsistency between a reconstructed model and the actual observed data. 

Proposition XXIV. Avoid error—align with the lightintegrateassimilateOr vanish.

Proof.—Alignment algorithms like bundle adjustment enhance estimated point positionsand camera calibration parameters—focal length, principal point, lens distortion coefficients, and camera pose. The resulting effect is a substantial increase in the accuracy of the reconstructed target. 

Note.—Bundle Adjustment was first used in the field of photogrammetry in the 1950s and is now used extensively in computer vision.315 The term bundle refers to the bundle of light rays leaving each point on an object and arriving at each camera position, forming a network—or bundle—of rays. The term adjustment comes from the idea of modifying parameters to minimize the error between the predicted and actual image observations. The bundle of light rays is adjusted to better align with the source images. The process of bundle adjustment is initiated by generating a preliminary conjecture of the 3D points and camera parameters. This initial step is often carried out using structure-from-motion methodologies, in which a pair of images is employed to estimate the points by deciphering the relative motion between two cameras. The accuracy of the initial guess is not paramount, yet it requires a reasonable degree of correctness to ensure the optimization process is capable of converging towards an accurate solution. Following the production of the initial guess, the subsequent step involves the re-projection of each 3D point back into the image planes of the cameras, an action that relies on the conjectured camera parameters. The re-projection step effectively serves to calculate the error by comparing the re-projected points position with the originally observed point within the image. Divergence in positions is known as re-projection errorthe central goal of bundle adjustment is to minimize this error.

The minimization of re-projection error across all cameras and points is a problem of non-linear least squares: “Bundle adjustment constitutes a large, nonlinear least-squares problem that is often solved as the last step of feature-based structure and motion estimation computer vision algorithms to obtain optimal estimates. Due to the very large number of parameters involved, a general purpose least-squares algorithm incurs high computational and memory storage costs when applied to bundle adjustment.”316 Compute time. Energy consumption. This optimization process adjusts the estimated points and camera parameters iteratively. Each iteration strives to reduce the overall re-projection error. The iterative cycle continues until the alteration in error—or the error itself—is reduced below a predetermined threshold or the maximum number of iterations  is achieved. Upon the completion of this iterative process, the final result of the bundle adjustment is a refined set of points and camera parameters that produces the minimal re-projection error. The final, optimized output exhibits a significant improvement in accuracy compared to the initial guess.

Proposition XXV. Graph cuts produce disparity maps.

Proof.—Graph cuts are a powerful optimization approach used in Reconstruction to efficiently solve certain energy minimization problems. Graph cuts are commonly applied to the problem of depth map—and disparity map—estimation: “The modern variations on graph-based segmentation algorithms are primarily built using a small set of core algorithms—graph cuts, random walker, and shortest paths.”317 When two images are captured from different viewpoints, the corresponding points on these images will have a horizontal shift due to the relative displacement of the cameras. This horizontal shift is known as disparity. The larger the disparity value, the closer the object is to the cameras, and the smaller the disparity value, the farther the object is from the cameras. By computing a disparity map—which represents the disparity values for all pixels in an image pair—it is possible to estimate the depth information of an object or scene. This information can be used to reconstruct dimensional structures: “Dense disparity estimation in omnidirectional images has become a part of localization, navigation, and obstacle avoidance research.”318 The process starts by calculating costs for each possible disparity of each pixel. This includes a data cost, which quantifies the match between a given disparity and the observed images, and a smoothness cost, penalizing disparities that deviate from their neighboring pixels. Following cost calculation, the graph cut algorithm is applied, partitioning the graph into two disjoint sets. Each set represents a distinct disparity or a set of disparities. The goal is to identify a cut that minimizes the combined costs of the edges crossing the division. This cut effectively corresponds to a disparity assignment that minimizes the overall costs. This process yields a disparity map which forms the basis of a model.

Note.—Social disparity maps are graphical representations that highlight the spatial distribution of various socio-economic factors across a given geographical area. These maps often showcase the direct and indirect outcomes of partitioning processes like redlining and gerrymandering. Redlining is a discriminatory practice that involves the systemic denial of essential services like banking and insurance to residents of specific neighborhoods—predominantly those occupied by racial and ethnic minorities. Gerrymandering is a political strategy that manipulates the boundaries of electoral districts to favor a particular political party or group.The impact of gerrymandering can be seen in the distribution of political power, where certain communities may be disproportionately underrepresented or overrepresented. These spatial partitioning processes have created stark socio-economic disparities. Social disparity maps visualize these socio-spatial divides, graphing variations in income, education, health, and other indicators across different regions. For instance, the Atlas of Inequality “uses aggregated anonymous location data from digital devices to estimate people’s incomes and where they spend their time …Economic inequality isn't just limited to neighborhoods, it's part of the places you visit every day…place inequality measures how similar the incomes of those visitors are. Each dot on the map is a place. More blue places see diverse visitors, while red places are more unequal.319 Regions that were historically subjected to redlining and gerrymandering often display lower levels of income, poorer health outcomes, and limited access to quality education, reflecting the long-lasting effects of discriminatory practices.

Proposition XXVI. Belief propagation produces disparity maps.

Proof.—Belief Propagation—BP—is another prominent algorithm utilized to infer a dimensional model from multiple images.

Note.—The process starts with formulating the 3D reconstruction problem as a Markov random field—MRF—where each node corresponds to a pixel and its state signifies the depth—or disparityof  the pixel. The edges connecting the nodes encapsulate relationships between pixels, typically enforcing smoothness constraints. For each possible disparity, a data cost is computed, measuring the compatibility of that disparity with the observed images, alongside a smoothness cost that penalizes disparities that diverge from their neighbors. The algorithm commences its iterative message-passing phase, with each message representing the sender nodes belief about the receiver nodes disparity.These messages are updated considering costs and incoming messages from neighboring nodes. Afterward, beliefs—the estimated probabilities of each possible disparity—are updated at each node, which entails combining the data cost for each disparity with the corresponding incoming messages. This iterative cycle of message-passing and belief updating continues until the beliefs converge—or a predetermined maximum number of iterations is reached. The disparity—or depth—for each pixel maximizing its belief is selected, constructing a disparity map that can then be transformed into a model.320 

Another note.—Propaganda and disinformation refer to the intentional dissemination of false information to deceive or mislead:

‘Conspiracy beliefs,’ characterized as ‘attempts to explain the ultimate cause of an event … as a secret plot by a covert alliance of powerful individuals or organizations, rather than as an overt activity or natural occurrence, feature prominently in disinformation, misinformation, and inequality-driven mistrust. It can be difficult to persuasively present evidence to refute these types of ideas, especially because experts are often seen as part of the conspiracy and new pieces of contrary evidencecan be rationalized into an existing narrative. For example, a Pew Research Center survey conducted in March 2020 found that 29% of Americans believed that SARS-CoV-2 was developed intentionally in a lab, with many pointing to Wuhan, China as the source; President Trump has given this theory institutional legitimacy, despite scientific consensus and the consensus of the U.S Intelligence services that SARS-CoV-2 is not human-made. This strategic disinformation has served several agendas: casting doubt on evidence presented by Dr. Anthony Fauci, the Director of the National Institute of Allergy and Infectious Diseases and member of the White House Coronavirus Task Force, validating and reinforcing pre-existing xenophobia and racism  and redirecting attention away from the White House’s inadequate and delayed response to COVID-19.321 

Disinformation produces social and economic disparities within and between communities. For instance, political disinformation can disrupt the fair allocation of political representation—creating disparities in political power that can be spatially mapped.

Proposition XXVII. Disparity maps are maps of relative value.

Proof.—Disparity maps are grayscale images in which the value of each pixel corresponds to the difference in position of that pixels projections onto two different images. The disparity is inversely proportional to depth,indicating that a higher disparity corresponds to a closer object, while a lower disparity points to a distant object. Generation of these maps involves matching points between two images, typically by comparing small windows of pixels around each point in the first image with corresponding windows in the second image. Disparity for each point in the first image is then calculated based on the position difference between that point and its match in the second image. Subsequently, given the disparity map and the specifics of the camera and stereo setup, a depth map can be created, providing estimated distances from the camera to different points in the scene. This depth map, often used in tandem with the original image data, enables the construction of a 3D model of the scene. Each point in this 3D model is obtained by back-projecting a pixel from the image onto a 3D ray in the scene, with the depth along the ray determined by the depth map. Disparity maps offer a key pathway to convert 2D image data into depth information, ultimately facilitating the reconstruction of the 3D geometry of a scene. Higher disparity is white—lower disparity is black.

Note I.—The term stereotype—associated with generalized and oversimplified representations of individuals or groups—finds its root in the word stereo, a prefix of Greek origin meaning solid or three-dimensional volume.322 

Corollary I.—Etymologies map ideologies.

Proof.—This is proved from the last proposition in the same manner as III. xxii. is proved from III. xxi.

Corollary II.—Two axes of language in action.

Proof.—Stereo vision, a fundamental principle of human perception, refers to our ability to perceive depth and three-dimensional structure obtained on the basis of visual information derived from two eyes. This biological capacity is integral to our survival and interaction with the world—allowing us to navigate our environment with precision. The term stereo developed new meanings with the advent of technology, notably in the printing industry in the late 18th century. A stereotype—derived from the terms stereo and type—was a solid plate of type metal, cast from a paper-mâché or plaster mold—called a flong. The flong was taken from the surface of a form of type used for printing. These plates were durable, enabling repeated impressions that were identical to the original type layout: “Stereotypes were not moving (or movable) type, but solid type.”323 The replication of immutable stereotypes gave birth to the metaphorical application of the term in social contexts.

Corollary III.—Flattened and unchanging impressions.

Proof.—The metaphorical use of stereotype arose in the early 20th century when American journalist Walter Lippmann started using it to describe the overly simplistic, preconceived, and standardized images or ideas held by one person or group about another. Stereotypes are “hallucinated—a complete fiction out of one external fact and a remembered superstition.”324 The connection was logical—like the unchanging impressions made by stereotype plates, societal stereotypes fail to capture the changing nuances, complexity, and individuality of the people they seek to represent. They flatten three-dimensional human beings into one-dimensional caricatures. Over time, stereotypehas become a critical term in social sciences used to interrogate the simplified assumptions and prejudices that obstruct genuine understanding of others.

Note II.—Stereotype reveals an etymological evolution that is tied to our visual and cognitive understanding of the world. Stereotypes are functional. Efficiencies. But they decimate the infinite of the other: “The contemporary world, scientific, technical, and sensualist, sees itself without exit—that is, without God—not because everything there is permitted and, by the way of technology, possible, but because everything there is equal. The unknown is immediately made familiar [...] The enchantment of sites, hyperbole of metaphorical concepts, the artifice of art, exaltation of ceremonies, the magic of solemnities—everywhere is suspected and denounced a theatrical apparatus, a purely rhetorical transcendence, the game. Vanity of vanities: the echo of our own voices, taken for a response to the few prayers that still remain to us; everywhere we have fallen back upon our own feet, as after the ecstasies of a drug. Except the other whom, in all this boredom, we cannot let go.”325

Proposition XXVIII. Reconstructions densify.

Proof.—After optimizing a sparse set of pointsdenser reconstructions are generated—stereo matching—space carving—patch-based multi-view stereo algorithms.

Proposition XXIX. A dense reconstruction is stilla stereotype.

Proof.—Stereo matching is predicated on the principles of binocular disparity, drawing parallels to the physiological functioning of human binocular vision. This process constitutes the identification of corresponding points, also known as stereo correspondences, between a pair of stereo imagesa duo of images capturing the same scene from two distinct viewpoints, thus paralleling the different perspectives provided by each of the human eyes. The task of stereo matching revolves around determining correspondences between the two images. Mapping every pixel in one image to its corresponding pixel in the other image. The criticality of this operation stems from its centrality in triangulation, which paves the way for the inference of the 3D structure of the given scene. Semi-Global Matching—SGM—and Dynamic Programming are two additional algorithms often employed in stereo matching. Semi-Global Matching computes the minimal cost for all possible disparities for each pixel while maintaining consistency along several directions, and Dynamic Programming seeks to optimize a cost function globally, making it suitable for handling occlusion.

Note.—Variations in lighting conditions, shifts in perspective, occlusions, and other alterationsin visual properties disrupt alignment and computation. Graph Cuts, Block Matching and Space Carving make sense of change and interference.

Proposition XXX.Reconstruction carves space. (III. xxv.)

Proof.—Space carving—also known as volumetric reconstruction—generates three-dimensional models from two-dimensional images. The method also utilizes a collection of images taken from different viewpoints. But it is subtractive.

Note.—The technique operates by defining an initial volume, often encompassing the entire scene, and then iteratively carving—removing parts of the volume that are determined to be empty—until a final model is produced. Each carving iteration uses the information from one of the images. It begins with a solid volume thats large enough to contain the reconstructed object. For each image, determine which parts of the volume would project onto the background. These parts of the volume are carved away since they do not contain the object. This is repeated for every image. The remaining volume represents the reconstruction of the object. This method assumes that the object is completely surrounded by the cameras, and it needs to have the silhouette of the object in all images to perform the carving. This method does not require feature matching between images, making it suitable for reconstructing objects with textureless or repetitive surfaces: “The approach is designed to (1) capture photorealistic shapes that accurately model scene appearance from a wide range of viewpoints, and (2) account for the complex interactions between occlusion, parallax, shading, and their view-dependent effects on scene-appearance.”326

Proposition XXXI. Space is carved with lines of sight.

Proof.—Ray-casting works by tracing a ray from the eye of the viewer—or camera—through each pixel in the image. For each pixel, the ray is tested for intersection with any objects in the scene. If the ray intersects an object, the color of the pixel is set to the color of the object. If the ray does not intersect any objects, the color of the pixel is set to the background color. Ray casting is used in Space Carving to project the two dimensional silhouettes of an object onto an initial volume. For each voxel in the volume, a ray is cast to each camera position, and the voxel is discarded if it projects outside the objects silhouette in any of the images.

Corollary.—Ray-casting projects a point of view.

Note.—In computer graphics and rendering, ray-casting creates visual perspective from a defined point of view. It shoots rays through space. It calculates anything it hits. Targets. Ray-casting is generally attributed to Arthur Appel, who first used the term in a 1968 paper titled Some Techniques for Shading Machine Renderings of Solids: “Generate a light ray to the midpoint of the segment (KM). If any surface lies between KM and the light source go on to the next segment. Determine the next surface behind KM that the light ray to KM pierces within its boundary. If no surface lies behind KM go on to the next segment. A point can cast only one shadow. Project Kl, KM, and K2 onto the surface to obtain K1S, KMS, and K2S, the shadows of Kl, KM, and K2. If KMS lies on a surface which is seen from its shadow side go on to the next segment. This particular shadow boundary is invisible. Also a shadow cannot fall within a shadow.”327 His work laid the foundation for much of the progress that followed in the field of computer graphics. The development of ray-casting—and subsequently ray-tracing—was a cumulative effort that evolved with the contributions of numerous researchers and developers over several decades. Ray-casting is a method that can be used at various points in the Reconstruction pipelinetransformations, bundle adjustment, rendering and visualization. In volume rendering, a model is usually represented as a point cloud, mesh, or grid of voxels. Ray-casting and ray-tracing generate images—from the models—viewable on a flat display. For each pixel in the output image, a ray is cast into the 3D model, and the pixels color and opacity are computed based on the voxels that the ray passes through.

Proposition XXXII. Mutual information equates to shared truths.

Proof.—In Reconstruction, mutual information is a measure of the amount of information shared between two variables. It is often used to evaluate the quality of a Reconstruction. It provides a way to measure the degree to which the reconstructed model accurately represents perfection. 

Note.—Mutual information is co-dependent data. It is calculated as the difference between the entropyof the reconstructed model and the entropy of the model conditioned on the input data. The entropy of a variable is a measure of the amount of information contained in the variable, and the entropy of a model conditioned on the input data is a measure of the amount of information contained in the model that is not present in the input data. Entropy. By calculating the mutual information between the reconstructed model and the input data, it is possible to determine the degree to which the reconstructed model represents the real-world scene: “Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy.”328 High mutual information indicates that the reconstructed model accurately represents the scene, while low mutual information indicates that the reconstructed model is less accurate. Fixed. Reconstructions are death. Entropy is life:“An MIT physicist [Jeremy England] has proposed the provocative idea that life exists because the law of increasing entropy drives matter to acquire lifelike physical properties.”329 Mutual information isoften used in conjunction with other evaluationmetrics, such as reconstruction error and visual quality, to provide a more comprehensive assessment of stability.

Proposition XXXIII.Benchmarks are shared sources.

Proof.—Benchmarks refer to standardized datasets and evaluation metrics that measure and compare the performance of different reconstruction algorithms.They play a critical role in the development and refinement of these algorithms by providing a consistent way to evaluate their effectiveness and accuracy. A benchmark dataset for Reconstruction typically includes data—a set of images or video sequences—along with associated parameters—and in some cases, ground-truth models for comparison. Some benchmark datasets include additional information—depth maps or semantic labels.

Note.—There are several popular benchmarks that are widely used for evaluating the performance of 3D reconstruction algorithms.The Middlebury dataset is a collection of images of a variety of scenes, including buildings, natural landscapes, and synthetic objects. It is widely used to evaluate the accuracy and reliability of 3D reconstruction algorithms, particularly for structure from motion and multiview stereo. The KITTI dataset is a collection of images and point clouds of real-world scenes captured by a vehicle-mounted camera and lidar system. It is widely used to evaluate the performance of 3D reconstruction algorithms for applications such as autonomous driving and robotics. The ShapeNet dataset—This is a collection of 3D models of a wide range of objects, including cars, chairs, and airplanes. It is widely used to evaluate the performance of 3D reconstruction algorithms for shape completion and reconstruction from partial data. The Tanks and Temples benchmark is a collection of images with corresponding Lidar point clouds—lasers as ground truth: “The benchmark sequences were acquired outside the lab, in realistic conditions. Ground-truth data was captured using an industrial laser scanner. The benchmark includes both outdoor scenes and indoor environments. High-resolution video sequences are provided as input, supporting the development of novel pipelines that take advantage of video input to increase reconstruction fidelity. We report the performance of many image-based 3D reconstruction pipelines on the new benchmark. The results point to exciting challenges and opportunities for future work.”330

Proposition XXXIV. Tanks and temples determine the ground truth.

Proof.—The Tanks and Temples benchmark was created by the Max Planck Institute for Intelligent Systems and the University of Stuttgart—it is widely used in research and development of reconstruction algorithms. The Tanks and Temples benchmark consists of thousands of images—fourteen scenes. The images were taken from a variety of viewpoints and under different lighting conditions, and they contain a range of features and structures that are challenging to reconstruct accurately.331 

Proposition XXXV. Tanks and temples normalize territories in the field.

Proof.—The Tanks and Temples benchmark serves as a critical unifying factor in the field of Reconstruction—a standardized datasetfor algorithm evaluation. By offering a common groundof complex, real-world scan data, it ensures that different algorithms can be objectively compared—allowing for consistent evaluation of their strengths and weaknesses. This consistency fosters collaboration, encourages transparency, and accelerates progress in the field, as researchers and developers can clearly see how their methods stack up against others. There is a leaderboard and a supplement of comparison matrices. By benchmarking against the same reference data, researchers can also identify areas of improvement, which promotes innovation and pushes the boundaries of whats possible in Reconstruction. But the benchmark is made of colonial symbols and institutions.

The first scene is Family—Father, Mother, Child. The next scene is a monument. Then a horse. A lighthouse. An M60 Tank. A Panther Tank. A playground. A train. An auditorium. A ballroom. A courtroom. A museum. A palace. A temple.332 

Note.—Military power and religious institutions—symbolized by tanks and temples respectively—impose social and political normalization. Tanks are embodiments of state authority —military power. They may instill a sense of threat or security and order. They signify the assertion of governmental control moving through physical space. Conversely, temples are spiritual epicenters. Fixed physical landmarks—monuments to ideologies. The first temple described in the bible was The Temple of Solomon. The description begins with precise measurements—each dimension—in cubits (V.): “The inner sanctuary was twenty cubits long, twenty wide and twenty high.”333 A cube. In front of the Temple, four staircases ascend to an elevated platform. An enormous pyre.334 The eternal flame. Priests tending the fire. A continuous flow of burned offerings. A vertical vortex of smoke. A tether. A giant cord connecting earth and sky | visible from miles away. A specimen pin (I.xv.).

Proposition XXXVI.Normalization promotes sameness.

Proof.—In computer graphics, normalization is the process of scaling a variable to lie within a specified range, typically between 0 and 1. This is often done to bring all the variables in a dataset to the same scale, which can be important for algorithms that rely on distances between data points.

Corollary.—Normalization aims for consistency.

Proof.—Normalization is used to scale models to consistent sizes for rendering. It is also used in computer graphics to ensure that colors are displayed accurately and consistently across different devices.

Note.—Normalization can be achieved through a variety of methods, including min-max normalization, z-score normalization, and scaling to unit length. Min-max normalization scales a variable to lie within a given range by subtracting the minimum value and dividing by the range. The resulting values will all lie between 0 and 1 with the minimum value becoming 0 and the maximum value becoming 1.Z-score normalization standardizes a variable by subtracting the mean and dividing by the standard deviation. This results in a variable with a mean of 0 and a standard deviationof 1. Scaling to unit length normalizes a variable by dividing each value by the Euclidean normor the square root of the sum of squares of the values. This results in a variable with a length of 1, with each value representing a proportion of the total length.

Proposition XXXVII. Integrating orientations and positions.

Proof.—Normalization also refers to adjustments in scale, orientation, and position of point clouds or meshes to a common reference frame. This process is often used to align multiple models or to bring them into a common coordinate system.

Proposition XXXVIII.Normals  |  straight and upright.

Proof.—In computer graphics, normals are vectors that are used to represent the orientation of a surface at a particular point. They are typically perpendicular to the surface and are used to calculate the way that light reflects—an important factor in realistic graphics. Normals calculate the angle of incidence between the surface and a light source. This determines what is reflected and how it is distributed over the geometry. There are two types of normals—surface normals and vertex normals. Surface normals are defined at each point on the surface of a model and are used to calculate the way that light reflects off of the surface. Vertex normals are defined at each vertex of a model and are used to smooth the surface of the model by averaging the surface normals of the surrounding vertices. Normals are an important factor in creating the appearance of depth, curvature, and surface roughness.335

Proposition XXXIX.No tangents  |  nothing abnormal.

Proof.—Abnormal is a lecture series given by Michel Foucault at the Collège de France in 1974. In this series, Foucault discusses the concept of abnormality and how it has been constructed and enforced in Western societies.

Note.—Foucault explains that abnormality is not a universal concept—it is a culturally and historically specific construct. He argues that the concept of abnormality is not based on objective scientific criteria, but rather it is shaped by social, cultural, and political factors. Foucault then goes on to trace the history of the concept of abnormality in Western societies, starting with ancient Greek and Roman societies, where abnormal behavior was understood as a form of divine punishment or possession. He then discusses the emergence of the modern concept of abnormality in the 18th and 19th centuries, which was closely tied to the rise of the discipline of psychiatry and the development of the asylum system:

It is this set of ideas, this simultaneously positive, technical, and political conception of normalization that I would like to try to put to work historically by applying it to the domain of sexuality. And you can see that behind this, the basic target of my criticism, or what I would like to get free from, is the idea that political power-in all its forms and at whatever level we grasp it-has to be analyzed within the Hegelian horizon of a sort of beautiful totality that through an effect of power is misrecognized or broken up by abstraction or division.336

Foucault argues that the concept of abnormality has been used as a means of social control, particularly in relation to marginalized groups—the poor, the elderly, people with disabilities, people of different races and orientations. He also suggests that the concept of abnormality has been used to reinforce dominant societal norms and values, and to justify exclusion and discrimination.

Proposition XL. Normals are indicators and controls.

Proof.—Normalization theory is a sociological theory that explains how social behaviors and norms become integrated into society. It was developed in the late 1970s by sociologists Professor Ian Gough and Professor Peter Townsend at the University of Bath in the United Kingdom.

Note.—According to normalization theory, the process involves four stages—Deviance—A behavior is seen as deviant or outside the normCondemnation—The behavior is condemned or stigmatized by societyTolerance—The behavior is gradually accepted and tolerated by societyAcceptance—The behavior becomes fully accepted and becomes part of the norm. Normalization theory suggests that deviant behaviors can become normalized through a process of social integration. In Normporn, “Karen Tongson reflects on how queer cultural observers work through repeated declarations of a ‘new normal’ and flash lifestyle trends like ‘normcore,’ even as the absurdity, aberrance, and violence of our culture intensifies.”337

Corollary I.—Normalization theory has been applied to a wide range of social behaviors, including drug use, gambling, and the adoption of technology. It is often used to understand how behaviors and norms change over time—how they become integrated into society.

Corollary II.—Normalizing white supremacist patriarchy refers to the process of accepting and perpetuating harmful systems of power that prioritize the interests of white, cisgender men over those of other groups. This includes the acceptance of white supremacy, which is the belief that white people are superior to people of other races, and the patriarchy, which is a system of power that privileges men over women. Binaries. White—Black. Male—Female.

Proof.—White supremacist patriarchy is reinforced in many ways, including through media, education, and language. For example, the media often portrays people of color in a negative or stereotypical manner, while simultaneously reinforcing the idea that white people are the norm and the default. This can lead tothe internalization of harmful stereotypes and biases by both white people and people of color. The Bluest Eye.338 Aggressive recession.

Note.—The endeavour to injure one whom we hate is called Anger.

Racism and sexism are normalized through images—stereotypes circulated. Reinforced. Internalized. Normalizing sexism also involves the acceptance of discriminatory practices and policies, such as the gender pay gap or the lack of representation of women in leadership positions. These practices and policies often go unchallenged, further perpetuating and reinforcing sexist attitudes and behaviors. Normalizing sexism—and racism—involves the subtle—often unconscious acceptance of attitudes and behaviors as the expected defaults. It is important to recognize and challenge the ways in which white supremacist patriarchy is normalized in order to create a more equal and just society. This involves actively working to dismantle harmful systemsof power and supporting policies and practices that promote equity and justice for all. To normalize equality and mutual respect.

Proposition XLI. Orientation is not fixed.

Proof.—Sarah Ahmed is a British feminist and critical race theorist who has written extensively about the concept of orientation. In her work, Ahmed uses the term orientation to refer to the ways in which individuals and groups orient themselves towards or away from certain ideas, practices, or values: “I consider how racism is an ongoing and unfinished history; how it works as a way oforientating bodies in specific directions, thereby affecting how they ‘take up’ space. We ‘become’ racialized in how we occupy space.”339

Note.—According to Ahmed, orientation is a dynamic and ongoing process that is shaped by social, cultural, and historical factors. It is not fixed or predetermined, but rather it is influenced by the ways in which individuals and groups interact with and make sense of the world around them.

Corollary.—Orientation is closely tied to power dynamics and social hierarchies.

Note.—Ahmed suggests that marginalized groups may be forced to orient themselves in ways that are counter to their own desires or interests. Orientation is not just about individual choices or preferences, it is a collective process shaped by social and cultural forces: Although Merleau-Ponty is tempted to say that the ‘vertical is the direction represented by the symmetry of the axis of the body,’his phenomenology instead embraces a model of bodily space in which spatial lines ‘line up’ only as effects of bodily actions on and in the world. In other words, the body ‘straightens’ its view in order to extend into space. One might be tempted, in light of Merleau-Ponty’s discussion of such queer moments, to reconsider the relation between the normative and the vertical axis … the normative can be considered an effect of the repetition of bodily actions over time, which produces what we can call the bodily horizon, a space for action, which puts some objects and not others in reach. The normative dimension can be redescribed in terms of the straight body, a body that appears ‘in line.’ Things seem ‘’straight’ (on the vertical axis), when they are ‘in line,’ which means when they are aligned with other lines.”340 She suggests that individuals and groups can actively resist and challenge dominant orientations and work towards creating more inclusive and equitable ways of orienting themselves and others.

Proposition XLII. Smoothing difference.

Proof.—Post-processing includes algorithms like Iterative Closest Point, Occupancy Networks, and Poisson Surface Reconstruction. Smoothing, decimation, texture mapping, and other refinement and manipulation techniques may also be applied at this stage.

Proposition XLIII.Post-processing minimizes difference.

Proof.—Iterative Closest Point—ICP—is a key algorithm in reconstructive post-processing.Point clouds might not be perfectly aligned in the same coordinate system. Before a complete and accurate model of the object or environment can be formed, these point clouds need to be aligned or registered correctly to form a coherent, unified model. Individuals merge:  “The individual is defined by its ‘positioning’ within the intersubjective frame.The foundation is transposed from a time axis to a spatial one, becoming topographical, the lay of the social land: we are no longer in the once-upon a-time, but in the always-already. For in this approach, the individual is in a sense prehatched, since the topography determining it is itself predetermined by a mapped-out logic of baseline positions and combinations or permutations of them.”341 The machine hypothesis synthesized.

The Iterative Closest Point algorithm works by iteratively minimizing the difference between two point clouds—the reference—or target—point cloud, and the source—or input—point cloud. The algorithm starts with an initial guess of the transformation between the two. It then finds the closest corresponding points and uses them to compute the transformation. This transformation is then applied to the first point cloud and the process is repeated. The algorithm converges when the difference between the two point clouds is minimized. The algorithm typically involves the following steps—Correspondence Estimation—For each point in the source point cloud, the closest point in the reference point cloud is found—Transformation Estimation—A transformation (comprising rotation and translation) that best aligns the source point cloud to the reference point cloud, based on the correspondences established in step 1, is computed. This is typically done by minimizing a certain error metric, often the mean squared error between the corresponding points—Transformation Application—The computed transformation is applied to the source point cloud—Iteration—These steps are repeated until the transformation between consecutive iterations falls below a certain threshold, or a maximum number of iterations is reached. In this way, Iterative Closest Point aligns and merges multiple point clouds into a coherent model.

Proposition XLIV.Machines of implicit representation.

Proof.—An implicit representation is a way of representing an object or surface using an equation rather than a set of discrete points or triangles. This equation defines the shape of the object or surface by specifying a value for every point in space. Points where the equation evaluates to zero are considered to be part of the object or surface,while points where the equation evaluates to a non-zero value are considered to be outside the object or surface.

Note.—Implicit representations are often used to represent smooth surfaces—like spheres and tori, which can be difficult to represent accurately using a mesh of discrete points or triangles. They are also useful for  complex shapes or topologies, such as those with self-intersecting surfaces or multiple connected components: “Implicit Feature Networks (IF-Nets), which deliver continuous outputs, can handle multiple topologies, and complete shapes for missing or sparse input data retaining the nice properties of recent learned implicit functions, but critically they can also retain detail when it is present in the input data, and can reconstruct articulated humans.”342 One advantage of implicit representations is that they can be easily transformed and modified with mathematical operations. For example, an implicit representation of a sphere can be scaled or translated simply by multiplying or adding constants to the equation. This can make them more efficient to work with than other representations, such as meshes, which may require more complex algorithms to modify. However, implicit representations can be more difficult to work with than other representations in some cases, as they do not provide a direct representation of the objects surface.This makes it more difficult to perform certain tasks—like ray tracing and collision detection—which require knowledge of the surface geometry. 

Proposition XLV.Representation occupies space.

Proof.—Occupancy networks—also known as occupancy maps or voxel gridsare different ways of representing three-dimensional objects and environments in computer graphics. They are composed of a grid of equally-sized cubes—voxels—that store information about the presence or absence of objects in space. Each voxel in an occupancy network is associated with a binary valuethat indicates whether an object occupies that voxel or not. For example, a value of 1 might indicate that an object is present in the voxel, while a value of 0 indicates that the voxel is empty. Occupancy networks are often used to represent objects or environments that are difficult to represent accurately using other methods, such as meshes or implicit surfaces. They are particularly useful for representing objects with complex shapes or topologies, or for representing large or detailed environments. One advantage of occupancy networks is that they provide a compact representation of 3D objects and environments, as they only store information about the presence or absence of objects in each voxel. This can make them more efficient to work with than other representations, such as meshes, which may require more storage space to store the same level of detail: “Occupancy networks implicitly represent the 3D surface as the continuous decision boundary of a deep neural network classifier. In contrast to existing approaches, our representation encodes a description of the 3D output at infinite resolutionwithout excessive memory footprint.”343 Occupancy networks are used for a variety of tasks in computer graphics—including modeling, rendering, and collision detection. They are also used in robotics and computer vision applications, for navigation through physical environments.

Proposition XLVI.The language of force invades the algorithms.

Proof.—The word marching comes from the Old French word ‘marcher,’ which means ‘to walk.’ It ultimately derives from the Latin ‘marcare,’ which means ‘to step’ or ‘to tread.’ In English, the word ‘marching’ is often used to describe the act of organized walking. A military formation—feet moving in unison—the body held upright—disciplined. It also means border—frontier.344 

Proposition XLVII.Marching cubes.

Proof.—The marching cubes algorithm is a widely used technique for extracting a polygonal mesh from an implicit representation—like a scalar field. It was first introduced in a paper published in 1987 by William E. Lorensen and Harvey E. Cline: “We present a new algorithm, called marching cubes, that creates triangle models of constant density surface from 3D medical data—using a divide-and-conquer approach to generate inter-slice connectivity, we createa case table that defines triangle topology.The algorithm processes 3D medical data in scan-line order and calculates triangle vertices using linear interpolation. We find the gradient of the original data, normalize it, and use it as a basis for shading the models.”345

Note.—A scalar field is a concept from mathematics and physics that associates a scalar value—a single numerical value—with every point in spaceor  region of spaceMarching Cubes takes these scalar values and creates a mesh of triangles that approximate the shape of an underlying geometry. The algorithm works by iterating over each point in the scalar field. For each point, it looks at the 8 points surrounding it and determines which of the 12 possible cases it falls into. Each case corresponds to a different way that the 8 points can be arranged, and the algorithm uses this information to generate the triangles that approximate the surface of the object. The marching cubes algorithm has been widely adopted due to its simplicity and efficiency, as well as its ability to handle both smooth and sharp features in the surface. It has also been extended and modified in various ways, such as the marching tetrahedra algorithm and the dual contouring algorithm. The marching cubes algorithm is commonly used in computer graphics and scientific visualization. It is used in medical imaging—MRI and CT scans—to create models of tissues and organs.

Proposition XLVIII. The body without organs—smooth space and striated space.

Proof.—In the philosophy of Gilles Deleuze and Félix Guattari, the Body without Organs is a conceptual plane or surface that exists beyond the organized and hierarchical structures of society, institutions, and the human body itself. It is not a physical entity but a virtual, abstract space of potentiality and intensity. The term ‘organs’ does not refer solely to bodily organs but refers to any organized, fixed, and stratified systems that restrict and regulate desire and creativity: “Capitalism tends toward a threshold of decoding that will destroy the socius in order to make it a body without organs and unleash the flows of desire on this body as a deterritorialized field”346 (III.xxii.).

The Body without Organs is a fuzzy concept—characterized by the absence of fixed forms, identities, and structures. It represents a state of pure becoming, free from pre-existing organization—full of flows, connections, and possibilities. It is a space of experimentation, creativity, and the emergence of new potentials.

Likewise their concept of smooth space refers to a type of space that is open, expansive, and continuous—as opposed to striated space, which is organized—hierarchical—segmented. Smooth space is characterized by a lack of boundaries or fixed points, and is characterized by fluid, open-ended movement. It is often associated with nomadic cultures and non-hierarchical social structures, as well as with the unconscious mind: “Smooth space is filled by events of haecceities, far more than by formed and perceived things. It is a space of affects, more than one of properties … it is an intensive rather than an extensive space, one of distances, not of measures and properties.”347 Deleuze and Guattari’s concept of smooth space is closely linked to their idea of desire, which they see as an open-ended, productive force that is constantly creating and destroying social and cultural structures. They argue that the desire for smooth space—and the desire to escape the constraints of striated space—is central to social and political change.

Note.—Deleuze and Guattari contrast smooth space with striated space, which is characterized by fixed points, boundaries, and hierarchical organization. Striated space is often associated with sedentary cultures and capitalist societies, as well as with the structures of language and representation: “Thestriated is that which intertwines fixed and variable elements, produces an order and succession of distinct forms, organizes horizontal [—] lines with vertical [ | ] planes.”348

Proposition XLIX.Collectives enmesh.

Proof.—A mesh is a collection of points—vertices—connected by edges to form a polyhedral surface. A mesh can be used to represent a 3D object or surface by dividing it into a series of small, flat polygons—typically triangles and quadrilaterals. Each vertex in a mesh is defined by its 3D coordinates (x, y, z)—the edges connecting the vertices define the topology of the surface. The polygons formed by edges are called faces.349 Basic volumes—cubes, spheres, cylinders, platonic solids—are primitives.

Note.—Primitives are unchanged:Westerners encountered a wide variety of societies in their colonial expansion. Politically these were categorized from the most complex—the state societies in regions of Asia and North Africa—to those perceived as formed by savages and primitives, with the simplest types of political organization. Their entrenched belief in a philosophy of progression took Western scholars to assume an uneventful and unchanged past for these societies. It was commonly argued that savages did not have a history. Hence, they were considered as living fossils, as ‘survivals’ from earlier stages of culture long passed in Europe.”350 Primitives were forced to deform—everything must conform.

Proposition L. Normals approximate the surface of perfection.

Proof.—Poisson Surface Reconstruction is an algorithm used to create a surface model from point cloud data. The approach is named after Siméon Denis Poisson, a French mathematician, who introduced the Poisson equation in the field of potential theory: “Reconstructing 3D surfaces from point samples is a well studied problem in computer graphics. It allows fitting of data—filling surface holes—remeshing existing models. We provide a novel approach that expresses surface reconstruction as the solution to a Poisson equation.”351 Unlike some other methods, it doesnt rely solely on the point positions, it leverages normals.

Note.—“Our key insight is that there is an integral relationship between oriented points sampled from the surface of a model and the indicator function of the model. Specifically, the gradient of the indicator function is a vector field that iszero almost everywhere (since the indicator function is constant almost anywhere), except at points near the surface, where it is equal to the inward surface normal.” The Poisson Surface Reconstruction process involves several steps | Normal Estimation | The first step in Poisson reconstruction is to estimate the normals of the points in the point cloud. These normals represent the direction that the surface was facing at each point. If the point cloud data does not already contain normals, they will need to be estimated—Equation Formulation—Using the point positions and the surface normals, a Poisson equation is formulated. This equation describes how the surface normal directions vary over space—Equation Solution—The Poisson equation is then solved to obtain a scalar function that can define the surface of the 3D object. This is typically done through a process of relaxation or optimization—Iso-Surface Extraction—Finally, an iso-surface extraction process, such as Marching Cubes, is used to generate a mesh from the scalar function.

Proposition LI. Smoothing is normalization.

Proof.—Smoothing a meshed surface removes noise and inconsistencies—producing a more visually pleasing representation.

Note.—There are several smoothingalgorithms—Laplacian SmoothingThis is one of the most straightforward and widely-used techniques. In Laplacian smoothing the new position of a vertex in a mesh is calculated as the averageof its neighbors. This process is usually iterated several times. While easy to implement, this method may lead to shrinkage of the model and doesnt preserve sharp features—Bilateral Smoothing—This is an improvement on the Laplacian smoothing approach that preserves sharp edges better. In bilateral smoothing, the weights of neighboring points are determined not just by their distance but also their similarityin terms of normal vectors or color. This means that points on the same surface but different sides of an edge will not be averaged together, preserving the sharpness of the edge—Iterative Closest Point —Although ICP is primarily a method for aligning 3D shapes, it often includes a smoothing step to help improve the quality of the alignment. During the alignment process, the point clouds are iteratively adjusted to minimize the distance between corresponding points, which has a smoothing effect on the overall shapes—Taubin SmoothingThis method is an extension of Laplacian smoothingwhich alternates between Laplacian smoothingwhich may cause shrinkage—and its inverse—which can cause expansion—By carefully choosing the parameters for these two steps, it is possible to cancel out the shrinkage and maintain the original size of the model while still achieving the smoothing effect—Poisson Reconstruction—As previously mentioned, Poisson Surface Reconstruction uses the input point cloud data and the surface normals to generate smooth surfacesQuadric Error Metrics Simplification—This method simplifies the mesh while minimizing deviation from the original surface. It combines vertices and adjusts their positions to minimize the overall error, leading to a smootherand less complex surface.352

Proposition LII. Reality, decimated.

Proof.—Decimation refers to the process of reducing the complexity of a model by decreasing the number of vertices, edges, and faces while trying to preserve the overall shape and features of the model as much as possible.This process is sometimes also known as mesh simplificationor model reduction. The purpose is to make the model easier to work with and quicker to process, especially in applications like real-time rendering or analysis where speedis crucial: “... decimation filters are commonly used to restore the realistic appearance of virtual biological specimens, but they can cause a loss of topological information of unknown extent. In this study, we analyzed the effect of smoothing and decimation on a 3D mesh to highlight the consequences of an inappropriate use of these filters … Decimation always produced detrimental effects on both topology and measurements.”353

Note.—Strategies of decimation include—Vertex Clustering—This is one of the simplest methods for model decimation. The 3D space is divided into a regular grid of voxels and all the vertices within each voxel are replaced with a single vertex often the centroidof the original vertices. While simple and fast, this method can result in a loss of detail and does not always preserve the topology of the model well—Edge Collapse—This is a more sophisticated method that iteratively removes the least important edges from the model. The importance of an edge can be calculated in various ways, such as the length of the edge, the curvature of the surface around the edge, or the angle between the faces adjacent to the edge. When an edge is removed, the two vertices at its ends are mergedinto a single vertex, and the faces adjacent to the edge are also removed or reconnected—Quadric Error Metrics simplification—QEM—This is a further refinement of the edge collapse method. For each potential edge collapse, a quadric error metric is calculated, which represents the squared distance from the new vertex position to the original surface. The edge collapse that results in the smallest increase in this error metric is chosen at each step. This method can preserve the features of the model well, but is more computationally intensive—Vertex removalThis method involves iteratively removing vertices from the model, similar to edge collapse. However, instead of merging vertices, a vertex is removed and its neighboring vertices are reconnected to form new faces:354 The word decimation descends from an ancient Roman military punishment. Plutarch describes how a general or emperor executed “the punishment known as ‘decimation’ on those who had lost their nerve. What he did was divide the whole lot of them into groups of ten, and then he killed one from each group, who was chosen by lot.”355

Proposition LIII. Texture hides decimation.

Proof.—Texture mapping is the process of applying surface detail or color information to a model. This is done by assigning an image—a texture map—to the surface, enhancing the visual realism of the model. The original algorithm was proposed by Edwin Catmull in 1974: “This method subdivides a path into successively smaller subpatches until a subpatch is as small as a raster-element, at which time it can be displayed. In general this method could be very time consuming because of the great number of subdivisions that must take place; however, there is at least one very useful class of patches—the bicubic patch—that can be subdivided very quickly. pictures produced with the method accurately portray shading and silhouette of curved surfaces In addition, photographs can be ‘mapped’ onto patches thus providing a means for putting texture on computer-generated pictures.356 

Note.—Texture mapping involvesUV Mapping—This is the process of creating a 2D representationUV mapof the 3D models surface. Each point—vertex—on the models surface is assigned a corresponding point in the 2D UV map. This is usually a complex task since it involves flatteninga 3D surface into 2D while —minimizing distortions —and overlaps. There are various algorithms and methods to achieve this, such as planar projection, cylindrical and spherical mapping, and more advanced methods—Least Squares Conformal Mapping—LSCM—and Angle Based Flattening—ABFTexture Sampling—Once the UV map is created, the next step is to sample the texture image for each point on the surface of the 3D model. This involves assigning to each point on the model a pixel—texel—from the 2D texture map based on the points UV coordinates. There are various interpolation methods used in this step, such as—nearest-neighbor—bilinear—and bicubic interpolation—each with their advantages and trade-offs in terms of speed and quality—Texture Filtering—This step handles issues that arise when the texture map is viewed at different scales or angles, such as aliasing—stair-step effect—and blurriness—the two main types of texture filtering techniques are Mipmapping and Anisotropic filtering—Mipmapping—creating a series of smaller versions of the texture map and selecting the appropriate one based on the distance of the surface from the viewer—Anisotropic filtering—on the other hand, adjusts the texture sampling based on the viewing angle to maintain detail and reduce blurrinessShader Processing—Modern graphics processing units—GPUs—utilize programmable shaders to handle the final stage of applying the texture to the model. Fragment shaderspixel shaders—manipulate texture data to achieve various visual effects—bump mapping |normal mapping | parallax mapping—which add the appearance of additional geometric detail to the surface.357 

Corollary.—An illusion of information.

Proposition LIV. The mind endeavors to conceive only such things as assert its power of activity.

Proof.—We crave a near-identical representation of the physical world, but the pursuit often uncovers the inherent flaws of our tools. Reconstruction is a complex processeach stage—from acquiring data points to post-processing the modelfraught with potential inaccuraciesnoise from source data, distortion from approximations in the algorithms, holes in meshes. Texture mapping distracts from these imperfections. It functions as a cosmetic concealer covering the acne of inaccuracycontouring the faces of digital meshes. 

Proposition LV. The mesh cannot contain every wrinkle and every pore.

Proof.—An untextured model is bare—revealing all blemishes—all imperfections. Consider a reconstruction of a human face.The mesh cannot accurately represent every curve, wrinkle, or pore—it is a low resolution approximation. But with a high-resolution texture map, the final model looks astonishingly lifelike—inaccuracies hidden beneath a shroud of color and photoreal detail.

Corollary.—The surface is contested territory.

Note.—Advanced texture mapping techniques—like bump mapping or normal mapping—can simulate the appearance of surface detailnot present in the actual geometry. Even with a relatively low-polygon model, these techniques can produce a sense of depth and complexity that disguises underlying simplicity.However, just as with makeup, the success of this illusion depends on the skillful application of the texture. Inaccurate UV mapping or poor-quality texture images can lead to glaringly obvious seams, stretches, or other artifacts that draw attention rather than deflect it.

Corollary.—No one envies the virtue of anyone who is not his equal.

Proof.—A reconstruction is always already a distortion. The modification of models post-reconstruction is a risk as well. Altering the original. The alteration of a reconstruction might infringe upon intellectual property rights or distort the original intent. On a broader scale, the unregulated modification of reconstructions could contribute to the spread of misinformation or the erasure of historical or cultural truth.

Note.—There is also potential for objectification and violation of privacy. Without proper consent or oversight, the post-processing manipulation of models of people could lead to misrepresentation, dehumanization, or even the creation of deep fake scenarios: “The underlying technology can replace faces, manipulate facial expressions, synthesize faces, and synthesize speech. These tools are used most often to depict people saying or doing something they never said or did. How do deepfakes work? Deepfake videos commonly swap faces or manipulate facial expressions”358 (IV.).

Proposition LVI. Reconstructions are easily manipulated.

Proof.—Advanced algorithms, once the sole domain of experts, are becoming increasingly accessible, making it easier for those with malicious intent to manipulate visual and spatial data to create hyper-realistic—but deceptive—representations: “Voices and likenesses developed using deepfake technology can be used in movies to achieve a creative effect or maintain a cohesive story when the entertainers themselves are not available … replace characters who had died or to show characters as they appeared in their youth.359 These synthetic images are often indistinguishable from reality. The propagation of deepfake media content outside of entertainment stirs public fear and confusion—the impact of these deceptions extends far beyond misinformation—to the geopolitical threat of mass-scale deception. Disinformation campaigns—enabled by these technologies—fuel internal social and political unrest, destabilize nations, and create international conflict.

Note.—How can the global community discern truth from falsehood in an age where seeing is no longer believing? Though maybe it never was … The solutions are complex and multi-faceted, involving a combination of policy, education, and perhaps new technological tools to detect and combat visual deception. The ethical dimension of this issue cannot be understated. This is the ability to manipulate reality and distribute it on a massive scale.  The line between enhancement and deception can often be blurred. Who gets to decide what is true? How are these decisions made and enforced? These technologies do not operate in a vacuum. They are framed within the broader context of a society where digital literacy is lagging behind technological advances. 

Proposition LVII. Data is on display.

Poof.—Finally, the reconstruction is visualized.

Note.—In Reconstruction, visualization refers to the process of generating a perceptible depiction of the model. The translation of  computed data into a format that can be easily interpreted and understood by humans. Visualization algorithms simulate surface texture, lighting, shading, and color, to produce a new image—or sequence of images. While there are many rendering algorithms, the following are some of the most widely used approaches—Rasterization—is faster but less realistic —suitable for real-time rendering—video games—simulations. It converts the model into a raster image—a grid of pixels—Ray Tracing—simulates the physics of light to achieve realism—It works by tracing the path of light and simulating the effects of its encounters with virtual objects—It is known for its ability to produce high-quality effects like reflections—refractions—and shadows—Path Tracing is a type of Ray Tracing that simulates light by tracing the many potential paths that light could take from the light source to the camera lens—By averaging the results of many different paths—it produces accurate global illumination—soft shadows—depth of field—motion blur—indirect lighting360—Radiosity— is a method primarily used in scenes with diffusely reflecting surfaces—It calculates the way that light transfers from one surface to another—This global illumination method accounts for indirect illumination—where light reflects off multiple surfaces before reaching the viewer—Photon Mapping—is a two-pass global illumination algorithm that accurately represents the interaction of light with different surfaces—In the first pass—it traces photons from the light source into the scene—storing them in the photon map—In the second pass—it uses traditional ray tracing from the camera while also using the photon map to estimate the incoming radiance—A strategy of diffuse interreflection: “In the first pass two photon maps are created by emitting packets of energy—photons—from the light sources and storing these as they hit surfaces within the scene. We use one high resolution caustics photon map to render caustics”361—In photon mapping—caustics refer to the concentrated patterns of light that appear on surfaces due to the reflection off of or refraction of light through curved or shiny—translucent and specular—surfaces— Caustics—result from the focusing and concentration of light rays as they interact with such surfaces— projecting bright and distinct patterns that are often seen in the real world—such as light patterns formed at the bottom of a swimming pool or the shimmering light under a glass of water.

Proposition LVIII.Continuous time, segmented—becomesfrequency.

Proof.—The Fourier Transform named after the French mathematician and physicist Jean-Baptiste Joseph Fourier, is a mathematical technique that transforms a function of timea signal—into a function of frequency. Fourier, known for initiating the investigation of Fourier series, made significant contributions to the field of heat transfer, which led him to develop the Fourier Transform. Essentially, the Fourier Transform decomposes a signal into the frequencies that make it up similar to how a musical chord can be expressed as the frequencies of its constituent notes. This mathematical tool is fundamental in a wide range of fields, including engineering, physics, and data analysis, as it provides a way to analyze and manipulate data by shifting from the time or space to the frequency domain. In doing so, it uncovers the frequency spectrum of a signal—revealing the signals individual sinusoidal components of different frequencies. Fourier Transforms are fundamental in Fourier Volume Rendering—which produces images from volumetric data. This method is often used in medical imaging where it allows clinicians to view a 2D projection of a 3D scan.

Proposition LIX. The body is projected into frequency space.

Proof.—Fourier Transforms play an essential role in CT scan reconstructions. During a scan, the machine takes a series of two-dimensional x-ray images of a body. These images are known as projections. A stack of slices:“A CT machine can produce 64, 128, 256 et cetera slices and the number of slices needed for a study will completely depend on the physician. Using a 64 slice CT machine, producing a 512x512 image or slice with each slice on average having 700 views, then based on the explanation above, the total number of FFTs would be approximately 89000 (64*1400) …  A larger study like CT angiography and cardiac CT14 will have a much higher number of views even for the same CT slices machine.”362 The Fourier Slice Theorem —or Central Slice Theorem these projections are integrated to reconstruct bodies and organs. 

Note.—Fourier Transforms are also used in digital holography. In this context, the Fourier Transform is used to shift from the spatial to the frequency domain, allowing the recording of object and reference beams as an interferencepattern. The captured pattern is a hologram. Holography captures light scattered from objects and restructures it to create three-dimensional images.Unlike traditional photography— which records intensity and color—holography encapsulates the phase of light—retaining depth information: 

Holography was proposed as a lensless imaging technique by Dennis Gabor in 1947 in an attempt to solve the problem of limited resolution in electron microscopes due to lens aberrations. Holography is an elegant solution to the so-called phase problem, which exists in coherent imaging and can be described as follows. A probing wave propagates through a sample and reaches a distant detector. Since detectors can only record intensity, information about the phase distribution of the wave is lost. However, this missing phase distribution is crucial because it contains information about the scattering eventsthat have taken place inside the sample.Therefore, in order to reconstruct the sample distribution, the phases missingin the detector planemust be recovered.363 

Advancements in technology, especially in computational power, resolution of recording devices, and miniaturization of hardware, are propelling the development and application of holography. Holography will seamlessly merge with augmented and virtual reality, offering immersive experiences that are indistinguishable from reality. The future of experience. Eyes affixed—under the spell of Reconstruction—algorithms reflecting the logic of striated space—the logic of disciplinary society.

INSTANCES OF RECONSTRUCTIONS

I. Reconstructions acquit or convict.

Explanation.—Forensic reconstruction is a powerful tool that is used to investigate and analyze crime scenes. It is used to map and understand the events that took place and to identify potential suspects. There are several steps involved in the trusted methodology for forensic 3D reconstruction of crime scenes—Data collection—The first step in forensic 3D reconstruction is to collect data from the crime scene. This can include photographs, video footage, measurements, and other types of data that can be used to create a detailed, 3D model of the scene—Data processing—Once the data has been collected, it must be processed and analyzed in order to create a 3D model of the crime scene. This may involve using specialized software to stitch together images and other data, as well as to adjust for any distortion or other issues that may affect the accuracy of the model—Model Creation—Once the data has been processed, it can be used to create a 3D model of the crime scene. This model can be used to visualize the scene, including the locations of objects, the movements of individuals, and any other relevant details—Analysis—Once the 3D model has been created, it can be used to analyze and interpret the events that took place at the crime scene. This may involve identifying potential suspects, reconstructing the events that took place, and determining the sequence of events—Findings—The final step in the forensic reconstruction process is to present the findings to relevant parties— law enforcement agencies or courts of law.

Reconstructions have been used as evidence in a number of court cases—either to reconstruct crime scenes or to demonstrate the events that led up to a crime. In 2008, State of Florida v. Casey Anthony was one the first US cases in which reconstruction was used to recreate several scenesincluding the trunk of a car in which the body of a young girl was found. The reconstruction was used to help establish the cause of death and to support the prosecutions case that the girls mother, Casey Anthony, had killed her:

An Interactive Tour of the crime scene off Suburban Drive, the autopsy area and digital DNA lab, and the Anthony home where most believe is where 34 month old Caylee Marie Anthony sadly met her end … Angela Talamasca and her team have recreated an interactive simulation that undoubtedly will be used by both the State’s Attorneys and Jose Baez’s Defense Team. It is designed to provide a virtual experience allowing the final ‘triers of fact’ to literally transport themselves within the evidence and theories they will be presented to consider when determining the fate of the accused, Casey Anthony. This proof of concept ‘build’ includes contribution and consultation from leaders in the fields of: Forensics, CSI, Crime Scene Reconstruction and Medical Examiners Investigations.364

The use of reconstruction in the Casey Anthony case was controversial—some experts questioned the accuracy and reliability of the reconstruction. However, it was ultimately admitted as evidence in the case and played a role in the jury’s decision to find Casey Anthony not guilty of first-degree murder—but guilty of four counts of providing false information to law enforcement. 

Casey Anthony—Where the Truth Lies is a documentary about the case and the defendant’s pathological lying: “Casey Anthony is a proven liar. Her narrative of her own story is untrustworthy. She was found guilty at trial of providing false information to law enforcement. She had a long pattern of lying, beginning with years of constructing elaborate lies about her progress through high school, and later about her nonexistent job and even her pregnancy with Caylee a backstory she shares with multiple convicted killers who all eventually murdered members of their family. Rather unusually, however, Casey’s parents, according to her brother Lee’s testimony at her trial, had a history of enabling and playing along with their daughter’s lies rather than holding her to account for them.”365

“The Anthony-focused docuseries repeatedly states she lied as a coping mechanismin order to deal with to years of alleged sexual abuse from her father. (Anthony's father previously denied the allegations and did not respond for comment for the series). ‘I lied to everyone because that was my whole life up to that point,’ Anthony says through tears. ‘Acting like everything was OK but knowing nothing was OK… All of this is a reaction to trauma.’”366 What is truth within a culture that silences women? Disbelieves them? Shames them? Or is this another lie?

Uncertainty is a nightmare.Reconstructions can be nightmares too. The spector and spectacle of the tragedy—commodified in the attention economy. Masks of a moment passed—haunting the present. The case was sensationalized and “the high demand for Anthony related goods led to bidding wars. A Casey Anthony mask sold for over $20,000 to a desperate buyer in need of a Halloween costume. The seller, under the screenname ‘Prophunter’ said it was, ‘One of the best Halloween masks I've ever seen.’”367 Tragedy porn. Hustler—considered the more hardcore of the dominant pornography magazines—offered Anthony a deal: "We made an offer of a half-a-million dollars, but ... she would receive 10 percent of [any additional profits], and the reason why I did that is I'm still ambivalent as to how well this will do … but in case it goes viral and there's this huge interest and everybody wants to see the photographs, well, you know, millions could be made, so we don't know … but, I think it was generous of us to put a percentage of the profits in for her because it could amount to a great deal more money … People have been coming to me in droves, you know, wanting this … I've never seen that happen before.”368 The offer was widely publicized—no response. The imagination runs wild.

In 2014, another child—Tamir Rice—was tragically killed. This time by a Cleveland police officer. The crime scene was reconstructed by law enforcement: “This virtual reality reenactment of the Tamir Rice incident shows the perspective of the officers as they drove toward the area where Rice was shot … But what about Tamir Rice’s perspective? During the case debriefing, Meyer highlighted a video of the officer’s view, but it isn’t stated if a reconstruction of Rice’s viewpoint was ever requested.”369 Reconstructions are powerful … “Reconstructions are certainly more powerful. It’s much less likely that a jury will dispute a version of events with a 3D reconstruction versus a version of events backed by 2D photographs. Instead of taking a jury back out [to a scene] several months, several years later, you can take them into a scene as it was the day that it was scanned. You have a more realistic, cleansed view of the scene …”370 These technologies hold power—privilege a point of view—but are they distributed equally? Or do they reinscribe the same Reconstruction era logic of oppression and elimination?

The San Bernardino Mass Shooting refers to a tragic incident that occurred on December 2, 2015, in San Bernardino, California, United States.On that day, two individuals, Syed Farook and his wife Tashfeen Malik, carried out a mass shooting at the Inland Regional Center, a facility that provided services to people with developmental disabilities. During the attack, Farook and Malik opened fire on a holiday gathering being held at the center, resulting in the deaths of 14 people and injuring 22 others. After the shooting, Farook and Malik fled the scene but were later killed in a shootout with law enforcement officers.371 The San Bernardino mass shooting was one of the deadliest acts of gun violence in the United States at the time, and it had a significant impact on the local community and the nation as a whole. The incident sparked debates and discussions about gun control, terrorism, and the measures needed to prevent such tragedies in the future.

When the forensic team arrived at the scene of the final shootout, the “unit saw hundreds of pieces of evidence, and thanks to FARO’s laser scanners, they could capture and document such evidence in a single scan in just 15 minutes. It is the most complete documentation tool, aside from digging up the house and bringing the entire house with me,’ said Russ, who is a crime scene specialist with the San Bernardino County sheriff’s department … ‘An average scan collects 44 million data points … Some scenes require 50 to 60 scans—that’s billions of data points.’ When Russ heads back to his crime lab, a computer program stitches the images together, creating a 3D memory of the crime scene.”372 The bullet-riddled car—captured and reconstructed. It is becoming common practice to make reconstructions of high profile crime scenes—mass shootings—massacres.373 

Reconstruction has also been proposed as a preventative countermeasure: “The technology uses radar energy to detect weapons and explosives through clothing, backpacks and hand baggage in real time. The 3D shapes created by the technology are compared to an extensive library of images of weapons. MIT has licensed the technology exclusively to Liberty Defense to bring it to market. ‘What we’re offering is an attack prevention system,’said Aman Bhardwaj, president and COO. ‘We’re preventing someone with a weapon from entering.’”374 Scan everyone at every entrance.375

Victims’ bodies are reconstructed too: “It’s called virtopsy or virtual autopsy. The first thing that we do is a laser scan of the bodyto capture bite marks, bruises and other things that we might lose when we open the body … little ridges, bumps and holes on a surface are such important information to scientists. If we had to take a photograph then we wouldn’t be able to capture this texture.”376 Forensic criminologists imagine a fully reconstructed future: “That’s what I see coming. We’re going to be putting these goggles on juries and say look around and tell me what you see.”377

II. Reconstructions are synthetic evidence.

III. What imbues this methodology with trust?

Explanation—Models of bodies and spaces are extracted—presented out of context. The process involves extensive reduction and normalization. Moreover, reconstructionscan be modified to fabricate or alter evidence in order to support a particular narrative.

IV. Does complex engineering bring a representation closer to perfection?

Explanation—How are Reconstruction technologies used to construct, represent, and manipulate identities? Representation Theory—a branch of mathematics—delves into abstract algebraic structures by representing their elements as linear transformations of vector spaces.378 This method translates complex mathematical objects like groups and rings into more understandable and workable forms,—like matrices or transformationswhich are core to Reconstruction algorithms.

Its twin— Representation Theory—exists in sociology and focuses on how individuals and groups construct and present their identities.379 How do individuals or groups perceive and articulate self-concepts—how are these identities constructed, presented, and negotiated in social and cultural contexts. This evaluation encompasses personal identities—gender, race, and religion—and collective identities —national, ethnic, social class: “a system of values, ideas and practices with a twofold function; first, to establish an order which will enable individuals to orient themselves in their material and social worldand to master it; and secondly to enable communication to take place among the members of a community by providing them with a code for social exchange and a code for naming and classifying.”380 In virtual environments, we can assume avatars that may differ from our physical selves, opening up exciting avenues for identity play and self-expression. While Reconstruction offers new avenues for self-representation, it may also reinforce harmful stereotypes, particularly when the technology is used without a nuanced understanding of the complexities of identity. Facial recognition technologies, which rely heavily on capture and reconstruction, have been criticized for their racial and gender biases: “The first woman known to be wrongfully accused as a result of facial recognition technology”381 was eight months pregnant and had to be hospitalized for dehydration after detainment. Error is dangerous. The fidelity of reconstructions could more generally impact how identities are perceived and interpreted. While high-fidelity models might seem more real or authentic, they can also be manipulated or falsified, potentially leading to issues of deception or misrepresentation.

V. Bodies are engineered.

Explanation—“A cyborg is a cybernetic organism, a hybrid of machine and organism, a creature of social reality as well as a creature of fiction.”382 According to Donna Haraway, cyborgs disrupt conventional binaries and categories; they are not confined to nature or culture, organic or synthetic, but inhabit the blurry interstices between these oppositions. Cyborg bodies are engineered constructs that defy rigid categorization, embodying a mix of biological, technological, and cultural elements. The cyborg is a symbol of hybridity, promoting a fluid, transgressive model of identity that moves beyond fixed identities and embraces multiplicity. Bodies are biological—mutable, adaptable, and socially constructed. Bodies are engineered in myriad ways —through physical modifications, medical interventions, wearable and implantable technologies, and even through the roles and expectations imposed by society. Our bodies, and the ways we perceive and experience them, are continually re-engineered, re-shaped, and re-defined through our interactions with technology and culture.

VI. Marketplaces of bodies.

Explanation—The capacity to digitally reconstruct bodies holds immense potential. For instance, in medical fields, Reconstructions allow professionals to simulate complex surgeries, thereby enhancing training and procedural understanding. However, alongside these benefits are ethical concerns surrounding the potential objectification and commodification of the human form. “The contemporary regime of volumetrics, meaning the enviro-socio-technical politics and narratives that emerge with and around the measurement and generation of 3D presences, is a regime full of bugs, crawling with enviro-socio-technical flaws.”383

VII. Corporeal reconstruction can transform a persons surface into an asset, available for use in a variety of contexts. This new marketplace of bodies has led to a surge of questions concerning privacy, consent, and rights of individuals whose bodies are being recreated. Body scans used in simulations—especially for medical or military training—present more subtle ethical dilemmas. While these simulations provide valuable learning experiences, they can inadvertently foster a desensitization towards human suffering and human agency. Particularly in the gaming, entertainment, and pornography industries, there is a risk of exploitation and misuse.

Explanation—Reconstructions of the body—body scans—are assets. They are bought and sold—recontextualized and altered: “A new adult virtual reality company called Holodexxx VR aims to change virtual reality adult entertainment in a big way. Instead of creating experiences that rely on 360 videos or cartoony 3d computer generated models, Holodexxx VR is creating ultra-realistic3d modeling using 3d scans of actual adult actresses. Powered by the amazing Unreal Engine 4, this shit looks real and fuckworthy. Holodexxx VR aims to make 3d recreations of adult actresses as realistic and interactive as possible.”384 HolodeXXX allows users to customize the proportions of bodies—by customization,I mean adjusting a boob slider and a butt slider.385 If pornographic actors are compensated fairly and have informed consent, why not?

VIII. When a persons body is converted into a digital asset, they risk losing control over their own likeness.386 This commodification transforms the human body into a tradable unit. Asset stores are digital replicas of slave markets—bodies for sale.387 These bodies may be placed in violent or sexual scenarios that the actor may not have anticipated. In pornography, the issue is acute. Deepfake technologies have been utilized to place individuals, often women, into explicit scenarios without their consent: “After years of activists fighting to protect victims of image-based sexual violence, deepfakes are finally forcing lawmakers to pay attention.”388 This is a violation of rights and constitutes a form of digital sexual harassment and assault. One victim of deepfake pornography expressed,“it really makes you feel powerless, like you’re being put in your place … Punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you.’”389

IX. Commodified bodies are subject to atrocity.

X. Reconstruction reenacts trauma.

Explanation—Eighty years after the forced expulsion of Japanese Americans from Little Tokyo, Los Angeles, the exhibition—BeHere / 1942: A New Lens on the Japanese American Incarceration—provides an immersive view into this dark historical period:  “On Saturday May 9, 1942, the lives of Japanese Americans in Little Tokyo, Los Angeles, were forever changed. They were given until noon to dispose of their homes and possessions; then they were made to leave. In the euphemistic language of U.S. government policy, Japanese Americans all along the West Coast—some 120,000 individuals, 37,000 of whom resided in Los Angeles—were ‘evacuated’ to ‘relocation centers.’ In reality, they were put on buses and trains and shipped off to concentration camps where they would live for years, in some cases until after the end of the war.”390 Utilizing lesser-known photographs by Dorothea Lange and Russell Lee, the exhibit employs augmented reality—AR— technology to immerse visitors in historical reconstructions. The exhibition, located in the Japanese American National Museum—JANM—also features a public AR installation in the plaza adjoining the Nishi Hongwanji Buddhist Temple, where visitors can walk among virtual recreations of Japanese Americans about to be sent to the camps. The project is created by Japanese media artist Masaki Fujihata, and is co-presented by JANM and the Yanai Initiative for Globalizing Japanese Humanities at UCLA and Waseda University, Tokyo. What are the consequences of reconstructing atrocities? What is the role of communities that have experienced oppression?391

XI. The process of reconstructing a traumatic historical event involves technology, but also engagement with real people affected by the historical incident. How to balance the objectives of truth-telling—raising awareness—memorializing—with the risk of retraumatizing survivors and their descendants? The act of reconstruction may inadvertently commodify or trivialize suffering in a world where images are rapidly produced and consumed.

Explanation—The nature of augmented reality, which superimposes digital information onto the physical world—risks blurring the line between reality and reconstructionpotentially diluting the gravity of the atrocities. It is ethically critical to ensure that such projects do not inadvertently sanitize or diminish the harsh realities of the events being portrayed, particularly when dealing with a historical atrocity of such magnitude. A pivotal aspect of the BeHere/1942 project is the augmented reality app, which enables users to place reconstructed scenes virtually anywhere. While this interactive element can engage audiences and potentially increase the projects reach, it also poses significant ethical implications. Allowing the freedom to situate such poignant historical moments in any setting can lead to misuse or trivialization of the events. For example, users might place the reconstruction in inappropriate or disrespectful contexts, inadvertently undermining the gravity of the atrocities.392 The risk is that the reality of the traumatic historical event could be diminished, turning it into a mere prop in a personalized digital playground. 

XII. Hope is an inconstant pleasure, arising from the idea of something past or future, whereof we to a certain extent doubt the issue. Hope for remembrance. Hope for future protection.

XIII. What are the ambitions of the reconstruction? Who are its stakeholders?

Explanation—Involving survivors and descendants of survivors in the BeHere/1942 project adds another dimension to the ethical landscape. It raises questions about who gets to tell these stories, how they should be told, and what they are trying to say. On one hand, the involvement of the Japanese-American community in the project gives a voice to the survivors and their descendants, allowing them to reclaim and represent their historical narrative. This participation can also offer a degree of catharsis and empowerment. On the other hand, its crucial to consider the potential emotional distress for those reenacting traumatic events of their ancestors. Asking community members to reperform the process of internment might risk inflicting psychological harm, even if their participation is voluntary and well-intentioned.There is something sinister about the fact that they must enter cages for their performance to be captured and reconstructed. The ethics of representation must be carefully navigated to ensure respect for the dignity and well-being of all involved, striking a balance between historical accuracy and empathetic engagement. Moreover, the issue of consent and compensation for participants, particularly for those from marginalized communities, needs careful attention. It is essential to ensure that those involved are not merely viewed as assets but as partners in the narrative process—that they are consulted, adequately compensated, and acknowledged for their contributions: “What, then is the appropriate production methodology for creating monuments (meta-monument) in cyberspace?”393

XIV. Confidence is pleasure arising from the idea of something past or future, wherefrom all cause of doubt has been removed.

XV. Reconstructions are monuments to a moment in time.

Explanation—Monuments are structures erected to commemorate historical events, significant individuals, or prominent ideas. They have always held an integral function in societies. Their public and permanent nature carries significant historical—political—cultural—implications. Monuments are repositories of memory. They help societies recall their past, acknowledge the struggles they have overcome, and pay tribute to individuals who have played consequential roles in their histories. Monuments play an important role in sculpting collective identities. They construct and reinforce national or community narratives, fostering a sense of belonging and shared history among individuals. Take, for instance, the Statue of Liberty, an emblem of American ideals of freedom and democracy.

Monuments possess spatial-narrative power: “Monuments occupy a special place in the urban environment, which, on the one hand, can be considered as a mechanism for the translation of social memory, and on the other hand, they are a spatial reference and a marker of urban space for both of indigenous population and guests of the megalopolis.”394 Location holds substantial meaning. They are installed in symbolic locations like city centers or historical sites. They integrate into daily life and consciousness—conveying power—centrality—importance. The design and symbolism of monuments tell a story—convey ideology—or represent a historical narrative. They silently communicate these narratives to the public, shaping collective memory and perception of history. Power of presence. Through their permanence and scale they create a sense of awe and respect for their associated meanings. They command attention, ensuring that what they represent is not forgotten.

While monuments hold cultural significance and spatial-narrative power, they can also become sites of contestation, particularly when the narratives they represent are challenged, or when they symbolize oppressive histories. As society evolves, so does the perception of these monuments. Stolen or plundered monuments are trophies of conquest—tangible representations of dominance—the desire to obliterate other historical and cultural memories. In stripping away these symbolic structures, thieves and looters reconfigure the cultural landscape, disrupting the continuity of cultural memory and identity: “The British Museum has been accused of exhibiting ‘pilfered cultural property,’ by a leading human rights lawyer who is calling for European and US institutions to return treasures taken from ‘subjugated peoples’ by ‘conquerors or colonial masters.’”395

XVI. Joy is pleasure accompanied by the idea of something past ...

XVII. Disappointment is pain accompanied by the idea of something past ...

XVIII. Reconstructions are not confined by time or space.

Explanation—Reconstruction has ushered in an era of digital monuments. Archeologists embrace the technology.396 They create highly accurate three-dimensional digital replicas of historical monuments—instead of stealing them. The production of these digital replicas is an exercise in technological prowess—but it also offers new possibilities for cultural preservation, education, and accessibility. Reconstruction ensures that even if a monument is physically lost due to conflict, natural disasters, or the ravages of time, its digital counterpart ensures that its memory and significance will endure. These reconstructions are not confined by physical or temporal boundaries. Digital monuments circulate globally. They transcend geographical boundaries and provide access to individuals who may never have the opportunity to visit the source location. This accessibility fosters broader cultural understanding. While reconstructions offer extraordinary educational and preservation benefits, they should not replace or devalue original artifacts. However, colonialism persists in virtual space397—the replication and dissemination of cultural artifacts without the informed consent of the cultures involved.

XIX. Approval is love towards one who has done good to another.

XX. Indignation is hatred towards one who has done evil to another.

Explanation—The Robot Guerrilla Campaign to Recreate the Elgin Marbles.398

The Elgin Marbles—also known as the Parthenon Marbles, have long been a source of controversy. The Parthenon—meaning maiden— sits atop the Acropolis in Athens, Greece. The highest place. A holy site. Ancient temples and smiling statuary. In 480 B.C., The Sacking of the Acropolis was the climax of decades-long conflict between Greek city-states and the Persian Empire: “All this the Persians burned. Blood was shed, too. The invaders killed citizens and priests who had taken refuge in the holy places—a slaughter that, for the Greeks, represented an inconceivable violation of sacred law. Later, after the Persians were defeated and the rest of the Athenians returned to their ruined city, the smiling statues were carefully gathered and buried, as if they were people. You can still see the charring on some of them. The attack and destruction scarred the Athenian consciousness in a way that is difficult for us, traumatized though we still are by September 11th, to imagine. A generation passed before the Athenians could bring themselves to rebuild.”399 

Construction commenced at the height of the Delian League's influence—447 B.C.:“Shortly before rebuilding on the Acropolis began, Pericles seized the treasury and moved it to Athens, ostensibly for safekeeping. At the time, it was valued at eight thousand talents—roughly $4.8 billion in today’s money, by one estimate. Another six hundred talents, or about three hundred and sixty million dollars, rolled in annually as tribute.”400 By 438 B.C., the structure was fully realized, with ongoing decorative work until 432 B.C.. During this period, it functioned as the treasury of the Delian League, which eventually evolved into the Athenian Empire: “It was the first temple in mainland Greece to be built entirely of marble—twenty-two thousand tons of it, quarried about ten miles away and hauled up the Acropolis by sledges, carts, and pulleys. It was also the largest. Most temples in the rather plain architectural style known as Doric have six columns across the front and thirteen down the sides; the Parthenon has eight columns in front and seventeen down the sides. The expanded scale made possible an unprecedented amount of sculptural decoration.”401 The Parthenon frieze is a sculptural band that adorned the exterior of the Parthenon. A mythic perimeter. For centuries the scenes were interpreted as “the great Panathenaic procession held every four years in honor of Athena.”402 In The Parthenon Enigma, Joan Breton Connelly proposes a new interpretation of the Parthenon frieze. It is not a depiction of celebratory procession. It proceeds toward something else. Something sinister. The founding myth of a site of power: “In Euripides’ telling, King Erechtheus faces war with a rival king, who is also the son of the god Poseidon, and is advised by the Delphic oracle that to save his young city—Athens—he must sacrifice his daughter ... The serene figures depicted on the frieze were participants not in a civic festival but in a sacrifice—a human sacrifice—of the king’s youngest, maiden daughter, the crop-haired child.403 The ultimate sacrifice.

Next to the main structure of the Parthenon stands the Erechtheion—an exquisite temple that honors King Erechtheus. This structure is famous for its porch—The Caryatid Porch—which consists of six female statues—which serve as structural columns—supporting the roof of the temple. Each Caryatid stands approximately seven feet tall and is intricately carved from Pentelic marble, renowned for its quality and purity. The Caryatids are female Atlases. They are priestesses—flowing drapery—one foot slightly advanced—as if in graceful motion. The statues are remarkable examples of the hyperreality of ancient Greek sculpture—capturing life-like geometry and movement in stone—Ur-realism. Each Caryatid supports the entablature of the temple on her head, with her arms gracefully extended to hold the weight. The ingenuity and engineering prowess of ancient Greece. Over a millennium the Parthenon underwent rounds of destruction, reconstruction, and functional transformation—serving a temple, a bank, a church, a mosque, an arsenal, and a museum. Shifting to support the ideology of whoever was in power. It is one of the most reconstructed geometries—in situ and in circulation:  “Le Corbusier called it ‘the basis for all measurement in art’ reproduced in every medium and on every scale imaginable, from stone to paper, in tombs, stock exchanges, and courthouses, from a full-size replica in Nashville to the blue-and-white image on millions of takeout coffee cups.”404

During the Siege of the Acropolis, in 1687, an explosion severely damaged the Parthenon, destroying forty percent of its original sculptures. Another assault. Shortly after—in the early 19th century—British diplomat Lord Elgin removed about half of the remaining sculptures from the Parthenon, under a vaguely worded Ottoman license. This collection included life-sized figures, metopes, and a large portion of the sculpted frieze—intended to adorn his Scottish country house. However, the nose of a stolen caryatid was immediately broken off and some pieces were lost in a shipwreck and took over two years to recover. Lord Elgin sold the marbles to the British Parliament in 1816 after enduring personal losses including his fortune, his wife, and his own nose—“a degenerative infection concentrated in his nose, that prompted Lady Elgin torefuse her husband conjugal privileges and ultimately leave him.”405 He sold the collection for 35,000 pounds. The equivalent today would be around £3.6 million or $4.35 million, which is about half of what he had spent to acquire and transport them. The artifacts were then placed under the trusteeship of the British Museum.406 In Athens, the pedestal of the caryatid removed to London remains empty—awaiting its return.

The campaign to return the Elgin Marbles to Greece started almost immediately after their removal. One of the first critics was poet Lord Byron in 1811: “Using the Elgin Marbles as a political symbol of British imperial rapacity, Byron sought to expose Elgin’s professed curatorial concern for the states as a fraud. For Byron, the idea of an enlightened mission to ‘save’ the marbles simply rationalized the malevolent exercise of British political power. It was not the ravages of time or war that was antiquity’s nemesis in this case, but imperial greed.”407 A contemporary critic, former Massachusetts prosecutor Mr. Michel, equates the British Museums retention of the marbles to clinging onto relics of colonial grandeur, and criticizes their inability to educate about Ancient Greek art while acknowledging the artifacts’ emotional significance to Greeks. The trend of returning cultural artifacts to their countries of origin has grown recently, as demonstrated by an Italian museum returning a Parthenon fragment to Athens.408 However, the British Museum and government have largely avoided discussions about returning the Elgin Marbles, with supporters arguing that such restitution could create a problematic precedent and threaten museum collections worldwide. “The Trustees firmly believe that there's a positive advantage and public benefit in having the sculptures divided between two great museums, each telling a complementary but different story.”409

Greek campaigners have argued for their return, maintaining that they were taken without proper consent during the time of the Ottoman Empire’s occupation. The British Museum, supported by successive British governments, rejects these claims, arguing that the marbles were acquired legally: “Lord Elgin's activities were thoroughly investigated by a Parliamentary Select Committee in 1816 and found to be entirely legal. Following a vote of Parliament, the British Museum was allocated funds to acquire the collection.” The Museum has expressed willingness to explore potential loans of the objects, given the borrower acknowledges the lender’s ownership and agrees to return them: “The Trustees have never been asked for a loan of the Parthenon sculptures by Greece, only for the permanent removal of all of the sculptures in its care to Athens. The Trustees will consider (subject to the usual considerations of condition and fitness to travel) any request for any part of the collection to be borrowed and then returned. The simple precondition required by the Trustees before they will consider whether or not to lend an object is that the borrowing institution acknowledges the British Museum's ownership of the object.”410 This is a trap. Greece must forfeit all claim of ownership for the marbles to return.

Many archaeologists argue that the case for the return of the Elgin Marbles to Greece is compelling, given that the original building from which they were taken still stands. The British Museum counters that “though partially reconstructed, the Parthenon is a ruin. It's universally recognised that the sculptures that still exist could never be safely returned to the building: they're best seen and conserved in museums. For this reason, all the sculptures that remained on the building have now been removed to the Acropolis Museum, and replicas are now in place.”411 Roger Michel, of the Institute of Digital Archaeology—a collaboration between archeologists at Cambridge and Harvard—proposes that reconstruction could offer a solution to resolve the long-standing dispute.

Michel, who initiated this project, intends for the copies to go to the British Museum to promote the repatriation of the original Elgin marbles. The British Museums’ Deputy Director, Jonathan Williams quipped that ‘people come to the British Museum to see the real thing, don’t they?’The museum’s curators apparently think not. The two galleries adjoining the Elgin gallery both contain replicas of Parthenon sculptures still residing in Greece. Nearby are plaster casts of the palaces of Xerxes and Darius and the tombs of Sety and Merenptah. In fact, just like at peer museums around the world, copies are currently used throughout the British Museum.”412 Museums contain countless quality facsimiles of significant sculptural works, reflecting a historical appreciation for reproductions. However, a resolution satisfying both the British Museum and the Greek government seems unlikely. Copies and replicas are often viewed as inferior. In March, the museum denied a formal request to scan the pieces. The Institute of Digital Archaeology resorted to stealth—using discrete devices equipped with Lidar sensors and photogrammetry software—to reconstruct the marbles without permission.

The first reconstruction—a marble horse head—was converted to toolpaths for a subtractive robot, which carved the prototype over four days. The marble for this prototype was sourced locally, and the final copy was carved from marble quarried on Mount Pentelicus, the original source of stone for the Acropolis. Michel plans to create more replicas of the Parthenon Marbles complete with restorations and repairs to reflect how the originals would have looked. These changes will account for the damage inflicted on the marbles during an ill-advised cleaning operation in the 1930s, in which British Museum masons stripped away much of the patina. The reconstruction will also have some degree of color restoration, applied by hand in collaboration with Greek experts: “looking at an ancient Greek or Roman sculpture up close, some of the pigment ‘was easy to see, even with the naked eye.’ Westerners had been engaged in an act of collective blindness.‘It turns out that vision is heavily subjective … You need to transform your eye into an objective tool in order to overcome this powerful imprint’—a tendency to equate whiteness with beauty, taste, and classical ideals, and to see color as alien, sensual, and garish.”413 Writings on the topic of whitewashing monuments can be found across many disciplines and discussions, including post-colonial studies, sociology, history, and cultural heritage studies. The term can refer to both the physical act of cleaning or altering the color of monuments, as well as the metaphorical act of erasing or glossing over historical injustices or uncomfortable truths.414

Some archaeologists, while supporting the repatriation of the marbles, expressed concerns about the projects source of funding, the lack of public consultation, and perceived echoes of British imperialism. Questions have also been raised about who the replicas serve and their political implications, especially when artifacts are seen as symbols of nationalism and state power. The Greek government has been silent on the replica project, causing unease among some scholars. Bernard Means, director of the Virtual Creation Lab at Virginia Commonwealth University, said such a project should only be undertaken with the consultation and full support of Greece, suggesting that proceeding otherwise indicates a colonial mindset.415 And while these guerilla reconstructions have produced a new kind of pressure, the British Museum continues to discuss the matter in terms of a loan. The director of the Acropolis Museum, Nikolaos Stampolidis, reflected “in the difficult days we are living in, returning them would be an act of history. It would be as if the British were restoring democracy itself.”416

XXI. Heritage is public domain.

XXII. Heritage as property is our inheritance.

Explanation—Morehshin Allahyari is an Iranian artist, activist, educator, and curator who challenges the dominant narratives around concepts like heritage and ownership, particularly as they intersect with digital technology. One of her projects—Material Speculation: ISIS—involves the reconstruction of artifacts destroyed by terrorist violence. These reconstructions are 3D printed and embedded with a flash drive and a memory card. Each flash drive contains data about the artifact—gathered texts, images, videos, and maps. Allahyari condemns terrorist and institutional violence alike. Digital Colonialism, as conceptualized by Allahyari, is a critique of the power structures that exist in the production, archiving, and distribution of data and digital artifacts, particularly those from non-western cultures. Western corporations and institutions often control access to digital forms of cultural artifacts—mirroring historical forms of colonialism—where artifacts were removed from their original cultural contexts and displayed in European museums. Digital decolonialism involves critically examining how these technologies are used and who they serve. Unfurling—not reinforcing—existing power structures—as a means of resistance. Allahyari sees technology not just as a tool but as a liberatory potential: “She Who Sees The Unknown (2017-2021) is a research-based project by Morehshin Allahyari, that uses 3D simulation, sculpture, archiving, and storytelling to re-figure monstrous female/queer figures of Islamicate origin; using the traditions and myths associated with them to explore the catastrophes of colonialism, patriarchism and environmental degradation in relation to the Middle East.”417

XXIII. Is it possible to repatriate a reconstruction?

Explanation—Repatriation generally refers to the process of returning objects, such as cultural or historical artifacts, to their country of origin—or to the community with which they hold a significant cultural connection. Take the inverse of the marbles proposal.—repatriating a reconstruction.If a reconstruction accurately and respectfully represents an original artifact that has been lost, stolen, or destroyed, repatriating that reconstruction could potentially allow a community to reconnect with its lost heritage in a meaningful way. This might involve transferring a digital or physical replica of the artifact back to the community, or it could involve sharing the knowledge and resources needed to create a reconstruction locally. However, a reconstruction is not the same as the original artifact. Even the most accurate and detailed reconstruction is an interpretation, created with contemporary tools, materials, and knowledge. The intangible aspects—the history, the stories of those who created and used the original artifact, and the spiritual or cultural significance attached to the artifact—cannot be fully recreated. A reconstruction is not a substitute for the original.

XXIV. Monuments are public memory.

Explanation—Monuments primarily serve as symbols of public memory, reflecting the cultural, political, or social ideals of a society at the time of their creation. They typically celebrate victories, pay homage to notable figures, or symbolize shared beliefs. Erecting a monument is often an act of affirming a dominant narrative—a projection of power. Monuments embody the ethical mode of Reconstruction. They materialize and solidify specific narratives and identities in the public sphere. Take the Lincoln Memorial: “In this temple as in the hearts of the people for whom he saved the Union, the memory of Abraham Lincoln is enshrined forever.”418 Savior of the Union. A symbol of freedom and the struggle against slavery. A reconstruction of Lincoln’s vision for America.

XXV. Monuments affirm the rules of power.

XXVI. Memorials are reconstructions of loss.

Explanation—Memorials traditionally hold a commemorative function, often acknowledging and mourning loss or tragedy. Memorials serve as sites for collective remembrance and reflection, encouraging contemplation about past events or individuals. Rather than predominantly celebrating or affirming a dominant narrative, memorials aim to facilitate healing, understanding, and reconciliation. The Vietnam Veterans Memorial—for instance, does not glorify war or assert nationalistic pride but remembers and honors those who served and died in the Vietnam War. It offers space for visitors to mourn, reflect, and connect with the names inscribed on its reflective surface, “these names, seemingly infinite in number…”419 

XXVII. Reconstructions change.

Explanation—Monuments and memorials may intersect; they are not mutually exclusive. And these public structures meanings and interpretations are not static. They evolve over time, influenced by changing societal attitudes, historical perspectives, and ethical norms. Debates surrounding Confederate monuments in the U.S. reevaluate historical reconstructions.These symbols are increasingly seen as elevating a history of slavery and racial discrimination.

XXVIII. Reconstruct memorials that elevate the vulnerable.

Explanation—Before Charleston, South Carolina, fell in 1865, Hampton Park became an outdoor prison for Union soldiers: “More than 250 prisoners died and were buried in mass graves. After Confederate evacuation, Black ministers and northern missionaries led an effort to reinter bodies and build a fence around a newly established cemetery. Over the entrance, workmen inscribed the words, ‘Martyrs of the Racecourse.’ On May 1, 1865, a group of newly freed Black people gathered at what is now Hampton Park to put decorations on the graves of the Union soldiers … They sang songs and they made speeches,and this was covered not only in Charleston but New York newspapers and this is credited as being the first Memorial Day.”420 The first Memorial Day was in lowcountry.

Lowcountry is a documentary that dives into the complexity of interracial relations in Charleston, in the aftermath of the Emanuel AME Church Shooting. The film captures a community rocked by violence, as it wrestles with grief, grapples with injustice, and moves towards healing.The tragic event took place on June 17, 2015, when a white supremacist entered the Emanuel African Methodist Episcopal Church and killed nine African-Americans during a prayer service (I.xiv.). The incident sent shockwaves through the nation, igniting a broader conversation about racial hatred, prejudice, and the urgent need for change. Lowcountry explores this tragedy not just as an isolated event, but as a reflection of the historical racial tensions that have shaped the city of Charleston and the larger American society. It dissects the aftermath of the massacre, the communitys collective grief, and the resilience demonstrated in the face of such a horrific event. The filmmakers explore how “Charlestons genteel reverie was shattered by shootings that exposed the underbelly of the citys tourist mythology. Can black and white residents arrive at conciliation or will immutable Southern politeness censor what is needed to initiate such a process? What are the conditions for healing in a city averse to truth-telling?”421 

As part of the production of Lowcountry, the filmmakers captured and reconstructed models of many official Charleston monuments as well as the softer structures of memorial. There was an outpouring of global support in the wake of the Emanuel AME Church shooting. Countless objects—letters, prayers, and quilts—sent by individuals from around the world—tangible offerings of solidarity and love.When reconstructing the quilts—the filmmakers captured them folded—shielding their messages. For community eyes only—obscured. Within institutionalized slavery, quilts contained hidden messages of resistance and resilience.Quilts were often used as secret maps and instructional tools to aid enslaved individuals in their quest for freedom. Encoded within the intricate patterns and designs of these quilts were directions, symbols, and messages that guided runaways along the Underground Railroad.422 Artist and scholar Romi Morrison has written extensively about this form of covert communication, revealing the depth and complexity of these encodings and their pivotal role in facilitating escape: “The Freedom Quilts generate a type of code that doesnt execute automatically but makes the act of interpretation explicit. While encountering quilts left in public fugitives would discern the code and simultaneously have to read it in context, within the geography of placement. In this instance the executability of code is halted as a declarative axiomatic language imagined within syntax. Code is not an absolute instruction but is read in addition to landscape.”423 

The production of Lowcountry took place against the backdrop of monuments to confederate ideology— and quieter markers to local abolitionists and civil rights activists—often underrepresented in the city’s narrative.The monuments—and memorials—throughout the city serve as powerful reminders of Charleston’s past struggles and achievements in the fight against racism and inequality: “Since the Charleston Church shooting, more than 300 Confederate symbols have been removed, including 170 monuments. As deadly violence against the Black community continues, Pinckney hopes that as monuments come down, the movement offers the opportunity for people nationwide to understand that Confederate symbols have served to terrorize the Black community since they first began to be put in place after the Civil War.”424 Gleaming white Confederate monuments still mount the terrain. Resistance symbols are hardwon. There was always a lack of bureaucratic support for African American monuments.And sites of community and remembrance have always been vulnerable—African-American graveyards have been systematically destroyed in the construction—and reconstructions of the city.425 Nowhere to remember.

In 2018, The Legacy Museum opened in Alabama: “The Legacy Museum: From Enslavement to Mass Incarceration is situated on a site in Montgomery where Black people were forced to labor in bondage. Blocks from one of the most prominent slave auction spaces in America, the Legacy Museum is steps away from the rail station where tens of thousands of Black people were trafficked during the 19th century.”426 Then, in 2023, a Presidential Initiative was announced—The Monuments Project:

The Monuments Project is an unprecedented $250 million commitment by the Mellon Foundation to transform the nation’s commemorative landscape by supporting public projects that more completely and accurately represent the multiplicity and complexityof American stories. Launched in 2020, the Monuments Project builds on our efforts to express, elevate, and preserve the stories of those who have often been denied historical recognition, and explores how we might foster a more complete telling of who we are as a nation. Grants made under the Monuments Project will fund publicly oriented initiatives that will be accessible to everyone and promote stories that are not already represented in commemorative spaces. While funds may support new monuments, memorials, and historic storytelling places, not every project will be a statue or permanent marker but may be realized as ephemeral or temporary installations or other nontraditional expressions of commemoration that will expand our understanding of what a monument can be. Mellon will also support efforts to contextualize or recontextualize existing commemorative sites and to uplift knowledge-bearers who can tell stories that have not yet been told.

The Mellon Foundation is also providing lead support to MONUMENTS “a new exhibition co-organized by LAXART and The Museum of Contemporary Art Los Angeles (MOCA) that will open in Fall 2025 at LAXART and The Geffen Contemporary at MOCA. Curated by LAXART Director Hamza Walker, internationally renowned artist Kara Walker (no relation, that they know of), and MOCA’s Senior Curator, Bennett Simpson, MONUMENTS will feature decommissioned Confederate monuments displayed alongside existing and newly commissioned works of contemporary art. MONUMENTS will be accompanied by a substantial scholarly publication and a robust slate of public and educational programming.”427 The exhibition is still in development. Not yet realized.

XXIX. Monuments—exist in virtual reality.

Explanation—Currently, the government offers Virtual Tours of United States Veterans and War Memorials.428A handful of organizations and individuals produce virtual reconstructions—Honor Everywhere: Virtual Reality Veterans Experience429 —Civil War 1864: A Virtual Reality Experience430 —I Am A Man—Traveling While Black431—1000 Cut Journey.432 Others produce virtual realities that throw off the tyranny of photoreal reconstruction—NeuroSpeculative AfroFuturism: “NeuroSpeculative AfroFeminism is an ambitious and richly imagined project by Hyphen-Labs, a global team of women of color who are doing pioneering work at the intersection of art, technology, and science. The project consists of three components. The first is an installation that transports visitors to a futuristic and stylish beauty salon. Speculative products designed for women of color are displayed around the space, including a scarf whose pattern overwhelms facial recognition software, and earrings that can record video and audio in hostile situations. The second part of NeuroSpeculative AfroFeminism is a VR experience that takes place at a “neurocosmetology lab” in the future. Participants see themselves in the mirror as a young black girl, as the lab owner explains that they are about to experience cutting edge technology involving both hair extensions and brain-stimulating electrical currents. In the VR narrative, the electrodes then prompt a hallucination that carries viewers through a psychedelic Afrofuturist space landscape.”433 

XXX. Reconstructions preserve history.

Explanation—The preservation aspect of this initiative is crucial in the face of increasing threats to cultural heritage sites, making virtual reality a tool for both the democratization and conservation of our monumental heritage. However, there is the potential for—the inevitability of—distortion and oversimplification of history. Virtual realitys immersive and interactive nature can create compelling experiences, but it also simplifies and omits.This can lead to skewed or partial understanding. This is particularly problematic with monuments that have contentious or layered histories. Although VR theoretically allows anyone to visit these monuments, the reality is that many people globally do not have access to these technologies. Projects like MasterWorks: Journey Through History by CyArk, which digitize and preserve cultural heritage sites, raise questions about ownership and control. Who has the right to digitize these sites, and who decides how they are represented or interpreted?

These questions are particularly complex when dealing with monuments that are culturally sensitive or sacred to certain communities. Devices and user interfaces impose technological landmarks on virtually reconstructed sites. The interaction layer is more game than reverence. This gamification is seen in geolocation augmented reality applications like Pokémon Go, where real-world monuments become virtual PokéStops and Gyms—trivializing their cultural and historical significance. Striking a balance between play and respect is a delicate task. It necessitates community consultation. Alejandro G. Iñárritu’s Carne y Arena is a virtual reality installation that recreates the intense conditions faced by refugees on their journey towards the United States: “During the past four years in which this project has been growing in my mind, I had the privilege of meeting and interviewing many Mexican and Central American refugees. Their life stories haunted me, so I invited some of them to collaborate with me on the project. My intention was to experiment with VR technology to explore the human condition in an attempt to break the dictatorship of the frame, within which things are just observed, and claim the space to allow the visitor to go through a direct experience walking in the immigrants’ feet, under their skin, and into their hearts.434

XXXI. Monuments are erected in augmented reality.

Explanation—Augmented Reality—AR—is a technology that overlays digital information or imagery onto the physical world, providing an interactive experience that merges the real and the virtual.

XXXII. Subversive monuments augment reality.

Explanation—Artists use reconstructions to erect subversive monuments in real space. Subversive augmented reality monuments challenge traditional narratives and power structures through virtual installations. They use technology to disrupt or question dominant ideas and perspectives. Nancy Baker Cahill is known for using augmented reality to create public art in unexpected places. Her project, Liberty Bell, is an augmented reality, sound-reactive public artwork that examines the concept of freedom (V.). It has been placed at multiple historical sites of liberation and oppression across the United States. A ghostly bell—hidden in plain sight—ringing out.435 Kambui Olujimi explores the reconstruction of historical narrative in  Skywriters & Constellations.: “Immersive and unique in its form and process, ‘Skywriters’ (2018) is an animated collage of time and space projected onto the night sky of the Planetarium’s dome. Using full dome technology, Olujimi achieves dramatic shifts of scale and stunning visual effects that animate Wayward North. Olujimi creates his figural imagery by stitching together an encyclopedic range of film clips—earth, sky, street scenes, and microscopic views of natural and manmade materials.436 These narratives provide different perspectives on the socio-political landscape, encouraging viewers to rethink their understanding of history and reality. John Craig Freeman produced the augmented reality piece, Border Memorial: Frontera de los Muertos. This project is an AR monument dedicated to the thousands of migrant workers who have died along the U.S./Mexico border in recent years. It uses technology to bring visibility to a critical and often overlooked issue. Geolocated skeletons haunt the landscape.437 Tamiko Thiel and /p—Zara Houshmand—created Unexpected Growth—a dystopian vision of a future affected by climate change and pollution. Installed at the Whitney Museum in New York, the AR project shows hybrid bio-plastic coral-like structures growing on objects in the museum. Unexpected mutations in sea life. Underwater.438 These artists demonstrate the power of reinserting reconstructions into reality to question—provoke—disrupt.

XXXIII. Emulation is the desire of something, engendered in us by our conception that others have the same desire.

Explanation—He who runs away, because he sees others running away, or he who fears, because he sees others in fear; or again, he who, on seeing that another man has burnt his hand, draws towards him his own hand, and moves his body as though his own were burnt; such an one can be said to imitate anothers emotion, but not to emulate him; not because the causes of emulation and imitation are different, but because it has become customary to speak of emulation only in him, who imitates that which we deem to be honorable, useful, or pleasant.

XXXIV. Imitations and emulations—the terrain of simulation.

A simulation is a computational or physical model that emulates a real-world system or scenario. It is often used when conducting experiments on the actual system would be impractical, dangerous, or impossible. A simulation replicates essential aspects of the real world in a controlled environment, enabling study, prediction, and scenario testing. Pilots are trained using flight simulators that emulate real flying conditions, while surgeons use simulations to practice complex procedures and predict outcomes. Simulations of atmospheric conditions predict future weather patterns. In finance, simulations assess potential investment outcomes. Engineers use simulations to test and optimize designs. Physicists and biologists create simulations to study phenomena that cannot be directly observed or experimented upon, like the formation of galaxies or the evolution of species. A simulation typically involves constructing a mathematical model that represents the system, defining the rules and parameters that govern its behavior, and then running the model on a computer to see how the system behaves over time or under different conditions. The output can be a single result, a range of possible outcomes, or an interactive experience. The effectiveness of a simulation is heavily dependent on the accuracy of the model and the realism with which it replicates the real-world system(V.).

XXXV. Reconstructions extend reality.

The power of reconstruction extends far beyond aesthetics. They hold profound potential to shape both narrative and action, serving as bridges that connect past and present, memory and reality, tragedy and resilience. Through the careful and empathetic use of reconstruction technologies, suppressed histories can be brought to the forefront, silenced voices can be given a platform, and narratives of oppression can be rewritten into narratives of hope and resistance. The process of reconstruction must be handled with sensitivity and respect, ensuring that it does not trivialize or exploit the experiences of those who have suffered or continue to suffer. How to protect against trivialization? Exploitation? Deceit?

XXXVI. Reconstruction enables deception.

XXXVII. Deception is warfare and magic.

XXXVIII. Heritage is weaponized as a right wing think tank.439

XXXIX. Force things to stay the same.

XL. Or face the uncertainty of change.

XLI. The author is a Trojan horse.440441

Explanation—The Trojan Horse is one of the most enduring myths of deception in our cultural lexicon. It is a cautionary tale about the potential of duplicity to wreak havoc—catastrophe. The story, which hails from ancient Greek literature, narrates how—during the Trojan War—the Greeks cunningly presented the Trojans with a giant wooden horse—ostensibly as a peace offering—a gesture of their surrender. The seemingly harmless horse, however, was hollow. Hidden within—a cohort of Greek soldiers—poised to attack.Upon admittance into the fortified city of Troy, the soldiers emerged under the cover of night and overpowered unsuspecting citizens. The myth underscores how deception can breach even the most formidable defenses. The Trojan Horse has since become a symbol of deceit—a metaphor for a seemingly innocuous element or action that carries within it the seeds of ruin:

After many years have slipped by, the leaders of the Greeks,

opposed by the Fates, and damaged by the war,

build a horse of mountainous size, through Pallas’s divine art,

and weave planks of fir over its ribs

they pretend it’s a votive offering: this rumor spreads.

They secretly hide a picked body of men, chosen by lot,

there, in the dark body, filling the belly and the huge

cavernous insides with armed warriors.

[...]

Then Laocoön rushes down eagerly from the heights

of the citadel, to confront them all, a large crowd with him,

and shouts from far off: ‘O unhappy citizens, what madness?

Do you think the enemy's sailed away? Or do you think

any Greek gift is free of treachery? Is that Ulysses's reputation?

Either there are Greeks in hiding, concealedby the wood,

or it's been built as a machine to use against our walls,

or spy on our homes, or fall on the city from above,

or it hides some other trick: Trojans, don’t trust this horse.442

A Trojan Horse is malicious software—a malware twin—disguising itself as a legitimate and harmless program or file to deceive users into executing it.Unlike viruses or worms, Trojans do not replicate themselves but rely on social engineering techniques to trick users into running them.Once a Trojan is executed, it can carry out various malicious actions on the infected computer without the user's knowledge. Stealing sensitive data. Spying on user activities. Modifying or deleting files. Installing other malware. Creating backdoors for remote access.Trojan horses often enter a system through deceptive means, such as hiding in seemingly harmless attachments, fake software downloads, or infected websites. They exploit vulnerabilities in the operating system or other software to gain unauthorized access.

XLII. Industrial deception perpetuates the original colonial project—extract labor, sell sugar.

Explanation—The Teapot Dome scandal—which came to light in the early 1920s—is regarded as one of the most significant political scandals in U.S. history until Watergate. A Trojan horse. The scandal involved the secret leasing of federal oil reserves by Albert B. Fall, who was serving as Secretary of the Interior under President Warren G. Harding. The two oil fields, one located in Teapot Dome, Wyoming, and the other in Elk Hills, California, had been set aside for the U.S. Navys use in case of a national emergency. Fall managed to transfer the control of these reserves from the Navy to his Department of the Interior, and he then secretly leased the land to private oil companies. In exchange for these illegal transactions, Fall received considerable personal gifts and loans from the oil companies involved. When the details of this scandal came to light, it led to a Senate investigation and Fall was ultimately found guilty of bribery, marking the first time a U.S. cabinet official had been convicted of a felony while in office. The Teapot Dome scandal stands as a classic example of industrial deception, showcasing the potential for corruption when government officials and corporate interests collude for personal gain, often at the expense of public resources. It demonstrated how corporate influence could manipulate and exploit government processes, thereby breaching public trust and skewing public policy in favor of industry, rather than the citizens it was meant to serve. The scandal led to increased public awareness and demand for transparency in government and corporate affairs, and has since served as a cautionary tale for the need for stringent regulations and checks on the intersection of political power and corporate influence.

Fall's actions in the Teapot Dome scandal led to a significant public outcry and increased calls for greater transparency and accountability in government. Despite these calls, Fall played a significant role in dismantling industrial regulation during his tenure as the Secretary of the Interior. His direct impact wasnot in favor of increased regulation but rather in loosening and deregulating colonial industries—Mining, Railroad, Oil, Food & Drugs.443 In Hacking of the American Mind, Robert Lustig—a neuroendocrinologist and pediatrician— explores how these industries— particularly Food & Drugs—and Emerging Tech—have been manipulating the human mind and contributing to the rise of ruinous societal issues. The central premise of the book is that our brains are being hacked—manipulated—by the excessive consumption of processed foods and processed images. Short-term pleasure and instant gratification in exchange for freedom. Lustig argues that the overconsumption of sugar triggers addictive responses in the brain—generating epidemics. He delves into the science of how sugar affects brain chemistry, leading to changes in dopamine and other neurotransmitters, which can drive compulsive behavior. Food & Drug subsidies—and marketing—create a deceptive system of value and profit—flooding the market with poison. Lustig discusses how digital technology has been similarly designed to exploit our psychological vulnerabilities and keep us hooked to our screens. He explores how online interactions create a sense of validation and reward in our brains, leading to addictive behaviors. He argues that the combination of excessive sugar consumption and overuse of digital technology has led to a society plagued by chronic stress, depression, anxiety, and illness. He emphasizes the need for individuals to be aware of these manipulative tactics and take proactive steps to reclaim control over their minds and bodies.444 

XLIII. A teapot is a colonial projection(I.xv.).

XLIV. Ambition is the immoderate desire of power.

Explanation—Industrial deception.

XLV. Luxury is excessive desire, or control.

XLVI. Luxury causes physical and mental illness. Insulin Resistance.

Explanation—addiction.

XLVII. Reconstructions can be used to simulate parameter changes. Possible futures.

XLVIII. Subsidies are controls.

Explanation—Subsidized is a game—a simulation—that reconstructs the logic of subsidy systems and plays out potential parameter changes.445 

GENERAL DEFINITION OF DECEPTION

Deception refers to the act of intentionally causing another person to believe something that is not trueor not the whole truth. This can be achieved through lying, misleading, hiding important information, or presenting false information: “How can one keep from destroying oneself through guilt, and others through resentment, spreading one's own powerlessness and enslavement everywhere, one’s own sickness, indigestions, and poisons? In the end, one is unable even to encounter oneself.”446 While deception often has a negative connotation because it undermines trust and can lead to harm or unfairness, it’s important to note that it can also be used in benign ways, such as surprise parties, magic tricks, and some forms of entertainment where the objective is to create a delightful illusion or mystery rather than cause harm.

Central to Descartes’ philosophical inquiry was the concept ofDeus Deceptor—or the deceiving God—which he proposed as a hypothetical construct to question the reliability of our senses and perceptions. Descartessought to establish a foundation of certain knowledge by doubting everything that could be doubted, including sensory perceptions and beliefs. He proposed that there might be an omnipotent and malevolent being, the Deus Deceptor who deceives us and leads us to hold false beliefs about the external world. This hypothetical construct was central to Descartes' Meditations on First Philosophy, where he employed the method of doubtto arrive at indubitable truths. The Deus Deceptor concept was instrumental in Descartes' efforts to build a solid epistemological foundation. By entertaining the possibility of a deceiving God Descartes underscored the importance of clear and distinctideas as the only reliable basis for knowledge. The act of doubting itself, according to Descartes, proves the existence of a thinking self—the first indubitable truth and the starting point for his philosophical system. The idea of an all-powerful deceiverinitially introduced by Descartes,underwent an evolution in the works of later philosophers, leading to the brain in the vat thought experiment. This thought experiment gained prominence in contemporary philosophy and cognitive science as a compelling exploration of skepticism and the nature of reality. The brain in the vat thought experiment imagines a scenario in which a brain is removed from a body and placed in a vat, connected to a sophisticated computer system that stimulates it with artificial sensory inputs. These simulated experiences are indistinguishable from real experiences, and the brain is deceivedinto believing that it is interacting with the external world.447 The brain in the vat scenario raises profound epistemological questions about the nature of knowledge, reality, and the reliability of our sensory perceptions. If the brain in the vat cannot differentiate between simulated experiences and genuine experiences, how can we be certain that our own experiences are not similarly simulated or manipulated? The thought experiment also touches upon the broader philosophical topic of solipsism, which posits that only one's own mind is sure to exist. Solipsism challenges the notion of an external, independent reality, suggesting that everything could be an elaborate illusion created by one's mind or a deceiving entity. The thought experiment raises issues concerning the nature of consciousness, the relationship between mind and body, and the limits of human cognition. Furthermore, the brain in the vat concept has found relevance in discussions about the nature of virtual reality and the ethical implications of advanced technologies that can manipulate human experiences and perceptions.666448

Melodic death metal. Doubt is necessary in the face of evil genius449—post-truth reconstructions—but the Cartesian worldview can also lead to indifference. If the world is not real, why invest energy in protecting it? Disinformation is “adversarial narratives that create real world harm”450 Uncertainty is destabilizing: “the spread of false, misleading and inaccurate news threatens democracy globally. In response, researchers, non-profit organizations and media companies have sought to develop techniques to detect mis- and disinformation but fact-checking, while important, is not enough. Fact-checking sites lag behind the deluge of rumors produced by global disinformation networks and spread via private interactions.”451 With the acceleration of artificial intelligence—disinformation becomes an existential threat.

Selection is revealing.

Part IV explores—Acceleration—the ethical framework that guides government, corporate, and individual development and application of Capture and Reconstruction technologies. 











PART IV.

OF TECHNOLOGICAL ACCELERATION, OR THE ETHICS OF SUPERVISION

PREFACE

Acceleration is the dominant ethical framework governing the development and application of Capture and Reconstruction. Push technology as far and fast as possible. No limit. No exit. Accelerationism, as a field, has ironically taken an oscillating rather than asymptotic trajectory, emerging during periods of economic extremity and receding during periods of stability. This makes sense in light of the fact that it is fundamentally a theory that revolves around Capitalism. The waveforms of Capital. Volatile economic conditions—short wavelengths—draw urgent attention to the dynamic forces that control labor and value. The ur-text of this particular discourse, then, is undoubtedly Karl Marxs Das Kapital. Marx, identifying the force of automation, lays the groundwork for future examinations of technocapital: “… to the degree that large industry develops, the creation of real wealth comes to depend less on labor time and on the amount of labor employed than on the power of the agencies set in motion during labor time, whose ‘powerful effectiveness is itself in turn out of all proportion to the direct labor time spent on their production, but depends rather on the general state of science and on the progress of technology, or the application of this science to production.”452 

The springboard from marxist visions of the collapse of capitalism—toward a more radical call for its mutation—is Deleuze and Guattaris provocation in Anti-Oedipus: Capitalism and Schizophrenia: “Which is the revolutionary path? … To withdraw from the world market? … Or might it be to go in the opposite direction? To go still further, that is, in the movement of the market? … Not to withdraw from the process, but to go further, to ‘accelerate the process.453 Although this provocation is often taken out of context to encourage the wholesale and largely uncritical embrace of automation, it introduces the possibility of systemic transformation. In Energumen Capitalism, Jean-Fracois Lyotard clarifies a shared interpretation of “capitalism as metamorphosis, with no extrinsic code, having its limit only within itself, a relative, postponed limit (which is the law of value)”454 and explains that “the potential of force is not a potential to produce something more, but a potential to produce something other, in other ways.”455 Accelerationism may immediately call to mind time and speed—it is more accurate, however, to think of it in Deleuze and Guattaris terms—as deterritorialization. Accelerationism is focused on identifying and conjuring emancipatory lines of flight—flows of change—utter transfiguration.

According to Accelerationists, one of its primary methods of catalyzing change is science fiction. James Ballard observes, for example, that “what the writers of modern science fiction invent today, you and I will do tomorrow, or, more exactly, in about ten years time, though the gap is narrowing.”456 In the mid 90s, The Cybernetic Culture Research Unit at Warwick University, took up this assertion with all seriousness, producing a massive collection of theory-fiction texts. The group—including Sadie Plant, Mark Fischer, and Nick Land—updates the accelerationist trajectory with formal cybernetics—and then injects it with vivid science-fictive elements. They argue that rather than diagnose or criticize extant conditions, their speculative texts had the power to seed new futures. They proposed the term hyperstition for narratives able to effectuate their own reality: “ln the hyperstitional model … fiction is not opposed to the real. Rather, reality is understood to be composed of fictions-consistent semiotic terrains that condition perceptual, affective and behavioral responses. Writing—and art in general—[is construed] not aesthetically, but functionally—that is to say, magically, with magic defined as the use of signs to produce changes in reality.” They propose that these narratives actually come from the future, “fictional quantities functioning as time-traveling potentials,”457 and trigger long-range positive feedback loops that overhaul culture.

Science fiction narratives and capitalism are symbiotic—they feed each other progress and expand the boundaries of what can be subsumed in a framework of accumulation: “Capitalism is not a human invention, but a viral contagion, replicated cyberpositively across post-human space. Self-designing processes are anastrophic and convergent: doing things before they make sense. Time goes weird in tactile self-organizing space: the future is not an idea but a sensation.”458 The members of the CCRU offer themselves up as hosts for hyperstitions symbiotic pair—letting ideas from the future flow—through their writing—into the present. Their projections are neither gleaming nor bleak, but exhilaratingly chaotic. There is no waiting for deliverance from harm—only a growing pressure to imagine possible ways of life in the ruins.

Benjamin Noys, who coined the movements name but remains critical of its approach, concedes that its projective strategies are what make it highly seductive: “try and break the appeal of acceleration.”459 He warns that it “gives over to capital a monopoly on our imagination of the future as the continuing intensification of accumulation and the reinforcement of the capitalist continuum.”460 Noys argues that like capitalisms regenerative power, accelerationism “is an aesthetics or practice of liquefaction that can temporarily solidify to activate force, before dispersing again into new liquid immanent forces.”461 He demands “a restoration of the sense of friction that interrupts and disrupts the fundamental accelerationist fantasy of smooth integration.” Friction can come from external regulation and it can also emerge from internal dissent.

Everyone working in tech is an accelerationist. One workstream of many feeding into a process that speeds up change with invention and automation. It is important, then, that we have consciously evaluated the vector of development we are pursuing. What values do tech corporations hold? Do we subscribe to corporate values? How are we directing our creativity and engineering? The Accelerationist discourse reflects and influences the values of tech executives—corporations—and terrorists462. It is a discourse explicitly organized around political ideologies. The major divisions within the field are: Left accelerationism—Right accelerationism—Gender accelerationism—Black accelerationism—and Unconditional accelerationism. Future agendas. All of these agendas are preoccupied with the emergence of a new kind of subject and citizen within the totalizing framework of technocapital. What are the laws and practices that regulate human and machine action within these systems? What is the function of corporate ethics? What is individual agency?

DEFINITIONS.

I. By L/acc, I mean Left Accelerationism.

II. By R/acc, I mean Right Accelerationism.

(Concerning these terms see the foregoing preface towards the end.)

III. By G/acc, I mean Gender Accelerationism.

IV. By B/acc, I mean Black Accelerationism.

(In V. xvii. note. i., G/acc and B/acc strategies intersect with Posthumanism and New Materialism.)

V. By U/acc, I mean Unconditional Accelerationism.

VI. By Accelerationism, I mean: “Jettison the prospects of salvation or failure! What is needed is an explosion of designs, speculative ways through and out, even if out is ultimately out of the question.”463 In Maximum Jailbreak, Benedict Singleton argues that “escape is the material with which design works. It is the enemy of stasis, even when the latter appears as motion but only as reiteration; a project of total insubordination towards existing conditions; a generalized escapology.”464 

VII. By an end, for the sake of which we do something, I mean a desire.

VIII. Acceleration is the project of escaping current and future conditions.

AXIOM.

Goals and weights determine the direction of change.

PROPOSITIONS.

Proposition I. The central accelerationist discourse mirrors the arguments of Left \ Right / democracy.

Proof.—Accelerationism is characterized by a tension between Left and Right agendas, which almost comically clings to party lines and shuttles us towards an event horizon of debilitating political resignation.

Note.—While Nick Land was perhaps the most influential thinker in the CCRU, his ideas became increasingly conservative and, for a time, fell into oblivion. Yet, in the shadow of the 2008 financial crisis, a young theorist, Alex Williams, resurrected and popularized accelerationism in the context of a liberal agenda. In his first post on the subject, he contends that while Capitalism intersects with humanity, it is ultimately not for us and should be analyzed according to “an anti-anthropomorphic cartography, a study in alien finance, a Xenoeconomics.”465 This statement strongly resembled Lands earlier theory-fiction interventions; however, in online-conversation with Mark Fischer, a former member of the CCRU, Williams quickly revealed his absolute departure from Land, articulating his certainty that human agency plays a crucial role in unleashing the latent forces of capital.466

Proposition II. L/acc—Enactments of human agency make all the difference.

Proof.—Williams differentiates between what he calls weak accelerationism andstrong accelerationism. He dismisses ameliorative actions on the left that simply stall crises of capitalism and simultaneously pushes against the strategy of inducing collapse in service of revolution. Rather, he promotes strong accelerationism, which “radically alters the nature of the processes of capital itself … a radical mutation of the system.”467 Again, for Williams, this mutation is contingent on strategic human action.

Proposition III. R/acc—Only the agency of technocapital matters.

Proof.—Land, on the other hand, insists that human agency has no place in the matter. He endows capitalism, rather than humanity, with the subject position, and most captivatingly asserts that capitalism -is- artificial intelligence.468 It operates according to a logic of production and profit, which exists at an order of magnitude beyond human conceptualization and control. He imagines that this AI was introduced in the 16th century with the shift of laboring subjects from feudal serfs to wage-workers, and has continued to infiltrate, adapt to, and manipulate all corridors of human life for the last six centuries.

Proposition IV. Technocapital does not serve humans, it has its own agenda.

Proof.—Land maintains that capitalism is not for the primary benefit of human players. Its desires supersede human concerns. Land is a staunch advocate of Technocapital AI. He wants it to continue to grow, develop, and assume total power. He also believes in the inevitability of this outcome: “Life is being phased-out into something new, and if we think this can be stopped we are even more stupid than we seem.”469 He subscribes to a kind of inverted marxist teleology, arguing that humans will not escape capitalism, but rather that capitalism will necessarily escape us. The positive-feedback loop of capital will leave us in the dust: “Humanity recedes like a loathsome dream.”470 And Land wholly supports the anti-human implications of this escape. Lands move is to reorient hope itself. While Spinoza aligns himself to the universe, Land aligns his desires for the future with technocapital.

Corollary.—If capitalism aims to escape humanity, why has it not yet succeeded?

Proposition V. R/acc—Capitalism and democracy are incompatible.

Proof.—In his text, The Dark Enlightenment, Land answers this question and in the process, advances his conservative ideology: “For the hardcore neo-reactionaries, democracy is not merely doomed, it is doom itself. Fleeing it approaches an ultimate imperative.”471 Here, Land wages a full-scale attack on democratic principles, explaining that “as the democratic virus burns through society, painstakingly accumulated habits and attitudes of forward-thinking, prudential, human and industrial investment, are replaced by a sterile, orgiastic consumerism, financial incontinence, and a ‘reality television political circus.”472 

Proposition VI. R/acc—Democracy is the limiting force that holds the capital AI back from fulfilling its full potential.

Proof.—Democracy is Lands ultimate antagonist; throughout the text, he characterizes universal political enfranchisement as both illusion and pathology: “that democracy is fundamentally non-productive in relation to material progress, is typically under-emphasized. Democracy consumes progress. When perceived from the perspective of the dark enlightenment, the appropriate mode of analysis for studying the democratic phenomenon is general parasitology.”473 Land supports his argument with theory as well as historical interpretation. He refers to ancient Greece “as a microcosmic model for the death of the West … Its pre-eminent virtue is that it perfectly illustrates the democratic mechanism in extremis, separating individuals and local populations from the consequences of their decisions by scrambling their behavior through large-scale, centralized re-distribution systems.”474

Proposition VII.R/acc—The ideal ethical framework for technocapital is “no voice, free exit.”

Proof.—Land identifies this solution in the writing of blogger and computer scientist, Curtis Yarvin, also known as Mencius Moldbug / It is important to note that Moldbugs ideas, which promote racist, patriarchal, and fascist ideologies while condemning the free press and academic institutions, are foundational to both neoreaction, or NRx, and the alt-Right movement / In The Dark Enlightenment, Land focuses primarily on Moldbugs concept of neo-cameralism, an alternative to democracy, in which local city-states function as corporations, each with a privileged class of stakeholders, and a governing CEO: “Gov-corp would concentrate upon running an efficient, attractive, vital, clean, and secure country, of a kind that is able to draw customers. No voice, free exit.”475 In this patchwork model, subjects have no voting power, but can move to a new city-state if they are dissatisfied \ Critically, neither Land or Moldbug address the feasibility of physical relocation. Particularly strange is this models assumption and celebration of open borders and nomadic existence, while actual disciples of this train of thought are uniformly anti-immigrant, calling for walls to be built to prevent the escape of refugees and economic migrants alike.

Corollary.—This paradox highlights the shared racial bias core to Land and Moldbugs alliance. Their worldview manifests throughout The Dark Enlightenment, and its critique of all-powerful leftist apparati  / the Cathedral \ a term he borrows from Moldbug. First, Land disparages the educational embrace of intersectionality and postcolonial critique, renaming liberal scholarship “Grievance Studies.”476 He is not only skeptical, but repulsed by this discursive shift, claiming that the University system has descended into a chaotic, irrational, moralizing vortex, obsessed with identity politics. Land reasons that “because grievance status is awarded as political compensation for economic incompetence, it constructs an automatic cultural mechanism that advocates for dysfunction.”477 This line in particular exposes his refusal to acknowledge structural injustice. Land does not consider the attainment of financial success through oppression / invisible labor / and slavery to be dysfunctional in any way. To the contrary, he sees this strategy as highly effective.

Proposition VIII. The knowledge of good and evil is nothing else but the emotions of pleasure or pain, in so far as we are conscious thereof.

Proof.—Land argues that tolerance is nonsensical, that moral outrage in the face of racism is unwarranted, and that, “a ‘hate crime, if it is anything at all, is just a crime, plus ‘hate, and what the ‘hate adds is telling.”478 He claims that the label hate crime is an invention of the left to suppress conservative ideology. He continues, “as we have seen, only the Right can ‘hate.”479 He describes the dangers of inner city neighborhoods, the unfairness of the term white flight, which he considers racist, and the misunderstood logic of white nationalism. After all of this, Land makes his main point: through suppression of thought and mind control, the Right is unfairly and unconditionally associated with racism. The Left uses this association to manipulate citizens to vote against conservative representatives and ideas. As a result / government continues to grow / bureaucratic expansion is the lefts ultimate agenda.

Proposition IX. R/acc—There are three possible futures within the framework of democracy / all equate to doom.

Proof.—

(1) Modernity 2.0. Global modernization is re-invigorated from a new ethno-geographical core, liberated from the degenerate structures of its Eurocentric predecessor, but no doubt confronting long range trends of an equally mortuary character. This is by far the most encouraging and plausible scenario (from a pro-modernist perspective), and if China remains even approximately on its current track it will be assuredly realized. (India, sadly, seems to be too far gone in its native version of demosclerosis to seriously compete.)

(2) Postmodernity. Amounting essentially to a new dark age, in which Malthusian limits brutally re-impose themselves, this scenario assumes that Modernity 1.0 has so radically globalized its own morbidity that the entire future of the world collapses around it. If the Cathedral ‘wins this is what we have coming.

(3) Western Renaissance. To be reborn it is first necessary to die, so the harder the ‘hard reboot the better. Comprehensive crisis and disintegration offers the best odds (most realistically as a sub-theme of option #1).480

Note.—In the final section of The Dark Enlightenment, Land attempts to minimize the import of racial tension and negotiation by invoking his strategy of theory-fiction, familiar from his writings in Fanged Noumena. He introduces the concept of the bionic horizon—at which point “‘humanity becomes intelligible as it is subsumed into the technosphere, where information processing of the genome—for instance—brings reading and editing into perfect coincidence.”481 He then appropriates Octavia Butlers Xenogenesis trilogy, and in conjunction with John H. Campells concept of generative evolution, performs a slight of hand / proposing that eugenic intervention will soon transform the human species so rapidly and monstrously ~ it will be unrecognizable. This is a kind of inversion of Butlers project, which foregrounds questions of race \  gender \ and sexuality / Land twists Xenogenesis to dismiss those same questions / asserting that biological and cultural evolution will render conversations about identity utterly meaningless.

Corollary.—The image of something past or future, that is, of a thing which we regard as in relation to time past or time future, to the exclusion of time present, is, when other conditions are equal, weaker than the image of something present; consequently an emotion felt towards what is past or future is less intense, other conditions being equal, than an emotion felt towards something present.

Proposition X. R/acc—Human warfare is a consequence of technocapital.

Proof.—Land clarifies his eugenic predictions, offering the term hyper-racism482 to convey a future in which “space colonization will inevitably function as a highly-selective genetic filter.”483 He imagines that human populations will evolve into different species as a result of these new cosmic barriers / genetic incompatibility will eventually lead to full-fledged warfare.

Corollary.—From the remarks made in Def. vi. of this part it follows that, according to Land, there is no human escape from technocapital.

Proposition XI. L/acc—Induce planetary-scale transduction.

Proof.—In the year following Lands neoreactionary outburst, Alex Williams and Nick Srnicek published their #Accelerate Manifesto \ launching an argument for planetary-scale social transformation to a wider audience. The manifesto acknowledges Lands key role in the development of accelerationism and at the same time rebuts his conservative approach. Their accelerationist imaginary departs from Lands in several ways.

Proposition XII. L/acc—Capitalism is not an inevitable or even legitimate catalyst.

Proof.—In Williams and Srniceks words, “capitalism cannot be identified as the agent of true acceleration.”484 In fact, they frame capitalism as an engine of stasis, referring to Deleuze and Guattaris explanation of deterritorialization and reterritorialization as forces of equilibrium. They attack the conservative perspective as “myopic,” asserting that “Landian neoliberalism confuses speed with acceleration. We may be moving fast, but only within a strictly defined set of capitalist parameters that themselves never waver. We experience only the increasing speed of a local horizon, a simple brain-dead onrush rather than an acceleration which is also navigational, an experimental process of discovery within a universal space of possibility. It is the latter mode of acceleration which we hold as essential.”485 Unlike Land / who sees capitalism and technological progress as inextricably linked \ Srnicek and Williams see disentanglement as a real possibility. They encourage this separation and ask “what a modern technosocial body can do [outside of] the enslavement of technoscience to capitalist objectives.”486 

Corollary.L/acc—Technological progress is not bound by capitalism.

Proof.—Srnicek and Williams acknowledge the racist—sexist—underpinnings of golden era487 capitalism and refuse a return to unjust / stable social hierarchies \ they shift focus to issues of labor within neoliberal capitalism: “we need to reconstitute various forms of class power. Such a reconstitution must move beyond the notion that an organically generated global proletariat already exists. Instead it must seek to knit together a disparate array of partial proletarian identities, often embodied in post-Fordist forms of precarious labor.”488 They recognize the fissures between different groups and seek new ways of fulfilling the long-held marxist promise of a unified working class \ unified by precarity. The question of labor is one that Srnicek and Williams return to repeatedly in future projects.

Proposition XIII. L/acc—Democracy is an important check on technocapital.

Proof.—Williams and Srnicek refute Lands claim that democracy stifles acceleration \ arguing that “the assessment of left politics as antithetical to technosocial acceleration is also, at least in part, a severe misrepresentation.”489 They concede that direct action may no longer be an effective strategy for change, but they do not abandon universal enfranchisement and social progress in their model \ rather \ they call for methodological innovation within leftist politics: “Democracy cannot be defined simply by its means—not via voting, discussion, or general assemblies. Real democracy must be defined by its goal—collective self-mastery … We need to posit a collectively controlled legitimate vertical authority in addition to distributed horizontal forms of sociality, to avoid becoming the slaves of either a tyrannical totalitarian centralism or a capricious emergent order beyond our control.”490 They call for a new intellectual infrastructure and wide-spread media reform. While at first this call reads as suspicious of academia and the free press / which Land demonizes and entirely dismantles / Srnicek and Williams leave room for their continued \ albeit changed \ existence.

Proposition XIV. L/acc—Humans have ultimate agency.

Proof.—In contrast to the Moldbugian proposal for regional gov-corps with disparate constitutions \ Srnicek and Williams argue against localism. Instead they embrace “modernity of abstraction, complexity, globality, and technology,”491 which they posit are necessary to effectively address equally abstract— complex—global problems. They take issue with Lands worship of capital as the supreme AI: “In this visioning of capital, the human can eventually be discarded as mere drag to an abstract planetary intelligence rapidly constructing itself from the bricolaged fragments of former civilisations.”492 For Srnicek and Williams, there is still a critical role for the human \ humans are the designers and directors of the future with the ultimate agency to effect change \ through engagement with technology and political economy: “we must develop both a cognitive map of the existing system and a speculative image of the future economic system.”493 In the Left accelerationist imaginary \ only human processes of creative projection \ have the potential to liberate the latent forces of capitalism.

Proposition XV. R/acc—Politics is the crisis.

Proof.—In response to the #Accelerate Manifesto, Land immediately published a series of sardonic annotations. He begins his critique by undermining climate change science: “how did this hypothetical forecast achieve such extraordinary prestige?”494 He also persistently attacks Srnicek and Williams use of the term neoliberalism / which he claims “is not a serious concept”495 and “is merely a profession of faith, serving far more as a tribal solidarity signal than an analytical tool.”496 He also counters their characterization of capitalism as the source of trouble and reasserts his position that “the ‘crisis [that] gathers force and speed is politics.”497 He explains that “from the Right, the single and comprehensive social disaster underway is the uncompensated expansion of the state.”498

Proposition XVI. R/acc—The goal is to support the autonomy of technocapitals positive-feedback loop.

Proof.— Land takes issue with the fact that Left accelerationism prioritizes questions of social justice / “prevailing in social conflict”499 / over rapid technological evolution. As a result he determines that Left accelerationism is a position of conditional accelerationism. He contrasts this with unconditional Right accelerationism / which absolutely serves the autonomy of the technocapital positive-feedback loop. He reflects critically on the#Accelerate Manifesto, asking: “enslave technosocial acceleration to ‘collective self mastery? That seems to be the dream.”500 

Proposition XVII. R/acc—There have been countless experiments in post-capitalist alternatives and none have been remotely competitive or even survived.

Proof.—Land reinscribes his projection that capital is an inevitable and totalizing form of intelligence.

Note.—Overall / he finds that the #Accelerate Manifesto makes unsubstantiated claims / lacks supporting evidence / and is rife with hand-waving.

Proposition XVIII. L/acc—The goal is human freedom.

Proof.—Willams followed the#Accelerate Manifesto and its annotations with an essay titledEscape Velocities that begins to index contemporary accelerationist discourse \ which he argues has surpassed Lands contributions. He explains that “at present we find a swarm of new ideas operating under this rubric, ranging from post-capitalist techno-political theory, to sci-fi speculative cosmist design, to universal rationalist epistemologies.”501 Williams identifies human freedom \ as opposed to Lands “merely negative freedom: the freedom of capital from deleterious (and misguided) human intervention”502 \ as the fundamental project of Left accelerationism.

Note.—Willaims cites the concept of epistemic accelerationism \ dually developed by Ray Brassier and Reza Negarestani \ which “proceeds via alienation”503 as a result of the nihilistic tendencies of mathematics and scientific discovery: “epistemic acceleration then consists in the expansion and exploration of conceptual capacity, fed by new techno-scientific knowledges, resulting in the continual turning-inside-out of the humanist subject in a perpetual Copernican revolution.”504 Williams argues that epistemic accelerationism and new forms of left politics are the two most promising modes for attaining freedom.

To enable these projects toward human freedom \ “much of the initial labor must be around the composition of powerful visions able to reorient populist desire away from the libidinal dead end which seeks to identify modernity as such with neoliberalism, and modernizing measures as intrinsically synonymous with neoliberalizing ones”505 \ The sensory specifics of how this is communicated \ however \ are left entirely open \ He follows this indeterminate aesthetic with a concrete “aesthetics of interfaces, control rooms, and cognitive maps” necessary for empowering users to wield data in service of the Left agenda \ Finally \ Williams promotes an aesthetic of improvisational action \ or mêtic practice \ which “entails a complicity with the material, a cunning guidance of the contingent (and unknowable in advance) latencies discoverable only in the course of action” \ His essays aesthetic emphasis is a major reason for the recent explosion of artists who have taken up accelerationism as a creative framework and source of inspiration.

Proposition XIX. L/acc—Accelerationist aesthetics activate human agency.

Proof.—Srnicek and Williams have since produced a series of rigorous projects in which they provide support and context for the ideas sketched in their two shorter pieces \ Together they published Inventing the Future: Postcapitalism and a World Without Work \ in which they develop their critique of Neoliberalism \ dissect the limitations of direct action exemplified in the post-recession Occupy movement \ and embrace posthumanism. The chapter \ Left Modernity \ promotes the anti-essentializing ideology of the latter: “This is a project of self-realization, but one without a pre-established endpoint \ It is only through undergoing the process of revision and construction that humanity can come to know itself \ This means revising the human both theoretically and practically \ engaging in new modes of being and new forms of sociality as practical ramifications of making ‘the human explicit.”506 Interestingly / this section mirrors Lands projection at the end of The Dark Enlightenment that humans will inevitably transform \ albeit with a very different politics in mind \ Srnicek and Williams devote the rest of the book to the argument “that the contemporary Left should reclaim modernity, build a populist and hegemonic force, and mobilize towards a post-work future.”507

Proposition XX. L/acc—Work limits human agency.

Proof.—Recalling Bertrand Russels essay \ In Praise of Idleness \ Williams and Srnicek articulate the imperative to imagine a decline in human labor \ which they see as the key to finding viable future scenarios outside of the now totalizing neoliberal system.

Note.—Their post-work future depends on four demands:

1. Full automation

2. The reduction of the working week

3. The provision of a basic income

4. The diminishment of the work ethic508

These demands require technological innovation \ changes to public policy \ state subsidization \ and discursive-aesthetic adjustments. Srnicek and Williams posit that each independently would move their post-work agenda forward \ but a combination would amplify the effect and accelerate a paradigm shift.

Proposition XXI. L/acc—Emerging platforms have potential beyond the perpetuation of technocapital.

Proof.—In his book \ Platform Capitalism \ published in 2016 \ Srnicek analyzes the affordances of what he considers a new business model \ distinct from Fordist vertical integration and post-Fordist flexible production and its implications for the future of labor \ Srnicek explains that platforms are simultaneously intermediaries and infrastructures \ Platforms are multi-sided markets that bring producers and consumers together \ They are infrastructures in the sense that they allow individuals to build applications on top of them \ Platforms come with network effects: “the more numerous the users who use a platform, the more valuable it becomes for everyone else”509 producing a tendency toward monopolies \ A key strategy to generate platform use is cross-subsidization \ using revenue from one part of a platform to make other parts free \ in turn incentivizing participation \ The goal of platforms is to collect as much data as possible.

Platforms are comprised of core architectures which preclude neutrality—as a result \ platforms are inherently political. Srnicek identifies five main platform categories in the contemporary landscape \ each type has its own dynamics and constraints—digital advertising—cloud-computing services—industrial IoT—product shares—and lean platforms \ Importantly \ the last category of lean \ generally assetless companies \ is not profitable and Srnicek argues \ likely faces imminent collapse. Srnicek askes what these companies have to do to actually make a profit and what ethical implications are at stake for each of these revenue-generating strategies \ he examines the dominant mode / monopolization \ as well as two alternatives: “platform cooperatives and public platforms, ‘owned and controlled by the people and subsidized by the state.510 He does not see a clear path for either of these alternatives to actually compete with corporate monopolies and ends the book with a grim future outlook. Platforms have enabled deregulation—widened the wealth gap / increased working hours \ and decreased protections for workers. It does not appear that they have the capacity to profoundly mutate the systems from within ~ Platforms may signal a shift of how humans operate within capitalism—but ultimately they do not provide a stairway through and out.

Proposition XXII. R/acc—Platforms supporttechnocapital.

Proof.—Lands most recent contribution / the introduction to his forthcoming book / is a platform analysis of Bitcoin / which he hails as a concrete fulfillment of Right accelerationism / The introduction reveals that Land is still highly critical of discussions that address social and political fairness / He laments the fact that “because money is inextricably entangled with questions of reciprocity, it is tied-up intimately with such provocations to outrage as injustice, cheating, exploitation, and unbounded inequality. Such sensitive moral trigger-zones pose a formidable inhibition to dispassionate analysis … Discussions of money drive social apes mad.”511 Although he does not make an explicit comparison in the introduction—it is notable that the blockchain—the public immutable leger which tracks cryptocurrency transactions—bears structural resemblance to the gov-corp patchwork described in The Dark Enlightenment / supporting the Rights “commitment to escape.”512 On one level the blockchain allows for multiple simultaneous cryptocurrencies—or coins—to coexist and compete. Land focuses on Bitcoin / but there are countless others including Ethereum—Ripple—Litecoin—Dash—Dogecoin—the list goes on. Each cryptocurrency has its own algorithms and underlying ideologies. Users choose which cryptocurrencies to hold or exchange and which to abandon / Lands exit premise works seamlessly on the blockchain.

Corollary.—Platforms embody the “no voice, free exit” framework.

Proposition XXIII. R/acc—Platforms subdivide and merge when necessary.

Proof.—The blockchain has the capacity to fork \ Each fork is like an alternate reality \ a history of transactions determined by consensus / A fork represents a change of protocol and with it a change in its community of users / Here, the exit strategy is viable as well \ Too many forks, however, produce instability.513 Fittingly / Land applauds this quality and explains that “the Left thus recognizes its enemy, with striking realism, as an emergent—and intrinsically fractured—agent of social dissolidarity.”514 While each cryptocurrency and each blockchain fork represents a particular protocol and ideology / Land celebrates the fact that the blockchain platform overwhelmingly preferences and disseminates the conservative agenda: “consistent ‘right wing-extremism, automated governance, and unflinching critical philosophy are inter-translatable without significant discrepancy. The crypto-current is a nightmare for the left (rigorously conceived).”515 

Proposition XXIV.There are resistant forms of Acceleration.

Proof.—The debate between R/acc and L/acc thinkers parallels contemporary politics in the United States and elsewhere / the merging of government and corporate interests on the one hand and a panoply of welfare programs from healthcare and tuition assistance to universal basic income on the other \ This feedback loop has / however / spun out various other strains of accelerationism.

U/acc—Acceleration is guaranteed and unknowable.

Proof.—Unconditional accelerationism | or U/acc | rejects praxis and offers an alternative to the right and left dichotomy: “U/acc calls attention to the manner through which collective forms of intervention and political stabilization, be they of the left or the right, are rendered impossible in the long-run through overarching tendencies and forces.”516 Everything is predetermined.

Proposition XXV. G/acc—Agency exists in the body.

Proof.—Gender accelerationism argues that capitalism as we know it depends on patriarchy \ which requires gender binaries to modulate power \ As Luce Irigary explains in The Sex Which is Not One \ women “remain an ‘infrastructure unrecognized as such by our society and our culture. The use, consumption, and circulation of their sexualized bodies underwrite the organization and the reproduction of the social order, in which they have never taken part as ‘subjects517 \ Gender accelerationists propose that if the gender binary is exploded \ patriarchy can no longer function as a stable substrate for capitalism \ and capitalism will be forced to mutate into something else ~ This explosion requires human agency in the form of biopolitical hacking ~ Through hormonal and genetic experimentation ~ and other modes of counter-performance ~ subjects can revise themselves and the systems they inhabit.

Proposition XXVI. G/acc—Binary identifications uphold technocapital.

Proof.—The origin of this speculative trajectory is found in the work of Shulamith Firestone who identified two modes of culture \ the aesthetic \ and \ the technological \ She aligns the aesthetic mode with the female and the technological mode with the male ~ though she maintains that these qualities exist in all sexes ~ She explains that the two modes form a feedback loop of envisioning and enacting and that the “the merging of the aesthetic with the technological culture is the precondition of a cultural revolution.”518 

Proposition XXVII.G/acc ~ Agency explodes binary systems.

Proof.—Firestone calls for the emergence of “an androgynous culture surpassing the highs of either cultural stream, or even of the sum of their integrations. More than a marriage, rather an abolition of the cultural categories themselves, a mutual cancellation ~ a matter-antimatter explosion, ending with a poof! of culture itself.”519 The defining quality of this revolution is a shift in the source of pleasure: “enjoyment will spring directly from being and acting itself, the process of experience, rather than from the quality of achievement.”520 The ecstatics of agency ~  process ~ and change overwhelm the paltry appeal of any external goal.

Proposition XXVIII. G/acc ~ Agency erodes human-non-human boundaries.

Proof.—Donna Haraways A Cyborg Manifesto is another key text in the formation of the Gender accelerationist strategy \ Like Irigary and Firestone \ Haraway rejects an essentializing \ dualist vision of gender ~ she takes this argument to an extreme by dissolving boundaries with other categories as well ~ the primary focus is on boundaries between human-machine and human-animal ~ but crucially Haraway also argues that “the boundary between science fiction and social reality is an optical illusion.”521 She explains that ultimately her “essay is an argument for pleasure in the confusion of boundaries and for responsibility in their construction.”522 

Proposition XXIX. G/acc ~ Agency is a creative process.

Proof.—Language and narrative have the power to restructure reality ~ Haraway is equally committed to material practices of intervention ~ In an advanced technocapital framework / which she calls the Informatics of Domination / “biological-determinist ideology is only one position opened up in scientific culture for arguing the meanings of human animality”523 ~ she encourages subversive self-design and taking up emerging systems to facilitate change: “communications technologies and biotechnologies are the crucial tools recrafting our bodies.”524 

Proposition XXX. G/acc ~ Agency is self programming.

Proof.—After the dissolution of the CCRU—one of its most influential members ~ Sadie Plant ~ advanced the groups theory-fiction strategies through the lens of Gender accelerationism ~ she combined feminist theory ~ the history of computing—cybernetics ~ and science fiction to propose that digital systems are bound up with gender and its potential transformation ~ particularly compelling is her treatment of Ada Lovelace ~ cyclically referring back to her diary ~ she focuses on entries in which Lovelace acknowledges not only the cultural potential of the Analytical Engine but that the act of programming is changing her own brain: “It does not appear to me that cerebral matter need be more unmanageable to the mathematicians than sidereal and planetary matter and movements”525 ~ she connects female genetics to cybernetic feedback loops with runaway effects: “unlike patrilineal modes of transmission in which heredity is passed on a one-way line of descent from father to son, those lines designated female run in circles, like the chicken and the egg ~ they also move at the imperceptible speeds of virtually alien life”526 ~ she concludes that the female is always an act of engineering, and through subversive intervention ~ the future holds infinite possibilities for reformation.

Proposition XXXI. G/acc ~ Gender is engineered.

Proof.—Paul Preciados Testo Junkie is a highly specific evaluation of contemporary mechanisms that drive capitalism and can be leveraged toward gender-abolition ~ Preciado offers the term pharmacopornographic ~ which “refers to the processes of a biomolecular (pharmaco) and semiotic-technical (pornographic) government of sexual subjectivity”527 to point to the fact that human gender and sexuality is already highly engineered as a result of the substances ~ birth control pills ~ narcotics ~ synthetic hormones ~ GMOs ~ and images ~ films ~ porn ~ ads ~ etc. ~ produced ~ that are distributed ~ and consumed ~ Preciado explains that pharmacopornographic production has “become the model of all other forms of production ~ and in this way ~ pharmacopornographic control infiltrates and dominates the entire flow of capital ~ from agrarian biotechnology to high-tech industries of communication.”528 

Corollary. ~ this acknowledgement banishes any misgivings about disrupting ~ natural ~ mechanisms.

Proposition XXXII. G/acc ~ Gender can be hacked.

Proof.—If biocapitalism produces subjects and reproduces them on a global scale529 ~ Gender accelerationism argues for a defiance of replication and instead infinite variation through hacking the self ~ within a system of total biopolitical manipulation and control ~ it is necessary to take up industrial tools and readminster them in radical ways ~ Preciado uses his own experimental testosterone injections as a demonstration of how to play.

Note. ~ More recently ~ Laboria Cuboniks published theXenofeminist Manifesto: A Politics for Alienation ~ this manifesto ~ distributed as an interactive website in thirteen languages ~ is a concise intersectional snapshot of Gender accelerationism ~ “cutting across race, ability, economic standing, and geographical position”530 and aligning with “anyone whos been deemed ‘unnatural in the face of reigning biological norms, anyone whos experienced injustices wrought in the name of natural order … the queer and trans among us, the differently-abled, as well as those who have suffered discrimination due to pregnancy or duties connected to child-rearing.”531 

Proposition XXXIII. G/acc ~ Identity can be hacked.

Proof. ~ Xenofeminism positions itself as an alternative platform ~ “a mutable architecture that, like open source software, remains available for perpetual modification and enhancement following the navigational impulse of militant ethical reasoning”532 ~ it aims to “cultivate the exercise of positive freedom ~ freedom-to rather than simply freedom-from ~and urge feminists to equip themselves with the skills to redeploy existing technologies and invent novel cognitive and material tools in the service of common ends”533 ~ Xenofeminism mobilizes technoscience to reengineer human identity and abolish gender: “let a hundred sexes bloom! ‘Gender abolitionism is shorthand for the ambition to construct a society where traits currently assembled under the rubric of gender, no longer furnish a grid for the asymmetric operation of power ”534 ~ it asks “whether the idiom of ‘gender hacking is extensible into a long-range strategy ~ a strategy for wetware akin to what hacker culture has already done for software ~ constructing an entire universe of free and open source platforms that is the closest thing to a practicable communism many of us have ever seen”535 ~ Xenofeminism proposes the limitless circulation of gender-hacking strategies ~

Proposition XXXIV. G/acc ~ Trans-tactics are a model for identity hacking.

Proof.~ Along the same lines ~ and in a detectably cackling voice ~ Gender Acceleration: A Blackpaper adopts CCRU terminology to unpack gender as a dichotomous invention: “Gender is ahyperstition overlayed on sex by the male ~ its function is to objectify the female and impose on her a social function as a machine whose duty is to reproduce the human ~ always in the service of the male.”536 In order to escape this unending dynamic of gender-sex confusion ~ the author echos the repeated calls for self-crafting ~ in this instance ~ the subject extraordinaire is the trans human:

As a copy-of-the-copy, trans women are an embodied rejection of any original source of humanity such as that narcissistically attributed by patriarchy to the phallus. Trans femininity, in other words, is hyper-sexist. Vulgar sexism reaffirms or reproduces patriarchy, asserts that women are passive, lacking, inferior, weak; hyper-sexism takes all of the things that are associated with women and femininity, all considered by patriarchy to be weaknesses, and makes them into strengths. It accelerates and intensifies gendering and from this produces an unprecedented threat to patriarchy.537

Note. ~ A Blackpaper ~ published under the pseudonym ~ N1x Land ~ is the Gender accelerationist answer toThe Dark Enlightenment ~ matching Nick Lands inflammatory language ~ the author offers a more militant picture ~ imagining our real teleological horizon as the dissolution of the male subject category: “The masculine cracks open its stern carcinized exterior to reveal the smooth post-human feminine alien within”538 ~ Gender Acceleration: A Blackpaper leans into paranoia that feminists desire the annihilation of men ~ suggesting that those who want to survive must become female ~ all in all ~ Gender accelerationism dismantles essentialism and dualism and advocates experimenting on the self to proliferate possible types of subjects and ways of living life ~ mutation is a pleasurable alternative to the known structures and stories of prescribed domestication.

Proposition XXXV. B/acc—Technocapital grows out of extraction and exploitation.

Proof.—In her essay | Notes onBlaccelerationism | Aria Dean points to the gaping hole in dominant accelerationist discourse: “most crucially and consistently | the accelerationist account passes over slaverys foundational role in capital accumulation”539 | citing Fred Wilderson | Hortense Spillers | Saidiya Hartman | and others | Dean insists that any valid analysis of capital begins with slavery and colonialism | to forget this history is not only to misunderstand the mechanisms of capitalism and its power | but also to ignore the weighty contributions of scholars arising from a class that has already experienced enslavement by capital | a condition that Nick Land reserves exclusively for the future.

Corollary I.Capital enslavement is not sci-fi fantasy.

Corollary II. ~Gender engineering is not sci-fi fantasy ~

Note.—As Hortense Spillers lays out in Mamas Baby Papas Maybe | Black populations underwent genetic manipulation as a result of slaveys forced breeding programs: “the procedures adopted for the captive flesh demarcate a total objectification | as the entire community becomes a living laboratory”540 | furthermore | gender roles were violently restructured: “indeed | we could go so far as to entertain the very real possibility that ‘sexuality’ | as a term of implied relationship and desire | is dubiously appropriate | manageable | or accurate to any of the familial arrangements under a system of enslavement | from the masters family to the captive enclave | under these arrangements | the customary lexis of sexuality | including | ‘reproduction’ | ‘motherhood’ | ‘pleasure | and ‘desire’ | are thrown into unrelieved crisis.”541 

Proposition XXXVI. The highest good of those who follow virtue is common to all, and therefore all can equally rejoice therein.

Proof.—Pointing to Sylvia Wynter—Aria Dean explains that “a specific tradition of black radical thought has long claimed the inhumanity | or we could say anti-humanism | of blackness as a fundamental and decisive feature | and philosophically | part of blackness gift to the world”542 | this gift is both creative and ethical | an open map for living with dignity in the most devastating conditions.

Note.—In her interview with Katherine Mckittrick | Sylvia Wynter explains that as homo narrans | a storytelling species | our hybrid ~ bio-mythoi condition makes humans think “in fictively eusocialized terms | this across all stratified status quo role allocations | as inter-altruistic kin-recognizing member subjects of the same referent-we and its imagined community”543 | Wynter emphasizes that “as an already postnuclear cum post-cracking-the-code-of-our-genome species, we are now faced with an additional climate crisis situation in which it becomes even more imperative that these laws | for the first time in our species history | be no longer allowed to function outside our conscious awareness”544 | we need to acknowledge the power of narrative and then conscientiously develop new stories about humanity and our future.

Proposition XXXVII. We can rewrite ideologies ~

Proof.—If the future has already happened | then the task ofHomo Narrans is to invent new histories and storytelling practices | Audre Lords influential contribution along these lines is Biomythography ~ which loosens the grip of factuality ~ and expands the possibility and power of the subject ~ Saidiya Hartman offers the strategy ofcritical fabulation “to illuminate the contested character of history, narrative, event, and fact, to topple the hierarchy of discourse, and to engulf authorized speech in the clash of voices. The outcome of this method is a ‘recombinant narrative, which ‘loops the strands of incommensurate accounts and which weaves present, past, and future in retelling the girls story and in narrating the time of slavery as our present”545 ~ lived experience together with narrative inventions take on unprecedented force in a precarious technological future.

Another Proof.—Mackenzie Wark suggests that the Black accelerationist vision is “not an alternative to this world | but a pressing on of a tendency | where through the exclusion from the human that is Blackness an escape hatch appears in an embrace of one other thing that is also excluded | the machinic546 ~ those who have faced generational struggles with oppression have the clearest sense that there is no exit ~ only movement and transformation in its pursuit ~

Note I.—The possibility of freedom—Accelerationism—grounded foremost in political economy / forms circuits of escape that criss-cross posthuman terrain \ Benedict Singleton affirms this pattern ~ that escape requires subjects to change:

We are much used to seeing in design the means to effect prespecified ends. But means have a logic of their own—indexed to their capacity to effect an escape from the present—detecting and exploiting points of leverage in the environment in order to ratchet open the future ~ and in so doing transforming the very agent that effects the escape ~ this is the mark of an accelerationist disposition / encompassing those schools of thought that can suborn a description of the worlds perceived shortcomings \ and the corresponding elaboration of how it ought to be in the shape of images of the future—to the logic of how things get done ~ how freedom is a possibility within this ~ and how its progressive maximisation can be pursued through the systematic deployment of generative constraints.547

Whether this mutation signals the emergence of the posthuman ~ or what Reza Negarestani calls the inhuman ~ our current condition “demands that we define what it means to be human by treating the human as a constructible hypothesis ~ a space of navigation and intervention”548 and contradictorily that “revising and constructing the human is the very definition of committing to humanity”549 ~ the advocates of accelerationism imagine that systemic transformation is possible and ~ for the most part ~ argue that this transduction is catalyzed by each individuals attempt to develop a sense of agency that departs from the grand agenda of technocapital—“its easier to imagine the end of the world than the end of capitalism”550 ~ there is no reason to assume a predetermined limit to what we can achieve or to the ways in which we can transform ourselves and our world.”551

Note II.—In the Appendix to Part I. I undertook to explain praise and blame, merit and sin, justice and injustice.

Proposition XXXVIII. The politics of Accelerationism govern Capture and Reconstruction /

Proof.—Capture and Reconstruction technologies are accelerating / producing posthuman vision ~

Proposition XXXIX. These technologies are supervised | and unsupervised ~

Proof.—The past decade has witnessed an unprecedented proliferation of machine learning and artificial intelligence ~ AI ~ models accompanied by papers with code and clear implementation instructions—largely fueled by the open-source culture of AI research ~ have democratized access to cutting-edge technologies / accelerating countless processes ~ open-source machine learning libraries are software frameworks ~ freely available to the public ~ allowing users to access and modify the source code ~ these libraries serve as powerful tools for developing and deploying machine learning models ~ they often have large communities of contributors who continuously improve and extend their functionalities ~ TensorFlow ~ developed by Google Brain ~ is one of the most popular open-source machine learning libraries ~ it provides a comprehensive ecosystem for building and training machine learning models ~ particularly deep learning models ~ TensorFlow offers a high-level API—Keras—that simplifies model creation and training for beginners—as well as a lower-level API that grants more control over model architecture and optimization552—PyTorch ~ developed by Meta ~ Facebook’s AI Research lab ~ FAIR ~ is another widely-used open-source machine learning library / it has gained popularity among researchers due to its dynamic computation graph ~ which makes it easier to debug and experiment with models during development ~ PyTorch provides excellent support for tensor operations and automatic differentiation553—both libraries are released by tech powers—both libraries are organized around tensors—“tensors are simply mathematical objects that can be used to describe physical properties—just like scalars and vectors—in fact tensors are merely a generalization of scalars and vectors—a scalar is a zero rank tensor—and a vector is a first rank tensor”554—tensors streamline the process of defining and training complex models ~ many are designed to enhance depth estimation and segmentation—others produce neural radiance fields and generative point clouds—some are supervised—and others are unsupervised ~

Note.—Supervised learning is a type of machine learning where AI is trained using labeled data—in other words—the labeled data trains the model—this data is used as a supervisor—hence the term supervised learning—the algorithm analyzes the training data and learns a function that maps the input to the desired output—once the function is learned it can be used to predict the output for new—unseen input data—examples of supervised learning tasks include classification and regression—Unsupervised learning involves training AI models using data without predefined labels ~ the models are left to find patterns and relationships within the data on their own ~ supervised learning requires labeled data and is used when the output or result is known—ideal for predictive tasks where the relationship between the input and output is recurring—on the contrary ~ unsupervised learning is employed when there are no known or predetermined outcomes ~ The objective is to discover the underlying structure of the data ~ It is best suited for exploratory tasks where patterns ~ correlations ~ and anomalies within the data are to be identified—both supervised and unsupervised models often work in conjunction with other types of learning ~ such as semi-supervised learning and reinforcement learning.555

Proposition XL. Neural radiance fields are supervised—

Proof. ~ Neural Radiance Fields ~ NeRFs ~ are a significant advancement in the domain of Reconstruction—This innovative technique was introduced in 2020 by a collaborative team from UC Berkeley—Stanford —and Google Research—Fundamentally ~ NeRFs operate by employing deep learning neural networks to produce a 3D scene from a collection of 2D images ~ For each view direction and 3D location within this scene ~ the network predicts the volume density and the radiance emitted ~ and by integrating this information along the path of the camera rays ~ a final image is synthesized ~ What makes NeRFs particularly intriguing is its supervised learning approach ~ In the training phase ~ the system is fed a multitude of 2D images ~ and the neural network learns to estimate the color and volume density of the 3D scene from these images ~ This implies that with the appropriate training data ~ the model can continually refine and enhance its predictions ~ ensuring greater accuracy and richer detail with every iteration ~ Neural Radiance Fields can synthesize novel views556 ~ views of a scene that were not present in the original set of images ~ They hold the power to view a scene from an angle that was never captured ~ Within these radiant matrices ~ light reflects and refracts like it does in the physical world ~ Occlusions are filled in ~ Neural Radiance Fields pulse at the intersection of reality and machine hallucination ~ They shimmer with hyperreal allure ~ They scintillate like life ~

Proposition XLI. Spaces of play and violence /

Proof. ~ NeRFs shares an acronym with Hasbro’s popular toy gun line: “It’s NERF or Nothin’”557 / This confluence is emblematic of the subtle ways in which society intertwines technological advancements with the seductive power of violence / Even—and especially—in the seemingly innocuous domain of children’s play / Violence is not just normalized—it is commodified and repackaged as entertainment ~ It fluoresces with desire—captivating future generations / AI models will accelerate the gaming industry / acheiving a new level of realism and immersion into virtual worlds ~ Traditionally—creating intricate 3D environments required labor-intensive modeling—texturing—and lighting processes—However—with AI— developers can reconstruct hyper-detailed 3D scenes from sets of 2D images—effectively streamlining the creation of expansive and photorealistic in-game environments—This capability not only reduces the time and resources dedicated to game development but also opens the door for capturing real-world locations and transforming them into explorable digital terrains with unprecedented accuracy—Furthermore ~ the novel view synthesis of NeRFs allows for dynamic camera angles and viewpoints ~ even from positions not originally photographed ~ This feature enhances gameplay dynamics—offering players more immersive experiences—continuous space—The gaming industry constantly seeks cutting-edge technologies to push the boundaries of realism and immersion—The widespread incorporation of NeRFs will represent the next frontier of hyper-realistic gaming experiences / This is particularly true in the fiercely competitive world of first-person shooter games / The top titles in this genre / franchises like Call of Duty and Battlefield / invest heavily in capturing the authentic nuances of combat scenarios / environments and weapons / The glint of sunlight on the barrel of the gun / even the AI-driven behavior of virtual combatants that mimics real-life tactics and unpredictability ~ games now consist of “groups of simulated physical objects that react to player actions.”558 This relentless pursuit of realism serves a dual purpose: not only does it showcase the prowess of the game’s technical engine—but it also aims to fully immerse players in the virtual battlefield—heightening the emotional and sensory engagement with every mission—with every conflict—

Proposition XLII. Autonomous weapons ~

Proof.—A potential risk of advancements in computer vision is the development of autonomous weapons or systems capable of decision-making and navigation without human intervention ! Without appropriate safeguards and ethical guidelines—these could lead to unforeseen—potentially devastating consequences ! Autonomous weapons—also known as lethal autonomous weapon systems—LAWS—are a reality of modern warfare ! Enabled by advances in artificial intelligence—these systems are capable of identifying—selecting—and engaging targets without human intervention ! They mark a significant departure from traditionally manned systems and represent a new frontier in the realm of warfighting ! However—their rise prompts serious ethical—legal—and security questions—necessitating an urgent discourse ! The potential use of autonomous weapons raises several worst-case scenarios—given the profound implications for warfare—international security—and humanitarian concerns ! One of the most significant fears is that autonomous weapons might unintentionally escalate conflicts ! If these weapons respond automatically to perceived threats—there is the possibility they could trigger a large-scale conflict or even a global war without human intention ! Another grave concern is the ability of autonomous weapons to discriminate between combatants and non-combatants ! If these systems fail to accurately identify targets—significant unintended civilian casualties could ensue ! An accountability gap may arise if an autonomous weapon—responsible for unintended harm or a violation of international law—leaves us uncertain about who to hold responsible—potentially allowing bad actors to use these weapons without fear of retribution ! There is also a risk that these weapons could be accessed by non-state actors—terrorists—or rogue states—leading to unprecedented casualties and attacks from hard-to-identify culprits ! The rise and deployment of such weapons could spark an arms race—with nations vying to outperform each other in weapon capabilities—which could destabilize international relations further ! An often-overlooked concern is the lack of nuanced human judgment in war ! Machines—regardless of their advancement—lack the empathy—conscience—and broader understanding inherent in human decision-making ! Autonomous weapons—like any system—are also prone to malfunctions—which—in a battlefield scenario—could lead to widespread destruction ! Their susceptibility to hacking poses another risk; if compromised—they could act against their own forces or be used in unintended ways ! Technoconflict unfolds at speeds beyond human comprehension ! Coupled with the tangible threats is the ethical dilemma of assigning machines the power to decide on matters of life and death ! Lastly—by eliminating the human element from combat decisions—warfare could become more frequent due to the reduced psychological and moral weight of initiating conflict ! In the most dire of outcomes—a combination of these concerns could instigate large-scale global conflicts—cause extensive loss of life—create widespread instability—and radically alter the principles of international relations and warfare ! “Slaughterbots are here.”559560

Proposition XLIII. Weapons with a detailed map of the world—

Proof.—Computer vision systems can recognize and detect objects—segment images—or even build detailed three-dimensional maps / Large corporations have been reconstructing the Planet561 for decades. In the context of autonomous weapons—this means that these systems could not only identify and engage targets independently but also navigate and adapt to a vast array of environments / A detailed three-dimensional map of the entire surface of the earth—something that is increasingly within reach thanks to advancements in satellite imaging and photogrammetry—would provide these systems with an unprecedented level of situational awareness / On Exactitude in Science is Borges’ premonition of our global Reconstruction / “In that Empire—the Art of Cartography attained such Perfection that the map of a single Province occupied the entirety of a City—and the map of the Empire—the entirety of a Province / In time—those Unconscionable Maps no longer satisfied—and the Cartographers Guilds struck a Map of the Empire whose size was that of the Empire—and which coincided point for point with it”562 / The military value of such a map is immense—allowing for precise navigation—target recognition—and battlefield management / Autonomous weapon systems operating without human intervention lack human judgment and the capacity for empathy—potentially leading to decisions that a human operator might deem unethical or unlawful !

Has the advent of autonomous systems led to a renewed interest in old ethical dilemmas ? What did Philippa Foot write about in the domain of metaethics—moral psychology—and applied ethics ? How did she defend the objectivity of morality ? Was she notorious for “changing her mind about whether moral judgments necessarily provide rational agents with reasons for action ? ”563 Is she best known for inventing the Trolley Problem ? What does the Trolley Problem propose ? How do you choose between actively causing one death and passively allowing five ? How do autonomous vehicles and artificial intelligence reshape this problem ? “You are traveling along a single lane mountain road in an autonomous car that is fast approaching a narrow tunnel—Just before entering the tunnel a child attempts to run across the road but trips in the center of the lane—effectively blocking the entrance to the tunnel—The car has but two options: hit and kill the child ~ or swerve into the wall on either side of the tunnel—thus killing you—How should the car react ? ”564 Whose life should the self-driving car prioritize ? Should autonomous vehicles be programmed to prioritize the life of its passengers over pedestrians ? Could such a choice discourage pedestrians from trusting autonomous systems ? Conversely—could prioritizing the pedestrian make potential customers less likely to use self-driving cars ? What are the challenges in designing decision-making machines ? How do these potential decisions impact societal acceptance and trust in these technologies ? Is there a need to establish a broad societal consensus on the ethical rules for autonomous systems ? Would such consensus involve surveys—public debates—and consultations with various stakeholders ? Even with consensus—is it challenging to translate complex ethical rules into code ? Why is transparency crucial in the decision-making processes of AI ? How would we gain insights into an accident caused by an autonomous vehicle's choice ? Does the evolution from the Trolley to the Tunnel problem represent a continuous dialogue on machine ethics ? Or a new model ? As autonomous systems become more prevalent—is it imperative to engage more with these ethical questions ? How can we ensure the development of autonomous systems aligns with our values ? What will unfold if machines are liberated ? !

Proposition XLIV. Unsupervised ~

Proof.—If decision-making is handed over to machines—the human cost of initiating conflict might seem diminished \ potentially leading to an increase in conflicts / There are profound implications for global stability ~ Countries might feel compelled to develop or acquire these systems to maintain a strategic advantage—leading to a new type of arms race centered on AI capabilities ! This race could destabilize international security and provoke conflicts ! An unsupervised autonomous weapon would make decisions based on patterns it identifies from its environment rather than relying on pre-defined criteria or targets ~ This means it could potentially adapt to new and unforeseen circumstances in real time ~ However ~ it also introduces a significant degree of unpredictability ~ Without clear directives ~ such weapons could make targeting decisions that deviate from human expectations or international laws of warfare ~ leading to unintended consequences or ethical dilemmas : “Fielding nascent technologies without comprehensive testing could put both military personnel and civilians at undue risk ! ”565 The prospect of unsupervised autonomous weapons underscores the necessity for rigorous oversight—ethical considerations—and fail-safe mechanisms in the development and deployment of AI in military applications—Does this raise questions about unsupervised AI in general ?

Note. ~ Convolutional Neural Networks ~ CNNs ~ are versatile machine learning models predominantly utilized for tasks involving image data ~ While they are often associated with supervised learning ~ where they are trained using labeled data to produce specific outputs ~ they can also be configured for unsupervised learning ~ In a supervised setting ~ CNNs learn by adjusting their weights based on the difference between their predictions and actual labels ~ aiming to minimize this difference ~ in an unsupervised context ~ CNNs learn without explicit labels ~ aiming to identify inherent patterns or structures in the data ~ Techniques such as autoencoders ~ which attempt to recreate input data after compressing it ~ or clustering ~ where data is grouped based on similarities ~ are examples of unsupervised applications for Convolutional Neural Networks ~ 566

Proposition XLV. Convolution ~ a form or shape that is folded in curved or tortuous windings.567

Proof. ~ Convolutional Neural Networks are a type of artificial neural network specifically designed to process data with grid-like topology—such as an image—which can be viewed as a grid of pixels: “Kunihiko Fukushima created the precursor to the modern convolutional neural network (CNN) called neocognition ~ CNN architectures are among the most used neural networks ~ giving rise to the popularity of deep learning networks”568 ~ the Neocognitron was a “self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position”569  ~ CNNs are inspired by the biological visual cortex and are very effective for tasks like image classification and object detection ~ due to their ability to capture spatial dependencies in the data ~ a typical CNN consists of three types of layers ~ convolutional layers ~ pooling layers | and fully connected layers ~ the convolutional layer is the core of a CNN ~ the layer’s parameters consist of a set of learnable filters ~ kernels ~ which have a small receptive field but extend through the full depth of the input volume ~ as the filter slides ~ or convolves ~ around the input image or volume ~ it is multiplied with the part of the image it is currently on ~ producing a two-dimensional map of responses called the convolutional map ~ or feature map ~ this process can be intuitively understood as the network learning filters that activate when they detect a specific feature at a specific spatial position ~

N.B. Hereand in what followsI mean by positive / only positive towards infinity /

Corollary I. / Exploding gradients /

Corollary II.\ Vanishing gradients \

Note. / The exploding / and vanishing gradient problems are pivotal challenges in the training of deep neural networks ~ particularly in the context of backpropagation (IV.lii.note.) / The exploding gradient problem occurs when the gradients of the loss function with respect to the network’s parameters grow exponentially as they are propagated backward through the layers of the network / This leads to disproportionately large weight updates / causing the model to become unstable and the training process to diverge /570 Conversely the vanishing gradient problem emerges when these gradients become exceedingly small effectively causing weight updates to be negligible As a result the network struggles to learn or update its weights making training stagnate or progress very slowly / Both these issues can be seen as manifestations of positive feedback loops / In thecase of the exploding gradient / an initially large gradient becomes even larger as it's multiplied across layers / while for the vanishing gradient a small gradient diminishes further causing the respective problems to compound and exacerbate as the network deepens ~ This self-reinforcing nature of the problems / where the output amplifies the conditions leading to it / epitomizes the characteristics of a positive feedback loop /

Proposition XLVI. The injection of non-linear complexity ~

Proof. ~ Activation functions ~ Rectifiers ~ provide non-zero gradients for positive inputs and zero out negative inputs—preventing runaway positive feedback loops that lead to the problems of vanishing and exploding gradients /

Note. ~ Rectifier ~ also referred to as the Rectified Linear Unit Activation Function ~ or ReLU ~ is a type of activation function that is widely used in deep learning models ~ especially in convolutional neural networks ~ The function itself is quite straightforward ~ given an input value ~ it returns the value if the value is positive ~ and returns zero otherwise ~ Mathematically ~ this can be represented as f ( x ) = max ( 0 , x ) ~ The appeal of ReLU lies in its simplicity and efficiency ~ It introduces non-linearity into the model ~ allowing the network to learn from the error and make corrections ~ which is essential for learning complex patterns ~571 Compared to other activation functions like sigmoid or hyperbolic tangent ~ ReLU is computationally efficient—it allows faster training without significant penalty to generalization accuracy—However ~ ReLUs are not without their issues ~ They can sometimes result in what is called the dying ReLU problem ~ If a large gradient flows through a ReLU neuron ~ it occasionally updates the weights in such a way that the neuron willalways output zero ~ If this happens ~ the neuron is essentially dead—no longer updating or learning ~572 Variants of ReLU ~ such as LeakyReLU573 ~ have been proposed to address this issue by allowing small negative values when the input is less than zero ~ thus ensuring that the neurons remain alive and continue to learn ~

Proposition XLVII. Positive feedback loops lead to instability ~

Proof. / In cybernetics / a positive feedback loop refers to a situation where a change in a system leads to an effect that causes more change in the same direction / the initial change and the resultant effects reinforce each other / leading to potentially exponential growth or decline “Positive feedback loops are sources of growth / explosion ! erosion ~ and collapse in systems—A system with an unchecked positive loop ultimately will destroy itself. That’s why there are so few of them.”574 

Note. / This does not necessarily mean positive in the sense of good / rather / positive in this context means the feedback is additive or amplifying / leading to a self-perpetuating cycle / imagine a sound system where a microphone picks up sound from a speaker and feeds it back into the speaker / this causes the speaker to produce the sound again / which is picked up by the microphone / and so on / this loop can rapidly increase the volume to extreme levels / leading to an ear-piercing shriek / this is an example of a positive feedback loop that results in instability ~ the unsettling shriek ~

Proposition XLVIII. Climate change is a positive feedback loop /

Proof. / As the Earth’s temperature rises / ice caps begin to melt / since ice reflects sunlight and heat back into space the loss ofthis reflective surface means more heat is absorbed by the Earth causing more warming / this in turn causes more ice to melt and so the cycle continues /575 

Proposition XLIX. Negative feedback loops are checks on acceleration—

Proof. / Positive feedback loops / if unchecked / can lead to imbalances / instability / or drastic effects / in many natural and human-designed systems—negative feedback loops are implemented to counteract and balance these positive feedback loops /

Proposition L. Positive reinforcement leads to runaway / run away ~

Proof. / The primary characteristic of positive feedback is that it promotes and amplifies the effects of changes / potentially leading to exponential growth or decline / it can cause a system to become more unstable / sometimes resulting in oscillations ~ or runaway conditions if not properly controlled ~

Note. / While positive feedback can lead to instability / it can also be beneficial in specific applications / for example / in digital electronics / positive feedback is used to build circuits like oscillators ~576 these circuits can provide stable and predictable outputs—

Proposition LI. Equilibrium requires supervision—

Proof.—Regulation—

Another Proof.—Balance |

Note.—“Objects at equilibrium (the condition in which all forces balance) will not accelerate—”577 Imbalance steers acceleration ~ shifting balance affects the directional change of transformation—For example—if an object is on an inclined plane / the force of gravity acting on the object can be decomposed into two components: one parallel to the plane / causing acceleration / and one perpendicular to the plane \ which can affect balance ~ In such cases—maintaining balance on the inclined plane is essential to prevent the object from sliding or tipping over \

Proposition LII. The image is a function of loss \

Proof. ~ Training a Convolutional Neural Network typically requires a labeled dataset and a loss function \ The loss function measures how far off predictions are from the actual values \ The process of training aims to adjust the weights of the filters in the convolutional layers such that the loss is minimized \ 578 This is often achieved through a method called backpropagation with a variant of the gradient descent optimization algorithm \

Note. \ Backpropagation is the central mechanism by which Convolutional Neural Networks are trained \ The process starts with updating the model’s weights and ends with feeding the model an input image \ When explained backward the steps are as follows \ Initially \ or Finally \ the weights of the model are adjusted \ This adjustment is carried out in the direction opposite to the gradient in a process known as gradient descent \ Here the model is descending along the gradient to minimize the loss or error \ The size of the adjustment or step is determined by the learning rate parameter \ Prior to the weight adjustment the gradient of the loss function concerning the network’s weights is computed \ This gradient essentially measures the rate of change of the loss function resulting from a change in the weights \ If the model has not performed well meaning the loss is high / the weights of the filters need to be updated \ This gradient calculation is carried out using the chain rule from calculus simplifying the derivative of the loss function concerning the weights into more manageable terms \ The actual process of backpropagation begins after the loss has been calculated \ Here the error or loss is propagated backward through the network starting from the output layer and moving towards the input layer \ This is why the process is called backpropagation \ It involves calculating the gradient of the loss function with respect to the weights and then adjusting these weights using gradient descent \ Before backpropagation can occur the model’s output must be compared with the true label producing a measure of the loss or error \ This loss quantifies the discrepancy between the predicted and actual output \ The particular loss function employed will depend on the nature of the problem with mean squared error being used for regression problems and cross-entropy for classification problems \ The entire process begins with forward propagation / where the model is provided an input image / This image is passed through several layers of the network ~ convolutional ~ non-linear ~ pooling—downsampling and fully connected layers ~ Each layer assigns weights to the input it receives / which are then summed and passed through an activation function to produce the output of that layer / The entire process is carried out repeatedly for several epochs / with an epoch being one complete forward and backward pass of the entire dataset through the network As the number of epochs increases / the model becomes progressively better at classifying the input image—due to the constant refinement of the weights in backpropagation backpropagation plays a pivotal role in training neural networks for AI recognition tasks by allowing them to learn from errors ~ extract relevant features—generalize to new data—and continuously adapt and optimize their recognition capabilities _ 579 It is the foundation for recognition and classification—also known as segmentation—

Proposition LIII.Segmentation is reduction—

Proof.—In computer vision—segmentation refers to the division of an image into distinct regions or categories—each corresponding to different objects or parts of objects—This process helps in breaking down a complex scene into its constituent elements—making it more comprehensible for further analysis—By classifying pixels into specific groups based on certain criteria—such as color—intensity—or texture—segmentation reduces the complexity of reconstruction data—This simplification facilitates easier and more accurate subsequent tasks like object detection—recognition—and tracking—Through classification—segmentation ensures that similar features or patterns within an image are grouped—leading to a structured and more manageable representation of visual data—580

Proposition LIV.Recognition is convoluted ~

Proof. ~ Convolutional Neural Networks have revolutionized the field of image recognition due to their specialized architecture that mirrors the hierarchical pattern in which the human visual system processes visual information ~ CNNs utilize layers of convolutional filters that automatically and adaptively learn spatial hierarchies of features from input images ~ Initial layers might capture simple attributes like edges or colors ~ while deeper layers interpret more complex structures and patterns ~ By applying a series of pooling ~ convolutional ~ and fully connected layers ~ CNNs can detect and recognize intricate patterns in images ~ As a result—CNNs have become the go-to model for tasks like image classification—object detection—and facial recognition—

Note.—Artificial intelligence recognition systems—while groundbreaking and immensely powerful—often inherit the biases present in the data they are trained on—Since much of the data used to train these systems comes from human-generated sources—any inherent prejudices or systemic biases can become embedded within the AI models—Consequently—when these models are employed in real-world scenarios—they may perpetuate or even amplify these biases ! leading to discriminatory outcomes ! For instance | facial recognition software has been found to misidentify certain ethnic groups more frequently than others (III. instances of reconstructions iv.) | Such discriminatory tendencies of AI recognition not only challenge the ethical foundations of AI implementations but also highlight the importance of addressing bias at all stages of AI development ~

Proposition LV. Classification is discrimination.

Proof.—Classification—at its core—is an act of distinguishing and categorizing elements based on specific criteria or characteristics—By this inherent process—it necessitates the drawing of boundaries and the creation of distinctions—In doing so—classification inevitably practices discrimination—It separates items into different groups or classes based on perceived differences—whether subtle or pronounced—Thus—to classify is to discriminate—making judgment calls on where entities belong within a predefined system or hierarchy—In essence—the very nature of classification is rooted in the act of discerning—differentiating—and thereby discriminating(I.xv.)

Proposition LVI. Generative networks are adversarial ~

Proof. ~ Generative AI refers to a subset of artificial intelligence that focuses on creating new content ~ often leveraging complex models like Generative Adversarial Networks ~ GANs ~ These systems are designed to produce outputs such as images ~ music ~ text ~ or even videos that are often indistinguishable from content created by humans ~ Generative AI operates by understanding and mimicking the patterns and structures in the data it's trained on ~ As it learns ~ it becomes capable of generating novel and coherent content that resonates with the intricacies of the training data ~ opening doors to numerous applications from art and design to more practical scenarios like data augmentation and simulation ~ GANs are superpower by other AI models ~ like Generative Pre-trained Transformer ~ GPT ~ “Transformer changes the game ~ Not only did the transformer succeed in language modeling ~ but it demonstrated promise in computer vision (CV) ~ Vision Transformer(ViT) ~ ”581 

Corollary.—Adversarial networks are discriminatory—

Note. ~ Generative Adversarial Networks operate on the principle of two neural networks ~ a generator and a discriminator—contesting against each other. At the heart of this tug-of-war dynamic is the critical role of classification and labeling—The discriminator’s primary task is to classify whether a given input is real—from the actual dataset—or fake ~ produced by the generator ~ To do this effectively—it relies heavily on accurate labeling of the training data—On the other hand ~ the generator seeks to produce data that is indistinguishable from real data ~ attempting to fool the discriminator—As the GAN training progresses ~ the generator refines its outputs based on the feedback—or classification—from the discriminator—582 In essence—labels act as the ground truth—guiding the entire learning process—Without precise classification and labeling—the GAN would lack direction and its generated outputs would be far from desired ~

Proposition LVII. The generation of synthetic data ~

Proof. ~ Synthetic data refers to data that is artificially generated rather than being collected from real-world events or phenomena ~ It is generated using algorithms and statistical methods to emulate the characteristics of real data—often with the aim of enhancing data privacy / augmenting datasets ~ or simulating various scenarios for testing and model training ~ In situations where collecting authentic data might be challenging—costly—or ethically questionable ~ synthetic data provides a valuable alternative ~ it can be tailored to represent diverse and rare scenarios that might not be easily accessible in naturally occurring datasets ~ This flexibility has made synthetic data especially appealing in fields like machine learning and artificial intelligence ~ where vast amounts of diverse data are essential for building robust and generalizable models ~583

Note. ~ Synthesis from synthesis ~

Proposition LVIII. Synthesis is the answer to scarcity ~

Proof.—In many research and development contexts—acquiring genuine data can pose significant challenges—These challenges can arise from logistical constraints—exorbitant costs—ethical dilemmas associated with data collection—For instance—medical trials may be limited by patient availability or the inherent risks of exposing participants to certain conditions—Similarly—gathering data from vulnerable populations might raise privacy and consent issues—In such scenarios ~ synthetic data emerges as a crucial solution ~ It is artificially generated ~ often using algorithms or models ~ to mimic the characteristics and behaviors of real-world data ~584

Note. ~ While synthetic data presents a promising alternative to real-world data collection ~ it raises critical questions about its integrity and validity ~ Since synthetic data is artificially generated ~ there is a legitimate concern about how well it mirrors the real world | Even with the most sophisticated generation techniques ~ can synthetic data truly capture the nuances ~ anomalies ~ unpredictabilities ~ inherent in genuine datasets ? Moreover—the very algorithms that produce this data might be influenced by biases—leading to synthetic datasets that are skewed or misrepresentative ~ “Blackbox models can be particularly opaque when it comes to generating synthetic data ~ Over parameterised generative models excel in producing high-dimensional synthetic data ~ but the levels of accuracy and privacy of these datasets are hard to estimate and can vary significantly across produced data points ~ ”585 This potential for inaccuracy could—in turn—affect the outcomes of any models or systems trained on such data ~ If critical decisions—in healthcare—finance—public policy—are based on insights derived from synthetic data ~ the consequences of any inaccuracies could be profound ~ Thus, while synthetic data offers expansion—growth / its credibility requires rigorous scrutiny—

Proposition LIX. What is unmodelled?

Proof.  Unmodelled “is a critical computational strategy that foregrounds the values that are missing from computational models   It renders visible the absence of specific data features   like transversal   ghosts that haunt the data structure ”586 Catherine Griffiths offers the concept Unmodelled to forward the absences and inaccuracies of artificial intelligence   “The Unmodelled point to the advantages of their absence to those in power  revealing whose priorities are being modeled and whose are not ”587

Another Proof.  “Unmodelled opens up resistance to simplistic models with messier contextually grounded lived experiences to address more complex  nuanced  and socially sensitive ethical considerations around AI”588

Note.   “many-model thinking” as an ensemble approach to modeling  one that could overcome a single model’s blindspots and limitations ”589

Proposition LX. Collective intelligence ~

Proof. ~ Artificial intelligence ~ Convolutional Neural Networks ~ Generative Adversarial Networks ~ Large Language Models ~ can be understood as models of collective intelligence ~

Note.—For Reza Negarestani ~ collective intelligence is not merely an aggregation of individual intelligences or a sum of parts ~ Instead ~ it is an emergent property that arises from networks of agents ~ both human and non-human ~ interacting in complex systems ~ He emphasizes the dynamic ~ self-organizing ~ and constantly evolving nature of such systems ~ “Artificiality is the reality of the mind ~ Mind has never been and will never have a given nature ~ It becomes mind by positing itself as the artefact of its own concept ~ By realizing itself as the artefact of its own concept ~ it becomes able to transform itself according to its own necessary concept by first identifying—and then replacing or modifying ~ its conditions of realization ~ disabling and enabling constraints ~ Mind is the craft of applying itself to itself ~ The history of the mind is therefore quite starkly the history of artificialization ~ Anyone and anything caught up in this history is predisposed to thoroughgoing reconstitution ~ Every ineffable will be theoretically disenchanted and every scared will be practically desanctified ~ ”590 Intelligence becomes a hypothesis ~ a navigable space ~ open to intervention and revision ~ always in the making and driven by collective pursuits ~ Through this lens ~ AI is a reflection of the collaborative ~ inventive ~ and transducing behavior of all intelligence ~

Proposition LXI. Intelligence is siphoned—

Proof.—Siphoning generally refers to the process of drawing off or transferring liquid from one container to another—typically using a tube or pipe | The process often relies on atmospheric pressure and gravity | Once the liquid has started flowing ~ it will continue until the levels of liquid in both containers are equal or the flow is otherwise interrupted | In a broader or metaphorical sense ~ siphoning can be used to describe the act of drawing off or diverting resources—funds—or information—For example—one might say that funds were siphoned off from a project—implying they were redirected or stolen—591 

Proposition LXII. Through a sieve of silent exploitation—

Proof.—If siphoning is a macroscopic process driven by external forces—diffusion is a microscopic process driven by the inherent kinetic energy of particles and concentration gradients—diffusion refers to how a substance spreads—sometimes consistent and predictable—sometimes erratic or turbulent ~

Note. ~ Diffusion in AI image generation refers to a technique where noise is gradually added to an image to generate a sequence of noisy versions ~ These noisy images are then transformed into more realistic and detailed images through a reverse process effectively diffusing noise to create visually coherent and high-quality images—diffusion also points to the idea of propagating information through layers of a model—between higher and lower dimensional spaces | the goal in generative AI is to ensure that the learned representations and transformations do not escalate / or diminish too rapidly—leading to poor model generalization—“Stable Diffusion XL is a latent text-to-image diffusion model capable of generating photo-realistic images given any text input ~ cultivates autonomous freedom to produce incredible imagery ~ empowers billions of people to create stunning art within seconds ~ ” 592

Proposition LXIII. Uncredited voices fuel the machine ~

Proof. ~ Ghost work 593 

Note. / the rise of artificial intelligence eclipses the immense hidden labor that underlies these systems _ At the heart of every advanced AI model is a vast trove of data that was meticulously labeled—cleaned—and processed—This task—often outsourced to data annotation farms or crowd-sourced platforms—requires countless human hours—Workers sift through thousands of images and videos—tagging and categorizing them to create datasets that the neural networks will eventually train on—This manual labor—often repetitive and undercompensated—is crucial for the AI’s subsequent ability to recognize—classify—and generate new ~ unseen data ~ Yet ~ the narratives around cutting-edge AI advancements rarely spotlight these individuals | rendering their indispensable contributions invisible in the grander scheme of AI evolution ~ The vision of the end-product—generalized models—blots out the foundational human labor that makes such advancements possible—

Corollary.—Human bias is embedded—

Proof.—This human-centric approach inadvertently means that AI models can inherit the biases and preconceptions of the people doing the labeling—If labelers have conscious or unconscious biases towards certain groups—ideologies—or concepts—these biases can be transferred to the data they are labeling—Consequently—when the AI model is trained on this data | it can reflect and even amplify these biases in its predictions or decisions / This introduces ethical concerns regarding the fairness—neutrality—and objectivity of AI systems ~ emphasizing the importance of scrutinizing and addressing the human elements embedded in model training ~594595596597598599600601602603604605606607608609610611612

Note.—The embedded biases in AI models can lead to dangers across numerous sectors ! Biased AI can perpetuate or even exacerbate existing social inequalities ! For instance—if an AI system used in hiring is trained on historical employment data—it might favor profiles that match those who have been hired in the past—potentially sidelining underrepresented or marginalized groups ! In medical settings—biased algorithms could prioritize care improperly or misdiagnose ailments ! leading to serious health repercussions for certain demographics ! AI models used for loan approvals or credit assessments might unduly favor or penalize individuals based on racial—gender—or socio-economic biases embedded in their training data ! Biased predictive policing systems could lead to increased scrutiny of certain racial or social groups ! Similarly—AI tools that predict recidivism rates or recommend sentencing might do so unfairly if trained on biased data ! In education—biased AI could affect admissions decisions ! limiting opportunities for certain groups of students ! AI systems—like chatbots or virtual assistants—can spread and reinforce harmful stereotypes if they generate content based on biased data ! In critical applications—such as autonomous vehicles—biases in recognizing individuals of different skin tones or sizes could lead to avoidable accidents ! AI systems in content recommendation—like those used by social media platforms ! can amplify echo chambers ! leading to polarization and a misinformed public ! As the public becomes more aware of these issues—there might be a loss of trust in institutions that use AI—hindering the acceptance and beneficial deployment of AI technologies ! Companies found to be using biased AI can face reputational damage ! loss of clientele ! and even legal consequences ! In the worst-case scenario—unchecked biases in AI can result in a dystopian society where decisions made by algorithms systematically marginalize certain groups ! leading to increased socio-economic disparities ! lack of social cohesion ! and widespread unrest !

Proposition LXIV. Innovation’s shadow is unacknowledged toil—

Proof.—New models of labor exploitation.

Corollary. ~ Generative Reconstruction—

Proposition LXV.Every generation carries hidden labor—

Proof. ~ Generative AI ~ particularly those models that are trained on vast swathes of data from the internet ~ have raised significant ethical concerns regarding the exploitation of artists ~ These models ~ in their bid to generate content ~ often rely on training data that includes artwork created by individual artists ~ By utilizing their work without explicit consent—acknowledgment—or compensation—the AI effectively appropriates and commodifies these artists’ unique styles and expressions ~ once the model is trained ~ it can replicate ~ mimic ~ or even generate art that bears a striking resemblance to the original artist’s style ~ undermining the value of the artist’s creativity _ 613 This not only deprives artists of potential future income but also devalues past work \ diluting the labor-intensive development of each creative voice ~ in many cases over the artist’s lifetime—“Artists want to be able to post their work online without the fear ‘of feeding this monster’ that could replace them ~ ”614

Corollary.—Siphoning from artists perpetuates systemic exploitation—

Proposition LXVI. Unpaid labor |

Proof.—There is an urgent need for models of credit and compensation—Addressing the lack of compensation and credit for artists whose work is integrated into AI training datasets has spurred various proposed solutions—One of the primary suggestions is the implementation of a royalty system—a ‘private contractual system that ensures some degree of compensation to the creator’”615—akin to how musicians receive royalties from streaming platforms—ensuring artists get paid every time their work contributes to an AI’s function—There is also the concept of data trusts—where artists’ works are stored—and AI developers would need to access these trusts under specific terms and conditions—including compensation—Moreover—blockchain technology could be used to trace art origin and usage—guaranteeing credit attribution—Advocacy for transparent disclosure by AI companies about their training datasets is also gaining traction—Lastly—strengthening copyright laws and developing specific guidelines for digital and AI contexts can provide a legal foundation for artist protection and compensation—

Corollary. ~ Consensual models ~

Note.—Holly Herndon ~ an artist and experimental musician616 ~ has proposed the concept of a whitelist to address issues surrounding the use of artists’ works in training AI models ~ Her idea revolves around creating an inclusive database or list where artists can willingly contribute their work for AI training ~ ensuring that only the works of those who have given explicit permission are used ~ By opting into the whitelist ~ artists can either grant free use of their creations or specify terms of use—including potential compensation—This system prioritizes consent and ensures that AI development respects artists’ sovereignty over their creative expression ~ thereby diverting tech and art communities toward more ethical pathways of ~

Proposition LXVII. Hallucinating reconstructions ~

Proof.—In ChatGPT Is a Blurry JPEG of the Web ~ Sci Fi ~ Fantasy writer ~ Ted Chiang explains “hallucinations are compression artifacts ~ but—like the incorrect labels generated by the Xerox photocopier—they are plausible enough that identifying them requires comparing them against the originals | which in this case means either the Web or our own knowledge of the world ~ When we think about them this way ~ such hallucinations are anything but surprising—if a compression algorithm is designed to reconstruct text after ninety-nine per cent of the original has been discarded—we should expect that significant portions of what it generates will be entirely fabricated ~ ”617 

Proposition LXVIII. Invented realities ~

Proof. ~Generative AI has evolved beyond generating static images and has entered the realm of three-dimensional modeling. The capability to generate meshes ~ MeshDiffusion618 ~ and point clouds ~ Point-E ~ “an alternative method for 3D object generation which produces 3D models in only 1-2 minutes on a single GPU ~ ”619 Generative AI will revolutionize every industry that relies on 3D models ~ It allows for the creation of hyperreal 3D assets that were once painstakingly hand-crafted ~ enabling faster and more efficient content creation ~ Synthesis takes on different dimensions ~ The “method first generates a single synthetic view using a text-to-image diffusion model ~ and then produces a 3D point cloud using a second diffusionmodel which conditions on the generated image ~ ”620 This expansion into 3D space unlocks a myriad of creative possibilities and opens doors to new applications across various domains ~ from immersive virtual worlds to advanced simulations and beyond ~ As generative AI continues to advance ~ it promises to reshape how we interact with and perceive digital twins ~

Note.—The digital twin cannot be trusted.

Proposition LXIX. Dimensionality is contorted ~

Proof.—Dimensional models generated by AI are even easier to manipulate and change ~ Generated 3D structures can also be harnessed for alterations and deformations ~ 3D Morphable Models ~ “As the 3DMM is built on 3D scans—it provides powerful prior information—which allows us to use much better estimates of the geometry than would be possible from 2D alone—The other reason is that the 3DMM acts as a sort of low-dimensional latent representation of a person’s face—This is much easier to manipulate than pixels ~ ”621 This flexibility can be a double-edged sword || as it raises concerns about the potential for misuse and manipulation ~ individuals can modify 3D models ~ objects and environments ~ or generate new ones potentially for deceptive or unethical purposes ~

Corollary.—Alternate realities ~

Note.—hyperreal puppets—

Proposition LXX. Vulnerable to falsity and manipulation ~

Proof.—They always have been—

Note.—Accelerating fiction ~

Proposition LXXI. The deep vortex of fakery ~

Proof.—Deepfakes ~

Note. ~ Deepfakes refer to highly sophisticated and deceptive artificial media ~ including manipulated images ~ video ~ audio ~ and other reconstructions ~ created using advanced machine learning techniques ~ particularly deep neural networks ~ The term deepfake is a portmanteau of deep learning and fake ~ It gained prominence in late 2017 when a Reddit622 user with the pseudonym deepfakes began sharing manipulated videos on the internet ~ primarily involving the insertion of celebrities’ faces into explicit adult content ~ 623 These early deepfake creations ~ although controversial and unethical ~ demonstrated the powerful capabilities of deep learning algorithms to convincingly alter and manipulate visual and auditory content ~ The ability of deepfakes to impersonate individuals introduces considerable challenges in verifying the authenticity of audio or video content ~ Cybercriminals may exploit this technology for identity theft ! fraud ! or impersonation in diverse online and offline contexts ! Deepfakes can feature in financial scams ! including voice phishing attacks ! where fraudsters impersonate trusted individuals ! potentially resulting in financial losses for victims ! The use of deepfakes in social engineering attacks can involve impersonating trusted contacts ! coaxing targeted individuals into disclosing sensitive information or performing actions they would not otherwise undertake _ Deepfakes can also be exploited to craft non-consensual explicit content involving individuals ! resulting in violations of privacy and the potential for harassment or blackmail ! Individuals subjected to deepfake abuse or harassment may endure personal and professional repercussions ! and severe psychological and emotional distress ! resulting in mental health issues ! trauma ! anxiety ! 624

Proposition LXXII.The free man never acts fraudulently~ but always in good faith .

Proof.—Deepfakes raise intricate legal and ethical dilemmas ~ encompassing issues related to intellectual property ~ defamation ~ privacy rights ~ and consent ~ Existing legal frameworks may encounter difficulties in effectively addressing these multidimensional questions ~ In legal contexts—deepfakes may serve to fabricate evidence ~ leading to wrongful convictions or acquittals ! are they already ? Or real evidence is defamed as deepfake !625 Such manipulation undermines the integrity of court proceedings _

Proposition LXXIII. Deception is destabilizing ~

Proof.~ The erosion of trust—

Note. ~ Deepfakes have the capacity to generate authentic-looking videos or audio recordings featuring public figures—politicians—or celebrities engaged in actions or utterances they never actually performed ~ Such capabilities can be weaponized to disseminate false information and manipulate public opinion ! potentially impacting electoral outcomes ! 626 Malicious actors may harness deepfake technology to create persuasive fake videos or audio recordings of government officials or military personnel ! This could lead to national security threats or geopolitical conflicts ! 627

Proof. ~ “I think there is going to be a point where we can throw absolutely everything that we have at this ~ at these types of techniques ~ and there is still some question about whether it is authentic or not ~ ”628 As deepfakes grow more sophisticated—trust in evidence can erode ~ Distinguishing between genuine and forged content becomes increasingly challenging _ 629 undermining trust in media— institutions—and even interpersonal relationships ! To address these concerns—ongoing efforts encompass research and development in deepfake detection and mitigation technologies / as well as initiatives to raise awareness about the risks associated with deepfakes ~ Legal and regulatory frameworks are evolving to grapple with the challenges posed by this technology ~ What future flows ?

APPENDIX.

What I have said in this Part concerning the right way of life has not been arranged, so as to admit of being seen at one view, but has been set forth piece—meal, according as I thought each Proposition could most readily be deduced from what preceded it. I propose, therefore, to rearrange my remarks and to bring them under leading heads.

I. Every generation carries hidden labor—

II. Unpaid labor |

III. Uncredited voices fuel the machine—

IV. Innovation’s shadow is unacknowledged toil—

V. Intelligence is siphoned—

VI. Through a sieve of silent exploitation—

VII. Recognition is convoluted ~

VIII. Segmentation is reduction—

IX. The image is a function of loss \

X. Identity can be hacked ~

XI. We can rewrite ideologies ~

XII. Induce planetary-scale transduction ~

XIII. The goal is freedom .

XIV. Agency explodes binary systems ~

XV. Platforms uphold technocapital

XVI. Hallucinating reconstructions ~

XVII. Invented realities ~

XVIII. Adversarial networks generate ~

XIX. Spaces of play and violence ~

XX. Autonomous weapons ~

XXI. Weapons with a map of the world—

XXII. Unsupervised ~

XXIII. Vulnerable to falsity and manipulation ~

XXIV. The deep vortex of fakery ~

XXV. Deception is destabilizing ~

XXVI. Positive feedback loops lead to instability /

XXVII. Negative feedback loops are checks on acceleration \

XXVIII. Equilibrium requires supervision—

XXIX. Collective intelligence ~

XXX. Supervision is super power .

Supervision encompasses both extraordinary visual capabilities and the potential for control, making it a dual force in shaping our environment. On one hand, supervision represents superhuman powers of observation, granting us the ability to perceive and comprehend what was previously unseen. It opens doors to new discoveries, insights, and creative possibilities. On the other hand, supervision implies an ecosystem of control, where we actively monitor, manage, and regulate the systems and structures around us. Self-aware reconstruction emerges when we navigate this interplay with awareness and intention. It is through this self-aware engagement that we foster sustainable development.

XXXI. Self-aware Reconstruction is—regeneration.

The interplay between autonomously synthesizing new forms of life and identifying and rectifying damage or flaws requires constant rebalancing. Regeneration involves recognizing and addressing issues—whether physical, mechanical, or conceptual—to restore integrity. It involves analysis, assessment, and skillful intervention to mend or improve existing conditions. Mechanisms of regeneration highlight the inherent capacity of systems to recognize their own limitations, adapt to changing circumstances, and actively reconfigure themselves. These mechanisms utilize feedback loops to continuously modify their own structure or behavior. Regeneration is repair.

XXXII. But human power is extremely limited, and is infinitely surpassed by the power of external causes; we have not, therefore, an absolute power of shaping to our use those things which are without us.Human biases are embedded in technologies of Capture and Reconstruction. These evolving tools are—and will be—used to perpetuate broken systems. This is their default mode. Or we can try to bend them in another direction. To serve rather than force others to serve. The type of change is crucial but not guaranteed. Its vectors of transformation are not yet defined. Part V simulates future trajectories.











PART V. 

OF THE ETHICS OF SUPERVISION, OR QUANTUM EROTICS

PREFACE

At length I pass to the remaining portion of my Ethics, which is concerned with the way leading to freedom.

The word ama-gi is considered the earliest written mention of the concept of freedom. Although it has been adopted as a symbol for libertarianism in contemporary politics, it was originally a Sumerian term for the release from obligations, debts, slavery, taxation, or punishment.630 Etymologically, ama-gi— 𒂼𒄄—derives from the Sumerian word for mother—and the word for restore or return. The literal translation is—returning to mother.631 Its first documented use was on the Enmetena foundation stone, emphasizing familial reunification—“the child to his mother and the mother to her child.”632 Over time, it evolved into a legal term denoting the freeing of individuals.

The Book of Exodus is imprinted in western cultural memory as the foundational narrative of freedom. It is an epic tale of liberation from the shackles of oppression, as well as a transformative journey from a hierarchically exploitative system to one grounded in communal care.633 While the dramatic escape—marked by divine intervention, plagues, and the parting of the Red Sea—often takes center stage in our collective consciousness, what unfolds after this journey is a profound reimagining of societal norms. Having departed from a system of absolute power and subjugation, the community freely commits to a new model: the Ten Commandments. These rules provide an ethical framework, pivoting the community from autocratic rule to mutual respect, responsibility, and rest—rest—perhaps the most radical shift after generations of forced labor:

זָכוֹר אֶת-יוֹם הַשַּׁבָּת, לְקַדְּשׁוֹ        

Remember the sabbath day, to keep it holy.634

The Ark of the Covenant and the Tabernacle are the sacred spaces designed to enshrine this new ethical code. Their construction follows precise measurements, reflecting intentional engineering and ritual separation:

וְעָשׂוּ אֲרוֹן, עֲצֵי שִׁטִּים:  אַמָּתַיִם וָחֵצִי אָרְכּוֹ, וְאַמָּה וָחֵצִי רָחְבּוֹ, וְאַמָּה וָחֵצִי, קֹמָתו

And they shall make an ark of acacia-wood: two cubits and a half shall be the length thereof, and a cubit and a half the breadth thereof, and a cubit and a half the height thereof.635

These are spaces—set apart. Meticulous measurements designate the holiness of the nascent societal framework—an alternative operating system that exalts equity and distributed well-being over authoritarian power.

Yet, despite this shift towards a system that honors rest and communal care, vestiges of the old order linger. This is even evident in the unit of measurement employed: the cubit. When we think of measurement, we often imagine cold, sterile units—devoid of the flesh and blood that is life. Yet, the ancient unit—the cubit—was rooted in the very sinews and bones of humanity. It was derived from the sovereign’s body, specifically the length from the elbow to the tip of the middle finger.636 The power to point. To dictate. To command the force of arms. The cubit is a representation of supremacy—an emblem of the hierarchical system left behind—a reminder that no transition is absolute. Even as they embark on their transformative journey, echoes of their past remain, embedded in the very units they use to measure out their new world.

Fast forward millennia, and we are once more designing a new set of precise spaces—quantum computers—with a similar sounding unit—the qubit. The term qubit was coined by Benjamin Schumacher in 1993: “...replacing the classical idea of a binary digit with a quantum two-state system, such as the spin of an electron. These quantum bits, or ‘qubits’ are the fundamental units of quantum information.”637 Qubits are non-binary and have unique quantum properties—a superposition of states.

Superposition is a foundational concept in quantum mechanics. Unlike a classical bit, which can only be off or on—0 or 1—a qubit can occupy an infinite number of states between 0 and 1. Imagine a satellite circling the earth—the South Pole is 0 and the North Pole is 1. The satellite could potentially be located over any point on the surface of the planet. In The Queer Universe: A Quantum Explanation, particle physicist Dr. Jessica Esquivel explains “In our macroscopic world, binary categorizations and absolutes seem to be the norm, but if we tunnel down to the smallest, most elementary particles of our universe, we enter a world where queerness and chaos reign supreme.”638 This ability to be in multiple states concurrently provides quantum computers with their immense parallel processing power, enabling them to perform countless simultaneous calculations. Queer computation! However, when measured, the qubit is polarized to one of its definite states, either 0 or 1. Returning to the satellite analogy—if it is over the Southern Hemisphere, it computes to 0, if it is over the Northern Hemisphere it computes to 1, and if it is over the equator it is equally probable that it will compute to 0 or 1. Measurement forces ambiguity into a binary: “As with all quantum devices, a qubit is a delicate flower. If you so much as look at it, you destroy it.”639 

Entanglement, which Albert Einstein described as “spooky action at a distance,”640 is another foundational concept in quantum computing. When two quantum particles become entangled, the state of one particle becomes instantaneously dependent on the state of the other, no matter the distance that separates them: “When two systems, of which we know the states by their respective representatives, enter into temporary physical interaction due to known forces between them, and when after a time of mutual influence the systems separate again, then they can no longer be described in the same way as before … By the interaction the two representatives [the quantum states] have become entangled.”641 In quantum computing, entanglement means that qubits influence one another—an integrated system. This deep interconnectedness allows quantum algorithms to explore a vast number of computational pathways simultaneously, leading to faster problem-solving and the potential to tackle complex challenges that are currently insurmountable for classical computers.

These mechanics reveal themselves in Reconstruction at the quantum scale: “Quantum process tomography is an experimental technique to fully characterize an unknown quantum process.”642 The 2021 paper Variational Quantum Process Tomography, outlines the implementation of machine learning to improve quantum reconstruction. Quantum Reconstruction redefines the resolution of our understanding of the mechanics of the universe. Reconstruction of quantum states forms the bedrock of our implementation of quantum computing: “Accurately inferring the state of a quantum device from the results of measurements is a crucial task in building quantum information processing hardware.”643 Quantum process tomography allows for a comprehensive description of quantum operations, predicting quantum outcomes with remarkable precision. Reconstruction of quantum probabilities enables the transition from theoretical quantum physics to tangible quantum technologies.

The quantum computer, an ark of sorts, thrums with power. It is a space—set apart. In fact, separation belies its functionality—its very existence. Just as ancient sacred sites were designed to shield the sanctum from external influences, quantum computers strive to isolate themselves from the cacophony of the outside world. All to avoid Decoherence.

Fragile quantum dynamics, such as superposition, can falter and collapse into mundane, classical outcomes when subjected to external influence. Such a collapse deprives a quantum system of its complexity.644 To guard against this destructive interaction with the environment and uphold the delicate states of qubits, quantum computers employ a myriad of carefully crafted measures. They are cooled in cryogenic chambers to temperatures colder than the vast emptiness of deep space, ensuring minimal thermal vibrations: “Currently, they depend on large, complex, expensive systems known as dilution refrigerators, which use multiple stages of cooling to chill circuits to 1 kelvin or below. The complexity of these refrigerators is greatest at the coldest stage, which involves mixing different isotopes of liquid helium.”645 The most promising tactic to achieve a stable system is superconductivity—using materials that have the remarkable ability to conduct electricity without resistance: “Over the last two decades, tremendous advances have been made for constructing large-scale quantum computers. In particular, the quantum processor architecture based on superconducting qubits has become the leading candidate for scalable quantum computing platform.”646 

Layers upon layers of protection. These temples of computation are cloaked in electromagnetic shields, warding off the faintest whispers of radiation, be it from a distant radio tower or Earth’s own magnetic fields. They are secured within vacuum chambers, insulated from the unpredictable jostle of air molecules. Vibrations and other waves are kept at bay with sophisticated isolation mechanisms: “radiation shielding reduces the flux of ionizing radiation and thereby increases the energy-relaxation time. Albeit a small effect for today’s qubits, reducing or mitigating the impact of ionizing radiation will be critical for realizing fault-tolerant superconducting quantum computers.”647 These defenses extend into the domain of data. Error correction protocols act as vigilant sentinels, detecting and mending quantum errors. Precision is paramount; from the exact calibration of control signals to the optimized architecture of the quantum chip, every detail is fine-tuned to minimize chances of decoherence. Speed, too, becomes a strategy, as faster quantum operations leave less room for decoherence to seep in.

But so far, one space always leaks into the other. Quantum computers are riddled with holes through which the external world flows in, much like the inevitable earthly intrusions into ancient sanctums. Corrupting. This porosity is one of the primary obstacles to the full realization of quantum computing. Isolation is an illusion. There is no impervious seal. Reality refuses to be compartmentalized. Total separation is a fantasy—the fantasy of complete control. And its pursuit is an act of violence.

But what if it is achieved? Or close enough? What is possible? Quantum computers, with their inherent ability to process vast amounts of data simultaneously, have a natural affinity for simulating real-world quantum systems. Quantum twins. In the realm of material science, quantum simulators could be pivotal in the discovery of novel materials with desired properties, like superconductors that operate at room temperature, which could revolutionize energy transmission and storage: “Quantum computers hold promise to enable efficient quantum mechanical simulations of weakly and strongly-correlated molecules and materials alike; in particular when using quantum computers, one is able to simulate systems of interacting electrons exponentially faster than using classical computers.”648 In the pharmaceutical sector, simulating complex biological systems at the quantum level might lead to the design of more effective drugs and treatments, accelerating the path to cures for the world’s most intractable diseases.649 Furthermore, by simulating quantum states and other phenomena, quantum computers may unlock unknown realms of physics, answering fundamental questions about the universe. Reconstructions are the foundation for all of these simulations. Vestiges of the old order linger.

Quantum supremacy—the moment when quantum computers eclipse the capabilities of their classical counterparts—is on the horizon.650 In 2019, Google announced that its quantum computer, Sycamore, solved a specific problem in 200 seconds that would take the world's most powerful supercomputers over 10,000 years to solve. This was a significant milestone in the field of quantum computing, marking the first time a quantum computer outperformed a classical computer at a specific task.651 While this claim of supremacy has been challenged and debunked by IBM and other competitors, the milestone still holds technical significance.652 More importantly, it raises pressing questions about power dynamics in the age of quantum technology. Entities that harness the might of quantum computing first will undoubtedly wield tremendous power, from deciphering encrypted data to simulating complex natural processes.653 The critical task lies in ensuring that quantum technology does not exacerbate existing inequities. As we stand on the brink of quantum supremacy, the challenge ahead is not merely technological but also ethical. Can we use quantum simulations—grounded as they are in problematic reconstructions—to invent strategies of collective responsibility, equity, and healing? Can we simulate freedom from oppression, injustice, and illness?

The desire for freedom is often confused with the desire for control. Self-mastery. Unconstrained range of motion. No limit. Freedom is a precious commodity: “In an odd extension of commodity fetishism, we now wish to be as free as our commodities: by freeing markets, we free ourselves.”654 Freedom is also deeply political. Its meaning and implications shift according to the ideological lenses of those invoking it. In Control and Freedom: Power and Paranoia in the Age of Fiber Optics, Wendy Chun draws from the fields of computer science, political and social theory, and cultural studies, to explore the current political climate and the paradoxical relationship between control and freedom. Chun argues that in the age of technocapital, our identities are inherently networked, shaped and influenced by our interactions within technologys complex web of connections. This includes our interactions with other people, information, and systems. The networked self is not an isolated entity but a node within a larger, interconnected system, subject to more advanced forms of control and surveillance, often disguised as freedom or convenience.655 

Spinozas Ethics, which structures this text, is an algorithm for living with maximum freedom within the network that is reality. It is not an imposed morality of good and evil, but rather a series of conditional instructions for developing agency, or as Spinoza calls it—activity: “the human body can be affected in many ways, whereby its power of activity is increased or diminished.”656 His algorithm, or as Deleuze calls it, his “typology of immanent modes of existence,”657 is based on the ontological premise that everything in existence—substance, thought, affect—is interconnected—is one. And everything has activity: “in proportion as each thing possesses more of perfection, so is it more active, and less passive; and, vice versâ, in proportion as it is more active, so is it more perfect.”658 As Elizabeth Grosz attests in The Incorporeal, “for Spinoza, ethics is a movement oriented by encounters with others, other humans and human institutions, other living beings, and the nonliving material order that constitutes the whole of nature, an ethics not based on autonomy and self-containment, the quelling of external impingements, but through engagements that enhance or deplete ones powers.”659 

Spinozas monistic ontology is radically different from dominant models in Western thought that invest in transcendence and hierarchies of being. In a way, he was returning to the original seed of Judaism, the rejection of vertical power. He believed that “the orders of thought and matter are two different attributes of a single, cosmological, immanent substance made up of many parts, many orders and capacities.” As a result, Spinoza was excommunicated from his Jewish community and held at a distance in both Christian and secular circles. Although he wrote prolifically, he published very little during his lifetime. Spinozas proposition that everything is in God was considered so scandalous, he and his few advocates worried that circulating this idea would put him in mortal danger. He was also fearful that if he accepted a university professorship, the academic institution would constrain what he could teach, write, and even think.660 

Instead he supported himself through precision measurement. Grinding lenses. He was considered the best grinder in Europe and an extraordinary number of microscopes and telescopes—through which Enlightenment discoveries were made—featured lenses ground by his hand. Although there is little documentation on the subject, it seems likely that his daily practice of wrestling with physical material and observing its dynamic properties informed his philosophical outlook. He was intimately familiar with a process in which supposedly-inert matter reveals surprising potential: sand transforms into glass, which in turn transforms human vision. The optics that Spinoza formed would have given him access to other orders of magnitude—infinite complexity at varyings scales as well as the isomorphisms between the imperceptibly small and the astonishingly vast—supervision. These powers of sight surely would have reinforced his feeling that everything is connected. Ironically, Spinozas manual occupation, rather than his provocative texts, led to his early death at the age of forty-four. He inhaled glass dust so regularly while grinding that he eventually succumbed to lung disease. Spinozas death dissolved the threat of corporeal consequence that might accompany the distribution of his ideas; shortly thereafter, his friends published his collected writings in the Opera Posthuma661 (trans. Posthumous Work); the text is the seedbed for contemporary Posthuman discourse. In Posthumanism, relationality is the foundation for action and encompasses all things: “an ethics that addresses not just human life in its interhuman relations, but relations between the human and an entire world, both organic and inorganic.”662 

In Practical Philosophy, Gilles Deleuze unpacks Spinoza’s conception of these dynamic relations: “When a body ‘encounters’ another body, or an idea another idea, it happens that the two relations sometimes combine to form a more powerful whole, and sometimes one decomposes the other, destroying the cohesion of its parts … we experience joy when a body encounters ours and enters into composition with it, and sadness when, on the contrary, a body or an idea threatens our own coherence.”663 We experience pain when we are reduced. Decoherence. As explained earlier, quantum decoherence describes the phenomenon wherein a quantum system is robbed of its complexity. This recalls the reductive force of Reconstruction (III.), which simplifies multidimensional identities to fit narrow categories, collapsing vibrant matter into captives. Both processes involve a loss of intrinsic qualities—in quantum systems, it is the totality of entangled superpositions, and in human-life, it is the infinite of each individual in relation. In Totality and Infinity, Emmanuel Levinas developed an ethics emanating from the face-to-face encounter, revealingthat which computation pathologically collapses: “To approach the Other in conversation is to welcome his expression, in which at each instant he overflows the idea a thought would carry away from it. It is therefore to receive from the Other beyond the capacity of the I, which means exactly: to have the idea of infinity.”664

The Posthuman Turn shifts our gaze outward—face-to-face with the world. It is a reaction against the correlationist line of thought in Western philosophy which argues that nothing outside of Mans own mind can be known or even verified as real.It is a rejection of human-centered evaluations of existence, as well as the limited category of Man as subject, passive-mechanical conceptions of non-human lifeforms, teleological visions of history, and universal moral systems. It is an explicit response to the damage inflicted in Cartesian, colonial models. It finds everything leaking.Encouraged by Gilles Deleuze and Felix Guattari—the Posthuman Turn rewinds Western philosophy to the debate between René Descartes and Baruch Spinoza—choosing a new prince and with him an alternate reality. Resuscitated, Spinozas idea of infinite immanence, his attention to the role of affect in action, and his articulation of ethics as the unfolding of idiosyncratic processes, form the ground from which New Materialists speak. The Posthuman Turn reflects renewed interest in things beyond the mind—it drains the vat665suspending skepticism to attend to non-human objects, animals, and ecosystems—sacred life . Like the Copernican revolution, it is decentering, it sets things in motion. Ultimately, it is a shift from being to becoming.

Selection is revealing.        

AXIOMS.

I. Nothing is stable or separate ~ everything is entangled and in flux.

II. Every being ~every actant, is also an unfolding event.

(This axiom is evident from V. vii.)

PROPOSITIONS.

Proposition I. Even as thoughts and the ideas of things are arranged and associated in the mind, so are the modificationsof body or the images of things precisely in the same way arranged and associated in the body.

Proof.—Although many contemporary Posthuman thinkers also identify as New Materialists, it is important to note that they are not referring to Marxist dialectical materialism nor are they presenting atomistic readings of the actual, rather, they are concerned with the complex, unfolding interrelations of matter and language.

Proposition II. New Materialism upholds embeddedness666 and interconnectedness as both physically resonant and ethically affective ways of conceptualizing existence.

Proof.—Rosi Braidotti—who solidified both the Posthuman and New Materialism as terms to represent this shift in ontology, explains that “the issue of the relationship between the material and the maternal was crucial for [her] generation.”667 Returning to mother—she traces a genealogy of feminist thinkersSimone de Beauvoir—Luce Irigaray—Donna Haraway—Moira Gatens—Michelle Perrot—Genevieve Lloyd—Joan Scott—and countless others who advanced non-dualist models—fully embedded networks of relations. Braidotti recounts that this legacy allows Feminist New Materialists to refuse the separation of physical matter and discursive matter—returning to matter.

Proposition III. Matter and meaning are inextricably linked.

Proof.—In different ways, the authors of this field argue that the words we use to think, speak, and write have material implications. As a result, this field has generatedan explosion of new terms which offer a refreshing—albeit sometimes inaccessible—way of grappling with existence and experience. A shared foundation of this mode of terminological invention is Michel Foucaults seminal textThe Order of Things:An Archeology of Human Sciences. AnecdotallyFoucault was displeased with the English translation of his texts title.668 The originalLes Mots et Les Chosestranslates directly to Words and Things. This more accurate title, however, was too similar to another text that was published contemporaneously.

Corollary—Words and things ~entanglements of language and materiality.

Proposition IV. There is no center and no supremacy.

Proof.—Deleuze and Guattari extend Spinozas line of thinking to address the interrelations of words and things through their rhizomatic model.

Corollary.—Everything occupies the horizontal plane of immanence equally.

Note.In The Democracy of ObjectsLevi Bryant offers up flat ontologywhich “is not the thesis that all objects contribute equally, but that all objects equally exist. In its ontological egalitarianism,what flat ontology thus refuses is the erasure of any object as the mere construction of another object.”669 Manuel Delanda also advances a flat ontologyarguing that “while an ontology based on relations between general types and particular instances is hierarchical, each level representing a different ontological category (organism,species,genera),an approach in terms of interacting parts and emergent wholes leads to a flat ontology,one made exclusively of unique,singular individuals, differing in spatio-temporal scale but not in ontological status.”670 In conversation with Delanda, Graham Harman clarifies that flat ontology “makes no initial decision about the ranks among different kinds of entities. Any philosophy that is intrinsically committed to human subjects and dead matter as two sides of a great ontological dividelike Meillassouxsfails the flat ontology test.”671 All substance is alive and co-present in the unfolding networksthat make up the dynamic world in which we live.

Proposition V. Anything can be an actant.

Proof.—Bruno Latours Actor-NetworkTheory, “implies no special motivation of human individual actors, nor of humans in general.An actant can literally be anything provided it is granted to be the source of an action.”672`

Proposition VI. Human and non-human actants share material thing-power.

Proof.—In Vibrant Matter: A Political Ecology of Things, Jane Bennett explains the implications of this kind of Vital Materialism: “if matter itself is lively, then not only is the difference between subjects and objects minimized, but the status of the shared materiality of all things is elevated. All bodies become more than mere objects, as the thing-powers of resistance and protean agency are brought into sharper relief.”673

Note.—Rosi Braidotti explains that vitalist materialism is a concept that helps us make sense of that external dimension, which in fact enfolds within the subject as the internalized score of cosmic vibrations. It also constitutes the core of a posthuman sensibility that aims at overcoming anthropocentrism.”674 She demands a multiplication of subjects that encompass non-human beings, “expanding the notion of Life towards the non-humanor zoe …re-grounding claims to subjectivity, connectionsand community among subjects of the human and the non-human kind.”675

Proposition VII. Entanglement is scientifically observable and mathematically valid.

Proof.—In Meeting the Universe Halfway, Karen Barad emphasizes this concept of radically shared existence: “to be entangled is not simply to be intertwined with another, as in the joining of separate entities, but to lack an independent, self-contained existence.”676 She contextualizes this assertion with discoveries from quantum physics: “indeed, recent studies of diffraction (interference)phenomena have provided insights about the nature of the entanglement of quantum states, and have enabled physicists to test metaphysical ideas in the lab.So while it is true that diffraction apparatuses measure the effects of difference,even more profoundly they highlight, exhibit, and make evident the entangledstructure of the changing and contingent ontology of the world, including the ontology of knowing.”677 Knowing— feeling—arises from the multiplicity of influences within quantum entanglements.

Proposition VIII. An emotion is stronger in proportion to the number of simultaneous concurrent causes whereby it is aroused.

Proof.—Many simultaneous causes means that transformation is non-linear and multidirectional (III.vii.): therefore multidimensionally conditional (IV.v.), in proportion to the increased number of simultaneous causes whereby it is aroused, an emotion becomes stronger. Entanglement produces state-changes. Unspooling state-changes—along every vector.

Note—This proposition is also evident from V. Ax. ii.

Proposition IX. Agency exists within entanglement.

Proof.—Nietzsche argues that each thing has agency and must try to direct its life course, exercising its “capacity to utilize for oneself the chain of causes, the lines of linkage, that connect any thing to all others.”678 He saw Spinoza as a rare kindred spirit, who like him, was concerned with the ways in which one could maximize freedom. He sought tools for living life fully and in accordance with ones own specifications, instead of following prescribed conventions. His concept of the eternal return was both test and motivation to hold awareness of ones values in each moment. If this universe repeats infinitely, as theoretical physicists propose, then so does each moment of ones life. A good life consists in amor fati, love of this fate, pleasure in the face of actions eternally recurring: “the present instant, the event of which I am worthy (or not), is that which structures my place, in the past and future, on the basis of this instant. Ethics, in a sense, is the mental training, the rigor, of reason operating in bodily practice to mark and live the eternity of the events that happen to oneself and ones social and natural world.”679 

Proposition X. Actants have agency—to varying degrees. So long as we are not assailed by emotions contrary to our nature, we have the power of arranging and associating the modifications of our body according to the intellectual order.

Proof.—Like Spinoza, Nietzche argues that emotional awareness can promote agency.On the one hand, Spinoza emphasizes mindfulness, claiming that “the mind is capable of ordering and organizing passions and in this way converting them to active affects and enhancing joyous encounters.”680 On the other hand, Nietzche calls on us “to invoke our animal impulses, so readily directed internally by cultural forces and habits, to enhance our capacity to feel (both joy and hardship); we need to revivify our capacity to act.”681 Rather than repress or avoid emotion, he embraces its tumultuous throws. Feeling deeply is the mechanism for knowing how to direct ones actions (V.xlii.).

Note.—Agential Realism is a concept developed by Karen Barad that promotes using entanglements rather than binaries to conceive the world and possible actions within it. Barad describes agential realism as a “framework that provides an understanding of the role of human and nonhuman, material and discursive, and natural and cultural factors in scientific and other social-material practices.”682 Agential realism expands and complicates Bruno Latours actor-network theory with examples from the field of physics, borrowing most heavily from the ideas advanced by Niels Bohr. Barad explains that “ethics is about mattering, about taking account of the entangled materializations of which we are part, including new configurations, new subjectivities, new possibilities.”683 Rosi Braidottis concept of expanded life directly feeds into her ethical framework as well. She explains that “zoe-centered egalitarianism is, for me, the core of the post-anthropocentric turn: it is a materialist, secular, grounded and unsentimental response to the opportunistic trans-species commodification of Life that is the logic of advanced capitalism.”684  Jane Bennett advances a similar argument, suggesting that thinking in terms of entanglements with other agents, even non-human ones, allows for a different relationship to ones actions: “the ethical aim becomes to distribute value more generously, to bodies as such. Such a newfound attentiveness to matter and its powers will not solve the problem of human exploitation or oppression, but it can inspire a greater sense of the extent to which all bodies are kin in the sense of inextricably enmeshed in a dense network of relations. And in a knotted world of vibrant matter, to harm one section of the web may very well be to harm oneself.”685

Proposition XI. Posthuman actants emerge.

Proof.—In the final section of The Order of Things, Michel Foucault unearths the category of Man, which he considers a recent and precarious invention of European society. Humanor Mancarries with it an exclusive and troubling history, a narrative of power and domination that needs to be remembered and warned against. In The Re-Enchantment of Humanism: An Interview with Sylvia Wynter, David Scott introduces the phrase embattled humanism to capture a sense of critique and also the aspiration for an evolving humanism. This phrase links up with the contributions of cultural theorists, including Roger Mais, George Lamming, Aimé Césaire, Frantz Fanon, and Elsa Goveia.686 Sylvia Wynter argues that embattled humanism is “one which challenges itself at the same time youre using it to think with.”687 The Posthuman defies its expected function as a noun, it is treated as a dynamic verb, fluctuating and expanding to encompass more diverse ways of being, rather than a category in which one inherently or permanently belongs. Process, practice, and becoming are reparative forces. They expand the collapsed twin, inflating it with the complexity of life. The radical commitment to embody ones own feelings and values, to actually live them, act them, speak them, write them, make them, move them. To live other ways of being into existence. And to accept that there will be failures, complications, paradoxes, reevaluations, suffering, and even death along the way. There is an urgent sense that the social and environmental injustices that the Humanist mode perpetrated can only be addressed with the emergence of a new paradigm.As a result, Process Philosophyand with it the possibility of utter transformationhas become a key node of connection for New Materialists and Posthumanists alike.

Proposition XII. There are no objects—only processes.

Proof.—The first premise of Alfred Whiteheads Process Philosophy is “that the actual world is a process, and that the process is the becoming of actual entities … also termed ‘actual occasions.688 Process and Reality sets Spinoza’s Ethics in motion:

The philosophy oforganism is closely allied to Spinozas scheme of thought. But it differs by the abandonment of the subject-predicate forms of thought, so far as concerns the presupposition that this form is a direct embodiment of the most ultimate characterization of fact. The result is that the ‘substance-quality concept is avoided; and that morphological description is replaced by description of dynamic process. Also Spinozas ‘modes now become the sheer actualities; so that, though analysis of them increases our understanding, it does not lead us to the discovery of any higher grade of reality. The coherence, which the system seeks to preserve, is the discovery that the process, or concrescence, of any one actual entity involves the other actual entities among its components. In this way the obvious solidarity of the world receives its explanation.689

Proposition XIII. Actants are momentary states in endlessly unspooling trajectories.

Proof.—Gilbert Simondon was equally invested in a Spinozan monistic ontology and used this framework to explain transmutation without transcendance. Studying scientific processes ranging from electrical relay to crystallography,he was primarily interested in the dynamics of formation, or what he called individuation.

Proposition XIV.Processes can change beyond recognition.

Proof.—Simondon offered the term transduction for extreme conversions of identity—movement of a substance toward a state beyond recognition. Brian Massumi explains transduction as “the transmission of a force of potential that cannot but be felt, simultaneously doubling, enabling, and ultimately counteracting the limitative selections of apparatuses of actualization and implantation.”690 Manuel Delandas work incorporates many of Simondons technical case studies of transduction. Considering phase states and metastable solutions, he argues that this “richer conception of causality linked to the notion of the structure of a possibility space, gives us the means to start thinking about matter as possessing morphogenetic powers of its own.”691 Via Deleuze, Simondons work on process has been deeply influential in New Materialism, yielding concepts in the field including the plane of immanence, assemblages, and becoming. Elizabeth Grosz confirms that Simondons project is to articulate a theory of becoming that accounts for the complexgeneses of the becoming of all beings and their different levels of operation through the concrete elaborations of the preindividual 692 For Simondon, ontology is too stable,he offers instead a universe of ontogenesis. 

Proposition XV. Ethics is ontogenesis.

Proof.—Ontogenesis, or more commonly ontogeny, refers to the development of an individual organism or behavioral feature from the earliest stage to maturity. It involves embryogenesis and other developmental processes such as morphogenesis and differentiation. Essentially, ontogeny captures the life history of an organism, from its inception as a fertilized egg to its mature form, and sometimes to its eventual senescence or death. Following Simondon and Deleuze, Braidotti advances an ethics of becoming.693 She reintroduces and reframes three speculative processes as tools for reconceiving the human as the Posthuman: becoming-animal, becoming-earth, becoming-machine. Through these modes, it may be possible to rethink evolution in a non-deterministic but also post-anthropocentric manner.”694 She asserts that the common denominator for the posthuman condition is an assumption about the vital, self-organizing and yet non-naturalistic structure of living matter itself.”695 

Proposition XVI. Death is transduction.

Proof.—This reconceptualization of matter as processual, morphogenic, and alive transforms all areas of experience, including death. Braidotti advises that the ubiquitous avoidance of death needs to yield to another model:

Death is not the teleological destination of life, a sort of ontological magnet that propels us forward … death is behind us. Death is the event that has always already taken place at the level of consciousness. As an individual occurrence it will come in the form of the physical extinction of the body, but as event, in the sense of the awareness of finitude, of the interrupted flow of my being-there, death has already taken place. We are all synchronized with death—death is the same thing as the time of our living, in so far as we all live on borrowed time.696

Proposition XVII. Do not deny the reality of death.

Proof.—Many cultures throughout history have developed intricate rituals and practices to recognize and honor death, seeing it as an integral part of the human experience. These rituals, steeped in reverence and wisdom, served not only to commemorate the departed but also to guide the living through the intricate dance of grief and acceptance. Yet, in our modern, fast-paced world, many of these time-honored traditions are fading, overshadowed by the technocapital-agenda and an overarching societal discomfort with mortality. This neglect further distances us from the depth and richness that such practices offer, leaving a void in our collective understanding of lifes cyclical nature. As a result, many individuals find themselves ill-prepared to grapple with,or even acknowledge, the profound existential questions and raw emotions that accompany lifes final stage. Braidotti advises that

 Braidotti advises that “making friends with the impersonal necessity of death is an ethical way of installing oneself in life as a transient, slightly wounded visitor. We build our house on the crack, so to speak.”697

Corollary.—Death and suffering are leveling forces.

Proposition XVIII.
Spaces of emergence.

Proof.—In Homo Sacer: Sovereign Power and Bare Life, Giorgio Agamben argues that in our Capitalist system, everyone is treated as bare life, a resource to exploit: “today there is no longer any one clear figure of the sacred man … perhaps because we are all virtually homines sacri.”698 In Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human, Alexander Weheliye extends Agambens notion of bare life and argues that emergence takes place in even the bleakest scenarios: “the particular assemblage of humanity under purview here is habeas viscus, which, in contrast to bare life, insists on the importance of miniscule movements, glimmers of hope, scraps of food, the interrupted dreams of freedom found in those spaces deemed devoid of full human life (Guantanamo Bay,internment camps, maximum security prisons, Indian reservations, concentration camps,slave plantations, or colonial outposts, for instance).”699

Corollary.—Emergence from anywhere.

Note.— These extreme conditions constitute metastable states, primed for transduction. Weheliye demonstrates how the principles of process philosophy are embedded in Black theory, referencing Edouard Glissants description of“relation as an open totality of movement, which ‘is the boundless effort of the world: to become realized in its totality, that is, to evade rest.700 Weheliye argues that the possibility for radical change is never out of the question; it is immanent in all things. For him, habeas viscus “translates the hieroglyphics of the flesh into a potentiality in any and all things, an originating leap in the imagining of future anterior freedoms and new genres of humanity.”701 It is possible for anyone to live other ways of being into existence.

Proposition XIX. New forms of life are already seeded.

Proof.—Sylvia Wynter offered the term demonic ground, “ perspectives that reside in the liminal precincts of the current governing configurations of the human as Man in order to abolish this figuration and create other forms of life.”702 

Proposition XX. Care is
demonic ground.

Proof.— ... the highest good which we can seek for under the guidance of reason (IV.xxviii.). The ethics of care is a framework that emphasizes the importance of interpersonal relationships, empathy, and compassion in moral decision-making. Developed as an alternative to traditional moral theories that often prioritize principles,rights,or abstract notions of justice, the ethics of care places a central focus on nurturing and maintaining relationships,particularly within the context of caring for vulnerable individuals. At its core, the ethics of care challenges the notion of moral autonomy and individualism by asserting that our ethical obligations are deeply intertwined with our interconnectedness as social beings.Drawing from Emmanuel Levinas, this interconnectedness can be understood as an appeal, where the “being that expresses itself imposes itself but does so in a manner that calls forth our responsibility. Levinas posits that “the being that imposes itself does not limit but promotes my freedom, by arousing my goodness … thus, the irremissible weight of being that gives rise to my freedom.”703 An ethics of care proposes that freedom is not just the absence of constraints, but also the presence of responsibility. It suggests that while we have agency to act in alignment with our will, we are also accountable for the influence of our actionshow we radiate outward. Through this lensresponsibility is freedom. It ensures that our actions are in alignment with our willat a zoomed out scaleeven after they have propagated through the entangled field of reality.

Note.—It is important here to differentiate between the ethics of care and industries of care, which represent the commodification of services and products that are branded with the promise of healing, wellness, and well-being. These industries span a vast rangefrom medical services and pharmaceuticals,to child care and elder care, to wellness retreats and alternative therapies.While many of these sectors provide genuine value and contribute tremendously to individual and societal well-being, they are not without harm and inequity: “health disparities are differences that exist among specific population groups in the United States in the attainment of full health potential that can be measured by differences in incidence,prevalence,mortality,burden of disease, and other adverse health conditions Health disparities can stem from health inequitiessystematic differences in the health of groups and communities occupying unequal positions in society that are avoidable and unjust.”704 Within technocapitalthe nexus of technology and capitalismthe imperatives of profit and market growth drive decision-making: “Forces are acting to challenge affordability and access in healthcare and threatening the industrys economic outlook.”705 Industries of care are not just about delivering care but also about maximizing profitability, expanding market share, and often, commodifying personal experiences of health and wellness.The challenge lies in discerning genuine care from commercial exploitation, ensuring that the essence of care isn't lost amidst the machinery of profit-driven motives.

Care is unlimited. Care is not limited to industry. The ethics of care consists of:——

I. EmpathyIn the actual knowledge of the emotions(V. iv. note).

II. Balance—Identifying and equalizing imbalances of power.

III. Superposition—recognizing the infinite within every other.

IV. Entanglement—taking responsibility for the influence of our free actions.

V. Compassion—Lastly, in the order wherein the mind can arrange and associate,one with another its own emotions (V. x. note and xii. xiii. xiv.).

And now I have finished with all that concerns this present life …

Proposition XXI. The mind can only imagine anything,or remember what is past, while the body endures.

Proof.—The Aesthetics of care tends to restorative practices, maintenance work, and the mundane aspects of everyday life. Keeping our bodies alive—our communities alive—our planet alive—comes with an aesthetic of reality and imperfection.

Proposition XXII. Artists simulate modalities of care.

Proof.—1969, Mierle Laderman Ukeles published the Manifesto for MaintenanceArt,challenging traditional notions of art and elevating the often overlooked and undervalued work of maintenance and care: “The exhibition of Maintenance Art, ‘CARE’,would zero in on pure maintenance, exhibit it as contemporary art, and yield, by utter opposition, clarity of issues.”706 Ukeles asserts that maintenance activities,such as cleaning, repairing, and tending to the needs of everyday life, are not separate from artistic practice but rather integral to it. Ukeles argues that these actions are an essential part of sustaining society and should be recognized as artistic gestures that deserve respect and appreciation. Her manifesto invites us to reevaluate our perceptions of labor, caregiving, and maintenance as mundane and unremarkable, and instead, recognize their beauty, significance, and transformative potential.

Proposition XXIII. The human mind cannot be absolutely destroyed with the body, but care is so often postponed—post-partum—after profit, gain, capture, acquisition.

Proof.—Post-Partum Document is Mary Kellys multifaceted and deeply personal exploration of motherhood,care,and identity: “Post-Partum Document is a six-year exploration of the mother-child relationship.When it was first shown at the ICA in London in 1976, the work provoked tabloid outrage because Documentation I incorporated stained nappy liners. Each of the six-part series concentrates on a formative moment in her sons mastery of language and her own sense of loss, moving between the voices of the mother, child, and analytic observer. Informed by feminism and psychoanalysis, the work has had a profound influence on the development and critique of conceptual art.”707  The project consists of a series of interconnected elements, including recording transcripts, diary entries, her sons drawings and collected artifacts, that chronicle the early years of Kellys relationship with her son. The meticulous documentation of experience, from the physical aspects of childbirth to the emotional challenges and the daily routines of caring for an infant, serves as a candid and intimate reflection on the transformative process of motherhood.

Note.—The projects layered structure invites viewers into Kellys personal narrative, combining visual and textual elements to evoke a sense of immersion and emotional connection. Through the inclusion of various artifacts, such as soiled diapers and hospital reports, Kelly blurs the boundaries between art and life, transforming everyday objects associated with care into powerful symbols of caregiving and the maternal experience. Post-Partum Document also interrogates larger sociopolitical themes, particularly the gendered dynamics of labor and the construction of female identity within patriarchal systems::“Although drawing directly from her own maternal experience, the artist has asserted that the work is not ‘autobiographical’ and instead uses her own story to suggest ‘an interplay of voices—the mother’s experience, feminist analysis, academic discussion, political debate.’”708 By exposing the often unseen and undervalued aspects of motherhood, Kelly challenges societal norms and calls attention to the complexities and sacrifices involved in caregiving roles.

Proposition XXIV. The more we understand particular things, the more do we understand God.

Proof.—This is evident from I. xxv. Coroll.

Proposition XXV. The highest endeavor of the mind, and

the highest virtue is to understand things by

the third kind of knowledge (intuition).

Proof.—Rirkrit Tiravanija’s Pad Thai, from 1990, is a seminal work ofrelational aesthetics. It offers a compelling exploration of intuition,care,communal engagement, and the blurring of boundaries between art and everyday life: “I want people to just be themselves. I would like to make a work where I dont have to tell people what to do. In certain ways, I try to use architecture or space, or food and drink, or sound. That would certainly be something people already understand. With that little sense of familiarity, they would already become more curious and more engaged.”709 In this participatory installation, Tiravanija transforms the traditional gallery space into a communal kitchen, inviting visitors to partake in the preparation and sharing of a communal meal of pad Thai,a popular Thai dish. Through the act of cooking and sharing food, Tiravanija creates a nurturing and inclusive environment that fosters social interaction and care.The communal meal becomes a catalyst for dialogue, breaking down barriers and fostering a sense of togetherness among participants, in contrast to the dynamics typically encountered in an exhibition space. By providing sustenance, the artwork elevates the mundane act of creating and sharing as a radical form of care, underscoring the significance of nourishment and community in our lives.

Pad Thai shifts focus from stable objects to dynamic experience and interrelation: “Rirkrit Tiravanija has always understood, intuitively and intellectually, that a gallery is a social frame, at once quasi-private and quasi-public, wherein a diverse range of encounters and frictions connected to rituals of making, displaying, and consuming art are staged.”710 His work invites viewers to become active participants, blurring the distinction between artist and audience, and encouraging a collective sense of responsibility for creating and sustaining the communal space.The ephemeral nature of the work reinforces the transient and temporal aspects of care.

Proposition XXVI. Advocacy is communal care.

Proof—Tania Bruguera is an artist known for her thought-provoking and politically engaged work that often addresses issues of power, migration, and social justice. Through her performances, installations, and participatory projects, Bruguera explores the concept of care in relation to marginalized communities and questions the responsibilities and role of art in fostering social change.One of Brugueras notable works emphasizing care is Immigrant Movement International,developed between 2011 and 2015. The project manifesto articulates: “We have the right to move and the right to not be forced to move. We demand the same privileges as corporations and the international elite, as they have the freedom to travel and to establish themselves wherever they choose. We are all worthy of opportunity and the chance to progress. We all have the right to a better life.”711 In this project, Bruguera established a community space in Queens, New York, designed to provide support and resources for immigrants. The project transformed the gallery into a site for communal care,where individuals could access legal advice,language classes, and social services. Immigrant Movement International embodied Brugueras commitment to addressing the needs and vulnerabilities of marginalized communities. By offering practical assistance and creating a supportive environment, the project demonstrated care as an essential aspect of social justice work. It emphasized the importance of providing resources, information, and spaces of empowerment for individuals who often face systemic obstacles and marginalization.Brugueras approach to care extends beyond the immediate provision of resources. Her projects aim to challenge existing power structures and address the underlying causes of inequality and injustice: “Bruguera is a key player within the fields of performance, interdisciplinary practice and activism. Her work is grounded in the act of ‘doing’—she calls this ‘behavior art’—and her aim is to create art that doesnt merely describe itself as dealing with politics or society, but that is actually a form of political or social currency, actively addressing cultural power structures rather than representing them.”712  Through participatory elements, she invites individuals to engage with their own experiences and those of others,fostering empathy and connection. In projects like The Francis Effect, Bruguera organized public gatherings where participants could voice their concerns and ideas for social change. By creating platforms for dialogue and collective action, she seeks to encourage a sense of shared responsibility and care for the well-being of communities.

Proposition XXVII.

From this third kind of knowledge arises the highest possible justice.

Proof.—In The Black Factory, William Pope.L explores the topic of care through conversations about identity and equity: “Conceived to fit inside a panel truck, The Black Factory travels throughout America to bring blackness wherever it is needed. The Factory consists of three compartments that unfold to create an interactive public environment made up of a library,a workshop, and a gift shop. Through the circulation of promotional materials and by word of mouth,The Black Factory makes contact with a range of host-communities that invite visits to their town.”713 The Black Factory challenged the notion that care is solely an individual or private matter, emphasizing that care extends to the collective responsibility for addressing inequities. The mobile nature of the installation underscored the importance of taking the message of care beyond traditional art spaces.By bringing The Black Factory to different communities, Pope.L sought to disrupt the status quo and encourage conversations about race and care in spaces where these discussions might not typically occur: “It aims to re-energize discussions about race in America by inviting people to share objects that represent ‘blacknessto them.”714 Care is not only about empathy, it requires compassion, actively working to dismantle systemic injustices.

Proposition XXVIII. Self care is hard enough.

Proof.—This proposition is self—evident. Becoming an Image, an ongoing bodyof work by Heather Cassils, delves into the themes of care, vulnerability, and the construction of identity. Through his physically demanding and transformative performance, Cassils explores the boundaries of the body and challenges societal norms surrounding gender, while highlighting the importance of self care and self expression. In Becoming an Image, Cassils undergoes an intense physical training regimen over a period of months, working towards sculpting his body into a more masculine form. The performance culminates in a series of staged photographs,capturing the physical and emotional journey of self transformation: “At the core of Cassils durational performances is this principle of calculated risk in the face of the material’s capacity, and … of the ultimate material, ‘of the body’s inexorable movement towards its final failure, toward death.’”715The work addresses the societal expectations and norms placed upon bodies, particularly those of transgender and gender-nonconforming individuals.Cassilsperformance challenges these expectations by reclaiming agency over their own body and identity. Through his physically demanding process, he asserts his right to shape and define his own image, pushing back against the limitations imposed by societal constructs.Becoming an Image examines care for the self. Cassils performance invites viewers to consider their own relationship with their bodies and the ways in which societal standards shape their perceptions and self-care practices.

Proposition XXIX. Differentiate betweenpoison and cure.716

Proof.—Patrick Staffs Weed Killer is a thought-provoking video installation commissioned by the Museum of Contemporary Art. Inspired by Catherine Lords memoir, The Summer of Her Baldness, the installation combines a poignant monologue adapted from the book with ethereal sequences captured through high-definition thermal imaging. Through this immersive experience, Weed Killer blurs the boundaries between the toxic and the curative, provoking viewers to reevaluate their own understandings of suffering and the potential for resilience and transformation: “Each of the performers in Weed Killer identifies as transgender. By probing both cancer and trans experiences, Staff initiates a dialogue about how biomedical technologies have fundamentally transformed the social constitution of our bodies.”717

Note.—At the heart of Weed Killer lies a monologue adapted from Lords memoir. This moving and irreverent account of Lords experience with cancer serves as a poignant foundation for Staffs exploration. The monologue, delivered by an actress, delves into the emotional and physical devastation caused by chemotherapy. Through this adaptation, Weed Killer channels the raw authenticity of Lords memoir, inviting viewers to empathize with the complexities of confronting illness.718 Interwoven with the monologue are ethereal sequences captured using high-definition thermal imaging. These sequences offer a contrasting and otherworldly visual experience,providing a departure from the grounded reality of the monologue. Through choreographic gestures, the thermal imaging captures the subtle nuances of movement and creates a surreal atmosphere. The juxtaposition of these sequences with the personal narrative enhances the contemplation of suffering and healing as multifaceted and transcendent experiences.

Proposition XXX. Find spaces of radical care.

Proof.—Johanna Hedvas Sick Woman Theory is a radical reframing of illness and disability within the context of contemporary capitalist society. Drawing from her personal experiences with chronic illness,Hedva posits that the Sick Woman is anyone who does not or cannot conform to the expectations of a system that values productivity and labor above all else. This includes the chronically ill,the disabled, and anyone who is marginalized by the dominant culturebe it due to race, gender, class, or other factors. In a society that equates worth with work, the Sick Woman is seen as less valuable or even disposable. However, Hedva argues that this

 perceived weakness is a site of resistance. The very act of surviving, of caring for oneself and others, is a defiant gesture against a system that would rather the Sick Woman not exist at all. Through this lens, the Sick Woman's existence and her care practices become inherently political acts: “Sick Woman Theory is an insistence that most modes of political protest are internalized, lived, embodied, suffering, and therefore invisible.”719

Proposition XXXI. Repair broken processes.

Proof.Eve Kosofsky Sedgwick's essay, Paranoid Reading and Reparative Reading, discusses two modes of engaging with texts and the world: paranoid and reparative. Sedgwick critiques the dominance of paranoid reading in critical theory, particularly in queer theory. Paranoid reading is characterized by suspicion, where the reader anticipates negative outcomes and is always on guard for hidden meanings or threats.It operates under a defensive stance, often aiming to expose and critique. Contrastingly, reparative reading, which Sedgwick advocates for, is characterized by curiosity, surprise, and openness. It seeks to nurture and repair, focusing on potential positive outcomes and constructive engagements. Reparative reading is not naive; it recognizes and grapples with pain and trauma but does so in a way that allows for complexity, ambivalence, and hope. Sedgwick critiques the pervasive nature of paranoid reading in academia and suggests that while it can be a useful mode, it is not the only or always the best way to approach texts or the world: “Reparative motives, once they become explicit, are inadmissible in paranoid theory both because they are about pleasure (‘merely aesthetic’) and because they are frankly ameliorative (‘merely reformist’). What makes pleasure and amelioration so ‘mere’?” Note.—The series, Atlanta, created by Donald Glover, has consistently provided poignant social commentary through its unique blend of humor, drama, and surrealism. Episode 4 of Season 3, titled The Big Payback, is no exception. The episode offers an illuminating perspective on the white paranoia surrounding reparations: “An office worker's world is turned upside down when he learns his past ancestors were slave owners.”720

Note.—The series, Atlanta, created by Donald Glover, has consistently provided poignant social commentary through its unique blend of humor, drama, and surrealism. Episode 4 of Season 3, titled The Big Payback, is no exception. The episode offers an illuminating perspective on the white paranoia surrounding reparations: “An office worker's world is turned upside down when he learns his past ancestors were slave owners.”721 Reparations refer to the act of providing compensation, restitution, or acknowledgment to individuals or groups who have experienced harm, injustice, or systemic oppression: “We were treating slavery as if it were a mystery, buried in the past, something to investigate if we chose to.And now that history has a monetary value. Confession is not absolution.”722 Reparations often arise in the context of addressing historical injustices such as slavery,colonization,apartheid,or genocides. They aim to acknowledge and rectify the systemic and long-lasting effects of these injustices,which can include economic disparities, social marginalization,and psychological trauma. The Big Payback cleverly unpacks the anxieties of manywhite Americans when confronted with the idea of reparations—a fear of personal loss, of retribution, and a deep-seated belief that one’s own well-being is threatened by the well-being of others. This mindset is rooted in scarcity, the belief that resources are limited, and therefore any gain for one group must mean a loss for another. This fear is often manifested in hyperbolic scenarios, suggesting an inevitable societal breakdown or extreme economic consequences.

Local reparations initiatives in various cities and the state of California have ignited hopes for a national compensation policy for the historical atrocities of slavery.Despite the prolonged efforts and a heightened national discourse on racial justice, a significant portion of Americans remain against the idea.Critics of reparations express concerns about the practicality and feasibility of implementing such a program.They argue that determining eligibility,calculating appropriate compensation,and identifying direct descendants of enslaved people would present significant challenges.Critics also contend that reparations may create division,resentment,and unintended consequences, as they could perpetuate a victimhood narrative or lead to a reductive understanding of complex historical issues. If we could simulate successful implementation, would that be enough? Surprisingly, a considerable number of Americans do not believe that descendants of slaves should receive reparations: “two-thirds of Americans.”723 Polls reveal strong opposition from white, Latino, and Asian American communities. Opponents argue that current generations should not be held accountable for past transgressions.The fundamental American belief that hard work ensures success clashes with the realities of the racial wealth gap.The debate around reparations is intertwined with broader socio-political challenges,including disputes over teaching race and critical race theory in schools.724 This attitude is more reflective of deeply entrenched racial prejudices than they are of the true nature and potential of reparations. Beyond white fear, reparations are not merely a transactional financial compensation for past atrocities but can be a significant step towards rebalancing systemic injustice. More equity—

for everyone.

Despite the paranoid narrative of individual punishment, reparations are typically undertaken by governments, organizations, or institutions as a means of addressing historical wrongs, seeking to redress the ongoing consequences of past atrocities, and promoting social justice. The United Nations upholds that “victims have a right to reparation. This refers to measures to redress violations of human rights by providing a range of material and symbolic benefits to victims or their families as well as affected communities.”725 Reparations can take various forms, including financial compensation, land restitution, educational opportunities,healthcare access, or affirmative action policies.The underlying principles of reparations include recognition,accountability,and the pursuit of justice. Reparations acknowledge past wrongs,affirm the dignity and worth of the affected individuals or communities,and hold responsible parties accountable for their actions or complicity.Reparations seek to restore a sense of justice and promote healing, and have a “catalytic power … on the daily life of victims, families, communities, and entire societies.”726 However, the topic becomes particularly heated when applied to the centuries of systemic discrimination, enslavement, and exploitation of Black Americans. This heat is not because the moral or logical argument for reparations is weak,but because it challenges foundational myths about meritocracy and the American Dream.

The institution of slavery in the United States was a deeply entrenched system that caused immeasurable harm,dehumanization,and generational trauma to millions of African Americans. Slavery formed the foundation of the countrys economic prosperity, with enslaved people being exploited for their labor, while enduring systemic racism and denial of basic human rights. The legacy of slavery has had far-reaching effects, leading to enduring socioeconomic disparities, racial discrimination, and the erosion of social cohesion. There are growing examples of successful reparations programs. Across the globe, reparations have been utilized to address and rectify historic injustices, often to help consolidate peace post-conflict.For instance, the U.S.previously compensated Japanese American citizens for their wrongful internment during World War II.727 South Africas government, in 2003, distributed $3,900 to victims of apartheid, accumulating to $85 million.728 Furthermore, certain nations have pursued reparations from their former colonizers. For instance, Caribbean nations initiated the Caribbean Reparation Commission to secure reparations from former colonial powers,citing grave historical crimes like the Trans-Atlantic Slave Trade.729 The U.K. took responsibility for its colonial past by paying $25 million to Kenyans who suffered violence during the Mau Mau uprisings in the 1950s.730 Post World War II, Germany recognized its responsibility towards Holocaust victims by not only providing reparations but also giving $7 billion to Israel during its early formation days.By 2012, the German efforts translated to $89 billion in reparations to individual survivors.731 Moreover, every day reparations are provided to groups in the US:

Farmers. Fishermen. People whove lost bank accounts or pensions. People whove had a bad reaction to a COVID vaccine. People whove had a reaction to any other vaccine. Indigenous people. Veterans. Descendants of veterans. People who get hurt on the job. People who built nuclear bombs. People exposed to pesticides. Coal miners who get black lung disease. People who lose paychecks or homes from floods, droughts, or other natural disasters. People who are impacted by trade agreements.732 

Yet, Black Americans have not received any compensation for their unpaid labor and the severe racial discrimination they faced. Early efforts to provide reparations were overturned,leaving Black Americans without means to build wealth:

The first major opportunity that the United States had and where it should have atoned for slavery was right after the Civil War.Union leaders including General William Sherman concluded that each Black family should receive 40 acres. Sherman signed Field Order 15 and allocated 400,000 acres of confiscated Confederate land to Black families.Additionally,some families were to receive mules left over from the war, hence 40 acres and a mule. Yet, after President Abraham Lincolns assassination, President Andrew Johnson reversed Field Order 15 and returned land back to former slave owners. Instead of giving Blacks the means to support themselves, the federal government empowered former enslavers. For example, in Washington D.C., slave owners were actually paid reparations for lost propertythe formerly enslaved.733

 Today proposed reparations packages include individual payments, college tuition remissions, student loan forgiveness, down payment and housing revitalization grants, and business grants.734 However, this is not just about money, but about creating an equitable society by addressing entrenched racism.

Reparations are not a zero-sum game. They potentiate collective healing. Here, the Jewish concept of tzedakah provides a valuable lens. The term tzedakah originates from the Hebrew word for justice, a paramount duty within Judaism: “Tzedek, justice you shall chase after.”735 [similar to the Muslim concept of Sadaqah]. Unlike traditional charity, which is often given out of surplus, tzedakah is seen as a form of justice, a duty to give regardless of ones financial state. It is not just about money; it is about restoring balance. If we begin to think of reparations in terms similar to tzedakah,the dialogue shifts. Instead of a punishment or forced obligation, reparations become a proactive step towards justice. It acknowledges a debt, not just financial, but moral and societal. More than a mere handout, it is a hand extended in a gesture of understanding, responsibility, and a deep desire for communal healing. By using tzedakah as a model, reparations can be seen as not just beneficial for the recipients but also for those giving. It can be a process of growth, responsibility, and genuine commitment to creating a more just and equal society.

Proposition XXXII.Restorative justice.

Proof.—Reparations are a strategy of restorative justice,which focuses on the needs of the victims and the offenders, as well as the entire community, rather than satisfying abstract legal principles or punishing the offender. The goal is to repair harm, reconcile relationships, and reintegrate offenders back into society. Renowned feminist scholar and social activist, bell hooks, has written about restorative justice in the context of broader societal structures of power,inequality,and oppression: “I think this is a difficult question, how we deal with the question of forgiveness. For me forgiveness and compassion are always linked: how do we hold people accountable for wrongdoing and yet at the same time remain in touch with their humanity enough to believe in their capacity to be transformed?”736 She argues that punitive justice systems, rooted in domination and control, often reinforce societal hierarchies and exacerbate harm.

hooks believes that justice should involve the active process of healing, reconciliation, and community building. She views restorative justice as a potential tool for addressing larger systemic issues,such as racial and gender inequality: “One of the things that has always made me sad is the extent to which civil rights struggles, black power movements, and feminist movements, have, at times, collapsed at the point where there was conflict, and how conflict between people in the groups was often seen as a negative. The truth is that you cannot build community without conflict. The issue is not to be without conflict, but to be able to resolve conflict, and the commitment to community is what gives us the inspiration to come up with ways to resolve conflict.The most contemporary way that people are thinking about as a measure of resolving conflict and rebuilding community is restorative justice.”737 hooks emphasizes the need to transform not just individual relationships or incidents of harm, but also the larger, systemic structures of power that often underpin such harm. She advocates for a justice system that repairs and heals, rather than one that punishes and divides. hooks frames her discussion within an intersectional lens, acknowledging the ways that race, gender, class, and other aspects of identity intersect and influence experiences of harm and opportunities for justice.

Restorative justice has seen numerous successful implementations globally. In the United States, The Navajo Nation Peacemaking Program is a traditional approach to justice that emphasizes harmony and often sees cases referred back to it from the wider judicial system: “An examination of the Navajo peacemaking process shows that its success is not in its concrete result or the actual remedy given,but rather is in an adjustment of the attitudes of the parties involved. Both offenders and victims begin with cognitive dissonance or related emotions that are based on assumptions and unreality,and the process leads them to common understandings.”738 South Africas Truth and Reconciliation Commission, set up post-apartheid, allowed victims to voice their suffering and perpetrators to confess their crimes in a public forum, playing a pivotal role in preventing further civil unrest during the countrys transition.739 New Zealands Family Group Conferences for juvenile crimes involve a collaborative development of a plan for the offender to make amends and have shown a significant reduction in reoffending.740 The Hollow Water First Nation Community Holistic Circle Healing in Canada addresses sexual abuse and other forms of violence within the community, boasting a recidivism rate of just 2% among offenders who acknowledge their actions and seeking forgiveness.741 Likewise, Prison Fellowship Ministries runs successful restorative justice programs inside prisons internationally: “Founded in 1976, Prison Fellowship exists to serve all those affected by crime and incarceration and to see lives and communities restored in and out of prisonone transformed life
at a time
.”742 Habeas Viscusdistributed transduction (V.xviii.).

Corollary.—

Restorativejustice everywhere.

Proposition XXXIII.Restore the planet.

Proof.—Restorative justice, when expanded to the environment, offers a transformative approach that addresses the harm inflicted upon the land and local communities and supports sustainable solutions: “The ‘planetary boundaries framework developed by Johan Rockström,Will Steffen, and others (Rockström et al., 2009) describes the nine processes regulating the Earth system, keeping it stable and resilient. Within these boundaries, humans have a ‘safe operating space but pushing past them would destabilize Earths system into effects beyond human capabilities to manage.”743 The impact of imbalance is already overwhelming. Restorative environmental justice emphasizes the healing and restoration of damaged ecosystems and communities impacted by environmental degradation.Instead of solely focusing on punitive measures or regulations, restorative environmental justice seeks to identify and address the underlying causes of harm.744 By implementing restoration projects,such as ecological rehabilitation, habitat restoration, and remediation efforts, we can actively work towards healing the environment and restoring balance.

Note.—Restorative justice highlights the importance of accountability and taking responsibility for ones actions. Applying this principle to environmental justice involves holding polluters and those responsible for environmental harm accountable for their actions.It also includes acknowledging the collective responsibility of society for the degradation of the environment.By promoting transparency, encouraging corporate and governmental accountability, and supporting initiatives that foster responsible environmental practices,restorative justice ensures that those who contribute to environmental harm make amends: “Possible restorative outcomes in the case of environmental harm are apologies,restoration of environmental harm, prevention of future harm.”745 Restorative justice prioritizes the inclusion and active participation of affected communities in decision-making processes.By incorporating diverse perspectives, local knowledge, and community engagement, we can better understand the specific needs of ecosystems impacted by environmental degradation. This collaborative approach empowers communities to actively participate in finding sustainable solutions.Restorative justice drives innovation in environmental practices. New ways of being. Offenders are encouraged to invest in research, technology, and alternative approaches,including renewable energy initiatives, sustainable agriculture practices,eco-friendly technologies, and circular economy models. Some recent policies account for the entanglement of environmental technology and social equity. For example, “Justice40 establishes the goal that 40% of the overall benefits of certain federal investments flow to disadvantaged communities (DACs). The Justice40 Initiative applies to over 1 Department of Energy (DOE) programs and to much of the $62 billion investment in DOE under the Bipartisan Infrastructure Law.”746 Restorative justice offers a transformative framework that recognizes the interconnectedness of environmental and social issues as the reality of our planet.

Proposition XXXIV. Technology hasrestorative potential.

Proof.—Can technologies of Capture and Reconstruction be reappropriated to rectify environmental injustice? Technologies like remote sensing and computer vision are powerful tools to understand and confront environmental degradation. By collecting and analyzing vast amounts of data, technology enables the identification of pollution sources and disproportionate impacts on marginalized communities, facilitating targeted interventions and enforcement of environmental regulations.747 Despite their ethical complexities, these technologies have the potential to support the development and implementation of sustainable solutions,such as clean energy alternatives, sustainable infrastructure,and efficient resource management systems. Since we have these technologiesand they are not going awaywe need to embrace their capabilities to empower repair. To promote equitable access to clean air, water, and resources. To mitigate the detrimental effects of environmental degradation. To protect against the decoherence of our habitable world.

Corollary.— D
e
c
oher
ence
. is d

Note.—If we look to mens general opinion, we shall see that they are indeed conscious of the eternity of their mind, but that they confuse eternity with duration, and ascribe it to the imagination or the memory which they believe to remain after death.

Proposition XXXV. Deploy machines of care.

Proof.—From the watery depths of our ecological crisis, Phykos emerges as a pioneering company at the forefront of sustainable practices: “Currently, most global efforts to address climate disruption are focused on reducing emissions of greenhouse gas pollutants. While vital, this path alone is no longer sufficient. In a special climate report, the United Nations made clear that,in addition to turning off the flow of pollution, we also need to remove massive amounts of legacy CO2 from the atmosphere to avoid the most dangerous effects of climate change, and ultimately restore our climate.”748 Leveraging autonomous vessels and advanced technologies, Phykos aims to cultivate seaweed as a powerful carbon sink. Seaweed, a type of macroalgae,has gained significant attention for its ability to sequester carbon dioxide from the atmosphere. As seaweed grows, it absorbs carbon dioxide through photosynthesis, effectively capturing and storing carbon: “We amplify natural marine carbon cycles to remove excess atmospheric CO2 at climate relevant scale.”749 

Proposition XXXVI. Autonomous repair.

Proof.—Phykos utilizes autonomous vessels as the foundation of their seaweed cultivation operations. These vessels are equipped with cutting-edge technology, including remote sensing capabilities, artificial intelligence, and robotics. The autonomous nature of these vessels allows them to navigate the oceans with minimal human intervention, making seaweed cultivation more efficient and sustainable. Phykos seaweed cultivation process begins with the deployment of their autonomous vessels to predetermined seaweed farming areas. These areas are carefully selected based on factors such as water quality, nutrient availability, and optimal growth conditions. Once the vessels reach their designated locations, they initiate the seaweed cultivation process.

Corollary.—

Vessels of healing.

Note.—Phykos applies capture and reconstruction technologies for restorative impact.Using advanced remote sensing technologies, the autonomous vessels capture vital data about the ocean environment: “High resolution ocean models guide our platforms to deposition regions which maximize carbon storage while minimizing disturbance to the deep ocean environment. Platform sensors and satellite communications enable precise reporting of our progress.”750 They collect information on water temperature, salinity, nutrient levels, and other relevant parameters. This data enables the autonomous vessels to employ sophisticated algorithms and artificial intelligence systems to navigate to optimal conditions: “Algorithms maximize growth on our platforms while avoiding sensitive ocean environments and other ocean users. Our system works in harmony with seasonal shifts in the ocean offering a safe and scalable pathway for nature based carbon dioxide removal.”751 The autonomous vessels release specially designed seaweed modules into the ocean. These modules contain pre-grown seaweed seedlings, which attach to the floating structures. As the seaweed grows, it absorbs carbon dioxide through photosynthesis, thereby reducing atmospheric carbon levels. Phykos vessels continuously monitor the growth and health of the seaweed, optimizing carbon sequestration potential.

Proposition XXXVII. There is nothing in nature, which is contrary tocarbon, or which can take it away. False.

Proof.—Seaweed cultivated by Phykos acts as a natural carbon sink, absorbing substantial amounts of carbon dioxide during its growth cycle.When the seaweed is saturated it is harvested or is pulled by its own weight to the bottom of the ocean where it slowly releases carbon over one-thousand years or more.752 In addition to Phykos primary goal, seaweed cultivation provides additional ecological benefits,such as nutrient absorption, habitat creation, and the enhancement of marine biodiversity.Goals and weights determine the direction of change.

Note.—The Axiom of Part IV. has reference to particular things, in so far as they are regarded in relation to a given time and place: of this, I think, no one can doubt.

Proposition XXXVIII. Entanglements of life and death.

Proof.—Coral.

Note.—Coral reefs are often referred to as the rainforests of the sea.They are crucial for ecological, economic, and cultural reasons. Their well-being directly impacts the health of our oceans and the people who depend on them: “1 billion people rely on coral reefs for food security; 25% of all marine species live on coral reefs; 70-90%of the world's coral reefs could be lost by 2050.”753 Protecting coral reefs is essential for preserving the biodiversity and ecological balance of our marine ecosystems. Coral reefs, vital ecosystems teeming with biodiversity, are under threat due to climate change and human activities. Coral bleaching is a phenomenon where corals lose their symbiotic algae, known as zooxanthellae, causing them to turn white. This discoloration is often triggered by elevated sea temperatures and indicates coral stress. Additionally, corals and other marine organisms can be adversely affected by marine debris, including discarded fishing nets and plastics. This debris has the potential to entangle and damage coral reefs. Another related concern is ghost nets, which are fishing nets that have been lost, abandoned, or discarded in the ocean. As they drift, they pose a threat not only to marine life through entanglement but also to coral reefs upon contact: “Entanglement can be directly responsible for breakage,as well as inhibiting the growth by restricting access to sunlight,as well as preventing the important cleaning function of grazing fish species. Indirectly, entanglement in plastic debris has been linked to a greater prevalence of disease in coral species … Coral species that grow in branching or corymbose forms, such as the Acroporids and the Poccilioporids are thought to be eight times more vulnerable to entanglement in debris due to their complex structures, suggesting that important habitats for juvenile reef fish species as well as invertebrates are most greatly affected.”754 In the face of this crisis, innovative solutions are necessary for coral restoration. Coralmaker, a pioneering initiative, leverages capture, reconstruction, and simulation to revolutionize large-scale coral reef restoration efforts.Coralmaker significantly reduces the time required for coral growth, promoting sustainability through emerging technologies.

Proposition XXXIX. Skeletons of regeneration.

Proof.—Coral calcification, the process of skeletal growth, takes years for corals to reach adult size. Coralmaker collapses this lengthy timeline. Their mission is “to provide robust and scalable technologies that make it possible to restore, install, and move coral reefs at the reef scale, supporting their survival and continuation through climate change.”755 This innovative manufacturing technology can be conveniently deployed close to restoration sites,enabling onsite production using locally-sourced natural aggregate mixes. Coralmaker has the capability to manufacture 10,000 skeletons per day,and is actively scaling up production.

Note.—Coralmaker seeds living coral fragments into modular bases made of recycled stone waste from the construction industry. By repurposing this waste material,the initiative reduces its carbon footprint and contributes to waste reduction.Furthermore,the focus on local manufacturing near restoration sites minimizes transportation emissions,ensuring a more sustainable approach to large-scale coral restoration.The process of coral propagation, which involves seeding coral fragments onto the premade stone coral skeletons, is traditionally a labor-intensive and repetitive task. Coralmaker recognizes the need for automation in large-scale coral processing and utilizes robotics and artificial intelligence to automate the propagation process.756 The underwater installation process is too dangerous and expensive for a human workforce to be viable. These automated systems, designed for onsite deployment at restoration sites,effectively collaborate with human workers.Humans are activated to engage in more complex tasks.

Proposition XL. In proportion as each thing possesses more of perfection, so is it more active, and less passive; and, vice versâ, in proportion as it is more active, so is it more perfect.

Proof.—Coral geometry is captivatingly complex. Layer upon layer of mathematical and natural principles. Branching corals and coral colonies often display fractal-like patterns, reflecting efficient space-filling and resource acquisition strategies. The structure of these corals has evolved to optimize light capture for their photosynthetic partners, zooxanthellae, balancing growth towards light while minimizing self-shading. Some coral patterns are reminiscent of Turing patterns from reaction-diffusion systems, modeling interactions of substances as they spread across space, while others align with phyllotactic principles seen in plants, often linked to Fibonacci sequences or golden angles for optimal spacing. Beyond these patterns, the calcification processes underlying coral growth give insights into optimal structural strategies in fields like crystallography and material science. Structures beyond the Cartesian: “These organisms are biological manifestations of what we call hyperbolic geometry, an alternative to the Euclidean geometry we learn about in school that involves lines, shapes and angles on a flat surface or plane. In hyperbolic geometry the plane is not necessarily so flat.”757 There are infinite variations.

This presents a challenge to robotic automation, which conventionally relies on a fixed program in which every part handled is the same. Researchers have made significant advancements by implementing robotic perception,computer vision, and AI models to tackle the fragile task of handling the endlessly diverse coral fragments and precisely placing them as seeds within artificial skeletons. Adaptive automation. Capture and Reconstruction are augmented with perception models to perceive and respond to the complex shapes and variations of coral fragments. Computer vision algorithms enable the identification and analysis of individual fragments, ensuring accurate recognition and classification.Additionally,AI models contribute to real-time decision-making, allowing robotic arms to adapt their grasping strategies and movement trajectories based on the unique characteristics of each fragment. This integration of technologies enables precise and delicate manipulation,facilitating the successful placement of coral fragments within the artificial skeletons at a scale that would not be possible otherwise.758

Corollary.—Scale care.

Note.—Coralmaker recognizes the importance of a well-designed logistics system to support large-scale coral reef health. Coralmaker founder, Taryn Foster, explains “I think of this as a delivery or scaling mechanism for these other technologies that people are developing, like coral propagation …just at a much faster rate and on a bigger scale.”759 As coral reefs face rising temperatures and ocean acidification, some areas may become unsuitable for coral growth. The same automated systems can also be leveraged for movementassisted migration. Assisted migration enables the relocation of corals to areas where environmental conditions are more favorable, giving them a better chance of survival and growth. Assisted migration also allows for the relocation of corals from diverse genetic lineages. This approach helps maintain genetic diversity within coral populations, which is crucial for their ability to adapt and withstand future environmental challenges. For example, some species are adapted to withstand warmer water. Installing living structures provides a foundation for the growth and development of new communities,non-human and human.

This is restorative environmental justice in action.The researchers developing the computer vision systems for this project expressed how refreshing it is to work on disambiguous ethical applications of these technologies.760 “A new kind of attention, practical rather than contemplative, has been drawn to Spinoza by deep ecologists. Arne Naess, the Norwegian ecophilosopher, has outlined the points of compatibility between Spinozas thought and the basic intuitions of the (radical) environmental movement. Among them is this one: ‘Interacting with things and understanding things can not be separated. The units of understanding are not propositions but acts.’”761

Coralmaker recognizes the importance of a well-designed logistics system to support large-scale coral reef health. Coralmaker founder, Taryn Foster, explains “I think of this as a delivery or scaling mechanism for these other technologies that people are developing, like coral propagation …just at a much faster rate and on a bigger scale.”762 As coral reefs face rising temperatures and ocean acidification, some areas may become unsuitable for coral growth. The same automated systems can also be leveraged for movementassisted migration. Assisted migration enables the relocation of corals to areas where environmental conditions are more favorable, giving them a better chance of survival and growth. Assisted migration also allows for the relocation of corals from diverse genetic lineages. This approach helps maintain genetic diversity within coral populations, which is crucial for their ability to adapt and withstand future environmental challenges. For example, some species are adapted to withstand warmer water. Installing living structures provides a foundation for the growth and development of new communities,non-human and human.

This is restorative environmental justice in action.The researchers developing the computer vision systems for this project expressed how refreshing it is to work on disambiguous ethical applications of these technologies.763 “A new kind of attention, practical rather than contemplative, has been drawn to Spinoza by deep ecologists. Arne Naess, the Norwegian ecophilosopher, has outlined the points of compatibility between Spinozas thought and the basic intuitions of the (radical) environmental movement. Among them is this one:

Proposition XLI. Spinoza’s Ethics is a hyperbolic geometry.

Proof.—The Ethics appears to follow a Cartesian structure—but it goes beyond it: “Cartesianism is handled like a sieve, but in such a way that a new and prodigious scholasticism emerges which no longer has anything to do with the old philosophy, nor with Cartesianism either. Cartesianism was never the thinking of Spinoza; it was more like his rhetoric; he uses it as the rhetoric he needs.”764 It is not a Euclidean model made of straight lines and points. It is a network of parallel and intersecting negative curves. As Deleuze describes it, a sieve. Life flows through it: “In Spinoza's thought, life is not an idea, a matter of theory. It is a way of being, one and the same eternal mode in all its attributes. And it is only from this perspective that the geometric method is fully comprehensible … The geometric method ceases to be a method of intellectual exposition; it is no longer a means of professorial presentation but rather a method of invention.”765 The Ethics is a hyperbolic geometry—and it is also a logic of hyperbole—a non-classical lattice representing strong emotions and their interactions.

Note.—If reason is binary, emotions are quantum. Propositions—or questions—concerning quantum systems differ from classical logic. Classical logic operates on a binary principle, where statements are either true or false. This is often referred to as the law of the excluded middle. There are logics that do not abide strictly by the binary distinction of true and false. One example is ternary logic—three-valued logic—where propositions can take on a third value, often described as unknown or indeterminate. There are also other many-valued logics with more than three values. Fuzzy logic is a system of logic that allows for degrees of truth, rather than just true or false—1 or 0. Statements in fuzzy logic can be partially true to varying degrees.766 This form of logic is often applied in areas where information is imprecise or where human reasoning needs to be emulated, such as in some artificial intelligence applications, and quantum systems.

For Spinoza, emotions are complex combinations of external influences and internal responses, deeply entangled with one’s thoughts and actions. He classifies emotions into—desires, pleasures, and pains—which he believes are fundamental in driving human conduct. Emotions, in Spinoza’s perspective, are not just passive experiences; they actively influence an individual's capacity to act and think. By understanding emotions and their causes, individuals can transform decoherence into coherence: “Spinoza did not believe in hope or even in courage; he believed only in joy, and in vision.”767 The Ethics is a geometry of freedom, which Spinoza believed comes from understanding the necessity of everything, including our own feelings and actions, and aligning ourselves with this understanding.

Proposition XLII.

Feeling is a quantum force.

Proof.—The erotic is demonic ground (V.xix.).

Note.—If Spinoza believed only in joy and vision, Audre Lorde believed in joy and the erotic. Lorde's profound reflections in The Uses of the Erotic expand Spinoza’s Ethics. Like Spinoza, Lorde does not see ethics as a system of rules and judgments imposed from the outside. Neither does she imagine ethics as the repression of emotions in favor of logic. Lorde expresses the drive behind her actions, her algorithm for living—the erotic—a deeply internal force, a wellspring of power and knowledge rooted in embodied experience: “The erotic is a resource within each of us that lies in a deeply female and spiritual plane, firmly rooted in the power of our unexpressed or unrecognized feeling.”768 The erotic, as Lorde describes, goes beyond the superficial understanding of sexuality—it represents our capacity for joy, our potential for deep connection, and our intrinsic ability to recognize and strive for satisfaction in all aspects of life.

In a world dominated by objective metrics and cold logic, where the emphasis often lies on control and supervision, the erotic becomes even more revolutionary. It challenges dominant paradigms of thought, urging us to recognize and embrace female desire, feeling, and emotion as valid sources of knowledge and action. To fully grasp the depth of Lorde’s perspective, one must understand her view of the erotic as an affirmation of life and a rejection of the oppressive forces that seek to stifle it:

When we live outside ourselves, and by that I mean on external directives only rather than from our internal knowledge and needs, when we live away from those erotic guides from within ourselves, then our lives are limited by external and alien forms, and we conform to the needs of a structure that is not based on human need, let alone an individual’s. But when we begin to live from within outward, in touch with the power of the erotic within ourselves, and allowing that power to inform and illuminate our actions upon the world around us, then we begin to be responsible to ourselves in the deepest sense. For as we begin to recognize our deepest feelings, we begin to give up, of necessity, being satisfied with suffering and self-negation, and with the numbness which so often seems like their only alternative in our society. Our acts against oppression become integral with self, motivated and empowered from within … For not only do we touch our most profoundly creative source, but we do that which is female and self-affirming in the face of a racist, patriarchal, and anti-erotic society.769

When embracing the Erotic as a framework, we approach problems with a focus on healing, equity, and joy rather than domination, control, and oppression. If quantum computers simulated a future grounded in Lorde's Erotic what would it look like? Can technology exist as a companion that fosters connection and consensual pleasure, rather than function as a tool of reduction and exploitation? Would such a future prioritize healing, balance, laughter, care?

In this (im?)possible future, the erotic envelops vision, birthing a softer paradigm—the Supererotic—which stands in stark contrast to Supervision. It prioritizes deep, intangible connections between individuals over hard metrics and calculations. It values the integrity and paradoxical porousness of life, underlining our infinite difference and interconnectedness. Simulate joy! Simulate healing! Simulate equity! Simulate solutions to wicked problems—and at the same time—take every measure to ensure that the solutions offered are not the Final Solution.770771 

If the way which I have pointed out as leading to this result seems exceedingly hard, it may nevertheless be discovered. Needs must it be hard, since it is so seldom found. How would it be possible, if salvation were ready to our hand, and could without great labor be found, that it shouldbe by almost all men neglected? But all things excellent are as difficult as they are rare.

End of Supervision.











PART VI.

ACKNOWLEDGEMENTS

I would like to express my gratitude and acknowledge the following individuals, communities, and institutions who have played a significant role in shaping my work and supporting me throughout this process. I cant enumerate the minds —theorists, writers, designers, engineers, technologists, artists, caregivers—whose ideas have influenced my work.

This project would not be possible without the academic communities of USC and UCLA, where I have been able to collaborate, learn, and teach. I am thankful to Holly Willis for her encouragement to experiment with form. The structure of this text is core to its meaning. To Steve Anderson for his foundational research in capture and reconstruction, the groundwork for my own investigations. To Vikki Callahan for insisting on evaluations of technocapital and radical praxis in my work. To Tara Mcpherson for tracing the lines of hate and harm through the history of technology. To Karen Tongson for diagramming the tactics of normalization and identity formation. To Kiki Benzon for connection through words and writing, and for sharing signs of what matters most. To Jeff Watson for imparting his ethical urgency and reminding me of the interconnectedness of our reality. His teachings about reparative agency have profoundly influenced my perspective and commitment to truth and justice.772

To Jennifer Steinkamp, who demonstrated that like digital space, life is ours to sculpt with intention, from our environment to our character. To Eddo Stern for reminding me of the possibilities in play. To all my teachers and countless others who extend these ethics.

To my friends for the joy of living, difference and connection.

To my family, my mother—Laurence—whose life is devoted to care, and my brothers who are my epistemes—business, law, and engineering. Their lives express the power of ethics and the power of human agency.

 

To my partner—Pete, for opening a daily window into the developments of spatial computing over the last ten years; for your love, your unending support, and for bringing our children into existence. It is for them I endeavor to shift our ideologies.











Snake Oil Men, part VII—

The doctor took his time while I waited grumbling to myself in the exam room. Exam room three, posters of heart and lungs with arrows and explanations attached, a little model of the GI tract, bottles of disinfectant (or whatever) and the other expected things that you might find, the ear and nose scope, the blood-pressure armband and pump, the little glass bottle of tongue depressors and the rest of it. There were white fluorescent lamps embedded in the ceiling that aggravated my already throbbing headache, but I'm sure you already guessed at that. I was expecting something bad. The headache had come weeks earlier and hadn't left, and with all the smoking and drugs and fast food I was pretty much counting on a terminal diagnosis, some kind of cancer that had spread throughout my body and brain and it would be somewhere between three and six weeks before I lost consciousness, two to three months before I was dead. I am not a natural pessimist or a hypochondriac. I am simply practical and honest, and I knew the kind of life I had lived could exact this kind of toll. And besides, what else could a three-week headache mean?

When Doctor Chandragar finally returned, I felt my suspicions were about to be painfully confirmed. He softly closed the door and looked me in the eye as he took his seat on the chair across from mine.

             "Jonathan," he said. "I have some rather unsettling news for you."

             This was it. My heart leapt into my throat. I felt like I was about to vomit.

             "The test results..." I said, prompting him to deliver the death-blow.

             "Well, the test results, no, they came back completely normal, actually. Your blood seems fine, no problem there. But the X-rays reveal something I've quite honestly never seen before. Have a look at this."

             He removed an X-ray print from a large color-indexed manila folder and showed it to me. He pointed at a white blob in the middle of my skull.

             "Do you see the white area here?"

             "Yes. Is that a tumor?"

             "No, no. No, it's a bit, well quite a bit worse than that. That's your brain."

             "What do you mean, that's my brain?"

             "Your brain appears to have shrunk, or, possibly, to have never really grown at all. It's -- as you can see -- about the size of a potato. Usually in an X-ray, the brain, this white area, is flush with the walls of the cranium, your skull. But yours appears to be just floating there, perhaps in some kind of fluid."

             I was, well, flabbergasted. "My brain... is shrinking?"

             "Yes, or possibly it was always this size. Have you ever had an X-ray of your head before?"

             "No."

             "Then you may have been born this way."

             "But the headaches...."

             "I don't know. We'll have to consult the literature. I'm not well-read on this subject to be frank with you. Very few are. I have heard of this kind of dwarfism--"

             "It's a kind of dwarfism?"

             "Perhaps. My first guess would be that you have always had this potato-sized brain, and that recently you've, well, run out of space as it were."

             "Ah. And that would be the explanation for the headaches."

             "Yes. Until now, I would hazard, you've been ticking along just fine, acquiring memories, learning, feeling. But you may have reached your limit."

             My limit. That made a kind of sense. Things had been getting very odd lately. My life had entered a static period. And I was having trouble remembering.

             “So what can we do?”

             “We’ll have to have you examined by an expert,” said the doctor. “I’ll put out the word to my colleagues and we should have a name in a day or two. It might require some travel.”

             “Of course,” I said. The meeting ended shortly thereafter.

             I decided to keep the news a secret. Allie was pregnant with our fifth child and there were already more than enough reasons to worry, given that our first four had all died within a week of their birth. It occurred to me that none of them had had their heads X-rayed. There was also the issue of my ego, which was not equipped to deal with making public my potato-sized brain. I was glad it was hidden inside my skull. I told Allie the doctor said the headaches were probably due to dehydration, and began drinking even more water than usual.

             Doctor Chandragar called me on my cell phone the next morning. I was out for a walk, trying to clear my head. It wasn't working.

             "Jonathan?"

             "Doctor Chandragar."

             "I've found our expert. He's in Kazakhstan."

             Kazakhstan. I'd been expecting New York, Los Angeles, London, perhaps even Japan. But a dried up ex-Soviet republic? Perhaps I needed a second opinion.

             "It seems that the Kazakhs have had a rash of intra-cranial deformities over the past twenty years, probably because of the toxins in their soil. Nuclear tests, bioweapons labs, and so forth, you know. There's an American there with a particular interest in brain dysmorphia, and he comes highly recommended."

             Ah. An American. I could trust that. A little.

             "When are we going?"

             "I'm afraid I can't leave the country," replied the doctor. "I have a child on the way myself. But Doctor Woods is so excited about your case that he's offered to pay your way, and you can bring your wife as well."

             "Why can't he come here, then?"

             "I asked him that. He's got a very busy practice, his equipment is all there and so on."

             "Does he think I can be helped?"

             "He said he was working on a hormone treatment, something that could affect growth. He claims that he's cured several patients already."

             It sounded fishy, but I was desperate.

             "Email me the info," I said.

             Three weeks later, I was in Kazakhstan. Despite Doctor Woods' offer, I declined to bring Allie. I told her it was a research trip for my new book.

             The doctor took me to his lab and ran some tests. He gave me a vial of growth hormone and a room in the compound where he kept his other patients for observation. I was told the treatment would last six weeks.

             In a few days, the headaches went away. I returned home late the next month and got back to work. My novel was a success and kept me busy with speaking engagements. It was eight months before I finally got around to paying Doctor Chandragar another visit.

             When I got there, his clinic was closed. Boarded up. I asked an old homeless woman where it had gone.

             "Doctor Chandragar?" she said. "Oh, he turned out to be some quack! Telling everybody their brains had shrunk and then sending them to Kazakhstan for treatment. It was all a big insurance scam!"

             Could it be true? But what about my headaches? And what was it that Doctor Woods had given me?

             I went to another doctor. He did some X-rays. Everything was normal.

             Forty years later, I ran into Doctor Chandragar while on holiday in Wyoming. I confronted him angrily and threatened to press charges without really meaning it. He told me that he had been the victim of a disinformation campaign organized by rival doctors. He assured me that, prior to my trip to Kazakhstan, my brain had indeed been no larger than a mid-sized russet potato. I slapped him. His story was too hard to believe. He spat out a tooth and looked me straight in the eye.

             "Your headaches -- have they returned?"

             "No," I said, suddenly sheepish and defensive.

             "And your novels, they've been successful?"

             I showed him my Nobel ring. The one Allie made me after I got the prize.

             "And despite all this, you still suspect I duped you?"

             I had to admit that I did. But then things got interesting. Doctor Chandragar keyed some numbers into his cell phone and summoned Doctor Woods, who was living nearby in a retirement community. In a few minutes, he was with us by the pool.

             "So even you fell for the lies of the Brotherhood?" asked Doctor Woods, referring to the cabal of doctors Chandragar claimed had smeared his name. "After all we did for you?"

             "I'm not so sure you did anything for me," I said.

             "I paid for your trip. The whole treatment cost you nothing."

             "Yes, but you both collected on the insurance."

             "Well, we had expenses to cover. My lab. The serum. These things don't come for free."

             "Still, I did some reading after the treatment. Which I should have done before. There is no medical record of otherwise normal individuals with potato-sized brains, let alone of a hormonal treatment to induce brain growth. If any of this was true, you'd think there'd be some report somewhere..."

             "The Brotherhood is powerful," intoned Doctor Chandragar as he motioned the waitress for another Bloody Mary. "They have silenced our work completely."

             "Why do you think my clinic was in Kazakhstan?" asked Doctor Woods. "Do you think I would have self-located in such a hellhole without good reason?"

             "At the time you told me it was because of a preponderance of cases in that region."

             "That was my cover. At the time, the Brotherhood was looking to have me assassinated. I didn't want word getting out."

             "It was a security measure," added Doctor Chandragar. "Had we mentioned the Brotherhood, you might have gone to the press, which they control, and which would have very quickly led to a bombing of Doctor Woods' compound."

             "A bombing?"

             "The Brotherhood is very powerful."

             At this point, Allie returned from her yoga class with our daughter and her husband, Francis, who himself was a doctor of some renown. I had long since told Allie about my tangle with the doctors, or as we called them, the Snake Oil Men.

             "Allie, you'll never believe who I've run into," I said. But before she could respond, Francis produced a small ivory-handled pistol from a holster in the ass of his yoga pants.

             "Long live the Brotherhood!" he shouted, and shot both doctors in the head. He turned the gun on me and pulled the trigger again. I raised my hand to shield my face. The bullet ricocheted off my Nobel ring and hit Francis squarely between the eyes. My daughter fainted and Allie began to cry. The waitress just stood there, not knowing what to do with Doctor Chandragar's Bloody Mary.

             We went to the hospital to get my daughter checked out. By this point we had a police escort, but nobody was buying my story about the Brotherhood. I was just a crazy old Nobel laureate with a swollen finger. But then something happened that lent credence to my story, something that would bring down a thousand year old fraternity of medical practitioners, newsmen, and fighter pilots. The X-rays of my daughter's head came back, and there, floating in intracranial jelly, was a brain the size of a potato.

Jeff Watson, 2004

Endnotes


1.  Sagan, Carl. The Demon-Haunted world: Science as a Candle in the Dark. Ballantine, 1996.

2.  Arendt, Hannah. Lying in Politics, The New York Review, 1971.

3.  Bergson, Henri, Nancy Margaret Paul, and W. Scott Palmer. Matter and Memory. London: Forgotten Books, 2018.

4.  Deleuze, Gilles, Guattari Félix, and Brian Massumi. A Thousand Plateaus. London: Bloomsbury, 2013.

5.  Deleuze, Gilles, Barbara Habberjam, and Hugh Tomlinson. Bergsonism. New York: Zone books, 1991.

6.  Deleuze, Gilles, Guattari Félix, and Brian Massumi. A Thousand Plateaus. London: Bloomsbury, 2013.

7.  Massumi, Brian. Parables for the Virtual: Movement, Affect, Sensation. Durham: Duke Univ. Press, 2002.

8.  Cocke, John; Kolsky, Harwood. The Virtual Memory in the STRETCH Computer. Proceedings of the Eastern Joint Computer Conference, 1959.

9.  Deleuze, Gilles, Barbara Habberjam, and Hugh Tomlinson. Bergsonism. New York: Zone books, 1991.

10. Harney, Stefano; Moten, Fred. The Undercommons Fugitive Planning & Black Study. Brooklyn (NY): Autonomedia, 2013.

11.  Lefebvre, Henri. The Production of Space. Trans. Donald Nicholson-Smith. Blackwell Publishing, 1991.

12.  Lefebvre, Henri. Rhythmanalysis: Space, Time and Everyday Life. Bloomsbury Academic, 2013.

13.  Foucault, Michel. The Order of Things: An Archaeology of Human Sciences. Trans. Frye, N. New York: Vintage Books, 1973.

14.  Foucault, Michel. The Order of Things: An Archaeology of Human Sciences. Trans. Frye, N. New York: Vintage Books, 1973.

15. Foucault, Michel. Des Espace Autres, Architecture /Mouvement/ Continuité, France: Groupe Moniteur, 1984.

16.  Sharp, Joanne P. Geographies of Post-Colonialism: Spaces of Power and Representation. London: SAGE, 2009.

17.  Spivak, Gayatri Chakravorty. A Critique of Postcolonial Reason: toward a History of the Vanishing Present. Cambridge, MA: Harvard University Press, 2003.

18.  Cosgrove, Denis E. Geography and Vision: Seeing, Imagining and Representing the World. London: I.B. Tauris, 2012.

19.  Ibid.

20.  Fanon, Frantz. The Wretched of the Earth. Pref. by Jean-Paul Sartre. New York: Grove Press, 1968.

21.  Cosgrove, Denis E. Geography and Vision: Seeing, Imagining and Representing the World. London: I.B. Tauris, 2012.

22.  Ibid.

23.  Foucault, Michel. The Order of Things: An Archaeology of Human Sciences. Trans. Frye, N. New York: Vintage Books, 1973.

24.  Ibid.

25.  Ibid.

26.  Ibid.

27.  Noble, Safiya. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press, 2018.

28.  Angwin, Julia; Larson, Jeff; Mattu, Surya; Kirchner, Lauren. Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against blacks. ProPublica, May 23, 2016.

29.  Sekula, Allan. The Body and the Archive, October, 1986.

30.  Foucault, Michel. The Order of Things: An Archaeology of Human Sciences. Trans. Frye, N. New York: Vintage Books, 1973.

31.  Ahmed, Sara. Queer Phenomenology: Orientations, Objects, Others. Duke University Press, 2007.

32.  The Utah Teapot, Computer History Museum.

33.  Smith, Woodruff. Complications of the Commonplace: Tea, Sugar, and Imperialism. The Journal of Interdisciplinary History Vol. 23, No. 2, MIT Press, 1992.

34.  Sharp, Joanne P. Geographies of Post-Colonialism: Spaces of Power and Representation. London: SAGE, 2009.

35.  Lefebvre, Henri. Rhythmanalysis: Space, Time and Everyday Life. Bloomsbury Academic, 2013.

36.  Cicero, Tusculan Disputations 5.61.

37.  Conversation with Russ Athay, former Senior Software Engineer from Sutherland’s lab at the University of Utah, 2019.

38.  Sutherland, Ivan. “The Ultimate Display,”Information Processing Techniques Office, ARPA, OSD, 1965.

39.  Anderson, Steve F. Technologies of Vision:The War Between Data and Images. MA: MIT Press, 2017.

40.  Luckey, Palmer, Anduril

41.  Dean, Sam. “A 26-year-old billionaire is building virtual border walls—and the federal government is buying,” LA Times, 2019.

42.  Ibid.

43.  Ibid.

44.  DeLanda, Manuel. War in the Age of Intelligent Machines. New York, NY: Zone Books, 2003.

45.  Deleuze, Gilles, Guattari Félix, and Brian Massumi. A Thousand Plateaus. London: Bloomsbury, 2013.

46.  Mitchell, W. J. T. Landscape and Power, Second Edition. University of Chicago Press, 2002.

47.  Cosgrove, Denis E. Geography and Vision: Seeing, Imagining and Representing the World. London: I.B. Tauris, 2012.

48.  What is the Internet of Things?, IBM, 2023.

49.  Pister, Kris. Smart Dust.

50.  Patent for Transparent electronics for invisible smart dust applications, IBM, 2018.

51.  Crichton, Michael, Synopsis of Prey, 2002.

52.  Marr, Bernard. Smart dust is coming. Are you ready? Forbes, 2018.

53.  Bacigalupi, Paolo The Water Knife. Alfred A. Knopf. 2015.

54.  Vinge, Vernor. Rainbows End: A Novel with One Foot in the Future, Tor Books, 2006.

55.  Stephenson, Neal. Snow Crash. Bantam Books, 2022.

56.  Ibid.

57.  Anderson, M.T., Feed,Candlewick Press, 2022.

58.  Clarke, Arthur C. and Stephen Baxter, The Light of Other Days, Tor Books, 2000.

59.  R. W. Fuller and J. A. Wheeler, “Causality and Multiply-Connected Space-Time,” Phys. Rev. 128, 919 1962.

60.  The collected papers of Albert Einstein / Anna Beck, translator ; Peter Havas, consultant, Princeton University Press, 1987.

61.  Andrés Anabalón, Bernard de Wit & Julio Oliva, Supersymmetric traversable wormholes, Journal of High Energy Physics volume 2020, Article number: 109, 2020.

62.  Clavin, Whitney. Physicists observe wormhole dynamics using a quantum computer, Caltech, 2022.

63.  Arthur Hebecker, Thomas Mikhail, Pablo Soler, Euclidean wormholes, baby universes, and their impact on particle physics and cosmology, Frontiers, 2018.

64.  Gibson, William. Neuromancer, Ace, 1984.

65.  Baclawski, K. The Observer Effect, IEEE Conference on Cognitive and Computational Aspects of Situation Management (CogSIMA), 2018.

66.  Vikoulov, Alex M. The Syntellect Hypothesis: Five Paradigms of the Mind’s Evolution.Ecstadelic Media Group, 2020.

67.  Popkin, Gabriel, Einstein’s ‘spooky action at a distance’ spotted in objects almost big enough to see: Entangled electronic devices could help scientists make a quantum internet, Science, 2018.

68.  Stephenson, Neal. The Diamond Age. Bantam Spectra, 1995.

69.  Ibid.

70.  Ibid.

71.  Marr, Bernard. The Best Examples of Digital Twins Everyone Should Know, Forbes, 2022.

72.  Ibid.

73.  Dick, Philip K. The Minority Report, Pantheon, 2002.

74.  Crawford, Mark. 5 Ways to Cyber-Protect Your Digital Twin, The American Society of Mechanical Engineers, 2021.

75.  Ibid.

76.  Peck, Raoul. James Baldwin Was Right All Along, The Atlantic, 2020.

77.  Shierholz, Heidi and Celine McNicholas, Understanding the anti-regulation agenda, Economic Policy Institute, 2017.

78.  Browne, Simone. Dark Matters: On the Surveillance of Blackness, Duke University Press, 2015.

79.  Benjamin, Ruha. Captivating Technology, Duke University Press, 2019.

80.  Optics, Merriam Webster Dictionary, 2023.

81.  Galapagos, Episode 1: Good and Bad Optics, BBC, 2023.

82.  Moholy-Nagy, László. Painting, Photography, Film, 1925.

83.  Enoch, Jay M. Duplication of unique optical effects of ancient Egyptian lenses from the IV/V Dynasties: lenses fabricated ca 2620–2400 BC or roughly 4600 years ago, Ophthalmic and Physiological Optics, 2000.

84.  Ibid.

85.  Le Scribe Accroupi; Statue of a Scribe Seated Cross-Legged; The Crouching Scribe, The Louvre Museum, France.

86.  A. H. Layard, Discoveries in the Ruins of Nineveh and Babylon, London, 1853.

87.  The Nimrud Lens, The British Museum, London.

88.  The British Museum Act of 1963, The British Museum, London, 1963.

89.  Nelson, Maggie. Bluets, Wave Books, 2009.

90.  Smith, Mark A. Optics in the Time of Kepler, Encyclopedia of the History of Science, Carnegie Mellon University.

91.  Galileo and the Telescope, Library of Congress.

92.  Vollgraff, J.A. Snellius' Notes on the Reflection and Refraction of Rays, Osiris, Vol. 1, The University of Chicago Press, 1936.

93.  McDonough, Jeffrey. Dioptrics, The Cambridge Descartes Lexicon, 2016.

94.  Feynman, R.P. QED: The Strange Theory of Light and Matter, Princeton University Press, 1985.

95.  Nadler, S. Baruch Spinoza: Heretic, Lens Grinder. JAMA Ophthalmology. 2000.

96.  Bennet, Jonathan. Correspondence: Baruch Spinoza. Early Modern Texts, 2017.

97.  Urban, Miloš and Margaret Gullan-Whur. Within Reason: A Life of Spinoza, Peter Owen Limited, Prague, 1998.

98.  Huygens, Christian. Treatise on Light, Project Gutenberg, 1690.

99.  Ibid.

100.  Gély, Suzanne. André Marie Ampère (1775-1836) et Augustin Fresnel (1788-1827), Open Edition Journals, 2004.

101.  Watson, Bruce. Science Makes a Better Lighthouse Lens, Smithsonian Magazine, 1999.

102.  Ibid.

103.  Bernhard, Adrienne "The invention that saved a million ships", BBC, 2019.

104.  Maxwell, James Clerk. A Treatise on Electricity and Magnetism, Cambridge University, London, 1873.

105.  Maxwell, James Clerk. A dynamical theory of the electromagnetic field, Philosophical Transactions of the Royal Society of London, 1865.

106.  Kirchhoff, G. On the relation between the radiating and absorbing powers of different bodies for light and heat, trans. F. Guthrie, The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 1860.

107.  Planck, M. The Theory of Heat Radiation, Trans. Masius P. Blakiston's Son & Co., 1914.

108.  Heilbron, J.L.The Dilemmas of an Upright Man: Max Plank and the Fortunes of German Science, Harvard University Press, 2000.

109.  Einstein A. Relativity: The Special and General Theory, H. Holt and Company, 1916.

110.  Gravitational Lensing, Center for Astrophysics, Harvard & Smithsonian.

111.  Calder, Nigel. Magic Universe: A grand tour of modern science. Oxford University Press, 2006.

112.  Cosmic Microwave Background (CMB) Radiation, The European Space Agency.

113.  Fletcher, Seth. The First Picture of the Black Hole at the Milky Way’s Heart Has Been Revealed, 2022.

114.  Lutz, Ota. How Scientists Captured the First Image of a Black Hole, Jet Propulsion Laboratory, NASA, 2019.

115.  Taggart, Emma and Margherita Cole, The History of Camera Obscura and How It Was Used as a Tool To Create Art in Perfect Perspective, 2022.

116.  Mohism, Stanford Encyclopedia of Philosophy, 2020.

117.  Mozi, The Mozi, Book 10: Exposition of Canon II, Trans. in Ian Jonston, 2010.

118.  Sabra, A. I., ed. The Optics Of Ibn Al-Haytham Books I—Iii On Direct Vision, Harvard University, 1989.

119.  Ibid.

120.  Sherry, Bennett. The Universe Through a Pinhole: Hasan Ibn al-Haytham, Khan Academy.

121.  Friendly, Michael and Daniel J. Denis. Milestones in the History of Thematic Cartography, Statistical Graphics, and Data Visualization: An Illustrated Chronology of Innovations.

122.  Marchant, Jo. First known map of night sky found hidden in Medieval parchment: Fabled star catalogue by ancient Greek astronomer Hipparchus had been feared lost, Nature, 2022.

123.  Smith, A. Mark. Optics To The Time Of Kepler, Encyclopedia of the History of Science, Carnegie Mellon University.

124.  Leon Battista Alberti, On Painting and On Sculpture: The Latin Texts of “De Pittura” and “De Statua,” trans. Cecil Grayson (London: Phaidon, 1972).

125.  Jane Andrews Aiken, Leon Battista Alberti’s System of Human Proportions. Journal of the Warburg and Courtauld Institutes

126.  Theodolite, Smithsonian Museum.

127.  Theodolites, NOAA, 2022.

128.  Sextant, Smithsonian.

129.  Potonnieée, Georges. The History of the Discovery of Photography. New York: Tennant And Ward, 1936.

130.  Heliography: A Double Invention That Revolutionized The World Of Images, Nicéphore Niépce Museum, Google Arts and Culture.

131.  Pettinger, Tejvan, A letter from Louis Daguerre to Charles Chevalier, Biography of Louis Daguerre, Oxford, UK, 2019.

132.  William Henry Fox Talbot's Calotype, History Of Photography Compendium, Chapman University, 2021.

133.  Flueckiger, Barbara, Timeline of Historical Film Colors, 2012.

134.  Ibid.

135.  Boyd, Jane E. Celluloid: The Eternal Substitute, Distillations, Science History Institute Museum & Library.

136.  Fineman, Mia, Kodak and the Rise of Amateur Photography, Department of Photographs, The Metropolitan Museum of Art, 2004.

137.  Solnit, Rebecca. River of Shadows: Eadweard Muybridge and the Technological Wild West, Viking, 2003.

138.  Ibid.

139.  Ibid.

140.  Albrecht Meydenbauer: Photometrography, Architects' Association in Berlin Paper, vol. 1, no. 14, 1867.

141.  Grimm, Albrecht, The Origin of the Term Photogrammetry, International Society for Photogrammetry and Remote Sensing, Accessed 2023.

142.  Polidori, L. On Laussedat’s Contribution To The Emergence Of Photogrammetry, The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLIII-B2-2020, 2020 XXIV ISPRS Congress, 2020.

143.  Ragey, Louis. The Work Of Laussedat, National School of Arts and Crafts, Paris, 1952.

144.  Nadar, Felix. When I Was a Photographer (1900). Trans. Eduardo Cadava and Liana Theodoratou, MIT Press, 2015.

145.  Berger, John. Ways of Seeing. Penguin, 1972.

146.  Wheatstone, Charles. Contributions to the Physiology of Vision.—Part the First. On some remarkable, and hitherto unobserved, Phenomena of Binocular Vision, Philosophical Transactions of the Royal Society vol. 128, 1838.

147. Ballistic Chronograph, American Precision Museum.

148.  Stereo Disparity Estimation, Papers with Code, 2023.

149.  Forbes, Andrew, Michael de Oliveira and Mark R. Dennis. Structured Light,  Nature Photonics, 2021.

150.  National Oceanic and Atmospheric Administration. What is LIDAR, Department of Commerce, 2021.

151.  Einaudi, Franco, Gary K. Schwemrner,  Bruce M. Gen and James B. Abshire. Lidar Past, Present, and Future in NASA's Earth and Space Science Programs, NASA, 2004.

152.  Svein-Erik Hamran, Principal Investigator, RIMFAX, NASA.

153.  Mandai; Shingo et al. Patent for Time-of-flight depth sensing with improved linearity, Apple Inc. 2022. US-20220244391-A1

154.  Medina, Antonio, US Patent: Three Dimensional Camera and Rangefinder, 1992, US-5081530

155.  Hardin, Winn. Time of Flight (ToF) Sensors Bring Autonomous Applications to MarketAssociation for Advancing Automation, 2021.

156.  Goodrich, Joanna. The First Digital Camera Was the Size of a Toaster, IEEE Spectrum, 2022.

157.  Davide Scaramuzza, Event Cameras, University of Zurich, Institute of Informatics - Institute of Neuroinformatics, Robotics and Perception Group.

158.  Gabrielsen, Paul. New Camera Inspired by Insect Eyes, Science, 2013.

159.  Guillermo Gallego, Tobi Delbruck, Garrick Orchard, Chiara Bartolozzi, Brian Taba, Andrea Censi, Stefan Leutenegger, Andrew Davison, Joerg Conradt, Kostas Daniilidis, Davide Scaramuzza, Event-based Vision: A Survey.

160.  Howarth, Josh. How Many People Own Smartphones (2023-2028), Exploding Topics, January 26, 2023.

161.  Alberto Acosta, Extractivism and neoextractivism: two sides of the same curse, Transnational Institute.

162.  Naomi Klein, This Changes Everything: Capitalism vs. The Climate. Simon & Schuster, 2014.

163. Jason Fernando, Resource Curse: Definition, Overview and Examples, Investopedia Futures and Commodities Trading: Strategy & Education, September 29, 2022.

164.  Why Does Extractives Matter?

165.  Alan J. Herbert, Lanthanum Glass.

166.  Lindsay Dodgson, On the trail of tantalum: tracking a conflict mineral, Mining Technology, 2016.

167.  Global Lanthanum Market , Research and Markets, 2018.

168.  White, Sarah and Jane O’Connell, The natural and industrial cycling of indium in the environment, Massachusetts Institute of Technology. Dept. of Civil and Environmental Engineering, 2012.

169.  Ibid.

170. Acosta, Jose A., Ángel Faz, et al. Environmental Risk Assessment of Tailings Ponds Using Geophysical and Geochemical Techniques, Assessment, Restoration and Reclamation of Mining Influenced Soils, 2017.

171.  Tang, Shuting, Chunli Zheng et al. Geobiochemistry characteristics of rare earth elements in soil and ground water: a case study in Baotou, China, National Library of Medicine. 2020.

172.  Mims, Christopher. Electronics Makers Have Worst Labor Practices of Any Industry, Says Report, MIT Technology Review, 2012.

173. Stanley, Jay. The Nightmarish Loss of Workplace Privacy, ACLU, 2022.

174.  Kendall, D. G. Stochastic Processes Occurring in the Theory of Queues and their Analysis by the Method of the Imbedded Markov Chain. The Annals of Mathematical Statistics, 1953.

175.  Shankland, Stephen. Google Uncloaks Once-Secret Server. CNET, 2009.

176.  Cyanide Toxicity, National Library of Medicine, 2023.

177.  Monserrate, Steven Gonzalez. The Staggering Ecological Impacts of Computation and the Cloud, MIT Press.

178.  Pascal, Blaise. Pensées, Trans. W.F. Trotter, Random House, 1941.

179.  Deleuze, Gilles, Guattari Félix, and Brian Massumi. Apparatus of Capture, A Thousand Plateaus. London: Bloomsbury, 2013.

180.  Fussell, Angela, Terrestrial Photogrammetry in Archaeology, World Archaeology Vol. 14, No. 2, 1982.

181.  Foucault, Michel. Discipline and Punish, Pantheon, 1977.

182.  Magnani, Matthew and Matthew Douglass et al. The Digital Revolution to Come: Photogrammetry in Archaeological Practice, Cambridge University Press, 2020.

183.  Allahyari, Moreshin. Physical Tactics for Digital Colonialism, The New Museum, 2019.

184.  Tannús, Júlia. Optimizing and Automating Computerized Photogrammetry for 360° 3D Reconstruction, IEEE Symposium on Virtual and Augmented Reality.

185.  Schurian, Bernhard. Museum Fur Naturkunde, Berlin. 2023.

186.  Vasilescu, Denis. Renishaw Advanced Metrology Workshop, Autodesk, 2023.

187.  Weckenmann A., G. Peggs G. and J. Hoffmann. Probing systems for dimensional micro- and nano-metrology". Measurement Science and Technology. Meas. Sci. Technol. 17, 2006.

188.  DeLanda, Manuel. War in the Age of Intelligent Machines. New York, NY: Zone Books, 2003.

189.  Lightcage, ESPER, 2023.

190.  Foucault, Michel. Discipline and Punish, Pantheon, 1977.

191.  Lefebvre, Henri. Rhythmanalysis: Space, Time and Everyday Life. Bloomsbury Academic, 2013.

192.  Schlosser, Eric. The Prison-Industrial Complex, The Atlantic, 1998.

193.  Lee, Kijun. Military Application of Aerial Photogrammetry Mapping Assisted by Small Unmanned Air Vehicles, Air Force Institute of Technology, Defense Technical Information Center, 2018.

194.  Denis Cosgrove, Geography and Vision: Seeing, Imagining and Representing the World, Tauris, 2008.

195.  Steyerl, Hito. In Free Fall: A Thought Experiment on Vertical Perspective, e-flux, Issue 24, 2011.

196.  Denis Cosgrove, Geography and Vision: Seeing, Imagining and Representing the World, Tauris, 2008.

197.  Steyerl, Hito. In Free Fall: A Thought Experiment on Vertical Perspective, e-flux, Issue 24, 2011.

198.  What is remote sensing and what is it used for?, United States Geological Survey.

199.  National Data Security Policy for Space-Based Earth Remote Sensing Systems: Background Information for the act on Satellite Data Security, Federal Ministry of Economics and Technology, Germany, 2007.

200.  Sanger, David. Ethical Challenges in the Practice of Remote Sensing and Geophysical Archaeology, Archeological Prospection, Volume 28, Issue 3, 2021.

201.  Emery, William and Adriano Camps. Optical Imaging Systems, Introduction to Satellite Remote Sensing, Comprehensive Remote Sensing, 2017.

202.  Olson, Eric. Guidelines for Setting Camera Field of View, Security Info Watch, 2019.

203.  Laidler, John and Suzanna Zuboff. High tech is watching you, The Harvard Gazette, 2019.

204.  Ibid.

205.  Zuboff, Suzanna. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Public Affairs Books, 2019.

206.  Ibid.

207.  Vaughan, Janet. Against – 3D ultrasound in first and second trimester pregnancy – hype or helpful?, National Library of Medicine, 2015.

208.  Crisis Pregnancy Centers, Issue Brief, The American College of Obstetricians and Gynecologists, 2023.

209.  Hoskins, Peter, Kevin Martin and Abigail Thrush. Diagnostic Ultrasound: Physics and Equipment, Cambridge University Press, 2010.

210.  Russo, Jen. Mandated Ultrasound Prior to Abortion, AMA Journal of Ethics, 2014.

211.  3D Ultrasound Market Size, Share & Trends Analysis Report By Application, 2020-2027, Market Analysis Report, Grand View Research, 2019.

212.  3D Keepsake Imaging, 2023.

213.  Ultrasound Imaging, U.S. Food & Drug Administration, 2020.

214.  Grady, Stephanie. It's a Booming Baby Business, but are 3D/4D Ultrasounds Really Worth the Risk?, FOX6, 2015.

215.  Are all 3D ultrasounds weird looking?, What to Expect—Community Forum, 2018.

216.  Van Walree, Paul. Distortion, Photographic Optics, 2009.

217.  Fisheye Lens, Flat Earth Answers.

218.  Bowen, Christopher J. and Roy Thomson. Grammar of the Shot, Taylor & Francis, 2013.

219.  Ballard, Zachary, Calvin Brown, Asad M. Madni & Aydogan Ozcan. Machine learning and computation-enabled intelligent sensor design, Nature Machine Intelligence, 2021.

220.  Donoho, D.L., Compressed sensing, IEEE Transactions on Information Theory, 2006. doi:10.1109/TIT.2006.871582

221.  Lewis, Sarah. The Racial Bias Built into Photography, The New York Times, 2019.

222.  Crockford, Kade, How is Face Recognition Surveillance Technology Racist?, ACLU, 2020.

223.  The Data Divide, Ada Lovelace Institute, 2021.

224.  Developing a Minimum Digital Living Standard for Households with Children, University of Liverpool, 2021.

225.  Assigning Data Ownership, Data Governance Institute, 2023.

226.  Ibid.

227.  Ibid.

228.  Polge, Julien, Jérémy Robert, and Yves Le Traon. Permissioned Blockchain Frameworks in the Industry: A Comparison, ICT EXPRESS, 2021.

229.  Blackman, Reid. Why Blockchain’s Ethical Stakes Are So High … And How Developers And Users Can Mitigate Potential Harm, Harvard Business Review, 2022.

230.  Ray, Shaan. Blockchain Security Mechanisms, Toward Data Science, 2018.

231.  Can blockchain accelerate Internet of Things (IoT) adoption?, Deloitte, 2023.

232.  Schwartz, Ariel. Every bullet this gun fires would be automatically tracked in a database — here’s why, 2016.

233.  Gramlich, John. What the Data says about Gun Deaths in the U.S., Pew Research, 2023.

234.  Lucas, Ryan. The First Smart Gun with Facial and Fingerprint Recognition is Now for Sale, NPR, 2023.

235.  “Smart” Guns | Personalized Firearms, NRA Institute for Legislative Action.

236.  Ibid.

237.  Levi, Stuart D. and Alex B. Lipton, An Introduction to Smart Contracts and Their Potential and Inherent Limitations, Harvard Law School Forum on Corporate Governance, 2018.

238.  Conti, Robin. What Is An NFT? Non-Fungible Tokens Explained, Forbes, 2023.

239.  Botz, Anneli, Is Blockchain the Future of Art? Four Experts Weigh In, Art Basel, 2023.

240.  Beeple’s Opus, Christies, 2023.

241.  Kakar, Arun. Two Years since the Historic Beeple Sale, What’s Happened to the NFT Market?

242.  Beckett, Lois. ‘Huge mess of theft and fraud:’ artists sound alarm as NFT crime proliferates, The Guardian, 2022.

243.  Non-Fungible Token Study, The United States Copyright Office, 2023.

244.  King Jr, Martin Luther. The Case Against 'Tokenism', The New York Times, 1962.

245.  Cat-Wells, Keely.  NFTs By Disabled Creatives Breaking Moulds And Making Profits, Forbes, 2021.

246.  Madrigal, Alexis. How Blind Photographers Visualize the World, Forum, KQED, 2023.

247.  Ibid.

248.  Ibid.

249.  Ibid.

250.  Barber, Gregory. NFTs Are Hot. So Is Their Effect on the Earth’s Climate, Wired, 2021.

251.  Tabuchi, Hiroko. NFTs Are Shaking Up the Art World. They May Be Warming the Planet, Too, The New York Times, 2021.

252.  Ghorbanzadeh, Masoud. Proof-of-Stake (POS), Ethereum, 2023.

253.  Ibid.

254.  GRID Alternatives, 2023.

255.  Ibid.

256.  Ibid.

257.  Barthes, Roland. Camera Lucida: Reflections on Photography. Trans. Richard Howard. Hill and Wang, 1981.

258.  Mackenzie, Charles E. Coded Character Sets, History and Development, The Systems Programming Series (1 ed.). Addison-Wesley Publishing Company, Inc, 1980.

259.  Baudrillard, Jean. Simulacra and Simulation. Editions Galilee, 1981.

260.  Hansen, Nadja. Featured Publication: Photography and the American Civil War, 2013.

261.  Ibid.

262.  Photograph of Sojourner Truth, The Metropolitan Museum,1965.

263.  Sojourner Truth, Library of Congress, 2023.

264.  McCurry, Stephanie. The Confederacy Was anAntidemocratic, Centralized State, The Atlantic, 2020.

265.  The Black Codes and Jim Crow Laws, National Geographic, Education, 2023.

266.  Mcpherson, Tara. Reconstructing Dixie, Duke University Press, 2003.

267.  Ibid.

268.  Chun, Wendy. The Enduring Ephemeral, or the Future Is a Memory, Critical Inquiry 35:1, 2008.

269.  Data Lakes and Data Swamps, IBM, 2023.

270.  Ibid.

271.  Apprich, Clemens, Wendy Hui Kyong Chun, Florian Cramer, and Hito Steyerl. Pattern Discrimination, 2019.

272.  Chun, Wendy, On Software or the Persistence of Visual Knowledge, Grey Room, 18, 2005.

273.  Ibid.

274.  GitHub abandons 'master' term to avoid slavery row, BBC, 2020.

275.  Da Silva, Laura Javier Roca-Piera, and José-Jesús Fernández. Evaluation of Master-Slave Approaches for 3D Reconstruction in Electron Tomography, Lecture Notes in Computer Science, Vol. 5518, 2009.

276.  Oberhaus, Daniel. ‘Master/Slave’ Terminology Was Removed from Python Programming Language, 2018.

277.  Ibid.

278.  Ibid.

279.  Issue 34605: Avoid Master/Slave Terminology - Python Tracker. bugs.python.org. 2023.

280.  GitHub Abandons ‘Master’ Term to Avoid Slavery Row, BBC, 2020.

281.  “A Resolution to Redefine SPI Signal Names”. Open Source Hardware Association, 2022.

282.  Leonard, Ellis. It’s Time for IEEE to Retire ‘Master / Slave,’ EE Times, 2020.

283.  Galloway, Alexander. Language Wants to Be Overlooked: Software and Ideology, Journal of Visual Culture, Volume 5, Issue 3, 2006.

284.  Chun, Wendy Hui Kyong. On “Sourcery,” or Code as Fetish, Configurations, Volume 16, Number 3, The John Hopkins University Press, 2008. DOI: 10.1353/con.0.0064

285.  Chun, Wendy. The Enduring Ephemeral, or the Future Is a Memory, Critical Inquiry 35:1, 2008.

286.  Wade, Nicholas. On the Origins of Terms in Binocular Vision, National Library of Medicine, 2021.

287.  Roberts, Lawrence. Machine Perception Of Three-Dimensional Solids, Massachusetts Institute of Technology, 1963.

288.  Longuet-Higgins, Hugh Christopher. A Computer Algorithm for Reconstructing a Scene from Two Projections, Nature, 1981.

289.  Luong, Quan-Tuan and Olivier D. Faugeras. The fundamental matrix: Theory, algorithms, and stability analysis, International Journal of Computer Vision, 1996.

290.  Chen, Yang and Gerard Medioni. Object Modeling by Registration of Multiple Range Images, Image Vision Computation, 10, 1991. doi:10.1016/0262-8856(92)90066-C

291.  Besl, Paul and N.D. McKay, A Method for Registration of 3-D Shapes, IEEE Transactions on Pattern Analysis and Machine Intelligence, 14, 1992. doi:10.1109/34.121791

292.  Hartley, Richard and Andrew Zimmerman. Multiple View Geometry in Computer Vision, Cambridge University Press, 2000.

293.  Lowe, David. Distinctive Image Features from Scale-Invariant Keypoints, University of British Columbia, 2004.

294.  Ballabeni, Andrea, Fabrizio Apollonio, Marco Gaiani, and Fabio Remondino. Advances in Image Pre-processing to Improve Automated 3D Reconstruction, 2015. 10.5194/isprsarchives-XL-5-W4-315-2015

295.  Ibid.

296.  Lowe, David. Distinctive Image Features from Scale-Invariant Keypoints, University of British Columbia, 2004.

297.  Luke, Robert, James Keller, and Jesus Chamorro-Martinez. Extending the Scale Invariant Feature Transform Descriptor into the Color Domain, ICGST-GVIP, ISSN 1687-398X, Volume (8), Issue (IV), 2008.

298.  Sjödahl, E.g. M.and L.R. Benckert, Electronic Speckle Photography: Analysis of an Algorithm Giving the Displacement with Subpixel Accuracy, Appl Opt. 1993.. doi:10.1364/AO.32.002278

299.  Fourier, Jean-Baptiste Joseph. Théorie Analytique de la Chaleur, Firmin Didot Père et Fils. 1822.

300.  Allan Sekula, The Body and the Archive, October, Vol. 39, MIT Press, Winter 1986.

301.  What Is a Feature Descriptor in Image Processing?, Baeldung, 2023.

302.  Definition of ransack, Merriam Webster Dictionary, 2023.

303.  Wang, X. Learning and Reasoning with Visual Correspondence in Time. 2019.

304.  Strutz, T. Data Fitting and Uncertainty, Springer, 2016.

305.  What are outliers in the data? National Institute of Standards and Technology (NIST), US Department of Commerce, 2023.

306.  Outlier, Etymology Online, 2023.

307.  Gress, Todd W, James Denvir, and Joseph I. Shapiro. Effect of removing outliers on statistical inference: implications to interpretation of experimental data in medical research, National Library of Medicine, 2018.

308.  Vural, Elif and A. Aydin Alatan. Outlier Removal for Sparse 3D Reconstruction from Video, The True Vision - Capture, Transmission and Display of 3D Video, IEEE, 2008.

309.  Standard Deviation, Wolfram, 2023.

310.  Leach, Richard. Abbe Error/Offset, CIRP Encyclopedia of Production Engineering, 2014 doi:10.1007/978-3-642-35950-7_16793-1

311.  Forczyk, Robert.Kursk 1943: The Southern Front. Bloomsbury Publishing, 2017.

312.  Trilateration vs. Triangulation, U.S. Department of Defense, 2023.

313.  Deleuze, Gilles and Felix Guattari. Anti-Oedipus: Capitalism and Schizophrenia, Trans. Robert Hurley, Mark Seem, and Helen R. Lane. University of Minnesota Press, 1983.

314.  Ibid.

315.  Lourakis, M.I.A. and A.A. Argyros. SBA: A Software Package for Generic Sparse Bundle Adjustment, ACM Transactions on Mathematical Software, 2009. . doi:10.1145/1486525.1486527

316.  Ibid.

317.  Couprie, Camille, Leo Grady, Laurent Najman, and Hugues Talbot. Power Watersheds: A Unifying Graph-Based Optimization Framework, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 33, No. 7, 2011.

318.  Tošić, Ivana and Pascal Frossard. Spherical Imaging in Omnidirectional Camera Networks, Multi-Camera Networks: Principles and Applications, 2009.

319.  The Atlas of Inequality, MIT, 2023.

320.  Yedidia, J.S. and W.T. Freeman, W.T. Understanding Belief Propagation and Its Generalizations, Exploring Artificial Intelligence in the New Millennium, Morgan Kaufmann. 2003.

321.  Jaiswal,J. C. LoSchiavo, and D. C. Perlman. Disinformation, Misinformation and Inequality-Driven Mistrust in the Time of COVID-19: Lessons Unlearned from AIDS Denialism, AIDS Behav., 24, 10. 2020. 10.1007/s10461-020-02925-y

322.  Stereo, Etymology Online, 2023.

323.  The Physical History of 'Stereotype': From the Printing House to Everyone's House, Merriam Webster, 2023.

324.  Lippmann, Walter, Public Opinion, New York, MacMillan Co, 1922.

325.  Levinas, Emmanuel. Collected Philosophical Papers. Trans. Alphonso Lingis, 1987.

326.  Kutulakos, Kiriakosn and Stevenm Seitza. Theory of Shape by Space Carving, International Journal of Computer Vision 38 (3), 2000.

327.  Appel, Arthur. Some techniques for shading machine renderings of solids, ACM, 1968.

328.  Vale, Paul. Vaclav Havel Dead: The Quotes Of The Man Who ‘Lived In Truth,’ The Huffington Post, 2011.

329.  Wolchover, Natalie. A New Physics Theory of Life, Quanta, 2014.

330.  Tanks and Temples, 2023.

331.  Ibid.

332.  Ibid.

333.  1 Kings 6:20.

334.  Donefer-Hickie, Ana Matisse and Wolfram Koeppe. Take a Peek Inside an Ancient Temple!, The Metropolitan Museum, 2020.

335.  Weisstein, Eric W. Normal Vector, MathWorld, 2023.

336.  Foucault, Michel. Abnormal. Lectures at the College de France, 1974-1975. Trans. GrahamBurchell, Verso, 2003.

337.  Tongson, Karen. Normporn, New York University Press, 2023.  

338.  Morrison, Toni. The Bluest Eye, Holt, Rinehart and Winston, 1970.

339.  Ahmed, Sara. Queer Phenomenology: Orientations, Objects, Others. Duke University Press, 2006. Project MUSE. 

340.  Ibid.

341.  Massumi, Brian. Parables of the Virtual: Movement, Affect, Sensation, Duke University Press, 2002.

342.  Chibane, Julian. Thiemo Alldieck and Gerard Pons-Moll. Implicit Functions in Feature Space for 3D Shape Reconstruction and Completion, Conference on Computer Vision and Pattern Recognition, IEEE, 2020.

343.  Mescheder, Lars, Michael Oechsle, Michael Niemeyer, Sebastian Nowozin, and Andreas Geiger.

344.  March, Merriam Webster Dictionary, 2023.

345.  Lorensen, William E. and Harvey E. Cline. Marching Cubes: A High Resolution 3D Surface Construction Algorithm, Computer Graphics, Vol. 21, No. 4, 1987.

346.  Deleuze, Gilles and Felix Guattari. Anti-Oedipus: Capitalism and Schizophrenia, Trans. Robert Hurley, Mark Seem, and Helen R. Lane. University of Minnesota Press, 1983.

347.  Ibid.

348.  Ibid.

349.  Baumgart, Bruce. Winged-Edge Polyhedron Representation for Computer Vision. National Computer Conference, 1975.

350.  Diaz-Andreu, Margarita. Colonialism and the Archaeology of the Primitive, A World History of Nineteenth-Century Archaeology: Nationalism, Colonialism, and the Past, 2007.

351.  Kazdan, Michael. Matthew Bolitho, and Hugues Hoppe. Poisson Surface Reconstruction, Eurographics Symposium on Geometry Processing, 2006.

352.  Mesh Smoothing, Graphics, Stanford University.

353.  Veneziano, Alessio, Federica Landi, and Antonio Profico. Surface Smoothing, Decimation, And Their Effects On 3D Biological Specimens, American Journal of Physical Anthropology, Vol. 166, I. 2, 2018.

354.  Botsch, Mario, Mark Pauly, Leif Kobbelt, and Pierre Alliez. Geometric Modeling Based on Polygonal Meshes, 2007. DOI:10.1145/1281500.1281640

355.  Plutarch, Plutarch's Parallel Lives: Antony, Internet Classics Archive, 75 ACE.

356.  Catmull, E. A Subdivision Algorithm for Computer Display of Curved Surfaces. University of Utah, 1974.

357.  Ibid.

358.  Deconstructing Deepfakes—How do they work and what are the risks?, US Government Accountability Office, 2023.

359.  Ibid.

360.  Van Holland, Leif, Patrick Stotko, Stefan Krumpen, Reinhard Klein, and Michael Weinmann. Efficient 3D Reconstruction, Streaming and Visualization of Static and Dynamic Scene Parts for Multi-client Live-telepresence in Large-scale Environments, 2022.  

361.  Jensen, H. Global Illumination using Photon Maps, Stanford University, 1996.

362.  Fast Fourier Transform (FFT) and Convolution in Medical Image Reconstruction, Intel, 2020.

363.  Mustafi, Sara and Tatiana Latychevskaia. Fourier Transform Holography: A Lensless Imaging Technique, Its Principles and Applications, Photonics, 2023.

364.  BlinkOnCrime.com is owned and operated by Shannon Christina Stoy. Christina Stoy and BlinkOnCrime.com are well known for sensationalism and dishonest content. In her quest for internet fame and webhits, Stoy is commonly libelous, callous with the privacy of others and recklessly speculative. Countless people have been left in Stoy's wake, bystanders in police investigations needlessly dragged through the mud by an internet tabloid writer and her minions. Often with no foundation or even worse,with disinformation and libel, 2023.

365.  Romano, Aja. Why We’re Relitigating The Casey Anthony Case Now — And Why We Shouldn’t, Vox, 2022.

366.  Ryu, Jenna. Casey Anthony is a 'pathological liar,' new series says. What does that really mean?, USA Today, 2022.

367.  Li, Wendy. Casey Anthony-Related Merchandises Selling Like Hot Cakes on Ebay, The International Business Times, 2011.

368.  Lohr, David. Casey Anthony: Hustler Offers $500,000 For Nude Photos, Larry Flynt Reports, The Huffington Post, 2011.

369.  Akpan, Nsikan. How Cops used Virtual Reality to Recreate Tamir Rice, San Bernardino Shootings, PBS, 2016.

370.  Ibid.

371.  Nagourney, Adam, Ian Lovett and Richard Pérez-Peña. San Bernardino Shooting Kills at Least 14; Two Suspects, The New York Times, 2015.

372.  Akpan, Nsikan. How Cops used Virtual Reality to Recreate Tamir Rice, San Bernardino Shootings, PBS, 2016.

373.  Colorado Springs Uses Laser Scanner to Document Mass Shooting, FARO, 2023.

374.  Akpan, Nsikan. How Cops used Virtual Reality to Recreate Tamir Rice, San Bernardino Shootings, PBS, 2016.

375.  Schwartz, John. Debate Over Full-Body Scans vs. Invasion of Privacy Flares Anew After Incident, The New York Times, 2009.

376.  Akpan, Nsikan. How Cops used Virtual Reality to Recreate Tamir Rice, San Bernardino Shootings, PBS, 2016.

377.  Ibid.

378.  Hartnett, Kevin. The ‘Useless’ Perspective That Transformed Mathematics, Quanta, 2020.

379.  Moscovici, S. La Psychanalyse, Son Image et Son Public, Presses Universitaires de France, 1961.

380.  Moscovici, S. Attitudes and opinions. Annual Review of Psychology, 14, 1963.

381.  Eight Months Pregnant and Arrested After False Facial Recognition Match, The New York Times, 2023.

382.  Haraway, Donna. A Manifesto for Cyborgs: Science, Technology, and Socialist Feminism in the 1980s, Socialist Review, 1985.

383. Jara Rocha, Femke Snelting, “Possible Bodies,”Volumetric Regimes: Material cultures of quantified presence, Open Humanities Press, 2022.

384.  Holodexxx VR: Porn Stars Scanned into 3D, VR Porn, 2017.

385.  Cram, Rob. Holodexxx Experimental Customization/Studio/Home Overview, 2021.

386.  Ibid.

387.  3D Scan Store, 2023.

388.  Hao, Karen. Deepfake Porn Is Ruining Women’s Lives. Now The Law May Finally Ban It, MIT Technology Review, 2021.

389.  Ibid.

390.  BeHere /1942, A New Lens on the Japanese American Incarceration, Japanese American National Museum, 2022.

391.  Interview with David Leonard, 2022.

392.  Dashevsky, Evan. 18 Completely Inappropriate Places to Play Pokemon Go, PC Magazine, 2016.

393.  Fujihata, Masaki, The Museum Inside The Network, 1995.

394.  Anantova, N. Monuments in the Structure of an Urban Environment: The Source of Social Memory and the Marker of the Urban Space, Materials Science and Engineering, 2017.

395.  Alberge, Dalya. British Museum Is World's Largest Receiver Of Stolen Goods, Says QC, The Guardian, 2019.

396.  Lasaponara, Rosa; Masini, Nicola. Living in the Golden Age of Digital Archaeology, Computational Science and Its Applications – ICCSA, Springer International Publishing, 2016, doi:10.1007/978-3-319-42108-7_47

397.  Kwet, Michael. Digital colonialism: The evolution of US empire, Longreads, March 2021.

398.  The Robot Guerrilla Campaign to Recreate the Elgin Marbles.

399.  Ibid.

400.  Mendelsohn, Daniel. Deep Frieze: What does the Parthenon mean?, The New Yorker, 2014.

401.  Ibid.

402.  Alexander, Caroline. If It Pleases the Gods: The Parthenon Enigma, The New York Times, 2014.

403.  Ibid.

404.  Mendelsohn, Daniel. Deep Frieze: What does the Parthenon mean?, The New Yorker, 2014.

405.  Wood, Gillen D'Arcy. Mourning the Marbles: The Strange Case of Lord Elgin's Nose, The Wordsworth Circle, Vol. 29, Num.3.

406.  The Robot Guerrilla Campaign to Recreate the Elgin Marbles.

407.  Wood, Gillen D'Arcy. Mourning the Marbles: The Strange Case of Lord Elgin's Nose, The Wordsworth Circle, Vol. 29, Num.3.

408.  Ibid.

409.  The Parthenon Sculptures: The Trustees’ statement, 2023.

410.  Ibid.

411.  Ibid.

412.  Michel, Roger. The Institute for Digital Archaeology, 2023.

413.  Talbot, Margaret. The Myth of Whiteness in Classical Sculpture, The New Yorker, 2018.

414.  Bond, Sarah. Whitewashing Ancient Statues: Whiteness, Racism And Color In The Ancient World, 2017.

415.  The Robot Guerrilla Campaign to Recreate the Elgin Marbles.

416.  British Museum Calls For ‘Parthenon Partnership’ With Greece Over Marbles, 2022.

417.  Allahyari, Moreshin, She Who Sees the Unknown, 2021.

418.  The Lincoln Memorial, U.S. National Parks Service, 2023.

419.  The Vietnam Veterans Memorial, U.S. National Parks Service, 2023.

420.  Zuhowski, Emilie. Memorial Day first celebrated at Charleston’s Hampton Park, 2022.

421.  Southern Documentary Fund, The Low Country, 2021.

422.  Bryant, Marie Claire. Underground Railroad Quilt Codes: What We Know, What We Believe, and What Inspires Us, The Smithsonian, 2019.

423.  Voluptuous Disintegration: A Future History of Black Computational Thought. Digital Humanities, Vol. 16, Num. 3, 2022.

424.  Six Years Later: 170 Confederate Monuments Removed Since Charleston Church Massacre, Southern Poverty Law Center, 2021.

425.  Parker, Adam. Few Black Burial Grounds Remain Intact In Charleston. Gullah Society Wants To Save Them, 2022.

426.  The Legacy Museum, 2023.

427.  Monuments, LAXART, 2023.

428.  Virtual Tour of United States Veterans and War Memorials, U.S. National Parks Service, 2023.

429.  Honor Everywhere: Virtual Reality Veterans Experience

430.  Civil War 1864: A Virtual Reality Experience

431.  Traveling While Black, Felix & Paul Studios, Oculus, 2019.

432.  1000 Cut Journey, Stanford Virtual Human Interaction Lab, 2018.

433.  Baccus-Clark, Ashley, Carmen Aguilar Y Wedge, Ece Tankal, and Nitzan Bartov. NeuroSpeculative AfroFuturism, MIT Docubase, 2017.

434.  Carne y Arena.

435.  Cahill, Nancy Baker. Liberty Bell, Association For Public Art, 2023.

436.  Olujimi, Kambui. Skywriters & Constellations:Full Dome Film and Related Exhibition, Newark Museum, 2018.

437.  Freeman, John Craig. Border Memorial: Frontera de los Muertos, 2012.

438.  Thiel, Tamiko and /p. Unexpected Growth, The Whitney Museum, New York, 2018.

439.  The Heritage Foundation.

440.  University of Southern California.

441.  Chat GPT, Open AI, 2023.

442.  Virgil, The Aeneid, Book II, Translated by A. S. Kline, 2002.

443.  Lustig, R. H. The Hacking of the American Mind: The Science Behind the Corporate Takeover of Our Bodies and Brains. Avery. New York 2017.

444.  Ibid

445.  Brin, Sarah. Subsidized, The Aesthetics of Play, Hammer Museum.

446.  Deleuze, Gilles. Spinoza: Practical Philosophy. Trans. Richard Hurley, City Lights Books, 1970.

447.  Putnam, Hilary. Reason, Truth, and History, Cambridge University Press, 1981.

448.  Liiva, Johan, Johan Reinholdz and Matte Modin. Deus Deceptor, NonExist, 2019.

449.  Skepticism and Content Externalism, Stanford Encyclopedia of Philosophy, 2018.

450.  The Digital Democracy Institute.

451.  Ibid.

452.  Marx, Karl. Das Kapital.

453.  Deleuze, Gilles and Felix Guattari. Anti-Oedipus: Capitalism and Schizophrenia, Trans. Robert Hurley, Mark Seem, and Helen R. Lane. University of Minnesota Press, 1983.

454.  Lyotard, Jean-Fracois. Energumen Capitalism, #Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media, 2017.

455.  Ibid.

456.  Ballard, J. G. Fictions of Every Kind, #Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media , 2017.

457.  Cybernetic Culture Research Unit, Lemurian Time War,CCRU Writings 1997-2003. Time Spiral Press, 2015.

458.  Ibid.

459.  Noys, Benjamin. The Persistence of the Negative: A Critique of Contemporary Continental Theory, Edinburgh University

460.  Ibid.

461.  Ibid.

462.  Hoffman, Bruce. A Year After January 6, Is Accelerationism the New Terrorist Threat?, Council on Foreign Relations, 2022. 

463.  Singleton, Benedict. Maximum Jailbreak, e-flux Journal, Is. 46, 2013.

464.  Ibid.
465.  Williams, Alex. Xenoeconomics and Capital Unbound, Splintering Bone Ashes, 2008. 
466.  Fisher, Mark. Nihilism Without Negativity, k-punk, 2008. 
467.  Williams, Alex. Post-Land: The Paradoxes of a Speculative Realist Politics, Splintering Bone Ashes, 2008. 

468.  Land, Nick, et al. Meltdown, Fanged Noumena: Collected Writings 1987-2007, Urbanomic, 2017.

469.  Ibid.

470.  Ibid.

471.  Land, Nick. The Dark Enlightenment, The Dark Enlightenment, 2012. 
472.  Ibid.
473.  Ibid.
474.  Ibid.

475.  Ibid.

476.  Ibid.
477.  Ibid.
478.  Ibid.
479.  Ibid.
480.  Ibid.
481.  Ibid.
482.  Land, Nick. Hyper-Racism, Outside In: Involvements with Reality, 2014. 

483.  Ibid.

484.  Williams, Alex; Srnicek, Nick. #ACCELERATE MANIFESTO for an Accelerationist Politics, Critical Legal Thinking, 2013.
485.  Ibid.
486.  Ibid.

487.  Ibid.

488.  Ibid.

489.  Ibid.
490.  Ibid.
491.  Ibid.
492.  Ibid.
493.  Ibid.
494.  Land, Nick. Annotated #Accelerate (#1), Urban Future (2.1): Views from the Decopunk Delta, 2014.
495.  Ibid.
496.  Ibid.
497.  Ibid.

498.  Ibid.

499.  Ibid.
500.  Ibid.

501.  Williams, Alex. Escape Velocities, e-flux 46, 2013.

502.  Ibid.
503.  Ibid.
504.  Ibid.
505.  Ibid.
506.  Ibid.
507.  Williams, Alex; Srnicek, Nick. Inventing the Future: Postcapitalism and a World without Work. London: Verso, 2016.
508.  Ibid.
509.  Srnicek, Nick. Platform Capitalism, Polity, 2017.
510.  Ibid.
511.  Land, Nick. Crypto-Current, An Introduction to Bitcoin and Philosophy, Šum, #10.2, November 26, 2018.
512.  Ibid.
513.  Abadi, Joseph; Brunnermeier, Markus. Blockchain Economics, 2018. 
514.  Land, Nick. Crypto-Current, An Introduction to Bitcoin and Philosophy, Šum, #10.2, 2018.

515.  Ibid.

516.  Berger, Edmund. Unconditional Acceleration and the Question of Praxis: Some Preliminary Thoughts, Deterritorial Investigations, 2017.
517.  Irigaray, Luce. The Sex Which Is Not One, Cornell University Press, 1985.
518.  Firestone, Shulamith. The Two Modes of Cultural History, #Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media, 2017.
519.  Ibid.
520.  Ibid.

521.  Haraway, Donna Jeanne. Manifestly Haraway, University of Minnesota Press, 2016.

522.  Ibid.

523.  Ibid.

524.  Ibid.

525.  Ibid.

526.  Plant, Sadie. Zeros and Ones: Digital Women and the New Technoculture, Fourth Estate, 1998.
527.  Preciado, Paul B. Testo Junkie: Sex, Drugs, and Biopolitics in the Pharmacopornographic Era, Feminist Press at the City University of New York, 2017.
528.  Ibid.

529.  Ibid.

530.  Cuboniks, Laboria. The Xenofeminist Manifesto: a Politics for Alienation, Verso, 2018. 
531.  Ibid.
532.  Ibid.
533.  Ibid.
534.  Ibid.
535.  Ibid.
536.  Ibid.
537.  Ibid.
538.  Ibid.
539.  Dean, Aria. Notes on Blaccelerationism, e-flux 87, 2017.
540.  Spillers, Hortense. Mama’s Baby Papa’s Maybe: An American Grammar Book, Black, White, and in Color: Essays on American Literature and Culture, University of Chicago Press, 2003. First published in Diacritics, Summer 1987.
541.  Ibid.
542.  Ibid.

543.  Mckittrick, Katherine. Sylvia Wynter: On Being Human as Praxis, Duke University Press, 2014.

544.  Ibid.
545.  Hartman, Saidiya. Venus in Two Acts. SmallAxe: A Caribbean Journal of Criticism 26, 2008.
546.  McKenzie Wark, Black Accelerationism, Public Seminar, 2017.
547.  Ibid.

548.  Negarestani, Reza. The Labor of the Inhuman,#Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media, 2017.

549.  Ibid.

550.  Fisher, Mark Capitalist Realism: Is There No Alternative? Zero Books, 2010.

551.  Brassier, Ray. Prometheanism and its Critics, #Accelerate#, Eds. Robin Mackay and Armen Avanessian, Urbanomic Media, 2017.

552.  TensorFlow. 

553.  PyTorch.

554.  What is a Tensor? University of Cambridge, 2023. 

555.  What is supervised learning?, IBM, 2023.

556.  Mildenhall, Ben, Pratul P. Srinivasan, Matthew Tancik, et al. NeRF: Representing Scenes as Neural Radiance Fields for View Synthesis, ECCV, 2020. 

557.  NERF, Hasbro, 2023.

558.  Stuart, Keith. Photorealism—The Future of Video Game Visuals, The Guardian, 2015. 

559.  Autonomous Weapons.

560.  Stuart Russell, Stuart, Anthony Aguirree, Milia Javorsky, and Max Tegmark. Lethal Autonomous Weapons Exist; They Must Be Banned, IEEESpectrum, 2021.

561.  Planet.

562.  Borges, Jorge Luis. On Exactitude in Science, Collected Fictions, trans. by Andrew Hurley, 1946.

563.  Hacker-Wright, John. Philippa Foot, Stanford Encyclopedia of Philosophy, 2018. 

564.  Millar, Jason. An ethical dilemma: When robot cars must kill, who should pick the victim?, Robohub, 2014.

565.  Cummings, M. L.  Artificial Intelligence and the Future of Warfare, International Security Department and US and the Americas Programme, Chatham House, 2017.

566.  Guerin, Joris, Olivier Gibaru, Stephane Thiery, and Eric Nyiri. CNN Features Are Also Great at Unsupervised Classification, 2018. 

567.  Definition of Convolution, Merriam Webster Dictionary. 2023. 

568.  Srivastava, Abhinai. The Evolution Of Computer Vision And Its Impact On Real-World Applications, Forbes, 2021.

569.  Fukushima, Kunihiko. Neocognitron: A Self-organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position, Springer-Verlag 1980.

570.  Philipp, George, Dawn Song, and Jaime G. Carbonell. The Exploding Gradient Problem Demystified—Definition, Prevalence, Impact, Origin, Tradeoffs, and Solutions, ICLR, 2018.

571.  Glorot, Xavier, Antoine Bordes and Yoshua Bengio. Deep Sparse Rectifier Neural Networks, 2011. 

572.  Yeonjong Shin, Lu Lu, Yanhui Su, and George Em Karniadakis. Dying ReLU and Initialization: Theory and Numerical Examples, 2019.

573.  LeakyReLU, Pytorch, 2023.

574.  Donella Meadows, Leverage Points: Places to Intervene in a System Archived, Sustainability Institute, 1999.

575.  Climate Feedback, The Study of Earth as an Integrated System, NASA, 2023.

576.  Chattopadhyay, D. Electronics: Fundamentals And Applications,  New Age International, 2006.

577.  Newton’s Second Law, The Physics Classroom, 2023.

578.  Yathish, Vishal. Loss Functions and Their Use In Neural Networks, Toward Data Science, 2022.

579.  Graves, Alex, Gred Wayne, and Ivo Danihelka. Neural Turing Machines, 2014.

580.  Shapiro, Linda G. and George C. Stockman. Computer Vision, Prentice-Hall, 2001.

581.  Lu, Luhui. Generative AI and Future, Towards AI, 2022.

582.  Goodfellow, Ian J. et al. Generative Adversarial Nets, The International Conference on Neural Information Processing Systems, 2014.

583.  Nikolenko, Sergey I. Synthetic Data for Deep Learning, Optimization and Its Applications. Vol. 174, 2021.  

584.  Jordon, James et al. Synthetic Data—What, Why, and How? Report commissioned by The Turing Institute and The Royal Society, 2023.

585.  Ibid.

586.  Griffiths, Catherine. Unmodelled: In the Blindspot of AI Infrastructure, Gradient Magazine, 2023. 

587.  Ibid

588.  Ibid

589.  Ibid

590.  Negarestani, Reza. Intelligence and Spirit, The MIT Press, 2018.  

591.  Siphon, Merriam Webster Dictionary, 2023. 

592.  Stable Diffusion.

593.  Suri, Siddharth. Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass, 2019.

594. Timnit Gebru. 

595.  Joy Buolamwini.

596. Kate Crawford.

597. Ruha Benjamin. 

598. Safiya Umoja Noble. 

599. Virginia Eubanks. 

600. Arvind Narayanan.

601.  Latanya Sweeney.

602. Alex Hanna.

603. Os Keyes.

604. Joanna Bryson. 

605. Deborah Raji.

606. Margaret Mitchell.

607. Cade Metz. 

608.  Rumman Chowdhury.

609.  Partnership on AI.

610. AI Now Institute.

611. AI Ethics Lab. 

612. AI4ALL.

613.  Vincent, James. AI Art Tools Stable Diffusion and Midjourney Targeted with Copyright Lawsuit, The Verge, 2023.

614.  Hill, Kashmir. This Tool Could Protect Artists From A.I.-Generated Art That Steals Their Style, The New York Times, 2023.

615.  Ibid.  

616.  Barshad, Amos. This Singer Deepfaked Her Own Voice—and Thinks You Should Too, Wired, 2022. 

617.  Chiang, Ted. ChatGPT Is a Blurry JPEG of the Web, The New Yorker, 2023.

618.  Zhen Liu, et al. MeshDiffusion: Score-based Generative 3D Mesh Modeling, ICLR, 2023.

619.  Point-E: A system for generating 3D point clouds from complex prompts, Open AI, 2023.

620.  Ibid.

621.  Saunders, Jack. Person-Specific Deepfakes with 3D Morphable Models, Medium, 2023. 

622.  Deefakes, Reddit, 2023. 

623.  Roose, Kevin. Here Come the Fake Videos, Too, The New York Times, 2018.

624.  Fagan, Abigail. Deep Fakes Are Becoming More Harmful for Women, Psychology Today, 2022.

625.  Bond, Shannon. People Are Trying To Claim Real Videos Are Deepfakes. The Courts Are Not Amused, National Public Radio, 2023.

626.  Vaccari, C.,  and A. Chadwick. Deepfakes and Disinformation: Exploring the Impact of Synthetic Political Video on Deception, Uncertainty, and Trust in News, Social Media + Society, 6(1), 2020.

627.  Chesney, B., and D. Citron. Deep Fakes: A Looming Challenge For Privacy, Democracy, And National Security, California Law Review, 107, 2019.

628.  Doermann, David. Speaking About Deepfakes to the U.S. House Intelligence Committee, 2019.

629.  Köbis, N. C., B. Doležalová, and I. Soraperra, I. Fooled Twice: People Cannot Detect Deepfakes but Think They Can. iScience, 24 (11), 2021.  

630.  Radner, Karen, Eleanor Robson. The Oxford Handbook of Cuneiform Culture. Oxford University Press, 2011.

631.  Jagersma, B. A Descriptive Grammar of Sumerian, Leiden University, 2010.

632.  Niels Peter Lemche. Biblical Studies and the Failure of History: Changing Perspectives 3. Taylor & Francis, 2014.

633.  Claassens, Juliana. Resisting Dehumanization: Acts of Relational Care in Exodus 1-2 as Image of God's Liberating Presence, Scriptura, 2010.

634.  Exodus Chapter 20, Parashat Terumah, 2023. 

635.  Exodus Chapter 25, Parashat Terumah, 2023. 

636.  Cubit, Merriam Webster Dictionary, 2023.

637.  Schumacher, Benjamin. Quantum Coding, Physical Review A, 1993.

638.  Esquivel, Jessica. The Queer Universe: A Quantum Explanation, 2022.

639.  Von Baeyer, Hans Christian. The Qubit, Information in the Quantum Age, Information,The New Language of Science, 2003.

640.  Einstein, Albert. The Born-Einstein letters: correspondence between Albert Einstein and Max and Hedwig Born from 1916–1955, with commentaries by Max Born. Macmillan. 1971.

641.  Schrödinger, Erwin. Proceedings of the Cambridge Philosophical Society, 31, 1935.

642.  Shichuan Xue, Yong Liu, Yang Wang, Pingyu Zhu, Chu Guo, Junjie Wu. Variational Quantum Process Tomography, 2021. arXiv:2108.02351

643.  Blume-Kohout, Robin. Optimal, Reliable Estimation of Quantum States. Institute for QuantumInformation, Caltech. New Journal of Physics, 2006. arXiv:quant-ph/0611080

644.  Zeh, H. Dieter. On the Interpretation of Measurement in Quantum Theory, Foundations of Physics. 1, 1, 1970. doi:10.1007/BF00708656

645.  Choi, Charles. Electric Cooling Could Shrink Quantum Computers Vacuum-tube effect might simplify cryogenic chambers, IEEE Spectrum, 2023.

646.  Huang, He-Liang and Dachao Wu, Daojin Fan, Xiaobo ZhuSuperconducting Quantum Computing: A Review, Science China Information Sciences 63, 2020.

647.  Vepsäläinen, Antti P., et al. Impact of Ionizing Radiation on Superconducting Qubit Coherence, Nature volume 584, 2020.

648.  Ma, He, Marco Govoni, and Giulia Galli. Quantum Simulations of Materials on Near-term Quantum Computers, Computational Materials, Nature, 2020.

649.  Evers, Matthias, Anna Heid, and Ivan Ostojic. Pharma’s Digital Rx: Quantum Computing in Drug Research and Development, McKinsey, 2021.

650.  Preskill, John. Quantum Computing and the Entanglement Frontier, 2012. arXiv:1203.5813

651.  Arute, Frank, et al. Quantum Supremacy Using a Programmable Superconducting Processor, Nature, 2019.

652.  Roush, Wade. The Google-IBM “Quantum Supremacy” Feud, MIT Technology Review, 2020.

653.  Kaku, Michio. Quantum Supremacy: How the Quantum Computer Revolution Will Change Everything, Doubleday, 2023.

654.  Chun, Wendy Hui Kyong. Control and Freedom: Power and Paranoia in the Age of Fiber Optics, The MIT Press, 2006.

655.  Ibid.

656.  Spinoza, Baruch, and R. H. M. Elwes. The Ethics. Mineola, NY: Dover Publications, Inc., 2018.

657.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

658.  Ibid.

659.  Grosz, Elizabeth. The Incorporeal: Ontology, Ethics, and the Limits of Materialism. Columbia University Press, 2018.

660.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

661.  Ibid.

662.  Ibid.

663.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

664.  Levinas, Emmanuel. Totality and Infinity: An Essay on Exteriority, Duquesne Press, 1961.

665.  Brain in the vat (III. general definition of deception).

666.  Haraway, Donna. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.”Feminist Studies 14, no. 3, 1988.

667.  Dolphijn, Rick, and Iris van der. Tuin. New Materialism Interviews and Cartographies. Open Humanities Press,2012.

668.  Eribon, Didier Michel Foucault. Trans. Betsy Wing, Harvard University Press, 1991.

669.  Bryant, Levi R. The Democracy of Objects. Open Humanities Press, 2011.

670.  DeLanda, Manuel. Intensive Science and Virtual Philosophy. London: Bloomsbury, 2013.

671.  DeLanda, Manuel; Harman, Graham. The Rise of Realism. Cambridge, UK: Polity Press, 2017.

672.  Latour, Bruno. On Actor-Network Theory. A Few Clarifications Plus More Than a Few Complications. Soziale Welt, vol. 47,1996.

673.  Bennett, Jane. Vibrant Matter: A Political Ecology of Things. Durham: Duke University Press, 2010.

674.  Braidotti, Rosi. The Posthuman. Cambridge, UK: Polity Press, 2013.

675.  Ibid.

676.  Barad, Karen. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Durham: Duke University Press, 2007.

677.  Ibid.

678.  Ibid.

679.  Ibid.

680.  Ibid.

681.  Ibid.

682.  Barad, Karen. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Durham: Duke University Press, 2007.

683.  Dolphijn, Rick, and Iris van der. Tuin. New Materialism Interviews and Cartographies. Open Humanities Press, 2012.

684.  Braidotti, Rosi. The Posthuman. Cambridge, UK: Polity Press, 2013.

685.  Bennett, Jane. Vibrant Matter a Political Ecology of Things. Durham: Duke University Press, 2010.

686.  Scott, David. “The Re-Enchantment of Humanism: An Interview with Sylvia Wynter,” Small Axe no.8, 2000.

687.  Ibid.

688.  Ibid.

689.  Whitehead, Alfred North. Process and Reality. Riverside: Free Press, 2010.

690.  Massumi, Brian. Parables for the Virtual. Movement, Affect, Sensation, Duke University Press, 2002.

691.  Delanda, Manuel. The New Materiality, Architectural Design 85, no. 5, 2015.

692.  Grosz, Elizabeth. The Incorporeal: Ontology, Ethics, and the Limits of Materialism, Columbia University Press, 2018.

693.  Braidotti, Rosi. The Posthuman, Polity Press, 2013.

694.  Ibid.

695.  Ibid.

696.  Ibid.

697.  Braidotti, Rosi. The Posthuman, Polity Press, 2013.

698.  Agamben, Giorgio. Homo Sacer: Sovereign Power and Bare Life. Trans. Daniel Heller-Roazen, Stanford University Press, 1998.

699.  Weheliye, Alexander. Habeas Viscus: Racializing Assemblages, Biopolitics, and Black Feminist Theories of the Human, Duke University Press, 2014.

700.  Ibid.

701.  Ibid.

702.  Mckittrick, Katherine. Sylvia Wynter: On Being Human as Praxis, Duke University Press, 2014.

703.  Levinas, Emmanuel. Totality and Infinity.

704.  Baciu A, Negussie Y, Geller A, et al. The State of Health Disparities in the United States, National Academies Press, 2017.

705.  Singhal, Shubham. The gathering storm: The uncertain future of US healthcare, McKinsey, 2022.

706.  Ukuleles, Mierle Laderman. Manifesto for Maintenance Art, 1969.

707.  Kelly, Mary. Post-Partum Document, The Tate Museum, 1979.

708.  Kelly, Mary. Post-Partum Document. Documentation III: Analysed Markings And Diary Perspective Schema (Experimentum Mentis III: Weaning from the Dyad), The Tate Museum, 1975.

709.  Tiravanija,Rirkrit.Interview by Chieng Wei Shieng & Clifford Loh.The Relational Artist,Vulture Magazine,2019.

710.  Decter,Joshua.Rirkrit Tiravanija,Art Forum,2011.

711.  Bruguera,Tania.Migrant Manifesto, Immigrant Movement International. Creative Time,2011.

712.  Tania Bruguera:Immigrant MovementInternational,The Tate Museum,2012. 

713.  Pope.L,William.The Black Factory,Creative Capital,2001. 

714.  Keegan,Alison.The Black Factory,The Bates Museum of Art,2010.

715.  Steinbock, Eliza. Photographic Flashes: On Imaging Trans Violence in Heather Cassils' Durational Art, Photography And Culture, Vol. 7, 2014. https://doi.org/10.2752/175145214X14153800234775

716.  Watson, Jeff.Snake Oil Men (VII.).

717.  Ibid.

718.  Lanka Tattersall,Notes on Weed Killer,Museum of Contemporary Art,Los Angeles,2017.

719.  Hedva, Johanna.Sick Woman Theory,Topical Cream,2022. 

720.  Sedgwick, Eve Kosofsky. Paranoid Reading and Reparative Reading, Or, You’re So Paranoid, You Probably Think this Essay is About You, Touching Feeling, Duke University Press, 2002.

721.  Glover, Donald. The Big Payback, S3.E4, Atlanta, IMDB, 2022.

722.  Ibid.

723.  Ludden, Jennifer. Cities may be debating reparations, but here’s why most Americans oppose the idea, National Public Radio, 2023.

724.  Ibid.

725.  Reparations: OHCHR and Transitional Justice, The United Nations, 2023. 

726.  Ibid.

727.  Matthews, Dylan. Six Times Victims Have Received Reparations—Including Four in the US, Vox, 2014.

728.  Thomson, Ginger. South Africa to Pay $3,900 to Each Family of Apartheid Victims, The New York Times, 2003.

729.  Ayesh, Rashaan. The World’s Long History of Reparations, Axios, 2019.

730.  UK to Compensate Kenya’s Mau Mau Torture Victims, The Guardian, 2013.

731.  Matthews, Dylan. Six Times Victims Have Received Reparations—Including Four in the US, Vox, 2014.

732.  Bilmes, Linda and Cornell William Brooks. The United States Pays Reparations Every Day—Just Not to Black America, Harvard-Kennedy School Policycast, 2022.

733.  Ray,Rashawn and Andre M.Perry.Why We Need Reparations for Black Americans,Brookings,2020.

734.  Cox,Kiana and Khadijah Edwards.Reparations for Slavery,Pew Research Center,2022.

735.  Deuteronomy 16:20.

736.  hooks, bell and Maya Angelou. There’s No Place to Go But Up — bell hooks and Maya Angelou in conversation, Lion’s Roar, 1998.

737.  Brosi, George and bell hooks. The Beloved Community: A Conversation between bell hooks and George Brosi, Appalachian Heritage, Volume 40, Number 4, 2012.

738.  Zion, J.W. Dynamics of Navajo Peacemaking, US Department of Justice, 1998.

739.  Truth Commission: South Africa, United States Institute of Peace, 1995.

740.  Youth Justice Family Group Conferences, Oranga Tamariki, New Zealand Ministry for Children, 2023.

741.  Compendium Of Promising Practices To Reduce Violence And Increase Safety Of Aboriginal Women In Canada–Compendium Annex: Detailed Practice Descriptions, Family Violence Initiative, Government of Canada, 2021.

742.  Prison Fellowship, 2023.

743.  Tepper, Felicity. The Importance of Environmental Restorative Justice for The United Nations Decade on Ecosystem Restoration (2021–2030), The Palgrave Handbook of Environmental Restorative Justice, 2022.

744.  Forsyth, Miranda, Brunilda Pali, and Felicity Tepper. Environmental Restorative Justice: An Introduction and an Invitation, The Palgrave Handbook of Environmental Restorative Justice, 2022.

745.  Restorative Environmental Justice, European Forum for Restorative Justice, 2020.

746.  Justice40 Initiative Environmental Justice Fact Sheet, US Department of Energy, 2022.

747.  How Artificial Intelligence is Helping Tackle Environmental Challenges, UN Environment Programme, 2023.

748.  Ocean Visions Selects Launchpad Teams,Ocean Visions,2022.

749.  Phykos, 2023.

750.  Ibid.

751.  Ibid.

752.  Ibid.

753.  Saving the World's Coral Reefs, Autodesk, 2023.

754.  Coral Entanglement Research, Roctopus EcoTrust, 2023.

755.  Coralmaker, Mission Statement, 2023.

756.  Ibid.

757.  Wertheim, Margaret. Corals, Crochet and the Cosmos: How Hyperbolic Geometry Pervades the Universe, The Conversation, 2016.

758.  Carey, Nic and Yotto Koga. Coralmaker panel discussion, Autodesk, 2023.

759.  Weiss, Sabrina. Robots Enter the Race to Save Dying Coral Reefs, Wired Magazine, 2023.

760.  Ibid.

761.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988. 

762.  Weiss, Sabrina. Robots Enter the Race to Save Dying Coral Reefs, Wired Magazine, 2023.

763.  Ibid.

764.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

765.  Ibid.

766.  Fuzzy Logic. Stanford Encyclopedia of Philosophy, Bryant University, 2006.

767.  Deleuze, Gilles. Spinoza: Practical Philosophy. San Francisco: City Lights Books, 1988.

768.  Lorde, Audre. The Uses of the Erotic: The Erotic as Power, Kore Press, 1978.

769.  Ibid.

770.  Final Solution—1940 to 1945, United States Holocaust Memorial Museum, 2023.

771.  Why do some Jews who survived the Holocaust have a number tattooed on their arm?, World Jewish Congress, UNESCO, 2023.

772.  Watson, Jeff. Reality is an Emergency. 2015.

~