The Grammars of AI: Towards a Structuralist and Transcendental Hermeneutics of Digital Technologies (original) (raw)
Related papers
The Becoming of AI: A Critical Perspective on the Contingent Formation of AI
This chapter offers a critical perspective on the contingent formation of artificial intelligence as a key sociotechnical institution in contemporary societies. It shows how the development of AI is not merely a product of functional technological development and improvement but depends just as much on economical, political, and discursive drivers. It builds on work from STS and critical algorithm studies surfacing that technological developments are always contingent on and resulting from transformations along multiple scientific trajectories as well as interaction between multiple actors and discourses. For our conceptual understanding of AI and its epistemology, this is a consequential perspective. It directs attention on different issues: away from detecting impact and bias ex post, and towards a perspective that centers on how AI is coming into being as a powerful sociotechnical entity. We illustrate this process in three key domains: technological research, media discourse, an...
2024
In this paper, I aim to assess whether postphenomenology's ontological framework is suitable for making sense of the most recent technoscientific developments, with special reference to the case of AI-based technologies. First, I will argue that we may feel diminished by those technologies seemingly replicating our higher-order cognitive processes only insofar as we regard technology as playing no role in the constitution of our core features. Second, I will highlight the epistemological tension underlying the account of this dynamic submitted by postphenomenology. On the one hand, postphenomenology's general framework prompts us to conceive of humans and technologies as mutually constituting one another. On the other hand, the postphenomenological analyses of particular human-technology relations, which Peter-Paul Verbeek calls cyborg relations and hybrid intentionality, seem to postulate the existence of something exclusively human that technology would only subsequently mediate. Third, I will conclude by proposing that postphenomenology could incorporate into its ontology insights coming from other approaches to the study of technology, which I label as human constitutive technicity in the wake of Peter Sloterdijk's and Bernard Stiegler's philosophies. By doing so, I believe postphenomenology could better account for how developments in AI prompt and possibly even force us to revise our self-representation. From this viewpoint, I will advocate for a constitutive role of technology in shaping the human lifeform not only in the phenomenological-existential sense of articulating our relation to the world but also in the onto-anthropological sense of influencing our evolution.
When Machines Talk: A Brief Analysis of Some Relations between Technology and Language
2020
This essay for the inaugural issue of Technology and Language builds on sustained discussions of the relation of the (philosophy of) technology and the (philosophy of) language, for example in the suggestion that there are „technology games“ in analogy to „language games“ as forms of life. In light of recent technological developments, this essay takes another step by way of distinguishing three types of interaction between language and technology as one considers technology as a language author, language user, and shaper of a form of life. This reflects back on what technology itself is and does. Technology is deeply integrated in, and interwoven with, our human world and our human thinking, which is always also a world permeated with, and enabled by, language.
Subjectivity
Immersed in the networks of artificial intelligences that are constantly learning from each other, the subject today is being configured by the automated architecture of a computational sovereignty (Bratton 2015). All levels of decisionmaking are harnessed in given sets of probabilities where the individuality of the subject is broken into endlessly divisable digits. These are specifically reassembled at check points (Deleuze in Negotiations: 1972-1990, Columbia University Press, New York, 1995), in ever growing actions of predictive data (Cheney-Lippold in We are data and the making of our digital selves, NYU Press, New York, 2017), where consciousness is replaced by mindless computations (Daston in "The rule of rules", lecture Wissenschaftskolleg Berlin, November 21st, 2010). As a result of the automation of cognition, the subject has thus become ultimately deprived of the transcendental tool of reason. This article discusses the consequences of this crisis of conscious cognition by the hands of machines by asking whether the servomechanic model of technology can be overturned to expose the alien subject of artificial intelligence as a mode of thinking originating at, but also beyond, the transcendental schema of the self-determining subject. As much as the socio-affective qualities of the user have become the primary sources of capital abstraction, value, quantification and governmental control, so has technology, as the means of abstraction, itself changed nature. This article will suggest that the cybernetic network of communication has not only absorbed physical and cognitive labour into its circuits of reproduction, but is, more importantly, learning from human culture, through the data analysis of behaviours, the contextual use of content and the sourcing of knowledge. The theorisation of machine learning as involving a process of thinking will be taken here as a fundamental inspiration to argue that the expansion of an
Phenomenology and Digital Knowledge , 2022
This article investigates the possibility to question the difference between artificial and human intelligence by assuming that the latter can incorporate artificial, external components just as artificial intelligence can simulate human responses, and by exploring human embodiment in its technically and digitally augmented dimension. The idea that digital processes do not merely imply a detachment from the body, a dematerialization or disembodiment, is supported by many researchers, starting already from those who-back in the 1980s-reacted to cyberpunk narratives and their tendency to posit a new mind-body dualism. Yet, here I would like to frame this thesis not within the post-human context but in a phenomenological perspective, and in doing so I will employ specific conceptual tools. I will particularly (1) rely on Katherine Hayles' distinction between incorporating and inscribing practices; (2) refer to Maturana and Varela's notion of structural coupling; (3) analyze algorithmic thinking and its temporal structure.
The Technological Haunt in Artificial Intelligence: a lexicon (in notes)
ARTIFICIAL MUSIC, 2022
In this chapter, I develop my term "the technological haunt" as an indicator of Indigenous people's haunting online. The chapter is written in notes, as a lexicon. In this way, I am attempting to write in an anti-colonial way that reintegrates Indigenous epistemologies at the forefront of theory with regard to settler colonialism offline and online. Published with Spector Books, Artificial Music, 2022 with HKW Berlin.
The work of Ludwig Wittgenstein is seldom used by philosophers of technology, let alone in a systematic way, and in general there has been little discussion about the role of language in relation to technology. Conversely, Wittgenstein scholars have paid little attention to technology in the work of Wittgenstein. In this paper we read the Philosophical Investigations and On Certainty in order to explore the relation between language use and technology use, and take some significant steps towards constructing a framework for a Wittgensteinian philosophy of technology. This framework takes on board, and is in line with, insights from postphenomenological and hermeneutic approaches, but moves beyond those approaches by benefiting from Wittgenstein's insights into the use of tools, technique, and performance, and by offering a transcendental interpretation of games, forms of life, and grammar. Focusing on Wittgenstein's philosophy of language in the Investigations, we first discuss the relation between language use and technology use, understood as tool use, by drawing on his analogy between language and tools. This suggests a more general theory of technology use, understood as performance. Then we turn to his epistemology and argue that Wittgenstein's understanding of language use can be embedded within a more general theory about technology use understood as tool use and technique, since language-in-use is always already a skilled and embodied technological practice. Finally, we propose a transcendental interpretation of games, forms of life, and grammar, which also gives us a transcendental way of looking at technique, technological practice, and performance. With this analysis and interpretation, further supported by comments on robotics and music, we contribute to using and
Experience and abstraction: the arts and the logic of machines
DOAJ (DOAJ: Directory of Open Access Journals), 2008
This paper is concerned with the nature of traditions of Arts practice with respect to computational practices and related value systems. At root, it concerns the relationship between the specificities of embodied materiality and aspirations to universality inherent in symbolic abstraction. This tension structures the contemporary academy, where embodied arts practices interface with traditions of logical, numerical and textual abstraction in the humanities and the sciences. The hardware/software binarism itself, and all that it entails, is nothing if not an implementation of the Cartesian dual. Inasmuch as these technologies reify that worldview, these values permeate their very fabric. Social and cultural practices, modes of production and consumption, inasmuch as they are situated and embodied, proclaim validities of specificity, situation and embodiment contrary to this order. Due to the economic and rhetorical force of the computer, the academic and popular discourses related to it, are persuasive. Where computational technologies are engaged by social and cultural practices, there exists an implicit but fundamental theoretical crisis. An artist, engaging such technologies in the realization of a work, invites the very real possibility that the technology, like the Trojan Horse, introduces values inimical to the basic qualities for which the artist strives. The very process of engaging the technology quite possibly undermines the qualities the work strives for. This situation demands the development of a 'critical technical practice' (Agre). This paper seeks to elaborate on this basic thesis. It is written from the perspective, not of the antagonistic luddite, but from that of a dedicated practitioner with twenty five years experience in the design and development of custom electronic and digital artworks. Note and Disclaimer: This paper, inevitably, focuses on issues which arise as a result of the peculiarities of western cultural and technical history, and reflects discourses conducted in the English language. As discussed, some of the forces influencing those historical flows relate to the traditions of western philosophy, itself strongly influenced by Christian doctrine. The question of what form automated computation might have taken if it had arisen in a culture with different religious and philosophical history is a fascinating one. Likewise, the way that such a culture might negotiate the relation between technology and culture might be very different from that which has occurred in the West, and might offer important and useful qualities.
Towards A*cognitive Architecture: A cybernetic note beyond - or the self informaing Machinery
Psychopathologies of Cognitive Capitalism Part 2: The Cogntive Turn, Warren Neidich (ed.), 2014
"We are writing the year 2014. Information increasingly becomes a desire, the necessity in the form of communication. Desire is an extension of the brain while communication seemingly combines heuristic operations in design development, and reaches beyond thermodynamics. More than 65 years ago, Norbert Wiener’s “Cybernetics - the Control and Communication in the Animal and the Machine” (Wiener, 1948) is published, the Macy Conferences then titled Circular Causal and Feedback Mechanisms in Biological and Social Systems are in their 3rd year and following von Neumann’s findings on the Ergodic Theorem, the Cellular Automaton are on its way.(1) In the meantime Baby, the Manchester Small-Scale Experimental Machine and the world’s first stored-program computer, tested for the so-called Williams Tube (a lightweight storage device) runs its first program. The Universal Turing Machine becomes the continuing driver for computation.(2) Within the context of The Psychopathologies of Cognitive Capitalism - Volume II, the subject matter of this paper relates to accessing knowledge and tools for observing and designing an interconnected para-metric world; a cybernetic world characterized by multidimensional behavioural structures informed through using the digital as interface. It provides an extension to the existing emergent construct of The Psychopathology of Cognitive Capitalism in general and the interdisciplinary Cognitive Capitalism Project in particular. The insight given into cognitive capitalism and design strategies are foundations formed in the 20th century and aims at understanding and formulating of what the pathology of The Psychopathology of Cognitive Capitalism may possibly be when looking through the lens of cybernetics beyond. It offers an investigation to the understanding and the form of knowledge through communication, a recursive re-invention and re-understanding of how we think, how we decide and how we actually design and behave. Not just as architects in the design process of a building or a city, but as human beings in the design process of the everyday. It is a compilation, a collection of thoughts and findings, hovering in a paradigm between architecture, cybernetics, system theory, technology and the state of being. Wittgenstein’s Tractatus Logico Philosophicus, and the subject of language or syntax, the question of what is reality, may be kept as a filtering veil through which the forthcoming can be received. This paper discusses cultural architectural theory in conjunction with technical, political and economical possibilities for tangible manifestations in the context of interdisciplinary cognitive work and authorship. In particular, in the context of open source software, virtual and unknown design teams, whose common ground is based on common interest and knowledge, a spatial-temporal structural coupling, which, at times, can be in a parallel fashion. The field of cybernetics and architecture within computational design, researches facts and theories developed in the last half of the 20th century, and their relevance to computational thinking and digital making, shaping the built and unbuilt networked environment in the near and distant future. In contrast to the digital, referring to a particular technology of executing calculations, the computational relates to a way of thinking and making reflecting complexity and non-linearity. It is about process rather than an input/output informed by issues such as emergence, algorithms, structure, material behaviour, data and society, in most cases using computers as interface and calculating machines and CNC (computer numerical control) for digital fabrication, such as 3D-printing, laser-cutting or robotic fabrication. Underlying principles are all but reduced to architecture; instead they are methods, possible to be applied to a variety of disciplines; and relevant as an application and proof for cognitive capitalism. The model suggested here pushes the boundaries of contemporary understanding of architecture and reconstructs reality; from the field of function in the scale of buildings, cities and smaller prototypes, towards understanding the liminal space between the built environment, materials used, the perception of the user, the regulation through the designer, client, budget, and most importantly decision-making processes, the choice of software during the design process, data used and the cognitive capital to inform all of the above. So the questions are: ‘Who is the designer?’ and ‘Who owns the copyright, or rather the cognitive copyright?’ Architecture as building, as physical form is the proof of concept of interplay, and most likely is a result of thinking and internalizing dwelling that has been outlived by the construct architecture as organism, evolving and informing itself through filtering, observation and self-observation: “I am the observed link between myself and observing myself” (von Foerster, 1981).(3) This paper is structured in a series of parts, which at times overlap and merge. “Defining the Matter” aims to clarify key terms used, and to establish possible relationships between them. These terms–which derive from biology, computer sciences, architecture or mathematics–so far have neither been defined cross-disciplinarily, nor within architecture. They do however affect how the forthcoming text is understood. Since in an era of rapid technological and theoretical shifts terms and expressions are revisited constantly and receive numerous varying definitions there is a necessity to delineate their complex functions. This will provide a theoretical basis for the present text and sets the topic into the framework of the philosophy of computation simultaneously. The main inquiry focuses on the agenda “It’s not alone, it’s synthetic – the bits are calculating,” and provides the reader with a description of A*cognitivist Architecture in a cyber-biological framework and embeds its construct within the mind/body phenomenon immanent in social environments. The narrative Reyner Banham loves Los Angeles bridges computational thinking, the body, perception through our senses and the brain in a material world. This part introduces the concept of Wechselwirkung offered by biological cognition and the cognitive Internet, with its unknown amount of reflectors in an existing non-linear para-space.(4) Here the cybernetic note will be woven into a genealogical string of thoughts. The paper concludes with an exploration for and of an interconnected para-metric world by engaging largely with immaterial, neurotectural design strategies and the question of collective authorship, relating back to the synthetic that becomes natural, cognitive and bio-semiotic. "
Ethnographic Studies, 2019
Algorithms are becoming interwoven with increasingly many aspects of our affairs. That process of interweaving has brought with it a language laden with anthropomorphic descriptions of the technologies involved, which variously hint at 'human-esque' or 'conscious-like' activity occurring within or behind their operations. Indeed , the term 'Artificial Intelligence' (AI) seems to refer to a quality that is thought to be largely human; namely, intelligence. However, while anthropomorphic descriptions may be useful or harmless, when taken at face value they generate a false picture of algorithms as well as of our own thinking and reasoning practices by treating them as analogues of one another rather than as distinct. Focusing on the algorithm, and what it is misleadingly said to be and to be like, in this article we outline three 'perspicuous representations' (Wittgenstein 1953: §122) of AI in specific contexts. Drawing on Wes Sharrock's ethnomethodological and Wittgensteinian work, our aim is to demonstrate that by attending to the particular, occasioned and locally accountable, not to say highly specified, usages of language that accompany the 'New AI' in particular, we can avoid being haunted by the new task performing ghosts currently being discursively conjured up in our algorithmic machines.