Talk based on a piece I wrote called ‘The Sharpest Point of Sensation is Pointless’ for an album by Eric Frye called ‘Small Differences in Sensation

the sharpest point of sensation is pointless

Advertisements

Formalizing Analogical Transitions within a Multi-Level Dynamic Mechanistic Framework

formalizing-analogical-transitions

Talk given at ‘Cold Bodies, Warm Machines’ Conference, Düsseldorf, 09-11/09/16.

Firstly, what is the Dynamic Mechanistic Framework?[1] Simply put its the way in which we can understand the properties of a system by disassembling it and seeing how its parts work either individually or in combination with other parts. It is a mechanistic framework because it acknowledges that the concept of mechanism has been, and still is, a key idea of modern science by which the characteristics of a system are explained according to the causal structure of its parts. Though the classical mechanistic framework has been critically overcome by the recognition of the ubiquity of non-linearity in dynamic systems theory, the concept of mechanism as structure-based explanatory framework for the functioning of a system may be retained if it is suitably dynamicized. This mechanistic framework is multi-level and dynamic because it is neither ‘ruthlessly reductionist’ nor rabidly anti-reductionist – it recognizes that the functional characteristics of a system at one level of analysis may be emergent properties that are not well described by the lower level dynamics that support them, without claiming that these properties are irreducible to the behaviour of its structural components (this is called weak as opposed to strong emergence). The framework itself is dynamic since it progresses by several stages of decomposition and localization of functional properties in structural components – if a property cannot be directly localized in a structural component then it may be hypothesized as a property of the interaction of two or more components, and if this hypothesis fails to hold a further level of complexity can be adduced, up to the point where a property can be assumed to emerge at the global level of the whole system.

Another important consideration that necessitates a multi-level analysis is that not only is order dependent on structure but randomness and unpredictability are relative to the theoretical framework or computational set-up (architecture, processing power, input stream). A system or an input is unpredictable only in relation to the theoretical framework that imposes its form of prediction. To understand something means to impute to it some kind of characteristic behaviour under certain circumstances, and this entails prediction, abstraction, and generalization. If we want to act on the world we cannot renounce prediction or abstraction, and we cannot thereby avoid the possibility of some random perturbation to our system. However, we may construct systems that are more or less robust or resilient to perturbations, more or less capable of the dynamic revision and reconstruction of theoretical frameworks. Crucially, different theoretical frameworks are necessary for each level of analysis, and the dynamics of randomness, unpredictability, and robustness or resilience to perturbation are distinct at each level[2] – for example physics, biology and social systems may be distinguished along precisely these lines, as well as classifying the descriptive capabilities of different architectures in the computational hierarchy from finite automata through to universal Turing machines.[3] Moreover, the multi-level dynamic mechanistic framework provides an engineering perspective by which the functional property of intelligence itself is decomposed and the resulting explanatory hypotheses enable traction not only on the problem of engineering artificially intelligent agents, but also on re-engineering the human.[4]

Why formalization? Formalization may be understood as the explicitation of implicit theoretical frameworks or models, and making these models explicit allows for them to be analysed and transformed. Implicit rule-governed practices will always be in excess of any explicit formalization of them (this was amply demonstrated by the later Wittgenstein’s investigation of language games),[5] however formalisation may act as a point of leverage giving externalized traction on the ideological sedimentation of implicitly structured practices (a point well made by Joshua Epstein with regard to the efficacy and politics of computational modeling).[6] In this sense formalization may be understood as a kind of decomposition of the implicit structure of a theoretical framework, allowing us to grasp its functional characteristics and change it. Moreover, formalization of a model makes possible its implementation in another substrate, such as an artificially intelligent agent.

Why do we talk about analogy here? Firstly, we can consider the functionalist analogy, which understood the brain as a computer, as the initial stage in a dynamic process of decomposition and localization of the functional properties of cognition (a process that moves from the first-wave functionalism of Putnam etc based on symbol manipulation to the dynamic model of neofunctionalism based on connectionism put forward by the likes of Bechtel). Already with Turing it was recognized that if the brain in a very general sense is something like a computer it can be specifically differentiated from the kind of computational processes occurring on a discrete state machine at the lower levels of the computational hierarchy since while the latter is functionally context-free and iterates without variation, the former is context-sensitive or further, as Turing had it ‘prone to exponential drift’, which we may understand in today’s parlance as ‘sensitive to initial conditions’ or non-linear.[7] This breakdown in the analogy does not definitively destroy its usefulness – in fact an analogy that directly maps is hardly fruitful – the usefulness of analogies depends on whether any testable consequences can be deduced from them.[8] In this case the analogy has drawn out the difference between computational processes occurring in discrete state machines and those occurring in biological brains. In fact, modeling itself may be understood at the most general level as an analogical process based on abstraction that ignores certain details while highlighting how other features overlap.

Secondly, analogical transitions between different contexts are not just a natural aspect of human intelligence but a key cognitive operation of any intelligent agent, along with induction, recursion, and the management of behaviour through the apprehension of contextual frames, discursive markers, and conceptual contents.

As Arzi-Gonczarowski notes, Kant understood that analogy isn’t based on an imperfect similarity of two things, but a perfect similarity of relations between two quite dissimilar things[9] – and moreover understood these kind of supramodal transits in terms of categories. It is extracting these perfect similarities that underlies the flexibility and generativity of higher intelligence (whether carbon-based, silicon-based or a hybrid). Gentner, for instance, models the use of analogy in various subprocesses of learning and reasoning, such as memory retrieval, mapping and structural alignment between different perceptions or environmental contexts, as well as evaluation, abstraction, re-representation and adaptation.[10]

Analogy is key to the development of AI for the same reason that it underlies human reasoning. Furthermore, in interacting with human agents, artificially intelligent agents will need to analogize in order to respond to human behaviour. At the very general level we can distinguish two broad forms of analogical transition – one based on recognition of already given similarities through a diagnostic process of interpretative analysis, and another that is capable of creatively discovering or inventing new analogical transitions through a generative process of synthesis.

In order to understand the centrality of analogical transitions to the general problem of intelligence it is necessary to acknowledge the way in which cognition is embodied, embedded, enacted and extended. It is impossible to separate perception, conception or action from the environment that supports it and to which it is directed. The fact that interaction with an environment plays an essential role in intelligence was recognized in Turing’s early writings, and developed more recently by others, such as Clark and Hutchins. The latter proposes the concept of a cognitive supraindividual including both the environment and the intelligent agent acting within it, which can be fruitfully aligned with the insights of second-order cybernetics.[11]

This generatively extensible supraindividual context-sensitivity can be applied not only to the intelligent agent but also to its perceptions, conceptions, and actions, which can take on an infinite number of different senses, since there is no limit on the number of contexts in which their meaning is transformed. However, this relativity, plasticity, and genericity should not collapse into absolute relativity or sweeping skepticism regarding epistemic limitations since some critical invariable aspect of meaning is held by individual perceptions, conceptions or actions and this must be shared across contexts in order for analogical transitions to make sense (in fact, we should rather think of higher intelligence as meta-contextual for this reason).

Grasping this invariance may present problems in natural language due to the tendency toward an infinite regress of definitions, and definitions of definitions, each of which will have context-sensitive connotations as well as invariants (this is the problem of the excess of practical reason to formalization we talked about earlier). However, contemporary mathematics has fortunately enabled a different approach.

Category theoretic morphisms give formal tools to rigorously describe structure preserving paths (that is, invariants) between different perceptions, conceptions or actions within a single interpretative context, or dually, for a single perception, conception or action across different interpretative contexts.

When an analogy is modeled by a morphism, then the monotonicity of that morphism captures the invariant core or ‘uniformity’ of the analogy, while the non-monotonicity provides a detailed description about the points where the analogy breaks down.[12] Such break-downs or slippages in the analogical transition do not invalidate it but rather require the acknowledgment of non-monotonic variability, and the construction of a non-rigid abstract schema that generalizes over these differences, avoiding determination only where necessary.

The generation of perceptually, conceptually of actionally incisive cognitive images of the environment, as well as the generation of highly structured analogies, are both based on awareness of lawlike patterns or structure preserving transformations between perceptual, conceptual or actional constituents. The ability to construct an intelligent cognitive image of the environment is then linked to the capacity to critically analyse and creatively synthesise insightful analogies since both require the exploitation of lawlike patterns or structure preserving transformations.

Since neural nets can trawl vast archives and compile massive data sets, they are able to discern and catalogue the relationship between a far greater number of patterns than any single human is capable of. Moreover, because they operate according to formalized criteria that are both inspectable and reprogrammable, they may explicitly reveal patterns that are obscured for humans by their ideological sedimentation in implicitly structured practices. This is not to say that neural nets or algorithms are immune to ideological bias – on the contrary, prejudice is often embedded in computational systems, often this is opaque to analysis due to the complexity of the system or to proprietary rights over its management, and moreover this may exacerbate the expression or enaction of bias in computer assisted human behaviour. However, formal tools can give us traction on this problem, not only allowing for the emancipatory amelioration of the computational system, but also facilitating the acknowledgement and transformation of predispositions in human behaviour. For example, a neural net designed by a team at google was programmed to trawl Google News texts looking for patterns of conceptual associations or word embeddings that could be represented by a simple vector algebra in a multi-dimensional vector space.[13] These word embeddings, for example ‘Paris is to France as Tokyo is to Japan’, capture structure preserving transformations or analogical transitions in the formal language of vector space mathematics. However, since the neural net discovered these analogical transitions by consuming a vast diet of professional journalism, it inadvertently picked up the latent gender bias that pervaded the input data as a whole. Even though gender bias was not noticeable within each single input article, at least to the journalists and editors responsible, the aggregate output displayed a marked prejudice, so that the neural net would make analogical transitions such as ‘man is to computer programmer as woman is to homemaker’. Because this bias was captured and represented by formal mathematical tools it was not only legible in the warped geometry of the neural net’s vector space but also transformable by applying a distortion of inverse proportions, a process the google team called ‘hard de-biasing’. However, judging the shape of the warp, or the appropriateness of the analogical transitions that populated it, could not itself be formalized and automated in the present system since it would require some kind of artificial general intelligence capable of making normative judgments regarding the appropriateness of associations made on the basis of gender difference. As a result this task was outsourced to democratic vote by a small group of Amazon’s mechanical Turks. The irony that there is a marked gender asymmetry in the makeup of the mechanical turk workforce should not be lost here (hard debiasing relies on structurally embedded employment asymmetries). Nevertheless, it should not detract from the remarkable power of the neural net and its formalized mathematical tools to apprehend and transform the latent ideological bias sedimented in journalistic practice.

 

[1] The methodology and its explanation are derived from Bechtel, W. (2010) Discovering Complexity: Decomposition and Localization as Strategies in Scientific Research. MIT Press.

[2] See: Calude, C. & Longo, G. (2015) Classical, Quantum and Biological Randomness as Relative Unpredictability. Natural Computing, Springer.; Longo, G. & Montévil, M. (2012) Randomness Increases Order in Biological Evolution. Frontiers in Physiology, n. 3, 39.; Bravi, B. & Longo, G. (2015) The Unconventionality of Nature: Biology, from Noise to Functional Randomness. Invited Lecture, Unconventional Computation and Natural Computation Conference (UCNC), Auckland (NZ), 31/8 – 4/9/2015, proceedings to appear in Springer LNCS, Eds. Calude, C.S. & Dinneen M.J. pp. 3-34.

[3] Crutchfield, J.P. (1994) The Calculi of Emergence: Computation, Dynamics, and Induction. Physica D, Proceedings of the Oji International Seminar Complex Systems – from Complex Dynamics to Artificial Reality. Elsevier North-Holland, Inc. New York, NY, USA.

[4] Negarestani, R. (2014) The Labor of the Inhuman, in Eds. Mackay, R. & Avanessian, A. #Accelerate: The Accelerationist Reader. Urbanomic.

[5] Wittgenstein, L. (2009) Philosophical Investigations. Wiley-Blackwell.
Cf. Brandom, R. (1994) Making it Explicit: Reasoning, Representing and Discursive Commitment. Harvard University Press. pp.13-30

[6] Epstein, J. M. (2007) Generative Social Science: Studies in Agent-Based Computational Modeling. Princeton Univeristy Press.

[7] Longo, G. (2008) Laplace, Turing and the “imitation game” impossible geometry: randomness, determinism and programs in Turing’s test, in eds. Epstein, R., Roberts, G., & Beber, G. Parsing the Turing Test. pp. 377-413, Springer.

[8] Arzi-Gonczarowski, Z. (1999) Perceive this as that – Analogies, artificial perception, and category theory. Annals of Mathematics and Artificial Intelligence, 26, 215–252.

[9] Caygill, H. (1995) A Kant Dictionary, Blackwell Publishers, Great Britain. p.66 quoted in Arzi-Gonczarowski, Z. (1999) Perceive this as that – Analogies, artificial perception, and category theory. Annals of Mathematics and Artificial Intelligence, 26, 215–252.

[10] Gentner, D. (1998) Analogy, in: A Companion to Cognitive Science. Blackwell, chapter II(1), pp. 107–113. quoted in Arzi-Gonczarowski, Z. (1999) Perceive this as that – Analogies, artificial perception, and category theory. Annals of Mathematics and Artificial Intelligence, 26, 215–252.

[11] Hutchins, E. (1995) Cognition in the Wild. MIT Press, Cambridge, MA.

[12] This is demonstrated in in Arzi-Gonczarowski, Z. (1999) Perceive this as that – Analogies, artificial perception, and category theory. Annals of Mathematics and Artificial Intelligence, 26, 215–252.

[13] Bolukbasi, T., Chang, K-W., Zou, J., Saligrama, V. & Kalai, A. (2016) Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. arXiv:1607.06520.

 

Viva Presentation

My interest in noise began from a dissatisfaction with available accounts in the humanities, particularly within sonic culture literature, where noise is mostly presented either in terms of the irreducibility of subjective experience to any form of measure or explanation, or as unpredictable material vibrations that disrupt control and undermine forms of domination – in both cases noise is opposed to rationality, which is characterized as compartmentalizing, idealist, and bound to a static ontology. This characterization fundamentally misunderstands reason, missing the complex dialectic between realism, materialism, and idealism. Moreover it is based on a fetishization of chance, randomness, and perturbation, that is endemic in many social contexts, appearing in a number of different guises and with various practical and theoretical ramifications – for example I show certain philosophical and scientific expressions of this fetishization, and also examine its expression in economics and in music. So, my aim was firstly to counter these mischaracterisations by giving a more positive account of the scientific conception of noise. To do this I firstly needed to explain noise in terms of various measures of randomness and unpredictability. This enables a demystification of noise because randomness is scale and language relative but not therefore relativist, irreducibly subjective, or beyond scientific description. Understanding randomness as a scale relative phenomenon means it can no longer be put into a simple opposition with order or rationality. A crucial claim of my thesis is that noise is not a given that can be treated everywhere the same – it is rather a context specific phenomenon that demands a multi-scale analysis in terms of the different levels of dynamics operating in complex hierarchically nested systems. My intention was to deepen the understanding of the concept of noise, to ramify its pathways, by giving it a multi-level and multi-contextual description, ranging over its philosophical, scientific and artistic implications, and exploring its development in the fields of economics and music.

 

My argument is that the problematic conception of noise put forward in the humanities stems from the opposition between the classical scientific image – based on mechanistic determinism (reversible) – and the manifest experience of freedom (perception, conception, and action). This is why I turn to Sellars, and his call for a stereoscopic fusion of the scientific and manifest images. Various strands of thought in the twentieth century (philosophical, scientific and artistic) sought to overcome this opposition by explaining the manifest experience of freedom in terms of self-organising systems (irreversible). Following Rene Thom I argue that this results in a botched dialectic, where the distinction between thought and thing is unjustifiably collapsed, and freedom comes to be understood as material vibrations beyond any measure of the intellect (while Thom levels this accusation at theorists like Atlan, Prigogine and Serres, I extend it to Deleuze). This problem is compounded in humanities accounts of noise, which generally have a poor understanding of the scientific issues, in particular entropy. Thermodynamic and information theoretic conceptions are often confused, and the distinction between the dynamics of physical and biological systems as well as information processing systems and rational thought are missed or misunderstood – this is why I turn to Longo’s explanation of these distinctions, and the development of the concept of anti-entropy in contrast with negentropy and entropy.

I argue that moving beyond this botched dialectic requires navigating between two equally problematic ways in which the opposition between thought and thing have been overcome: on the one side a reductionist claim (epitomized by eliminative materialism) that the manifest experience of freedom is an illusion and that all talk of persons and their beliefs will be made obsolete by neuroscientific explanations, and on the other side a fusion of thought and thing based on the unpredictability of self-organising systems and the irreducibility of experience to any form of measure. This is why I find it necessary to turn to Sellars’ inferentialist account of rational thought, which argues for a fully naturalized explanation of thought but understands freedom by returning to the Kantian conception of normative self-determination. Importantly, unlike Kant for Sellars this does not stem from or rest on the self-sufficiency of subjective experience, but is naturalistically evolved at the level of collective forms of representation and communication. Sellars makes his naturalistic account of the development of intelligence by referring to the pattern-governed activity of sentient lifeforms, and arguing that humans are not merely pattern-governed but also rule-obeying. Normative freedom consists in the ability to alter behaviour according to reasons, Sellars refers to this as the irreducibility of the is to the ought. This is an important distinction that is often collapsed in recent work in the humanities, inspired by Deleuze, which reject the representational model of thought based on the propositional form of judgment, and replace it with bottom-up processes of self-organising systems. Both Deleuze and Sellars make unconventional analyses of Hume and Kant, and these bear important consequences for the philosophical articulation of the concept of noise and its relation to randomness, freedom, causality, and reason. I explore their different interpretations, systematically rejecting the Deleuzian emphasis on immediacy and intuition whilst maintaining and rearticulating his important insights within a Sellarsian framework.

After addressing the question of noise at the philosophical level I then turn to an exploration of the concept of noise in two more practical contexts: economics and music. I explain in the introduction that this choice is not arbitrary, since both are highly mathematical, both draw on or are influenced by probability theory and physics, both bear interesting relationships to materialism and idealism, to the naturalistic and the normative, to chance, unpredictability and noise.

The rise of modern finance, and the onset of global capitalist trade networks, is predicated on the mathematics of probability theory. Classical and neoclassical economics are an ideology based on a substantialist conception of value, equilibrium dynamics of the market, and the natural tendency towards the benefit of all agents. They import concepts from physics – where trajectories are predictable, reversible, and tend towards equilibrium – and apply them to the socio-economic domain where, assymetries, and irreversible path-dependent processes are pervasive. More than the brutal abstractions of economic theory, the problem again, I argue, stems from a faulty articulation of the naturalistic and the normative. Following the recent work of several different authors I argue that value is derivative to the pricing process, and that every economic transaction, as well as the constitution of the liquidity function of money, is a normative gesture that cannot be justified by pointing to any nature. Neoclassical economics is still the orthodox account however, and in naturalizing competition in the framework of marginal utility theory has shaped neoliberal hegemony and influenced global economic policy decisions over a prolonged period. Certain heterodox economic theories have been enjoying renewed attention recently as a result, in particular ecological economics, which has recognized non-equilibrium dynamics, information asymmetries, pervasive feedback effects, and so on. In doing so they move beyond the ideological assumptions of orthodox theories. However I draw on Longo’s work with Felin, Koppl and Kauffman to argue that the concept of landscape that ecological economics draws on is illegitimately taken from physics and cannot be rigorously applied either to biological ecologies or to the econosphere, where trajectories are unpredictable, possibilities cannot be totalized, and so a phase space cannot be constructed. Again following Longo’s work I argue that the idea of robustness to noise is not pertinent to biological and economic systems, since they are constantly undergoing symmetry breaking transitions and are thus better described by the concept of resilience. What is needed therefore is a synoptic economic theory that is fully naturalized but that accounts for the specificity of socio-economic dynamics in contrast with physics, and that preserves the fundamentally normative character of the pricing process.

Lastly, I turn to an analysis of noise in the sonic sense, and its use in music. I begin by outlining a phenomenological account of auditory experience drawing on Husserl and Schaeffer. This establishes the difference between physical soundwaves and auditory experience, which includes various anamorphoses as well as the global integration of auditory information into a window of simultaneity constituting a specious present marked by retentions and protentions. I examine bistable and multistable auditory phenomena as revealing the active subpersonal processes of signal disambiguation and image construction underlying the apparent seamlessness of auditory scene analysis. This can be understood as the Bayesian calibration of a decision criterion on a probability distribution elicited by a received signal. I then review some examples of music that explores this sweet spot of perceptual illusion, such as spectral music and post-techno. I argue that music is more than just ‘organised sound’ and can be better understood as the diagrammatic or gestural constitution of a counterfactual space and time allowing for the transformation of behaviour within this new possibility space. I then delve further into the neuroscientific account of auditory experience, drawing on Metzinger’s self model theory of subjectivity and his multi-level description of the constraints of conscious experience. I then explore the use of noise in contemporary music, discussing its relation to art theory, to genre, and arguing for the necessity of a multi-level account that includes semantics and syntax as well as more traditional focus on rhythm, melody and harmony etc. Finally I show how tools from contemporary mathematics – category theory and topos theory – can deepen the analysis of music, allowing for an understanding of musical time in terms of symmetry breaking, as well as accounting for the gestural constitution of this time in the self-referential structure of music.

AntiNatural

AntiNatural Talk

The concept of nature is on the one hand a mainstay of diverse forms of conservative ideology where the natural is predominantly opposed to the artificial, the idealistic, and the deviant or perverse.

On the other hand, there is an attempt to overcome the oppositions entrenched in this conservative perspective by expanding the concept of nature so that it includes technology, fantastic ideas, and unconventional behaviour.

The conservative position is overly deterministic; it justifies normative prescriptions concerning what ought to be the case by pointing to naturalistic descriptions of what is the case.

The expanded concept of nature is overly indeterministic – its emphasis is on the indeterminate potential of becoming but by extending the concept of nature to all behaviour the normative capacity for judgment loses its purchase. By fetishizing natural processes of self-organisation over representation and the propositional articulation of reason it remains complicit with the neoliberal logic of late capitalism .

An adequate conceptualization of nature must navigate between these positions.

In contrast with the conservative position nature is neither a given nor a justification. The market does not naturally tend towards the formation of fair prices. Income inequalities and traditional gender roles are not human nature. Neither the ecological nor the social can be seen as a natural order, they are not fragile equilibria that must be guarded against technical and social transformations. They are both artificial constructions that have never been anything but far from equilibrium.

In contrast with the expanded conception of nature we must make a distinction between a naturalistic description of different levels of information processing dynamics that accounts for the normative-linguistic capacity for the top-down control of behaviour according to propositionally articulated reasons.

To be human is to enter into a game of artificial self-construction at the level of the social – this is an ongoing process of alienation from nature, a progressive deracination from the local exigencies that constrain thought and behaviour. Nature is no reason, and reason is not natural even if it is part of our nature. As Negarestani says reason is inhuman and ‘Inhumanism is the labor of rational agency on the human’ this is the elaboration of what it means to be human. Nature reflexively grasps its anti-nature and thus transforms itself. Freedom is not given but is performed or produced in this labor of reflexive transformation.

In order to grasp the true nature of this freedom it is necessary to avoid two tendencies to misconceptualise the relationship between reason and causality; on the one hand ruthless reductionist accounts that aim to eliminate the inherited illusions of folk psychology, on the other hand emergentist accounts that argue for the irreducibility of thought to causal processes. The former deny the normative-linguistic force of reason, the latter deny the causal-naturalistic explanation of reason.

To give a mathematical description of a process is to naturalise it – to explain it according to natural laws. When Galileo mathematised the supralunary realm he naturalized the heavens.

To claim that something is not amenable to explanation according to natural laws is anti-scientific mystification.

For Bergson, and for Deleuze, the lived experience of duration is natural, in a vitalistic expanded conception of nature, but cannot be mathematized so is not amenable to explanation according to natural laws.

This is a mystification of experience. On the contrary, the lived experience of duration can be naturalized according to a neurophenomenological description of the global architecture of consciousness.

What cannot be axiomatised, and what is in that sense anti-natural, is reason. The normative-linguistic capacity of thinking can be explained in terms of the causal structure of neuronal activity, but a description of the neuronal activation patterns in the brain at any one state is no indication of what its subsequent state will be.

If we give a description of an inanimate object like a rock, specifying its position and velocity we can calculate with accuracy where it will be at some point in the future.

Being here entails that it will be there.

But there are no entailing laws for predicting the trajectory of biological organisms or neural assemblies.

However, this does not mean that the freedom of thought is just the unpredictable randomness of neuronal activation patterns.

The freedom of rational subjectivity, its logical irreducibility to any naturalized description, is its capacity to acknowledge, construct and revise rules and to perceive think and act according to these commitments

Reason is fully naturalistic, in the sense that it is amenable to scientific explanation in terms of its causal structure and its functional properties, whilst also requiring a further level of description that must be addressed at the normative level of commitments and entitlements.

The definition of freedom has been bound up with the philosophical problem of necessity and chance, determinacy and indeterminacy, and this has caused a great deal of confusion.

Continental theory is in particular to blame for promoting a ‘botched dialectic’ that makes ‘self-organising’ randomness and perturbations below the threshold of measurement the wellspring of freedom and creativity against the rational description of systems in terms of mechanistic determinism.

Countering this ‘pseudo-libertarian imposture of chaos’ does not mean returning to a dualistic conception in which material processes are reduced to the linear causal regime of particle impacts and opposed to some form of spontaneous unconstrained freedom. Rather, it demands a reconceptualization of the relation between reason and randomness that resists the temptation to hypostatise chance.

This argument follows René Thom’s criticism of the glorification of chance in the form of random fluctuations and perturbations in the diverse philosophies of Monod, Prigogine, Atlan, and Serres. I think Thom’s critique can be extended to the very different ways in which randomness, self-organising systems and noise have been misconceptualised and fetishised in philosophies such as Deleuze, in political economy, and in the theory and practice of music (which I don’t have time to go into).

Thom’s argument follows from a negative definition of randomness, as what exceeds simulation or formal description. he explains that the capacity for simulation or formal description is relative to a certain scale of observation, and that this is particularly true for the analysis of complex hierarchically organised systems such as us.

It could be argued that Thom has a merely epistemological understanding of randomness, and cannot thereby think its ontological scope. However, this would be mistaken; his argument is that any talk of randomness presupposes the definition of a frame of reference, or context, and a language or means of representation; ‘any discussion on the theme of “chance vs. determinism” must begin with an examination of the languages and formalisms which permit making a phenomenon the object of a realm of knowledge.’ This approach is corroborated by James Crutchfield’s ‘computational mechanics’, which also argues that any measure of disorder is relative to the descriptive tools employed, and the specification of this language is defined by what the model is intended to observe.

Thom begins with an epistemological definition of randomness and draws an ontological conclusion from this; he affirms the ultimate describability of nature in principle (i.e. the non-existence of fundamental limits to reason), and thereby denies the hypostatisation of chance: ‘To assert that “chance exists” is therefore to take the ontological position which consists in affirming that there are natural phenomena which we shall never be able to describe, therefore never understand.’ Thom’s negative ontological claim might be rephrased as the positive assertion that for any context-specific or scale-relative appearance of randomness, there are no a priori limitations to its description or scientific understanding at another scale. One might argue then that randomness exists (has an objective ontological status), but only as an effect of information processing dynamics and multi-scale complexity.

To summarise we are not free because of the indeterminacy of nature or because of a lack of constraints, but because the complex hierarchically nested structure of constraints in dynamic systems such as ourselves enables us to control our behaviour according to rules and make choices according to reasons. As techno-scientific knowledge progresses more and more complex phenomena will yield to a naturalized description, finally leading to a fully objectified account of experience.

Having a naturalized description of something makes various control opportunities available that were hitherto unimaginable. The more that consciousness is given a natural description the more that we can gain traction on the parochial limitations of biological cognition and transcend them. This is the infinite goal of anti-nature, lean forwards and activate the revisionary-constructive engineering loop.

New Abstract

This thesis aims to elaborate the theoretical and practical significance of the concept of noise with regard to current debates concerning realism, materialism, and rationality. The scientific conception of noise follows from the developments of thermodynamics, information theory, cybernetics, and dynamic systems theory; hence its qualification as irreversible. It is argued that this conceptualization of noise is entangled in several polemics that cross the arts and sciences, and that it is crucial to an understanding of their contemporary condition. In particular, there are ruthless reductionist accounts that aim to eliminate the inherited illusions of folk psychology on the one hand, and emergentist accounts that argue for the irreducibility of thought to causal processes on the other hand. The former deny the normative-linguistic force of reason, the latter deny the causal-naturalistic explanation of reason. In contrast with either tendency this thesis contributes to the theoretical articulation of noise by arguing for the necessity of an inferentialist account of reason that is fully naturalistic, in the sense that it is amenable to scientific explanation in terms of its causal structure and its functional properties, whilst also maintaining a normative conception of freedom that must be addressed at the level of commitments and entitlements in the space of reason. It draws on complexity theory to give a multi-level account of noise, and argues that randomness is an intrinsic functional component at all levels of complex dynamic systems, including higher cognition and reason. After surveying the scientific and philosophical context, the practical understanding of noise in terms of probability theory is elaborated through a history of its development in the field of economics, where the idealization of randomness has had its most pernicious effects. Moving from the suppression of noise in economics to its glorification in aesthetics, the experience of noise in the sonic sense is first given a naturalistic neuro-phenomenological explanation. Finally, the theoretical tools developed over the course of the inquiry are applied to the use of noise in music. The rational explanation of randomness, and the active manipulation of probability that this enables, is opposed to the political and aesthetic tendencies to fetishize indeterminacy. This multi-level account of constrained randomness demystifies noise, showing it to be an intrinsic and functionally necessary condition of reason and consequently of freedom.

Irreversible Noise: The Veneration of Chaos and the Rationalization of Indeterminacy

This site is really less of a blog than a repository for previous work I’ve done, much of which I no longer endorse. Basically its a dumping ground where expired ideas can decay in public, unleashing their toxins to the unsuspecting visitor. Its about time I updated the compost pile by adding a relatively fresh heap of conceptual detritus. So, here’s a summary of what my thesis is supposed to be doing:

 

Irreversible Noise: The Veneration of Chaos and the Rationalization of Indeterminacy

The premise of this thesis is that it is the ‘strategic ambiguity’ of the concept of noise that allows for its exemplary traversal across artistic, scientific, economic, political and scientific fields. In contrast to many other accounts that mobilize the concept against the prevailing order, thereby endorsing an ontological indeterminism that risks falling into negative theology (veneration of chaos), it attempts to show how its transdisiplinary articulation has allowed for a progressive generation of constraints and rationalization of uncertainty. This horizontal extension in the topological space of the concept (topos of noise) is deepened by a stratigraphic cut through its multiple dimensions of verticality revealing the complex hierarchal dynamics of its structural constitution.

 

Considering philosophy to be a metadiscursive practice where the concept finds its full elaboration it examines how metaphysical, ontological and epistemological claims expand the conceptualization of noise through tracing and reconfiguring the ramified pathways of its topos. It then proceeds to describe the physical, chemical and biological generation of noise in terms of thermodynamic, morphodynamic, and teleodynamic processes that are necessary preconditions for the emergence of complex thought. The prosthetic construction of a space of reason is then analyzed in terms of its socio-technical development, deploying the multi-scale extension of the concept of noise to render the relation between techné and episteme explicit with regards to indeterminacy and randomness. This is further explored according to a functionalist and pragmaticist account of reason as a revisable structure of commitments.

 

Noise plays a central role in the contemporary rearticulation of reason since it crucially draws on abductive as well as deductive and inductive logical inferences due to their error-tolerant capacity for revision and the ampliative expansion of knowledge through hypothesis generation. This inferential aspect of noise is pursued through an investigation of computation and algorithmic incompressibility, and in terms of heterodox economic accounts that draw on thermodynamic, evolutionary and ecological science. Phenomenological descriptions of sensory interference and neurophilosophical explanations of the stochastic randomness of brain functions demonstrate the necessity of multi-scale noise for higher order processes. Finally, the escalating tendency towards the inclusion of complex sound in twentieth century music is shown to be irreducible to its analysis in either aesthetic or conceptual terms alone, where a full appreciation must take account of the dynamic depth of noise.

 

 

 

Glass Bead in New York

http://glass-bead.org

http://www.e-flux.com/program/reza-negarestani-and-guerino-mazzola-presented-by-glass-bead/

 

Charles Sanders Peirce, <i>Labyrinth</i>. From Charles Sanders Peirce papers, MS Am 1632 (1537). Houghton Library, Harvard University, Cambridge, Mass.

Charles Sanders Peirce, Labyrinth. From Charles Sanders Peirce papers, MS Am 1632 (1537). Houghton Library, Harvard University, Cambridge, Mass.

Reza Negarestani and
Guerino Mazzola, presented
by Glass Bead

Reza Negarestani: “What Philosophy
Does to the Mind”
Tuesday, April 22, 2014, 7–9pm

Guerino Mazzola: “Melting Glass Beads—The Multiverse Game of Strings and Gestures”
Friday, April 25, 2014, 7–9pm

e-flux
311 East Broadway
New York, NY 10002

T 212 619 3356

www.e-flux.com