Rough Transcript of Ray Brassier’s Talk – “Pricing Time: Remarks on Malik’s Ontology of Finance”

it makes a radical claim about politics in the condition of capitalization as subordinated to the logic of financialisation

two reasons for confronting financialisation – two lessons of 2008 crash: derivatives markets present a systemic risk to national world economies, and that the relative size of these markets presents a systemic risk to geopolitical and economic securities (sums now being traded are equal if not greater than national econs)

four considerations (motivating factors for arguments)
1. finance power – what is finance power distinct from modern state sovereignty? since finance here is a euphemism for a systemic market led dynamic of capital accumulation…what is required is a power theory of finance that must take its lead from the operational complexity of financial markets. he develops a non-standard general theory of price and of the political economy of finance. financialisation is now a political power.
2. futurity. the reorganization of the relation to the future via price in general, not just within the circumscribed arena of derivatives markets but across the entire social order. derivatives are seen to systemically operationalize an unprecedented modality of the wager that is intrinsic to the standard notion of betting but is theoretically and practically unavailable upon the basis of that standard notion. what is going on in derivatives markets is a kind of wagering on the future that cannot be understood in terms of standard notion of bet.
3. left accelerationism must abandon its admittedly ambivalent attachment to a Marxian labour based determination of capitalism and political economy because these are not the prerequisites of capital power in general but tendentious misapprehensions of it. the proposition advanced here is that what capitalism and political economy needs to be understood according to the most advanced theoretical tools available today, this means finance in general and derivatives in particular not Marxism.
4. critique of what he calls neo-rationalism. ‘the extirpation of social norms by capital power (a normativity that does not entail the destruction of social order but the chronic reinstitutionalisation of a risk order)’ – this is what financial speculation does according to Malik – ‘casts significant doubt upon the political and theoretical adequacy of a neo-rationalist programme to the ambitions of LA. if neorationalism contends that subjective norms can be progressively transformed by the pragmatic universalism of self-revising rational norms, that contention supposes both the authority of reason not only over conceptual thought but also over social norms, and also the revisability of social norms… capital-power, though certainly not directed by theoretical reason, revises social norms to the point at which [they] lose efficacy altogether; authority of any kind is not a prevalent power-modality in the risk-order; and risk itself proscribes any tendential organization or universalist determination…other than greater capitalization.’ claim being made here is (and critique of LA) is that the idea that collective self-org can challenge capital power is precisely what is undermined by the risk order as intrinsic to the logic of financialisation. financialisation entails an operationalization of risk, which is to say a wager on the future, and this modality of wager/invocation of risk subverts any voluntaristic attempts to determine the future, or to collectively organise in order to construct the future on the basis of an actually existing state of affairs. this is the radical political contention of the text. existing social norms do not provide an adequate basis for reorganizing our relationship to the future, we have to understand the financialized operationalization of risk in order to be able to effectuate the future and this means that rationality is not a political resource. risk/gambling supplants organization and planning. critique of latent voluntarism of LA.

Malik uses four resources – first, Nitzlan and Bichler Capital as Power, which is a challenge to Marxian accounts of capital but also the dominance of neoliberal understandings of political economy
secondly, the logic of derivatives as Derridean difference
thirdly, work of Esposito on time binding
and finally Ayache on derivatives markets as an inscription of contingency

so Malik’s argument draws on all four sources, critically on Derridan, Esposito and Ayache but more or less uncritically on Nitzan and Bichler

any systematic critique of N+B would undermine all the subsequent steps of Malik’s argument

N+B propose that capital is directly power because it is neither a material entity nor a productive process but rather the very ability of absentee orders to control, shape and structure society more broadly
for N+B capitalism is not a mode of production that has certain political consquences that inaugurates a certain ordering of power – capitalization is directly an effectuation of power
the main conflict in capital is between those accumulating power
there’s no class struggle
the antagonism is not between social classes its between capitalists themselves
or as Malik puts it ‘Capitalists do not seek to accumulate capital nor (as liberal business dogma has it) to maximize profits, but rather to ‘beat the average’ represented by the normal rate of return. Capitalists do not seek to accumulate capital nor (as liberal business dogma has it) to maximize profits, but rather to ‘beat the average’ represented by the normal rate of return. The rate is set not just by the standard instruments, but also by the …N+B’s shorthand for accumulation by intracapitalist rivalry is differential accumulation’
the two primary aspects of differential accumulation are price and sabotage
price tells us how much a capitalist will pay now to receive a flow of money later. price is merely the unit with which capitalism is ordered. and capitalization is the pattern of that order.
‘bonds, shares, etc. are simply different incarnations of the same thing: they are all income-generating entities’
this has three consequences:
prices are uniform across space and time providing universal transhistorical equivalents – in this view prices are independent of the value of a currency within a specific economy, so prices can be transplanted or converted across economies and currencies. secondly, price is a direct index of differential accumulation, and hence of social power, it structures the dynamic reordering of power, in other words price fixing is power ordering. thirdly, price is the medium of a transformative power rationality, whose specific historical organization is the result of intra-capitalist conflict. intra-capitalist conflict is all about price fixing.
sabotage (this is the second aspect) locally entails diminishing competing firms capital accumulation relative to one’s own, and globally entails limiting the average overall rate of growth of profit, in order to secure differentially greater accumulation against the average rate. as a systemic condition differential sabotage manifests itself in diverse social arrangements including unemployment, inflation, wage restraints, social fragility, education policy, immigration regulation, etc. Capitalists do not accumulate capital by seeking to maximize profit by maximizing productivity, but rather differential accumulation requires compromising productivity as such. Business is then not just unproductive, but moreover necessarily counterproductive, as are capitalists societies overall and in general. this means that finance necessarily promulgates sabotage in general, meaning it is an inherently counterproductive power. the positive determination of price qua abstract financial magnitude is that it directly indexes capital’s power ordering and reordering. through price fixing, its how social institutions, and the organization of society can be ordered and reordered. capital power is exerted through pricing. pricing and sabotage are coeval and mutually determining, directly constituting the order of power across society at every scale, as a necessarily integrated political economy. this demonstration of differential accumulation via price setting strategies makes it explicit that the administering price according to mark-ups already embodies the power to incapacitate the social order. if price setting advances differential accumulation via both accumulation and the concentration of social power, then prices set the market. Administered prices make explicit that price is the medium of capital accumulation qua power-ordering. This is what power is in capitalism. The power to price is the power to create and destroy social order independently of social constraints. markets and pricing predicated on finance enable social reshaping and reformatting ‘in innumerable ways’ that ‘no other ruling class has ever been able’ to undertake. finance is the condition and means of capital-power, and capital ‘is finance and only finance’
this means that capitalization is a species of financialisation. as the structural condition of capitalization, finance logically precedes it, or inversely economic practice is a restricted theoretical and practical rendition of capitalization, and capitalism is only a particular order of financialisation, meaning that it is not the only possible one.
this is crucial for Malik’s argument, because his claim is that a postcapitalist society can’t be a post-financial society, so that postcapitalism will simply be an alternative order of financialisation. that’s the basic claim. but this entails a distinction between financialisation as a kind of structural a priori or condition, and financial markets as a social historical instantiation of this logic of financialisation. Malik distinguishes the a priori financiality of capital-power as a systemic condition for capitalization, and finance markets as practical institutional operating mechanisms and facts of capital accumulation. So what we have is a familiar philosophical distinction between financialisation (a priori) and financial markets (empirical instantiation). relies on articulating two dimensions of finance: as a priori condition and historical fact – without directly identifying them, and as we’ll see his argument depends upon distinguishing financialisation and markets but its not clear how viable this distinction is.
what is a derivatives contract? the key distinction is between the forward price, which is the price agreed in the contract, and the delivery price, which is the actual price of the asset at maturity.
the derivatives contract is made in view of the likely difference between delivery and forward prices, yielding a profit or loss for one of the parties
contract of the derivatives claim is contingent in a double sense, firstly: it depends upon an eventuality independent from and external to the contracted price, which is known as the underlying asset. secondly, in the prevalent sense in which it is understood in the derivatives markets, the eventuality upon which the payout depends may or may not be occasioned, meaning that the contract will lead to a gain or a loss to one or another without certainty as to who will be gaining/losing party

Gains or losses are made dependent on whether the price agreed in the contract, the delivery price, is higher or lower than the market price of the underlying (spot price) at maturity.
the payout of a derivative is determined solely by the terms set out in the contract, this means that the historical material or qualitative particularity of the underlying is irrelevant
the standard way of describing relation between price and pricing is exogenous – a traditionally conceived wager
on this account derivatives are nothing but a wager on a price differential over time
(predicated on the non-knowledge of the future)
on this account the uncertainty between the contracted price and the delivery price is epistemic – best bet on your ignorance of the future
epistemic uncertainty into an ontological indeterminacy
this is where he brings in Derrida and the notion of temporization
the derivatives contract constitutes a price differential between delivery and strike prices
the contract defers the trade of the underlying in order to institute the price differential, and conversely the price differential specified by the forward contract is simultaneously a deferral of the exchange of the underlying asset, or put schematically the forward contract defers exchange to constitute a price differential for an underlying asset between times or across markets just as its positing of that differential defers the immediate vending of the asset
a deferred differential or a differentiating deferral
crucial conceptual problem here is that the difference between the forward price and the delivery price is not an actually determined magnitude (not on Malik’s account, he argues against those who claim that it can be)
nor is it the difference between two actual differences
the future difference is constitutive of the present difference just as the present difference is constitutive of the future difference
the differentiation at issue is therefore not the determination of the determinable difference between two actually determined differences but rather the derivatives contract is the determinate inscription of an indetermination
derivatives constitute price differentials precisely according to this differential logic of temporization, which is no less their operation, the delivery
what’s interesting about derivatives in Malik’s account is – its not an exchange
temporization becomes a condition for speculation
so what is going on in derivatives trading is speculative accumulation via temporization
you make money without exchanging anything
derivatives on Maliks account are hauntological because they inscribe the indetermination of what he considers to be an unpresentable future, rather than the undeterminability of the unpresentable past
this is where he parts company with Derrida – the unpresentable past in Derrida is a trace of the other, source of ethical responsibility
for Malik, inscribing the indetermination of the future is a political act, whereas for Derrida registering the undeterminability of the past is an ethical responsibility
‘derivatives are not pre-ontological, rather their ontology consists of a binding and enforceable contract that is constituted by statute’
the derivatives contract binds time – it’s a temporizing operation which exerts capital power
the logic of capitalization, as an ordering and disordering of a social system, is concentrated in the binding of time in the derivatives contract
the institution of derivatives is the constitution of price differentiation
this is where he follows Ayache in arguing against the Black-Scholes-Merton standard account of pricing
the BSM account treats as a Weiner process, where the uncertainty of the actual movement of a price in the future is rendered as a probability, a bounded calculated anticipation
on BSM account ‘the anticipation of price movement, the measure of changing market forces is both memoryless and, given its unpredictability, futureless’
it is only the present probability of what the future might be…
here what is being criticized in the BSM account is the attempt to model the movement of pricing in terms of probability theory
in probability theory there’s a segmentation of the modes of time, and compartmentalization such as past present and future are mutually exclusive states in a linear series
so one can calculate the probability of a future outcome based on information about the actual state of the system
in Malik’s argument, probability only operates if you can decompose wagering into a series of discrete steps, e.g. coin toss – the determinate outcome is independent of flip/history. we know long run probability is 50:50 but we don’t know probability for finite series
so time is segmented into flips/series of states that are undetermined by the previous states, and only in this way can you calculate probabilities
you must also be able to discreetly identify the possible outcomes
standard financial praxis – account of their own operations as both logically predicated on, and operationally constituting, the volatility and …
the contract is between a constative and a performative relation of price
in the BSM account you can represent the possible outcomes of the wager because the outcome of the wager is independent of the praxis of the wager
exogenous relation between price and pricing, because the practice of pricing is in no way determines the prices
what determines market movements are forces working independently of the derivatives contract
what’s peculiar about derivatives trading, drawing on Esposito, is that the infra-wager constitutes that which is wagered upon
the act of wagering constitutes
a performative relation between pricing and price as opposed to a merely constative one
this is why prices can no longer be probabilistically mapped or represented
the volatility of markets is endogenous as opposed to exogenous to pricing
the future stipulated by the derivatives contract is unpredictable because it is produced by the very same…
in short, derivatives markets constitute prices
here we need to distinguish between present future and future present
derivatives mobilize a distinction between present future, our current anticipation of the future, and future present, that becomes actual in the future
what is traded on derivatives is not the future given the then unknown strike price of the underlying but the present risk of that price against the delivery price
derivatives pricing constantly refers to the present way of seeing the future and not the unknowable future that will come about
that is, derivative pricing makes explicit in the present the relation to an inactual and necessarily uncertain future present
this means that time is systems specific
the difference between present future and future present is always internal to a system, and here it’s a social system
the maintenance within the present of past and future presents depends entirely on the structure, organization, and capacities of any given system
this is why derivatives pricing, and is volatility, are in short constructions of time
finally, Malik distinguishes between the extra-wager and the infra-wager
the extra-wager is the standard wager – gamble on necessarily limited knowledge of inactual future occurrence
we can attempt to mitigate the risk incurred by gambling using probability – prediction – if you can find the likelihood of an outcome you diminish the risk
and Malik’s claim is that in derivatives trading – the infra-wager – the uncertainty itself is what is wagered upon
what is being wagered is the absolute indeterminability of the future state
this is why risk is ontological on Malik’s account
and this is why in the infra-wager, any instance of derivatives pricing is a wager placed not just in an indefinite betting process but also on it. Derivatives market pricing is thereby akin to odds lengthening or shortening on a bet, according to what other betters place. What is priced by derivatives markets then is the pricing process itself. Unlike the extra-wager, derivatives are an infra-wager for which the terms of the relation are not externally determined conditionalities, but only parametric constraints.
what is being wagered in derivatives is wagering itself, you’re betting on a bet, you’re not betting on some outcome that is external to the act of betting
acute involution – pricing of the price process itself, and nothing that is independent of that process enters into it
the orthodox, BSM account is an attempt to represent pricing, as if pricing was an activity that was connected to or interacting with processes in the world
the reason why pricing is the inscription of an absolute contingency, or an indeterminability as such, is a practice which doesn’t reflect or represent any data or information from outside the practice of pricing
In addition to the contingency of abstraction that is universal fungibility of the underlying forward contract the derivative is also contingent in that it posits a speculative as yet unknown eventuality. that eventuality does not preexist the contract but is fabricated by it. so you’re producing the outcome that you’re betting on. the contract constitutes its inactuality and unknowness, in other words, the unknown that is being wagered in the contract is not independent of the institution of the contract
the two different outcomes are branches of reality only one of which will be actualized at the maturation of the contract
it follows that the derivatives contract is always a contingent claim
but the contingency identified by Ayache is one that the derivative constitutes and inaugurates, and is to designated as its thetic contingency, its thetic contingency because it is instituted by the wager or by the institution of the contract. what this third contingency of the derivatives constitutes is its deracination not with regard to the underlying, from which the derivative was deracinated by the contingency of abstraction, but rather the deracination of price itself in the pricing process. It posits that the world is what it is in reality. pricing fact – except that it could have been different. only one of the outcomes are actualized the other remains inactual.
there’s an appeal to the distinction between virtuality and possibility. possibility is always abstracted from virtuality. In Bergson’s account possibility is always retrojected as a state of affairs that could have been actualized but in fact was not. This involves thinking of time as decomposable into discrete instants, time as being a movement of actualization, the actualization of a possible state of affairs that is somehow adjoined to the actual state of affairs, in other words what could be is determinable on the basis of what is. And the Bergsonian claim is that futurity or virtuality is precisely never what could be if what could be is understood as a possibility inscribed in actuality.
So, in Ayache’s account of contingency, the actualized inactuality is not the realization of a possibility – this is an infra-wager, you’re not wagering on the basis of your ignorance of what could be, not what is unknown but unknowable/indeterminable
if the future is understood as a reservoir of possibility it is determinable – you can predict what could be on the basis of what is
but on this account what might be or may be is unforeseeable or unpredictable because nothing that we know about what is provides a reliable index to what may be
The representation of the world, its discrete segmentation into substances and attributes and combinations of objects in states of affairs
so its this whole ontology of representation that is being challenged by Ayache’s account of contingency
the infra-wager, as exemplified by derivatives trading, is a practice of time-binding that has the power to generate order and disorder without any prevision
this is why it’s a challenge to any model of political agency which is predictive, which supposes its possible to construct the future from the present – the future is non-constructible on this account, it can only be wagered, but the wager is what will constitute the future.
and financialisation is a practice that allows you to constitute the future without having to rely on anything you know about the present – this is its radical political valence
some critical remarks:
I think there’s a problem with the attempted link between differentiation and temporization and systemic time binding
Derrida’s argument is a critique of certain phenomenological orientations, of the Husserlian notion of the living present
Derrida’s claim is that in Husserlian phenomenology what is not yet or has been are somehow intrinsic to what is or the experience of the present, because what is cannot be understood in terms of a substantial actuality – the living present is never punctual because it is constituted by the anticipation of the not yet and the retention of what has been, and this anticipation and retention are relations, intentional acts on Husserls account which make absence constitutive of the present, but Derrida wants to radicalize this relation between presence and absence by arguing that what is present can only be present according to a certain mode of absence (deconstruction of the opposition. Every anticipation and retention presuppose the unpresentable. The standard move is to say that Derrida’s critical remarks are already there in Husserl or Heidegger’s account – subversion of presence already understood by phenomenology.
There’s a problem about transplanting the logic of temporization to a social system, because here it seems that the relationship between the present future and the future present can’t be straightforwardly adjudicated by appealing to something like self-consciousness. In other words, the difference is between what we currently anticipate (present future) and what will be independent of our current anticipation (future present) – there’s a problem with applying this to social systems because current anticipations are also conditioned by a set of representations which unfold in time. There’s a distinction between represented time, the time experienced by a system, and the time of representing, the time in which the system represents its relation to the world and experiences the difference between present future and future present. This is a straightforward philosophical distinction between the representation of succession, is not equivalent to the succession of representation – the former (RoS) is conditioned by the latter (SoR), which may not be represented.
Surely, if we’re talking about something as complicated as a social system then the difference between present future and future present is itself conditioned by the difference between the present of representation (in which the system constitutes its self-representation) and the future that will condition its present representation. Any system capable of differentiating between present future and future present is embedded in a time order which conditions the time order experienced by the system. There’s a time that is internal to the system and a time that is external that conditions the generation of time within the system. This means there’s something peculiar about using the distinction between present future and future present, and even the cross-contamination between them, to yield this description of the indeterminability of…
This then calls into question the distinction between financialisation as structural a priori and financial markets as historically conditioned social practices. In other words, its one thing to say that the practice of pricing doesn’t mirror or reflect price movements in a reality which is somehow independent of the market, but the performativity of pricing and the fact that you can no longer segment the institution of the contract or price, surely means that what is the determining factor…there is a determinate factor operative in pricing – why isn’t it straightforwardly empirical? Why isn’t it the trader’s contingent psychological state? You need the Derridean register to enforce this distinction between the unpresentable a priori and the contingent historical formation of this wagering on the unpresentable, but that’s only if you’re sure that this account of wagering actually bets time binding. If the binding of time in the derivatives contract is conditioned by factors that are unavailable to the practitioners of pricing, then in what way can wagering be taken to be constitutive of time binding at a systemic level? Once we have this difference between the representation of time and the time of representing then everything that’s going on at the level of wagering can be conditioned by factors which may be unavailable to the participants in the wager, but which are perfectly identifiable to a sufficiently fine-grained empirical analysis. So I think the transcendentalisation of the infra-wager is dubious. So, we’ve got pricing as a peculiar practice – the pricing of pricing – which seems to be impervious to any external social determination, and this then becomes the transcendentally constitutive moment – it effects the operational binding for the entire system independently of the empirical conditions of the market traders. Once one questions the transcendentalisation of the Derridean account it seems that the performativity of pricing is purely empirical and not transcendental. The endogeneity and volatility of pricing, the fact that pricing depends on what the traders happen to believe – there’s no rationale for transcendentalising that volatility. The risk inscribed in wagering is empirical and not transcendental – it seems dubious to ontologise it. A more general remark on this ontologisation of risk: what’s decisive about the claim is that temporal indeterminacy, or the indetermination of the future is ontological and not merely epistemic, so therefore that no rational action or no projection can realize the future – in other words we can’t plan or predict. This seems a contentious generalization of an experiential account – its explicit in Ayache with references to figures like Bergson – the whole critique of representation is there’s nothing but the dynamic movement through which the future realizes itself are unrepresentable and unforeseeable on the basis of any information which we have, any of those characteristics of the world that we can represent, is based on the primacy of lived experience over the relationship to objectivity, so its undialectical in that sense. And this is why the peculiar consequence if we read Ayache is trading becomes this creative act. Trading becomes an inscription of absolute contingency, it becomes a creative act. Malik doesn’t talk about creativity he talks about time binding, its as if the synthesis of time depends on this inscription of this radical indeterminability, But this is a peculiarly subjectivistic, this whole account which is supposed to be radically anti-humanist has curiously subjectivistic premises. If one has a more dialectical understanding of the relation between subject and object, one refuses to absolutise immediacy, whether phenomenologically or vitalistically, then its not true to say that the future is radically indeterminable. We can successfully predict outcomes…the border between stability and instability can always be temporarily circumscribed by sufficient sophisticated understanding of dynamic processes, and I think that here the reason why the BSM, the attempt to construct a probabilistic model of pricing is unsatisfactory is not because pricing inscribes some radical indeterminability of the future but simply because everything that traders are doing is an extremely complex nesting of entirely arbitrary inclinations. Because how they will wager is determined by their previous wager and the anticipation of their next wager, then this is a series of …a determination of one moment by the next, which can be characterized as purely empirical and doesn’t need to be extrapolated into this speculative register.
The fact that they rely on hunches or guesses and that these can only be sociologically described and not probabilistically represented is just a trivial epistemic fact and not something that should be inflated into an ontologisation of risk. Notoriously, its not the traders themselves that are incurring these risks, its always…the empirical consequences, the gains or losses will always be incurred by people outside of the market. Malik’s account is an attempt to reappropriate the ontologisation of risk for an emancipatory politics as opposed to a reactionary Darwinian liberalism. Because the claim that the future is radically unforeseeable, that we can’t plan or project or organise is a familiar claim of neoliberal ideologues, there’s nothing radical about that claim, so therefore this whole argument relies on a rejection of voluntarism and rationalism as wholly inadequate to the task of effecting an emancipatory transformation, a reordering of society, on the basis of the well-known catastrophic consequences of revolutionary politics in the twentieth century. But it seems insufficient, there’s no principled argument as to why we can’t successfully reorganize society by trying to construct future institutions on the basis of revising current institutions. Malik is rejecting what he calls this rationalistic account because if transcendental constitution is social institution it means that the way in which we understand ourselves and try to organise and reason about means and ends is a reflection of a series of social institutions which are themselves historically determined…
What drops out of this account is that financialisation turns out to be determining in the last instance for socialization, but surely financialisation presupposes a social practice…
That financialisation has this power to create and destroy social order is a consequence of familiar social relations, which have to do with property, and if we understand this I don’t see the need to avert to financialisation in order to achieve this reordering. The key presumption is that reordering predicated on a program – a programmatic reordering will simply be totalitarian: that’s the unstated premise here. We know that programmatic reorderings have had totalitarian consequences but unless…it seems bizarre to make totalitarianism the inevitable consequence of voluntarism, and that’s a standard neoliberal trope – if you will to change the world in any way you’re gonna start shipping people off to gulags. That’s just ideology.

Formalizing Analogical Transitions within a Multi-Level Dynamic Mechanistic Framework

formalizing-analogical-transitions

Talk given at ‘Cold Bodies, Warm Machines’ Conference, Düsseldorf, 09-11/09/16.

Firstly, what is the Dynamic Mechanistic Framework?[1] Simply put its the way in which we can understand the properties of a system by disassembling it and seeing how its parts work either individually or in combination with other parts. It is a mechanistic framework because it acknowledges that the concept of mechanism has been, and still is, a key idea of modern science by which the characteristics of a system are explained according to the causal structure of its parts. Though the classical mechanistic framework has been critically overcome by the recognition of the ubiquity of non-linearity in dynamic systems theory, the concept of mechanism as structure-based explanatory framework for the functioning of a system may be retained if it is suitably dynamicized. This mechanistic framework is multi-level and dynamic because it is neither ‘ruthlessly reductionist’ nor rabidly anti-reductionist – it recognizes that the functional characteristics of a system at one level of analysis may be emergent properties that are not well described by the lower level dynamics that support them, without claiming that these properties are irreducible to the behaviour of its structural components (this is called weak as opposed to strong emergence). The framework itself is dynamic since it progresses by several stages of decomposition and localization of functional properties in structural components – if a property cannot be directly localized in a structural component then it may be hypothesized as a property of the interaction of two or more components, and if this hypothesis fails to hold a further level of complexity can be adduced, up to the point where a property can be assumed to emerge at the global level of the whole system.

Another important consideration that necessitates a multi-level analysis is that not only is order dependent on structure but randomness and unpredictability are relative to the theoretical framework or computational set-up (architecture, processing power, input stream). A system or an input is unpredictable only in relation to the theoretical framework that imposes its form of prediction. To understand something means to impute to it some kind of characteristic behaviour under certain circumstances, and this entails prediction, abstraction, and generalization. If we want to act on the world we cannot renounce prediction or abstraction, and we cannot thereby avoid the possibility of some random perturbation to our system. However, we may construct systems that are more or less robust or resilient to perturbations, more or less capable of the dynamic revision and reconstruction of theoretical frameworks. Crucially, different theoretical frameworks are necessary for each level of analysis, and the dynamics of randomness, unpredictability, and robustness or resilience to perturbation are distinct at each level[2] – for example physics, biology and social systems may be distinguished along precisely these lines, as well as classifying the descriptive capabilities of different architectures in the computational hierarchy from finite automata through to universal Turing machines.[3] Moreover, the multi-level dynamic mechanistic framework provides an engineering perspective by which the functional property of intelligence itself is decomposed and the resulting explanatory hypotheses enable traction not only on the problem of engineering artificially intelligent agents, but also on re-engineering the human.[4]

Why formalization? Formalization may be understood as the explicitation of implicit theoretical frameworks or models, and making these models explicit allows for them to be analysed and transformed. Implicit rule-governed practices will always be in excess of any explicit formalization of them (this was amply demonstrated by the later Wittgenstein’s investigation of language games),[5] however formalisation may act as a point of leverage giving externalized traction on the ideological sedimentation of implicitly structured practices (a point well made by Joshua Epstein with regard to the efficacy and politics of computational modeling).[6] In this sense formalization may be understood as a kind of decomposition of the implicit structure of a theoretical framework, allowing us to grasp its functional characteristics and change it. Moreover, formalization of a model makes possible its implementation in another substrate, such as an artificially intelligent agent.

Why do we talk about analogy here? Firstly, we can consider the functionalist analogy, which understood the brain as a computer, as the initial stage in a dynamic process of decomposition and localization of the functional properties of cognition (a process that moves from the first-wave functionalism of Putnam etc based on symbol manipulation to the dynamic model of neofunctionalism based on connectionism put forward by the likes of Bechtel). Already with Turing it was recognized that if the brain in a very general sense is something like a computer it can be specifically differentiated from the kind of computational processes occurring on a discrete state machine at the lower levels of the computational hierarchy since while the latter is functionally context-free and iterates without variation, the former is context-sensitive or further, as Turing had it ‘prone to exponential drift’, which we may understand in today’s parlance as ‘sensitive to initial conditions’ or non-linear.[7] This breakdown in the analogy does not definitively destroy its usefulness – in fact an analogy that directly maps is hardly fruitful – the usefulness of analogies depends on whether any testable consequences can be deduced from them.[8] In this case the analogy has drawn out the difference between computational processes occurring in discrete state machines and those occurring in biological brains. In fact, modeling itself may be understood at the most general level as an analogical process based on abstraction that ignores certain details while highlighting how other features overlap.

Secondly, analogical transitions between different contexts are not just a natural aspect of human intelligence but a key cognitive operation of any intelligent agent, along with induction, recursion, and the management of behaviour through the apprehension of contextual frames, discursive markers, and conceptual contents.

As Arzi-Gonczarowski notes, Kant understood that analogy isn’t based on an imperfect similarity of two things, but a perfect similarity of relations between two quite dissimilar things[9] – and moreover understood these kind of supramodal transits in terms of categories. It is extracting these perfect similarities that underlies the flexibility and generativity of higher intelligence (whether carbon-based, silicon-based or a hybrid). Gentner, for instance, models the use of analogy in various subprocesses of learning and reasoning, such as memory retrieval, mapping and structural alignment between different perceptions or environmental contexts, as well as evaluation, abstraction, re-representation and adaptation.[10]

Analogy is key to the development of AI for the same reason that it underlies human reasoning. Furthermore, in interacting with human agents, artificially intelligent agents will need to analogize in order to respond to human behaviour. At the very general level we can distinguish two broad forms of analogical transition – one based on recognition of already given similarities through a diagnostic process of interpretative analysis, and another that is capable of creatively discovering or inventing new analogical transitions through a generative process of synthesis.

In order to understand the centrality of analogical transitions to the general problem of intelligence it is necessary to acknowledge the way in which cognition is embodied, embedded, enacted and extended. It is impossible to separate perception, conception or action from the environment that supports it and to which it is directed. The fact that interaction with an environment plays an essential role in intelligence was recognized in Turing’s early writings, and developed more recently by others, such as Clark and Hutchins. The latter proposes the concept of a cognitive supraindividual including both the environment and the intelligent agent acting within it, which can be fruitfully aligned with the insights of second-order cybernetics.[11]

This generatively extensible supraindividual context-sensitivity can be applied not only to the intelligent agent but also to its perceptions, conceptions, and actions, which can take on an infinite number of different senses, since there is no limit on the number of contexts in which their meaning is transformed. However, this relativity, plasticity, and genericity should not collapse into absolute relativity or sweeping skepticism regarding epistemic limitations since some critical invariable aspect of meaning is held by individual perceptions, conceptions or actions and this must be shared across contexts in order for analogical transitions to make sense (in fact, we should rather think of higher intelligence as meta-contextual for this reason).

Grasping this invariance may present problems in natural language due to the tendency toward an infinite regress of definitions, and definitions of definitions, each of which will have context-sensitive connotations as well as invariants (this is the problem of the excess of practical reason to formalization we talked about earlier). However, contemporary mathematics has fortunately enabled a different approach.

Category theoretic morphisms give formal tools to rigorously describe structure preserving paths (that is, invariants) between different perceptions, conceptions or actions within a single interpretative context, or dually, for a single perception, conception or action across different interpretative contexts.

When an analogy is modeled by a morphism, then the monotonicity of that morphism captures the invariant core or ‘uniformity’ of the analogy, while the non-monotonicity provides a detailed description about the points where the analogy breaks down.[12] Such break-downs or slippages in the analogical transition do not invalidate it but rather require the acknowledgment of non-monotonic variability, and the construction of a non-rigid abstract schema that generalizes over these differences, avoiding determination only where necessary.

The generation of perceptually, conceptually of actionally incisive cognitive images of the environment, as well as the generation of highly structured analogies, are both based on awareness of lawlike patterns or structure preserving transformations between perceptual, conceptual or actional constituents. The ability to construct an intelligent cognitive image of the environment is then linked to the capacity to critically analyse and creatively synthesise insightful analogies since both require the exploitation of lawlike patterns or structure preserving transformations.

Since neural nets can trawl vast archives and compile massive data sets, they are able to discern and catalogue the relationship between a far greater number of patterns than any single human is capable of. Moreover, because they operate according to formalized criteria that are both inspectable and reprogrammable, they may explicitly reveal patterns that are obscured for humans by their ideological sedimentation in implicitly structured practices. This is not to say that neural nets or algorithms are immune to ideological bias – on the contrary, prejudice is often embedded in computational systems, often this is opaque to analysis due to the complexity of the system or to proprietary rights over its management, and moreover this may exacerbate the expression or enaction of bias in computer assisted human behaviour. However, formal tools can give us traction on this problem, not only allowing for the emancipatory amelioration of the computational system, but also facilitating the acknowledgement and transformation of predispositions in human behaviour. For example, a neural net designed by a team at google was programmed to trawl Google News texts looking for patterns of conceptual associations or word embeddings that could be represented by a simple vector algebra in a multi-dimensional vector space.[13] These word embeddings, for example ‘Paris is to France as Tokyo is to Japan’, capture structure preserving transformations or analogical transitions in the formal language of vector space mathematics. However, since the neural net discovered these analogical transitions by consuming a vast diet of professional journalism, it inadvertently picked up the latent gender bias that pervaded the input data as a whole. Even though gender bias was not noticeable within each single input article, at least to the journalists and editors responsible, the aggregate output displayed a marked prejudice, so that the neural net would make analogical transitions such as ‘man is to computer programmer as woman is to homemaker’. Because this bias was captured and represented by formal mathematical tools it was not only legible in the warped geometry of the neural net’s vector space but also transformable by applying a distortion of inverse proportions, a process the google team called ‘hard de-biasing’. However, judging the shape of the warp, or the appropriateness of the analogical transitions that populated it, could not itself be formalized and automated in the present system since it would require some kind of artificial general intelligence capable of making normative judgments regarding the appropriateness of associations made on the basis of gender difference. As a result this task was outsourced to democratic vote by a small group of Amazon’s mechanical Turks. The irony that there is a marked gender asymmetry in the makeup of the mechanical turk workforce should not be lost here (hard debiasing relies on structurally embedded employment asymmetries). Nevertheless, it should not detract from the remarkable power of the neural net and its formalized mathematical tools to apprehend and transform the latent ideological bias sedimented in journalistic practice.

 

[1] The methodology and its explanation are derived from Bechtel, W. (2010) Discovering Complexity: Decomposition and Localization as Strategies in Scientific Research. MIT Press.

[2] See: Calude, C. & Longo, G. (2015) Classical, Quantum and Biological Randomness as Relative Unpredictability. Natural Computing, Springer.; Longo, G. & Montévil, M. (2012) Randomness Increases Order in Biological Evolution. Frontiers in Physiology, n. 3, 39.; Bravi, B. & Longo, G. (2015) The Unconventionality of Nature: Biology, from Noise to Functional Randomness. Invited Lecture, Unconventional Computation and Natural Computation Conference (UCNC), Auckland (NZ), 31/8 – 4/9/2015, proceedings to appear in Springer LNCS, Eds. Calude, C.S. & Dinneen M.J. pp. 3-34.

[3] Crutchfield, J.P. (1994) The Calculi of Emergence: Computation, Dynamics, and Induction. Physica D, Proceedings of the Oji International Seminar Complex Systems – from Complex Dynamics to Artificial Reality. Elsevier North-Holland, Inc. New York, NY, USA.

[4] Negarestani, R. (2014) The Labor of the Inhuman, in Eds. Mackay, R. & Avanessian, A. #Accelerate: The Accelerationist Reader. Urbanomic.

[5] Wittgenstein, L. (2009) Philosophical Investigations. Wiley-Blackwell.
Cf. Brandom, R. (1994) Making it Explicit: Reasoning, Representing and Discursive Commitment. Harvard University Press. pp.13-30

[6] Epstein, J. M. (2007) Generative Social Science: Studies in Agent-Based Computational Modeling. Princeton Univeristy Press.

[7] Longo, G. (2008) Laplace, Turing and the “imitation game” impossible geometry: randomness, determinism and programs in Turing’s test, in eds. Epstein, R., Roberts, G., & Beber, G. Parsing the Turing Test. pp. 377-413, Springer.

[8] Arzi-Gonczarowski, Z. (1999) Perceive this as that – Analogies, artificial perception, and category theory. Annals of Mathematics and Artificial Intelligence, 26, 215–252.

[9] Caygill, H. (1995) A Kant Dictionary, Blackwell Publishers, Great Britain. p.66 quoted in Arzi-Gonczarowski, Z. (1999) Perceive this as that – Analogies, artificial perception, and category theory. Annals of Mathematics and Artificial Intelligence, 26, 215–252.

[10] Gentner, D. (1998) Analogy, in: A Companion to Cognitive Science. Blackwell, chapter II(1), pp. 107–113. quoted in Arzi-Gonczarowski, Z. (1999) Perceive this as that – Analogies, artificial perception, and category theory. Annals of Mathematics and Artificial Intelligence, 26, 215–252.

[11] Hutchins, E. (1995) Cognition in the Wild. MIT Press, Cambridge, MA.

[12] This is demonstrated in in Arzi-Gonczarowski, Z. (1999) Perceive this as that – Analogies, artificial perception, and category theory. Annals of Mathematics and Artificial Intelligence, 26, 215–252.

[13] Bolukbasi, T., Chang, K-W., Zou, J., Saligrama, V. & Kalai, A. (2016) Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. arXiv:1607.06520.

 

Viva Presentation

My interest in noise began from a dissatisfaction with available accounts in the humanities, particularly within sonic culture literature, where noise is mostly presented either in terms of the irreducibility of subjective experience to any form of measure or explanation, or as unpredictable material vibrations that disrupt control and undermine forms of domination – in both cases noise is opposed to rationality, which is characterized as compartmentalizing, idealist, and bound to a static ontology. This characterization fundamentally misunderstands reason, missing the complex dialectic between realism, materialism, and idealism. Moreover it is based on a fetishization of chance, randomness, and perturbation, that is endemic in many social contexts, appearing in a number of different guises and with various practical and theoretical ramifications – for example I show certain philosophical and scientific expressions of this fetishization, and also examine its expression in economics and in music. So, my aim was firstly to counter these mischaracterisations by giving a more positive account of the scientific conception of noise. To do this I firstly needed to explain noise in terms of various measures of randomness and unpredictability. This enables a demystification of noise because randomness is scale and language relative but not therefore relativist, irreducibly subjective, or beyond scientific description. Understanding randomness as a scale relative phenomenon means it can no longer be put into a simple opposition with order or rationality. A crucial claim of my thesis is that noise is not a given that can be treated everywhere the same – it is rather a context specific phenomenon that demands a multi-scale analysis in terms of the different levels of dynamics operating in complex hierarchically nested systems. My intention was to deepen the understanding of the concept of noise, to ramify its pathways, by giving it a multi-level and multi-contextual description, ranging over its philosophical, scientific and artistic implications, and exploring its development in the fields of economics and music.

 

My argument is that the problematic conception of noise put forward in the humanities stems from the opposition between the classical scientific image – based on mechanistic determinism (reversible) – and the manifest experience of freedom (perception, conception, and action). This is why I turn to Sellars, and his call for a stereoscopic fusion of the scientific and manifest images. Various strands of thought in the twentieth century (philosophical, scientific and artistic) sought to overcome this opposition by explaining the manifest experience of freedom in terms of self-organising systems (irreversible). Following Rene Thom I argue that this results in a botched dialectic, where the distinction between thought and thing is unjustifiably collapsed, and freedom comes to be understood as material vibrations beyond any measure of the intellect (while Thom levels this accusation at theorists like Atlan, Prigogine and Serres, I extend it to Deleuze). This problem is compounded in humanities accounts of noise, which generally have a poor understanding of the scientific issues, in particular entropy. Thermodynamic and information theoretic conceptions are often confused, and the distinction between the dynamics of physical and biological systems as well as information processing systems and rational thought are missed or misunderstood – this is why I turn to Longo’s explanation of these distinctions, and the development of the concept of anti-entropy in contrast with negentropy and entropy.

I argue that moving beyond this botched dialectic requires navigating between two equally problematic ways in which the opposition between thought and thing have been overcome: on the one side a reductionist claim (epitomized by eliminative materialism) that the manifest experience of freedom is an illusion and that all talk of persons and their beliefs will be made obsolete by neuroscientific explanations, and on the other side a fusion of thought and thing based on the unpredictability of self-organising systems and the irreducibility of experience to any form of measure. This is why I find it necessary to turn to Sellars’ inferentialist account of rational thought, which argues for a fully naturalized explanation of thought but understands freedom by returning to the Kantian conception of normative self-determination. Importantly, unlike Kant for Sellars this does not stem from or rest on the self-sufficiency of subjective experience, but is naturalistically evolved at the level of collective forms of representation and communication. Sellars makes his naturalistic account of the development of intelligence by referring to the pattern-governed activity of sentient lifeforms, and arguing that humans are not merely pattern-governed but also rule-obeying. Normative freedom consists in the ability to alter behaviour according to reasons, Sellars refers to this as the irreducibility of the is to the ought. This is an important distinction that is often collapsed in recent work in the humanities, inspired by Deleuze, which reject the representational model of thought based on the propositional form of judgment, and replace it with bottom-up processes of self-organising systems. Both Deleuze and Sellars make unconventional analyses of Hume and Kant, and these bear important consequences for the philosophical articulation of the concept of noise and its relation to randomness, freedom, causality, and reason. I explore their different interpretations, systematically rejecting the Deleuzian emphasis on immediacy and intuition whilst maintaining and rearticulating his important insights within a Sellarsian framework.

After addressing the question of noise at the philosophical level I then turn to an exploration of the concept of noise in two more practical contexts: economics and music. I explain in the introduction that this choice is not arbitrary, since both are highly mathematical, both draw on or are influenced by probability theory and physics, both bear interesting relationships to materialism and idealism, to the naturalistic and the normative, to chance, unpredictability and noise.

The rise of modern finance, and the onset of global capitalist trade networks, is predicated on the mathematics of probability theory. Classical and neoclassical economics are an ideology based on a substantialist conception of value, equilibrium dynamics of the market, and the natural tendency towards the benefit of all agents. They import concepts from physics – where trajectories are predictable, reversible, and tend towards equilibrium – and apply them to the socio-economic domain where, assymetries, and irreversible path-dependent processes are pervasive. More than the brutal abstractions of economic theory, the problem again, I argue, stems from a faulty articulation of the naturalistic and the normative. Following the recent work of several different authors I argue that value is derivative to the pricing process, and that every economic transaction, as well as the constitution of the liquidity function of money, is a normative gesture that cannot be justified by pointing to any nature. Neoclassical economics is still the orthodox account however, and in naturalizing competition in the framework of marginal utility theory has shaped neoliberal hegemony and influenced global economic policy decisions over a prolonged period. Certain heterodox economic theories have been enjoying renewed attention recently as a result, in particular ecological economics, which has recognized non-equilibrium dynamics, information asymmetries, pervasive feedback effects, and so on. In doing so they move beyond the ideological assumptions of orthodox theories. However I draw on Longo’s work with Felin, Koppl and Kauffman to argue that the concept of landscape that ecological economics draws on is illegitimately taken from physics and cannot be rigorously applied either to biological ecologies or to the econosphere, where trajectories are unpredictable, possibilities cannot be totalized, and so a phase space cannot be constructed. Again following Longo’s work I argue that the idea of robustness to noise is not pertinent to biological and economic systems, since they are constantly undergoing symmetry breaking transitions and are thus better described by the concept of resilience. What is needed therefore is a synoptic economic theory that is fully naturalized but that accounts for the specificity of socio-economic dynamics in contrast with physics, and that preserves the fundamentally normative character of the pricing process.

Lastly, I turn to an analysis of noise in the sonic sense, and its use in music. I begin by outlining a phenomenological account of auditory experience drawing on Husserl and Schaeffer. This establishes the difference between physical soundwaves and auditory experience, which includes various anamorphoses as well as the global integration of auditory information into a window of simultaneity constituting a specious present marked by retentions and protentions. I examine bistable and multistable auditory phenomena as revealing the active subpersonal processes of signal disambiguation and image construction underlying the apparent seamlessness of auditory scene analysis. This can be understood as the Bayesian calibration of a decision criterion on a probability distribution elicited by a received signal. I then review some examples of music that explores this sweet spot of perceptual illusion, such as spectral music and post-techno. I argue that music is more than just ‘organised sound’ and can be better understood as the diagrammatic or gestural constitution of a counterfactual space and time allowing for the transformation of behaviour within this new possibility space. I then delve further into the neuroscientific account of auditory experience, drawing on Metzinger’s self model theory of subjectivity and his multi-level description of the constraints of conscious experience. I then explore the use of noise in contemporary music, discussing its relation to art theory, to genre, and arguing for the necessity of a multi-level account that includes semantics and syntax as well as more traditional focus on rhythm, melody and harmony etc. Finally I show how tools from contemporary mathematics – category theory and topos theory – can deepen the analysis of music, allowing for an understanding of musical time in terms of symmetry breaking, as well as accounting for the gestural constitution of this time in the self-referential structure of music.

AntiNatural

AntiNatural Talk

The concept of nature is on the one hand a mainstay of diverse forms of conservative ideology where the natural is predominantly opposed to the artificial, the idealistic, and the deviant or perverse.

On the other hand, there is an attempt to overcome the oppositions entrenched in this conservative perspective by expanding the concept of nature so that it includes technology, fantastic ideas, and unconventional behaviour.

The conservative position is overly deterministic; it justifies normative prescriptions concerning what ought to be the case by pointing to naturalistic descriptions of what is the case.

The expanded concept of nature is overly indeterministic – its emphasis is on the indeterminate potential of becoming but by extending the concept of nature to all behaviour the normative capacity for judgment loses its purchase. By fetishizing natural processes of self-organisation over representation and the propositional articulation of reason it remains complicit with the neoliberal logic of late capitalism .

An adequate conceptualization of nature must navigate between these positions.

In contrast with the conservative position nature is neither a given nor a justification. The market does not naturally tend towards the formation of fair prices. Income inequalities and traditional gender roles are not human nature. Neither the ecological nor the social can be seen as a natural order, they are not fragile equilibria that must be guarded against technical and social transformations. They are both artificial constructions that have never been anything but far from equilibrium.

In contrast with the expanded conception of nature we must make a distinction between a naturalistic description of different levels of information processing dynamics that accounts for the normative-linguistic capacity for the top-down control of behaviour according to propositionally articulated reasons.

To be human is to enter into a game of artificial self-construction at the level of the social – this is an ongoing process of alienation from nature, a progressive deracination from the local exigencies that constrain thought and behaviour. Nature is no reason, and reason is not natural even if it is part of our nature. As Negarestani says reason is inhuman and ‘Inhumanism is the labor of rational agency on the human’ this is the elaboration of what it means to be human. Nature reflexively grasps its anti-nature and thus transforms itself. Freedom is not given but is performed or produced in this labor of reflexive transformation.

In order to grasp the true nature of this freedom it is necessary to avoid two tendencies to misconceptualise the relationship between reason and causality; on the one hand ruthless reductionist accounts that aim to eliminate the inherited illusions of folk psychology, on the other hand emergentist accounts that argue for the irreducibility of thought to causal processes. The former deny the normative-linguistic force of reason, the latter deny the causal-naturalistic explanation of reason.

To give a mathematical description of a process is to naturalise it – to explain it according to natural laws. When Galileo mathematised the supralunary realm he naturalized the heavens.

To claim that something is not amenable to explanation according to natural laws is anti-scientific mystification.

For Bergson, and for Deleuze, the lived experience of duration is natural, in a vitalistic expanded conception of nature, but cannot be mathematized so is not amenable to explanation according to natural laws.

This is a mystification of experience. On the contrary, the lived experience of duration can be naturalized according to a neurophenomenological description of the global architecture of consciousness.

What cannot be axiomatised, and what is in that sense anti-natural, is reason. The normative-linguistic capacity of thinking can be explained in terms of the causal structure of neuronal activity, but a description of the neuronal activation patterns in the brain at any one state is no indication of what its subsequent state will be.

If we give a description of an inanimate object like a rock, specifying its position and velocity we can calculate with accuracy where it will be at some point in the future.

Being here entails that it will be there.

But there are no entailing laws for predicting the trajectory of biological organisms or neural assemblies.

However, this does not mean that the freedom of thought is just the unpredictable randomness of neuronal activation patterns.

The freedom of rational subjectivity, its logical irreducibility to any naturalized description, is its capacity to acknowledge, construct and revise rules and to perceive think and act according to these commitments

Reason is fully naturalistic, in the sense that it is amenable to scientific explanation in terms of its causal structure and its functional properties, whilst also requiring a further level of description that must be addressed at the normative level of commitments and entitlements.

The definition of freedom has been bound up with the philosophical problem of necessity and chance, determinacy and indeterminacy, and this has caused a great deal of confusion.

Continental theory is in particular to blame for promoting a ‘botched dialectic’ that makes ‘self-organising’ randomness and perturbations below the threshold of measurement the wellspring of freedom and creativity against the rational description of systems in terms of mechanistic determinism.

Countering this ‘pseudo-libertarian imposture of chaos’ does not mean returning to a dualistic conception in which material processes are reduced to the linear causal regime of particle impacts and opposed to some form of spontaneous unconstrained freedom. Rather, it demands a reconceptualization of the relation between reason and randomness that resists the temptation to hypostatise chance.

This argument follows René Thom’s criticism of the glorification of chance in the form of random fluctuations and perturbations in the diverse philosophies of Monod, Prigogine, Atlan, and Serres. I think Thom’s critique can be extended to the very different ways in which randomness, self-organising systems and noise have been misconceptualised and fetishised in philosophies such as Deleuze, in political economy, and in the theory and practice of music (which I don’t have time to go into).

Thom’s argument follows from a negative definition of randomness, as what exceeds simulation or formal description. he explains that the capacity for simulation or formal description is relative to a certain scale of observation, and that this is particularly true for the analysis of complex hierarchically organised systems such as us.

It could be argued that Thom has a merely epistemological understanding of randomness, and cannot thereby think its ontological scope. However, this would be mistaken; his argument is that any talk of randomness presupposes the definition of a frame of reference, or context, and a language or means of representation; ‘any discussion on the theme of “chance vs. determinism” must begin with an examination of the languages and formalisms which permit making a phenomenon the object of a realm of knowledge.’ This approach is corroborated by James Crutchfield’s ‘computational mechanics’, which also argues that any measure of disorder is relative to the descriptive tools employed, and the specification of this language is defined by what the model is intended to observe.

Thom begins with an epistemological definition of randomness and draws an ontological conclusion from this; he affirms the ultimate describability of nature in principle (i.e. the non-existence of fundamental limits to reason), and thereby denies the hypostatisation of chance: ‘To assert that “chance exists” is therefore to take the ontological position which consists in affirming that there are natural phenomena which we shall never be able to describe, therefore never understand.’ Thom’s negative ontological claim might be rephrased as the positive assertion that for any context-specific or scale-relative appearance of randomness, there are no a priori limitations to its description or scientific understanding at another scale. One might argue then that randomness exists (has an objective ontological status), but only as an effect of information processing dynamics and multi-scale complexity.

To summarise we are not free because of the indeterminacy of nature or because of a lack of constraints, but because the complex hierarchically nested structure of constraints in dynamic systems such as ourselves enables us to control our behaviour according to rules and make choices according to reasons. As techno-scientific knowledge progresses more and more complex phenomena will yield to a naturalized description, finally leading to a fully objectified account of experience.

Having a naturalized description of something makes various control opportunities available that were hitherto unimaginable. The more that consciousness is given a natural description the more that we can gain traction on the parochial limitations of biological cognition and transcend them. This is the infinite goal of anti-nature, lean forwards and activate the revisionary-constructive engineering loop.

New Abstract

This thesis aims to elaborate the theoretical and practical significance of the concept of noise with regard to current debates concerning realism, materialism, and rationality. The scientific conception of noise follows from the developments of thermodynamics, information theory, cybernetics, and dynamic systems theory; hence its qualification as irreversible. It is argued that this conceptualization of noise is entangled in several polemics that cross the arts and sciences, and that it is crucial to an understanding of their contemporary condition. In particular, there are ruthless reductionist accounts that aim to eliminate the inherited illusions of folk psychology on the one hand, and emergentist accounts that argue for the irreducibility of thought to causal processes on the other hand. The former deny the normative-linguistic force of reason, the latter deny the causal-naturalistic explanation of reason. In contrast with either tendency this thesis contributes to the theoretical articulation of noise by arguing for the necessity of an inferentialist account of reason that is fully naturalistic, in the sense that it is amenable to scientific explanation in terms of its causal structure and its functional properties, whilst also maintaining a normative conception of freedom that must be addressed at the level of commitments and entitlements in the space of reason. It draws on complexity theory to give a multi-level account of noise, and argues that randomness is an intrinsic functional component at all levels of complex dynamic systems, including higher cognition and reason. After surveying the scientific and philosophical context, the practical understanding of noise in terms of probability theory is elaborated through a history of its development in the field of economics, where the idealization of randomness has had its most pernicious effects. Moving from the suppression of noise in economics to its glorification in aesthetics, the experience of noise in the sonic sense is first given a naturalistic neuro-phenomenological explanation. Finally, the theoretical tools developed over the course of the inquiry are applied to the use of noise in music. The rational explanation of randomness, and the active manipulation of probability that this enables, is opposed to the political and aesthetic tendencies to fetishize indeterminacy. This multi-level account of constrained randomness demystifies noise, showing it to be an intrinsic and functionally necessary condition of reason and consequently of freedom.

Irreversible Noise: The Veneration of Chaos and the Rationalization of Indeterminacy

This site is really less of a blog than a repository for previous work I’ve done, much of which I no longer endorse. Basically its a dumping ground where expired ideas can decay in public, unleashing their toxins to the unsuspecting visitor. Its about time I updated the compost pile by adding a relatively fresh heap of conceptual detritus. So, here’s a summary of what my thesis is supposed to be doing:

 

Irreversible Noise: The Veneration of Chaos and the Rationalization of Indeterminacy

The premise of this thesis is that it is the ‘strategic ambiguity’ of the concept of noise that allows for its exemplary traversal across artistic, scientific, economic, political and scientific fields. In contrast to many other accounts that mobilize the concept against the prevailing order, thereby endorsing an ontological indeterminism that risks falling into negative theology (veneration of chaos), it attempts to show how its transdisiplinary articulation has allowed for a progressive generation of constraints and rationalization of uncertainty. This horizontal extension in the topological space of the concept (topos of noise) is deepened by a stratigraphic cut through its multiple dimensions of verticality revealing the complex hierarchal dynamics of its structural constitution.

 

Considering philosophy to be a metadiscursive practice where the concept finds its full elaboration it examines how metaphysical, ontological and epistemological claims expand the conceptualization of noise through tracing and reconfiguring the ramified pathways of its topos. It then proceeds to describe the physical, chemical and biological generation of noise in terms of thermodynamic, morphodynamic, and teleodynamic processes that are necessary preconditions for the emergence of complex thought. The prosthetic construction of a space of reason is then analyzed in terms of its socio-technical development, deploying the multi-scale extension of the concept of noise to render the relation between techné and episteme explicit with regards to indeterminacy and randomness. This is further explored according to a functionalist and pragmaticist account of reason as a revisable structure of commitments.

 

Noise plays a central role in the contemporary rearticulation of reason since it crucially draws on abductive as well as deductive and inductive logical inferences due to their error-tolerant capacity for revision and the ampliative expansion of knowledge through hypothesis generation. This inferential aspect of noise is pursued through an investigation of computation and algorithmic incompressibility, and in terms of heterodox economic accounts that draw on thermodynamic, evolutionary and ecological science. Phenomenological descriptions of sensory interference and neurophilosophical explanations of the stochastic randomness of brain functions demonstrate the necessity of multi-scale noise for higher order processes. Finally, the escalating tendency towards the inclusion of complex sound in twentieth century music is shown to be irreducible to its analysis in either aesthetic or conceptual terms alone, where a full appreciation must take account of the dynamic depth of noise.