Modelling Worlds: The Politics of Simulation

A guest post from Nathan Coombs who is an incoming Leverhulme Early Career Research Fellow at the University of Edinburgh. He edits the Journal of Critical Globalisation Studies, and is the author of the forthcoming book, Politics of the Event: From Marxism to Contemporary French Theory (Edinburgh University Press, 2015). His current research interests are in financial algorithms and financial regulation. He can be contacted at n.coombs (at) fastmail.co.uk


 

hft

Over the last decade, scholars have become increasingly interested in what we do when we make use of models and simulations. An emerging consensus – often legitimated through reference to Bruno Latour’s Actor-Network Theory – is that mathematical models and computer simulations are not passive tools but rather a material force in their own right. Agents may employ such technologies in order to achieve pre-determined ends, but the technologies themselves have an effectivity that exceeds their users’ intentions, and set in place path-dependencies that serve to proscribe the range of political and economic possibility.

This concern with the politics of technology cuts across multiple disciplines including Sociology, Communication Studies, International Relations, International Political Economy, and Management Studies. However, the Social Studies of Finance (SSF) has perhaps gone furthest in exploring the practical implications of modelling and simulation technologies. Applying Austinian and Barnesian notions of performativity, researchers in this field have sought to grasp the way in which economic models shape markets, and to dig into the mathematical and technical details that underpin this process.

Donald MacKenzie’s book An Engine, Not a Camera (2008) is exemplary of this approach, and a common point of reference for scholars in SSF and all the aforementioned disciplines. In his analysis of the development and uptake of the Black-Scholes option-pricing model in the 1970s, MacKenzie aims to show how the model’s employment of the efficient market hypothesis – where stock prices are considered to accurately reflect their risk – led to a period in which the pricing of options came to reflect that predicted by the model. The point of MacKenzie’s analysis is not to endorse the neoclassical economic assumptions codified in the model. Rather, it is to point out how models serve to socially facilitate evaluation practices in the face of complexity, uncertainty, and epistemological opacity. On this basis a model can also contribute to financial instability when it is both widely employed and based on assumptions that are confounded by ‘real world’ contingencies.

Continue reading

The Best Things In Life Are Free?: Open Access Publishing and Academic Precarity

The fifth post this week on open access and its impact on IR (amongst other social sciences) from previous guest poster Nathan Coombs (follow the blue underlines for the first, second, thirdfourth and sixth posts). Nathan is completing a PhD in politics and philosophy in the Department of Politics and International Relations, Royal Holloway, University of London. He is co-founder and co-editor of the transdisciplinary, open-access journal, the Journal of Critical Globalisation Studies. He has a book forthcoming in 2013: The British Ideology. Images by Pablo.


When my colleagues and I established the open-access journal, the Journal of Critical Globalisation Studies in 2009, to us open-access publishing meant placing an academic journal online which would be free for both our contributors and our readers. We took inspiration from open-access journals in critical philosophy such as Parrhesia and Cosmos and History, the efforts of the Open Humanities Press, and the Australian book publisher Re.Press, who make PDFs of their releases available online simultaneously with their distribution to bookstores.

Since this time, however, the term open-access seems to have become increasingly polyvalent. As discussed in contributions to this series of reflections by Pablo, Colin Wight and David Mainwaring, open-access publishing is now endorsed by government and publishers. Yet the price of this move into the mainstream has unfortunately been a watering down of the term. In the ‘gold’ open-access publishing scheme proposed by the Finch report, for instance, universal access to academic publications is secured, but only by preserving the existing journal subscription system and by introducing Article Processing Charges (APCs) for authors.

Whether these pseudo open-access schemes will prove to be unstable transitional forms or lasting models only time will tell. In any event, for my contribution I want to focus on open-access in its fully fledged form: ‘full open-access’ we will call it. The model of full open-access, as operated by the JCGS, does not permit any persistent role for the private (profit motivated) sector within academic journal publishing. Full open-access journals are housed on independent or University affiliated websites, freely available to everyone in the world within an internet connection, and provide a free anonymous peer-review service for contributors.

Let us imagine a world where academic journal publishing turned over completely to this approach. Journal subscription fees would be swept away. Academics would take control over their publishing arrangements. The profits of corporate publishers would dwindle to zero. An enticing scenario for anyone exasperated with the current status quo.

As with all things that sound too good to be true, though, caution is required. Continue reading

Who’s Afraid Of Economic Models?

A guest post by Nathan Coombs, a doctoral student in Politics and International Relations at Royal Holloway. Nathan’s work focuses on the relationship between metaphysics and political ideology. Nathan is the author of ‘The Political Theology of Red Toryism’, published in the Journal of Political Ideologies, 16(1), February 2011, as well as a number of other papers. He is an also an Editor of the Journal of Critical Globalisation Studies, an open-access peer-reviewed academic journal which should be a stimulus to us all. Images by Pablo.


If there is a point of unity for strong (non-Keynesian) critics of neoclassical economics it is their shared rejection of modelling. This is not to say such authors shun all use of abstraction, idealisation, and quantisation in favour of just qualitative, empirical efforts at explanation. Rather, modelling is held out as a practice whereby mathematical attempts to grasp economic laws become unhinged from reality; where abstraction begets abstraction for its own sake. For example, in his famous methodological treatise, Economics and Reality, Tony Lawson firmly demarcates the form of abstraction he recommends for economics from the practice of modelling – placing stress on the point where: “it seems vital that I indicate why the procedure to which I refer does not at all reduce to the activities of the ‘modelling’ project in question.” (Lawson 1997, 227)

For different reasons converging to the same end, advocates of the most active strand of Marxian economics, working from the Temporal Single System Interpretation (or TSSI), are equally averse to modelling, associating it with simultaneous, equilibrium interpretations of Marx’s labour theory of value, diverging from the correct representation of the theory. Inside the pages of Andrew Kliman’s recent book, Reclaiming Marx’s “Capital”, the word ‘model’ only occurs negatively in association with what he argues are flawed abstractions of the theory from Okishio’s theorem through much political economy in the 20th century (Kliman 2007, 44, 48, 66, 101, 176). The idea that Marx’s Capital might itself be considered a theoretical model of the economy is out of the question.

What explains this resistance to modelling for critics of the status quo in economics?

Continue reading