Toward a New Concept of Genocide: A Reply

Our symposium on Benjamin Meiches’s The Politics of Annihilation: A Genealogy of Genocide (University of Minnesota Press, 2019) concludes with the author’s response to the participants. You can find all the previous entries listed here.


As I read each of the pieces in this symposium, I felt a sense of deep gratitude. While scholars regularly discuss issues with one another, it is truly rare that our research becomes the subject of such serious, thorough engagement. Each of the contributors to the symposium made insightful comments, showcased their critical acumen, and read The Politics of Annihilation with agonistic respect. Each commentary gave me new insight into the work. Indeed, a friend of mine in Disability Studies maintains that you only know what a book is about after you finish writing it. To the contrary, I think you only know what a book is about after you hear what it has done (or not done) for others. In that sense, these contributions have given me some of the first insights into what this text is actually about. So let me begin by extending a heartfelt thanks to Jelena, Alex, Jessica, and Myriam for their time, generous feedback and consideration. I also wish to thank Antoine Bousquet for both suggesting and organizing the symposium.

Jelena’s piece calls attention to the problem of linguistic policing and the danger of focusing on language rather than actual violence. She describes this as an international phenomenon by pointing to the ongoing debate in the United States about whether the Trump administration’s detention facilities are ‘concentration camps’ and to the classification of Srebrenica as the only ‘act of genocide’ in the context of the ICTY. Each case, Jelena contends, creates a distinct problem. On the one hand, the politicization of ‘concentration camps’ involves “gatekeeping of the use of certain historical terms and the prohibition of analogizing from past to today that is freezing political action.” Entrenched debate over terminology saps energy that could be used to dismantle these institutions of confinement and violence. On the other hand, language is important since it lays the groundwork for other types of denial and disavowal. At worst, historical designations may become the touchstone that legitimates contemporary political violence. The difficulty then is that focusing too much on language obscures material conditions, but, at the same time, ignoring discursive power risks the derealization of violence.

Continue reading

Advertisements

The Politics of Annihilation: A Symposium

The Disorder of Things is delighted to be hosting over the coming week a symposium for Benjamin Meiches’s important new book The Politics of Annihilation: A Genealogy of Genocide (University of Minnesota Press, 2019). Following Benjamin’s introductory post below, we will have a rich set of interventions from Jelena Subotic, Alexander Barder, Jessica Auchter, and Myriam Fotou before a final rejoinder from the author. All the entries in this series will be collated here. Previous symposia are also available.

Benjamin Meiches is Assistant Professor of security studies and conflict resolution at University of Washington Tacoma. In addition to his new monograph, he has contributed a variety of articles to International Political Sociology, Security Dialogue, Critical Studies on Security, and Review of International Studies.

 


“New conceptions require new terms” – Raphaël Lemkin (Axis Rule in Occupied Europe)

“To affirm is not to bear, carry, or harness oneself to that which exists, but on the contrary to unburden, unharness, and set free that which lives.” – Gilles Deleuze (Nietzsche and Philosophy)

Raphaël Lemkin coined the neologism ‘genocide’ in 1944 in a publication called Axis Rule in Occupied Europe, a voluminous study that documented legal and policy changes in Europe under Nazi rule. Little did Lemkin know that less than a century later this term would become one of the most charged terms in contemporary politics. Indeed, within a generation, an explosion took place that transformed the concept of genocide from little more than a scholarly heuristic buried in the midst of a legal tome into the most symbolically vexing and affectively potent form of rhetoric in global politics.

Although barely seven and a half decades separates the genesis of the concept of genocide from today, a great drift took place during this period. Consider, for example, a popular and critical use of genocide discourse today. NK Jemisin, author of the brilliant science fiction series The Broken Earth, uses the character of Nassun to speculate about the meaning of genocide. In the text, Nassun is a member of a hunted group called ‘orogenes’ that suffer murder, enslavement, and torture over millennia. Through Nassun’s voice, Jemisin addresses the problem of genocide. Specifically, Nassun states: “But breathing doesn’t always mean living, and maybe…maybe genocide doesn’t always leave bodies.” In some sense, Nassun (or Jemisin) are correct to view this statement as a new (and important) image of genocide because, today, the dominant images of genocide focus primarily on the act of mass killing based on ethnic, religious, national or racial identity. The irony is that Jemisin’s (or Nassun’s) image of genocide, the genocide that may or may not leave bodies, resonates far more powerfully with the vision of genocide articulated by Lemkin and many of his interlocutors at the inception of this discourse.

Continue reading

Pyrrhic Victories: The Endgames of Accelerationist Efficacy

The fourth commentary, and fifth post, on Nick and Alex Williams’ Inventing the Future, delivered by Aggie Hirst and Tom Houseman. Aggie is a Lecturer in International Politics at City University London. She works on issues relating to violence and international theory/philosophy, including war and wargaming, US foreign policy, Derrida, Nietzsche, and post-foundational ethics/politics. Tom is a Lecturer in International Politics at the University of Manchester, focusing on capitalism, development, and ideology. He is variously interested in (in no particular order) the politics of epistemology, apocalypticism, Adorno, international development, and concepts of science.


In a climate of successive defeats, missed opportunities and the consolidation (and even exacerbation) of unequal and exploitative social relations, there are few acts more thankless than turning the weapons of iconoclasm against those already waging a struggle against insurmountable odds. Inventing the Future seeks to rescue the Left from what its authors term ‘folk politics’: a commitment to horizontal, local, consensual and prefigurative forms of political action, which the authors claim result ultimately in impotence and irrelevance, aimlessness and lack of focus. In condemning a host of the post-68 Left’s most dearly held praxiological and ethical commitments, Srnicek and Williams wilfully risk aggravating and alienating those they seek to influence.

There will be many readers who will find their prescriptions – the revival of universalism, the aspiration to hegemony, the mobilisation of state power – outdated, odious and even obscene. And for good reason: the attack on ‘folk politics’ doesn’t end after the critique that opens the book. Instead, the sheer audacity of the authors’ wager – essentially that our only hope of defeating the Godzilla of neoliberal capitalism is the creation of an equally powerful Mechagodzilla capable of supplanting the former’s hegemony with its own – performs an ongoing rejection of a parochialism and modesty they see as having corrupted Leftist activism and academia. Like all iconoclasm, such a move is necessarily scandalous in response to the perceived sanctity of that at which it takes aim.[1]

It is precisely this scandalous character of both the book and its precursor, the ‘Manifesto for an Accelerationist Politics’ (MAP), which goes some way to accounting for the attention the authors have generated across the Left. The book’s stated goals are both vast in scope and highly controversial, yet its tone is one of consistent and calm self-assuredness. The magnitude of the risks associated with the project – the casualties of automation (both human and environmental), the tyrannies of engineering consent, the violences of assuming the task of constructing people’s very identities, to point to just a few – would suffice to make most recoil in dread. The authors’ composed confidence in the face of such potential horror makes reading and responding to the seductions of such book a complex and disorientating task.

Continue reading

What We Talked About At ISA: Cognitive Assemblages

HFT

What follows is the text of my presentation for a roundtable discussion on the use of assemblage thinking for International Relations at ISA in early April.


In this short presentation I want to try and demonstrate some of the qualities assemblage thinking brings with it, and I’ll attempt to do so by showing how it can develop the notion of epistemic communities. First, and most importantly, what I will call ‘cognitive assemblages’ builds on epistemic communities by emphasising the material means to produce, record, and distribute knowledge. I’ll focus on this aspect and try to show what this means for understanding knowledge production in world politics. From there, since this is a roundtable, I’ll try and raise some open questions that I think assemblage thinking highlights about the nature of agency. Third and finally, I want to raise another open question about how to develop assemblage theory and ask whether it remains parasitic on other discourses.

Throughout this, I’ll follow recent work on the concept and take ‘epistemic communities’ to mean more than simply a group of scientists.[1] Instead the term invokes any group that seeks to construct and transmit knowledge, and to influence politics (though not necessarily policy) via their expertise in knowledge. The value of this move is that it recognises the necessity of constructing knowledge in all areas of international politics – this process of producing knowledge isn’t limited solely to highly technical areas, but is instead utterly ubiquitous.

1 / Materiality

Constructivism has, of course, emphasised this more general process as well, highlighting the ways in which identities, norms, interests, and knowledge are a matter of psychological ideas and social forces. In Emanuel Adler’s exemplary words, knowledge for IR “means not only information that people carry in their heads, but also, and primarily, the intersubjective background or context of expectations, dispositions, and language that gives meaning to material reality”.[2] Knowledge here is both mental, inside the head, and social, distributed via communication. The problem with this formulation of what knowledge is, is that decades of research in science and technology studies, and in cognitive science, have shown this to be an impartial view of the nature of knowledge. Instead, knowledge is comprised of a heterogeneous set of materials, only a small portion of which are in fact identifiably ‘social’ or ‘in our heads’. It’s precisely this heterogeneity – and more specifically, the materiality of knowledge – that assemblage thinking focuses our attention on.

Knowledge is inseparable from measuring instruments, from data collection tools, from computer models and physical models, from archives, from databases and from all the material means we use to communicate research findings. In a rather persuasive article, Bruno Latour argues that what separates pre-scientific minds from scientific minds isn’t anything to do with a change inside of our heads.[3] There was no sudden advance in brainpower that made 17th century humans more scientific than 15th century humans, and as philosophy of science has shown, there’s no clear scientific method that we simply started to follow. Instead, Latour argues the shift was in the production and circulation of various new technologies which enabled our rather limited cognitive abilities to become more regimented and to see at a glance a much wider array of facts and theories. The printing press is the most obvious example here, but also the production of rationalised geometrical perspectives and new means of circulating knowledge – all of this contributed to the processes of standardisation, comparison, and categorisation that are essential to the scientific project. Therefore, what changed between the pre-scientific to the scientific was the materiality of knowledge, not our minds. And it’s assemblage thinking which focuses our attention on this aspect, emphasising that any social formation is always a collection of material and immaterial elements.

In this sense, questions about the divide between the material and the ideational can be recognised as false problems. The ideational is always material, and the constructivist is also a materialist.

Continue reading

Damage, Unincorporated*, Part Two: War Studies in the Shadow of the Information Bomb

I’m thinking about something much more important than bombs.
I am thinking about computers.

John von Neumann, 1946 (via The Scientific Way of Warfare)

Modern war has become too complex to be entrusted to the intuition of even our most trusted commander. Only our giant brains can calculate all the possibilities.

John Kemeny, 1961 (ditto)

‘Extreme science’ – the science which runs the incalculable risk of the disappearance of all science. As the tragic phenomenon of a knowledge which has suddenly become cybernetic, this techno-science becomes, then, as mass techno-culture, the agent not, as in the past, of the acceleration of history, but of the dizzying whirl of the acceleration of reality – and that to the detriment of all verisimilitude.

Paul Virilio, The Information Bomb (1998)

Non-Consensual Hallucinations

A recent spate of cyber-attacks, and the civilian-military responses to them, have pushed questions of collective violence, technological complexity and the very relation between war and peace into a more mainstream arena. Alongside diagnoses of the political impact of Web 2.0, the analysis of contemporary technoscience and its militarised uses seems less neophiliac marginalia than urgently-required research program. As previously indicated in Part One of this review, a number of recent works have broached this subject, and in the process have addressed themselves to the very relation between bios and technos, sometimes with the implication that the latter is on the verge of overwhelming the former. Skynet gone live!

Critical engagement with the boundaries and possibilities of Network-Centric Warfare (NCW) thus opens a range of complex problems relating to the co-constitution of war and society, the place of ethics in military analysis (and military practice) and the adequacy of standard categories of social science to world-changing inventions. To expect answers to such broad questions is perhaps to overburden with expectation. Yet it is interesting to find that both Guha and (Antoine) Bousquet, who are most concerned with the radical newness of contemporary war, implicitly operate within a rather traditional understanding of its boundaries. For both, ‘war’ means the restricted arena of battlespace, and in particular that battlespace as viewed by the soldiers and generals of the United States of America.

James Der Derian is intrigued by many of the same questions, but his view is more expansive, and his diagnosis of the connection between NCW and international politics generally more comprehensive. Continue reading

Damage, Unincorporated*, Part One: The Chaoplexity of Collective Violence

The below mirrors closely a review essay I recently completed for the Journal of Intervention and Statebuilding, which should appear at some point in the not-too-distant future. The books under discussion are Reimagining War in the 21st Century: From Clausewitz to Network-Centric Warfare by Manabrata Guha (London and New York: Routledge, 2011); The Scientific Way of Warfare: Order and Chaos on the Battlefields of Modernity by Antoine Bousquet (London: Hurst and Co., 2009); and Virtuous War: Mapping the Military-Industrial-Media-Entertainment Network (2nd Edition) by James Der Derian (London and New York: Routledge, 2009). Part two will follow shortly (lookie here).


I am the last in line that started with who?
With John von Neumann
If it’s the end of time so be it
But hey, it was Truman
Who set me free
I am half man
I’m almost like you
But you’ll be god-damned when I’m through
It’s a new day
So open the bay
And set this free

Black Francis, ‘Half Man’ (2008)

War is different now. On this Manabrata Guha, (our very own) Antoine Bousquet and James Der Derian agree. And their parallel accounts of the impact of technology on war – or more precisely, on the purportedly distinct Western way of war – share some other features. As is to be expected, each engages with traditions of thinking about violence and humanity’s remaking of the natural. Clausewitz looms over all three works, which could be said to share an investment in the tension derived from him between war as a kind of friction and war as a kind of instrument. All three also address a looser set of everyday ideas about (post)modern war, whether in the disconnection of bombers from their targets or the science fiction resonances found in near-instant communication, virtual reality targeting and cyborg warriors.

The question concerning technology – to put it in Martin Heidegger’s formulation, one which concerns all three authors to similar degrees – has gained considerable ground in International Relations and cognate disciplines over the last decades. In large part driven by Der Derian’s early work on post-structuralism and speed, theoretical inquiry into the nature and effects of technological progress has more recently been reinforced by considerable ‘real world’ relevance: in the explosion of social networking and its attendant ‘revolutions’, the increasing deployment of unmanned drones by the US military in Afghanistan and Pakistan and the general discourse of post-Cold War security threats from non-state actors in the form of cyber-attacks, miniaturised weapons systems or black market dirty bombs. As the impact of technology apparently spreads and metastasises, scholarly attention is turning to the sociological and ethical dimensions of digitised networks at war.

So what has the information bomb done to the modalities of collective violence?

Continue reading