Judging the Past: Can We Really Afford Not To?

University of Edinburgh historian Donald Bloxham has provided much food for thought in his recent article for the March edition of BBC History Magazine, entitled ‘Why History Must Take a Stance’. In it, he challenges the dogmatic insistence on neutrality that pervades the historical profession. Instead of feigning an unattainable neutrality, he argues, historians should take ownership of the judgements they make and the moral ‘prompts’ that they provide to their readers. Proclaiming neutrality is misleading, and possibly dangerous. I am inclined to agree.

Whilst neutrality is an honourable and necessary ambition for any historian, it is an ideal, and it is folly to suppose otherwise. No morally conscious human being can honestly claim to provide a totally neutral account of British imperialism, for instance. We tell a story in the way that we want to tell it, and there are plethora ways of telling that story, all of which have moral implications in the present. Language, as Bloxham observes, is a key factor. Can a historian who writes about the ‘exploitation’ and ‘subjugation’ of millions of human beings as a result of the Atlantic slave trade truly claim that they are providing a ‘neutral’ impression to their reader? These words carry weight, and rightly so. To talk about the past in totally neutral terms is not only impossible, but also heartless. The stories of the people whose lives were torn apart by past injustices deserve to be told, not only out of respect or disengaged interest but because they bear lessons that exert a tangible and morally didactic hold over us in the present.

The Lady of Justice statute outside the Old Baily. (Credit: Into the Blue)

That is not to say that historical writing should take the form of a moral invective, lambasting the behaviour of dead people whom we can no longer hold to account. Nor is it to argue that historical relativism is not a vitally important and foundational principle of the profession. What I am proposing, however, is that when Richard J. Evans claims, in his otherwise brilliant ‘In Defence of History’, that we should refute E.H. Carr’s argument – that the human cost of collectivisation in the USSR was a necessary evil – in the ‘historian’s way’, by undermining its ‘historical validity’, he seems to be suggesting that we are not doing so with a moral purpose in mind. Indeed, suggesting that the costs outweighed the benefits is itself a moral judgement, for is it not judging the value of people’s lives? Whilst Evans claims that it is the reader who must infer this conclusion, not the historian, his economic argument (that collectivisation was no more successful than the policies that preceded it) is surely intended to ‘prompt’ it.

Evans, like most people, clearly opposes the morality of Carr’s argument, and his way of communicating this is in the (highly effective) ‘historian’s way’. But his purpose nonetheless is to influence the opinion of his readers, not simply to fulfil the role of historical automaton, providing those readers with every fact under the sun. The process of omission and admission is one that, try as we may to temper it, will always involve some degree of value judgement about which facts matter for the purpose of our argument and which do not. Such a value judgement will inevitably, at times, operate on a moral criterion.

This debate may, as is often the case with those that take historiography as their subject, appear somewhat academic. In a world in which our history does so much to define the identities of (and relations between) ethnic, social, cultural and political groups, however, it is anything but. What we can call the ‘neutrality complex’ runs the risk of imbuing the historical profession and its practitioners with a sense of intellectual superiority, forgetting the political consequences of its output. One can find little fault in Bloxham’s assertion that certain histories carry less moral weight, and are therefore more conducive to neutral assessment, but subjects with as much emotional resonance as the history of slavery, the Holocaust or Mao’s Great Famine cannot but be judgemental in nature. 

‘Neutrality’ can be a mask for the covert projection of nefarious ideologies and interpretations. Presenting something simply as ‘fact’ is irresponsible and shows great ignorance of the moral dispositions that influence what we write and how we write it. There is space and need for some degree, however tentative, of self-acknowledged judgement in historical writing. We owe it to our audience to declare our judgement and to justify it. The crimes of imperialism, genocide and slavery are universally evil. The historian has a concern and a duty to show their audience why those that claim otherwise, who hyperinflate relativism and claim neutrality, are guilty both of intellectual hubris and moral cowardice.

Samuel Lake, History in Politics Writer

The Environment Has No Ideology: Debating Which System Works Best is Inherently Flawed

It is often assumed that we in the ‘West’ are the arbiters of environmental policy, that we simply ‘care more’ than the rest of the world. ‘China’, for many, evokes images of flat-pack cities and rapid industrialisation synonymous with the stain left by humanity on the natural world. It is lazily viewed as an outlying hindrance to the global goal of sustainable development, whilst we remain wilfully ignorant of our own shortcomings, both past and present. Instead of viewing Chinese environmental negligence as unique, I argue, within the lingering paradigm of the ‘capitalist good/communist bad’ dichotomy, that a more bipartisan assessment of the root cause of environmental degradation may be in order. Our planet, after all, cares little for politics.

Many of China’s environmental failures have historically been attributed to the communist policies of the ruling party, particularly under Mao, whose ‘ren ding shen jian’, or ‘man must conquer nature’ slogan has been presented by the historian Judith Shapiro as evidence of the Communist Party’s desire to dominate the natural world, even at the expense of its own people and environment. Of course, there is merit to this argument – the collectivisation of land and the Great Leap Forward’s unattainable targets  wreaked havoc on the land and contributed in no small part to what Frank Dikötter has termed ‘Mao’s Great Famine’, which is estimated to have killed up to 45 million people between 1958 and 1962. It can be easy, therefore, for us to assume that this environmental exploitation is one peculiar to China’s communist system of government.

A factory in China by the Yangtze River, 2008. (Credit: Wikimedia Commons)

Without excusing the undoubtedly detrimental and inhumane policies of Mao’s government, we should  view the environmental impact of the Chinese state’s rapid development in a more contextual manner. After all, did not the rampant capitalism of the Industrial Revolution in the United Kingdom lead to the explosion of soot-filled cities like Manchester, Liverpool and Birmingham? All of which were centres of heightened industrial activity that harmed both their human population and the surrounding environment. London’s death rate rose 40% during a period of smog in December 1873, and similarly, we can look to the Great Smog of 1952, which the Met Office claims killed at least 4000 people, possibly many more.

Industrial potteries in North Staffordshire during the nineteenth century. (Credit: StokeonTrent Live)

Geographically closer to China, the Japanese state has also shown in recent years that pointing to ideology might be mistaken. The post-war Japanese growth-first and laissez-faire mentality left the likes of Chisso Corporation in Minamata to their own devices, and the results were devastating. From 1956 through to the 1970s, first cats, then human residents of  Minamata began coming down with a mysterious illness, one that caused ataxia and paralysis in its victims. It would transpire that what came to be known as ‘Minamata disease’ was the result of Chisso’s chemical plant releasing methylmercury into the town’s bay. This was absorbed by algae and passed up the food chain through the fish that local residents (both human and feline) were regularly consuming. Government inaction was deafening, despite the cause being known since 1959, and change only came after it was forced by  non-capitalist union pressure in the 1970s. If this seems like a problem confined to the past, one need only cast their mind back to the Fukushima disaster in 2011, ultimately the result of the irresponsible decision to pursue a nuclear energy policy on the disaster-prone Pacific Ring of Fire.

This article does not wish to make the case for either the capitalist or communist system’s superiority in environmental affairs. Rather, it should be clear that the common thread running through all of these disasters – from the Great Smog to the Great Famine and Fukushima – is a policy emphasising economic growth as the paramount standard of success is a dangerous one that will inevitably lead to environmental destruction. The style and severity of that destruction may be influenced by ideology, but if we are to live in harmony with our environment, we must be willing to abandon the ideals of gain (collective or individual) and competition, that have placed us in our current quandary, whatever the tint of our political stripes.

Samuel Lake, History in Politics Writer

Is It Time For An Elected Head of State?

Democracy and equality under the law have increasingly come to be seen as the gold-standard for structuring societies ever since the enlightenment. it may therefore appear odd to some that the United Kingdom, the ‘mother of parliamentary democracy’, is still reigned over by a monarchy. Stranger still is that despite the drastic decline in the number of monarchies worldwide since the start of the 20th century, the British monarchy continues to sit in the heart of a proudly democratic nation and continues to enjoy significant popular support amongst the general public. Perhaps this will change with the passing of our current and longest serving monarch Queen Elizabeth II, perhaps the royal family will lose its purpose, or perhaps it will continue to hold steadfast as it has done in the face of major social transformations. But while there may be calls for the monarchy to be replaced by an elected head of state, we should ask ourselves what the monarchy both means to us and offers us, domestically and internationally, before we rush to any conclusions. 

Queen Elizabeth II and Prince Philip. (Credit: AY COLLINS – WPA POOL/GETTY IMAGES)

While certainly debatable, I would contend that in its history, structures, and character, Britain is fundamentally a conservative nation. Not conservative in the sense that it strictly aligns with today’s Conservative party, but more in the sense of Burke or Oakeshott; we sacrifice democratic purity on the altar of an electoral system that is more inclined to produce stable and commanding governments; we still retain a strong support for the principle of national sovereignty in a world of increasing interdependence and cooperation; we take pride in our institutions, such as parliamentary democracy and our common law; and as evidenced by our addiction to tea, we value tradition. So is it really surprising that monarchy, the oldest form of government in the United Kingdom, still not only exists but enjoys significant public support? 

The monarchy is intended as a symbol of national identity, unity, pride, duty, and serves to provide a sense of stability and continuity across the lifespan of the nation (according to its website). Its whole existence is rooted in the conservative disposition towards traditions, historical continuity, and the notion of collective wisdom across the ages that should not be readily discarded by those in the present. The monarchy is also politically impartial, and so able to provide that described sense of unity as it is a symbol that should cut across factional lines. Finally, the royal family is not necessarily an obstacle to democracy anymore; we have a constitutional monarchy, whereby the politicians make the decisions without arbitrary sovereign rule. The Sovereign’s role is not to undemocratically dictate legislation, it is to embody the spirit of the nation and exemplify a life of service and duty to country.

Conversely, many may say with good reason that the monarchy is outdated, elitist, and a spanner in the works for democracy. Indeed monarchies are increasingly becoming a thing of the past, and in today’s world it may seem out of place to see a family of people living a life of unbounded riches and privileges simply by birth right. This is a view that is becoming increasingly popular among younger Britons. Additionally, one might contend that the monarchy has lost its magic; it no longer inspires the same awe and reverence it once did, and is unable to invoke the sense of service and duty to country that it once could. 

Prince Harry and Meghan Markle being interviewed by Oprah Winfrey. (Credit: Marie Claire)

While support for the British monarchy appears to be holding steady, even in the wake of the latest saga with Harry and Meghan, I believe that the monarchy is on thin ice. The age of deference has long since passed, and in an era of materialism and rationality, the ethereal touch of monarchy has arguably lost its draw. Perhaps this is a good thing, or perhaps now more than ever we need a symbol of unity and duty to do our best by our neighbour and country. What is worth pointing out though is that Queen Elizabeth II, our longest serving monarch, has led the country dutifully throughout her life, and it is worth considering deeply whether the alternative (President Boris Johnson?) is really a better option.

Leo Cullis, History in Politics Writer

The Future Unlocked? 

What a strange year. April might seem like an even stranger time to reflect, one month after the anniversary of the first Coronavirus lockdown, but it also seems astute as the easing of lockdown starts to open up our futures. With pubs starting to open, vaccines being delivered, and being officially allowed back to university, there is light at the end of the pandemic tunnel.

Yet, while we’ve been locked up in our houses, a few things have happened. For one, History in Politics has done two terms as a university society – but you probably don’t care much about that. More significant are the huge events seen through the prism of a new post pandemic world. Britain has finally properly left the EU, Boris Johnson lost his most infamous advisor, thousands marched for BLM, and thousands have protested policing in the wake of Sarah Everard: ‘why are you protecting statues of racists over actual women?’, one sign read. 

During the pandemic, Britain has been reflecting. We might look back upon our relationship with Europe. We might look at the history of race-relations in the UK, or our colonial legacy. In fact, with books such as Empireland: How Imperialism Has Shaped Modern Britain being released in January by Sathnam Sanghera, it is clear that many have been reflecting on such themes. In doing so, it is hoped that, by having a clear idea of where we’ve come from, we might have a better idea of what we’re meant to do in the future.

Luckily for me, although perhaps less so for my career prospects, I’ve had the privilege of studying such history. I’ve spent a lifetime learning about the British Empire, race-relations, civil rights, and Britain’s relationship with Europe (although, aged 21, a lifetime is quite a melodramatic way of putting it). I have even had time to study the Tudors, which many complain took the place of ‘more relevant’ history. Despite all this history I am still to get the magic key to predicting our future – perhaps that will come tomorrow, or once I’m back in a Durham pub. 

Ironically, such historical reflections can be found throughout history. When Edward Colston’s statue was raised in Bristol in 1895, for instance, it was already over a century and a half after his death. Those who toppled his statue over a hundred years later, certainly wouldn’t think that the Victorian reflections or remembrance of Colston was a positive one. Although some might suggest it was representative of the future for Victorians, a future of racial inequality. 

The plinth of the now removed statue of Edward Colston, Bristol, England. (Credit: James Beck for The New York Times)

One thing which we cannot change, regardless of how we might reflect upon it, is what has passed. This might sound obvious, but it is important to hold in mind such ‘objective truths’. They’re the reason people look back, hoping the past truths will unlock future truths. It is in search of the ‘truth’ that we talk, read, and reflect on our past – from empire to race. Last summer, as statues were ripped up and the media exploded into debate, I asked how we might have that conversation in a civil manner. Yet, the ‘culture war’ has continued – regardless of these ‘truths’. 

Perhaps it is less talking and more listening which needs to happen. Over lockdown I had the pleasure of listening to Natalie Mears (associate professor in early modern British history) discuss some of these topics. Finally, I could put Tudor history to some use, and the comparisons with our present ‘culture war’ were stark. From powerful political advisors (it is the 500th anniversary of William Cecil’s birth; and a year since ‘that’ trip to Barnard Castle) to our relationship with Europe, some things seemingly haven’t changed. As we reflect upon the past year (and a bit) of COVID-19, one lesson from Elizabethan England sticks out the strongest: that reflections and memories of the past have always been political. At least that is one door to the future which is unlocked. The future is undoubtedly political.

Join Durham University’s History in Politics Society for their term’s theme of ‘Reflections’ and find series two of Dead Current, the History in Politics Podcast, on Spotify. The first episode of series two is President Emily Glynn and Event’s Manager Ed Selwyn Sharpe’s interview with Natalie Mears. 

Ed Selwyn Sharpe

General Secretary Putin— the Use of History by Russia’s Regime

Putin’s propaganda machine was laid bare for all to see last month with the return of Russian dissident Alexey Navalny, after recovering from nerve agent poisoning. The Kremlin initially refrained from commenting on the activist, however as his video detailing Putin’s Black Sea palace was released and protests in support of him erupted across Russia, he was quickly depicted as a Western puppet, playing on old fears of Russia’s Cold War rivals. He was later charged with slandering a veteran of the Second World War, more commonly known as the Great Patriotic War in Russia. The defence against Nazi invasion occupies a venerated place in the minds of Russians, dating back to Stalinist propaganda. To slander a veteran would be to slander Russia’s sacrifices in the war as a whole. It is therefore, unsurprising Putin deployed this particular tactic against his adversary.

Vladimir Putin addressing Russian citizens on the State Television channels, Moscow, Russia, March 2020. (Credit: Alexei Druzhinin, Sputnik, Kremlin Pool Photo via AP)

Throughout Putin’s time in power, he has often deployed a nostalgic form of Russian history to construct a narrative which strengthens his grip on power and maintains his popularity among the Russian public. He attempts to mirror the conditions of Russia in times when it was a global power— mostly through imitating the Soviet Union and the Russian Empire. Through this, he stokes a Russian nationalism embarrassed at its nation losing its former ‘glory’ and desperate to regain it. A poll in 2018 showed 66% of Russians regretted the collapse of the Soviet Union and a poll in 2020 showed that 75% of Russians believed the USSR was the greatest time in their country’s history. Rather than facing the bleak reality of decline, Putin’s rule seemingly offers a restoration of former glory and a return to better days. By masquerading Russia as a global power, Putin attempts to distract the public from ongoing economic decline. 

This public nostalgia therefore brings the USSR to the forefront of Putin’s propaganda campaign. Russian history textbooks were remade to display the USSR in a positive and idealised light, the Soviet national anthem was reinstated. The Patriotic War was especially capitalised on, the 2020 constitutional changes went so far as to ban ‘belittling’ of Russia’s feats in the Second World War and to ‘protect the historical truth’ of the war, essentially outlawing any narratives of the war contrary to the state-sanctioned line. 

Russia’s foreign policy also reflects a desire to return to the days of the USSR. Involvement in Syria harks back to Soviet interference in Afghanistan. Annexation of parts of Crimea and Georgia portray Russia as reclaiming former lands lost since the dissolution of the USSR. The poisoning of dissidents in the UK and apparent interference in US elections are reminiscent of the days of espionage against the West.

Perhaps the most important utilisation of the USSR’s legacy is in rhetoric surrounding the West— the Other, the ever-looming threat during the Cold War. Putin plays on old Cold War mentalities by constantly depicting his adversaries as either Western or in league with the West. A dichotomy is therefore created where Putin is the defender of Russian values and stability against his opponents, who are dangerous Western puppets. Navalny, rather than being an anti-corruption activist, is a Western pawn bent on destabilising Russia. Any criticism of Putin’s regime is quickly deemed Western propaganda, calling into the question the good will of the critics.

In all of these instances, Putin is essentially creating an image of himself which capitalises on nationalistic feelings surrounding Russian history and uses this to target his opponents and perpetuate his rule. Putin cannot be criticised, for doing so would be to criticise the Great Patriotic War, the USSR, and stability itself. 

Recently, Putin’s rule has appeared more shaky, especially after his dismal coronavirus response. Moreover, this year will mark 30 years since the USSR fell. A new generation, with no memory of Russia as a global superpower, is less susceptible to Putin’s use of history: the Soviet national anthem brings up no memories, nor does linking Navalny with the West diminish their support of him. A simple look at the make-up of the Navalny campaign shows all that you need to know. Navalny himself is positioned as an opposite to Putin’s authoritarianism, he engages with his audience primarily on social media, where protests and campaigning are also organised. News reports show the protestors as young and eager for change. History is an effective tool for such authoritarians, but insofar as there is any real connection to that history in the present. As the old Soviet generation will soon start to dwindle in number, that connection is lost, and the propaganda of those authoritarians loses appeal.

Jonas Balkus, History in Politics Contributor

Book Review: J. S. Mill’s ‘On Liberty’

John Stuart Mill’s ‘On Liberty’ is a classic statement of liberal values and an iconic text in the arena of moral and political thought. Published in 1859, it was originally conceived as a short essay upon which Mill and his wife, Harriet Taylor, fleshed out the liberal values and morality that still provide much of the basis for political structures today. In essence, it seeks to address the question of how far the state or society as a whole should go in controlling individual beliefs and actions, and its answer is a resounding defence of individuality.

Title page of the first edition of On Liberty (1859). (Credit: Public Domain)

Mill opens his account with a historical assessment of the ancient struggle between liberty and authority, suggesting an evolving relationship between ruler and ruled whereby people came to believe that rulers no longer needed to be independent powers opposed to their interests, thus giving rise to notions of democracy. But, whilst government tyranny is a concern for Mill, ‘On Liberty’ focuses more on the dangers of democratic and social coercion and its hindrance upon the individual; perhaps an unsurprising view in the context of Victorian social conservatism. On Liberty sees Mill warn against a ‘tyranny of the majority’, and it is with this in mind that Mill sets out the individual freedoms and protections that ground liberal values to this day. 

‘On Liberty’ focuses on four key freedoms: freedom of thought, speech, action, and association, all of which would challenge the Victorian orthodoxy of custom and restraint in the social and political sphere. 

Freedom of thought, by which Mill means ‘absolute freedom of opinion and sentiment on all subjects’, is a staple in the genre of classical liberalism; a rejection of group-think and the elevation of individual thought over social customs. Mill’s conception of freedom of speech is arguably more profound and more contentious. His defence of free speech extends up until such speech becomes incitement to violence. He sees value in speech no matter how potentially hateful or self-evidently incorrect, for such speech is necessary to reinforce the strength of our convictions and stop our beliefs and values from becoming mere platitudes. One might perceive this opinion as at the crux of today’s disagreements over the limits of free speech.

Mill’s conceptualisation of freedom of act divides action into two categories: self-regarding action and other-regarding action, and sees only limitations on the latter as permissible. In essence, one should be free to act in any way they please, unless in doing so they directly harm somebody else; a classical liberal statement if there ever was one. Finally, freedom of association; the freedom to unite with any person so long as the purpose does not involve harm. 

“Mankind are greater gainers by suffering each other to live as seems good to themselves, than by compelling each to live as seems good to the rest”.

J. S. Mill, ‘On Liberty’

‘On Liberty’ is evidently a defence of individualism and individual freedoms, but it represents a major departure from previous liberal thinkers. Mill’s support for liberty is rooted in his utilitarianism. Whereas liberal thinkers such as John Locke see liberty as a valuable end in itself, and man as endowed with natural rights by way of existing, Mill’s individual liberties merely serve a purpose, that purpose being utility. In short, ‘in proportion to the development of his individuality, each person becomes more valuable to himself, and is therefore capable of being more valuable to others’. He has come under severe criticism for this, with many doubting his liberal credentials, but as he states in ‘On Liberty’, without firm grounding, ‘there is only too great a tendency in the best beliefs and practices to degenerate into the mechanical’.

Mill’s ‘On Liberty’ is not liberty merely for liberty’s sake, but rather it is liberty with a purpose, and his robust defence of individual freedoms still providing the framework for liberal thought today makes ‘On Liberty’ one of politics’ greatest hits.

Leo Cullis, History in Politics Writer

The Cost of Casual Scepticism to Human Rights

Modern aversion to human rights protection in the United Kingdom can be seen on the surface of British politics, from Theresa May’s indignation that she was  unable to deport an illegal immigrant because he had a pet cat, to Nigel Farage’s demand that we “scrap the EU Human Rights Act”. The modern rallying cry for human rights sceptics often takes the form that we, the British,  have no need for coffee-drinking, baguette-eating European to tell us how to do human rights. The problem though is that rights protection is not that simple, and never has been, so why do we let human rights be disparaged by these tabloid-level soundbites and mantras?

The European Court of Human Rights in Strasbourg. (Credit: Europe’s Human Rights Watchdog)

When bashing human rights, politicians are most commonly referring to the European Convention of Human Rights (ECHR) which was created by the Council of Europe which, for clarity’s sake, has nothing to do with the European Union – incidentally making Farage’s claims about the “EU Human Rights Act” farcical. The ECHR established the European Court of Human Rights (ECtHR), frequently referred to as ‘Strasbourg’ in lieu of its location. These elements of European influence running through our human rights make it an easy target for politicians wanting to exploit the rife Euroscepticism that exists in modern Britain. This is why, perhaps unsurprisingly, key features of the ECHR’s history are left out by its critics.

The ECHR can find its origins in the ruins of Europe following World War Two, and the Holocaust, in which it was decided action must be taken to prevent future human rights abuses across the continent. Winston Churchill, and the British delegation, played a key role in the drafting of the Convention and were ardent supporters of it. Since then, the Convention has evolved according to the changing nature of Europe and has helped in reducing discrimination and exposing rights abuses. 

Looking then at the face of modern Europe, the significant role of the ECHR may seem unnecessary and can even seem to be an inconvenience to public protection and government policy. However, the Convention should not be perceived as only existing to prevent grave breaches of human rights, but also to remedy all forms of discrimination, such as the right of LGBT couples to adopt or for people of all faiths to have the same job opportunities. These breaches still occur across Europe, with significant examples appearing in Poland and Hungary where radical anti-LGBT policies are being enacted, such as ‘LGBT free zones’.

This is one reason why it is more crucial than ever that other States party to the Convention continue to believe in and abide by it. If disagreements with Strasbourg start to appear over issues, for example prisoner voting rights, it waters down the impact of the ECtHR’s rulings and threatens the possibility that judgments given by the Court to clear breaches(such as those appearing in Eastern Europe), are swept aside, and ignored by the violating State. Arguing and fighting for proper rights protection is not done only for yourself, or your group’s, but is done for the wider community.

This is why the rhetoric used in modern politics about seemingly humorous issues, like the failed deportation of a man because of his cat, and other false remarks which attempt to sabotage the mechanisms for rights protection are dangerous. Dangerous not only for the prevention of relatively benign issues that may appear in nations like the UK, but also for those that rely on the Convention to prevent fundamental breaches of human rights, which appear more likely to occur  on a daily basis, and cause one to consider the events that led to the creation of the ECHR.

Aidan Taylor, History in Politics Contributor

Mary I: The First Queen of England and the Legacy of Female Leadership

Whilst the first Queen of England’s reign is largely overshadowed in favour of the other Tudors, namely her father Henry VIII and half-sister Elizabeth I, the historiographical interpretations of Mary I illuminate interesting notions associated with female leadership. Mary I was the first woman to ascend to the throne of England, as the succession of Empress Matilda in the twelfth century never materialised due to the eruption of civil war. Only four centuries later do we witness the succession of a female monarch in England, and this was not without issues, as there had been earlier attempts to bar her from inheritance. Her gender, as well as her supposed illegitimacy, provided the grounds for such attempts as her younger half-brother was placed above her, when she was eventually restored to the line of succession.

Mary I’s reign is largely interpreted as one of hysteria and irrationality. Such a notion of hysteria is particularly of note as the term has long been associated with the assumption that women are unable to control their emotions and are thus unfit to rule. Shakespeare’s common depictions of a madwoman within his works have fostered this link of femininity with that concept of hysteria, particularly illustrated in Hamlet through Ophelia. Literary representations have certainly exacerbated the historical issues of female rule.


Portrait of Mary I, Antonis Mor, 1554. (Credit: Public Domain)

Historians are critical of both Mary’s personal life, due to her failure to conceive, as well as her broader policies that saw the intense persecution of Protestant sympathisers, which led to her title of ‘Bloody Mary’. The issue of motherhood in politics is still prevalent in our times, as seen by the scrutiny Theresa May faced within the media and from other members of the Conservative Party due to her choice not to have children. With regard to Marian politics, the stabilisation of economic policy her reign is often underplayed and when acknowledged, is widely credited to her male councillors or the reforms laid by the Duke of Northumberland before her advent to the throne. Mary is thus painted as weak, feeble and ineffective. The first Queen of England is largely portrayed as conforming to the gendered anxieties that the elite ruling class had regarding the notion of female monarchy. Women were deemed as far more emotionally charged compared to their male counterparts and would be unable to conduct rational governance as a result.

In the 21st century, women still face significant opposition to reaching the highest positions of political power. Britain’s only popularly elected female Prime Minister, Margaret Thatcher, is often represented as contradicting feminist values. The US has never elected a woman to that highest position of President, and notably Hilary Clinton’s gender played a significant role in right-wing opposition to her election. Such ideas regarding female hysteria and heightened emotion have arguably laid a basis for the preference of male leadership and have correlated with deeming women unfit for positions of political power. The qualities associated with leadership are still often presented through characteristics which men are more likely to be socialised to have, therefore perpetuating the idea that women are not suited to power.

What we can draw from the accounts of Mary I’s governance and her later treatment in historical research is that female leadership is not often deemed a suitable option, nor do women find easy pathways into politics. These ideals surrounding female inferiority have

historical precedence in sixteenth-century England as illustrated by the analysis of early modern Queenship. These notions have not been undermined or significantly challenged by the twenty-first century. Mary I’s legacy illustrates how she has been utilised to highlight the perceived barriers to effective female governance. Arguably, this has set a precedent for the limited role that women play in politics within our modern era.

Ellie Brosnan

Seeing like Cassandra: a New Role for Literature in Political Risk Analysis?

Political conflicts and situations of crises in a multitude of forms continue to mark our present. Indeed, early crisis prevention is a question so pertinent to our times that it has prompted researchers at the University of Tübingen in Germany to explore an unusual  method of conflict prediction: studying fictional literature of specific regions prone to crises  to examine if it is possible to identify potential future threats through literary texts. The project is entitled “Projekt Cassandra”, alluding to the Greek mythological figure of Cassandra, who was famously able to predict the future, although cursed so that nobody would believe her prophecies. 

The study led by Tübingen  appears to have demonstrated potential  as an alternative method of strategic analysis  and is being partially being funded by the German Ministry of Defence. The “Projekt Cassandra” has so far investigated three centres of conflict for model analyses, notably the Serbo-Kosovan conflict (1998-1999), the Nigerian terror epidemic caused by the group Boko Haram, and the tensions in Algeria preceding the election in 2019. While the study is still underway, it poses an essential question: can literature truly function as a tool for the early detection of crises?

Tens of thousands of Algerians protesting Algeria’s presidential vote, Algiers, December 12, 2019. (Credit: Ramzi Boudina, via Reuters)

Conflict and crisis prediction are an essential part in the field of risk analysis. With regard to the hypothesis of “Projekt Cassandra”, we can draw parallels with the idea that “history repeats itself”. While the extent to which the past really “repeats” itself is debatable, this argument  rests on the assumption that history provides us with patterns which allow for a certain degree of “prediction”. A similar concept could be applied in the case of literature through the ages, where the identification of patterns relating to past crises could help identify future potential conflicts. 

In considering concrete examples, the picture becomes more complex. By association, we might immediately consider literature as actively imagining the future, such as the famous 1984 by George Orwell. This dystopian classic from 1949 is set in a distorted future of censorship and surveillance, which could be interpreted as a prediction of certain elements that we now  find in our present. However, it must be noted that  science fiction and dystopian literature often provides less a prediction of the future,  than a reflection of respective historical circumstances. 1984 for instance stems from the context of the beginning Cold War, in which surveillance and espionage were considered primary threats. This however does not mean that there is a necessary correlation between the content of a novel like 1984 and the  realities of the future. 

The ‘Projekt Cassandra’ logo. (Credit: Project Cassandra, via Twitter)

 So how can early crisis detection through literature function? Predicting crises is a lot about the identification of patterns. This is why it also matters what type of crisis is being investigated. For instance, a health crisis such as the Covid-19 pandemic can hardly be said to have been predicted by Camus, merely because his novel The Plague (1947) describes a situation that is eerily similar to current events. 

We do however find a poignant example favouring this hypothesis relating to military conflict in British literature. Starting in the 1870s, literature imagining a new European war, at various instances between Britain and France or Britain and Germany, was increasingly popularized, so much so that “invasion literature” became an entirely new genre in British literature. The most famous example is the novel The Invasion of 1910 (1906), which propagated Germanophobia in Britain, through its imagined German invasion of Britain.  This can in hindsight, be considered a near-prescient account of elements of the First World War. Invasion literature even influenced politics:popularised concerns  of a new European conflict accelerated the arms race of the 1900s,which ultimately played  a role in the eruption of the First World War. 

Literature, it must be reiterated, is not an accurate tool of strategic prediction. It does however frequently capture underlying social currents that can later become problematic, such as ethnic tensions or larger societal unrest. These currents are often subtle, yet can function as inspiration for authors. To the extent that these societal undercurrents can effectively alert us to the potential of future crises and conflict, only Cassandra knows.

Cristina Coellen

Who Won the Good Friday Agreement?

The Good Friday Agreement was signed in 1998, heralding a new era of peace after the decades of violence that characterised the Troubles. But who benefitted the most from the signing of the Agreement, and is the answer different today than it was twenty years ago?

For unionists, represented most prominently by the Democratic Unionist Party (DUP) and the Ulster Unionist Party (UUP), the Good Friday Agreement ultimately symbolised the enshrining of the status quo in law: Northern Ireland remained a part of the UK. In addition, the Republic of Ireland renounced articles two and three of its Constitution, which laid claim to the entire island of Ireland. It may seem, then, that unionism was the victor in 1998, but elements of the Good Friday Agreement have been responsible for tectonic shifts in the period since, arguably exposing it, ultimately, as a victory for nationalism.

While Irish republicans in the form of the Provisional IRA were required to put down their weapons and suspend the violent struggle for a united Ireland, there is a compelling argument that the Good Friday Agreement laid the platform for the growth of the movement and perhaps even the fulfilment of the goal of Irish unity. For one, it mandated power-sharing between the two sides of the Northern Irish divide: both unionists and nationalists must share the leadership of government. Since 1998, this has acted to legitimise the nationalist cause, rooting it as a political movement rather than an armed struggle. Sinn Féin, the leading nationalist party in the North, have moved from the leadership of Gerry Adams and Martin McGuiness to new, younger leaders Mary Lou McDonald and Michelle O’Neill. Irish unity propaganda now emphasises the economic sense of a united Ireland, the need to counter the worst effects of Brexit on Northern Ireland and a sensible debate about the constitutional nature of a new country, rather than the adversarial anti-unionist simplicity of the Troubles. Here, nationalism did gain significantly from the Good Friday Agreement because it allowed this transition from the Armalite to the ballot box in return for legitimacy as a movement.

British Prime Minister Tony Blair (left) and Irish Prime Minister Bertie Ahern (right) signing the Good Friday Agreement. (Credit: PA, via BBC)

Most prominently for nationalists, however, is the fact that the Good Friday Agreement spells out the route to a united Ireland, explicitly stating that a ‘majority’ in a referendum would mandate Northern Ireland’s exit from the UK. While unclear on whether majorities would be required both in the Republic and Northern Ireland, as was sought for the Agreement itself in 1998, as well as what criteria would have to be met in order to hold a vote, this gives Irish nationalism a legal and binding route to its ultimate goal of Irish unity, arguably the most prominent victory of the peace process.

Since 1998, Northern Ireland has seen a huge amount of political tension, governmental gridlock and occasional outbreaks of violence, most recently witnessed in the killing of journalist Lyra McKee in 2019. However, the Good Friday Agreement has served crucially to preserve peace on a scale unimaginable in the most intense years of the Troubles. If Irish nationalism achieves its goal of uniting the island, it will come about through a democratic referendum, not through violence. The very existence of the Good Friday Agreement, particularly its survival for over 20 years, is testament to the deep will for peace across the communities of Northern Ireland, forged in decades of conflict; it is this desire being fulfilled, even as the parties squabble in Stormont and the political status of Northern Ireland remains in the balance, that continues to make the biggest difference in the daily lives of both unionists and nationalists.

Joe Rossiter, History in Politics Writer