Peace Lines, Borders, and Brexit: Northern Ireland’s Dilemma

The Shankill is one of the main roads leading through Belfast and home to the city’s predominantly Protestant and Loyalist supporters– citizens who are in favour of the country remaining under British control. On the other side is Falls Road– the Republican Catholic community who are in favour of a United Ireland out of British control. 

Whilst fundamentally different and opposed– what unites these communities is their segregation– in the 25 ft high physical peace walls dividing them, manned by police– some have gates where the passage between the areas at night-time is blocked in an attempt to lower inter-communal violence. 

This border is one of many on the island: between the Republic and the region of Ulster, but also, between mainland Britain and Northern Ireland due to the Brexit border crisis. 

The 1921 Partition of Ireland separated the island to create two devolved governments both under British control, in hope that this would then lead to reunification. Violence ensued as Southern Ireland refused to create a government, therefore, defying British rule– declaring an Irish Republic independent from the UK. This resistance led to the Irish War of Independence. The outcome of this guerilla war was the Anglo-Irish treaty that recognised the Republic of Ireland as independent from Britain. 

Supposed peace was futile. Borders created discrimination and differences in Northern Ireland between Protestants and Catholics, with the latter oppressed. In response to housing and employment prejudice, as well as issues with the Electoral Representation of Catholic towns, the Northern Ireland Civil Rights Association led a campaign in favour of equality. Met with opposition, violence ensued– most notably in the Battle of Bogside– leading to the thirty-year conflict known as The Troubles.

Whilst most of this violence ceased with the 1998 Good Friday Agreement, the borders and peace walls are representative of a conflict still unsettled.

The 2016 Brexit referendum has proved many complexities. Most British citizens did not know what the European Union was upon voting. For Northern Irish citizens, the vote complexified their relationships with both the Republic and Westminster. 

Self-imposed apartheid has characterised the communities in Belfast, Derry, Portadown and Lurgan. Decisions to not mingle with those different to them has contributed to the growing tensions– the Orange Order walk on July 12th celebrating the Protestant William of Orange’s invasion and subsequent oppression of Catholics in the 17th century, and in the 2021 Northern Ireland riots. These recent riots were incited by the border and goods crisis as a result of Brexit– where a fifth of businesses surveyed said that suppliers were ‘unwilling to engage with the new requirements of shipping– and in some cases, businesses from Great Britain are no longer supplying Northern Ireland. 

Loyal Orange Order March, Edinburgh. (Credit: Des Mooney, via Flickr)

Unionist or Nationalist self-identification was the most important determinant of referendum choice on Brexit. For many, voting Leave became part of a British identity– much like how in England, voting Leave became a point of taking back control. For Unionists in Northern Ireland, this was felt strongly. Even those who wished to stay in the EU– but were Unionists– voted leave, mainly out of principle. Due to the religious and political separation in the country, there’s a separation in politics manifesting itself in the nationalist party of Sinn Fein and the Protestant Democratic Unionist Party.

This is furthermore complicated by the fact that Sinn Fein MPs do not take up their seats in Westminster, as in doing so, it would give legitimacy to an institution they do not recognise . But this meant that the party could not argue against Britain’s withdrawal from the EU nor fight to remain there.

For a history marked by separation, the departure of the UK from the EU was just another chapter in a long story. The Lanark Peace Gates are not only divided by religion and ethno-nationalist beliefs but also the difference between Remain and Leave voters. These are some of the most deprived areas in the whole of Northern Ireland– their deprivation and poverty levels are what unites them, but their perceived solutions mark the difference and creates conflict. 

Northern Ireland has the lowest poverty rate of any UK region. Its unemployment rate is small also, but its educational attainment and health and disability are where the country draws short. More than two-thirds of students on the Shankhill– the Protestant area– and Falls Road– the Catholic area– perform below Belfast’s average. The poverty in these areas is most likely incited by the Unionist andRepublican division– balancing the demands of the integrated educational system has led to many falling behind in their exams. Brexit has only worsened this.

For a generation that was promised peace in 1998, 23 years later– the situation remains in conflict.

Under Sinn Fein and the opinion of Republicans was that Northern Ireland needed to stay in the EU as an important way of working towards a united Ireland, eventually, especially as it was EU integration that ceased the patrolling of British soldiers in Northern Ireland. 

In contrast, UK sovereignty is the most important thing for British Unionists, given that the Unionist working-class was the likely sector to vote for Brexit, with the Democratic Unionist Party encouraging those to vote to Leave. 

Due to Brexit, the border between Northern Ireland and the Republic could be stronger due to EU laws– that an EU country must have a hard border with a non-EU country. At the same time, a new border has emerged between Northern Ireland and the UK– one that was separated by water is now in conflict due to Brexit– a complication many were not prepared for.

Northern Ireland has not known peace in its entire existence. Whilst most of its citizens still favour remaining a part of the UK, due to Brexit and heightening tensions– this could change. Ireland could become unified sooner than many realise.

Aoifke Madeleine, Summer Writer

Is ‘Ever-Closer Union’ The Right Path for the EU’s Survival?

Depending on whether you supported the UK leaving or remaining in the European Union, you might presume that the EU is either an undemocratic mess destined to fail, or an international organisation bound to grow and strengthen in a world where cooperation is key. The problem is, in the long-run, it really is impossible to tell which possibility will prevail. 

One the one hand, this last decade has seen a rise in nationalistic sentiment and a resurgent hunger for the principles of sovereign independence; Brexit, the election of Donald Trump accompanied by the slogan ‘America first’, and Orbán’s rule in Hungary serve as just a handful of examples of such a sentiment. On the other hand, it may seem impossible that any nation could fully address its challenges alone in an age of unprecedented interdependence and interconnectedness (and the pandemic speaks for itself here). 

Since its beginning, the EU has been guided by the latter view; states must share resources, work collaboratively under formal rules, and pool their sovereignty in order to survive and prosper in a globalised world. Indeed, the European Coal and Steel Community, which evolved into what we know today as the European Union, was formally established in 1951 with the aim of regional integration in order to avoid war between France and Germany following the horrific conflict of World War II. Underlying all this was a simple perspective: without internationally agreed rules and standards, states would inevitably compete and conflict, and so overarching structures were necessary to prevent this. 

Flags at the European Union headquarters in Brussels. (Credit: Wiktor Dabkowski, action press, via Flickr)

This may appear surprising, as in recent years we’ve often heard from leave campaigners that the EU was originally a mere free trade bloc which morphed into a political union over time. However, the language of ‘political union’ and ‘ever-closer union’ has been in the treaties right from the start, and those ideas have increasingly manifested themselves. Illustrating this, just recently the German Foreign Minister went so far as to call on the EU to abolish the veto power of individual member states when it comes to foreign policy. It seems, therefore, that the EU is set to continue on its pathway towards ever-closer union and increased integration between its member-states. But is that the right path for the union to follow?? 

Despite major challenges – namely the eurozone crisis, the migration crisis, and even Brexit – the EU has succeeded on its slow march towards integration and expansion, and public support across the region has held steady. Whatsmore, continued access to the world’s largest single market area is a great benefit of EU membership, especially in the aftermath of the pandemic and its economic effects. The EU also remains a key player in global governance, attending and influencing the G7, giving the member-states a collective power that they would otherwise lack as independent nations. 

However, increased integration and ever-closer union are not guaranteed to succeed. There is t a possibility that continued allegiance to those principles could prove to be the Achilles’ heel of the EU. Vaccine access and roll-out across the EU during the pandemic highlighted the weakness of the EU in dealing with crises as a large collective, resulting in major dissatisfaction with its leaders, as well as reducing public confidence in the vaccine itself. 

Furthermore, the UK’s future success or lack-thereof as a post-Brexit independent nation will play an important role in shaping perceptions about the benefits of an ever-closer union. If the UK is seen to succeed as a nation unbound by a supranational authority in areas of trade, security, and global leadership, then the integrationist approach of the EU will be put under the spotlight.

Crucially, the sense of Europeanism among the population will likely play the key role in determining just how much further EU integration can go, whilst succeeding. If there is a strong enough European identity, as there is now, then further integration is likely to succeed. However, as we witnessed with Brexit, the electorates of Europe will not sit quietly if they feel that their national identity is being significantly displaced on the altar of ever-closer union. For now it seems as though the current path is working, and public support is holding steady. However, in the long term, the future of the EU is impossible to predict.

Leo Cullis, History in Politics Writer

Recycling Political Establishments?

The announcement made by Abdelaziz Bouteflika in 2019 proclaiming his candidacy for a fifth presidential term ignited an ocean of furious Algerians opposing the monotonous and stagnant regime under his rule. Since Algeria’s independence in 1962, the nature of its political system under Bouteflika’s neo-patrimonial and authoritarian rule led to a disruption of the country’s social contract resulting in a loss of legitimacy for its rulers. Algerian protestors peacefully took to the streets against Bouteflika’s bid, the pressure placed by the Hirak movement alongside the military led to the resignation of Bouteflika, restoring a sense of hope and new beginnings for the Algerian people. The resignation of Bouteflika allowed for the disclosure of the profound fractures within the Algerian organization but also led to uncertainty between political actors on how to progress in a post-Bouteflika regime. 

The goals of the Hirak had endured a rancorous end as the country’s military leadership rebuffed any additional concessions, overlooking all calls necessary for an essential transition period. Algeria’s political establishment instead, marshalled propaganda and authoritarianism  to force the presidential elections in December 2019, resulting in the presidency of former Prime Minister Abdelmadjid Tebboune. Algeria has continued to struggle politically over the past two years with the Hirak movement gradually losing momentum, political stability still being seen to be lacking in the country as a vicious cycle of tainted political actors continually suspend urgently needed political and economic reforms. The Covid-19 pandemic has heightened the economic and political struggles of the country, potentially entering a state of multifaceted chaotic crisis, one would not be surprised to see the character of Algeria during the Arab Spring being brought back to life in upcoming years as the people’s needs are dismissed by Algeria’s political elite. (or something like this) 

Painted portrait of Abdelaziz Bouteflika. (Credit: Abode of Chaos, via Flickr)

The Hirak has become irrevocably divided as groups no longer share consistent socio-political aspirations, most notably the divide between the new-reformist camp. The internal weaknesses of the Hirak have meant that there has been a failure of agenda establishment regarding what exactly it is the movement seeks to achieve. Dialogue between the Hirak is a necessary channel to any form of success yet it is overdue, unless Algeria faces an existential threat that would push the system to engage collectively it seems there will be no progression for its political and economic placement.

Despite the Hirak not having achieved its major goals the opposition movement has sparked a genuine desire and need for political and social progression; however, this may take years to attain, and time is not on Algeria’s side given its serious economic and political challenges. The abandonment of Algeria by the international community has further complicated matters since 2019, Algeria is a regular when it comes to favouring the status-quo and may very well reject any interference with their internal affairs. However, the international community could afford the country a course of internal dialogue or aid the Hirak with its organizational process via encouraging greater civil and political freedoms. Algeria may not be of priority for the Biden-Harris administration, nonetheless, hand in hand with its recently reinforced relations with European governments the United States have a greater potential to revive a collective effort towards a transition period for Algeria. The June 12th snap election has not instigated any meaningful change so far with the majority of the population even boycotting the election as the military remains in control. Although it would be precarious to call for radical and instant changes it is necessary that Algeria gradually works on reciprocally beneficial reforms for both the opposition and the system.

Lydia Benaicha, History in Politics Contributor

The Politics of the Past: How Divergent Interpretations of History Shape East Asian Diplomatic Relations in the Present

David Cameron’s refusal to remove his poppy for his 2010 visit to China was revealing of a stark contrast in the significance granted to history in politics between himself (and the British political establishment as a whole) and his hosts. Whilst history has often played the role of a footnote to contemporary politics in the UK – as reflected by the severe lack of meaningful authority being granted to historians in any government department barring the Foreign Office, and even then only recently – it is central to the national self-portrayal of the Chinese nation. The ‘Century of Humiliation’ narrative that plays such a pivotal role in the story of the nation, as painted by the Chinese Communist Party, is one that the West would do well to take more notice of. Meanwhile, in Japan and Korea, the legacy of the Japanese colonial project looms large in contemporary relations. Perhaps as the ‘victors’ of modern history it is easy to relegate the past to that which went before. In Asia, where the nineteenth and twentieth centuries were ones of humiliation and soul-searching, it is impossible to simply sequester the past – it is intricately bound to the politics of the present.

China’s relations with the West underwent a radical shift in the Great Divergence of the nineteenth century, as European powers and the United States came to dominate the globalising world order. The reversal in fortunes suffered by the Qing Empire and, later, the modern Chinese state, has served to inform Chinese foreign policy and education ever since. Chairman Mao linked the Japanese imperialism of the early twentieth century to the Opium Wars of the nineteenth, and the same wars were used to justify Communist China’s ‘reaction’ against their Western oppressors. The Chinese national imagining has therefore come to be defined in opposition to, and in competition with, a West that remains stained by its past, a point of nuance that David Cameron failed so visibly to grasp in 2010, and one that continues to underlie the diplomatic fallacy that we are able to negotiate any sort of equal standing with the Chinese government. A competitive national consciousness has been fostered that means that ‘the West’ will always be cast as the natural point of comparison for China’s past failures and current successes, leaving them and the likes of the UK at polar ends of a dichotomy that western governments, until very recently, have failed to fully grasp.

A Nationalist officer guarding women prisoners likely to be comfort women used by the Communists, 1948.
(Credit: Jack Birns, The LIFE Picture Collection, Getty Images)

Elsewhere in East Asia, the memory of the Japanese military’s ‘comfort women’, who were drawn from across the Empire through the course of the Second World War and forced into what can only be described as sexual slavery, retains a pervasive political potency. The majority of these women were Korean and though estimates vary, they seem to have numbered in at least the tens, possibly hundreds of thousands. Indeed, such a range in estimates comes as a result of the topic’s controversial nature in the context of the countries’ poor diplomatic relations in recent years. The plight of the comfort women and the allocation of responsibility for the crimes against them has come to represent a clearly drawn battle line between the two countries – Japanese nationalists, the recently departed Shinzo Abe amongst them, seeking to play down the extent of official sanction for such atrocities, whilst Koreans pursue justice not only for the victims, but for the Korean nation as a whole. In order for the nations’ relations to reach some level of normality, the governments of both must look to find a compromise between what are currently polarised memories of the Japanese Empire. Forgetting those years is a luxury that only the oppressors may take, yet it is clear that in Korea too a way must be found for the nation to move on from the scars of their past.

Both of these cases demonstrate the historical dimension of diplomacy in the East Asian political sphere. A history of ruptures, clean breaks and colonial exploitation has bred national imaginings in which the traumas of the past play a central role. This significance is one that can be easily underestimated by those of us in the West for whom history has taken on an almost trivial status, as a backdrop to the present. Cameron underestimated it and it appears that our current leaders are also misunderstanding the inescapable threat posed by a Chinese leadership that places itself firmly in the context of historical competition with Western ‘imperialists’. Such cultural ignorance not only offends those whose culture is being ignored, but also hamstrings those guilty of that ignorance. Without a clear understanding of the other side’s thinking, diplomatic blunders like the poppy controversy are not likely to go anywhere anytime soon.

Samuel Lake, History in Politics Writer

Judging the Past: Can We Really Afford Not To?

University of Edinburgh historian Donald Bloxham has provided much food for thought in his recent article for the March edition of BBC History Magazine, entitled ‘Why History Must Take a Stance’. In it, he challenges the dogmatic insistence on neutrality that pervades the historical profession. Instead of feigning an unattainable neutrality, he argues, historians should take ownership of the judgements they make and the moral ‘prompts’ that they provide to their readers. Proclaiming neutrality is misleading, and possibly dangerous. I am inclined to agree.

Whilst neutrality is an honourable and necessary ambition for any historian, it is an ideal, and it is folly to suppose otherwise. No morally conscious human being can honestly claim to provide a totally neutral account of British imperialism, for instance. We tell a story in the way that we want to tell it, and there are plethora ways of telling that story, all of which have moral implications in the present. Language, as Bloxham observes, is a key factor. Can a historian who writes about the ‘exploitation’ and ‘subjugation’ of millions of human beings as a result of the Atlantic slave trade truly claim that they are providing a ‘neutral’ impression to their reader? These words carry weight, and rightly so. To talk about the past in totally neutral terms is not only impossible, but also heartless. The stories of the people whose lives were torn apart by past injustices deserve to be told, not only out of respect or disengaged interest but because they bear lessons that exert a tangible and morally didactic hold over us in the present.

The Lady of Justice statute outside the Old Baily. (Credit: Into the Blue)

That is not to say that historical writing should take the form of a moral invective, lambasting the behaviour of dead people whom we can no longer hold to account. Nor is it to argue that historical relativism is not a vitally important and foundational principle of the profession. What I am proposing, however, is that when Richard J. Evans claims, in his otherwise brilliant ‘In Defence of History’, that we should refute E.H. Carr’s argument – that the human cost of collectivisation in the USSR was a necessary evil – in the ‘historian’s way’, by undermining its ‘historical validity’, he seems to be suggesting that we are not doing so with a moral purpose in mind. Indeed, suggesting that the costs outweighed the benefits is itself a moral judgement, for is it not judging the value of people’s lives? Whilst Evans claims that it is the reader who must infer this conclusion, not the historian, his economic argument (that collectivisation was no more successful than the policies that preceded it) is surely intended to ‘prompt’ it.

Evans, like most people, clearly opposes the morality of Carr’s argument, and his way of communicating this is in the (highly effective) ‘historian’s way’. But his purpose nonetheless is to influence the opinion of his readers, not simply to fulfil the role of historical automaton, providing those readers with every fact under the sun. The process of omission and admission is one that, try as we may to temper it, will always involve some degree of value judgement about which facts matter for the purpose of our argument and which do not. Such a value judgement will inevitably, at times, operate on a moral criterion.

This debate may, as is often the case with those that take historiography as their subject, appear somewhat academic. In a world in which our history does so much to define the identities of (and relations between) ethnic, social, cultural and political groups, however, it is anything but. What we can call the ‘neutrality complex’ runs the risk of imbuing the historical profession and its practitioners with a sense of intellectual superiority, forgetting the political consequences of its output. One can find little fault in Bloxham’s assertion that certain histories carry less moral weight, and are therefore more conducive to neutral assessment, but subjects with as much emotional resonance as the history of slavery, the Holocaust or Mao’s Great Famine cannot but be judgemental in nature. 

‘Neutrality’ can be a mask for the covert projection of nefarious ideologies and interpretations. Presenting something simply as ‘fact’ is irresponsible and shows great ignorance of the moral dispositions that influence what we write and how we write it. There is space and need for some degree, however tentative, of self-acknowledged judgement in historical writing. We owe it to our audience to declare our judgement and to justify it. The crimes of imperialism, genocide and slavery are universally evil. The historian has a concern and a duty to show their audience why those that claim otherwise, who hyperinflate relativism and claim neutrality, are guilty both of intellectual hubris and moral cowardice.

Samuel Lake, History in Politics Writer

The Environment Has No Ideology: Debating Which System Works Best is Inherently Flawed

It is often assumed that we in the ‘West’ are the arbiters of environmental policy, that we simply ‘care more’ than the rest of the world. ‘China’, for many, evokes images of flat-pack cities and rapid industrialisation synonymous with the stain left by humanity on the natural world. It is lazily viewed as an outlying hindrance to the global goal of sustainable development, whilst we remain wilfully ignorant of our own shortcomings, both past and present. Instead of viewing Chinese environmental negligence as unique, I argue, within the lingering paradigm of the ‘capitalist good/communist bad’ dichotomy, that a more bipartisan assessment of the root cause of environmental degradation may be in order. Our planet, after all, cares little for politics.

Many of China’s environmental failures have historically been attributed to the communist policies of the ruling party, particularly under Mao, whose ‘ren ding shen jian’, or ‘man must conquer nature’ slogan has been presented by the historian Judith Shapiro as evidence of the Communist Party’s desire to dominate the natural world, even at the expense of its own people and environment. Of course, there is merit to this argument – the collectivisation of land and the Great Leap Forward’s unattainable targets  wreaked havoc on the land and contributed in no small part to what Frank Dikötter has termed ‘Mao’s Great Famine’, which is estimated to have killed up to 45 million people between 1958 and 1962. It can be easy, therefore, for us to assume that this environmental exploitation is one peculiar to China’s communist system of government.

A factory in China by the Yangtze River, 2008. (Credit: Wikimedia Commons)

Without excusing the undoubtedly detrimental and inhumane policies of Mao’s government, we should  view the environmental impact of the Chinese state’s rapid development in a more contextual manner. After all, did not the rampant capitalism of the Industrial Revolution in the United Kingdom lead to the explosion of soot-filled cities like Manchester, Liverpool and Birmingham? All of which were centres of heightened industrial activity that harmed both their human population and the surrounding environment. London’s death rate rose 40% during a period of smog in December 1873, and similarly, we can look to the Great Smog of 1952, which the Met Office claims killed at least 4000 people, possibly many more.

Industrial potteries in North Staffordshire during the nineteenth century. (Credit: StokeonTrent Live)

Geographically closer to China, the Japanese state has also shown in recent years that pointing to ideology might be mistaken. The post-war Japanese growth-first and laissez-faire mentality left the likes of Chisso Corporation in Minamata to their own devices, and the results were devastating. From 1956 through to the 1970s, first cats, then human residents of  Minamata began coming down with a mysterious illness, one that caused ataxia and paralysis in its victims. It would transpire that what came to be known as ‘Minamata disease’ was the result of Chisso’s chemical plant releasing methylmercury into the town’s bay. This was absorbed by algae and passed up the food chain through the fish that local residents (both human and feline) were regularly consuming. Government inaction was deafening, despite the cause being known since 1959, and change only came after it was forced by  non-capitalist union pressure in the 1970s. If this seems like a problem confined to the past, one need only cast their mind back to the Fukushima disaster in 2011, ultimately the result of the irresponsible decision to pursue a nuclear energy policy on the disaster-prone Pacific Ring of Fire.

This article does not wish to make the case for either the capitalist or communist system’s superiority in environmental affairs. Rather, it should be clear that the common thread running through all of these disasters – from the Great Smog to the Great Famine and Fukushima – is a policy emphasising economic growth as the paramount standard of success is a dangerous one that will inevitably lead to environmental destruction. The style and severity of that destruction may be influenced by ideology, but if we are to live in harmony with our environment, we must be willing to abandon the ideals of gain (collective or individual) and competition, that have placed us in our current quandary, whatever the tint of our political stripes.

Samuel Lake, History in Politics Writer

Is It Time For An Elected Head of State?

Democracy and equality under the law have increasingly come to be seen as the gold-standard for structuring societies ever since the enlightenment. it may therefore appear odd to some that the United Kingdom, the ‘mother of parliamentary democracy’, is still reigned over by a monarchy. Stranger still is that despite the drastic decline in the number of monarchies worldwide since the start of the 20th century, the British monarchy continues to sit in the heart of a proudly democratic nation and continues to enjoy significant popular support amongst the general public. Perhaps this will change with the passing of our current and longest serving monarch Queen Elizabeth II, perhaps the royal family will lose its purpose, or perhaps it will continue to hold steadfast as it has done in the face of major social transformations. But while there may be calls for the monarchy to be replaced by an elected head of state, we should ask ourselves what the monarchy both means to us and offers us, domestically and internationally, before we rush to any conclusions. 

Queen Elizabeth II and Prince Philip. (Credit: AY COLLINS – WPA POOL/GETTY IMAGES)

While certainly debatable, I would contend that in its history, structures, and character, Britain is fundamentally a conservative nation. Not conservative in the sense that it strictly aligns with today’s Conservative party, but more in the sense of Burke or Oakeshott; we sacrifice democratic purity on the altar of an electoral system that is more inclined to produce stable and commanding governments; we still retain a strong support for the principle of national sovereignty in a world of increasing interdependence and cooperation; we take pride in our institutions, such as parliamentary democracy and our common law; and as evidenced by our addiction to tea, we value tradition. So is it really surprising that monarchy, the oldest form of government in the United Kingdom, still not only exists but enjoys significant public support? 

The monarchy is intended as a symbol of national identity, unity, pride, duty, and serves to provide a sense of stability and continuity across the lifespan of the nation (according to its website). Its whole existence is rooted in the conservative disposition towards traditions, historical continuity, and the notion of collective wisdom across the ages that should not be readily discarded by those in the present. The monarchy is also politically impartial, and so able to provide that described sense of unity as it is a symbol that should cut across factional lines. Finally, the royal family is not necessarily an obstacle to democracy anymore; we have a constitutional monarchy, whereby the politicians make the decisions without arbitrary sovereign rule. The Sovereign’s role is not to undemocratically dictate legislation, it is to embody the spirit of the nation and exemplify a life of service and duty to country.

Conversely, many may say with good reason that the monarchy is outdated, elitist, and a spanner in the works for democracy. Indeed monarchies are increasingly becoming a thing of the past, and in today’s world it may seem out of place to see a family of people living a life of unbounded riches and privileges simply by birth right. This is a view that is becoming increasingly popular among younger Britons. Additionally, one might contend that the monarchy has lost its magic; it no longer inspires the same awe and reverence it once did, and is unable to invoke the sense of service and duty to country that it once could. 

Prince Harry and Meghan Markle being interviewed by Oprah Winfrey. (Credit: Marie Claire)

While support for the British monarchy appears to be holding steady, even in the wake of the latest saga with Harry and Meghan, I believe that the monarchy is on thin ice. The age of deference has long since passed, and in an era of materialism and rationality, the ethereal touch of monarchy has arguably lost its draw. Perhaps this is a good thing, or perhaps now more than ever we need a symbol of unity and duty to do our best by our neighbour and country. What is worth pointing out though is that Queen Elizabeth II, our longest serving monarch, has led the country dutifully throughout her life, and it is worth considering deeply whether the alternative (President Boris Johnson?) is really a better option.

Leo Cullis, History in Politics Writer

Who Won the Good Friday Agreement?

The Good Friday Agreement was signed in 1998, heralding a new era of peace after the decades of violence that characterised the Troubles. But who benefitted the most from the signing of the Agreement, and is the answer different today than it was twenty years ago?

For unionists, represented most prominently by the Democratic Unionist Party (DUP) and the Ulster Unionist Party (UUP), the Good Friday Agreement ultimately symbolised the enshrining of the status quo in law: Northern Ireland remained a part of the UK. In addition, the Republic of Ireland renounced articles two and three of its Constitution, which laid claim to the entire island of Ireland. It may seem, then, that unionism was the victor in 1998, but elements of the Good Friday Agreement have been responsible for tectonic shifts in the period since, arguably exposing it, ultimately, as a victory for nationalism.

While Irish republicans in the form of the Provisional IRA were required to put down their weapons and suspend the violent struggle for a united Ireland, there is a compelling argument that the Good Friday Agreement laid the platform for the growth of the movement and perhaps even the fulfilment of the goal of Irish unity. For one, it mandated power-sharing between the two sides of the Northern Irish divide: both unionists and nationalists must share the leadership of government. Since 1998, this has acted to legitimise the nationalist cause, rooting it as a political movement rather than an armed struggle. Sinn Féin, the leading nationalist party in the North, have moved from the leadership of Gerry Adams and Martin McGuiness to new, younger leaders Mary Lou McDonald and Michelle O’Neill. Irish unity propaganda now emphasises the economic sense of a united Ireland, the need to counter the worst effects of Brexit on Northern Ireland and a sensible debate about the constitutional nature of a new country, rather than the adversarial anti-unionist simplicity of the Troubles. Here, nationalism did gain significantly from the Good Friday Agreement because it allowed this transition from the Armalite to the ballot box in return for legitimacy as a movement.

British Prime Minister Tony Blair (left) and Irish Prime Minister Bertie Ahern (right) signing the Good Friday Agreement. (Credit: PA, via BBC)

Most prominently for nationalists, however, is the fact that the Good Friday Agreement spells out the route to a united Ireland, explicitly stating that a ‘majority’ in a referendum would mandate Northern Ireland’s exit from the UK. While unclear on whether majorities would be required both in the Republic and Northern Ireland, as was sought for the Agreement itself in 1998, as well as what criteria would have to be met in order to hold a vote, this gives Irish nationalism a legal and binding route to its ultimate goal of Irish unity, arguably the most prominent victory of the peace process.

Since 1998, Northern Ireland has seen a huge amount of political tension, governmental gridlock and occasional outbreaks of violence, most recently witnessed in the killing of journalist Lyra McKee in 2019. However, the Good Friday Agreement has served crucially to preserve peace on a scale unimaginable in the most intense years of the Troubles. If Irish nationalism achieves its goal of uniting the island, it will come about through a democratic referendum, not through violence. The very existence of the Good Friday Agreement, particularly its survival for over 20 years, is testament to the deep will for peace across the communities of Northern Ireland, forged in decades of conflict; it is this desire being fulfilled, even as the parties squabble in Stormont and the political status of Northern Ireland remains in the balance, that continues to make the biggest difference in the daily lives of both unionists and nationalists.

Joe Rossiter, History in Politics Writer

Margaret Thatcher: A Feminism Icon?

Thatcher’s lasting impact on twenty-first century feminism is widely debated. Whilst her actions have inspired future generations of ambitious young women in all professions, Thatcher was undoubtedly not a feminist. In fact, she actively disliked and directly discouraged feminist movements. Thus, Margaret Thatcher works as an apt example that a successful woman does not always mean a direct step forward for the women’s equality movement. Instead, whilst Thatcher was our first British female Prime Minister, it never occurred to her that she was a woman prime minister; she was, quite simply, a woman that was skilled and successful enough to work her way to the top. 

Throughout Thatcher’s eleven-year premiership, she only promoted one woman to her cabinet; she felt men were best for the role. When questioned about this, Thatcher remarked that no other women were experienced or capable enough to rise through the ranks in the same way that she had.  Similarly, whilst Thatcher demonstrated that women were now able to get to the top of the UK Government, she certainly did not attempt to make things easier for women to follow; she pulled the ladder straight up after herself.  Thatcher claimed that she ‘did not owe anything to women’s liberation.’ Reflected in her policy provisions, she ignored almost all fundamental women’s issues. Despite hopeful attempts to raise female concerns to her, childcare provision, positive action and equal pay were not addressed during her years as Prime Minister. One can therefore diagnose that Thatcher’s loyalty was almost exclusively to the Conservative Party and her vision focused on saving the country, not women.

The May 1989 Cabinet: CREDIT: Photo: UPPA

Thatcher resented being defined by her gender, but she worked naturally as a role model and continues to do so (despite her policies) on the basis that she was simply, a female. She was made unique by feminists (and women generally) in the media simply for being a woman. In this way, examples matter; this is in the same way that Obama’s presidency matters for young generations of the BAME community. Perhaps Thatcher’s disacknowledgement of the glass ceiling is precisely what allowed her to smash it so fearlessly. If she couldn’t see it, no one could point it out to her! Amber Rudd has claimed that she is a direct beneficiary of Thatcher’s valiance, as her work debunked the idea that only men could survive, let alone prosper, in political life. Men were undoubtedly perturbed by Thatcher’s female presence and conviction; John Snow has gone as far as describing interviewing her as ‘unnerving.’ These predispositions, held by both men and women, were fundamentally and forcefully shifted by Thatcher.

Yet, is symbolism alone enough to praise Thatcher as a feminism icon? It is of course clear, that being a role model symbolically is incomparable to those who have actively and tirelessly campaigned to seek change for women. The influence she had on the feminist movement was not a result of her own actions or policies. Rather, her influence is a result of her being the first woman who made the exceptional progress, as a total outsider, to not only become our Prime Minister, but go on to win two successive elections convincingly. 

Amelia Crick

Women in Terrorism: An Invisible Threat?

In 1849 the world met its first female doctor, Elizabeth Blackwell. Years later in 1903, the first woman to win a Nobel Prize, Marie Curie, did so for her outstanding contributions to Physics. How, with many more remarkable achievements behind women, does society continue to hold limited expectations of them? Why does the concept of a female terrorist seem so improbable to the vast majority of the Western world? 

While this perhaps appears a perverse logic; almost rendering terrorism a positive milestone for women, that is certainly not the intention. Instead, I hope to enlighten the reader to the gendered dimensions of terrorism, and to highlight the escalating need to perceive women as potentially equal vessels of terror. 

The BBC series Bodyguard focuses on Police Sergeant David Budd’s protection of Home Secretary Julie Montague in a fast-paced drama. The plot twist in the season finale centres on a Muslim women, who is revealed as the architect and bomb-maker behind the attack. Although some have found this portrayal as troublesome, displaying Islamophobic overtones, Anijli Mohindra, the actress, explains that the role was actually “empowering”. Regardless of these perceptions, it is clear that the ‘surprise’ element manifests itself in the female gender. This sentiment presides outside of the media too, highlighting the potential threat posed by gender limitations. 

Anjli Mohindra playing terrorist Nadia in BBC One’s Bodyguard. (Credit: BBC)

There is an undeniable, and widespread assumption that terrorists are always male. While this assumption could be ascribed to the smaller numbers of women involved in terrorism, it is more likely attributable to embedded gender stereotypes. Such stereotypes perceiving women as maternal and nurturing, but also helpless and passive, are irreconcilable with that of an individual committing acts which knowingly cause death and disruption. In 2015 when women such as Shemima Begum and Kadiza Sultana left East London for the Islamic State, they were depicted as archetypal ‘jihadi bride[s]’ in the media: meek, manipulated and denied of any agency in their decision. Yet, an accurate representation of women in terrorism needs to transcend the constraints of traditional gender constructs.  Although we may be aware of female stereotypes, why do they continue to permeate our understanding of women in terrorism, when we claim to be an equal society. 

The reality of women in terror is quite the contrary of the aforementioned stereotype. In January 2002, Wafa Idris became the first female suicide bomber. Since this date, women have represented over 50% of successful suicide bombings in the conflicts of Turkey, Sri Lanka and Chechnya. In more recent years, the Global Extremism Monitor recorded 100 distinct suicide attacks conducted by female militants in 2017, constituting 11% of the total incidents occurring that year. Moreover, Boko Haram’s female members have been so effective in their role as suicide bombers, that women now comprise close to two-thirds of the group’s suicide-attackers.  

It is perhaps the dominant nature of presiding stereotypes regarding women, which enables them to be so successful in their attacks – presenting terrorist organisations with a strategic advantage. This is illustrated by the astonishing figures proving female suicide attacks more lethal on average than those conducted by their male counterparts. According to one study, attacks carried out by women had an average of 8.4 victims, compared to 5.3 for those attacks carried out by men. Weaponizing the female body is proving successful as society continues to assume women lack credibility as terrorist actors. Needless to say, remaining shackled to entrenched gender preconceptions will undoubtedly continue to place society at risk of unanticipated terror attacks from women.

Emily Glynn, History in Politics President