Recycling Political Establishments?

The announcement made by Abdelaziz Bouteflika in 2019 proclaiming his candidacy for a fifth presidential term ignited an ocean of furious Algerians opposing the monotonous and stagnant regime under his rule. Since Algeria’s independence in 1962, the nature of its political system under Bouteflika’s neo-patrimonial and authoritarian rule led to a disruption of the country’s social contract resulting in a loss of legitimacy for its rulers. Algerian protestors peacefully took to the streets against Bouteflika’s bid, the pressure placed by the Hirak movement alongside the military led to the resignation of Bouteflika, restoring a sense of hope and new beginnings for the Algerian people. The resignation of Bouteflika allowed for the disclosure of the profound fractures within the Algerian organization but also led to uncertainty between political actors on how to progress in a post-Bouteflika regime. 

The goals of the Hirak had endured a rancorous end as the country’s military leadership rebuffed any additional concessions, overlooking all calls necessary for an essential transition period. Algeria’s political establishment instead, marshalled propaganda and authoritarianism  to force the presidential elections in December 2019, resulting in the presidency of former Prime Minister Abdelmadjid Tebboune. Algeria has continued to struggle politically over the past two years with the Hirak movement gradually losing momentum, political stability still being seen to be lacking in the country as a vicious cycle of tainted political actors continually suspend urgently needed political and economic reforms. The Covid-19 pandemic has heightened the economic and political struggles of the country, potentially entering a state of multifaceted chaotic crisis, one would not be surprised to see the character of Algeria during the Arab Spring being brought back to life in upcoming years as the people’s needs are dismissed by Algeria’s political elite. (or something like this) 

Painted portrait of Abdelaziz Bouteflika. (Credit: Abode of Chaos, via Flickr)

The Hirak has become irrevocably divided as groups no longer share consistent socio-political aspirations, most notably the divide between the new-reformist camp. The internal weaknesses of the Hirak have meant that there has been a failure of agenda establishment regarding what exactly it is the movement seeks to achieve. Dialogue between the Hirak is a necessary channel to any form of success yet it is overdue, unless Algeria faces an existential threat that would push the system to engage collectively it seems there will be no progression for its political and economic placement.

Despite the Hirak not having achieved its major goals the opposition movement has sparked a genuine desire and need for political and social progression; however, this may take years to attain, and time is not on Algeria’s side given its serious economic and political challenges. The abandonment of Algeria by the international community has further complicated matters since 2019, Algeria is a regular when it comes to favouring the status-quo and may very well reject any interference with their internal affairs. However, the international community could afford the country a course of internal dialogue or aid the Hirak with its organizational process via encouraging greater civil and political freedoms. Algeria may not be of priority for the Biden-Harris administration, nonetheless, hand in hand with its recently reinforced relations with European governments the United States have a greater potential to revive a collective effort towards a transition period for Algeria. The June 12th snap election has not instigated any meaningful change so far with the majority of the population even boycotting the election as the military remains in control. Although it would be precarious to call for radical and instant changes it is necessary that Algeria gradually works on reciprocally beneficial reforms for both the opposition and the system.

Lydia Benaicha, History in Politics Contributor

The Politics of the Past: How Divergent Interpretations of History Shape East Asian Diplomatic Relations in the Present

David Cameron’s refusal to remove his poppy for his 2010 visit to China was revealing of a stark contrast in the significance granted to history in politics between himself (and the British political establishment as a whole) and his hosts. Whilst history has often played the role of a footnote to contemporary politics in the UK – as reflected by the severe lack of meaningful authority being granted to historians in any government department barring the Foreign Office, and even then only recently – it is central to the national self-portrayal of the Chinese nation. The ‘Century of Humiliation’ narrative that plays such a pivotal role in the story of the nation, as painted by the Chinese Communist Party, is one that the West would do well to take more notice of. Meanwhile, in Japan and Korea, the legacy of the Japanese colonial project looms large in contemporary relations. Perhaps as the ‘victors’ of modern history it is easy to relegate the past to that which went before. In Asia, where the nineteenth and twentieth centuries were ones of humiliation and soul-searching, it is impossible to simply sequester the past – it is intricately bound to the politics of the present.

China’s relations with the West underwent a radical shift in the Great Divergence of the nineteenth century, as European powers and the United States came to dominate the globalising world order. The reversal in fortunes suffered by the Qing Empire and, later, the modern Chinese state, has served to inform Chinese foreign policy and education ever since. Chairman Mao linked the Japanese imperialism of the early twentieth century to the Opium Wars of the nineteenth, and the same wars were used to justify Communist China’s ‘reaction’ against their Western oppressors. The Chinese national imagining has therefore come to be defined in opposition to, and in competition with, a West that remains stained by its past, a point of nuance that David Cameron failed so visibly to grasp in 2010, and one that continues to underlie the diplomatic fallacy that we are able to negotiate any sort of equal standing with the Chinese government. A competitive national consciousness has been fostered that means that ‘the West’ will always be cast as the natural point of comparison for China’s past failures and current successes, leaving them and the likes of the UK at polar ends of a dichotomy that western governments, until very recently, have failed to fully grasp.

A Nationalist officer guarding women prisoners likely to be comfort women used by the Communists, 1948.
(Credit: Jack Birns, The LIFE Picture Collection, Getty Images)

Elsewhere in East Asia, the memory of the Japanese military’s ‘comfort women’, who were drawn from across the Empire through the course of the Second World War and forced into what can only be described as sexual slavery, retains a pervasive political potency. The majority of these women were Korean and though estimates vary, they seem to have numbered in at least the tens, possibly hundreds of thousands. Indeed, such a range in estimates comes as a result of the topic’s controversial nature in the context of the countries’ poor diplomatic relations in recent years. The plight of the comfort women and the allocation of responsibility for the crimes against them has come to represent a clearly drawn battle line between the two countries – Japanese nationalists, the recently departed Shinzo Abe amongst them, seeking to play down the extent of official sanction for such atrocities, whilst Koreans pursue justice not only for the victims, but for the Korean nation as a whole. In order for the nations’ relations to reach some level of normality, the governments of both must look to find a compromise between what are currently polarised memories of the Japanese Empire. Forgetting those years is a luxury that only the oppressors may take, yet it is clear that in Korea too a way must be found for the nation to move on from the scars of their past.

Both of these cases demonstrate the historical dimension of diplomacy in the East Asian political sphere. A history of ruptures, clean breaks and colonial exploitation has bred national imaginings in which the traumas of the past play a central role. This significance is one that can be easily underestimated by those of us in the West for whom history has taken on an almost trivial status, as a backdrop to the present. Cameron underestimated it and it appears that our current leaders are also misunderstanding the inescapable threat posed by a Chinese leadership that places itself firmly in the context of historical competition with Western ‘imperialists’. Such cultural ignorance not only offends those whose culture is being ignored, but also hamstrings those guilty of that ignorance. Without a clear understanding of the other side’s thinking, diplomatic blunders like the poppy controversy are not likely to go anywhere anytime soon.

Samuel Lake, History in Politics Writer

Judging the Past: Can We Really Afford Not To?

University of Edinburgh historian Donald Bloxham has provided much food for thought in his recent article for the March edition of BBC History Magazine, entitled ‘Why History Must Take a Stance’. In it, he challenges the dogmatic insistence on neutrality that pervades the historical profession. Instead of feigning an unattainable neutrality, he argues, historians should take ownership of the judgements they make and the moral ‘prompts’ that they provide to their readers. Proclaiming neutrality is misleading, and possibly dangerous. I am inclined to agree.

Whilst neutrality is an honourable and necessary ambition for any historian, it is an ideal, and it is folly to suppose otherwise. No morally conscious human being can honestly claim to provide a totally neutral account of British imperialism, for instance. We tell a story in the way that we want to tell it, and there are plethora ways of telling that story, all of which have moral implications in the present. Language, as Bloxham observes, is a key factor. Can a historian who writes about the ‘exploitation’ and ‘subjugation’ of millions of human beings as a result of the Atlantic slave trade truly claim that they are providing a ‘neutral’ impression to their reader? These words carry weight, and rightly so. To talk about the past in totally neutral terms is not only impossible, but also heartless. The stories of the people whose lives were torn apart by past injustices deserve to be told, not only out of respect or disengaged interest but because they bear lessons that exert a tangible and morally didactic hold over us in the present.

The Lady of Justice statute outside the Old Baily. (Credit: Into the Blue)

That is not to say that historical writing should take the form of a moral invective, lambasting the behaviour of dead people whom we can no longer hold to account. Nor is it to argue that historical relativism is not a vitally important and foundational principle of the profession. What I am proposing, however, is that when Richard J. Evans claims, in his otherwise brilliant ‘In Defence of History’, that we should refute E.H. Carr’s argument – that the human cost of collectivisation in the USSR was a necessary evil – in the ‘historian’s way’, by undermining its ‘historical validity’, he seems to be suggesting that we are not doing so with a moral purpose in mind. Indeed, suggesting that the costs outweighed the benefits is itself a moral judgement, for is it not judging the value of people’s lives? Whilst Evans claims that it is the reader who must infer this conclusion, not the historian, his economic argument (that collectivisation was no more successful than the policies that preceded it) is surely intended to ‘prompt’ it.

Evans, like most people, clearly opposes the morality of Carr’s argument, and his way of communicating this is in the (highly effective) ‘historian’s way’. But his purpose nonetheless is to influence the opinion of his readers, not simply to fulfil the role of historical automaton, providing those readers with every fact under the sun. The process of omission and admission is one that, try as we may to temper it, will always involve some degree of value judgement about which facts matter for the purpose of our argument and which do not. Such a value judgement will inevitably, at times, operate on a moral criterion.

This debate may, as is often the case with those that take historiography as their subject, appear somewhat academic. In a world in which our history does so much to define the identities of (and relations between) ethnic, social, cultural and political groups, however, it is anything but. What we can call the ‘neutrality complex’ runs the risk of imbuing the historical profession and its practitioners with a sense of intellectual superiority, forgetting the political consequences of its output. One can find little fault in Bloxham’s assertion that certain histories carry less moral weight, and are therefore more conducive to neutral assessment, but subjects with as much emotional resonance as the history of slavery, the Holocaust or Mao’s Great Famine cannot but be judgemental in nature. 

‘Neutrality’ can be a mask for the covert projection of nefarious ideologies and interpretations. Presenting something simply as ‘fact’ is irresponsible and shows great ignorance of the moral dispositions that influence what we write and how we write it. There is space and need for some degree, however tentative, of self-acknowledged judgement in historical writing. We owe it to our audience to declare our judgement and to justify it. The crimes of imperialism, genocide and slavery are universally evil. The historian has a concern and a duty to show their audience why those that claim otherwise, who hyperinflate relativism and claim neutrality, are guilty both of intellectual hubris and moral cowardice.

Samuel Lake, History in Politics Writer

The Environment Has No Ideology: Debating Which System Works Best is Inherently Flawed

It is often assumed that we in the ‘West’ are the arbiters of environmental policy, that we simply ‘care more’ than the rest of the world. ‘China’, for many, evokes images of flat-pack cities and rapid industrialisation synonymous with the stain left by humanity on the natural world. It is lazily viewed as an outlying hindrance to the global goal of sustainable development, whilst we remain wilfully ignorant of our own shortcomings, both past and present. Instead of viewing Chinese environmental negligence as unique, I argue, within the lingering paradigm of the ‘capitalist good/communist bad’ dichotomy, that a more bipartisan assessment of the root cause of environmental degradation may be in order. Our planet, after all, cares little for politics.

Many of China’s environmental failures have historically been attributed to the communist policies of the ruling party, particularly under Mao, whose ‘ren ding shen jian’, or ‘man must conquer nature’ slogan has been presented by the historian Judith Shapiro as evidence of the Communist Party’s desire to dominate the natural world, even at the expense of its own people and environment. Of course, there is merit to this argument – the collectivisation of land and the Great Leap Forward’s unattainable targets  wreaked havoc on the land and contributed in no small part to what Frank Dikötter has termed ‘Mao’s Great Famine’, which is estimated to have killed up to 45 million people between 1958 and 1962. It can be easy, therefore, for us to assume that this environmental exploitation is one peculiar to China’s communist system of government.

A factory in China by the Yangtze River, 2008. (Credit: Wikimedia Commons)

Without excusing the undoubtedly detrimental and inhumane policies of Mao’s government, we should  view the environmental impact of the Chinese state’s rapid development in a more contextual manner. After all, did not the rampant capitalism of the Industrial Revolution in the United Kingdom lead to the explosion of soot-filled cities like Manchester, Liverpool and Birmingham? All of which were centres of heightened industrial activity that harmed both their human population and the surrounding environment. London’s death rate rose 40% during a period of smog in December 1873, and similarly, we can look to the Great Smog of 1952, which the Met Office claims killed at least 4000 people, possibly many more.

Industrial potteries in North Staffordshire during the nineteenth century. (Credit: StokeonTrent Live)

Geographically closer to China, the Japanese state has also shown in recent years that pointing to ideology might be mistaken. The post-war Japanese growth-first and laissez-faire mentality left the likes of Chisso Corporation in Minamata to their own devices, and the results were devastating. From 1956 through to the 1970s, first cats, then human residents of  Minamata began coming down with a mysterious illness, one that caused ataxia and paralysis in its victims. It would transpire that what came to be known as ‘Minamata disease’ was the result of Chisso’s chemical plant releasing methylmercury into the town’s bay. This was absorbed by algae and passed up the food chain through the fish that local residents (both human and feline) were regularly consuming. Government inaction was deafening, despite the cause being known since 1959, and change only came after it was forced by  non-capitalist union pressure in the 1970s. If this seems like a problem confined to the past, one need only cast their mind back to the Fukushima disaster in 2011, ultimately the result of the irresponsible decision to pursue a nuclear energy policy on the disaster-prone Pacific Ring of Fire.

This article does not wish to make the case for either the capitalist or communist system’s superiority in environmental affairs. Rather, it should be clear that the common thread running through all of these disasters – from the Great Smog to the Great Famine and Fukushima – is a policy emphasising economic growth as the paramount standard of success is a dangerous one that will inevitably lead to environmental destruction. The style and severity of that destruction may be influenced by ideology, but if we are to live in harmony with our environment, we must be willing to abandon the ideals of gain (collective or individual) and competition, that have placed us in our current quandary, whatever the tint of our political stripes.

Samuel Lake, History in Politics Writer

Is It Time For An Elected Head of State?

Democracy and equality under the law have increasingly come to be seen as the gold-standard for structuring societies ever since the enlightenment. it may therefore appear odd to some that the United Kingdom, the ‘mother of parliamentary democracy’, is still reigned over by a monarchy. Stranger still is that despite the drastic decline in the number of monarchies worldwide since the start of the 20th century, the British monarchy continues to sit in the heart of a proudly democratic nation and continues to enjoy significant popular support amongst the general public. Perhaps this will change with the passing of our current and longest serving monarch Queen Elizabeth II, perhaps the royal family will lose its purpose, or perhaps it will continue to hold steadfast as it has done in the face of major social transformations. But while there may be calls for the monarchy to be replaced by an elected head of state, we should ask ourselves what the monarchy both means to us and offers us, domestically and internationally, before we rush to any conclusions. 

Queen Elizabeth II and Prince Philip. (Credit: AY COLLINS – WPA POOL/GETTY IMAGES)

While certainly debatable, I would contend that in its history, structures, and character, Britain is fundamentally a conservative nation. Not conservative in the sense that it strictly aligns with today’s Conservative party, but more in the sense of Burke or Oakeshott; we sacrifice democratic purity on the altar of an electoral system that is more inclined to produce stable and commanding governments; we still retain a strong support for the principle of national sovereignty in a world of increasing interdependence and cooperation; we take pride in our institutions, such as parliamentary democracy and our common law; and as evidenced by our addiction to tea, we value tradition. So is it really surprising that monarchy, the oldest form of government in the United Kingdom, still not only exists but enjoys significant public support? 

The monarchy is intended as a symbol of national identity, unity, pride, duty, and serves to provide a sense of stability and continuity across the lifespan of the nation (according to its website). Its whole existence is rooted in the conservative disposition towards traditions, historical continuity, and the notion of collective wisdom across the ages that should not be readily discarded by those in the present. The monarchy is also politically impartial, and so able to provide that described sense of unity as it is a symbol that should cut across factional lines. Finally, the royal family is not necessarily an obstacle to democracy anymore; we have a constitutional monarchy, whereby the politicians make the decisions without arbitrary sovereign rule. The Sovereign’s role is not to undemocratically dictate legislation, it is to embody the spirit of the nation and exemplify a life of service and duty to country.

Conversely, many may say with good reason that the monarchy is outdated, elitist, and a spanner in the works for democracy. Indeed monarchies are increasingly becoming a thing of the past, and in today’s world it may seem out of place to see a family of people living a life of unbounded riches and privileges simply by birth right. This is a view that is becoming increasingly popular among younger Britons. Additionally, one might contend that the monarchy has lost its magic; it no longer inspires the same awe and reverence it once did, and is unable to invoke the sense of service and duty to country that it once could. 

Prince Harry and Meghan Markle being interviewed by Oprah Winfrey. (Credit: Marie Claire)

While support for the British monarchy appears to be holding steady, even in the wake of the latest saga with Harry and Meghan, I believe that the monarchy is on thin ice. The age of deference has long since passed, and in an era of materialism and rationality, the ethereal touch of monarchy has arguably lost its draw. Perhaps this is a good thing, or perhaps now more than ever we need a symbol of unity and duty to do our best by our neighbour and country. What is worth pointing out though is that Queen Elizabeth II, our longest serving monarch, has led the country dutifully throughout her life, and it is worth considering deeply whether the alternative (President Boris Johnson?) is really a better option.

Leo Cullis, History in Politics Writer

Who Won the Good Friday Agreement?

The Good Friday Agreement was signed in 1998, heralding a new era of peace after the decades of violence that characterised the Troubles. But who benefitted the most from the signing of the Agreement, and is the answer different today than it was twenty years ago?

For unionists, represented most prominently by the Democratic Unionist Party (DUP) and the Ulster Unionist Party (UUP), the Good Friday Agreement ultimately symbolised the enshrining of the status quo in law: Northern Ireland remained a part of the UK. In addition, the Republic of Ireland renounced articles two and three of its Constitution, which laid claim to the entire island of Ireland. It may seem, then, that unionism was the victor in 1998, but elements of the Good Friday Agreement have been responsible for tectonic shifts in the period since, arguably exposing it, ultimately, as a victory for nationalism.

While Irish republicans in the form of the Provisional IRA were required to put down their weapons and suspend the violent struggle for a united Ireland, there is a compelling argument that the Good Friday Agreement laid the platform for the growth of the movement and perhaps even the fulfilment of the goal of Irish unity. For one, it mandated power-sharing between the two sides of the Northern Irish divide: both unionists and nationalists must share the leadership of government. Since 1998, this has acted to legitimise the nationalist cause, rooting it as a political movement rather than an armed struggle. Sinn Féin, the leading nationalist party in the North, have moved from the leadership of Gerry Adams and Martin McGuiness to new, younger leaders Mary Lou McDonald and Michelle O’Neill. Irish unity propaganda now emphasises the economic sense of a united Ireland, the need to counter the worst effects of Brexit on Northern Ireland and a sensible debate about the constitutional nature of a new country, rather than the adversarial anti-unionist simplicity of the Troubles. Here, nationalism did gain significantly from the Good Friday Agreement because it allowed this transition from the Armalite to the ballot box in return for legitimacy as a movement.

British Prime Minister Tony Blair (left) and Irish Prime Minister Bertie Ahern (right) signing the Good Friday Agreement. (Credit: PA, via BBC)

Most prominently for nationalists, however, is the fact that the Good Friday Agreement spells out the route to a united Ireland, explicitly stating that a ‘majority’ in a referendum would mandate Northern Ireland’s exit from the UK. While unclear on whether majorities would be required both in the Republic and Northern Ireland, as was sought for the Agreement itself in 1998, as well as what criteria would have to be met in order to hold a vote, this gives Irish nationalism a legal and binding route to its ultimate goal of Irish unity, arguably the most prominent victory of the peace process.

Since 1998, Northern Ireland has seen a huge amount of political tension, governmental gridlock and occasional outbreaks of violence, most recently witnessed in the killing of journalist Lyra McKee in 2019. However, the Good Friday Agreement has served crucially to preserve peace on a scale unimaginable in the most intense years of the Troubles. If Irish nationalism achieves its goal of uniting the island, it will come about through a democratic referendum, not through violence. The very existence of the Good Friday Agreement, particularly its survival for over 20 years, is testament to the deep will for peace across the communities of Northern Ireland, forged in decades of conflict; it is this desire being fulfilled, even as the parties squabble in Stormont and the political status of Northern Ireland remains in the balance, that continues to make the biggest difference in the daily lives of both unionists and nationalists.

Joe Rossiter, History in Politics Writer

Margaret Thatcher: A Feminism Icon?

Thatcher’s lasting impact on twenty-first century feminism is widely debated. Whilst her actions have inspired future generations of ambitious young women in all professions, Thatcher was undoubtedly not a feminist. In fact, she actively disliked and directly discouraged feminist movements. Thus, Margaret Thatcher works as an apt example that a successful woman does not always mean a direct step forward for the women’s equality movement. Instead, whilst Thatcher was our first British female Prime Minister, it never occurred to her that she was a woman prime minister; she was, quite simply, a woman that was skilled and successful enough to work her way to the top. 

Throughout Thatcher’s eleven-year premiership, she only promoted one woman to her cabinet; she felt men were best for the role. When questioned about this, Thatcher remarked that no other women were experienced or capable enough to rise through the ranks in the same way that she had.  Similarly, whilst Thatcher demonstrated that women were now able to get to the top of the UK Government, she certainly did not attempt to make things easier for women to follow; she pulled the ladder straight up after herself.  Thatcher claimed that she ‘did not owe anything to women’s liberation.’ Reflected in her policy provisions, she ignored almost all fundamental women’s issues. Despite hopeful attempts to raise female concerns to her, childcare provision, positive action and equal pay were not addressed during her years as Prime Minister. One can therefore diagnose that Thatcher’s loyalty was almost exclusively to the Conservative Party and her vision focused on saving the country, not women.

The May 1989 Cabinet: CREDIT: Photo: UPPA

Thatcher resented being defined by her gender, but she worked naturally as a role model and continues to do so (despite her policies) on the basis that she was simply, a female. She was made unique by feminists (and women generally) in the media simply for being a woman. In this way, examples matter; this is in the same way that Obama’s presidency matters for young generations of the BAME community. Perhaps Thatcher’s disacknowledgement of the glass ceiling is precisely what allowed her to smash it so fearlessly. If she couldn’t see it, no one could point it out to her! Amber Rudd has claimed that she is a direct beneficiary of Thatcher’s valiance, as her work debunked the idea that only men could survive, let alone prosper, in political life. Men were undoubtedly perturbed by Thatcher’s female presence and conviction; John Snow has gone as far as describing interviewing her as ‘unnerving.’ These predispositions, held by both men and women, were fundamentally and forcefully shifted by Thatcher.

Yet, is symbolism alone enough to praise Thatcher as a feminism icon? It is of course clear, that being a role model symbolically is incomparable to those who have actively and tirelessly campaigned to seek change for women. The influence she had on the feminist movement was not a result of her own actions or policies. Rather, her influence is a result of her being the first woman who made the exceptional progress, as a total outsider, to not only become our Prime Minister, but go on to win two successive elections convincingly. 

Amelia Crick

Women in Terrorism: An Invisible Threat?

In 1849 the world met its first female doctor, Elizabeth Blackwell. Years later in 1903, the first woman to win a Nobel Prize, Marie Curie, did so for her outstanding contributions to Physics. How, with many more remarkable achievements behind women, does society continue to hold limited expectations of them? Why does the concept of a female terrorist seem so improbable to the vast majority of the Western world? 

While this perhaps appears a perverse logic; almost rendering terrorism a positive milestone for women, that is certainly not the intention. Instead, I hope to enlighten the reader to the gendered dimensions of terrorism, and to highlight the escalating need to perceive women as potentially equal vessels of terror. 

The BBC series Bodyguard focuses on Police Sergeant David Budd’s protection of Home Secretary Julie Montague in a fast-paced drama. The plot twist in the season finale centres on a Muslim women, who is revealed as the architect and bomb-maker behind the attack. Although some have found this portrayal as troublesome, displaying Islamophobic overtones, Anijli Mohindra, the actress, explains that the role was actually “empowering”. Regardless of these perceptions, it is clear that the ‘surprise’ element manifests itself in the female gender. This sentiment presides outside of the media too, highlighting the potential threat posed by gender limitations. 

Anjli Mohindra playing terrorist Nadia in BBC One’s Bodyguard. (Credit: BBC)

There is an undeniable, and widespread assumption that terrorists are always male. While this assumption could be ascribed to the smaller numbers of women involved in terrorism, it is more likely attributable to embedded gender stereotypes. Such stereotypes perceiving women as maternal and nurturing, but also helpless and passive, are irreconcilable with that of an individual committing acts which knowingly cause death and disruption. In 2015 when women such as Shemima Begum and Kadiza Sultana left East London for the Islamic State, they were depicted as archetypal ‘jihadi bride[s]’ in the media: meek, manipulated and denied of any agency in their decision. Yet, an accurate representation of women in terrorism needs to transcend the constraints of traditional gender constructs.  Although we may be aware of female stereotypes, why do they continue to permeate our understanding of women in terrorism, when we claim to be an equal society. 

The reality of women in terror is quite the contrary of the aforementioned stereotype. In January 2002, Wafa Idris became the first female suicide bomber. Since this date, women have represented over 50% of successful suicide bombings in the conflicts of Turkey, Sri Lanka and Chechnya. In more recent years, the Global Extremism Monitor recorded 100 distinct suicide attacks conducted by female militants in 2017, constituting 11% of the total incidents occurring that year. Moreover, Boko Haram’s female members have been so effective in their role as suicide bombers, that women now comprise close to two-thirds of the group’s suicide-attackers.  

It is perhaps the dominant nature of presiding stereotypes regarding women, which enables them to be so successful in their attacks – presenting terrorist organisations with a strategic advantage. This is illustrated by the astonishing figures proving female suicide attacks more lethal on average than those conducted by their male counterparts. According to one study, attacks carried out by women had an average of 8.4 victims, compared to 5.3 for those attacks carried out by men. Weaponizing the female body is proving successful as society continues to assume women lack credibility as terrorist actors. Needless to say, remaining shackled to entrenched gender preconceptions will undoubtedly continue to place society at risk of unanticipated terror attacks from women.

Emily Glynn, History in Politics President

Four “Non-feminist” Feminists in British history

When we think of feminism, we think of women holding strongly coloured flags of green, white and gold or green, white and purple in historical photos. We think of women and girls who spoke on the news demanding equal opportunities, more provision of pregnancy and abortion advice and the liberation of females in third world countries. We think of Malala speaking at international conferences, Jessica Chastain acting in Zero Dark Thirty and FaceBook executive Sheryl Sandberg on the cover of Time Magazine. They are all strong women in their declaration as being a feminist and are frequently and publicly involved in feminist organizations and activities.

Yet, we often forget to include some women as feminists, either due to the fact they do not carry those clear ‘feminist features’ as mentioned or simply that they do not consider, or even refuse to consider themselves as feminists. They are the Celia Foote character (interestingly also played by Jessica Chastain) in The Help, who were not feminists in a conventional way like the Skeeter character (played by Emma Stone), and even often doubted themselves, but went against the mainstream in terms of thoughts and actions, and are undoubtedly feminists who proved to be great thinkers and writers of all times. 

Hildegard of Bingen

Unknown to many, feminism was rooted in religious contexts in which women found themselves with the opportunity to express their thoughts as freely as male. Since ancient history, especially in Europe, families sent their ‘unmarriable’ daughters away to convents. Some women found the time and quietness a great opportunity to think, read and write. One of them is our first ‘non-feminist’ feminists, Hildegard of Bingen. 

Saint Hildegard. (Credit: Franz Waldhäusl, via imageBROKER – http://www.agefotostock.com.)

Hildegard was born in the 11th century and became a nun when she was a teenager (the exact age of her enclosure is subject to debate). She later became the abbess of a small Rhineland convent. People normally do not consider her as a feminist as she lacked most essential elements contemporarily associated with feminism, yet she was certainly a pioneer of proving that women can do exactly what male can do with her actions. Hildegard was considered by many in Europe to be the founder of scientific natural history in Germany. She is also one of the few known chant composers to have written both the music and the words. Hildegard also produced two volumes of material on natural medicine and cures and three great volumes of visionary theology, which were all well celebrated and led to her recognition as a Doctor of the Church, one of the highest titles given by the Catholic Church to saints recognized as having made a significant contribution to theology or doctrine through their research, study, or writing.

Not only was she deeply involved in what conventionally thought as works of male – music, academics, medicine and religious theories – she also frequently wrote letters to popes, emperors, abbots and abbesses expressing her critical views and thoughts on a wide range of topics. These significant political, social, economic people often approached her for advice. She even invented a language called the Lingua ignota (“unknown language”).

Despite her apparent feminist approach of doing things, another thing that often dissociates her with feminism is her frequent self-doubt. Often doubtful towards her ‘unfeminine’ activities, she was always insecure of being an uneducated woman and was not confident of her ability to write. She once wrote to Bernard of Clairvaux, one of the leading churchmen of the time, asking whether she should continue with her wiring and composing, even though at that time she was already widely known and honoured for being an incredible writer and musician. 

But still, Hildegard was one of the few women in medieval history that wrote so freely and critically on everything and was viewed as an impressive writer, musician and religious leader, based on her achievements instead of based on her gender. That made her a feminist. 

Margaret Cavendish 

Margaret Cavendish was born in the 17th century into a family of well-established, Royalist landowners, and later married the Duke of Newcastle. She was one of the few fortunate women of her time who had her husband encouraging her to pursue her literacy ambition. 

At first, she started writing on topics mostly associated with women’s internal struggle, ranging from their worries regarding their family to their common fear and sorrow about their children’s sickness and death, though she did not experience what she wrote. Well-reserved by women, her writings were found to be very moving and unexpectedly understanding of the harsh realities faced by many women at that time. 

Margaret Cavendish, the Duchess of Newcastle. (Credit: Kean Collection, via Getty Images)

At later times Cavendish started writing philosophical verse. Though her work was widely recognised, same as Hildegard, Cavendish often had self-doubt about her capacity and duty as a woman when she was writing. A modern biographer once remarked that Cavendish felt torn between ‘the (feminine and Christian) virtue of modesty’ and her ambitions. On the one hand, she was very serious and confident about her work; on the other hand, she often depreciated her work and justified them with defensive and apologetic justifications. 

From Cavendish’s degradation of her work, it seems that she lacked the confidence and guts of feminists to denounce the conventional status of women and to loudly declare that her work should be equally recognised. But she wrote like a feminist in a sense that she brought womanly issues, which was thought to be a topic not worthy of writing and of no political, social or any importance, to public attention. She brought the dark side, the internal perspective of women’s struggles in the household into the open light. She also spoke out against the hostility towards any women regarded as outspoken or ambitious, which at her time was deemed as madness and dangerous. Her writings also encouraged later women to write and urged them to unite together instead of always being jealousy critical of other women’s achievements. That made her a feminist. 

Dorothy Osborne 

Same as Cavendish, Dorothy Osborne was born in the 17th century into a family of Royalists. She is comparatively less well-known than our other four ‘non-feminist’ feminists as she did not produce writings or theories as significant as those produced by Cavendish or Hildegard. She could even be seen as an anti-feminist by some people as she was one of those critics who heavily denounced the work of Cavendish, a more obvious feminist, as ‘extravagant’ and ‘ridiculous’. 

Funnily enough, what made her on the list is her criticism of other people’s work (including Cavendish’s) in letters exchanged between her husband and her. She read widely, often had heavy criticisms of other people’s work and exchanged her thoughts with her fiance, then husband, Sir William Temple in letters. These ‘witty, progressive and socially illuminating’ letters were later published and became large volumes of evidence that Dorothy was a ‘lively, observant, articulate woman’. Even Virginia Woolf later remarked that Dorothy, with her great literacy fashion, would have been a novelist in another time. 

An additional point that made her a true feminist was her actions against an arranged marriage and conventional family mindsets. During the 17th century, marriages were usually business arrangements, especially for a rich family like Dorothy’s. Being in love with Sir William Temple, who was refused by her family due to financial reasons, she protested by remaining single and refused multiple proposers put forth by her family. At last, her struggle was rewarded with her finally marrying Temple. But her feminist acts did not stop there. Later references showed that she was actively involved in her husband’s diplomatic career and matters of the state, quite contrary to what an ordinary wife would behave in the 17th century. 

Dorothy had not ever said that she was a feminist, or intended to act like a feminist. But her thoughts, words and actions clearly showed that she lived a feminist life by becoming a free-willed, critical woman. That made her a feminist. 

Mary Shelley 

Mary Shelley, who famously wrote Frankenstein, is the woman who wrote several of the greatest Gothic novels of all times and was considered to be the pioneer of writing science fiction. 

Mary Shelley was born in 1797. Her father was the political philosopher William Godwin and her mother was none other than one of the first public feminist activist in British history, Mary Wollstonecraft. Under the influence of two great parents, Mary Shelley was encouraged to read, learn and write, and her father gave her an informal, nonetheless rich, education. 

She later fell in love with one of his father’s followers, a romantic poet and philosopher Percy Bysshe Shelley, who was married. After the death of Percy’s first wife, Mary and Percy got married. They were certainly two talented, happy people married. But Mary’s luck then seemed to run out. Three of her four children died prematurely and Percy later drowned when sailing. In the last decade of her life, she was constantly sick. 

Despite her miserable life, she was able to produce great novels such as Valperga, The Last Man, The Fortunes of Perkin Warbeck, etc. Perhaps due to her own sad stories, her novels were strikingly dark, in a sense that it created no hope for the characters. Mary’s novels, especially her most famous one, Frankenstein, did not have any strong female characters; ironically most female characters died. In a conventional perspective, her work is not feminist at all. But what she explored was the struggles of women in an age which society was driven by reason, science and patriarchy. One commentator once said that ‘[t]he death of female characters in the novel is alone to raise enough feminist eyebrows to question how science and development is essentially a masculine enterprise and subjugates women.’ She was radical in thought and critical of society’s norm. Alongside writing, she devoted herself to raising and educating her only surviving child, Percy Florence Shelley, who later famously supported amateur performers and charity performances. 

What made Mary Shelley a legendary woman, in addition to her great writings, was her strength in turning misery into energy and striving as an author and a mother despite her miserable life. That made her a feminist. 

Chan Stephanie Sheena

Does the Electoral College Serve the Democratic Process?

“It’s got to go,” asserted Democratic presidential candidate, Pete Buttigieg, when speaking of the electoral college in 2019 – reflecting a growing opposition to the constitutional process, which has been only heightened by the chaotic events of the past weeks. Rather than simply reiterating the same, prosaic arguments for the institution’s removal – the potential subversion the popular vote, the overwhelming significance of battleground states, the futility of voting for a third party, and so forth – this piece will consider the historical mentalities with which the electoral college was created in an effort to convey the ludicrous obsolescence of the institution in a twenty-first century democracy.  

Joe Biden and Kamala Harris preparing to deliver remarks about the U.S. economy in Delaware, 16 November 2020. (Credit: CNN)

In its essence, the system of electors stems from the patrician belief that the population lacked the intellectual capacity necessary for participation in a popular vote – Elbridge Gerry informing the Constitutional Convention, “the people are uninformed, and would be misled by a few designing men.” Over the past two hundred years, the United States has moved away from the early modern principles encouraging indirect systems of voting: for instance, the fourteenth amendment normalised the direct election of senators in 1913. It has also seen the electors themselves transition from the noble statesmen of the Framers’ vision, to the staunch party loyalists that they so greatly feared. In fact, the very institutions of modern political parties had no place in the Framers’ original conception, with Alexander Hamilton articulating a customary opposition to the, “tempestuous waves of sedition and party rage.” This optimistic visualisation of a factionless union soon proved incompatible with the realities of electioneering and required the introduction of the twelfth amendment in 1803, a response to the factious elections of 1796 and 1800. Yet, while early pragmatism was exercised over the issue of the presidential ticket, the electoral college remains entirely unreformed at a time when two behemothic parties spend billions of dollars to manipulate its outcome in each presidential election cycle. 

The Constitutional Convention was, in part, characterised by a need for compromise and it is these compromises, rooted in the specific political concerns of 1787, that continue to shape the system for electing the nation’s president. With the struggle between the smaller and larger states causing, in the words of James Madison, “more embarrassment, and a greater alarm for the issue of the Convention than all the rest put together,” the electoral college presented a means of placating the smaller states by increasing their proportional influence in presidential elections. While it may have been necessary to appease the smaller states in 1787, the since unmodified system still ensures voters in states with smaller populations and lower turnout rates, such as Oklahoma, hold greater electoral influence than those in states with larger populations and higher rates of turnout, such as Florida. Yet, it was the need for compromise over a more contentious issue – the future of American slavery – that compelled the introduction of the electoral college further still. Madison recognised that “suffrage was much more diffusive in the Northern than the Southern States” and that “the substitution of electors obviated this difficulty.” The indirect system of election, combined with a clause that counted three of every five slaves towards the state population, thus granted the slaveholding section of the new republic much greater representation in the election of the president than an alternative, popular vote would have permitted. At a time when the United States’ relationship with its slaveholding past has become the subject of sustained revaluation, its means of electing the executive remains steeped in the legacy of American slavery.

It takes only a brief examination, such as this, to reveal the stark contrasts between the historical mentalities with which the electoral college was established and the realities of a modern, democratic state. Further attempts to reform the institution will no doubt continue to come and go, as they have over the past two hundred years. However, when compared with the environment in which it was proposed, it is clear that the unreformed electoral college is no longer fit for purpose and must, eventually, give way to a system in which the president is elected by a popular vote.

Ed Warren