Mary I: The First Queen of England and the Legacy of Female Leadership

Whilst the first Queen of England’s reign is largely overshadowed in favour of the other Tudors, namely her father Henry VIII and half-sister Elizabeth I, the historiographical interpretations of Mary I illuminate interesting notions associated with female leadership. Mary I was the first woman to ascend to the throne of England, as the succession of Empress Matilda in the twelfth century never materialised due to the eruption of civil war. Only four centuries later do we witness the succession of a female monarch in England, and this was not without issues, as there had been earlier attempts to bar her from inheritance. Her gender, as well as her supposed illegitimacy, provided the grounds for such attempts as her younger half-brother was placed above her, when she was eventually restored to the line of succession.

Mary I’s reign is largely interpreted as one of hysteria and irrationality. Such a notion of hysteria is particularly of note as the term has long been associated with the assumption that women are unable to control their emotions and are thus unfit to rule. Shakespeare’s common depictions of a madwoman within his works have fostered this link of femininity with that concept of hysteria, particularly illustrated in Hamlet through Ophelia. Literary representations have certainly exacerbated the historical issues of female rule.


Portrait of Mary I, Antonis Mor, 1554. (Credit: Public Domain)

Historians are critical of both Mary’s personal life, due to her failure to conceive, as well as her broader policies that saw the intense persecution of Protestant sympathisers, which led to her title of ‘Bloody Mary’. The issue of motherhood in politics is still prevalent in our times, as seen by the scrutiny Theresa May faced within the media and from other members of the Conservative Party due to her choice not to have children. With regard to Marian politics, the stabilisation of economic policy her reign is often underplayed and when acknowledged, is widely credited to her male councillors or the reforms laid by the Duke of Northumberland before her advent to the throne. Mary is thus painted as weak, feeble and ineffective. The first Queen of England is largely portrayed as conforming to the gendered anxieties that the elite ruling class had regarding the notion of female monarchy. Women were deemed as far more emotionally charged compared to their male counterparts and would be unable to conduct rational governance as a result.

In the 21st century, women still face significant opposition to reaching the highest positions of political power. Britain’s only popularly elected female Prime Minister, Margaret Thatcher, is often represented as contradicting feminist values. The US has never elected a woman to that highest position of President, and notably Hilary Clinton’s gender played a significant role in right-wing opposition to her election. Such ideas regarding female hysteria and heightened emotion have arguably laid a basis for the preference of male leadership and have correlated with deeming women unfit for positions of political power. The qualities associated with leadership are still often presented through characteristics which men are more likely to be socialised to have, therefore perpetuating the idea that women are not suited to power.

What we can draw from the accounts of Mary I’s governance and her later treatment in historical research is that female leadership is not often deemed a suitable option, nor do women find easy pathways into politics. These ideals surrounding female inferiority have

historical precedence in sixteenth-century England as illustrated by the analysis of early modern Queenship. These notions have not been undermined or significantly challenged by the twenty-first century. Mary I’s legacy illustrates how she has been utilised to highlight the perceived barriers to effective female governance. Arguably, this has set a precedent for the limited role that women play in politics within our modern era.

Ellie Brosnan

Who Won the Good Friday Agreement?

The Good Friday Agreement was signed in 1998, heralding a new era of peace after the decades of violence that characterised the Troubles. But who benefitted the most from the signing of the Agreement, and is the answer different today than it was twenty years ago?

For unionists, represented most prominently by the Democratic Unionist Party (DUP) and the Ulster Unionist Party (UUP), the Good Friday Agreement ultimately symbolised the enshrining of the status quo in law: Northern Ireland remained a part of the UK. In addition, the Republic of Ireland renounced articles two and three of its Constitution, which laid claim to the entire island of Ireland. It may seem, then, that unionism was the victor in 1998, but elements of the Good Friday Agreement have been responsible for tectonic shifts in the period since, arguably exposing it, ultimately, as a victory for nationalism.

While Irish republicans in the form of the Provisional IRA were required to put down their weapons and suspend the violent struggle for a united Ireland, there is a compelling argument that the Good Friday Agreement laid the platform for the growth of the movement and perhaps even the fulfilment of the goal of Irish unity. For one, it mandated power-sharing between the two sides of the Northern Irish divide: both unionists and nationalists must share the leadership of government. Since 1998, this has acted to legitimise the nationalist cause, rooting it as a political movement rather than an armed struggle. Sinn Féin, the leading nationalist party in the North, have moved from the leadership of Gerry Adams and Martin McGuiness to new, younger leaders Mary Lou McDonald and Michelle O’Neill. Irish unity propaganda now emphasises the economic sense of a united Ireland, the need to counter the worst effects of Brexit on Northern Ireland and a sensible debate about the constitutional nature of a new country, rather than the adversarial anti-unionist simplicity of the Troubles. Here, nationalism did gain significantly from the Good Friday Agreement because it allowed this transition from the Armalite to the ballot box in return for legitimacy as a movement.

British Prime Minister Tony Blair (left) and Irish Prime Minister Bertie Ahern (right) signing the Good Friday Agreement. (Credit: PA, via BBC)

Most prominently for nationalists, however, is the fact that the Good Friday Agreement spells out the route to a united Ireland, explicitly stating that a ‘majority’ in a referendum would mandate Northern Ireland’s exit from the UK. While unclear on whether majorities would be required both in the Republic and Northern Ireland, as was sought for the Agreement itself in 1998, as well as what criteria would have to be met in order to hold a vote, this gives Irish nationalism a legal and binding route to its ultimate goal of Irish unity, arguably the most prominent victory of the peace process.

Since 1998, Northern Ireland has seen a huge amount of political tension, governmental gridlock and occasional outbreaks of violence, most recently witnessed in the killing of journalist Lyra McKee in 2019. However, the Good Friday Agreement has served crucially to preserve peace on a scale unimaginable in the most intense years of the Troubles. If Irish nationalism achieves its goal of uniting the island, it will come about through a democratic referendum, not through violence. The very existence of the Good Friday Agreement, particularly its survival for over 20 years, is testament to the deep will for peace across the communities of Northern Ireland, forged in decades of conflict; it is this desire being fulfilled, even as the parties squabble in Stormont and the political status of Northern Ireland remains in the balance, that continues to make the biggest difference in the daily lives of both unionists and nationalists.

Joe Rossiter, History in Politics Writer

History as a Tool of Fascist Revolution

The past is a powerful weapon, one that in the wrong hands has the potential to tear asunder the present. Its utility is one that spans the political spectrum, and propagandists have long recognised its appeal. The most effective appropriation of the past as a tool of persuasion has undoubtedly been central to the exclusionary policies of fascist regimes; most apparent in Mussolini’s Italy and, perhaps lesser known, in China under the leadership of Chiang Kai-Shek’s Guomindang. Whilst communists sought to destroy the past, fascists chose to worship their own national version of it.

Chang Kai-Shek in 1943. (Credit: Public Domain)

Looking first to the use of history as the binding glue of the revolutionary Chinese republic, a peculiar relationship with the past that emphasises both rupture and continuity becomes apparent. The May 4th Movement, which reached its climax in 1919, had stressed the importance of a break with the Confucian past, embodied by the ‘backwards’ Qing Dynasty, as the only means of competing with the ‘modern’ West. 

As the movement split into communist and nationalist camps throughout the 1920s, the Guomindang (a nationalist party) increasingly came to cast themselves as the defenders of a Chinese, Confucian culture against the ravages of the Red Menace encroaching from the USSR by means of Mao Zedong’s CCP, who were at this time seen as a periphery, almost foreign force. The Guomindang thus found themselves promulgating a policy of revolutionary conservatism, what would come to be known as ‘Confucian fascism’. They took the legacy of the Confucian social order and bound it to a fascist future; as Chiang himself put it in 1933, ‘as members of the revolutionary party we must dedicate ourselves sincerely to the preservation of the traditional virtues and the traditional spirits.’ The restoration of the ancient past was the goal of the revolutionary present, a means by which the new Chinese ‘nation’ might define itself against the world. History was front and centre of the nationalist ideology. Out with the old and in with the older.

Indeed, the invocation of antiquity was not unique to Confucian fascism. The blind admiration of days long forgotten is one of the key features that the Chinese regime shared with the better-known fascist movements sweeping through Europe during the 1920s and 30s. The Nazis claimed descendance from the Holy Roman Empire or the ‘First Reich’, dissolved in 1806, and even constructed a somewhat less palpable link to the Vikings (see the SS Viking Division). Neither the Chinese nor Germans, however, could compare in their reverence of the past with the imperial illusion incubated by Benito Mussolini throughout his reign.

Il Duce sought to construct a ‘vast, orderly, powerful’ Rome, as it had been under the Emperor Augustus. He built the Via dei Fori Imperiale, which led through the ancient monuments of Roman power and civilization, and along which his 1938 parade welcoming Hitler was to proceed. Furthermore, he reintroduced the Roman salute that we now recognise as the quintessential declaration of fascist loyalties and explicitly pursued a restoration of the Roman empire in his failed invasions of North Africa. Drawing on more recent history, his Blackshirts were modelled on the Redshirts of the father of Italian unification, Giuseppe Garibaldi. Like Chiang, he sought an exclusively Italian culture. His promise was a return to the splendour and majesty of Rome’s glory days – an end to the division that had plagued the peninsula for centuries, and a remedy to the humiliations of the Great War.

These appeals to the past, like those of Adolf Hitler and the Chinese nationalists, exploited a population facing crisis: in Germany, the 1929 financial crash obliterated the economy; China had fallen into fracture following the 1911 fall of the Qing dynasty and the subsequent ‘warlord’ years. In Italy, it was the end of the First World War and the ‘Red Years’ of leftist agitation that granted Mussolini his opportunity. 

In times of turmoil, when the present seems under threat, people often look to the idealized past. This tendency leaves them vulnerable to the forces willing to seize upon it. Amidst the crises of our time, we would do well to bear that in mind.

Samuel Lake, History in Politics Writer

The Painted Word – Political Allegory in Early Modern Royal Portraiture

Portraits of sovereigns were always conceived with a political function in mind. Monarchs used their official portraits to cultivate an image of majesty, prestige, and royal authority, a key component in the broader construction of an inherently politicised royal public image. Whilst there is a discourse within the existing art-historical scholarship that seeks to depoliticise royal portraiture and downgrade the importance of symbolism, it is fruitless to extricate the paintings of early modern sovereigns from their clear political intentions. Close inspection of contemporary art indicates a distinct propensity for allegory, which served as a central way in which an image of Renaissance princely magnificence was promoted. 

‘The Rainbow Portrait’, 1600-1602. (Credit: Wikimedia Commons)

The famous portrait of Elizabeth I, The Rainbow Portrait, has received significant attention due to its religious significance; however, little emphasis has been placed on its impact as a highly politicised piece of art. The most potent political thread of this painting is the portrayal of Elizabeth as ageless, when by 1600 she was nearly seventy years old. At the turn of the seventeenth century, England’s monarch was nearing the end of her life without a legitimate heir to succeed her. Through this lens, the depiction of Elizabeth as wearing the ‘Mask of Youth’ appears undeniably political. Contributing to the fiction of an eternal present, this portrait was a product of the period which attributed the physical likeness of the monarch to the health of the state, and served to alleviate the political anxieties surrounding the succession and a potential return to a turmoil akin to the Wars of the Roses. However, if we look closely at this painting, there is a distinct sense of political iconography. Some scholars have considered this painting as an exemplification of Elizabeth’s sexual power, a credible concept which recognises the inextricable connection in the early modern period between a female monarch’s sexuality and her political authority. The transparent rainbow she grasps in her hand, whilst iconographically establishing a connection with the divine and implicitly presenting Elizabeth’s legitimacy as a deific leader, also reflects her sexual power. The phallic symbolism applied to the rainbow consolidates her supremacy over the masculine, establishing a dominance that was essential for a female ruler in the early modern period. Through this specific painting, it is evident that royal portraiture was undeniably politicised as a visual representation of strength and control.

However, although the symbolic content of royal portraiture was the central means of constructing a political image, it is also significant to consider the role that portraits served as a princely currency and an integral component of political discourse in the diplomatic relations within the European princely community. Portraits were exchanged to ameliorate political relationships, as well as to serve as symbols of recognition of other rulers within the royal fraternity. Whilst this role was undoubtedly significant, its longevity is eclipsed by the allegory and symbolic weight which timelessly pervades the paintings of sovereigns, such as the famous Rainbow Portrait of Elizabeth I. 

Maximus McCabe-Abel, History in Politics Vice President

Margaret Thatcher: A Feminism Icon?

Thatcher’s lasting impact on twenty-first century feminism is widely debated. Whilst her actions have inspired future generations of ambitious young women in all professions, Thatcher was undoubtedly not a feminist. In fact, she actively disliked and directly discouraged feminist movements. Thus, Margaret Thatcher works as an apt example that a successful woman does not always mean a direct step forward for the women’s equality movement. Instead, whilst Thatcher was our first British female Prime Minister, it never occurred to her that she was a woman prime minister; she was, quite simply, a woman that was skilled and successful enough to work her way to the top. 

Throughout Thatcher’s eleven-year premiership, she only promoted one woman to her cabinet; she felt men were best for the role. When questioned about this, Thatcher remarked that no other women were experienced or capable enough to rise through the ranks in the same way that she had.  Similarly, whilst Thatcher demonstrated that women were now able to get to the top of the UK Government, she certainly did not attempt to make things easier for women to follow; she pulled the ladder straight up after herself.  Thatcher claimed that she ‘did not owe anything to women’s liberation.’ Reflected in her policy provisions, she ignored almost all fundamental women’s issues. Despite hopeful attempts to raise female concerns to her, childcare provision, positive action and equal pay were not addressed during her years as Prime Minister. One can therefore diagnose that Thatcher’s loyalty was almost exclusively to the Conservative Party and her vision focused on saving the country, not women.

The May 1989 Cabinet: CREDIT: Photo: UPPA

Thatcher resented being defined by her gender, but she worked naturally as a role model and continues to do so (despite her policies) on the basis that she was simply, a female. She was made unique by feminists (and women generally) in the media simply for being a woman. In this way, examples matter; this is in the same way that Obama’s presidency matters for young generations of the BAME community. Perhaps Thatcher’s disacknowledgement of the glass ceiling is precisely what allowed her to smash it so fearlessly. If she couldn’t see it, no one could point it out to her! Amber Rudd has claimed that she is a direct beneficiary of Thatcher’s valiance, as her work debunked the idea that only men could survive, let alone prosper, in political life. Men were undoubtedly perturbed by Thatcher’s female presence and conviction; John Snow has gone as far as describing interviewing her as ‘unnerving.’ These predispositions, held by both men and women, were fundamentally and forcefully shifted by Thatcher.

Yet, is symbolism alone enough to praise Thatcher as a feminism icon? It is of course clear, that being a role model symbolically is incomparable to those who have actively and tirelessly campaigned to seek change for women. The influence she had on the feminist movement was not a result of her own actions or policies. Rather, her influence is a result of her being the first woman who made the exceptional progress, as a total outsider, to not only become our Prime Minister, but go on to win two successive elections convincingly. 

Amelia Crick

Does the Electoral College Serve the Democratic Process?

“It’s got to go,” asserted Democratic presidential candidate, Pete Buttigieg, when speaking of the electoral college in 2019 – reflecting a growing opposition to the constitutional process, which has been only heightened by the chaotic events of the past weeks. Rather than simply reiterating the same, prosaic arguments for the institution’s removal – the potential subversion the popular vote, the overwhelming significance of battleground states, the futility of voting for a third party, and so forth – this piece will consider the historical mentalities with which the electoral college was created in an effort to convey the ludicrous obsolescence of the institution in a twenty-first century democracy.  

Joe Biden and Kamala Harris preparing to deliver remarks about the U.S. economy in Delaware, 16 November 2020. (Credit: CNN)

In its essence, the system of electors stems from the patrician belief that the population lacked the intellectual capacity necessary for participation in a popular vote – Elbridge Gerry informing the Constitutional Convention, “the people are uninformed, and would be misled by a few designing men.” Over the past two hundred years, the United States has moved away from the early modern principles encouraging indirect systems of voting: for instance, the fourteenth amendment normalised the direct election of senators in 1913. It has also seen the electors themselves transition from the noble statesmen of the Framers’ vision, to the staunch party loyalists that they so greatly feared. In fact, the very institutions of modern political parties had no place in the Framers’ original conception, with Alexander Hamilton articulating a customary opposition to the, “tempestuous waves of sedition and party rage.” This optimistic visualisation of a factionless union soon proved incompatible with the realities of electioneering and required the introduction of the twelfth amendment in 1803, a response to the factious elections of 1796 and 1800. Yet, while early pragmatism was exercised over the issue of the presidential ticket, the electoral college remains entirely unreformed at a time when two behemothic parties spend billions of dollars to manipulate its outcome in each presidential election cycle. 

The Constitutional Convention was, in part, characterised by a need for compromise and it is these compromises, rooted in the specific political concerns of 1787, that continue to shape the system for electing the nation’s president. With the struggle between the smaller and larger states causing, in the words of James Madison, “more embarrassment, and a greater alarm for the issue of the Convention than all the rest put together,” the electoral college presented a means of placating the smaller states by increasing their proportional influence in presidential elections. While it may have been necessary to appease the smaller states in 1787, the since unmodified system still ensures voters in states with smaller populations and lower turnout rates, such as Oklahoma, hold greater electoral influence than those in states with larger populations and higher rates of turnout, such as Florida. Yet, it was the need for compromise over a more contentious issue – the future of American slavery – that compelled the introduction of the electoral college further still. Madison recognised that “suffrage was much more diffusive in the Northern than the Southern States” and that “the substitution of electors obviated this difficulty.” The indirect system of election, combined with a clause that counted three of every five slaves towards the state population, thus granted the slaveholding section of the new republic much greater representation in the election of the president than an alternative, popular vote would have permitted. At a time when the United States’ relationship with its slaveholding past has become the subject of sustained revaluation, its means of electing the executive remains steeped in the legacy of American slavery.

It takes only a brief examination, such as this, to reveal the stark contrasts between the historical mentalities with which the electoral college was established and the realities of a modern, democratic state. Further attempts to reform the institution will no doubt continue to come and go, as they have over the past two hundred years. However, when compared with the environment in which it was proposed, it is clear that the unreformed electoral college is no longer fit for purpose and must, eventually, give way to a system in which the president is elected by a popular vote.

Ed Warren

Debates Take On a Different Meaning in the “Worst Year Ever”

The Trump-Biden debates are wrapped up, and for the “Worst Year Ever” they didn’t disappoint. The first debate was widely condemned as the “Worst Debate Ever”. Both candidates talked over each other, and it was near-impossible to understand them. Biden faced calls to boycott the other debates. Trump made this decision for him, falling ill with COVID-19.

President Donal Trump and Democratic candidate Joe Biden take the stage in their final debate of the election campaign, Nashville, Tennessee. (Credit: Reuters)

Anyone who saw even brief highlights of the first debate could be forgiven for giving up on the whole institution of debates. But this would be extremely unwise. Yes, Trump interrupted Joe Biden a staggering 128 times. And admittedly, Joe Biden did reply by telling him to “shut up” and calling him a “clown”. Yet this wasn’t the breakdown of the debate as an institution. Rather, it was an additional insight into who the two candidates are, and how they will act in the face of the adversity that Presidents experience on a daily basis.

The problem with the way we view debates is that we anticipate 90 minutes of detailed and virtuous policy discussion. There is no clearer example of this fantasy than the West Wing episode, in which the two candidates running for president have a high-minded and theoretical exchange of views on what it means to stand as a Republican or a Democrat. In reality, presidential debates have little to do with policy. Most voters are unswayed by the arguments of the candidates; they may have little trust in them, or have made up their minds previously. The one area where debates really count is character.

The focus on character may be why the UK has lacked similar style pre-election debates, and why attempts here have enjoyed less success. The presidency is a position uniquely judged by the character of its occupant, and in the build-up to 2020 President Trump’s character – depending on who you ask – has been viewed as his biggest strength or weakness. This really gets to the crux of what debates are, and what they have always been – a blank slate.

The debate is one of the few foreseeable major events in a campaign. But that is all that can be foreseen: the event. Most voters are aware of it, and around 80 million will watch it, but the candidates are under no obligation to make it a debate on the state of America. Like most other political realities, the in-depth policy debate was an unwritten rule, held up by the ‘honour system’, and President Trump lacks this honour.

Using debates for non-policy advantages is as old as the institution itself. In the first ever presidential debate in 1960, Nixon faced off against Kennedy. Nixon turned up looking sickly and sweaty, whilst JFK was the epitome of suave New England style. Accordingly, whilst radio listeners thought Nixon had performed better, TV viewers agreed that Kennedy had won the debate. The echoes of 1960 were clear in Mr Trump’s first performance, in which he waved his hands, stood firm, interrupted, and generally tried to give the impression that he was in control of the events of the stage. Yet Mr Biden was not immune from these gimmicks either – he would flash a smile whenever the president made an outrageous claim, as if to say, “look at this clown – does he have what it takes to fill the office?”

Vice President Richard Nixon and Senator John F. Kennedy in their final presidential debate; 21 October, 1960. (Credit: AP Photo)

The pressure of these debates is intense. Each given candidate will have three or four separate strategies they’re trying to pursue, and they have to juggle all of them whilst simultaneously readjusting their approach depending on which hits are landing. In the first debate, Mr Trump was balancing trying to present Joe Biden as senile, racist, and yet also a radical socialist. The president struggled with these conflicting narratives, especially as he hoped that constantly interrupting Mr Biden would force the former vice president into a memorable gaffe. Ultimately, it was Mr Trump’s inability to change his approach in the debate that cost him more than any of his policy errors, and formed the main narrative of the debate in its aftermath.

But there was ultimately something more sinister going on. Donald Trump’s biggest election worry is high turnout – Republicans usually vote reliably, but Democrats are much more vote-shy. This is doubly true of young people. Accordingly, the president may have been playing a deeper game during the first debate, one which he executed outstandingly. President Trump saw an opportunity to portray the debate as an irrelevant contest between two old white men – not dissimilar to how young Americans view the election already. Mr Trump’s constant interruptions made the debate unbearable to watch, but he ultimately wanted that. He may not have done well with the few undecided voters left in the campaign. He will care little. The bigger constituency was voters undecided between voting for Biden or staying home. The first debate looked exactly like two old men bickering, and for Trump that’s as close to a debate win as he can get.

Seth Weisz

What Will Happen Now Ruth Bader Ginsburg’s Dead?

You cannot understand the confirmation process of Amy Coney Barrett without understanding that of Robert Bork. Nominated by Ronald Reagan in 1987, Bork was a polarising figure, known for his disdain for the supposed liberal activism of the court. Ted Kennedy of Massachusetts, deeming Bork to be too radical for the court, turned away from the bipartisan tradition of assessing a nominee’s qualifications rather than values. The Judiciary Committee hearings featured hostile questioning, and Bork was ultimately rejected by 58-42 in a Democratic-majority Senate. The events produced the term “borked,” referring to the vigorous questioning of the legal philosophy and political views of judges in an effort to derail their nomination. The legacy of Bork lives on today.

The death of Supreme Court Justice Ruth Bader Ginsburg (RBG) has triggered a high-stakes nomination process just weeks before the election. The Supreme Court is the highest level of the judicial branch in the US, with Justices nominated by the President and voted on by the Senate. The process usually takes a few months, with nominees being interviewed privately by senators, and then publicly by the Senate Judiciary Committee, before being forwarded by the committee to be voted on in the Senate. 

Ruth Bader Ginsburg, 2014. (Credit: Ruven Afanador)

However Barack Obama’s final year in office altered the traditional conception of nominating Supreme Court Justices. With the death of Justice Scalia in 2016, Obama, in alignment with the Constitution, nominated Merrick Garland to fill the seat. However, in what political scientists Steven Levitsky and Daniel Ziblatt deemed “an extraordinary instance of norm breaking,” the Republican-controlled Senate refused hearings. Senate majority leader Mitch McConnell argued that in an election year the Senate should wait until a new President has been elected, thus giving “the people” a say in the nomination process.

His position proved polarising. The practice of the Senate blocking a specific nominee (as in the case of Bork) would usually be fairly uncontroversial, even happening to George Washington in 1795. The issue was McConnell preventing an elected President from filling the seat at all, something that had never happened in post-construction US politics.

Yet the death of RBG has shown this precedent to be short-lived. Despite a Court seat opening up even closer to the election, the vast majority of Republicans have accepted McConnell’s present claim that his own precedent doesn’t apply in an election year if the same party holds both the Senate and Presidency. Thus, President’s Trump’s nominee, Amy Coney Barrett, looks set to be confirmed.

It’s unknown how polarising her confirmation will be. The hearings of Clarence Thomas in 1991 were dominated by the questioning of Anita Hill over her allegations of sexual harassment against the then-nominee, with Thomas then accusing the Democrat-led hearing of being a “high-tech lyniching for uppity who in any way deign to think for themselves.” The 2018 Kavanaugh hearings echoed this process, with the then-nominee accused of attempted rape in a widely-viewed public hearing. Although the Barrett hearings are unlikely to prove as sinister, it’s likely the Republicans will accuse the Democrats of finding any means possible to block a conservative justice, as was seen in the Clarence and Kavanaugh hearings.

Barrett is set to be ‘borked’. Her views have been well-documented over her career, and, most notably, Republican Senators seem confident she’ll vote to overturn Roe vs Wade, the 1973 ruling that protected a woman’s liberty to have an abortion without excessive government restriction. The Committee hearings process will likely rally each party’s base going into the election, but the long term implications on civil rights and the legitimacy of the Court have yet to be determined.

Sam Lazenby


Bibliography

The Economist. “Courting trouble: The knife fight over Ruth Bader Ginsburg’s replacement.” (26 Sep 2020) https://www.economist.com/united-states/2020/09/26/the-knife-fight-over-ruth-bader-ginsburgs-replacement

The Economist. “What does Amy Coney Barrett think?” (26 Sep 2020) https://www.economist.com/united-states/2020/09/26/what-does-amy-coney-barrett-think

Levitsky, S. and Ziblatt, D. (2019) “How Democracies Die.” Great Britain: Penguin

Liptak, A. “Barrett’s Record: A Conservative Who Would Push the Supreme Court to the Right.,” New York Times (26 Sep 2020). https://www.nytimes.com/2020/09/26/us/amy-coney-barrett-views-abortion-health-care.html

Pruitt, S. “How Robert Bork’s Failed Nomination Led to a Changed Supreme Court,” History (28 Oct 2018). https://www.history.com/news/robert-bork-ronald-reagan-supreme-court-nominations

Siddiqui, S. “Kavanaugh hearing recalls Clarence Thomas case,” The Guardian, (27 Sep 2018). https://www.theguardian.com/us-news/2018/sep/27/brett-kavanaugh-clarence-thomas-anita-hill-hearings

Victor, D. “How a Supreme Court Justice Is (Usually) Appointed,” The New York Times, (26 Sep 2020). https://docs.google.com/document/d/1880187lYZ4z9gXjkVeNDsSsN8F0ZdRK1MIrua4CQmIk/edit

Debate: Monarchy, a Relic or Required?

Monarchy and its Political Pomp and Circumstance

The Glorious Revolution of 1688 implemented the constitutional monarchy of the UK that we know today, effectively limiting the political role of the Crown to mere pomp and circumstance. Yet, to this day, certain superfluous political liberties have remained. In practice, the sovereign still gives weekly counsel to the Prime Minister. In practice, the sovereign opens Parliament with their speech, albeit drafted by the Commons. In practice, the sovereign must approve all legislation before it can become an act of parliament, although the last bill to be refused in such a manner was vetoed in 1708. While the British political constitution has moved on considerably from its absolute-monarchical days, the monarch’s political role still retains an archaic air, where substance falls short of ceremony. The lack of majority dissent over this archaism can only be explained by the increasing celebrity of the monarchy, caused by the tabloid-frenzied consumption of their every move, from wedding dress to baby name. This infatuation with these winners of a ‘genetic lottery’ completely overlooks the fact that these political liberties are available to be used and abused. Even if they choose not to do so, that is irrelevant to the fact they still exist.

This is only the tip of the iceberg, as, ceremonial politics aside, the monarchy can also be utilised by the party in power when wanting to inspire confidence in their abilities. This was evident in the Queen’s recent coronavirus address where she spoke of the need for solidarity, harking back to the Second World War idea of ‘everyone doing their bit’ and quoting Vera Lynn’s song, ‘We’ll Meet Again’. For a more worrying influence we must look back only to August of last year where Boris Johnson used the Queen’s ability to prorogue parliament to prevent lawmakers from thwarting his Brexit plans. Though the Crown officially adopts an air of impartiality towards partisan politics, it seems the monarchy is still a political tool to be manipulated on a whim. Surely the best way to ensure sovereign impartiality is to remain aloof from the political world. But surely while this demands reform, the monarchy need not be abolished to take its fingers out from the political pie.

When also considering the royal finances, it seems there is certainly no harm in taking this next step either. With £82.2 million paid by taxpayers in 2019 to form the Sovereign Grant – not including security or ceremonial costs – is it really necessary to keep funding this archaic institution? Popular responses say yes, pointing to tourism revenues of £550 million, and ambassador-generated trade of £150 million. Yet the latter number barely makes a dent in the sum of UK exports (£543 billion), and as for tourism revenue, the abolition of the monarchy would not stop tourists from frequenting destinations such as Windsor Castle and Buckingham Palace. The question we the public should be asking is are the monarchy still relevant? The royal family can still exist in celebrity status and tabloid sensationalism without pulling on the drawstrings of the public purse and without being used as a political tool. The political role of the monarchy should be a thing of the past, celebrated and remembered perhaps, but fit for the vault of history.

Melanie Perrin

The current British royal family on Buckingham Palace’s Balcony. (Credit: Chris Jackson, via Getty Images.)

A Defence of the Monarchy

A word that recalls the riches and privileges of fairy-tale princes and princesses, but one that also connotes the existential crisis faced by many kingdoms. The twentieth century saw a deadly trend for the end of monarchies: most famously, the tragic demise of the Romanovs. However, new monarchies were forged that have remained to this day, such as Bhutan’s Wangchucks, whose popularity in Thailand has even led to a sharp increase in Thai tourism to Bhutan.

Monarchies carry more influence than is recognised in modern society. In Britain, the House of Windsor encourages support for charitable causes. Prince Harry, the Duke of Sussex, has been outspoken about the importance of mental health services, describing his participation in counselling and advocating open discussion concerning mental health. Alongside the Duke and Duchess of Cambridge, Prince Harry founded ‘Heads Together’; a campaign created to increase the visibility of mental health conditions. Using their royal status greatly, the Cambridges and Sussexes promoted ‘Heads Together’ through royal visits, social media presence and tailored events. It was highly successful, with the foundation announcing it had assisted “millions” in talking more about mental health. The British monarchy is still deeply entrenched within our society and culture, engaging with topical issues, and promoting causes that they believe in. The Windsors have become more personal than rulers of the past, and still engage with politics, albeit in different ways. Commentary on social issues is another valid way of engaging with the political constitution. 

Neutrality is the most important characteristic of today’s monarchy, with the royal veto having been abandoned for over 300 years. The monarch is now idealised to be a leader that the public can stand behind, regardless of the political climate. Prime Ministers cannot command the support nor the majority, which the monarchy can. According to YouGov in 2018, 69% of people support the monarchy, with 21% opposing and 11% stating no preference. No Prime Minister has ever achieved such a high public majority. Theresa May was the second most popular Conservative leader ever, and still only commanded a positive opinion of 30%. In a turbulent modern society, the British monarchy has been a source of constancy.  

In a politically chaotic decade, Britain has seen three Prime Ministers in three years under Conservative Party leadership, which has been deeply divisive. However, the popularity of the monarchy has been proven time and time again. For the wedding of the Cambridges, there were 60 million viewers (averaging at 22 million for whole coverage), and sales of the royal issue of the Hello! magazine rose by 25%. Globally, there were 29 million viewers of the wedding of Prince Harry to Meghan Markle. Furthermore, the British monarchy unites 2.4 billion under the Commonwealth, from across five continents. 

The grasp upon the monarchy has not been relinquished by the world, but especially not by British society. It has been steadfast for centuries and whether it is universally accepted, monarchy occupies a key part of politics, culture and society in modern Britain. It does not seem as if the world is ready for the monarchy to be a historic concept.

Lorna Cosgrave

Do the US Presidential Candidates Meet the ‘American Dream’?

‘We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.’

Declaration of Independence, 1776

The American Dream. It is a belief professed in America’s culture, literature, advertising and schools; the idea that one can truly do anything or be anything in America as an American citizen. Developed by Thomas Jefferson as part of the ‘Declaration of Independence’ in 1776, it is flaunted as part of America’s proud past. While in theory it is hopeful and fair, politics has fallen short of what it means to provide equal rights for everyone. It is now used to gain public support, rather than to deliver on its promises. Anyone can profess the American Dream, even if they are racist, homophobic, or believe injecting bleach will cure the 2020 coronavirus pandemic.

On the 3rd of November 2020, all eyes will be on the USA as Americans cast their votes in the 59th Presidential election. The effects will have a profound impact, not only on domestic issues but politics across the globe. The two main candidates, the Republican Donald Trump and the Democrat Joe Biden, have conflicting views on immigration, healthcare, racism, the climate crisis and COVID-19, just to name a few. Wildly differing perspectives add fuel to the political fire, and over the next couple of months each candidate will do everything they possibly can to take up residence in the White House. Perhaps the only two things they have in common are their fight for votes and their use of the American Dream narrative to do just that. 

Trump is appealing to the white working-class American, promising his supporters that he’ll ‘make America great again, again.’ He pulls on the heartstrings of the individual, professing that each of his voters can achieve personal excellence and financial gain. Though Trump professes this in his speeches, he denies marginalised groups the opportunity to achieve this great dream. Just one of the many examples is his attitude towards the Black Lives Matter movement. He has repeatedly denied the existence of institutionalised racism, suggesting that violent protests are far more of an issue than the devastating reasons people are protesting. He plays with fears and drives division to maintain each individual with the idea that they can achieve, despite most never actually having the opportunity to do so. In other words, he uses the nationalist American Dream to win votes and fails to deliver. 

Biden is appealing to those who believe in equality for all. In his speech at the Democratic convention he said that unlike the Republican party, ‘united we can, and will overcome this season of darkness in America.’ His specifics are much clearer. He wants to tackle climate change, racial injustice, the current global pandemic and the economic depression, in a way that Trump has been unable to do. His unifying speech brought together many well-respected leaders to explain why everyone should have the same opportunities in education, healthcare, careers and more. Biden has called this election ‘the battle for the soul of this nation,’ – a suggestion that this election will highlight what Americans hope their future will look like. 


Joe Biden speaking at the 2019 Iowa Federation of Labor Convention in Altoona, Iowa. (Credit: Gage Skidmore)

Whichever party wins, these two campaigns make it clear that the ‘American Dream’ is fundamentally flawed. It promises two paradoxical ideas, absolute equality for all and individual excellence. It is near impossible to successfully have both. The question stands, as one of the biggest global powers, which would you rather American politics reflect: fundamental equal rights for all or power for the select individual? All will be revealed come November.

Issie Stewart