Industries of exploitation: The ‘Radium Girls’ and the state of industrial working conditions today

The effects of radium exposure (P.C. Flickr)

In the early 1900s, radium became a prolific consumer product in the US. Used in a variety of products, including cosmetics and tonics, it was most famously worked into dials and clock faces.  Now, despite its far-ranging consumption, radium had only been discovered in 1898 and knowledge of the element was limited. Safety guidelines, hence, were scant. From 1917 onwards, hundreds of young working-class women attained jobs in factories painting clockfaces and dials with radium. Here, these women were encouraged to practice ‘lip-pointing’, sharpening the points of brushes with their lips. Having their employers’ assurances that radium was safe, many applied it to their hair and skin to achieve a luminescent effect, soon to become known as ‘the ghost girls’.

As a result of their continued exposure to and ingestion of radium, many of these women grew ill. The first fatality among this group of women was Amelia Maggia, an employee of the Radium Luminous Materials Corporation (later the US Radium Corporation) who died in 1922. Maggia experienced toothache and ulcers that soon spread and necessitated the removal of her lower jaw. Many other dial painters experienced similar symptoms, but the RLMC continued to deny the possible role of radium in their deaths. Even after the corporation commissioned a study on the deaths of the workers that proved a link between the fatalities and radium, it continued to deny the dangers of the substance. After numerous deaths, ‘lip-pointing’ was abolished in 1927, but workers continued to handle radium with little protection. Women employed as dial painters continued to suffer radium-related illnesses, such as cancer in the 1940s and 1950s. It wasn’t until 1979 and 1980 that the final radium factories were closed.

The case of the ‘Radium Girls’ (as they have come to be known) has shown some results for the way employee welfare has improved across the US; in 1938 Catherine Wolfe Donohue’s case against the Radium Dial Corporation was one of the first instances of a company being held responsible for the safety and welfare of its employees. Yet despite this, and numerous other cases made by other workers, changes to working conditions in factories has progressed rather slowly across the decades.

After all, similarities between the ‘Radium Girls’ and modern garment factory workers cannot be overlooked. Payment by piece, relaxed safety regulations and exposure to harmful substances tie these groups together. As a parable in tragedy, the ‘Radium Girls’ illuminates the need for improvement to the lives of factory workers todays. 

The company Shein, for example, when investigated by the Canadian Broadcasting Corporation, was found to have been selling clothing that contained elevated levels of lead, PFAS and Phthalates, all of which, when exposed to in high amounts and over long periods could pose health concerns. Workers, therefore, were likely exposed to these substances in high quantities. This is on top of the fact that the Swiss company, Private Eye, found that workers producing for Shein were often working 75-hour weeks in factories with inadequate fire safety measures. 

International action, of course, is needed to ensure the welfare of these workers is protected, and that they are paid fairly. Countries importing this clothing could enact change by enforcing legislation against exploitative working practices and enforcing quality control over imports. The exploitation of the ‘Radium Girls’ and present-day textile workers was and is enabled by governments refusing to create new legislation on radium and enforcing responsibility on large corporations. Remembering tragedies such as that of the ‘Radium Girls’ demonstrates the necessity of swift action to protect the welfare and rights of exploited workers. 

Amy Raven, Summer Writer

Could the North of England Become an Independent Country?

Much attention surrounding devolution and independence within the United Kingdom is focused on Scotland – the 2014 referendum combined with the growing support for the Scottish National Party in the country has established a resentment towards England due to its disregard for other countries within the Union. Northern Ireland and Wales have growing independence movements, with Cornwall adding to this, and surprisingly for some– the north of England.

For many, the physical divide between north and south is hard to discern. Some say it is just above Watford, others exclude counties such as Lancashire and Lincolnshire in favour of Tyne and Wear and Yorkshire, whilst others believe the Midlands are in the region. 

The recently formed Northern Independence Party defines the north as the ancient region of Northumbria. A leading voice for northern independence, this seems to be the clear definition of the ‘north’; they gained 50,000 members in less than a year and even ran for office in the Hartlepool by-election. But, what is it that unites these counties under this umbrella term?

Historians argue that two key periods have helped shape the northern identity: the Industrial Revolution and the economic struggle of the 1980s, creating an identity remarkably separate from the rest of England. 

The north of England has been historically oppressed since 1069 under William the Conqueror, with the ‘harrying of the north’ in which he brutally suppressed northern rebellions against his rule, and systematically destroyed northern towns. The region experienced famine with much of the areas deserted. 

More rebellions followed throughout the next 1000 years, and to this day socio-economic disparities are still evident. Mortality rates are higher in the north than in the south– 15% higher. Education and transport standards are majorly different to the south.

The Coronavirus pandemic is one of the many pressures adding to the calls for independence. Manchester seemed to be the centre of this, with Andy Burnham’s anger towards the government for their lack of funding given to the city. This lack of COVID-19 relief– in comparison to southern cities– added to the growing concerns over the survival of the northern industries, after it was placed under Tier 3 coronavirus restrictions against the advice of scientific advisors.

Gordon Brown, Former Prime Minister, remarked that in response to the Coronavirus pandemic, the psychological glue holding the country together has come unstuck. ‘Not only do you have the Scottish minister but you’ve got regional ministers saying they are not consulted or listened to… you’ve got no mechanism or forum for co-ordinating the regions and nations’. Not only is the Scottish-English divide strong, but the north-south divide is growing stronger– and it’s not just due to decades of Tory rule but perhaps, the political system. Could independence or devolution be the answer?

Under Tony Blair, arguments and motions for devolution were brought forward. Devolved assemblies were created in Scotland and Wales. However, the 2004 referendum on northern devolution failed– with 78% voting against it. 

To gain power in the north, Northumberland-born Alex Niven, author of ‘New Model Island: How to Build a Radical Culture Beyond the Idea of England’ argues that it will take a radical left-wing government to bring about devolution for the north. But rather than a completely autonomous and independent north, Niven argues for constitutional changes to challenge the ‘imperialistic’ Britain. Devolution is the answer, not total independence. A centralised approach to policy and politics in Westminster only ruins the local communities and creates a greater distrust in the government.

However, the Northern Independence Party’s manifesto states that it is London that ‘gobbles up’ the industry. They declare that London-based journalists ‘pick out the worst of us’ by perpetuating stereotypes of the north that adds to its general oppression. With independence, they claim they will join with an independent Scotland in alliances. 

It poses many questions. Is Derby included? What will be the capital? The historic capital of England was York and, as Yorkshire is the biggest county in Northumbria, could it be York– or will the economic powerhouses of Manchester, Leeds and Liverpool become the capital due to their resources? If so, will a feeling of resentment grow in certain regions, with perhaps a divide in opinion much like the north-south divide already prevalent? Also, structural inequality– if the capital does become one of these powerhouses, will employment be focused here rather than across the region?

Big questions include economic stability and the presence of an army. Decades of structural inequalities have led to the wish for an independent north due to the neglect of Westminster. But, this structural inequality creates a complex issue: with no real employment available already, how will jobs be created in a new country? 

I believe that for now, devolution is the answer to the crises in the north. Whilst the 2004 referendum did not show much support for this, in recent years and especially with the coronavirus pandemic and the rise of the northern independence party– the time is ripe. Could independence happen? Possibly, with the aid of Scotland. But for a truly Free North to fruition, for now, devolution will aid the cause to true independence. It’s been a long time coming and whilst under Tory, Westminster rule, the north will continue to perish.

Aoifke Madeleine, History in Politics Summer Writer

The Forgotten Medici

The Medici were one of the most infamous families in Italian and Renaissance history, a family of bankers who rose to rule Florence. They became patrons of the arts, backing the likes of Da Vinci and Galileo and produced four popes and two queens of France.

However, there is one member of the family who has been erased from history and has only recently been more widely spoken about in historical and media circles.

Alessandro de’ Medici was the only recognised illegitimate son of Lorenzo II de’ Medici. It is believed that his mother was a servant in the Medici household. Her name was Simonetta da Collevecchio, who was believed to be of African descent by multiple historians such as Christopher Hibbert and John Brackett.

For the most part, he was disliked less for his skin colour than his mother’s status as a freed slave. He was seen as ‘false royalty’ throughout his life due his mother’s low birth.

He was given the nickname Il Moro, ‘The Moor’ by others, due to his dark skin and curly hair.

His half-sister, Catherine, Lorenzo’s only legitimate child would go on to be Queen consort to Henry II of France after their father’s death in 1519.

They were both raised under the guidance of the Medici pope Leo X (until his death in 1512) and cardinal Giuliano de’ Medici (later made Pope Clement VII). In 1522, he was given the title Duke of Penne from his uncle. Clement apparently favoured Alessandro, often taking his side in disputes he had with his cousin, Ippolito. This favour also fuelled rumours over who Alessandro’s father was or if Giuliano may have been his father.

According to one historian, Alessandro was morose, passionate and could be cruel. His manners were marked by ‘vulgarity and abruptness,’ something that was unexpected of a man of his class and upbringing. This attitude translated into his political life, making him many enemies.

Political Life

After the siege of Florence ended in 1530, he was made Duke of Florence, after an agreement between the Pope and Charles V, the Holy Roman Emperor in 1531. He was also later made Hereditary Duke in 1532, ending the Florentine republic, and making him the first Medici to rule Florence, starting a monarchy that would last just over 200 years.

Alessandro was married to Charles V’s illegitimate daughter, Margherita. His noble birth, being a direct descendent of Lorenzo de’ Medici, the ‘Magnificent’, was an attractive feature and helped establish him as a genuine noble.

Descriptions of his rule vary. Positively, he was seen as a champion of the poor and helpless. He was also, like many in his family before him, a patron of the arts, commissioning pieces from notable artists at the time such as Ponto Moro. Duke Alessandro acted with the advice of elected councils and took their advice whilst ruling.

Florence’s vocal exile community judged his rule as harsh, depraved, and incompetent. In 1535, the exiles asked his cousin Ippolito to meet with Charles V to denounce Alessandro’s rule. They described him as a tyrant and accused him of every crime imaginable, but Charles ignored these, particularly after hearing from one of Alessandro’s advisors, who told a more favourable story of his rule.

Ippolito then died in questionable circumstances, which some believe Alessandro arranged. This helped prove to some of his contemporaries that he was a tyrant.

Alessandro was assassinated in 1537 by his distant cousin Lorenzino de’ Medici. This was an attempt to bring back the Florentine Republic. Power was passed on to Cosimo I de’ Medici from the junior line of the Medici family, marking the end of the senior family’s line and their rule in Florence.

Opacity via Flickr

Afterlife

Images of Alessandro vary amongst his contemporaries and historians.

No one was more determined to establish the worthiness of Alessandro as a good leader than his successor Cosimo I, who went on to be successful in his rule of Florence. Cosimo assumed responsibility for raising Alessandro’s two illegitimate children and avenged his death by assassinating Lorenzino.

For his contemporaries, as previously mentioned, his blackness was not why they hated him. To them, he was an arrogant tyrant, murderer and above all, a Medici. His race was perhaps the last objection they would have had about him.

His image as a tyrant however did prevail over time presenting him as the prince who started Florence’s ruin.

Historians, trying to take a more impartial view have argued back and forth as to what sort of man he really was with some concluding that he was a much better ruler than his detractors have claimed, pointing to his kindness to the poor and helpless.

Until recently, he has mostly been ignored within historian circles and mainstream representations of the Medici family or representations of Renaissance Italy. This is odd, considering his short and extraordinary life seems the kind of story one should tell in a period drama: from his womanising to his rule as the first Prince of Florence and the last of the original Medici line.

The story of Alessandro de’ Medici is part of a wider conversation around the erasure of Afro-Europeans from history books and the role they played in shaping Europe’s political history.

It is important for historical integrity and diversity to tell such stories and recognise the impact men and women like Alessandro had.

Michaela Makusha, History in Politics Writer

How the UK Shaped Hong Kong’s Unique Democratic Sensitivity

Many look at Hong Kong’s politics now and wonder how Hong Kong got into such a mess. As some may know, in addition to it being a shopping and cuisine paradise, Hong Kong is a has a special political and legal status . Alongside Macau, Hong Kong is run under the principles of ‘one country, two systems’. In other words, though Hong Kong is a part of socialist China, it operates under a capitalist system. This is a compromise agreed between the British Colonial Government and China in 1997 when the British control of Hong Kong ended. Whether the Chinese government is maintaining the principle well is not the question to be discussed here). Instead, this article will  explore the British Colonial government’s impact on shaping Hong Kong people’s unique democratic sensitivity, which has certainly contributed to the recent clash between the Hong Kong government and its people. 

The impact of shaping Hong Kong people’s democratic sensitivity can first be explored by the British Colonial government’s localisation policies. In 1967, there was a very serious riot throughout Hong Kong. This was a wake-up call to the British Colonial government that they had to change their way of administration by catering to the local people’s needs better. The British Colonial government thus started to implement a series of socio-economic policies, such as providing affordable housing and free and compulsory education. With a better living environment, Hong Kong people were able to spend more on learning instead of merely focusing on escaping poverty. Generations of improvement in education led to a Hong Kong population with a very high level of education. As a result, more locals were capable and eligible to work in the government. There was rapid localisation of governmental personnel including an increase of over 50% of Hong Kong civil servants from 1980 to 1990. There was an increased number of Hong Kong Administrative Officers. Similarly, more Hong Kong people were promoted to senior and even top governmental positions. For example, Anson Chan Fang On Sang became the first Chinese Chief Secretary and Donald Tsang Yam Kuen became the first Chinese Financial Secretary in the 1980s and 90s. Over the years towards the transfer of Hong Kong, more Chinese ‘secretaries’ emerged. More Hong Kong people learned the British democratic way of governance and were trained in this way. 

The Pro-Beijing government forces facing protestors, 1967. (Credit: Hong Kong Free Press)

As the transfer approached , the British Colonial government implemented an even more significant attempt at ‘localisation’: increasing  Hong Kong people’s democratic sensitivity. Towards the second half of the twentieth century, China had emerged as a stronger nation running under a socialist system. The British Colonial government feared that Hong Kong would become a socialist city under the CCP. As a result, in the 1990s, it greatly localised the government by promoting more locals into the administration. The British colonial government hoped that by doing so, these Hong Kong people would already be trained to manage their government in a democratic way when the transfer happens. Also, the fact that these capable Hong Kong people are already occupying government positions means that there would not be many vacancies when the British Colonial government was ‘out’ in 1997. The Chinese government would not, the thought went, send their own personnel (who are trained and worked under a socialist system) to manage the government. 

The impacts of the localisation measures had been effective in realising the British Colonial government’s democratic intentions. For example, in the early years of the twenty-first century, many Hong Kong people trained under the British democratic system still occupied most government positions. They pushed for further democratic reform after the transfer to ensure  democratic education to the new generations. The creation of the secondary school subject ‘Liberal Studies’, which educated youngsters on the ‘one country, two systems’ and one’s political rights, is a clear illustration of these efforts. These in turn trained a new generation of millennials who had lived and known democracy their whole life. These generations of youngsters clearly know what their political rights are and are willing to participate in defending their rights or pushing for democratic reforms. Under the education of liberal ideas, they are also capable of critically challenging government actions. Thus, it is not hard to understand why these democratically sensitive generations of youngsters felt threatened and protested when more pro-China politicians are taking up government positions and more pro-China policies are implemented in recent years. 

Hong Kong people’s unique democratic sensitivity can also be explored by another policy of the British Colonial government: the creation of representative governance. The China government’s autocratic rule during the Cultural Revolution really ‘freaked out’ the British Colonial government. It was determined to build Hong Kong a steadfast democratic foundation through increasing the electoral elements in Hong Kong’s political structure. In the Legislative Council, the first indirect election in 1985 marked the start of a gradual change, and was soon followed by the first direct election for 18 seats by the method of ‘one person, one vote’ in 1991 and the abolition of all official seats in 1995 by Governor Chris Patten. At last, the president of the Legislative council was no longer the Governor but elected among the Legislative council’s elected counsellors. In the District Council, the first direct election was held in 1982 and all official seats were abolished in 1985. All appointed seats were abolished in 1994 and the voting age was lowered from 21 to 18 years old. More people were eligible to participate in voicing their opinions by being able to vote for politicians that represent their views. In the Urban and Regional Councils, there were gradual elections and the abolishment of appointed seats. More people could vote and more were eligible to stand in elections. 

As Hong Kong moved into the twenty-first century, these elections were already present. The current  generations are used to having their say and participating in politics by voting and choosing their representatives. On the other side, more young people choose to participate in community affairs by standing in District Councilors’ elections, which are open to voting to everyone aged above 18. Some other young people choose to become a Legislative Councilor to have their opinions regarding the future development of Hong Kong valued. Thus, it is not hard to understand why youngsters are willing to protest, even resolve to radical actions, in face of the narrowing of electoral choices and rights in recent years. 

Chan Stephanie Sheena

Home Ownership: is it the key in Britain’s inequality crisis?

Housing reform is necessary. That, as a statement, is perhaps one of the few things undisputed between the major parties in Westminster. Not enough affordable homes are accessible, leaving many within a rent trap, never quite managing to make it onto the housing ladder. Yet under the surface issue of getting the next generation onto the housing ladder lies an issue of greater concern. If left unchecked, the imbalance within the housing market, coupled with long-time economic woe for lenders, could be Britain’s next ticking time bomb and have disastrous socio-political consequences.

The catalyst of this socio-political crisis that is beginning to brew is economic. Following the 2008 Financial Crash, economic growth in the UK has struggled to get back onto its feet. Productivity remains stagnant and lags behind other G7 nations. Attempting to stimulate economic growth, the central bank has, throughout the decade, maintained low interest rates to persuade people to spend rather than save. The rationale is straightforward enough. More gets produced if money is being spent, and not sitting in current accounts accruing interest. Yet, so far, such a strategy is yet to bear fruit.

A house left derelict in Brixton, London. Mercury Press & Media (Credit: https://www.mirror.co.uk/news/uk-news/crumbling-home-abandoned-30-years-22352062)

There is, however, a further, longer-term issue with the central bank’s policy on interest rates which is of greater concern. Ever since the UK government brought in inflation targeting in 1992, an economic policy where the central bank targets a certain inflation rate, interest rates have trended consistently downwards. Whilst great for home-owners with mortgages, for young people looking to save it is increasingly difficult to build up the finances required for the deposit to a house.

What does this have to do with an impending socio-political catastrophe? Simply put, right now there is no way for young people to reach the housing ladder, being perpetually stuck within the renting market. This, by itself, is not damaging to the socio-political situation within the UK. Other developed European countries across the continent have renting cultures and, if anything, dismantling the constructed expectation to house-own would likely be an improvement to British culture.

However, it is not certain that the impending economic change will achieve such a culture shift, and the consequences of retaining a house owning culture within a renting society are worrying.

Even ignoring, momentarily, the short-term recorrection of the market that would occur when demand for houses is outstripped by supply as a largely property-owning baby boomer generation passes away, unattainably expensive houses for the generations following will still remain. Instead, the most worrying concern is the impact it could have on an already expanding inequality within the UK.

This is because the gap between those who have and those who have not will become unbridgeable. With an inability to save for that deposit, those with parents who already own houses may find that inheritance is the only way onto the property ladder. Owning a house will become a sign of significant family wealth and concentrate privilege to an ever-decreasing minority, whilst those stuck with the financial pressures of paying rent will fall further and further behind.

And, as history points out repeatedly, such unsustainable inequality leads to increased political extremism and conflict. It would be naïve of us to assume that we would not be subjected to the same increase in political extremism that defined the French Revolution, the formation of the Soviet Union, and 1930s Germany. Indeed, in the 19th century there was a fear that Britain would follow a similar path to the examples listed above, as events such as the Peterloo Massacre and the Hyde Park Riots threatened to be the UK’s own Storming of the Bastille, as protestors demonstrated against the inequality within Victorian Britain.

Disraeli and Gladstone can be largely applauded for quelling such extremist politics with their reformative platforms. So what ,then, do we do about our modern conundrum? Well, were we to copy from the economist’s textbook, there would be one of two options: change the government’s fiscal policy or change its monetary policy. Either the government could go down a monetary route, scrapping inflation targeting and artificially raising interest rates. Or it could go down the fiscal route of greater spending, through the building of more affordable homes.

Yet, whilst the causes of this crisis may be down to our economic history, we must take a more socio-political approach in attempting to prevent such a crisis.

In contrast to the Victorian reformists Disraeli and Gladstone, who both passed Reform Acts focused on expanding suffrage within the UK, political reform should be prioritised in an act of preparation for, rather than reaction to, this crisis. Focus should, instead, be placed in two directions. Firstly, upon making central government contracts more transparent and meritocratic, the result of this being greater political pressure for a larger percentage of housing developments to be affordable. Secondly, the remit of local governments should be altered, increasing control over the rewarding of contracts, whilst simultaneously completing the shift towards housing associations owning social housing. Such a move would give councils and metropolitan areas greater control of how much housing is built, whilst protecting them from the unaffordable and uneconomical costs which come from development.

The housing crisis, although economically caused, must rely on contemporary political solutions to prevent socio-political catastrophe. We cannot be lulled into either thinking the UK is immune from the multiple historical examples of political extremism that accompany greater inequality, or that implementing the same economic policies repeatedly will provide a different result. Without expansive political reforms to housing, transferring power towards local governments, and increasing transparency in government, the UK will only step closer towards socio-political disaster.

Matthew Lambert, Summer Writer

South Africa and Apartheid’s Enduring Legacy

Apartheid, literally defined as ‘apartness’ or ‘separateness’ in Afrikaans, refers to the policy of enforced racial segregation that defines the history of modern South Africa. Spanning from 1948 to 1994, when the National Party was in power and put into practice the culture of ‘baasskap’ or white supremacy, the national programme of apartheid forced black and white citizens apart for nearly fifty years. The first law, the Prohibition of Mixed Marriages Act of 1949, served as the forerunner for later legislation which sought to prevent interracial relationships and remove the political rights of black citizens. All public facilities, including hospitals and transportation vehicles were segregated; however, the effects of apartheid split up families and displaced them from their homes.

A sign enforcing racial segregation in a bayside area, South Africa, 1970s. (Credit: Keystone via Getty Images)

However, whilst the political doctrine of apartheid and its segregationist ideology ended in 1994, culminating in the election of Nelson Mandela as the President of South Africa, its socio-economic legacy extends into the present day. The apartheid economy was tailored to appeal to, and overwhelmingly benefit white citizens, and as a nation of significant inequality, the after-effects of enforced segregation still pervade twenty-first century discourses. This economic legacy of apartheid is still palpable within modern South Africa, which continues to be defined by the segregationist policies of the late twentieth century. Today, black citizens, compared to their white counterparts, arguably remain somewhat disadvantaged in the national economy and the opportunities afforded to them. As the Economic Freedom Fighters, a South African left-wing political party emphasised in 2013, ‘political freedom without economic emancipation is meaningless.’ Statistical evidence supports the party’s observation, citing that in 2011, 54% of Africans compared to less than one percent of white citizens lived in poverty, attesting to the wider culture of division which had served as the central bastion of political authority. 

Even in the realm of education – particularly pertinent given the notable involvement of students within the anti-apartheid movement – the effect of segregation is demonstrable in the twenty-first century. Under the National Party, the funding of white schools was greater than that of black schools by tenfold, meaning that historical inequalities have become so deeply embedded in the framework of South Africa’s education system, that they are perpetuated nearly thirty years after the dissolution of apartheid. From 2015 to 2019, the school funding in the province of KwaZulu-Natal, one of the lowest-income communities in the nation, fell by a further 15%. What this evidence highlights is that whilst the official dogma of segregation is no longer directly involved in the fabric of the nation, the ghost of apartheid remains a ubiquitous element of life in South Africa, carving out an enduring and reprehensible modern socio-economic legacy. 

Maximus McCabe-Abel, President, History in Politics

Hong Kong’s National Security Law: Power Not To The People

You might have heard of the unrest in Hong Kong last year, stemming from the Government’s attempt to introduce an extradition agreement with Mainland China and culminating in a full-blown humanitarian crisis with the enactment of the National Security Law (NSL). Why was the extradition agreement met with such vigour? The proposed Bill would have led to both foreign nationals residing in Hong Kong and local criminal suspects becoming extraditable to mainland China, which has a substantially different criminal justice system and a history of breaching fundamental human rights. This has included arbitrary detention, unfair trials and torture, with the only requirement that “prima facie” evidence, which carries a significantly low standard of proof, be provided to the Chief Executive and the courts. Following escalating public clashes between the Government, police and citizens, and protests seeing over a million people in attendance and over 10,000 people arrested, the Bill was shelved. But by that time, the damage was done. The Bill exacerbated the deep fears of local citizens and expats in Hong Kong, who saw it as an early sign of China’s descent upon the nation and the dark future to come.

Several demands arose from the locals: the formal withdrawal of the Bill, release and exoneration of those arrested from the protests, the establishment of an independent commission of inquiry into police behaviour, universal suffrage for the Legislative Council, Chief Executive elections in addition to the resignation of Chief Executive Carrie Lam, and lastly the retraction of the characterisation of protests as “riots”. Somewhat unsurprisingly, only the first demand was met, which was seen by the Hong Kong people as highly unsatisfactory, and protests continued with increasing intensity. All this culminated in the Chinese Standing Committee of the National People’s Congress enacting the NSL, which opened a bigger can of worms.

Protesters marching at the “Stop Police Violence, Defend Press Freedom” silent march called after media professionals were insulted by police officers when covering protests against the extradition law to China. (Credit: Ivan Abreu, SOPA Images, Sipa via AP Images.)

Under the Sino-British Joint Declaration of 1997, resulting from the First and Second Opium War, Britain handed back control of Hong Kong to China on the condition that the “One Country, Two System” and freedoms of free speech, assembly, religious belief, amongst others, would continue to be enjoyed by the former until 2047. The NSL contained intentionally vague provisions, which would allow for ‘secession, subversion, terrorism and collusion with foreign forces’ to become punishable by a maximum sentence of life in prison. Having already been exercised to charge 50+ individuals, this has naturally given rise to a sense of deep unease in both the domestic and international sphere. As the legislation would have also allowed cases to be tried in Mainland China under their legal system, there was a real risk of criminal suspects being deprived of fundamental human rights, like being held incommunicado in undisclosed locations for up to 6 months before being formally arrested or released. Whilst the UK has similar national security laws in that suspected terrorists can be detained without charge for up to 28 days, these individuals are nevertheless allowed legal representation after a maximum period of 48 hours upon arriving at the police station. Compared to Mainland China, the UK is subject to more intense public and legal scrutiny whenever human rights are undermined. The legislations effect is essentially a complete curtailing of free speech, press and political dissent in Hong Kong. Critics worldwide have speculated that this directly contravenes the Joint Declaration’s condition of “One Country, Two Systems”, with the addition of the NSL being also applicable to crimes committed abroad, to non-permanent residents and people outside of Hong Kong. This means that the reach of the law is far and extensive, essentially subjecting foreign nationals to r to the authority of the NSL. 

Whilst the realistic probability of extraditing foreign citizens in the West for crimes committed against the communist party are relatively slim, the law has already caused a growing reluctance amongst foreign investors to conduct business in Hong Kong for fear of being subject to the extensive powers of the NSL. After the emergence of Covid-19 and consequent increasing criticism towards the Communist Party, it will be a matter of great importance for there to be checks and controls to prevent Mainland China’s ever-increasing influence. If they are left unchecked, one can only hope to stay out of the line of sight of the Chinese Government, and that is something I concern myself with, as this article has the potential to be considered “subversion” under the draconian National Security Law.

May Lam, History in Politics Contributor

The Future Unlocked? 

What a strange year. April might seem like an even stranger time to reflect, one month after the anniversary of the first Coronavirus lockdown, but it also seems astute as the easing of lockdown starts to open up our futures. With pubs starting to open, vaccines being delivered, and being officially allowed back to university, there is light at the end of the pandemic tunnel.

Yet, while we’ve been locked up in our houses, a few things have happened. For one, History in Politics has done two terms as a university society – but you probably don’t care much about that. More significant are the huge events seen through the prism of a new post pandemic world. Britain has finally properly left the EU, Boris Johnson lost his most infamous advisor, thousands marched for BLM, and thousands have protested policing in the wake of Sarah Everard: ‘why are you protecting statues of racists over actual women?’, one sign read. 

During the pandemic, Britain has been reflecting. We might look back upon our relationship with Europe. We might look at the history of race-relations in the UK, or our colonial legacy. In fact, with books such as Empireland: How Imperialism Has Shaped Modern Britain being released in January by Sathnam Sanghera, it is clear that many have been reflecting on such themes. In doing so, it is hoped that, by having a clear idea of where we’ve come from, we might have a better idea of what we’re meant to do in the future.

Luckily for me, although perhaps less so for my career prospects, I’ve had the privilege of studying such history. I’ve spent a lifetime learning about the British Empire, race-relations, civil rights, and Britain’s relationship with Europe (although, aged 21, a lifetime is quite a melodramatic way of putting it). I have even had time to study the Tudors, which many complain took the place of ‘more relevant’ history. Despite all this history I am still to get the magic key to predicting our future – perhaps that will come tomorrow, or once I’m back in a Durham pub. 

Ironically, such historical reflections can be found throughout history. When Edward Colston’s statue was raised in Bristol in 1895, for instance, it was already over a century and a half after his death. Those who toppled his statue over a hundred years later, certainly wouldn’t think that the Victorian reflections or remembrance of Colston was a positive one. Although some might suggest it was representative of the future for Victorians, a future of racial inequality. 

The plinth of the now removed statue of Edward Colston, Bristol, England. (Credit: James Beck for The New York Times)

One thing which we cannot change, regardless of how we might reflect upon it, is what has passed. This might sound obvious, but it is important to hold in mind such ‘objective truths’. They’re the reason people look back, hoping the past truths will unlock future truths. It is in search of the ‘truth’ that we talk, read, and reflect on our past – from empire to race. Last summer, as statues were ripped up and the media exploded into debate, I asked how we might have that conversation in a civil manner. Yet, the ‘culture war’ has continued – regardless of these ‘truths’. 

Perhaps it is less talking and more listening which needs to happen. Over lockdown I had the pleasure of listening to Natalie Mears (associate professor in early modern British history) discuss some of these topics. Finally, I could put Tudor history to some use, and the comparisons with our present ‘culture war’ were stark. From powerful political advisors (it is the 500th anniversary of William Cecil’s birth; and a year since ‘that’ trip to Barnard Castle) to our relationship with Europe, some things seemingly haven’t changed. As we reflect upon the past year (and a bit) of COVID-19, one lesson from Elizabethan England sticks out the strongest: that reflections and memories of the past have always been political. At least that is one door to the future which is unlocked. The future is undoubtedly political.

Join Durham University’s History in Politics Society for their term’s theme of ‘Reflections’ and find series two of Dead Current, the History in Politics Podcast, on Spotify. The first episode of series two is President Emily Glynn and Event’s Manager Ed Selwyn Sharpe’s interview with Natalie Mears. 

Ed Selwyn Sharpe

Diamonds, Best Friend or Mortal Enemy?

Diamonds symbolise love, wealth, and commitment to both the purchaser and the recipient, after all, they are known to be a woman’s best friend. Yet, the process of retrieving such a valuable commodity remains a battleground for those who work in the diamond mines. Alongside diamond production, the construction of worker exploitation, violence, and civil wars is generated proving that beauty is in fact, pain. 

The tale of the present-day diamond market emerged on the African continent, South Africa to be precise. The Democratic Republic of the Congo ranks fourth in the world when it comes to diamond production with 12 million carats being produced in 2020, the African region dominates the top 10 rankings with seven out of 54 countries acting as some of the world’s largest diamond producers.

Congolese workers searching for rough diamonds in mines in the south west region of Kasai in the Democratic Republic of Congo, August 9, 2015. (Credit: Lynsey Addario, Getty Images Reportage for Time Magazine)

The diamond trade contributes approximately $8.5 billion per year to Africa and Nelson Mandela has previously stated that the industry is “vital to the Southern African economy”. The wages of the diamond miners, however, do not reflect the value of this work and its contributions to the financial expansion of African countries. An estimated one million diamond diggers in Africa earn less than a dollar a day, an unlivable wage stooping below the extreme poverty line. Despite the significant revenues from the diamond industry, both through taxation and profit-sharing arrangements, governments often fail to re-invest these funds in local communities. The government in Angola receives about $150 million per year in diamond revenues yet conditions near major diamond mining projects are appalling. Public schools, water supply systems and health clinics are near non-existent. Many African countries are still healing from the impact of colonisation and are dealing with corruption, incompetence and weak political systems. Therefore, it comes as no surprise that governments fail to invest their diamond revenues productively. 

Adjacent to being excessively underpaid and overworked, miners endure work in exceptionally hazardous conditions often lacking safety equipment and the adequate tools for their role. Injuries are a likely possibility in the everyday life of a miner sometimes leading to fatality. The risk of landslides, mine collapses and a variety of other accidents is a constant fear. Additionally, diamond mining also contributes to public health problems since the sex trade flourishes in many diamond mining towns leading to the spread of sexually transmitted diseases.

Children are considered an easy source of cheap labour and so they tend to be regularly employed in the diamond mining industry. One survey of diamond miners in the Lunda Norte province of Angola found that 46% of miners were between the ages of 5 and 16. Life as a diamond miner is full of hardship, and this appalling way of living is only heightened for younger kids who are more prone to injuries and accidents. Since most of these kids do not attend school, they tend to be pigeonholed into this way of life throughout adulthood, robbing them of their childhood and bright futures. 

African countries such as Sierra Leone, Liberia, Angola and Côte d’Ivoire have endured ferocious civil conflicts fuelled by diamonds. Diamonds that instigate such civil wars are often called “blood” diamonds as they intensify civil wars by financing militaries and rebel militias. Control over diamond rich territories causes rival groups to fight, resulting in tragic situations such as bloodshed, loss of life and disturbing human right abuses. 

Whilst purchasing diamonds from a conflict-free country such as Canada can buy you a clean conscience, you must not forget about the miners being violated every day for the benefit of others but never themselves. Just as we have the opportunity to choose fair trade foods benefitting the producers, consumers of one of the most valuable products one may ever own should not be left in the dark regarding the strenuous work of digging miners do behind the stage of glamour and wealth. A true fair trade certification process must be set in place through which miners are adequately awarded for their dedication and commitment to such a relentless industry, especially in countries that are still processing generational trauma that has been caused by dominating nations.

Lydia Benaicha, History in Politics Contributor

The Cost of Casual Scepticism to Human Rights

Modern aversion to human rights protection in the United Kingdom can be seen on the surface of British politics, from Theresa May’s indignation that she was  unable to deport an illegal immigrant because he had a pet cat, to Nigel Farage’s demand that we “scrap the EU Human Rights Act”. The modern rallying cry for human rights sceptics often takes the form that we, the British,  have no need for coffee-drinking, baguette-eating European to tell us how to do human rights. The problem though is that rights protection is not that simple, and never has been, so why do we let human rights be disparaged by these tabloid-level soundbites and mantras?

The European Court of Human Rights in Strasbourg. (Credit: Europe’s Human Rights Watchdog)

When bashing human rights, politicians are most commonly referring to the European Convention of Human Rights (ECHR) which was created by the Council of Europe which, for clarity’s sake, has nothing to do with the European Union – incidentally making Farage’s claims about the “EU Human Rights Act” farcical. The ECHR established the European Court of Human Rights (ECtHR), frequently referred to as ‘Strasbourg’ in lieu of its location. These elements of European influence running through our human rights make it an easy target for politicians wanting to exploit the rife Euroscepticism that exists in modern Britain. This is why, perhaps unsurprisingly, key features of the ECHR’s history are left out by its critics.

The ECHR can find its origins in the ruins of Europe following World War Two, and the Holocaust, in which it was decided action must be taken to prevent future human rights abuses across the continent. Winston Churchill, and the British delegation, played a key role in the drafting of the Convention and were ardent supporters of it. Since then, the Convention has evolved according to the changing nature of Europe and has helped in reducing discrimination and exposing rights abuses. 

Looking then at the face of modern Europe, the significant role of the ECHR may seem unnecessary and can even seem to be an inconvenience to public protection and government policy. However, the Convention should not be perceived as only existing to prevent grave breaches of human rights, but also to remedy all forms of discrimination, such as the right of LGBT couples to adopt or for people of all faiths to have the same job opportunities. These breaches still occur across Europe, with significant examples appearing in Poland and Hungary where radical anti-LGBT policies are being enacted, such as ‘LGBT free zones’.

This is one reason why it is more crucial than ever that other States party to the Convention continue to believe in and abide by it. If disagreements with Strasbourg start to appear over issues, for example prisoner voting rights, it waters down the impact of the ECtHR’s rulings and threatens the possibility that judgments given by the Court to clear breaches(such as those appearing in Eastern Europe), are swept aside, and ignored by the violating State. Arguing and fighting for proper rights protection is not done only for yourself, or your group’s, but is done for the wider community.

This is why the rhetoric used in modern politics about seemingly humorous issues, like the failed deportation of a man because of his cat, and other false remarks which attempt to sabotage the mechanisms for rights protection are dangerous. Dangerous not only for the prevention of relatively benign issues that may appear in nations like the UK, but also for those that rely on the Convention to prevent fundamental breaches of human rights, which appear more likely to occur  on a daily basis, and cause one to consider the events that led to the creation of the ECHR.

Aidan Taylor, History in Politics Contributor