Deepfakes: faces created by AI now look more real than genuine photos

conversation
This article is republished from The Conversation under a Creative Commons licence

by
Manos Tsakiris, Royal Holloway University of London

Even if you think you are good at analysing faces, research shows many people cannot reliably distinguish between photos of real faces and images that have been computer-generated. This is particularly problematic now that computer systems can create realistic-looking photos of people who don’t exist.

Recently, a fake LinkedIn profile with a computer-generated profile picture made the news because it successfully connected with US officials and other influential individuals on the networking platform, for example. Counter-intelligence experts even say that spies routinely create phantom profiles with such pictures to home in on foreign targets over social media.

These deep fakes are becoming widespread in everyday culture which means people should be more aware of how they’re being used in marketing, advertising and social media. The images are also being used for malicious purposes, such as political propaganda, espionage and information warfare.

Continua a leggere…
Advertisement

Listen and read with “The Conversation”

conversation

Ho scoperto l’altro giorno che il sito The Conversation ha ampliato la sua offerta per includere una sezione NOA (News Over Audio) che propone le notizie sia come lettura che come ascolto: un esercizio utilissimo per chi studia la lingua inglese.

Consiglio, come sempre, di usare questi articoli anzitutto come esercizi di ascolto puro, senza guardare il testo, cercando di cogliere più informazioni possibili, anche con ascolti ripetuti. Il prossimo passo è quello di ascoltare di nuovo con il testo davanti, cercando d’individuare e studiare i passaggi dove si sono presentate maggiori difficoltà nell’ascolto. Infine può essere utile fare un ultimo ascolto senza il testo davanti, vedendo se perlopiù si riesce a seguire tutto l’articolo dall’inizio alla fine.

La pagina con gli articoli già pubblicati si trova QUI


What the world can learn about equality from the Nordic model

conversation
This article is republished from The Conversation under a Creative Commons licence

by
Geoffrey M Hodgson, University of Hertfordshire

Rising inequality is one of the biggest social and economic issues of our time. It is linked to poorer economic growth and fosters social discontent and unrest. So, given that the five Nordic countries – Denmark, Finland, Iceland, Norway and Sweden – are some of the world’s most equal on a number of measures, it makes sense to look to them for lessons in how to build a more equal society.

The Nordic countries are all social-democratic countries with mixed economies. They are not socialist in the classical sense – they are driven by financial markets rather than by central plans, although the state does play a strategic role in the economy. They have systems of law that protect personal and corporate property and help to enforce contracts. They are democracies with checks, balances and countervailing powers.

Nordic countries show that major egalitarian reforms and substantial welfare states are possible within prosperous capitalist countries that are highly engaged in global markets. But their success undermines the view that the most ideal capitalist economy is one where markets are unrestrained. They also suggest that humane and equal outcomes are possible within capitalism, while full-blooded socialism has always, in practice, led to disaster.

The Nordic countries are among the most equal in terms of distribution of income. Using the Gini coefficient measure of income inequality (where 1 represents complete inequality and 0 represents complete equality) OECD data gives the US a score of 0.39 and the UK a slightly more equal score of 0.35 – both above the OECD average of 0.31. The five Nordic countries, meanwhile, ranged from 0.25 (Iceland – the most equal) to 0.28 (Sweden).

The relative standing of the Nordic countries in terms of their distributions of wealth is not so egalitarian, however. Data show that Sweden has higher wealth inequality than France, Germany, Japan and the UK, but lower wealth inequality than the US. Norway is more equal, with wealth inequality exceeding Japan but lower than France, Germany, UK and US.

Nonetheless, the Nordic countries score very highly in terms of major welfare and development indicators. Norway and Denmark rank first and fifth in the United Nations Human Development Index. Denmark, Finland, Norway and Sweden have been among the six least corrupt countries in the world, according to the corruption perceptions index produced by Transparency International. By the same measure, the UK ranks tenth, Iceland 14th and the US 18th.

The four largest Nordic countries have taken up the top four positions in global indices of press freedom. Iceland, Norway and Finland took the top three positions in a global index of gender equality, with Sweden in fifth place, Denmark in 14th place and the US in 49th.

Suicide rates in Denmark and Norway are lower than the world average. In Denmark, Iceland and Norway the suicide rates are lower than in the US, France and Japan. The suicide rate in Sweden is about the same as in the US, but in Finland it is higher. Norway was ranked as the happiest country in the world in 2017, followed immediately by Denmark and Iceland. By the same happiness index, Finland ranks sixth, Sweden tenth and the US 15th.

In terms of economic output (GDP) per capita, Norway is 3% above the US, while Iceland, Denmark, Sweden and Finland are respectively 11%, 14%, 14% and 25% below the US. This is a mixed, but still impressive, performance. Every Nordic country’s per capita GDP is higher than the UK, France and Japan.

Special conditions?

Clearly, the Nordic countries have achieved very high levels of welfare and wellbeing, alongside levels of economic output that compare well with other highly developed countries. They result from relatively high levels of social solidarity and taxation, alongside a political and economic system that preserves enterprise, economic autonomy and aspiration.

Yet the Nordic countries are small and more ethnically and culturally homogeneous than most developed countries. These special conditions have facilitated high levels of nationwide trust and cooperation – and consequently a willingness to pay higher-than-average levels of tax.

As a result, Nordic policies and institutions cannot be easily exported to other countries. Large developed countries, such as the US, UK, France and Germany, are more diverse in terms of cultures and ethnicities. Exporting the Nordic model would create major challenges of assimilation, integration, trust-enhancement, consensus-building and institution-formation. Nonetheless, it is still important to learn from it and to experiment.

Despite a prevailing global ideology in favour of markets, privatisation and macro-economic austerity, there is considerable enduring variety among capitalist countries. Furthermore some countries continue to perform much better than others on indicators of welfare and economic equality. We can learn from the Nordic mixed economies with their strong welfare provision that does not diminish the role of business. They show a way forward that is different from both statist socialism and unrestrained markets.The Conversation


How multinationals continue to avoid paying hundreds of billions of dollars in tax – new research

conversation
This article is republished from The Conversation under a Creative Commons licence

by
Miroslav Palanský, Charles University

Tax havens have become a defining feature of the global financial system. Multinational companies can use various schemes to avoid paying taxes in countries where they make vast revenues. In new research, my colleague Petr Janský and I estimate that around US$420 billion in corporate profits is shifted out of 79 countries every year.

This equates to about US$125 billion in lost tax revenue for these countries. As a result, their state services are either underfunded or must be funded by other, often lower-income taxpayers. It contributes to rising inequality both within countries and across the world.

Given the nature of the issue, it is intrinsically difficult to detect tax avoidance or evasion. To get round this, we use data on foreign direct investment (FDI) collected by the International Monetary Fund to examine whether companies owned from tax havens report lower profits in high-tax countries compared to other companies.

We found that countries with a higher share of FDI from tax havens report profits that are systematically and significantly lower, suggesting these profits have been shifted to tax havens before being reported in high-tax countries. The strength of this relationship enables us to estimate how much more profit would be reported in each country if companies owned from tax havens reported similar profits to other companies.

We found that lower-income countries on average lose at least as much as developed countries (relative to the size of their economies). At the same time, they are less able to implement effective tools to reduce the amount of profit shifted out of their countries.

Three channels of profit shifting

There are three main channels that multinationals can use to shift profits out of high-tax countries: debt shifting, registering intangible assets such as copyright or trademarks in tax havens, and a technique known as “strategic transfer pricing”.

To see how these channels work, imagine that a multinational is composed of two companies, one located in a high-tax jurisdiction like Australia (company A) and one located in a low-tax jurisdiction like Bermuda (company B). Company B is a holding company and fully owns company A.

While both companies should pay tax on the profit they make in their respective countries, one of the three channels is used to shift profits from the high-tax country (Australia in our case, with a corporate income tax rate of 30%) to the low-tax country (Bermuda, with a corporate income tax rate of 0%). For every dollar shifted in this way, the multinational avoids paying 30 cents of tax.

Debt-shifting is when company A borrows money (although it does not need to) from company B and pays interest on this loan to company B. The interest payments are a cost to company A and are tax-deductible in Australia. So they effectively reduce the profit that company A reports in Australia, while increasing the profit reported in Bermuda.

In the second channel, the multinational transfers its intangible assets (such as trademarks or copyright) to company B, and company A then pays royalties to company B to use these assets. Royalties are a cost to company A and artificially lower its profit, increasing the less-taxed profit of company B.

Strategic transfer pricing, the third channel, can be used when company A trades with company B. To set prices for their trade, most countries currently use what’s called the “arm’s length principle”. This means that prices should be set the same as they would be if two non-associated entities traded with each other.

But, in practice, it is often difficult to determine the arm’s length price and there is considerable space for multinationals to set the price in a way that minimises their overall tax liabilities. Imagine company A manufactures jeans and sells them to company B, which then sells them in shops. If the cost of manufacturing a pair of jeans is US$80 and company A would be willing to sell them to unrelated company C for US$100, they would make US$20 in profit and pay US$6 in tax (at 30%) in Australia.

But if company A sells the jeans to its subsidiary company B for just US$81, it only makes US$1 in profit and so pays US$0.3 in tax in Australia. Company B then sells the jeans to unrelated company C for US$100, making US$19 in profit, but not paying any tax, since there is no corporate income tax in Bermuda. Using this scheme, the multinational evades paying US$5.7 in tax in Australia for every pair of jeans sold.

How to stop it

The root of the problem is the way international corporate income is taxed. The current system is based on an approach devised almost a century ago, when large multinationals as we know them today did not exist. Today, individual entities that make up a multinational run separate accounts as if they were independent companies. But the multinational optimises its tax liabilities as a whole.

Instead, we should switch to what’s called a unitary model of taxation. The idea is to tax the profit where the economic activity which generates it actually takes place – not where profits are reported. The multinational would report on its overall global profit and also on its activity in each country in which it operates. The governments of these countries would then be allowed to tax the multinational according to the activity in their country.

In practice, defining what exactly constitutes “economic activity which generates profit” is the tricky bit. For a multinational that manufactures phones, for example, it is not clear what part of its profit is generated by, say, the managers in California, designers in Texas, programmers in Munich, an assembly factory in China, a Singapore-based logistics company that ships the phone to Paris, the retail store in Paris that sells the phone, or the French consumer.

Different proposals for unitary taxation schemes define this tax base in various ways. The five factors most often taken into account are: location of headquarters, sales, payroll, employee headcount and assets. Different proposals give different weight to these factors.

Ultimately, introducing unitary taxation would require a global consensus on the formula used to apportion profits. And, admittedly, this would be difficult to do. As the OECD says: “It present[s] enormous political and administrative complexity and require[s] a level of international cooperation that is unrealistic to expect in the field of international taxation.”

But, seeing as the current system costs governments around the world around US$125 billion annually, is global cooperation really more expensive than that?The Conversation


Stop calling coronavirus pandemic a ‘war’

conversation
This article is republished from The Conversation under a Creative Commons licence

by
Alexandre Christoyannopoulos, Loughborough University

In speeches, commentaries and conversations about the coronavirus pandemic, we keep hearing war-like metaphors being deployed. It happens explicitly (“we are at war”, “blitz spirit”, “war cabinet”) and implicitly (“threat”, “invisible enemy”, “frontline”, “duty”).

This, after all, helps project an interpretation of the extraordinary reality facing us which is readily understandable. It helps convey a sense of exceptional mobilisation and offers to decision-makers an opportunity to rise up as heroic commanders.

It is also true that the language of biomedicine and epidemiology is already heavily militarised. We “battle” a virus, and our body has “defence” mechanisms against the pathogens that “invade” it.

But the coronavirus crisis is an international, pan-human challenge. It certainly requires exceptional collective mobilisation, but no real weapons, no intentional killing of fellow human beings, and no casting of people as dehumanised others. Militarised language is unnecessary.

Explaining and encouraging community resilience and togetherness in the face of adversity by evoking images of war conjures up distorted myths and narratives of heroic past national glory and military campaigns. This might function as a cognitive shortcut to evoke collective effort, but the narrow narratives it reproduces are open to exploitation by opportunistic politicians.

We could just as much favour analysis of the evolving situation in calmer scientific and medical terms. You don’t need ideas about war to tell a story of the human race naturally coming together when faced by a common danger.

Indeed, one striking phenomenon has been the huge proliferation of organic networks of mutual aid. From street-level up, and often with the help of social media, a huge number of people have been organising solidarity networks to help each other – and especially the most vulnerable.

People have come together and organised within neighbourhoods, cities and regions – but also across nations – to help each other without needing to call it a “war” or military “duty”. The language of mutual aid and solidarity works just as well.

Ideological appropriations

Anyone interested in political theory and ideologies must be watching all this with some intellectual curiosity. Different perspectives come with different assumptions about human nature, the role of the state compared to other institutions, and so on.

War is the business of the state par excellence. Some argue it was war-making that actually made the modern state. Framing the response to COVID-19 in military language will reinforce such statist thinking – and the statist project itself. It will reinforce the state and its power.

It is of course true that, given the political architecture in place as the crisis hit, states do hold much organisational capacity and power. They have a crucial role to play in tackling the current emergency. But other political entities matter too, from spontaneous bottom-up networks and municipalities to regional organisations and the World Health Organization. Military metaphors, however, either conceal their contributions or co-opt them by describing their efforts in military terms.

One could just as much pitch the crisis as being about medicine, health workers and human communities across the globe. One could analyse events around particular socio-economic classes, such as supermarket workers, delivery workers and essential equipment manufacturers, in every country affected by the virus. Looking at socio-economic classes across borders could also set up more searching discussions about homelessness, refugee camps, working conditions and universal healthcare.

An analysis based on class or social justice is just as appropriate as one revolving around military metaphors. But instead of reinforcing statist and military thinking, it would explain the crisis in anarchist, Marxist, feminist, or liberal internationalist terms, for example.

Normalising war

Language matters. It helps frame particular stories, interpretations and conversations while at the same time closing off alternative perspectives. It reinforces particular theories about how the world works, and sidelines others.

Framing political issues in the language of war both illustrates the prevalence of militarised thinking and further enables it. The more we use military language, the more we normalise the mobilisation of the military and the more we entrench military hierarchies. When the next international crisis arrives, rather than examining the deeper structural problems that caused them, we jump again to heroic narratives of national militarised mobilisation.

Who benefits from this? Politicians can project an image of decisive generals protecting their lot. Agents of state coercion can project themselves as dutiful and robust but popular administrators of the public will. They can then mobilise this (typically masculine) brand for their own political agenda later on. If you are Trump, perhaps you can even egg up some anti-Chinese patriotism.

Missed is the opportunity to develop a more nuanced understanding of human capabilities not restricted to national boundaries. Yet this international solidarity and these pan-human capabilities might be what we need to tackle other problems of international scale, such as the climate crisis.

When a crisis of global proportions gives rise to organic expressions of mutual aid, our imagination has grown so restricted that we find ourselves framing the challenge in statist and national terms. Instead of seeing the whole of humanity rising to the challenge together and observing the multi-layered outpouring of mutual aid, our imagination is restricted into encasing this in military language.

But that does not capture the full story. The human race will come out of COVID wiser if it does not frame its understanding of its response to it in narrow military language.The Conversation


What is populism – and why is it so hard to define?

conversation
This article is republished from The Conversation under a Creative Commons licence

by
Andy Knott, University of Brighton

We live in a moment in which the word “populism” is never far from the lips of politicians (although oh so rarely of the populist politicians themselves). We hear the word repeated over and over, but once we try to get a handle on what it actually means, confusion abounds. There are a few good reasons for this difficulty of understanding but, at the same time, the burgeoning academic community writing on populism has increasingly forged a consensus around at least the core features of the concept.

The first reason for the conceptual confusion is that words don’t neatly map onto their referents. There is a struggle over the meaning of key political terms and the predominant use of populism in politics and the media is derogatory. Established politicians and journalists dismiss populism as an aberrant infant intruding into and disrupting political normality.

Because populists don’t understand politics, according to this establishment view, the populist intrusion will be temporary. Voters will inevitably return to their senses and see through the seductive but hollow musings of this infantile intruder. This is why the signifier “populism” tends to be used by establishment figures – such as former British prime minister Tony Blair and former deputy prime minister Nick Clegg. And what they intend to signify by that word is that the public should reject populism. They are the anti-populists but, again, you don’t tend to hear those accused of being populist – Nigel Farage or Donald Trump, for instance – labelling themselves as such.

Invoking Blair and Clegg brings us to the second reason for populism’s conceptual confusion. Historically, populism has not been a permanent political phenomenon. It comes in waves. It disappears and reappears, usually coinciding with crisis (whether real or declared). What matters is the people have to feel that crisis, have to recognise that the crisis designated by the interloping populist performer is upon us. And this time the crisis is also a crisis of the worldview that the likes of Blair and Clegg brought into being. When in power, Blair regularly likened the version of globalisation New Labour fostered as a force of nature. As sure as night follows day, globalisation was upon us, and the only valid response was to find a way to work within this unstoppable force.

Nationalism began to rise in Europe several decades back. It came in response to the establishment, consolidation and growth of the EU, and the decline of the continent encapsulated by decolonisation and the end of empires. Initially it was a trickle, but it grew inexorably throughout this century. Populists began to rail against postnational institutions such as the EU and UN and against international treaties that attempt to bind all nations (relating to climate change and other environmental factors). Globalisation no longer seems quite as inevitable as Blair claimed.

Rejecting the ‘elites’

In this shift from Blair’s globalisation to the reassertion of nationalism, something happened to the people. This is one of the most heavily contested concepts in politics, but under the calm of Blair’s rule, the people were viewed as one – both rulers and ruled got along with one another. Blair was declared the “man of the people” and he thought his popularity resulted from his being “a normal guy”. This is not how populists treat the people. For populists, the seamless harmony between the people and their rulers no longer holds. The people have been betrayed. A gulf has opened up between the people and the elites. Instead of unity, they have entered a conflictual relationship.

And it is this understanding of populism – the people pitched against elites – that has now become widespread among the academic community. But this is a somewhat limited or minimal presentation of what populism is, and once academics start expanding on it, they quickly start to disagree.

The most contentious issue is over whether populism is an ideology as Cas Mudde, the most quoted commentator on contemporary populism claims. This would align populism with other political ideologies, such as liberalism, socialism and conservatism.

Yet liberalism has core identifiable features – the centrality of the individual (and not the people), human rights, the separation (and limitation) of powers. Populism does not have these.

Moffitt suggests populism is better understood as a style. It’s a manner or practice of doing politics. You identify (or declare) a crisis, invoke the people against elites, and so on. And because it is more of a style of politics than an ideology with content, there are several variants of it, most notably of the left and right. Syriza in Greece and Podemos in Spain are perhaps the most obvious left variants emerging in the aftermath of 2008 – although both Corbynism (far more than Jeremy Corbyn himself) and Bernie Saunders share certain affinities.

It is the right, however, especially in Europe and now the US under Trump, that is very much in the ascendancy. The right has proved highly effective at mobilising the national people against not only “the swamp” in Washington or Brussels, but also against those these elites are deemed to represent and protect: migrants primarily, but also other minority interests.

This is the final complicating factor about populism: alongside the people and the elites, there is a third group against which populists will direct their ire – migrants usually for the right; financial elites for the left. The success of right populists mobilising against the dual combination of Brussels elites and migrants (or minorities) explains why Viktor Orban is in power in Hungary, Matteo Salvini in Italy, and European politics continues to be profoundly influenced by Farage, Marine Le Pen, Geert Wilders – and plenty more besides.The Conversation


The dark side of plant-based food – it’s more about money than you may think

conversation
This article is republished from The Conversation under a Creative Commons licence

by
Martin Cohen, University of Hertfordshire and Frédéric Leroy, Vrije Universiteit Brussel

If you were to believe newspapers and dietary advice leaflets, you’d probably think that doctors and nutritionists are the people guiding us through the thicket of what to believe when it comes to food. But food trends are far more political – and economically motivated – than it seems.

From ancient Rome, where Cura Annonae – the provision of bread to the citizens – was the central measure of good government, to 18th-century Britain, where the economist Adam Smith identified a link between wages and the price of corn, food has been at the centre of the economy. Politicians have long had their eye on food policy as a way to shape society.

That’s why tariffs and other trade restrictions on imported food and grain were enforced in Britain between 1815 and 1846. These “corn laws” enhanced the profits and political power of the landowners, at the cost of raising food prices and hampering growth in other economic sectors.

Over in Ireland, the ease of growing the recently imported potato plant led to most people living off a narrow and repetitive diet of homegrown potato with a dash of milk. When potato blight arrived, a million people starved to death, even as the country continued to produce large amounts of food – for export to England.

Such episodes well illustrate that food policy has often been a fight between the interests of the rich and the poor. No wonder Marx declared that food lay at the heart of all political structures and warned of an alliance of industry and capital intent on both controlling and distorting food production.

Vegan wars

Many of today’s food debates can also be usefully reinterpreted when seen as part of a wider economic picture. For example, recent years have seen the co-option of the vegetarian movement in a political programme that can have the effect of perversely disadvantaging small-scale, traditional farming in favour of large-scale industrial farming.

This is part of a wider trend away from small and mid-size producers towards industrial-scale farming and a global food market in which food is manufactured from cheap ingredients bought in a global bulk commodities market that is subject to fierce competition. Consider the launch of a whole new range of laboratory created “fake meats” (fake dairy, fake eggs) in the US and Europe, oft celebrated for aiding the rise of the vegan movement. Such trends entrench the shift of political power away from traditional farms and local markets towards biotech companies and multinationals.

Estimates for the global vegan food market now expect it to grow each year by nearly 10% and to reach around US$24.3 billion by 2026. Figures like this have encouraged the megaliths of the agricultural industry to step in, having realised that the “plant-based” lifestyle generates large profit margins, adding value to cheap raw materials (such as protein extracts, starches, and oils) through ultra-processing. Unilever is particularly active, offering nearly 700 vegan products in Europe.

Researchers at the US thinktank RethinkX predict that “we are on the cusp of the fastest, deepest, most consequential disruption” of agriculture in history. They say that by 2030, the entire US dairy and cattle industry will have collapsed, as “precision fermentation” – producing animal proteins more efficiently via microbes – “disrupts food production as we know it”.

Westerners might think that this is a price worth paying. But elsewhere it’s a different story. While there is much to be said for rebalancing western diets away from meat and towards fresh fruits and vegetables, in India and much of Africa, animal sourced foods are an indispensable part of maintaining health and obtaining food security, particularly for women and children and the 800 million poor that subsist on starchy foods.

To meet the 2050 challenges for quality protein and some of the most problematic micronutrients worldwide, animal source foods remain fundamental. But livestock also plays a critical role in reducing poverty, increasing gender equity, and improving livelihoods. Animal husbandry cannot be taken out of the equation in many parts of the world where plant agriculture involves manure, traction, and waste recycling – that is, if the land allows sustainable crop growth in the first place. Traditional livestock gets people through difficult seasons, prevents malnutrition in impoverished communities, and provides economic security.

Follow the money

Often, those championing vegan diets in the west are unaware of such nuances. In April 2019, for example, Canadian conservation scientist, Brent Loken, addressed India’s Food Standards Authority on behalf of EAT-Lancet’s “Great Food Transformation” campaign, describing India as “a great example” because “a lot of the protein sources come from plants”. Yet such talk in India is far from uncontroversial.

The country ranks 102nd out of 117 qualifying countries on the Global Hunger Index, and only 10% of infants between 6–23 months are adequately fed. While the World Health Organization recommends animal source foods as sources of high-quality nutrients for infants, food policy there spearheads an aggressive new Hindu nationalism that has led to many of India’s minority communities being treated as outsiders. Even eggs in school meals have become politicised. Here, calls to consume less animal products are part of a deeply vexed political context.

Likewise, in Africa, food wars are seen in sharp relief as industrial scale farming by transnationals for crops and vegetables takes fertile land away from mixed family farms (including cattle and dairy), and exacerbates social inequality.

The result is that today, private interest and political prejudices often hide behind the grandest talk of “ethical” diets and planetary sustainability even as the consequences may be nutritional deficiencies, biodiversity-destroying monocultures and the erosion of food sovereignty.

For all the warm talk, global food policy is really an alliance of industry and capital intent on both controlling and distorting food production. We should recall Marx’s warnings against allowing the interests of corporations and private profit to decide what we should eat.The Conversation


History of the two-day weekend offers lessons for today’s calls for a four-day week

conversation
This article is republished from The Conversation under a Creative Commons licence

by
Brad Beaven, University of Portsmouth

The idea of reducing the working week from an average of five days to four is gaining traction around the world. Businesses and politicians have been considering a switch to fewer, but more productive hours spent working. But the idea has also been derided.

As a historian of leisure, it strikes me that there are a number of parallels between debates today and those that took place in the 19th century when the weekend as we now know it was first introduced. Having Saturdays as well as Sundays off work is actually a relatively modern phenomenon.

Throughout the 19th century, government legislation reduced working hours in factories and prescribed regular breaks. But the weekend did not simply arise from government legislation – it was shaped by a combination of campaigns. Some were led by half-day holiday movements, others by trade unions, commercial leisure companies and employers themselves. The formation of the weekend in Britain was a piecemeal and uneven affair that had to overcome unofficial popular traditions that punctured the working week during the 19th century.

‘Saint Monday’

For much of the 19th century, for example, skilled artisan workers adopted their own work rhythms as they often hired workshop space and were responsible for producing items for their buyer on a weekly basis. This gave rise to the practice of “Saint Monday”. While Saint Monday mimicked the religious Saint Day holidays, it was in fact an entirely secular practice, instigated by workers to provide an extended break in the working week.

They worked intensively from Tuesday to finish products by Saturday night so they could then enjoy Sunday as a legitimate holiday but also took Mondays off to recover from Saturday night and the previous day’s excesses. By the mid-19th century, Saint Monday was a popular institution in British society. So much so that commercial leisure – like music halls, theatres and singing saloons – staged events on this unofficial holiday.

Workers in the early factory system also adopted the tradition of Saint Monday, despite manufacturers consistently opposing the practice, as it hurt productivity. But workers had a religious devotion to the unofficial holiday, which made it difficult for masters to break the habit. It continued to thrive into the 1870s and 1880s.

Nonetheless, religious bodies and trade unions were keen to instil a more formal holiday in the working week. Religious bodies argued that a break on Saturday would improve working class “mental and moral culture”. For example, in 1862 Reverend George Heaviside captured the optimistic tone of many religious leaders when, writing in the Coventry Herald newspaper, he claimed a weekend would allow for a refreshed workforce and greater attendance at church on Sundays.

Trade unions, meanwhile, wanted to secure a more formalised break in the working week that did not rely on custom. Indeed, the creation of the weekend is still cited as a proud achievement in trade union history.

In 1842 a campaign group called the Early Closing Association was formed. It lobbied government to keep Saturday afternoon free for worker leisure in return for a full day’s work on Monday. The association established branches in key manufacturing towns and its membership was drawn from local civic elites, manufacturers and the clergy. Employers were encouraged to establish half-day Saturdays as the Early Closing Association argued it would foster a sober and industrious workforce.

Trades unions and workers’ temperance groups also saw the half-day Saturday as a vehicle to advance working class respectability. It was hoped they would shun drunkenness and brutal sports like cock fighting, which had traditionally been associated with Saint Monday.

For these campaigners, Saturday afternoon was singled out as the day in which the working classes could enjoy “rational recreation”, a form of leisure designed to draw the worker from the public house and into elevating and educational pursuits. For example, in Birmingham during 1850s, the association wrote in the Daily News newspaper that Saturday afternoons would benefit men and women who could:

Take a trip into the country, or those who take delight in gardening, or any other pursuit which requires daylight, could usefully employ their half Saturday, instead of working on the Sabbath; or they could employ their time in mental or physical improvements.

Business opportunity

Across the country a burgeoning leisure industry saw the new half-day Saturday as a business opportunity. Train operators embraced the idea, charging reduced fares for day-trippers to the countryside on Saturday afternoons. With increasing numbers of employers adopting the half-day Saturday, theatres and music halls also switched their star entertainment from a Monday to Saturday afternoon.

Perhaps the most influential leisure activity to help forge the modern week was the decision to stage football matches on Saturday afternoon. The “Football Craze”, as it was called, took off in the 1890s, just as the new working week was beginning to take shape. So Saturday afternoons became a very attractive holiday for workers, as it facilitated cheap excursions and new exciting forms of leisure.

The adoption of the modern weekend was neither swift nor uniform as, ultimately, the decision for a factory to adopt the half-day Saturday rested with the manufacturer. Campaigns for an established weekend had begun in the 1840s but it did not gain widespread adoption for another 50 years.

By the end of the 19th century, there was an irresistible pull towards marking out Saturday afternoon and Sunday as the weekend. While they had their different reasons, employers, religious groups, commercial leisure and workers all came to see Saturday afternoon as an advantageous break in the working week.

This laid the groundwork for the full 48-hour weekend as we now know it – although this was only established in the 1930s. Once again, it was embraced by employers who found that the full Saturday and Sunday break reduced absenteeism and improved efficiency.The Conversation

%d bloggers like this: