MONTHLY BLOG 54, POST-ELECTION SPECIAL: ON LOSING? 1

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2015)

Losing is not fun. That is, losing in a cause that means a lot to you, both intellectually and emotionally. In fact, it’s grim. Coping with defeat feels a bit like coping with a death. Again, there are deaths and deaths. Losing in a cause that means a lot feels very like coping with the unexpected death of someone close. So the first and unavoidable response is to grieve, rather grimly.

This BLOG arises from my personal feelings about Labour’s loss in Battersea at the General Election on 7 May 2015. It’s a short report, because grieving makes me feel distinctly brisk. For the record, we had a lively campaign. But we lost with a perceptible swing from Labour to the Conservatives, alongside a (predicted) heavy fall in Liberal Democrat votes, and a weak but not negligible showing for the Greens and UKIP.2

By the way, since the election I have attended a Wandsworth civic event where I bumped into Jane Ellison, Battersea’s Tory MP who was duly returned, and congratulated her. She was gracious (not hard when victorious but better than gloating). Jane Ellison also commented on the amount of venomous personal abuse that she had encountered via Twitter during the campaign. I was disgusted to hear that. I’d already heard of some personal abuse in the Twittersphere being directed at Will Martindale, the Labour candidate; and afterwards I learned that he too had received many unpleasant personal accusations, mostly made anonymously. That doubly depressing news made me feel even grimmer. It’s a tough baptism for all people in the public spotlight and especially tough on the electoral losers, who don’t have the consolation of victory at the polls. This half-hidden dimension to the electoral struggle made me appreciate yet again the dangers of anonymity and the civic advantages in open declarations of political allegiance, as was the norm in the pre-1872 practice of open voting.3

Anyway, after a public loss, the next task is to get on with all the business which follows. Clearing out the campaign headquarters comes high on the list. It’s a bit macabre but best done quickly. Undelivered leaflets seem especially sad, registering obsolete hopes. But it’s essential to keep an archive for our own record.

There’s also an amazing quantity of personal junk which accumulates in places where hundreds of strangers pop in to help. Lost umbrellas and bags are relatively explicable. But who leaves a vanity case with hundreds of lipsticks? Who leaves a sack of old clothing? Did some canvasser arrive fully dressed and leave starkers?

Perhaps it’s symbolic of rebirth after trauma? Certainly that would fit with the third stage, since, after grimly grieving and clearing the decks, it’s time for a major rethink. I now feel more brisk than grim. We’ve already had some meetings and more are planned, to assess what happened and where we should go next. Happily, large numbers have continued to attend. The feelings of outrage at the inequalities and unfairness of life in Battersea, which motivated many of Labour’s canvassers, have not gone away. And why would they? The extremes of wealth and poverty, side by side, remain stark. One of the Wandsworth foodbanks, run by dedicated volunteers in St Mark’s Church on Battersea Rise, helps a regular stream of desperate guests,4 while on the Thames riverfront a series of soulless ‘ghost towns’ of empty flats,5 held by absentee investors and potential money-launderers,6  mock the concept of community.

I have my own suggestion (in which I am not alone): I think that the Labour Party needs to update its language and its name. As part of that, I also favour a re-alliance of the progressive Left: Labour, plus Liberal Democrats, plus Greens. As it happens, the votes of all these parties would not have ousted the Conservative candidate from the Battersea seat. But the (re)birth of a progressive, redistributive, co-operative, green and libertarian centre-left is still the best long-term answer, for British political and cultural life as a whole. It has been much discussed over the years. But now’s the time, fellow Britons: political leaders and grass-roots alike. We should follow the pithy message from the Swedish-American Labour activist Joe Hill: ‘Don’t mourn; but organise’7 … with a new popular front.
2015-6 No1 Joe Hill

Swedish-American Labour activist Joe Hill: don’t mourn but organise (song 1930).

1 With comradely sympathy for Will Martindale (Battersea Labour candidate), Sean Lawless (BLP organiser) and the hundreds of Labour campaigners in the constituency.

2 Battersea’s votes, in a turnout of 67.1%, went to: Jane ELLISON (Conservative) 26,730 – up 5.0% from 2010; Will MARTINDALE (Labour) 18,792 – up 1.7%; Luke TAYLOR (LibDem) 2,241 – down 10.3%; Joe STUART (Green) 1,682; and Christopher HOWE (UKIP) 1,586.

3 But there were good reasons for adopting the secret ballot: see PJC, ‘What’s Wrong with the Old Practice of Open Voting? Standing Up to be Counted’ BLOG no 53 (May 2015).

4 Project seeded by the Trussell Trust: see www.wandsworth.foodbank.org.uk.

5 See Vauxhall Society (July 2013), www.vauxhallcivicsociety.org.uk/2013/07/: for viewpoint of Peter Rees, City of London chief planning officer, that under current redevelopment plans Vauxhall will be getting a ‘ghost town’ which would need no more than a ‘single-decker bus once an hour’, not the projected Northern Line [tube] Extension

6 Zoe Dare Hall, ‘Prime London and the Threat of Money Laundering’, 3 June 2015, in The London Magazine: reported in www.harrodsestates.com/news/361/prime-london.

7 Joel Hagglund, known as Joe Hill (1879-1915), hymned in ‘I dreamed I Saw Joe Hill Last Night’ (lyrics Alfred Hayes c.1930; music Earl Robinson 1936), a song especially loved by my father Tony Corfield, a lifelong activist in trade-unionism and adult education.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 54 please click here

MONTHLY BLOG 53, ELECTION SPECIAL: WHAT’S WRONG WITH THE OLD PRACTICE OF OPEN VOTING, STANDING UP TO BE COUNTED? 1

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2015)

Vote early! Generations of democratic activists have campaigned over centuries to give the franchise to all adult citizens. (Yes, and that right should extend to all citizens who are in prison too).2  Vote early and be proud to vote!

So, if we are full of civic pride or even just wearily acquiescent, why don’t we vote openly? Stand up to be counted? That is, after all, how the voting process was first done. In most parliamentary elections in pre-democratic England (remembering that not all seats were regularly contested), the returning officer would simply call for a show of hands. If there was a clear winner, the result would be declared instantly. But in cases of doubt or disagreement a head-by-head count was ordered. It was known as a ‘poll’. Each elector in turn approached the polling booth, identified his qualifications for voting, and called his vote aloud.3
2015-5 No1 Detail from Hogarth Election 1754

William Hogarth’s Oxfordshire Election (1754) satirised the votes of the halt, the sick and the lame. Nonetheless, he shows the process of open voting in action, with officials checking the voters’ credentials, lawyers arguing, and candidates (at the back of the booth) whiling away the time, as voters declare their qualifications and call out their votes.

Open voting was the ‘manly’ thing to do, both literally and morally. Not only was the franchise, for many centuries, restricted to men;4  but polling was properly viewed as an exercise of constitutional virility. The electoral franchise was something special. It was a trust, which should be exercised accountably. Hence an Englishman should be proud to cast his vote openly, argued the liberal philosopher John Stuart Mill in 1861. He should cast his vote for the general good, rather than his personal interest. In other words, the elector was acting as a public citizen, before the eyes of the world – and, upon important occasions, his neighbours did come to hear the verdict being delivered. Furthermore, in many cases the Poll Books were published afterwards, so generating a historical record not only for contemporaries to peruse, and for canvassers to use at the following election, but also for later historians to study individual level voting (something impossible under today’s secret ballot).

Especially in the populous urban constituencies, some of the most protracted elections became carnival-like events.6  Crowds of voters and non-voters gathered at the open polling booths to cheer, heckle or boo the rival candidates. They sported election ribbons or cockades; and drank at the nearby hostelries. Since polling was sometimes extended over several days, running tallies of the state of the poll were posted daily, thus encouraging further efforts from the canvassers and the rival crowds of supporters. Sometimes, indeed, the partisanship got out of hand. There were election scuffles, affrays and even (rarely) riots. But generally, the crowds were good-humoured, peaceable and even playful. In a City of Westminster parliamentary by-election in 1819, for example, the hustings oratory from the candidate George Lamb was rendered inaudible by incessant Baaing from the onlookers. It was amusing for everyone but the candidate, though he did at least win.7

Performing one’s electoral duty openly was a practice that was widely known in constitutionalist systems around the world. Open voting continued in Britain until 1872; in some American states until 1898; in Denmark until 1900; in Prussia until 1918; and, remarkably, in Hungary until 1938.

Not only did the voter declare his stance publicly but the onlookers were simultaneously entitled to query his right to participate. Then the polling clerks, who sat at the hustings to record each vote, would check in the parish rate books (or appropriate records depending each variant local franchise) before the vote was cast.8  In the event of a subsequent challenge, moreover, the process was subject to vote-by-vote scrutiny. One elector at a parliamentary by-election in Westminster in 1734 was accused by several witnesses of being a foreigner. He was said to have a Dutch accent, a Dutch coat, and to smoke his pipe ‘like a Dutchman’. Hence ‘it is the common repute of the neighbourhood that he is a Dutchman’. In fact, the suspect, named Peter Harris, was a chandler living in Wardour Street and he outfaced his critics. The neighbours’ suspicions were not upheld and the vote remained valid. Nonetheless, public opinion had had a chance to intervene. Scrutiny of the electoral process remains crucial, now as then.
2015-5 No2 Mynheer Van Funk - Dutch Skipper 1730

Illustration/2: British satirical cartoon of Mynheer Van Funk, a Dutch Skipper (1730)
Was this what Peter Harris, of Wardour Street, Westminster, looked like?

Well then, why has open voting in parliamentary elections disappeared everywhere? There are good reasons. But there is also some loss as well as gain in the change. Now people can make a parade of their commitment (say) to some fashionable cause and yet, sneakily, vote against it in the polling booth. Talk about having one’s cake and eating it. That two-ways-facing factor explains why sometimes prior opinion polls or even immediate exit polls can give erroneous predictions of the actual result.

Overwhelmingly, however, the secret ballot was introduced to allow individual voters to withstand external pressures, which might otherwise encourage them to vote publicly against their true inner convictions. In agricultural constituencies, tenants might be unduly influenced by the great local landlord. In single-industry towns, industrial workers might be unduly influenced by the big local employer. In service and retail towns, shopkeepers and professionals might be unduly influenced by the desire not to offend rich clients and customers. And everywhere, voters might be unduly influenced by the power of majority opinion, especially if loudly expressed by crowds pressing around the polling booth.

For those reasons, the right to privacy in voting was one of the six core demands made in the 1830s by Britain’s mass democratic movement known as Chartism.10 In fact, it was the first plank of their programme to be implemented. The Ballot Act was enacted in 1872, long before all adult males – let alone all adult females – had the vote. It was passed just before the death in 1873 of John Stuart Mill, who had tried to convince his fellow reformers to retain the system of open voting. (By the way, five points of the six-point Chartist programme have today been achieved, although the Chartist demand for annual parliaments remains unmet and is not much called for these days).

Does the actual voting process really matter? Secrecy allows people to get away with things that they might not wish to acknowledge publicly. They can vote frivolously and disclaim responsibility. Would the Monster Raving Loony Party get as many votes as it does (admittedly, not many) under a system of open voting? But I suppose that such votes are really the equivalent of spoilt ballot papers.

In general, then, there are good arguments, on John Stuart Millian grounds, for favouring public accountability wherever possible. MPs in Parliament have their votes recorded publicly – and rightly so. Indeed, in that context, it was good to learn recently that a last-minute bid by the outgoing Coalition Government of 2010-15 to switch the electoral rules for choosing the next Speaker from open voting to secret ballot was defeated, by a majority of votes from Labour plus 23 Conservative rebels and 10 Liberal Democrats. One unintentionally droll moment came when the MP moving the motion for change, the departing Conservative MP William Hague, defended the innovation as something ‘which the public wanted’.11

Electoral processes, however, are rarely matters of concern to electors – indeed, not as much as they should be. Overall, there is a good case for using the secret ballot in all mass elections, to avoid external pressures upon the voters. There is also a reasonable case for secrecy when individuals are voting, in small groups, clubs, or societies, to elect named individuals to specific offices. Otherwise, it might be hard (say) not to vote for a friend who is not really up to the job. (But MPs choosing the Speaker are voting as representatives of their constituencies, to whom their votes should be accountable). In addition, the long-term secrecy of jury deliberations and votes is another example that is amply justified in order to free jurors from intimidation or subsequent retribution.

But, in all circumstances, conscientious electors should always cast their votes in a manner that they would be prepared to defend, were their decision known publicly. And, in all circumstances, the precise totals of votes cast in secret ballots should be revealed. The custom in some small societies or groups, to announce merely that X or Y is elected but to refrain from reporting the number of votes cast, is open to serious abuse. Proper scrutiny of the voting process and the outcome is the democratic essence, along with fair electoral rules.

In Britain, as elsewhere, there is still scope for further improvements to the workings of the system. The lack of thoroughness in getting entitled citizens onto the voting register is the first scandal, which should be tackled even before the related question of electoral redistricting to produce much greater equality in the size of constituencies. It’s also essential to trust the Boundaries Commission which regularly redraws constituency boundaries (one of the six demands of the Chartists) to do so without political interference and gerrymandering. There are also continuing arguments about the rights and wrongs of the first-past-the-post system as compared with various forms of Alternative Voting.

Yet we are on a democratic pathway …. Hence, even if parliamentary elections are no longer occasions for carnival crowds to attend as collective witnesses at the hustings, let’s value our roles individually. The days of open voting showed that there’s enjoyment to be found in civic participation.
2015-5 No3 Rowlandson Westminster 1808

Thomas Rowlandson’s Westminster Election (published 1808), showing the polling booths in front of St Paul’s Covent Garden – and the carnivalesque crowds, coming either to vote or to witness.

1 With warm thanks to Edmund Green for sharing his research, and to Tony Belton, Helen Berry, Arthur Burns, Amanda Goodrich, Charles Harvey, Tim Hitchcock, Joanna Innes, and all participants at research seminars at London and Newcastle Universities for good debates.

2 On this, see A. Belton, BLOG entitled ‘Prisoners and the Right to Vote’, (2012), tonybelton.wordpress.com/2012/12/04/prisoners-and-the-right-to-vote/.

3 See J. Elklit, ‘Open Voting’, in R. Rose (ed.), International Encyclopaedia of Elections (2000), pp. 191-3; and outcomes of open voting in metropolitan London, 1700-1850, in www.londonelectoralhistory.com, incl. esp. section 2.1.1.

4 In Britain, adult women aged over 30 first got the vote for parliamentary elections in 1918; but women aged between 21 and 30 (the so-called ‘flappers’) not until 1928.

5 J.S. Mill, Considerations on Representative Government (1861), ed. C.V. Shields (New York, 1958), pp. 154-71.

6> See F. O’Gorman, Voters, Patrons and Parties: The Unreformed Electorate of Hanoverian England, 1734-1832 (Oxford, 1989).

7 British Library, Broughton Papers, Add. MS 56,540, fo. 55. Lamb then lost the seat at the next general election in 1820.

8 Before the 1832 Reform Act, there was no standardised electoral register; and many variant franchises, especially in the parliamentary boroughs.

9 Report of 1734 Westminster Scrutiny in British Library, Lansdowne MS 509a, fos. 286-7.

10 For a good overview, consult M. Chase, Chartism: A New History (Manchester, 2007).

11 BBC News, 26 March 2015: www.bbc.co.uk/news/uk-politics-32061097.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 53 please click here

MONTHLY BLOG 38, WHY IS THE LANGUAGE OF ‘RACE’ HOLDING ON SO LONG WHEN IT’S BASED ON A PSEUDO-SCIENCE?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

Of course, most people who continue to use the language of ‘race’ believe that it has a genuine meaning – and a meaning, moreover, that resonates for them. It’s not just an abstract thing but a personal way of viewing the world. I’ve talked to lots of people about giving up ‘race’ and many respond with puzzlement. The terminology seems to reflect nothing more than the way things are.

But actually, it doesn’t. It’s based upon a pseudo-science that was once genuinely believed but has long since been shown as erroneous by geneticists. So why is this language still used by people who would not dream of insisting that the earth is flat, or the moon made of blue cheese.

Part of the reason is no doubt the power of tradition and continuity – a force of history that is often under-appreciated.1 It’s still possible to hear references to people having the ‘bump of locality’, meaning that they have a strong topographical/spatial awareness and can find their way around easily. The phrase sounds somehow plausible. Yet it’s derived from the now-abandoned study of phrenology. This approach, first advanced in 1796 by the German physician F.J. Gall, sought to analyse people’s characteristics via the contours of the cranium.2  It fitted with the ‘lookism’ of our species. We habitually scrutinise one another to detect moods, intentions, characters. So it may have seemed reasonable to measure skulls for the study of character.

Phrenologist’s view of the human skull: point no. 31 marks the bump of locality, just over the right eyebrow.Yet, despite confident Victorian publications explaining The Science of Phrenology3  and advice manuals on How to Read Heads,4  these theories turned out to be no more than a pseudo-science. The critics were right after all. Robust tracts like Anti-Phrenology: Or a Chapter on Humbug won the day. Nevertheless, some key phrenological phrases linger on.5  My own partner in life has an exceptionally strong sense of topographical orientation. So sometimes I joke about his ‘bump of locality’, even though there’s no protrusion on his right forehead. It’s a just linguistic remnant of vanished views.

That pattern may apply similarly in the language of race, which is partly based upon a simple ‘lookism’. People who look like us are assumed to be part of ‘our tribe’. Those who do not seem to be ‘a race apart’ (except that they are not). The survival of the phrasing is thus partly a matter of inertia.

Another element may also spring, paradoxically, from opponents of ‘racial’ divisions. They are properly dedicated to ‘anti-racism’. Yet they don’t oppose the core language itself. That’s no doubt because they want to confront prejudices directly. They accept that humans are divided into separate races but insist that all races should be treated equally. It seems logical therefore that the opponent of a ‘racist’ should be an ‘anti-racist’. Statistics of separate racial groups are collected in order to ensure that there is no discrimination.

Yet one sign of the difficulty in all official surveys remains the utter lack of consistency as to how many ‘races’ there are. Early estimates by would-be experts on racial classification historically ranged from a simplistic two (‘black’ and ‘white’) to a complex 63.6  Census and other listings these days usually invent a hybrid range of categories. Some are based upon ideas of race or skin colour; others of nationality; or a combination And there are often lurking elements of ‘lookism’ within such categories (‘black British’), dividing people by skin colour, even within the separate ‘races’.7

So people like me who say simply that ‘race’ doesn’t exist (i.e. that we are all one human race) can seem evasive, or outright annoying. We are charged with missing the realities of discrimination and failing to provide answers.

Nevertheless, I think that trying to combat a serious error by perpetrating the same error (even if in reverse) is not the right way forward. The answer to pseudo-racism is not ‘anti-racism’ but ‘one-racism’. It’s ok to collect statistics about nationality or world-regional origins or any combination of such descriptors, but without the heading of ‘racial’ classification and the use of phrases that invoke or imply separate races.

Public venues in societies that historically operated a ‘colour bar’  used the brown paper bag test for quick decisions,  admitting people with skins lighter than the bag and rejecting the rest.  As a means of classifying people, it’s as ‘lookist’ as phrenology  but with even fewer claims to being ‘scientific’.  Copyright © Jessica C (Nov. 2013)What’s in a word? And the answer is always: plenty. ‘Race’ is a short, flexible and easy term to use. It also lends itself to quickly comprehensible compounds like ‘racist’ or ‘anti-racist’. Phrases derived from ethnicity (national identity) sound much more foreign in English. And an invented term like ‘anti-ethnicism’ seems abstruse and lacking instant punch.

All the same, it’s time to find or to create some up-to-date phrases to allow for the fact that racism is a pseudo-science that lost its scientific rationale a long time ago. ‘One-racism’? ‘Humanism’? It’s more powerful to oppose discrimination in the name of reality, instead of perpetrating the wrong belief that we are fundamentally divided. The spectrum of human skin colours under the sun is beautiful, nothing more.

1 On this, see esp. PJC website BLOG/1 ‘Why is the Formidable Power of Continuity so often Overlooked?’ (Nov. 2010).

2 See T.M. Parssinen, ‘Popular Science and Society: The Phrenology Movement in Early Victorian Britain’, Journal of Social History, 8 (1974), pp. 1-20.

3 J.C. Lyons, The Science of Phrenology (London, 1846).

4 J. Coates, How to Read Heads: Or Practical Lessons on the Application of Phrenology to the Reading of Character (London, 1891).

5 J. Byrne, Anti-Phrenology: Or a Chapter on Humbug (Washington, 1841).

6 P.J. Corfield, Time and the Shape of History (London, 2007), pp. 40-1.

7 The image comes from Jessica C’s thoughtful website, ‘Colorism: A Battle that Needs to End’ (12 Nov. 2013): www.allculturesque.com/colorism-a-battle-that-needs-to-end.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 38 please click here

MONTHLY BLOG 37, HOW DO PEOPLE RESPOND TO ELIMINATING THE LANGUAGE OF ‘RACE’?

 If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

 Having proposed eliminating from our thoughts and vocabulary the concept of ‘race’ (and I’m not alone in making that suggestion), how do people respond?

Indifference: we are all stardust. Many people these days shrug. They say that the word ‘race’ is disappearing anyway, and what does it matter?

Indeed, a friend with children who are conventionally described as ‘mixed race’ tells me that these young people are not worried by their origins and call themselves, semi-jokingly, ‘mixed ray’. It makes them sound like elfin creatures from the sun and stars – rather endearing really. Moreover, such a claim resonates with the fact that many astro-biologists today confirm that all humans (among all organic and inorganic matter on earth) are ultimately made from trace elements from space – or, put more romantically, from stardust. 1

So from a cosmic point of view, there’s no point in worrying over minor surface differences within one species on a minor planet, circulating within the constellation of a minor sun, which itself lies in but one quite ordinary galaxy within a myriad of galaxies.

Ethnic pride: On the other hand, we do live specifically here, on earth. And we are a ‘lookist’ species. So others give more complex responses, dropping ‘race’ for some purposes but keeping it for others. Given changing social attitudes, the general terminology seems to be disappearing imperceptibly from daily vocabulary. As I mentioned before, describing people as ‘yellow’ and ‘brown’ has gone. Probably ‘white’ will follow next, especially as lots of so-called ‘whites’ have fairly dusky skins.

‘Black’, however, will probably be the slowest to go. Here there are good as well as negative reasons. Numerous people from Africa and from the world-wide African diaspora have proudly reclaimed the terminology, not in shame but in positive affirmation.

Battersea’s first ‘black’ Mayor, John Archer (Mayor 1913/14) was a pioneer in that regard. I mentioned him in my previous BLOG (no 35). Archer was a Briton, with Irish and West Indian ancestry. He is always described as ‘black’ and he himself embraced black consciousness-raising. Yet he always stressed his debt to his Irish mother as well as to his Barbadian father.

In 1918 Archer became the first President of the African Progress Union. In that capacity, he attended meetings of the Pan-African Congress, which promoted African decolonisation and development. The political agenda of activists who set up these bodies was purposive. And they went well beyond the imagery of negritude by using a world-regional nomenclature.

Interestingly, therefore, the Pan-African Congress was attended by men and women of many skin colours. Look at the old photograph (1921) of the delegates from Britain, continental Europe, Africa and the USA (see Illus 1). Possibly the dapper man, slightly to the L of centre in the front row, holding a portfolio, is John Archer himself.

Illus 1: Pan-African Congress delegates in Brussels (1921)Today, ‘black pride’, which has had a good cultural run in both Britain and the USA, seems to be following, interestingly, in Archer’s footsteps. Not by ignoring differences but by celebrating them – in world-regional rather than skin-colourist terms. Such labels also have the merit of flexibility, since they can be combined to allow for multiple ancestries.

Just to repeat the obvious: skin colour is often deceptive. Genetic surveys reveal high levels of ancestral mixing. As American academic Henry Louis Gates has recently reminded us in The Observer, many Americans with dark skins (35% of all African American men) have European as well as African ancestry. And the same is true, on a lesser scale, in reverse. At least 5% of ‘white’ male Americans have African ancestry, according to their DNA.

Significantly, people with mixed ethnicities often complain at being forced to choose one or the other (or having choice foisted upon them), when they would prefer, like the ‘Cablinasian’ Tiger Woods, to celebrate plurality. Pride in ancestry will thus outlast and out-invent erroneous theories of separate ‘races’.

Just cognisance of genetic and historic legacies: There is a further point, however, which should not be ignored by those (like me) who generally advocate ‘children of stardust’ universalism. For some social/political reasons, as well as for other medical purposes, it is important to understand people’s backgrounds.

Thus ethnic classifications can help to check against institutionalised prejudice. And they also provide important information in terms of genetic inheritance. To take one well known example, sickle-cell anaemia (drepanocytosis) is a condition that can be inherited by humans whose ancestors lived in tropical and sub-tropical regions where malaria is or was common. It is obviously helpful, therefore, to establish people’s genetic backgrounds as accurately as possible.

All medical and social/political requirements for classification, however, call for just classification systems. One reader of my previous BLOG responded that it didn’t really matter, since if ‘race’ was dropped another system would be found instead. But that would constitute progress. The theory of different human races turned out to be erroneous. Instead, we should enquire about ethnic (national) identity and/or world-regional origins within one common species. Plus we should not use a hybrid mix of definitions, partly by ethnicities and partly by skin colour (as in ‘black Britons’).

Lastly, all serious systems of enquiry should ask about plurality: we have two parents, who may or may not share common backgrounds. That’s the point: men and women from any world-region can breed together successfully, since we are all one species.

1 S. Kwok, Stardust: The Cosmic Seeds of Life (Heidelberg, 2013).

2 For John Richard Archer (1869-1932), see biog. by P. Fryer in Oxford Dictionary of National Biography: on-line; and entry on Archer in D. Dabydeen, J. Gilmore and C. Jones (eds), The Oxford Companion to Black British History (Oxford, 2007, p. 33.

3 The Observer (5 Jan. 2014): New Review, p. 20.

4 M. Tapper, In the Blood: Sickle Cell Anaemia and the Politics of Race (Philadelphia, 1999).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 37 please click here

MONTHLY BLOG 36, TALKING OF LANGUAGE, IT’S TIME TO UPDATE THE LANGUAGE OF RACE

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

Names matter. Geneticists have long told us that all humans form part of one big human race.1  Indeed, we share biological characteristics not only with one another but also with a surprising number of other species. Nature is versatile in its ability to try many elegant variations within the common building blocks of life. As a result, the old view, that there were many separate human species, which were incapable of inter-marrying and inter-breeding, has gone. So we should not still talk or think in terms of there being different races of humans. It’s simply not true. Continuing to talk that way is like talking of the flat earth or insisting that the moon is made of blue cheese.

It therefore follows that we should not describe individuals as being of ‘mixed race’. The phrase is not only scientifically erroneous but positively misleading. It is a hangover from older ideas. Some early global explorers were impressed by our common humanity. Others, in good faith, saw different races. But the latter group proved to be wrong. Hence the logic is clear. Since there are no separate races, individuals cannot be mixtures of separate races. We are all one people. All ultimately in the human diaspora ‘out of Africa’.

Out of Africa diagram© Tom MooreNonetheless, there are different heritages and variegated group experiences within one common human history. How can we talk of those? One possible way is to refer to different ‘peoples’ within one species. But that terminology easily becomes confusing. Another old vocabulary talked of different ‘tribes’. Yet that too is unhelpful. If ‘peoples’ seem too nebulous and vague, then ‘tribes’ seem too small and sectarian. And in neither case is it easy to talk about compound heritages, whether from ‘mixed tribes’ or ‘mixed peoples’.

In fact, a number of Victorian social anthropologists spent a lot of time trying without success to classify the world’s ‘races’. But no criteria worked systematically, not only because people are intermixed today but also because we have a long history of intermixture. So there was no consensus about the number of different ‘races’. Various criteria were proposed, including skin colour, hair texture, average heights, cranial (skull) formation, nose-shapes, and testable intelligence. But these all yielded different and inconsistent answers.3

Estimates of the number of different human ‘races’ can be found from as low as two (‘black’ and ‘white’) to as high as 63. Such a range of guesstimates indicates not just that the task was hard but that it was impossible. For example, the many subtle variations in the handsome spectrum of human skin colours, from lily to ebony, make drawing up hard-and-fast divisions based upon colour a subjective and fallible exercise.

Interestingly, most of the proposed criteria for racial identification were solely external and ‘lookist’. But it’s hardly a secret that external appearance is no automatic guide to parentage. In countries where there were colour bars, plenty of people who were classified as black were able to ‘pass’ for white and vice versa.4  There are many permutations of looks and skin colour, even amongst very close family. Look around your own.

Or consider some public examples. A current member of the British cabinet, the conservative politician Iain Duncan-Smith, has a Japanese great-grandmother. But you would not guess that he is one-eighth Japanese at a quick glance. Or take a different case: the twin British girls born to Kylee Hodgson and Remi Horder in 2005 have contrasting skin and eye colours, one being dark-skinned and brown-eyed, the other having light colouring and blue eyes. Their parents view them proudly as a genetic gift. But a stranger would not know that the girls are sisters – let alone twins – simply by looking at them.

Does it matter? Not at all, for any human who accepts humanity as we are. It only matters for those who mind about such things, usually with hostile intent towards one or other of the attributed ‘racial’ categories. Indeed, some cultures do still maintain elaborate hierarchies of public status, tending to view those with light skin as ‘higher’ than those with darker hues.7  Such attitudes are, however, historic legacies of cultural classification that are not related to innate human qualities. For that reason, plenty of people reject a colourist world-view. The long history of caste fluidity and inter-caste marriage indicates that old cultural assumptions can be overcome – or shed entirely.

At the same time, we do need to acknowledge variety in ancestry and ethnicity. There are some medical conditions that are associated with particular genetic clusters. So some form of reference is needed. In my view, the ‘lookist’ language of skin colour, though still widely used, is historically on the way out as a means of classification. It is too crude and, currently, too socially sensitive. We don’t now refer to ‘yellows’, ‘browns’ or ‘coloured’. And, in my view, references to ‘white’ and ‘black’ will also go the way of history.

That prediction relates especially to how we name others. Some may want to retain the badge of colour as a proud form of self-identification, especially when it’s done to challenge old prejudices. But such labels may still be misleading. Particularly in the USA, where mobility and inter-marriage are rife, many dark-skinned people turn out to have very diverse parentage, with ancestors who don’t look like them but are still ancestors. Read Neil Henry’s account of A Black Man’s Search for his White Family: the upwardly mobile ‘black’ professional traced his socially declining ‘white trash’ cousins. But when they met, after the initial surprise on all sides, it was just normal.8

What then remains? The obvious forms of recognising difference relate to what we call ‘ethnicity’, pertaining to the many different human nations. That form of identification covers both biological and cultural affinities. So ‘ethnicity’ is not just a grand term for race. Instead, it’s an alternative way of recognising the effects of history and geography, by acknowledging the different cultures and traditions around the globe.

All human babies in their first year babble in the phonemes of all the thousands of human languages.9  Yet each child is brought up to speak predominantly in but one – or perhaps two – of those tongues.10 It’s a good example of difference within a common ability.

babies babble in the phonemes of all the world’s languages:  baby silhouette© victor-magz.com (2013)‘Ethnicity’ provides a neutral way of referring to variety within unity. It uses nationhood or world-region to provide a social label. Thus the ‘Japanese’ are those bred in Japan and who share the Japanese cultural identity – whatever their skin colour. Similarly, all the ‘Scots’ who will vote on the forthcoming referendum on the future of Scotland are those on the current Scottish electoral register, wherever they were born. Close neighbours, like my first cousin who self-identifies as ‘Scottish’ but lives in the north of England, will not.

The great advantage of using national or regional labels is that they can be doubled, to acknowledge diversity of heritage. Thus John Archer, known as London’s first ‘black’ Mayor (Battersea: 1913-14) can be more properly described as a Briton, born in Liverpool, with Barbadian Irish ancestry. That pays due respect to both his parents. The Americans, as a ‘people’ with a long history of immigration, are paving the way in this usage, helping individuals to acknowledge their adherence to America but also a different parental heritage: African American, Irish American, Hungarian American, and so forth.

But admittedly, there is one large complication when people have many ethnicities to acknowledge. John Archer, after all, was Barbadian Irish British. His wife was West Indian Canadian. But such convolutions can easily become cumbersome. What would their children be? Here the golfer Tiger Woods has found a witty answer. He’s pioneered the adjective ‘Cablinasian’ to name his Caucasian, Black, American Indian and Asian heritage. That should (even if it hasn’t yet) stop people trying to define him as ‘black’.

Lastly, what to do when recent politics still governs the language of social description? It’s only recently that South Africa shed its tripartite classification of ‘white’, ‘black’ and ‘Cape coloured’ (difficult as it was to implement at the multiple margins). Now perhaps one might distinguish between people of Dutch South African descent or English South African heritage. But it would then be logical to talk about African South Africans; or, for mixed ancestries, (say) Dutch African South African. It’s all too much. How about following the people of Brazil, with their mixed heritage from indigenous Americans, Portuguese, Africans, and Asians? Their National Research by Household Sample (2008) classifies people partly by self-assigned colour and partly by family origin by world-region.11 For all other purposes, however, they are ethnic ‘Brazilians’.

I guess that’s what Mandela would have wished to see happening among the next generations of South Africans. Down with skin-deepishness. Long live world-regional identities – plus their mixing.

1 L.L. and F. Cavalli-Sforza, The Great Human Diasporas: The History of Diversity and Evolution (New York, 1995): all humans should read this book.

2 See Carl Zimmer, ‘Genes are Us. And Them’, National Geographic (July 2013)

3 For an attempted scientific methodology, see R.B. Dixon, The Racial History of Man (New York, 1923), pp. 8-45, 475-523. See also, for context, E. Barkan, The Retreat of Scientific Racism: Changing Concepts of Race in Britain and the United States between the World Wars (Cambridge, 1992).

4 The difficulty of classifying individuals objectively into clearly separate and unmixed ‘races’ has vitiated various past attempts at classifying racial intelligence – quite apart from the problem of finding tests that factor out the effects of different nurture and social/biological environment.

5 M. Tempest, ‘Duncan Smith’s Secret Samurai Past’, The Guardian, 3 Sept. 2001: see
www.theguardian.com/politics/2001.

6 See report by Paul Harris and Lucy Laing, Daily Mail, 30 March 2012: www.dailymail.co.uk/news/article-2123050/Look-The-black-white-twins-turn-seven.

7 On pigmentary hierarchies, which are found in some but not all cultures, see D. Gabriel, Layers of Blackness: Colourism in the African Diaspora (London, 2007); E.N. Glenn (ed.), Shades of Difference: Why Skin Color Matters (Stanford, Calif., 2009); S.B. Verma, ‘Obsession with Light Skin: Shedding Some Light upon the Use of Skin Lightening Products in India’, International Journal of Dermatology, 49 (2010), pp. 464ff.

8 N. Henry, Pearl’s Secret: A Black Man’s Search for his White Family (Berkeley, Calif., 2001).

9 D. Crystal (ed.), The Cambridge Encyclopedia of Language (Cambridge, 1994), pp. 236-7.

10 Most children are monolingual, but bilingualism is not uncommon, where the parents have different languages, or where the wider society operates with more than one official language. It’s much rarer to be polyglot: see e.g. Xiao-Lei Wang, Growing Up with Three Languages: Birth to Eleven (Bristol, 2008).

11 Brazil’s National Research by Household Sample (2008) reported that 48.43% of the Brazilian population, when surveyed, described themselves as ‘white’; 43.80% as ‘brown’ (multi-ethnic); 6.84% as ‘black’; 0.58% as ‘Amerindian’ (officially known as ‘Indigenous’); while 0.07% (about 130,000 individuals) did not declare any additional identity. A year earlier, Brazil’s National Indian Foundation also reported the existence of at least 67 different ‘uncontacted’ tribes. See en.wikipedia.org/wiki/Brazil.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 36 please click here

MONTHLY BLOG 35, DONS AND STUDENT-CUSTOMERS? OR THE COMMUNITY OF LEARNERS?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

Names matter. Identifying  things – people – events – in realistic terminology means that they are being fully understood and taken seriously. Conversely, it’s warping to the mind and eventually corrosive of good thought to be constantly urged to give lip-service to the ‘wrong’ terms. People who live under dictatorial systems of would-be thought-control often testify to the ‘dead’ feeling that results from public censorship, especially when it is internalised as self-censorship.

By the way, I wrote that paragraph before remembering that this sentiment dovetailed with something I’d read about Confucius. A quick Google-check confirmed my half-memory.  Confucius long ago specified that: ‘The beginning of wisdom is to call things by their proper names’. It’s a great dictum. It doesn’t claim too much. Naming is only the ‘beginning of wisdom’, not the entirety. And there is often scope for debating what is or should be the ‘proper’ name. Nonetheless, Confucius not only highlights the good effect of clear vision, accurately acknowledged to others, but equally implies the malign effects of the reverse. The beginning of madness is to delude oneself and others about the true state of affairs.

Which brings me to my current question: are University students ‘customers’? If so, an interesting implication follows. If ‘the customer is always right’, as the business world asserts but does not always uphold, should not all students get top marks for having completed an assignment or an exam paper? Or, at very least not get bad marks?

Interestingly, now that student payments for tuition are very much up-front and personal in the form of fees (which are funded as repayable loans), so the standard of degrees is gradually rising. Indeed, grade inflation has become noticeable ever since Britain’s Universities began to be expanded into a mass system. A survey undertaken in 2003 found that the third-class degree has been in steady decline since 1960 and was nearing extinction by 2000. And a decade on, the lower second (2.2) in some subjects is following the same trajectory. Better teaching, better study skills, and/or improved exam preparation may account for some of this development. But rising expectations on the part of students – and increasing reputational ambitions on the part of the Universities – also exert subtle pressures upon examiners to be generous.

Nonetheless, even allowing for a changing framework of inputs and outputs, a degree cannot properly be ‘bought’. Students within any given University course are learners, not customers. Their own input is an essential part of the process. They can gain a better degree not by more money but by better effort, well directed, and by better ability, suitably honed.

People learn massively from teachers, but also much from private study, and much too from their fellow-learners (who offer both positive and negative exemplars). Hence the tutors, the individual student, and other students all contribute to each individual’s result.2

A classic phrase for this integrated learning process was ‘the community of scholars’. That phrase now sounds quaint and possibly rather boring. Popularly, scholarship is assumed to be quintessentially dull and pedantic, with the added detriment of causing its devotees to ‘scorn delights and live laborious days,’ in Milton’s killing phrase.3  In fact, of course, learning isn’t dull. Milton, himself a very learned man, knew so too. Nonetheless, ‘the community of scholars’ doesn’t cut the twenty-first century terminological mustard.

But ‘learning’ has a better vibe. It commands ‘light’. People may lust for it, without losing their dignity. And it implies a continually interactive process. So it’s good for students to think of themselves as part of a community of learners. Compared with their pupils, the dons are generally older, sometimes wiser, always much better informed about the curriculum, much more experienced in teaching, and ideally seasoned by their own research efforts. But the academics too are learners, if more advanced along the pathway. They are sharing the experience and their expertise with the students. Advances in knowledge can come from any individual at any level, often emerging from debates and questions, no matter how naive. So it’s not mere pretension that causes many academics to thank in their scholarly prefaces not only their fellow researchers but also their students.

Equally, it’s good for the hard-pressed dons to think of themselves as part of an intellectual community that extends to the students. That concept reasserts an essential solidarity. It also serves to reaffirm the core commitment of the University to the inter-linked aims of teaching and research. Otherwise, the students, who are integral to the process, are seemingly in danger of getting overlooked while the dons are increasingly tugged between the rival pressures of specialist research in the age of Research Assessment, and of managerial business-speak in the age of the University-plc.4

Lastly, reference to ‘community’ need not be too starry-eyed. Ideals may not always work perfectly in practice. ‘Community’ is a warm, comforting word. It’s always assumed to be a ‘good thing’. Politicians, when seeking to commend a policy such as mental health care, refer to locating it in ‘the community’ as though that concept can resolve all the problems. (As is now well proven, it can’t). And history readily demonstrates that not all congregations of people form a genuine community. Social cohesion needs more than just a good name.

That’s why it’s good to think of Universities as containing communities of learners, in order to encourage everyone to provide the best conditions for that basic truth to flourish at its best. That’s far from an easy task in a mass higher-education system. It runs counter to attempts at viewing students as individual consumers. But it’s more realistic as to how teaching actually works well. And calling things by their proper names makes a proper start.

William Hogarth’s satirical Scholars at a Lecture (1736) offers a wry reminder to tutors not to be boring and to students to pay attention1 ‘Third Class Degree Dying Out’, Times Higher Education, 5 Sept. 2003: www.timeshighereducation.co.uk/178955/article: consulted 4 Nov. 2013.

2 That’s one reason why performance-related-pay (PRP) for teachers, based upon examination results for their taught courses, remains a very blunt tool for rewarding teaching achievements. Furthermore, all calculations for PRP (to work even approximately justly) need to take account of the base-line from which the students began, to measure the educational ‘value-added’. Without that proviso, teachers (if incentivised purely by monetary reward) should logically clamour to teach only the best, brightest, and most committed students, who will deliver the ‘best’ results.

3 John Milton, Lycidas: A Lament for a Friend, Drowned in his Passage from Chester on the Irish Seas (1637), lines 70-72: ‘Fame is the spur that the clear spirit doth raise/ (That last Infirmity of Noble mind)/ To scorn delights and live laborious days.’

4 On the marketisation of higher education, see film entitled Universities Plc? Enterprise in Higher Education, made by film students at the University of Warwick (2013): www2.warwick.ac.uk/fac

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 35 please click here

MONTHLY BLOG 33, CONTRACTING OUT SERVICES IS KILLING REPRESENTATIVE DEMOCRACY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

‘Contracting out’ is a policy mantra especially of financial/services capitalism (as opposed to industrial capitalism or landowner capitalism), which has been gaining greater support year by year. As an ideal, it was succinctly formulated by Nicholas Ridley (1929-93), who held various ministerial posts under Margaret Thatcher government. Theoretically, he hated government expenditure of all kinds: ‘I was against all but the most minimal use of the taxpayer’s purse’.1

For Ridley – himself from a titled family with business interests in ship-owning – the ideal form of local democracy would be one in which the Councillors met no more than once yearly. At the annual meeting, they should set the rate and agree the fees for contracting out municipal services. Then they could all go home. His was an extreme version of what is known in political theory as a preference for the minimal ‘night-watchman state’.2

C17 print of night-watchman and dog. No mention from Ridley of Town Hall debates as providing a sounding-board for local opinion. No mention of community identity and pride in collective institutions. No mention of a proper scope for in-house services. No mention of elected control of key tasks, including regulatory and quasi-judicial functions. No mention even of scrutinising the contracted-out services. No mention therefore of accountability.

Above all, no mention from Ridley of what Edmund Burke called the ‘little platoons’3 (‘local platoons’ would have been better, as their sizes are variable) that bridge between private individuals and the central state. Hence no mention of representative democracy at a local level. This was aristocratic disdain worthy of Marie Antoinette before the French Revolution. Moreover, without representative politics at all levels of society, then popular democracy will, when provoked, burst through into direct action. Often, though not invariably, in an uncoordinated and violent manner.

France, in fact, provides an excellent historical example of the eventual follies of contracting out. The absolute monarchs before 1789 presided over a weak central bureaucracy. As a result, one of the key functions of the state, the collection of taxes, was ‘farmed out’, in the jargon of the eighteenth century. The Ferme Générale undertook the humdrum tasks of administration, absorbing the risks of fluctuating returns, while assuring the monarchy of a regular income. And, to be sure, this system survived for many years. Nonetheless, the French monarchy faced chronic financial problems by the later eighteenth century. And the great political problem was that all the tax profits went to the Tax Farmers, while popular hatred at high payments and harsh collection methods remained directed at the kings.4

In twenty-first century Britain, something of the same situation is developing. The state still has to provide basic services; and remains the guarantor of last resort, if and when private service firms fail. Thus the faults of the system are still the government’s faults, while the profits go to private companies. The other long-term costs are borne by the general public, left to face cut-to-the-bone services, provided by poorly-paid and demoralised casual labour. No-one is popular, in such a system. But the secretive and unaccountable world of the private providers, sheltered by commercial ‘secrecy’, saves them for a while from the wrath to come.

One notorious example is known to everyone. It occurred in July 2012, just before the start of the Olympic Games. The private firm G4S promised but failed to deliver security. The contract was worth £284 million. Two weeks before the opening ceremony, the same role was transferred to the publicly-funded army. It did the task well, to tremendous applause. G4S forfeited £88 million for its failure on this part of the contract.5 Yet, despite this ‘humiliating shambles’ in the words of its chief executive, who resigned just over six months later with a huge payoff,6 the firm remains a major player in the world of security services.

The British army on security patrol at the London Olympics August 2012 – replacing the failed private security firm G4S.So G4S today advertises itself as ‘the world’s leading international security solutions group, which specialises in secure outsourcing in countries and sectors where security and safety risks are considered a strategic threat’.7 No mention of regular overview and scrutiny, because there is none. It’s another of those businesses which are considered (wrongly, in practice) as ‘too big to fail’. The point of scrutiny comes only after an embarrassing failure or at the renewal of the contract, when nervous governments, having invested their prestige and money in privatisation programmes, don’t care or dare to rethink their strategy. In August 2013, G4S is being investigated by the Ministry of Justice for alleged over-charging on electronic ‘tagging’ schemes for offenders.8 Yet, alas, this costly imbroglio is unlikely to halt the firm’s commercial advance for long.

Overall, there is a huge shadow world of out-sourced businesses. They include firms like Serco, Capita, Interserve, Sodexo, and the Compass Group. As the journalist John Harris comments: ‘their names seem anonymously stylised, in keeping with the sense that they seemed both omni-present, and barely known’.9 Their non-executive directors often serve on the board of more than one firm at a time, linking them in an emergent international contractocracy. Collectively, they constitute a powerful vested interest.

Where will it end? The current system is killing representative democracy. Elected ministers and councillors find themselves in charge of dwindling bureaucracies. So much the better, cry some. But quis custodiet? The current system is not properly accountable. It is especially dangerous when private firms are taking over the regulatory functions, which need the guarantee of impartiality. (More on that point in a later BLOG). Successful states need efficient bureaucracies, that are meritocratic, impartial, non-corrupt, flexible, and answerable regularly (and not just at contract-awarding intervals) to political scrutiny. The boundaries between what should be state-provided and what should be commercially-provided are always open to political debate. But, given  that the state often funds and ultimately guarantees many functions, its interest in what is going on in its name cannot be abrogated.

The outcome will not be the same as the French Revolution, because history does not repeat itself exactly. Indeed, the trend nowadays is towards contracting-out rather than the reverse. Yet nothing is fixed in stone. Wearing my long-term hat, I prophecy that eventually many of the profit-motive ‘Service Farmers’ will have to go, rejected by democratic citizens, just as the ‘Tax Farmers’ went before them.

1 Patrick Cosgrave, ‘Obituary: Lord Ridley of Liddesdale’, Independent, 6 March 1993.

2 Another term for this minimal-government philosophy is ‘Minarchism’ or limited government libertarianism, often associated with free-marketry. Minarchism should be distinguished from anarchism or no-government, which has different ideological roots.

3 ‘To be attached to the subdivision, to love the little platoon we belong to in society, is the first principle (the germ as it were) of public affections. It is the first link in the series by which we proceed towards a love to our country and to mankind’: E. Burke, Reflections upon the Revolution in France (1790), ed. C.C. O’Brien (Harmondsworth, 1969), p. 135.

4 E.N. White, ‘From Privatised to Government-Administered Tax-Collection: Tax Farming in Eighteenth-Century France’, Economic History Review, 57 (2004), pp. 636-63.

5 Reported in Event, 14 Feb. 2013.

6 Daily Mail, 21 May 2013, from Mail-online: www.dailymail.co.uk, viewed 9 Aug. 2013.

7 See ‘Who we are’ in website www.g4s.com.

8 Daily Telegraph, 6 August 2013, from Telegraph-online: www.telegraph.co.uk, viewed on 9 Aug. 2013.

9 John Harris on Serco, ‘The Biggest Company you’ve never heard of’, Guardian, 30 July 2013: supplement, pp. 6-9.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 33 please click here

MONTHLY BLOG 30, BUT PEOPLE OFTEN ASK: HISTORY IS REALLY POLITICS, ISN’T IT? SO WHY SHOULDN’T POLITICIANS HAVE THEIR SAY ABOUT WHAT’S TAUGHT IN SCHOOLS?

 If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

Two fascinating questions, to which my response to the first is: No – History is bigger than any specific branch of knowledge – it covers everything that humans have done, which includes lots besides Politics. Needless to say, such a subject lends itself to healthy arguments, including debates about ideologically-freighted religious and political issues.

But it would be dangerous if the study of History were to be forced into a strait-jacket by the adherents of particular viewpoints, buttressed by power of the state. (See my April 2013 BLOG). By the way, the first question can also be differently interpreted to ask whether all knowledge is really political? I return to that subtly different issue below.*

Meanwhile, in response to the second question: I agree that politicians could do with saying and knowing more about History. Indeed, there’s always more to learn. History is an open-ended subject, and all the better for it. Because it deals with humans in ever-unfolding Time, there is always more basic data to incorporate. And perspectives upon the past can gain significant new dimensions when reconsidered in the light of changing circumstances.

Yet the case for an improved public understanding of History is completely different from arguing that each incoming Education Secretary should re-write the Schools’ History syllabus. Politicians are elected to represent their constituents and to take legislative and executive decisions on their behalf – a noble calling. In democracies, they are also charged to preserve freedom of speech. Hence space for public and peaceful dissent is supposed to be safeguarded, whether the protesters be many or few.

The principled reason for opposing attempts at political control of the History syllabus is based upon the need for pluralism in democratic societies. No one ‘side’ or other should exercise control. There is a practical reason too. Large political parties are always, whether visibly or otherwise, based upon coalitions of people and ideas. They do not have one ‘standard’ view of the past. In effect, to hand control to one senior politician means endorsing one particular strand within one political party: a sort of internal warfare, not only against the wider culture but the wider reaches of his or her own political movement.

When I first began teaching, I encountered a disapproving professor of markedly conservative views. When I told him that the subject for my next class was Oliver Cromwell, he expressed double discontent. He didn’t like either my gender or my politics. He thought it deplorable that a young female member of the Labour party, and an elected councillor to boot, should be indoctrinating impressionable students with the ‘Labour line on Cromwell’. I was staggered. And laughed immoderately. Actually, I should have rebuked him but his view of the Labour movement was so awry that it didn’t seem worth pursuing. Not only do the comrades constantly disagree (at that point I was deep within the 1971 Housing Finance Act disputes) but too many Labour activists show a distressing lack of interest in History.

Moreover, Oliver Cromwell is hard to assimilate into a simplistic narrative of Labour populism. On the one hand, he was the ‘goodie’ who led the soldiers of the New Model Army against an oppressive king. On the other hand, he was the ‘baddie’ who suppressed the embryonic democrats known as the Levellers and whose record in Ireland was deeply controversial. Conservative history, incidentally, has the reverse problem. Cromwell was damned by the royalists as a Regicide – but simultaneously admired as a successful leader who consolidated British control in Ireland, expanded the overseas empire, and generally stood up to foreign powers.1

Interestingly, the statue of Oliver Cromwell, prominently sited in Westminster outside the Houses of Parliament, was proposed in 1895 by a Liberal prime minister (Lord Rosebery), unveiled in 1899 under a Conservative administration, and renovated in 2008 by a Labour government, despite a serious proposal in 2004 from a Labour backbencher (Tony Banks) that the statue be destroyed. As it stands, it highlights Cromwell the warrior, rather than (say) Cromwell the Puritan or Cromwell the man who brought domestic order after civil war. And, at his feet, there is a vigilant lion, whose British symbolism is hard to miss.2

Cromwell statue with lion
Or take the very much more recent case of Margaret Thatcher’s reputation. That is now beginning its long transition from political immediacy into the slow ruminations of History. Officially, the Conservative line is one of high approval, even, in some quarters, of untrammelled adulation. On the other hand, she was toppled in 1990 not by the opposition party but by her own Tory cabinet, in a famous act of ‘matricide’. There is a not-very concealed Conservative strand that rejects Thatcher outright. Her policies are charged with destroying the social cohesion that ‘true’ conservatism is supposed to nurture; and with strengthening the centralised state, which ‘true’ conservatism is supposed to resist.3 Labour’s responses are also variable, all the way from moral outrage to political admiration.

Either way, a straightforward narrative that Thatcher ‘saved’ Britain is looking questionable in 2013, when the national economy is obstinately ‘unsaved’. It may be that, in the long term, she will feature more prominently in the narrative of Britain’s conflicted relationship with Europe. Or, indeed, as a janus-figure within the slow story of the political emergence of women. Emmeline Pankhurst (below L) would have disagreed with Thatcher’s policies but would have cheered her arrival in Downing Street. Thatcher, meanwhile, was never enthusiastic about the suffragettes but never doubted that a woman could lead.4

Emmeline Pankhurst and Thatcher statue parliament
Such meditations are a constituent part of the historians’ debates, as instant journalism moves into long-term analysis, and as partisan heat subsides into cooler judgment. All schoolchildren should know the history of their country and how to discuss its meanings. They should not, however, be pressurised into accepting one particular set of conclusions.

I often meet people who tell me that, in their school History classes, they were taught something doctrinaire – only to discover years later that there were reasonable alternatives to discuss. To that, my reply is always: well, bad luck, you weren’t well taught; but congratulations on discovering that there is a debate and deciding for yourself.

Even in the relatively technical social-scientific areas of History (such as demography) there are always arguments. And even more so in political, social, cultural, and intellectual history. But the arguments are never along simple party-political lines, because, as argued above, democratic political parties don’t have agreed ‘lines’ about the entirety of the past, let alone about the complexities of the present and recent-past.

Lastly * how about broadening the opening question? Is all knowledge, including the study of History, really ‘political’ – not in the party-political sense – but as expressing an engaged worldview? Again, the answer is No. That extended definition of ‘political’ takes the term, which usefully refers to government and civics, too far.

Human knowledge, which does stem from, reflect and inform human worldviews, is hard gained not from dogma but from research and debate, followed by more research and debate. It’s human, not just political. It’s shared down the generations. And between cultures. That’s why it’s vital that knowledge acquisition be not dictated by any temporary power-holders, of any political-ideological or religious hue.

1 Christopher Hill has a good chapter on Cromwell’s Janus-faced reputation over time, in God’s Englishman: Oliver Cromwell and the English Revolution (1970), pp. 251-76.

2 Statue of Cromwell (1599-1658), erected outside Parliament in 1899 at the tercentenary of his birth: see www.flickr.com, kev747’s photostream, photo taken Dec. 2007.

3 Contrast the favourable but not uncritical account by C. Moore, Margaret Thatcher, the Authorised Biography, Vol. 1: Not for Turning (2013) with tough critiques from Christopher Hitchens and Karl Naylor: see www.Karl-Naylor.blogspot.co.uk, entry for 23 April 2013.

4 Illustrations (L) photo of Emmeline Pankhurst (1858-1928), suffragette leader, orating in Trafalgar Square; (R) statue of Margaret Thatcher (1925-2013), Britain’s first woman prime minister (1979-90), orating in the Commons: see www.parliament.uk.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 30 please click here

MONTHLY BLOG 29, SHOULD EACH SECRETARY OF STATE FOR EDUCATION REWRITE THE UK SCHOOLS HISTORY SYLLABUS?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

The answer is unequivocally No. (Obvious really but worth saying still?)

History as a subject is far, far too important to become a political football. It teaches about conflict as well as compromise; but that’s not the same as being turned into a source of conflict in its own right. Direct intervention by individual politicians in framing the History syllabus is actively dangerous.

2013-4 Cavaliers and Roundheads

Rival supporters of King and Parliament in the 1640s civil wars, berating their opponents as ‘Roundhead curs’ and ‘Cavalier dogs’: the civil wars should certainly appear in the Schools History syllabus but they don’t provide a model for how the syllabus should be devised.

There are several different issues at stake. For a start, many people believe that the Schools curriculum, or prescriptive framework, currently allots too little time to the study of History. There should be more classes per week. And the subject should be compulsory to the age of sixteen.1  Those changes would in themselves greatly enhance children’s historical knowledge, reducing their recourse to a mixture of prevalent myths and cheerful ignorance.

A second issue relates to the balance of topics within the current History syllabus, which specifies the course contents. I personally do favour some constructive changes. There is a good case for greater attention to long-term narrative frameworks,2  alongside high-quality in-depth studies.

But the point here is: who should actually write the detailed syllabus? Not individual historians and, above all, not individual politicians. However well-intentioned such power-brokers may or may not be, writing the Schools History syllabus should be ultra-vires: beyond their legal and political competence.

The need for wide consultation would seem obvious; and such a process was indeed launched. However, things have just moved into new territory. It is reported that Education Secretary has unilaterally aborted the public discussions. Instead, the final version of the Schools History syllabus, revealed on 7 February 2013, bears little relation to previous drafts and discussions.3 It has appeared out of the (political) blue.

Either the current Education Secretary acted alone, or perhaps had some unnamed advisers, working behind the scenes. Where is the accountability in this mode of procedure? Even some initial supporters of syllabus revision have expressed their dismay and alarm.

Imagine what Conservative MPs would have said in 2002 if David Blunkett (to take the best known of Blair’s over-many Education Ministers) had not only inserted the teaching of Civics into the Schools curriculum as a separate subject;4 but had written the Civics syllabus as well. Or if Blunkett had chosen to rewrite the History syllabus at the same time?

Or imagine what Edmund Burke, the apostle of moderate Toryism, would have said. This eighteenth-century politician-cum-political theorist, who was reportedly identified in 2008 as ‘the greatest conservative ever’ by the current Education Secretary,5 was happy to accept the positive role of the state. Yet he consistently warned of the dangers of high-handed executive power. The authority of central government should not be untrammelled. It should not be used to smash through policies in an arbitrary manner. Instead Burke specifically praised the art of compromise or – a better word – of mutuality:

All government, indeed every human benefit and enjoyment, every virtue, and every prudent act, is founded on compromise and barter.6

An arbitrary determination of the Schools History syllabus further seems to imply that the subject not only can but ought to be moulded by political fiat. Such an approach puts knowledge itself onto a slippery slope. ‘Fixing’ subjects by political will (plus the backing of the state) leads to intellectual atrophy.

To take a notoriously extreme example, Soviet biology was frozen for at least two generations by Stalin’s doctrinaire endorsement of Lysenko’s environmental genetics.7 A dramatic rise in agrarian productivity was promised, without the need for fertilisers (or more scientific research). Stalin was delighted. Down with the egg-heads and their slow research. Lysenko’s critics were dismissed or imprisoned. But Lysenkoism did not work. And, after unduly long delays, his pseudo-science was finally discredited.
2013-4 Lysenko_with_Stalin - Copy

A rare photo of Stalin (back R) gazing approvingly at Trofim Lysenko (1898-1976)
speaking from the rostrum in the Kremlin, 1935

In this case, the Education Secretary is seeking to improve schoolchildren’s minds rather than to improve crop yields. But declaring the ‘right’ answer from the centre is no way to achieve enlightenment. Without the support of the ‘little platoons’ (to borrow another key phrase from Burke), the proposed changes may well prove counter-productive in the class-room. Many teachers, who have to teach the syllabus, are already alienated. And, given that History as a subject lends itself to debate and disagreement, pupils will often learn different lessons from those intended.

Intellectual interests in an Education Secretary are admirable. The anti-intellectualism of numerous past ministers (including too many Labour ones) has been horribly depressing. But intellectual confidence, tipped into arrogance, can be taken too far. Another quotation to that effect is often web-attributed to Edmund Burke, though it seems to come from Albert Einstein. He warned that powerful people should wisely appreciate the limits of their power:

Whoever undertakes to set himself up as a judge of Truth and Knowledge is shipwrecked by the laughter of the gods.8

1 That viewpoint was supported in my monthly BLOG no.23 ‘Why do Politicians Undervalue History in Schools’ (Oct. 2012): see www.penelopejcorfield.co.uk.

2 I proposed a long-span course on ‘The Peopling of Britain’ in History Today, 62/11 (Nov. 2012), pp. 52-3.

3 See D. Cannadine, ‘Making History: Opportunities Missed in Reforming the National Curriculum’, Times Literary Supplement, 15 March 2013, pp. 14-15; plus further responses and a link to the original proposals in www.historyworks.tv

4 For the relationships of History and Civics, see my monthly BLOG no.24 ‘History as the Staple of a Civic Education’, www.penelopejcorfield.co.uk.

5 Michael Gove speech to 2008 Conservative Party Annual Conference, as reported in en.wikipedia.org/wiki/Michael_Gove, consulted 3 April 2013.

6 Quotation from Edmund Burke (1729-97), Second Speech on Conciliation with America (1775). For further context, see D. O’Keeffe, Edmund Burke (2010); I. Kramnick, The Rage of Edmund Burke: Portrait of an Ambivalent Conservative (New York, 1977); and F. O’Gorman (ed.), British Conservatism: Conservative Thought from Burke to Thatcher (1986).

7 Z. Medvedev, The Rise and Fall of T.D. Lysenko (New York, 1969).

8 Albert Einstein (1879-1955), in Essays Presented to Leo Baeck on the Occasion of his Eightieth Birthday (1954), p. 26. The quotation is sometimes (but wrongly) web-attributed to Edmund Burke’s critique of Jacobin arrogance in his Preface to Brissot’s Address to his Constituents (1794).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 29 please click here

MONTHLY BLOG 25, CHAMPIONING THE STUDY OF HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2012)

How do we champion (not merely defend) the study of History in schools and Universities? Against those who wrongly claim that the subject is not commercially ‘useful’.

Here are three recommendations. Firstly, we should stress the obvious: that a knowledge of history and an interconnected view of past and present (cause and consequence) is essential to the well-functioning not only of every individual but also of every society. The subject roots people successfully in time and place. Individuals with lost memories become shadowy, needing help and compassion. Communities with broken memories, for example through forced uprooting, exhibit plentiful signs of trauma, often handed down through successive generations. Civics as well as economics thus demands that people have a strong sense of a sustained past. That entails learning about the history their own and other societies, in order to gain an understanding of the human condition. All knowledge comes from the past and remains essential in the present. Nothing could be more ‘useful’ than history, viewed broadly.

december003The second recommendation links with the first. We should define the subject as the study not of the ‘dead past’ but of ‘living history’.

In fact, there’s a good case for either usage. Historians often like to stress the many differences between past and present. That’s because studying the contrasts sets a good challenge – and also because an awareness of ‘otherness’ alerts students not simply to project today’s attitudes and assumptions backwards in time. The quotation of choice for the ‘difference’ protagonists comes from an elegiac novel, which looked back at England in 1900 from the vantage point of a saddened older man in the 1940s. Entitled The Go-Between by L.P. Hartley (1953), it began with the following words: The past is a foreign country: they do things differently there.

It’s an evocative turn of phrase that has inspired book titles.1 It’s also widely quoted, often in the variant form of ‘the past is another country’. These phrases draw their potency from the fact that other places can indeed be different – sometimes very much so. It is also true that numerous historic cultures are not just different but have physically vanished, leaving imperfect traces in the contemporary world. ‘Ancient Ur of the Chaldees is covered by the sands of southern Iraq. … And the site of the once-great Alexandrian port of Herakleion lies four miles off-shore, under the blue seas of the Mediterranean’.2

december002On the other hand, while some elements of history are ‘lost’, past cultures are not necessarily inaccessible to later study. Just as travellers can make an effort to understand foreign countries, so historians and archaeologists have found many ingenious ways to analyse the ‘dead past’.

There are common attributes of humanity that can be found everywhere. We all share a living human history.3 Ancient cultures may have vanished but plenty of their ideas, mathematics, traditions, religions, and languages survive and evolve. Anyone who divides a minute into sixty seconds, an hour into sixty minutes, and a circle into 360 degrees, is paying an unacknowledged tribute to the mathematics of ancient Babylon.4

december001So there is an alternative quotation of choice for those who stress the connectivity of past and present. It too comes from a novelist, this time from the American Deep South, who was preoccupied by the legacies of history. William Faulkner’s Requiem for a Nun (1951) made famous his dictum that:
The past is never dead. It’s not even past.

No doubt there are circumstances when such sentiments are dangerous. There are times when historic grievances have to be overcome. But, before reconciliation, it’s best to acknowledge the reality of such legacies, rather than dismissing them. As it happens, that was the argument of Barack Obama when giving a resonant speech in 2008 about America’s festering ethnic divisions.5

Historians rightly observe that history contains intertwined elements of life and death. But when campaigning for the subject, it’s best to highlight the elements that survive through time. That is not romanticising history, since hatreds and conflicts are among the legacies from the past. It’s just a good method for convincing the doubters. Since we are all part of living history, for good and ill, we all need to study the subject in all its complexity.

Thirdly and finally: historians must make common cause with champions of other subjects. Obvious allies come from the Arts and Humanities. But we should appeal especially to the Pure Sciences. They too fail to meet the test of immediate economic ‘usefulness’. There is no instant value in a new mathematical equation. No immediate gain from the study of String Theory in physics. (Indeed, some physicists argue that this entire field is turning into a blind alley).6 But the pure sciences need essential scope for creativity and theoretical innovation. Some new ideas have become ‘useful’ (or dangerous) only many years after the initial intellectual breakthrough. Others have as yet no direct application. And some may never have.

Humans, however, are capable of thinking long. It is one of our leading characteristics. So we must not be bullied into judging the value of subjects to study solely or even chiefly in terms of short-term criteria. The Pure Sciences, alongside the Arts and Humanities, must combat this blinkered approach. There are multiple values in a rounded education, combining the theoretical and the practical. In the case of History, the blend must include knowledge as well as skills. In the sciences, it must include the theoretical as well as the applied. One without the other will fail. And that in the long-term is not remotely useful. In fact, it’s positively dangerous. History confirms the long-term usefulness of the sciences. Let the scientists repay the compliment by joining those who reject crude utilitarianism – hence in turn championing the study of History.

1 Notably by David Lowenthal, The Past is a Foreign Country (Cambridge, 1983)

2 Quoting from an essay by myself, entitled ‘Cities in Time’, in Peter Clark (ed.), Oxford Handbook on Cities in World History (Oxford, forthcoming May 2013).

3 See Ivar Lissner, The Living Past (1957), transl. from German So Habt Ihr Geleb = literally Thus Have They Lived; and my personal response in PJC Discussion-Point Nov. 2011.

4 For the social and intellectual context of Babylonian mathematics, see Eleonor Robson, Mathematics in Ancient Iraq: A Social History (Princeton, 2008).

5 For Barack Obama’s speech ‘A More Perfect Union’, delivered at Philadelphia, PA, 18 March 2008: see video on www.youtube.com.

6 See references to the usefulness or otherwise of pure maths in PJC Blog Oct. 2012.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 25 please click here