Tag Archive for: teachers

MONTHLY BLOG 122, PROPOSED ROOTS PROJECT FOR TEENAGERS

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2021)

Line Drawing of Tree & Roots:
© Vector Illustrations (2020)
65691748

It’s important for individuals to know about their personal roots, Humans all live in Time-Space (also known as the Space-Time continuum).1 And knowing a bit about personal family roots helps to locate people in their own individual spot in history and geography.

So this short essay speculates about a possible School Roots Project for children in their mid-teens. (Perhaps in a Civics class; or a part of a contemporary History course). The aim is not in any way to encourage family- bragging, whether for ‘lofty’ aristocratic lineage or for ‘authentic’ proletarian roots. Instead, the value is chiefly for the individuals concerned, to know more about themselves – and to have the chance to talk seriously about their roots with parents/ grandparents/ influential family members/ and/or any others who played a significant role in their upbringings.

Clearly teachers need to organise all such Roots Projects with great sensitivity. Not all families are happy ones. Not all older relatives will be at ease talking about the past with people of a younger generation. And thoughtful arrangements have to be made for students who are adopted, who may know little or nothing about their biological background – but who share the same human need to be socially well rooted in Time-Space. Indeed, it can well be argued that those whose position is, outwardly at least, relatively unsettled have the greatest need for this exercise in rooting, both with their adoptive families and/or with their biological families, if they can be traced.2

The more that individuals know about their personal background, the more secure they feel – the more they understand their connections with others – the better their sense of self-esteem – and the more they feel in control of their own lives. Rootedness is a prime indicator of emotional health and happiness. And the more that people are secure in their own skin, the better they can relate to others.3 They can simultaneously see their own role as part of a wider human history, set in unfolding Time which links the generations.

What then should a Roots Project for teenagers entail? The details are best left to be specified by teachers who know the relevant age-group. There’s no magic formula. Just a desire to get children talking to their parents/ grandparents/ or any other significant figures in their upbringing. At infant school level, there are many good storybooks about families; and there are projects which invite children to ask grandparents (say) simple questions, such as ‘What sort of toys did you have as a child?’ For teenagers, the discussion can be more probing – but may be hampered by years of not talking about personal matters. Therefore Projects should start modestly: asking children which adults influenced them as they grew? And then asking the youngsters to think of questions to ask the grownups in their lives?

Students should also be briefed on asking for family help with their Roots Projects. It must be stressed that all information will be used exclusively by the students. These talks will not be ‘on the record’ – here contrasting with what can happen to taped interviews as the result of formal Oral History exercises.4 Instead, the Roots Projects are intended as launch-pads for informal chats, enabling the students to write a short account of one or more significant adults who influenced their upbringing.

Afterwards, the class can be invited to share their experiences of the process. Some families will already be talkers. Others not. In every case, there is always more to be learned. Did the students find it easy or difficult to get the adults to talk? If difficult, why was that? Was it that they themselves were embarrassed? Or the parents shy? Did the talking exercise make things any easier? Did they learn anything surprising? What might they ask next time that they have a family chat? To stress again, the exercise is not a competitive exercise in bragging about comparative social backgrounds. Instead, it is an exercise in Rooting – taking specific steps in what may become a longer series of family discussions.

Generally, it’s very common for people to exclaim, at the demise of a parent, grandparent or any other significant relative or carer: ‘I wish I’d asked them more about themselves, when they were alive to tell me’. Death locks the doors to personal memories of a shared past. Rooting Projects help to open the conversations while all the protagonists are alive to relate their own histories.

ENDNOTES:

1 Whether the chosen terminology is Time-Space or Space-Time, the proposition is the same: that Time and Space are integrally yoked. For further discussion, see P.J. Corfield, Time and the Shape of History (2007), pp. 15, 9-11, 17-18, 218, 220, 248-52; and PJC current research-in-progress.

2 See e.g. J. Rees, Life Story Books for Adopted Children: A Family-Friendly Approach (2009); J. Waterman and others, Adoption-Specific Therapy: A Guide to Helping Adopted Children and their Families Thrive (Washington DC, 2018); A. James, The Science of Parenting Adopted Children: A Brain-Based, Trauma-Informed Approach to Cultivating Your Child’s Social, Emotional and Moral Development (2019).

3 R. Coleman, ‘Why We Need Family History Now More than Ever’, FamilySearch, 26 Sept. 2017: https://www.familysearch.org/blog/en/family-history-2/

4 Oral History, professionally undertaken, provides a wonderful set of original resources for historical studies: among a huge literature, see e.g. A. Zusman, Story Bridges: A Guide for Conducting Intergenerational Oral History Projects (2016); F-A. Montoya and B. Allen, Practising Oral History to Connect University to Community (2018). These Schools Rooting Projects can be regarded as early stepping stones in the same process of tapping into the powers of the human memory – and sharing them with others.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 122 please click here

MONTHLY BLOG 40, HISTORICAL REPUTATIONS THROUGH TIME

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

What does it take to get a long-surviving reputation? The answer, rather obviously, is somehow to get access to a means of endurance through time. To hitch a lift with history.

People in sports and the performing arts, before the advent of electronic storage/ replay media, have an intrinsic problem. Their prowess is known at the time but is notoriously difficult to recapture later. The French actor Sarah Bernhardt (1844-1923), playing Hamlet on stage when she was well into her 70s and sporting an artificial limb after a leg amputation, remains an inspiration for all public performers, whatever their field.1  Yet performance glamour, even in legend, still fades fast.

Bernhardt in the 1880s as a romantic HamletWhat helps to keep a reputation well burnished is an organisation that outlasts an individual. A memorable preacher like John Wesley, the founder of Methodism, impressed many different audiences, as he spoke at open-air and private meetings across eighteenth-century Britain. Admirers said that his gaze seemed to pick out each person individually. Having heard Wesley in 1739, one John Nelson, who later became a fellow Methodist preacher, recorded that effect: ‘I thought his whole discourse was aimed at me’.2

Yet there were plenty of celebrated preachers in Georgian Britain. What made Wesley’s reputation survive was not only his assiduous self-chronicling, via his journals and letters, but also the new religious organisation that he founded. Of course, the Methodist church was dedicated to spreading his ideas and methods for saving Christian souls, not to the enshrining of the founder’s own reputation. It did, however, forward Wesley’s legacy into successive generations, albeit with various changes over time. Indeed, for true longevity, a religious movement (or a political cause, come to that) has to have permanent values that outlast its own era but equally a capacity for adaptation.

There are some interesting examples of small, often millenarian, cults which survive clandestinely for centuries. England’s Muggletonians, named after the London tailor Lodovicke Muggleton, were a case in point. Originating during the mid-seventeenth-century civil wars, the small Protestant sect never recruited publicly and never grew to any size.  But the sect lasted in secrecy from 1652 to 1979 – a staggering trajectory. It seems that the clue was a shared excitement of cultish secrecy and a sense of special salvation, in the expectation of the imminent end of the world. Muggleton himself was unimportant. And finally the movement’s secret magic failed to remain transmissible.3

In fact, the longer that causes survive, the greater the scope for the imprint of very many different personalities, different social demands, different institutional roles, and diverse, often conflicting, interpretations of the core theology. Throughout these processes, the original founders tend quickly to become ideal-types of mythic status, rather than actual individuals. It is their beliefs and symbolism, rather than their personalities, that live.

As well as beliefs and organisation, another reputation-preserver is the achievement of impressive deeds, whether for good or ill. Notorious and famous people alike often become national or communal myths, adapted by later generations to fit later circumstances. Picking through controversies about the roles of such outstanding figures is part of the work of historians, seeking to offer not anodyne but judicious verdicts on those ‘world-historical individuals’ (to use Hegel’s phrase) whose actions crystallise great historical moments or forces. They embody elements of history larger than themselves.

Hegel himself had witnessed one such giant personality, in the form of the Emperor Napoleon. It was just after the battle of Jena (1806), when the previously feared Prussian army had been routed by the French. The small figure of Napoleon rode past Hegel, who wrote: ‘It is indeed a wonderful sensation to see such an individual, who, concentrated here at a single point, astride a horse, reaches out over the world and masters it’.4

(L) The academic philosopher G.W.F. Hegel (1770-1831) and (R) the man of action, Emperor Napoleon (1769-1821),  both present at Jena in October 1806The means by which Napoleon’s posthumous reputation has survived are interesting in themselves. He did not found a long-lasting dynasty, so neither family piety nor institutionalised authority could help. He was, of course, deposed and exiled, dividing French opinion both then and later. Nonetheless, Napoleon left numerous enduring things, such as codes of law; systems of measurement; structures of government; and many physical monuments. One such was Paris’s Jena Bridge, built to celebrate the victorious battle.

Monuments, if sufficiently durable, can certainly long outlast individuals. They effortlessly bear diachronic witness to fame. Yet, at the same time, monuments can crumble or be destroyed. Or, even if surviving, they can outlast the entire culture that built them. Today a visitor to Egypt may admire the pyramids, without knowing the names of the pharaohs they commemorated, let alone anything specific about them. Shelley caught that aspect of vanished grandeur well, in his poem to the ruined statue of Ozymandias: the quondam ‘king of kings’, lost and unknown in the desert sands.6

So lastly what about words? They can outlast individuals and even cultures, provided that they are kept in a transmissible format. Even lost languages can be later deciphered, although experts have not yet cracked the ancient codes from Harappa in the Punjab.7  Words, especially in printed or nowadays digital format, have immense potential for endurance. Not only are they open to reinterpretation over time; but, via their messages, later generations can commune mentally with earlier ones.

In Jena, the passing Napoleon (then aged 37) was unaware of the watching academic (then aged 36), who was formulating his ideas about revolutionary historical changes through conflict. Yet, through the endurance of his later publications, Hegel, who was unknown in 1806, has now become the second notable personage who was present at the scene. Indeed, via his influence upon Karl Marx, it could even be argued that the German philosopher has become the historically more important figure of those two individuals in Jena on 13 October 1806. On the other hand, Marx’s impact, having been immensely significant in the twentieth century, is also fast fading.

Who from the nineteenth century will be the most famous in another century’s time? Napoleon? Hegel? Marx? (Shelley’s Ozymandias?) Time not only ravages but provides the supreme test.

1  R. Gottlieb, Sarah: The Life of Sarah Bernhardt (New Haven, 2010).

R.P. Heitzenrater, ‘John Wesley’s Principles and Practice of Preaching’, Methodist History, 37 (1999), p. 106. See also R. Hattersley, A Brand from the Burning: The Life of John Wesley (London, 2002).

3  W. Lamont, Last Witnesses: The Muggletonian History, 1652-1979 (Aldershot, 2006); C. Hill, B. Reay and W. Lamont, The World of the Muggletonians (London, 1983); E.P. Thompson, Witness against the Beast: William Blake and the Moral Law (Cambridge, 1993).

G.W.F. Hegel to F.I. Neithammer, 13 Oct. 1806, in C. Butler (ed.), The Letters: Georg Wilhelm Friedrich Hegel, 1770-1831 (Bloomington, 1984); also transcribed in www.Marxists.org, 2005.

See http://napoleon-monuments.eu/Napoleon1er.

6  P.B. Shelley (1792-1822), Ozymandias (1818).

For debates over the language or communication system in the ancient Indus Valley culture, see: http://en.wikipedia.org/

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 40 please click here

MONTHLY BLOG 39, STUDYING THE LONG AND THE SHORT OF HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

A growing number of historians, myself included, want students to study long-term narratives as well in-depth courses.1 More on (say) the peopling of Britain since Celtic times alongside (say) life in Roman Britain or (say) medicine in Victorian times or (say) the ordinary soldier’s experiences in the trenches of World War I. We do in-depth courses very well. But long-term studies are also vital to provide frameworks.2

Put into more abstract terms, we need more diachronic (long-term) analysis, alongside synchronic (short-term) immersion. These approaches, furthermore, do not have to be diametrically opposed. Courses, like books, can do both.

That was my aim in an undergraduate programme, devised at Royal Holloway, London University.3  It studied the long and the short of one specific period. The choice fell upon early nineteenth-century British history, because it’s well documented and relatively near in time. In that way, the diachronic aftermath is not too lengthy for students to assess within a finite course of study.

Integral to the course requirements were two long essays, both on the same topic X. There were no restrictions, other than analytical feasiblity. X could be a real person; a fictional or semi-fictionalised person (like Dick Turpin);4  an event; a place; or anything that lends itself to both synchronic and diachronic analysis. Students chose their own, with advice as required. One essay of the pair then focused upon X’s reputation in his/her/its own day; the other upon X’s long-term impact/reputation in subsequent years.

There was also an examination attached to the course. One section of the paper contained traditional exam questions; the second just one compulsory question on the chosen topic X. Setting that proved a good challenge for the tutor, thinking of ways to compare and contrast short- and long-term reputations. And of course, the compulsory question could not allow a simple regurgitation of the coursework essays; and it had to be equally answerable by all candidates.

Most students decided to examine famous individuals, both worthies and unworthies: Beau Brummell; Mad Jack Mytton; Queen Caroline; Charles Dickens; Sir Robert Peel; Earl Grey; the Duke of Wellington; Harriette Wilson; Lord Byron; Mary Shelley; Ada Lovelace; Charles Darwin; Harriet Martineau; Robert Stephenson; Michael Faraday; Augustus Pugin; Elizabeth Gaskell; Thomas Arnold; Mary Seacole; to name only a few. Leading politicians and literary figures tended to be the first choices. A recent book shows what can be done in the case of the risen (and rising still further) star of Jane Austen.5 In addition, a minority preferred big events, such as the Battle of Waterloo; or the Great Exhibition. None in fact chose a place or building; but it could be done, provided the focus is kept sharp (the Palace of Westminster, not ‘London’.)

Studying contemporary reputations encouraged a focus upon newspaper reports, pamphlets, letters, public commemorations, and so forth. In general, students assumed that synchronic reputation would be comparatively easy to research. Yet they were often surprised to find that initial responses to X were confused. It takes time for reputations to become fixed. In particular, where the personage X had a long life, there might well be significant fluctuations during his or her lifetime. The radical John Thelwall, for example, was notorious in 1794, when on trial for high treason, yet largely forgotten at his death in 1834. 6

By contrast, students often began by feeling fussed and unhappy about studying X’s diachronic reputation. There were no immediate textbooks to offer guidance. Nonetheless, they often found that studying long-term changes was good fun, because more off-the-wall. The web is particularly helpful, as wikipedia often lists references to X in film(s), TV, literature, song(s) and popular culture. Of course, all wiki-leads need to be double-checked. There are plenty of errors and omissions out there.

Nonetheless, for someone wishing to study the long-term reputation of (say) Beau Brummell (1778-1840), wikipedia offers extensive leads, providing many references to Brummell in art, literature, song, film, and sundry stylistic products making use of his name, as well as a bibliography. 7

Beau Brummell (1778-1840) from L to R: as seen in his own day; as subject of enquiry for Virginia Woolf (1882-1941); and as portrayed by Stewart Granger in Curtis Bernhardt’s film (1954).Plus it is crucial to go beyond wikipedia. For example, a search for relevant publications would reveal an unlisted offering. In 1925, Virginia Woolf, no less, published a short tract on Beau Brummell.8 The student is thus challenged to explore what the Bloomsbury intellectual found of interest in the Regency Dandy. Of course, the tutor/ examiner also has to do some basic checks, to ensure that candidates don’t miss the obvious. On the other hand, surprise finds, unanticipated by all parties, proved part of the intellectual fun.

Lastly, the exercise encourages reflections upon posthumous reputations. People in the performing arts and sports, politicians, journalists, celebrities, military men, and notorious criminals are strong candidates for contemporary fame followed by subsequent oblivion, unless rescued by some special factor. In the case of the minor horse-thief Dick Turpin, he was catapulted from conflicted memory in the eighteenth century into dashing highwayman by the novel Rookwood (1834). That fictional boost gave his romantic myth another 100 years before starting to fade again.

Conversely, a tiny minority can go from obscurity in their lifetime to later global renown. But it depends crucially upon their achievements being transmissable to successive generations. The artist and poet William Blake (1757-1827) is a rare and cherished example. Students working on the long-and-the-short of the early nineteenth century were challenged to find another contemporary with such a dramatic posthumous trajectory. They couldn’t.

But they and I enjoyed the quest and discovery of unlikely reactions, like Virginia Woolf dallying with Beau Brummell. It provided a new way of thinking about the long-term – not just in terms of grand trends (‘progress’; ‘economic stages’) but by way of cultural borrowings and transmutations between generations. When and why? There are trends but no infallible rules.

1 ‘Teaching History’s Big Pictures: Including Continuity as well as Change’, Teaching History: Journal of the Historical Association, 136 (Sept. 2009), pp. 53-9; and PJC website Pdf/3.

2 My own answers in P.J. Corfield, Time and the Shape of History (2007).

3 RH History Course HS2246: From Rakes to Respectability? Conflict and Consensus in Britain 1815-51 (content now modified).

4 Well shown by J. Sharpe, Dick Turpin: The Myth of the Highwayman (London, 2004).

5 C. Harman, Jane’s Fame: How Jane Austen Conquered the World (Edinburgh, 2009).

6 Two PJC essays on John Thelwall (1764-1834) are available in PJC website, Pdf/14 and Pdf/22.

7 See http://en.wikipedia.org/wiki/Beau_Brummell.

8 See V. Woolf, Beau Brummell (1925; reissued by Folcroft Library, 1972); and http://www.dandyism.net/woolfs-beau-brummell/.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 39 please click here

MONTHLY BLOG 38, WHY IS THE LANGUAGE OF ‘RACE’ HOLDING ON SO LONG WHEN IT’S BASED ON A PSEUDO-SCIENCE?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

Of course, most people who continue to use the language of ‘race’ believe that it has a genuine meaning – and a meaning, moreover, that resonates for them. It’s not just an abstract thing but a personal way of viewing the world. I’ve talked to lots of people about giving up ‘race’ and many respond with puzzlement. The terminology seems to reflect nothing more than the way things are.

But actually, it doesn’t. It’s based upon a pseudo-science that was once genuinely believed but has long since been shown as erroneous by geneticists. So why is this language still used by people who would not dream of insisting that the earth is flat, or the moon made of blue cheese.

Part of the reason is no doubt the power of tradition and continuity – a force of history that is often under-appreciated.1 It’s still possible to hear references to people having the ‘bump of locality’, meaning that they have a strong topographical/spatial awareness and can find their way around easily. The phrase sounds somehow plausible. Yet it’s derived from the now-abandoned study of phrenology. This approach, first advanced in 1796 by the German physician F.J. Gall, sought to analyse people’s characteristics via the contours of the cranium.2  It fitted with the ‘lookism’ of our species. We habitually scrutinise one another to detect moods, intentions, characters. So it may have seemed reasonable to measure skulls for the study of character.

Phrenologist’s view of the human skull: point no. 31 marks the bump of locality, just over the right eyebrow.Yet, despite confident Victorian publications explaining The Science of Phrenology3  and advice manuals on How to Read Heads,4  these theories turned out to be no more than a pseudo-science. The critics were right after all. Robust tracts like Anti-Phrenology: Or a Chapter on Humbug won the day. Nevertheless, some key phrenological phrases linger on.5  My own partner in life has an exceptionally strong sense of topographical orientation. So sometimes I joke about his ‘bump of locality’, even though there’s no protrusion on his right forehead. It’s a just linguistic remnant of vanished views.

That pattern may apply similarly in the language of race, which is partly based upon a simple ‘lookism’. People who look like us are assumed to be part of ‘our tribe’. Those who do not seem to be ‘a race apart’ (except that they are not). The survival of the phrasing is thus partly a matter of inertia.

Another element may also spring, paradoxically, from opponents of ‘racial’ divisions. They are properly dedicated to ‘anti-racism’. Yet they don’t oppose the core language itself. That’s no doubt because they want to confront prejudices directly. They accept that humans are divided into separate races but insist that all races should be treated equally. It seems logical therefore that the opponent of a ‘racist’ should be an ‘anti-racist’. Statistics of separate racial groups are collected in order to ensure that there is no discrimination.

Yet one sign of the difficulty in all official surveys remains the utter lack of consistency as to how many ‘races’ there are. Early estimates by would-be experts on racial classification historically ranged from a simplistic two (‘black’ and ‘white’) to a complex 63.6  Census and other listings these days usually invent a hybrid range of categories. Some are based upon ideas of race or skin colour; others of nationality; or a combination And there are often lurking elements of ‘lookism’ within such categories (‘black British’), dividing people by skin colour, even within the separate ‘races’.7

So people like me who say simply that ‘race’ doesn’t exist (i.e. that we are all one human race) can seem evasive, or outright annoying. We are charged with missing the realities of discrimination and failing to provide answers.

Nevertheless, I think that trying to combat a serious error by perpetrating the same error (even if in reverse) is not the right way forward. The answer to pseudo-racism is not ‘anti-racism’ but ‘one-racism’. It’s ok to collect statistics about nationality or world-regional origins or any combination of such descriptors, but without the heading of ‘racial’ classification and the use of phrases that invoke or imply separate races.

Public venues in societies that historically operated a ‘colour bar’  used the brown paper bag test for quick decisions,  admitting people with skins lighter than the bag and rejecting the rest.  As a means of classifying people, it’s as ‘lookist’ as phrenology  but with even fewer claims to being ‘scientific’.  Copyright © Jessica C (Nov. 2013)What’s in a word? And the answer is always: plenty. ‘Race’ is a short, flexible and easy term to use. It also lends itself to quickly comprehensible compounds like ‘racist’ or ‘anti-racist’. Phrases derived from ethnicity (national identity) sound much more foreign in English. And an invented term like ‘anti-ethnicism’ seems abstruse and lacking instant punch.

All the same, it’s time to find or to create some up-to-date phrases to allow for the fact that racism is a pseudo-science that lost its scientific rationale a long time ago. ‘One-racism’? ‘Humanism’? It’s more powerful to oppose discrimination in the name of reality, instead of perpetrating the wrong belief that we are fundamentally divided. The spectrum of human skin colours under the sun is beautiful, nothing more.

1 On this, see esp. PJC website BLOG/1 ‘Why is the Formidable Power of Continuity so often Overlooked?’ (Nov. 2010).

2 See T.M. Parssinen, ‘Popular Science and Society: The Phrenology Movement in Early Victorian Britain’, Journal of Social History, 8 (1974), pp. 1-20.

3 J.C. Lyons, The Science of Phrenology (London, 1846).

4 J. Coates, How to Read Heads: Or Practical Lessons on the Application of Phrenology to the Reading of Character (London, 1891).

5 J. Byrne, Anti-Phrenology: Or a Chapter on Humbug (Washington, 1841).

6 P.J. Corfield, Time and the Shape of History (London, 2007), pp. 40-1.

7 The image comes from Jessica C’s thoughtful website, ‘Colorism: A Battle that Needs to End’ (12 Nov. 2013): www.allculturesque.com/colorism-a-battle-that-needs-to-end.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 38 please click here

MONTHLY BLOG 37, HOW DO PEOPLE RESPOND TO ELIMINATING THE LANGUAGE OF ‘RACE’?

 If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

 Having proposed eliminating from our thoughts and vocabulary the concept of ‘race’ (and I’m not alone in making that suggestion), how do people respond?

Indifference: we are all stardust. Many people these days shrug. They say that the word ‘race’ is disappearing anyway, and what does it matter?

Indeed, a friend with children who are conventionally described as ‘mixed race’ tells me that these young people are not worried by their origins and call themselves, semi-jokingly, ‘mixed ray’. It makes them sound like elfin creatures from the sun and stars – rather endearing really. Moreover, such a claim resonates with the fact that many astro-biologists today confirm that all humans (among all organic and inorganic matter on earth) are ultimately made from trace elements from space – or, put more romantically, from stardust. 1

So from a cosmic point of view, there’s no point in worrying over minor surface differences within one species on a minor planet, circulating within the constellation of a minor sun, which itself lies in but one quite ordinary galaxy within a myriad of galaxies.

Ethnic pride: On the other hand, we do live specifically here, on earth. And we are a ‘lookist’ species. So others give more complex responses, dropping ‘race’ for some purposes but keeping it for others. Given changing social attitudes, the general terminology seems to be disappearing imperceptibly from daily vocabulary. As I mentioned before, describing people as ‘yellow’ and ‘brown’ has gone. Probably ‘white’ will follow next, especially as lots of so-called ‘whites’ have fairly dusky skins.

‘Black’, however, will probably be the slowest to go. Here there are good as well as negative reasons. Numerous people from Africa and from the world-wide African diaspora have proudly reclaimed the terminology, not in shame but in positive affirmation.

Battersea’s first ‘black’ Mayor, John Archer (Mayor 1913/14) was a pioneer in that regard. I mentioned him in my previous BLOG (no 35). Archer was a Briton, with Irish and West Indian ancestry. He is always described as ‘black’ and he himself embraced black consciousness-raising. Yet he always stressed his debt to his Irish mother as well as to his Barbadian father.

In 1918 Archer became the first President of the African Progress Union. In that capacity, he attended meetings of the Pan-African Congress, which promoted African decolonisation and development. The political agenda of activists who set up these bodies was purposive. And they went well beyond the imagery of negritude by using a world-regional nomenclature.

Interestingly, therefore, the Pan-African Congress was attended by men and women of many skin colours. Look at the old photograph (1921) of the delegates from Britain, continental Europe, Africa and the USA (see Illus 1). Possibly the dapper man, slightly to the L of centre in the front row, holding a portfolio, is John Archer himself.

Illus 1: Pan-African Congress delegates in Brussels (1921)Today, ‘black pride’, which has had a good cultural run in both Britain and the USA, seems to be following, interestingly, in Archer’s footsteps. Not by ignoring differences but by celebrating them – in world-regional rather than skin-colourist terms. Such labels also have the merit of flexibility, since they can be combined to allow for multiple ancestries.

Just to repeat the obvious: skin colour is often deceptive. Genetic surveys reveal high levels of ancestral mixing. As American academic Henry Louis Gates has recently reminded us in The Observer, many Americans with dark skins (35% of all African American men) have European as well as African ancestry. And the same is true, on a lesser scale, in reverse. At least 5% of ‘white’ male Americans have African ancestry, according to their DNA.

Significantly, people with mixed ethnicities often complain at being forced to choose one or the other (or having choice foisted upon them), when they would prefer, like the ‘Cablinasian’ Tiger Woods, to celebrate plurality. Pride in ancestry will thus outlast and out-invent erroneous theories of separate ‘races’.

Just cognisance of genetic and historic legacies: There is a further point, however, which should not be ignored by those (like me) who generally advocate ‘children of stardust’ universalism. For some social/political reasons, as well as for other medical purposes, it is important to understand people’s backgrounds.

Thus ethnic classifications can help to check against institutionalised prejudice. And they also provide important information in terms of genetic inheritance. To take one well known example, sickle-cell anaemia (drepanocytosis) is a condition that can be inherited by humans whose ancestors lived in tropical and sub-tropical regions where malaria is or was common. It is obviously helpful, therefore, to establish people’s genetic backgrounds as accurately as possible.

All medical and social/political requirements for classification, however, call for just classification systems. One reader of my previous BLOG responded that it didn’t really matter, since if ‘race’ was dropped another system would be found instead. But that would constitute progress. The theory of different human races turned out to be erroneous. Instead, we should enquire about ethnic (national) identity and/or world-regional origins within one common species. Plus we should not use a hybrid mix of definitions, partly by ethnicities and partly by skin colour (as in ‘black Britons’).

Lastly, all serious systems of enquiry should ask about plurality: we have two parents, who may or may not share common backgrounds. That’s the point: men and women from any world-region can breed together successfully, since we are all one species.

1 S. Kwok, Stardust: The Cosmic Seeds of Life (Heidelberg, 2013).

2 For John Richard Archer (1869-1932), see biog. by P. Fryer in Oxford Dictionary of National Biography: on-line; and entry on Archer in D. Dabydeen, J. Gilmore and C. Jones (eds), The Oxford Companion to Black British History (Oxford, 2007, p. 33.

3 The Observer (5 Jan. 2014): New Review, p. 20.

4 M. Tapper, In the Blood: Sickle Cell Anaemia and the Politics of Race (Philadelphia, 1999).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 37 please click here

MONTHLY BLOG 35, DONS AND STUDENT-CUSTOMERS? OR THE COMMUNITY OF LEARNERS?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

Names matter. Identifying  things – people – events – in realistic terminology means that they are being fully understood and taken seriously. Conversely, it’s warping to the mind and eventually corrosive of good thought to be constantly urged to give lip-service to the ‘wrong’ terms. People who live under dictatorial systems of would-be thought-control often testify to the ‘dead’ feeling that results from public censorship, especially when it is internalised as self-censorship.

By the way, I wrote that paragraph before remembering that this sentiment dovetailed with something I’d read about Confucius. A quick Google-check confirmed my half-memory.  Confucius long ago specified that: ‘The beginning of wisdom is to call things by their proper names’. It’s a great dictum. It doesn’t claim too much. Naming is only the ‘beginning of wisdom’, not the entirety. And there is often scope for debating what is or should be the ‘proper’ name. Nonetheless, Confucius not only highlights the good effect of clear vision, accurately acknowledged to others, but equally implies the malign effects of the reverse. The beginning of madness is to delude oneself and others about the true state of affairs.

Which brings me to my current question: are University students ‘customers’? If so, an interesting implication follows. If ‘the customer is always right’, as the business world asserts but does not always uphold, should not all students get top marks for having completed an assignment or an exam paper? Or, at very least not get bad marks?

Interestingly, now that student payments for tuition are very much up-front and personal in the form of fees (which are funded as repayable loans), so the standard of degrees is gradually rising. Indeed, grade inflation has become noticeable ever since Britain’s Universities began to be expanded into a mass system. A survey undertaken in 2003 found that the third-class degree has been in steady decline since 1960 and was nearing extinction by 2000. And a decade on, the lower second (2.2) in some subjects is following the same trajectory. Better teaching, better study skills, and/or improved exam preparation may account for some of this development. But rising expectations on the part of students – and increasing reputational ambitions on the part of the Universities – also exert subtle pressures upon examiners to be generous.

Nonetheless, even allowing for a changing framework of inputs and outputs, a degree cannot properly be ‘bought’. Students within any given University course are learners, not customers. Their own input is an essential part of the process. They can gain a better degree not by more money but by better effort, well directed, and by better ability, suitably honed.

People learn massively from teachers, but also much from private study, and much too from their fellow-learners (who offer both positive and negative exemplars). Hence the tutors, the individual student, and other students all contribute to each individual’s result.2

A classic phrase for this integrated learning process was ‘the community of scholars’. That phrase now sounds quaint and possibly rather boring. Popularly, scholarship is assumed to be quintessentially dull and pedantic, with the added detriment of causing its devotees to ‘scorn delights and live laborious days,’ in Milton’s killing phrase.3  In fact, of course, learning isn’t dull. Milton, himself a very learned man, knew so too. Nonetheless, ‘the community of scholars’ doesn’t cut the twenty-first century terminological mustard.

But ‘learning’ has a better vibe. It commands ‘light’. People may lust for it, without losing their dignity. And it implies a continually interactive process. So it’s good for students to think of themselves as part of a community of learners. Compared with their pupils, the dons are generally older, sometimes wiser, always much better informed about the curriculum, much more experienced in teaching, and ideally seasoned by their own research efforts. But the academics too are learners, if more advanced along the pathway. They are sharing the experience and their expertise with the students. Advances in knowledge can come from any individual at any level, often emerging from debates and questions, no matter how naive. So it’s not mere pretension that causes many academics to thank in their scholarly prefaces not only their fellow researchers but also their students.

Equally, it’s good for the hard-pressed dons to think of themselves as part of an intellectual community that extends to the students. That concept reasserts an essential solidarity. It also serves to reaffirm the core commitment of the University to the inter-linked aims of teaching and research. Otherwise, the students, who are integral to the process, are seemingly in danger of getting overlooked while the dons are increasingly tugged between the rival pressures of specialist research in the age of Research Assessment, and of managerial business-speak in the age of the University-plc.4

Lastly, reference to ‘community’ need not be too starry-eyed. Ideals may not always work perfectly in practice. ‘Community’ is a warm, comforting word. It’s always assumed to be a ‘good thing’. Politicians, when seeking to commend a policy such as mental health care, refer to locating it in ‘the community’ as though that concept can resolve all the problems. (As is now well proven, it can’t). And history readily demonstrates that not all congregations of people form a genuine community. Social cohesion needs more than just a good name.

That’s why it’s good to think of Universities as containing communities of learners, in order to encourage everyone to provide the best conditions for that basic truth to flourish at its best. That’s far from an easy task in a mass higher-education system. It runs counter to attempts at viewing students as individual consumers. But it’s more realistic as to how teaching actually works well. And calling things by their proper names makes a proper start.

William Hogarth’s satirical Scholars at a Lecture (1736) offers a wry reminder to tutors not to be boring and to students to pay attention1 ‘Third Class Degree Dying Out’, Times Higher Education, 5 Sept. 2003: www.timeshighereducation.co.uk/178955/article: consulted 4 Nov. 2013.

2 That’s one reason why performance-related-pay (PRP) for teachers, based upon examination results for their taught courses, remains a very blunt tool for rewarding teaching achievements. Furthermore, all calculations for PRP (to work even approximately justly) need to take account of the base-line from which the students began, to measure the educational ‘value-added’. Without that proviso, teachers (if incentivised purely by monetary reward) should logically clamour to teach only the best, brightest, and most committed students, who will deliver the ‘best’ results.

3 John Milton, Lycidas: A Lament for a Friend, Drowned in his Passage from Chester on the Irish Seas (1637), lines 70-72: ‘Fame is the spur that the clear spirit doth raise/ (That last Infirmity of Noble mind)/ To scorn delights and live laborious days.’

4 On the marketisation of higher education, see film entitled Universities Plc? Enterprise in Higher Education, made by film students at the University of Warwick (2013): www2.warwick.ac.uk/fac

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 35 please click here