MONTHLY BLOG 96, WHAT’S WRONG WITH PREHISTORY?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Arthur’s Stone, Herefordshire, dating from c.3000 BCE: photo © Tony Belton, 2016

Arthur’s Stone, Herefordshire, dating from c.3000 BCE:
photo © Tony Belton, 2016

What’s wrong with ‘prehistory’? Absolutely nothing but the name. People refer to ancient monuments as ‘prehistoric’ and everyone knows roughly what is meant. The illustration (above) shows an ancient burial tomb, known as Arthur’s Stone, dating from 3000 BCE, which I visited in Herefordshire on a summer day in 2016. It did and does indeed look truly venerable. So loose terms such as ‘prehistoric’ are passable enough if used casually.

But ‘prehistory’ as a scholarly term in application to a prolonged period of human history? Seriously misleading. It implies that the long aeons of foundational human history, before the advent of literacy, somehow occurred in a separate ante-chamber to the ‘real’ deal.

The acquiring of skills in reading and writing (which occurred in different parts of the world at different times) was in fact part of a lengthy process of human adaptation and invention. Before literacy, key developments included: the adoption of clothing; the taming of fire; the invention of tools; the refinement of tools and weapons with handles; the invention of the wheel; the arrival of speech; the advent of decorative arts; the formulation of burial rituals; the domestication of animals; the development of a calendrical consciousness; the capacity to cope with population fluctuations including survival during the Ice Age; the start of permanent settlements and farming; and the cumulative mental and cultural preparation for the invention of reading and writing. Some list! The pace of change was often slow; but the changes were absolutely foundational to human history.1

In practice, of course, the skilled and ingenious experts, who study pre-literate societies, do not consider their subject to be anything other than fully and deeply historical. They use ‘prehistory’ because it is a known term of art. (Often, indeed, they may start their lectures and books with a jovial disclaimer that such terminology should not be taken literally). The idea of ‘prehistory’ was crystallised by Victorian historians, who were developing a deep reverence for the importance of written sources for writing ‘real’ history. But the differences in prime source material, although methodologically significant, are not fundamental enough to deprive the foundational early years of the full status of history. And, in fact, these days historians of all periods study a range of sources. They are not just stuck in archives, reading documents – important as those are. If relevant to their theme, historians may examine buildings, art, artefacts, materials, bones, refuse, carbon datings, statistical extrapolations, and/or genetic evidence (etc etc), just as do archaeologists and ‘prehistorians’.

Moreover, conventional references to ‘prehistory’ have now been blind-sided by the recent return to diachronic (through-time) studies of what is known as Big History. This approach to the past takes as its remit either the whole of the cosmos or at least the whole lifespan of Planet Earth.2 It draws upon insights from cosmologists and astro-physicists, as well as from geologists and biologists. After all, a lot of history had indeed happened before the first humans began to walk. So what are the millennia before the advent of homo sapiens to be entitled? Pre-prehistory? Surely not. All these eras form part of what is sometimes known as ‘deep history’: a long time ago but still historical.

So why has the misleading term ‘prehistory’ survived for so long? One major reason lies in the force of inertia – or institutional continuity, to give it a kinder name. ‘Prehistory’ has prevailed as an academic terminology for over a century. It appears in the names of academic departments, research institutions, learned societies, job descriptions, teaching courses, examination papers, academic journals, books, blogs, conferences, publishers’ preferences for book titles, and popular usages – let alone in scholars’ self-definitions. Little wonder that renaming is not a simple matter. Nonetheless, subjects are continuously being updated – so why not a further step now?

I was prompted to write on this question when three congenial colleagues asked me, a couple of years ago, to contribute to a volume on Time & History in Prehistory (now available, with publication date 2019).3 I was keen to respond but hostile to the last word in their book title. My answer took the form of arguing that this specialist section of historical studies needs a new and better name. I am grateful to the editors’ forbearance in accepting my contribution. It contributes to debates elsewhere within the volume, since criticising the terminology of ‘prehistory’ is not new.

Apart from the lack of logic in apparently excluding the foundational experiences of the human species from ‘real’ history, my own further objection is that the division inhibits diachronic analysis of the long term. A surviving relic from ‘prehistoric’ times, like Arthur’s Stone, has a long and intriguing history which still continues. At some stage long before the thirteenth century CE, the modest monument, high on a ridge between the Wye and Golden Valleys, became associated in popular legend with the feats of King Arthur. (Did he win a battle there, rumour speculated, or slay a giant?) That invented linkage is in itself a fascinating example of the spread of the Arthurian legend.4

The site later witnessed some real-life dramas. In the fifteenth century, a knight was killed there in a fatal duel. And in September 1645 the embattled Charles I dined at the Stone with his royalist troops. Perhaps he intended the occasion as a symbolic gesture, although it did not confer upon him sufficient pseudo-Arthurian lustre to defeat Cromwell and the Roundheads.

For the villagers in nearby Dorstone and Bredwardine, Arthur’s Stone at some stage (chronology uncertain) became a venue for popular festivities, with dancing and ‘high jinks’ every midsummer. This long-standing tradition continued until well into Victorian times. As a sober counter-balance, too, the local Baptists in the nineteenth and twentieth centuries organised an ecumenical religious service there each June/July. Living witnesses remember these as occasions of fervent al fresco hymn-singing. Implicitly, they were acknowledging the Stone’s sacral nature, whilst simultaneously purging its pagan associations.

When visiting the Stone myself in 2016, I met by chance a local resident, named Ionwen Williams. In a stroke of research serendipity, we got chatting and she zestfully recounted her memories, as a child before World War II, of joining her schoolfellows to sing hymns at the site each midsummer. This experience and many later visits confirmed for her the special nature of the place. I did not for a moment doubt her memories; but, as a prudent historian, thought it helpful to cross-check – and found them corroborated.

It is abundantly clear that, throughout its five thousand years of existence, Arthur’s Stone has had multiple meanings for the witnessing generations. At one sad stage in the late nineteenth century, it was pillaged by builders taking stones for new constructions. But local objections put a stop to that; and it is now guarded by English Heritage. It is utterly historic, not separately ‘prehistoric’: and the same point applies to all long-surviving monuments, many of which are much bigger and more famous than Arthur’s Stone. Furthermore, deep continuities apply to many other aspects of human history – and not just to physical monuments. For example, there are many claims and counter-claims about the foundations of human behaviour which merit debate, without compartmentalising the eras of pre-literacy from those of post-literacy.

Lastly, what alternative nomenclature might apply? Having in the first draft of my essay rebuked the specialists known as ‘prehistorians’ for not changing their name, I was challenged by the editors to review other options. Obviously it’s not for one individual to decide. It was, however, a good challenge. In many ways, these early millennia might be termed ‘foundational’ in human history. That, after all, is what they were. On the other hand, ‘foundational history’ sounds like a first-year introduction course. Worthy but not very evocative. My essay reviews various options and plumps for ‘primeval’ history. That term not only sounds ancient but signals primacy: in human history, these years came first.5 The contributions within the volume as a whole are questioning and challenging throughout, as they analyse different aspects of Time and, yes, ‘History’. It is a pleasure to join these essays in thinking long.6

1 For an enticing introduction (apart from one word in its subtitle), see C. Gamble, Timewalkers: The Prehistory of Global Colonisation (Sutton: Stroud 1993).

2 For an introduction, see D.G. Christian, Maps of Time: An Introduction to Big History (U. of California Press: Berkeley, 2004).

3 S. Souvatzi, A. Baysal and E.L. Baysal (eds), Time and History in Prehistory (Routledge: Abingdon, 2019).

4 N.J. Lacy (ed.), The New Arthurian Encyclopaedia (Garland: New York, 1991).

5 P.J. Corfield, ‘Primevalism: Saluting a Renamed Prehistory’, in Soutvatzi, Baysal and Baysal (eds), Time and History, pp. 265-82. My own interest in ‘long ago’ was sparked when, as a teenager, I read a study by Ivar Lissner, entitled The Living Past (Cape: London, 1957): for which see P.J. Corfield, ‘An Unknown Book Which Influenced Me’ BLOG no.14 (Nov. 2011).

6 On this theme, see J. Guldi and D. Armstrong, The History Manifesto (Cambridge University Press: Cambridge, 2014); P.J. Corfield, ‘What on Earth is the “Temporal Turn” and Why is it Happening Now?’ BLOG no.49 (Jan. 2015); and idem, ‘Thinking Long: Studying History’, BLOG no.94 (Oct. 2018), all BLOGs available on www.penelopejcorfield.com/monthly-blogs.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 96 please click here

MONTHLY BLOG 95, ‘WHAT IS THE GREATEST SIN IN THE WORLD?’ CHRISTOPHER HILL AND THE SPIRIT OF EQUALITY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Text of short talk given by PJC to introduce the First Christopher Hill Memorial Lecture, (given by Prof. Justin Champion) at Newark National Civil War Centre, on Saturday 3 November 2018.

Christopher Hill was not only a remarkable historian – he was also a remarkable person.1 All his life, he believed, simply and staunchly, in human equality. But he didn’t parade his beliefs on his sleeve. At first meeting, you would have found him a very reserved, very solid citizen. And that’s because he was very reserved – and he was solid in the best sense of that term. He was of medium height, so did not tower over the crowd. But he held himself very erect; had a notably sturdy, broad-shouldered Yorkshire frame; and was very fit, cycling and walking everywhere. And in particular, Christopher Hill had a noble head, with a high forehead, quizzical eyebrows, and dark hair which rose almost vertically – giving him, especially in his later years, the look of a wise owl.
Christopher-Hill-1-&-2

Christopher Hill (L) in his thirties and (R) in his seventies

By the way, he was not a flashy dresser. The Hill family motto was ‘No fuss’. And, if you compare the two portraits of him in his 30s and his 70s, you could be forgiven for thinking that he was wearing the same grey twill jacket in both. (He wasn’t; but he certainly stuck to the same style all his life).

Yet even while Christopher Hill was reserved and dignified, he was also a benign figure. He had no side. He did not pull rank. He did not demand star treatment. He was courteous to all – and always interested in what others had to say. That was a key point. As Master of Balliol, Hill gave famous parties, at which dons and students mingled; and he was often at the centre of a witty crowd. But just as much, he might be found in a corner of the room discussing the problems of the world with a shy unknown.

As I’ve already said. Christopher Hill believed absolutely in the spirit of equality. But he did know that it was a hard thing to achieve – and that was why he loved the radicals in the English civil wars of the mid-seventeenth century. They were outsiders who sought new ways of organising politics and religion. Indeed, they struggled not only to define equality – but to live it. And, although there was sometimes a comic side to their actions, he admired their efforts.

When I refer to unintentionally comic aspects, I am thinking of those Ranters, from the radical and distinctly inchoate religious group, who jumped up in church and threw off their clothes as a sign. The sign was that they were all God’s children, equal in a state of nature. Not surprisingly, such behaviour attracted a lot of criticism – and satirists had good fun at their expense.

Well, Christopher Hill was far too dignified to go around throwing off his clothes. But he grew up believing a radical form of Methodism, which stressed that ‘we are all one in the eyes of the Lord’. As I’ve said, his egalitarianism came from within. But he was clearly influenced by his Methodist upbringing. His parents were kindly people, who lived simply and modestly (neither too richly nor too poorly). They didn’t drink, didn’t smoke, didn’t swear and didn’t make whoopee. Twice and sometimes even three times on Sundays, they rode their bikes for several miles to and from York’s Central Methodist Chapel; and then discussed the sermon over lunch.

In his mid-teens, Hill was particularly inspired by a radical Methodist preacher. He was named T.S. Gregory and he urged a passionate spiritual egalitarianism. Years later, Hill reproduced for me Gregory’s dramatic pulpit style. He almost threw himself across the lectern and spoke with great emphasis: ‘Go out into the streets – and look into the eyes of every fellow sinner, even the poorest beggar or the most abandoned prostitute; [today he would add look under the hoods of the druggies and youth gangs]; look into these outcast faces and in every individual you will see elements of the divine. The York Methodists, from respectable middle class backgrounds, were nonplussed. But Hill was deeply stirred. For him, Gregory voiced a true Protestantism – which Hill defined as wine in contrast with what he saw as the vinegar and negativism of later Puritanism.

The influence of Gregory was, however, not enough to prevent Hill in his late teens from losing his religious faith. My mother, Christopher’s younger sister, was very pleased at this news as she welcomed his reinforcement. She herself had never believed in God, even though she too went regularly to chapel. But their parents were sincerely grieved. On one occasion, there was a dreadful family scene, when Christopher, on vacation from Oxford University, took his younger sister to the York theatre. Neither he nor my mother could later remember the show. But they both vividly recalled their parent’s horror: going to the theatre – abode of the devil! Not that the senior Hills shouted or rowed. That was not their way. But they conveyed their consternation in total silence … which was difficult for them all to overcome.

As he lost his faith, Hill converted to a secular philosophy, which had some elements of a religion to it. That was Marxism. Accordingly, he joined the British Communist Party. And he never wavered in his commitment to a broad-based humanist Marxism, even when he resigned from the CP in 1956. Hill was not at all interested in the ceremonies and ritual of religion. The attraction of Marxism for him was its overall philosophy. He was convinced that the revolutionary unfolding of history would eventually remove injustices in this world and usher in true equality. Hill sought what we would call a ‘holistic vision’. But the mover of change was now History rather than God.

On those grounds, Hill for many years supported Russian communism as the lead force in the unfolding of History. In 1956, however, the Soviet invasion of Hungary heightened a fierce internal debate within the British Communist Party. Hill and a number of his fellow Marxist historians, struggled to democratise the CP. But they lost and most of them thereupon resigned.

This outcome was a major blow to Hill. Twice he had committed to a unifying faith and twice he found its worldly embodiment unworthy. Soviet Communism had turned from intellectual inspiration into a system based upon gulags, torture and terror. Hill never regretted his support for Soviet Russia during the Second World War; but he did later admit that, afterwards, he had supported Stalinism for too long. The mid-1950s was an unhappy time for him both politically and personally. But, publicly, he did not wail or beat his breast. Again, that was not the Hill way.

He did not move across the political spectrum, as some former communists did, to espouse right-wing causes. Nor did he become disillusioned or bitter. Nor indeed, did he drop everything to go and join a commune. Instead, Hill concentrated even more upon his teaching and writing. He did actually join the Labour Party. Yet, as you can imagine, his heart was not really in it.

It was through his historical writings, therefore, that Hill ultimately explored the dilemmas of how humans could live together in a spirit of equality. The seventeenth-century conflicts were for him seminal. Hill did not seek to warp history to fit his views. He could not make the radicals win, when they didn’t. But he celebrated their struggles. For Hill, the seventeenth-century religious arguments were not arid but were evidence of the sincere quest to read God’s message. He had once tried to do that himself. And the seventeenth-century political contests were equally vivid for him, as he too had been part of an organised movement which had struggled to embody the momentum of history.

As I say, twice his confidence in the worldly formulations of his cause failed. Yet his belief in egalitarianism did not. Personally, he became happy in his second marriage; and he immersed himself in his work as a historian. From being a scholar who wrote little, he became super-productive. Books and essays poured from his pen. Among those he studied was the one seventeenth-century radical who appealed to him above all others: Gerrard Winstanley, the Digger, who founded an agrarian commune in the Surrey hills. And the passage in Winstanley’s Law of Freedom (1652) that Hill loved best was dramatic in the best T.S. Gregory style. What is the greatest sin in the world? demanded Winstanley. And he answered emphatically that it is for rich people to hoard gold and silver, while poor people suffer from hunger and want.          

          What Hill would say today, at the ever widening inequalities across the world, is not hard to guess. But he would also say: don’t lose faith in the spirit of equality. It is a basic tenet of human life. And all who believe in fair does for all, as part of true freedom, should strive to find our own best way, individually and/or collectively, to do our best for our fellow humans and to advance Hill’s Good Old Cause.

1 For documentation, see P.J. Corfield, ‘“We are all One in the Eyes of the Lord”: Christopher Hill and the Historical Meanings of Radical Religion’, History Workshop Journal, 58 (2004), pp. 110-27. Now posted on PJC personal website as Pdf5; and further web-posted essays PJC Pdf47-50, all on www.penelopejcorfield.co.uk

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 95 please click here

MONTHLY BLOG 94, THINKING LONG – STUDYING HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

History is a subject that deals in ‘thinking long’. The human capacity to think beyond the immediate instant is one of our species’ most defining characteristics. Of course, we live in every passing moment. But we also cast our minds, retrospectively and prospectively, along the thought-lines of Time, as we mull over the past and try to anticipate the future. It’s called ‘thinking long’.

Studying History (indicating the field of study with a capital H) is one key way to cultivate this capacity. Broadly speaking, historians focus upon the effects of unfolding Time. In detail, they usually specialise upon some special historical period or theme. Yet everything is potentially open to their investigations.

Sometimes indeed the name of ‘History’ is invoked as if it constitutes an all-seeing recording angel. So a controversial individual in the public eye, fearing that his or her reputation is under a cloud, may proudly assert that ‘History will be my judge’. Quite a few have made such claims. They express a blend of defiance and  optimism. Google: ‘History will justify me’ and a range of politicians, starting with Fidel Castro in 1963, come into view. However, there’s no guarantee that the long-term verdicts will be kinder than any short-term criticisms.

True, there are individuals whose reputations have risen dramatically over the centuries. The poet, painter and engraver William Blake (1757-1827), virtually unknown in his own lifetime, is a pre-eminent example. Yet the process can happen in reverse. So there are plenty of people, much praised at the start of their careers, whose reputations have subsequently nose-dived and continue that way. For example, some recent British Prime Ministers may fall into that category. Only Time (and the disputatious historians) will tell.

Fig. 1 William Blake’s Recording Angel has about him a faint air of an impish magician as he points to the last judgment. If this task were given to historians, there would be a panel of them, arguing amongst themselves.

In general, needless to say, those studying the subject of History do not define their tasks in such lofty or angelic terms. Their discipline is distinctly terrestrial and Time-bound. It is prone to continual revision and also to protracted debates, which may be renewed across generations. There’s no guarantee of unanimity. One old academic anecdote imagines the departmental head answering the phone with the majestic words: ‘History speaking’.1 These days, however, callers are likely to get no more than a tinny recorded message from a harassed administrator. And academic historians in the UK today are themselves being harried not to announce god-like verdicts but to publish quickly, in order to produce the required number of ‘units of output’ (in the assessors’ unlovely jargon) in a required span of time.

Nonetheless, because the remit of History is potentially so vast, practitioners and students have unlimited choices. As already noted, anything that has happened within unfolding Time is potentially grist to the mill. The subject resembles an exploding galaxy – or, rather, like the cosmos, the sum of many exploding galaxies.

Tempted by that analogy, some practitioners of Big History (a long-span approach to History which means what it says) do take the entire universe as their remit, while others stick merely to the history of Planet Earth.2 Either way, such grand approaches are undeniably exciting. They require historians to incorporate perspectives from a dazzling range of other disciplines (like astro-physics) which also study the fate of the cosmos. Thus Big History is one approach to the subject which very consciously encourages people to ‘think long’. Its analysis needs careful treatment to avoid being too sweeping and too schematic chronologically, as the millennia rush past. But, in conjunction with shorter in-depth studies, Big History gives advanced students a definite sense of temporal sweep.

Meanwhile, it’s also possible to produce longitudinal studies that cover one impersonal theme, without having to embrace everything. Thus there are stimulating general histories of the weather,3 as well as more detailed histories of weather forecasting, and/or of changing human attitudes to weather. Another overarching strand studies the history of all the different branches of knowledge that have been devised by humans. One of my favourites in this genre is entitled: From Five Fingers to Infinity.4 It’s a probing history of mathematics. Expert practitioners in this field usually stress that their subject is entirely ahistorical. Nonetheless, the fascinating evolution of mathematics throughout the human past to become one globally-adopted (non-verbal) language of communication should, in my view, be a theme to be incorporated into all advanced courses. Such a move would encourage debates over past changes and potential future developments too.

Overall, however, the great majority of historians and their courses in History take a closer focus than the entire span of unfolding Time. And it’s right that the subject should combine in-depth studies alongside longitudinal surveys. The conjunction of the two provides a mixture of perspectives that help to render intelligible the human past. Does that latter phrase suffice as a summary definition?5 Most historians would claim to study the human past rather than the entire cosmos.

Yet actually that common phrase does need further refinement. Some aspects of the human past – the evolving human body, for example, or human genetics – are delegated for study by specialist biologists, anatomists, geneticists, and so forth. So it’s clearer to say that most historians focus primarily upon the past of human societies in the round (ie. including everything from politics to religion, from war to economics, from illness to health, etc etc). And that suffices as a definition, provided that insights from adjacent disciplines are freely incorporated into their accounts, wherever relevant. For example, big cross-generational studies by geneticists are throwing dramatic new light upon the history of human migration around the globe and also of intermarriage within the complex range of human species and the so-called separate ‘races’ within them.6 Their evidence amply demonstrates the power of longitudinal studies for unlocking both historical and current trends.

The upshot is that the subject of History can cover everything within the cosmos; that it usually concentrates upon the past of human societies, viewed in the round; and that it encourages the essential human capacity for thinking long. For that reason, it’s a study for everyone. And since all people themselves constitute living histories, they all have a head-start in thinking through Time.7

1 I’ve heard this story recounted of a formidable female Head of History at the former Bedford College, London University; and the joke is also associated with Professor Welch, the unimpressive senior historian in Kingsley Amis’s Lucky Jim: A Novel (1953), although upon a quick rereading today I can’t find the exact reference.

2 For details, see the website of the Big History’s international learned society (founded 2010): www.ibhanet.org. My own study of Time and the Shape of History (2007) is another example of Big History, which, however, proceeds not chronologically but thematically.

3 E.g. E. Durschmied, The Weather Factor: How Nature has Changed History (2000); L. Lee, Blame It on the Rain: How the Weather has Changed History (New York, 2009).

4 F.J. Swetz (ed.), From Five Fingers to Infinity: A Journey through the History of Mathematics (Chicago, 1994).

5 For meditations on this theme, see variously E.H. Carr, What is History? (Cambridge 1961; and many later edns); M. Bloch, The Historian’s Craft (in French, 1949; in English transl. 1953); B. Southgate, Why Bother with History? Ancient, Modern and Postmodern Motivations (Harlow, 2000); J. Tosh (ed.), Historians on History: An Anthology (2000; 2017); J. Black and D.M. MacRaild, Studying History (Basingstoke, 2007); H.P.R. Finberg (ed.), Approaches to History: A Symposium (2016).

6 See esp. L.L. Cavalli-Sforza and F. Cavalli-Sforza, The Great Human Diasporas: The History of Diversity and Evolution, transl. by S. Thomas (Reading, Mass., 1995); D. Reich, Who We Are and Where We Got Here: Ancient DNA and the New Science of the Human Past (Oxford, 2018).

7 P.J. Corfield, ‘All People are Living Histories: Which is why History Matters’. A conversation-piece for those who ask: Why Study History? (2008) in London University’s Institute of Historical Research Project, Making History: The Discipline in Perspective www.history.ac.uk/makinghistory/resources/articles/why_history_matters.html; and also available on www.penelopejcorfield.co.uk/ Pdf1.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 94 please click here

MONTHLY BLOG 93, HOW TO STUDY HISTORIANS: HISTORIOLOGY, NOT HISTORIOGRAPHY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Historian at work:
Scribble, Scribble, Scribble
– with acknowledgement to Shutterstock 557773132

‘Always scribble, scribble, scribble! Eh, Mr Gibbon?’ This kindly put-down from the Duke of Gloucester to Edward Gibbon in 1781 has become a classic from a lackadaisical onlooker, who had just been presented with a new volume of Decline and Fall by its industrious author. And Gibbon, historian-scribbler par excellence, has had the last laugh. His works are still in print. And the noble Duke, the younger brother of George III, is today unknown, except for this exchange.

His remark may stand proxy for the bafflement which is often the public response to the hard work behind the historian’s scribbles. Readers primarily study History to learn about the immense stock of past human experience. But it’s always wise to check the sources behind any given interpretation. In these days when the public is rightly being re-alerted to the risk of fake news (NOT a recent invention), people should be similarly aware of the dangers of unduly biased histories as well as fake documentation on-line and fake information on social media.

With such thoughts in mind, the historian E.H. Carr, a canny expert on Soviet Russia, offered famously brisk advice: ‘Study the historian before you begin to study the facts’.1 In practice, however, such a leisurely two-step procedure is not really feasible. (Quite apart from the challenges in demarcating ‘facts’ from interpretations). History readers are generally not greatly interested in the lives of historians, which are rarely (if ever)  as exciting as the History which they study.

In practice, therefore, the public tends to rely upon book reviewers to highlight particularly notable points in an individual historian’s approach – and upon book publishers to vet the general standard. (And, yes: there is a rigorous process of assessment behind the scenes). At degree level, however, History students need to know about the formation of their discipline and how to apply best practice. Thus every advanced thesis or dissertation is expected to start with a critical review of the main debates surrounding the chosen subject, with measured reflections upon the viewpoints of all the leading protagonists.

So how can students best be trained in this art? It’s often done via old-hat courses labelled Historiography. These courses introduce famous historians in roughly chronological order, replete with details of who wrote what when, and with what basic approach. There are some helpful overview guides.2 Yet fellow historians tend to find such studies far more interesting as a genre than do students. Instead, undergraduates often complain that old-style Historiography courses are boring, hard to assimilate, and unclear in their overall pedagogic message.

Moreover, today the biographical/historiographical approach has been rendered impracticable by the twentieth-century burgeoning of professional History. Once, students could be frogmarched through Gibbon, Macaulay, Lord Acton, and, with a nod to internationalism, Leopold von Ranke. With academic expansion, however, the terms of trade have altered. Globally, there are thousands of practicing historians. Students are habitually given reading lists of up to 20 books and articles for each separate essay which they are required to write. Clearly, they cannot give equal attention to every author. Nor should they try.

Academics in Britain today are regularly assessed, in a national regime of utilitarian scrutiny which verges on the oppressive. There is less scope for individual idiosyncrasy, let alone real eccentricity. Thus, while there are significant interpretational differences, the major variations are between schools of thought.

Hence courses on Historiography should mutate into parallel courses on Historiology. (The name’s abstruse but the practice is not). Such courses introduce the rich matrix of concepts and approaches which coalesce and jostle together to create the discipline of History as practised today. As a result, students are alerted to the different schools of thought, emerging trends of scholarship, and great debates within and about the subject.3

Individual historians may still appear in the narrative, to exemplify relevant trends. For example, any assessment of the Marxist contribution to British history-writing will include the role of E.P. Thompson (1924-90), author of The Making of the English Working Class (1st pub. 1963; and still in print). Yet he was no orthodox follower of Karl Marx. (Indeed, Thompson in his later days sometimes called himself a post-Marxist). Instead, his approach was infused by the practice of empathy, as derived from thinkers like Wilhelm Dilthey (1833-1911) and adopted in the new discipline of anthropology.4 Hence E.P. Thompson appears in Historiology courses under more than one heading. He is also an exemplar of the impact of cultural anthropology upon historical studies. In other words, his own ‘making’ was complex – and students are invited to assess how Thompson fused two different intellectual traditions into his version of cultural Marxism.5

A good foundational course in Historiology should thus provide a broad overview of the growth and diversity of the discipline. Its organisation should be thematic, not biographical. Relevant topics include: (1) the pioneering of source citation and footnoting; (2) the nineteenth-century development of professional research standards and the move into the archives; (3) the contribution of Whig-liberal views of progress; (4) countervailing theories of decline and fall; (5) the impact of Lewis Namier and the first iteration of structuralism; (6) the input from Marxism; (7) the role of ‘empathy’ and input from cultural anthropology; (8) the impact of feminism(s); (9) the focus upon ‘identity’, whether social, sexual, ethnic, imperial, colonial, post-colonial, religious, or any other; (10) structuralism and its refinement into Foucauldian poststructuralism; (11) the postmodernist challenge, peaking in the 1990s, and the historians’ answers to the same; and (12) the current quest for re-synthesis: from micro-history to Big History, big data, global history, and public history. (With other specialist themes to be added into related courses tailored for sub-specialisms such as art history, economic history, and so forth).

It’s crucial, meanwhile, that the teaching of historical skills and methodologies is fully incorporated into Historiology. Theories and praxis are best understood and taught together There has been much recent pressure, chiefly coming from outside the discipline, to teach ‘Skills’ separately. It looks suitably utilitarian in brochures. But it makes for poor teaching. Courses that jump from one skill to another – today, empathy; next week, databases; the week after, using archives – are very hard for students to assimilate. To repeat my words from 2010: ‘People cannot learn properly from skills taught in a vacuum. At best they have a half-knowledge of what to do – and at worst they have forgotten – which means that later they have to learn the same skills all over again.’6

Lastly, the name of ‘Historiology’ needs a user-friendly makeover. If nothing else emerges, call it simply History’s ‘Core’ or ‘Foundation’ course. Ideally, however, it needs a ‘big’ compendious name. It takes ‘Big-History-Skills-Concepts’ all taught together to illuminate the eclectic operational framework of today’s ever-busy and ever-argumentative historians.

ENDNOTES:

1 E.H. Carr, What is History? (1961; in second edn. 1964), p. 23.

2 See e.g. C. Parker, The English Historical Tradition since 1850 (1990).

3 Four exemplary studies are reviewed in P.J. Corfield, ‘How Historiology Defines History’ (2008), in PJC website www.penelopejcorfield.co.uk/Pdf4.

4 I.N. Bulhof, Wilhelm Dilthey: A Hermeneutic Approach to the Study of History and Culture (1980), esp. pp. 1-23.

5 See B.D. Palmer, The Making of E.P. Thompson: Marxism, Humanism and History (1981); H.J. Kaye, The British Marxist Historians: An Introductory Analysis (1984), esp. pp. 167-220; P.J. Corfield, ‘E.P. Thompson: An Appreciation’, New Left Review, no 201 (Sept/Oct 1993), pp. 10-17, repr. in PJC website www.penelopejcorfield.co.uk/Pdf45; and C. Efstathiou, E.P. Thompson: A Twentieth-Century Romantic (2015).

6 PJC, ‘What should a New Government do about the Skills Agenda in Education Policy? (BLOG/1, Oct. 2010), in PJC, https://www.penelopejcorfield.com/monthly-blogs/.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 93 please click here

MONTHLY BLOG 92, HISTORIANS AT WORK THROUGH TIME

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)
Historians, who study the past, don’t undertake this exercise from some vantage point outside Time. They, like everyone else, live within an unfolding temporality. That’s very fundamental. Thus it’s axiomatic that historians, like their subjects of study, are all equally Time-bound.1

Nor do historians undertake the study of the past in one single moment in time. Postmodernist critics of historical studies sometimes write as though historical sources are culled once only from an archive and then adopted uncritically. The implied research process is one of plucking choice flowers and then pressing them into a scrap-book to some pre-set design.

On such grounds, critics of the discipline highlight the potential flaws in all historical studies. Sources from the past are biased, fallible and scrappy. Historians in their retrospective analysis are also biased, fallible and sometimes scrappy. And historical writings are literary creations only just short of pure fiction.2

Historians should welcome scepticism this dose of scepticism – always a useful corrective. Yet they entirely reject the proposition that trying to understand bygone eras is either impossible or worthless. Rebuttals to postmodernist scepticism have been expressed theoretically;3 and also directly, via pertinent case studies which cut through the myths and ‘fake news’ which often surround controversial events in history.4

When at work, historians should never take their myriad of source materials literally and uncritically. Evidence is constantly sought, interrogated, checked, cross-checked, compared and contrasted, as required for each particular research theme. The net is thrown widely or narrowly, again depending upon the subject. Everything is a potential source, from archival documents to art, architecture, artefacts and though the gamut to witness statements and zoological exhibits. Visual materials can be incorporated either as primary sources in their own right, or as supporting documentation. Information may be mapped and/or tabulated and/or statistically interrogated. Digitised records allow the easy selection of specific cases and/or the not-so-easy processing of mass data.

As a result, researching and writing history is a slow through-Time process – sometimes tediously so. It takes at least four years, from a standing start, to produce a big specialist, ground-breaking study of 100,000 words on a previously un-studied (or under-studied) historical topic. The exercise demands a high-level synthesis of many diverse sources, running to hundreds or even thousands. Hence the methodology is characteristically much more than a ‘reading’ of one or two key texts – although, depending upon the theme, at times a close reading of a few core documents (as in the history of political ideas) is essential too.

Mulling over meanings is an important part of the process too. History as a discipline encourages a constant thinking and rethinking, with sustained creative and intellectual input. It requires knowledge of the state of the discipline – and a close familiarity with earlier work in the chosen field of study. Best practice therefore enjoins writing, planning and revising as the project unfolds. For historical studies, ‘writing through’ is integral, rather than waiting until all the hard research graft is done and then ‘writing up’.5

The whole process is arduous and exciting, in almost equal measure. It’s constantly subject to debate and criticism from peer groups at seminars and conferences. And, crucially too, historians are invited to specify not only their own methodologies but also their own biases/assumptions/framework thoughts. This latter exercise is known as ‘self-reflexivity’. It’s often completed at the end of a project, although it’s then inserted near the start of the resultant book or essay. And that’s because writing serves to crystallise and refine (or sometimes to reject) the broad preliminary ideas, which are continually tested by the evidence.

One classic example of seriously through-Time writing comes from the classic historian Edward Gibbon. The first volume of his Decline & Fall of the Roman Empire appeared in February 1776. The sixth and final one followed in 1788. According to his autobiographical account, the gestation of his study dated from 1764. He was then sitting in the Forum at Rome, listening to Catholic monks singing vespers on Capitol Hill. The conjunction of ancient ruins and later religious commitments prompted his core theme, which controversially deplored the role of Christianity in the ending of Rome’s great empire. Hence the ‘present’ moments in which Gibbon researched, cogitated and wrote stretched over more than 20 years. When he penned the last words of the last volume, he recorded a sensation of joy. But then he was melancholic that his massive project was done.6 (Its fame and the consequent controversies last on today; and form part of the history of history).

1 For this basic point, see PJC, ‘People Sometimes Say “We Don’t Learn from the Past” – and Why that Statement is Completely Absurd’, BLOG/91 (July 2018), to which this BLOG/92 is a companion-piece.

2 See e.g. K. Jenkins, ReThinking History (1991); idem (ed.), The Postmodern History Reader (1997); C.G. Brown, Postmodernism for Historians (Harlow, 2005); A. Munslow, The Future of History (Basingstoke, 2010).

3 J. Appleby, L. Hunt and M. Jacob, Telling the Truth about History (New York, 1994); R. Evans, In Defence of History (1997); J. Tosh (ed.), Historians on History (Harlow, 2000); A. Brundage, Going to the Sources: A Guide to Historical Research and Writing (Hoboken, NJ., 2017).

4 H. Shudo, The Nanking Massacre: Fact versus Fiction – A Historian’s Quest for the Truth, transl. S. Shuppan (Tokyo, 2005); Vera Schwarcz, Bridge across Broken Time: Chinese and Jewish Cultural Memory (New Haven, 1998).

5 PJC, ‘Writing Through a Big Research Project, not Writing Up’, BLOG/60 (Dec.2015); PJC, ‘How I Write as a Historian’, BLOG/88 (April 2018).

6 R. Porter, Gibbon: Making History (1989); D.P. Womersley, Gibbon and the ‘Watchmen of the Holy City’: The Historian and his Reputation, 1776-1815 (Oxford, 2002).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 92 please click here

MONTHLY BLOG 91, PEOPLE SOMETIMES SAY: ‘WE DON’T LEARN FROM THE PAST’ AND WHY THAT STATEMENT IS COMPLETELY ABSURD

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

People sometimes say, dogmatically but absurdly: ’We don’t learn from the Past’. Oh really? So what do humans learn from, then? We don’t learn from the Future, which has yet to unfold. We do learn in and from the Present. Yet every moment of ‘Now’ constitutes an infinitesimal micro-instant an unfolding process. The Present is an unstable time-period, which is constantly morphing, nano-second by nano-second, into the Past. Humans don’t have time, in that split-second of ‘Now’, to comprehend and assimilate everything. As a result, we have, unavoidably, to learn from what has gone before: our own and others’ experiences, which are summed as everything before ‘Now’: the Past.

It’s worth reprising the status of those temporal categories. The Future, which has not yet unfolded, is not known or knowable in its entirety. That’s a definitional quality which springs from the unidirectional nature of Time. It does not mean that the Future is either entirely unknown or entirely unknowable. As an impending temporal state, it may beckon, suggest, portend. Humans are enabled to have considerable information and expectations about many significant aspects of the Future. For example, it’s clear from past experience that all living creatures will, sooner or later, die in their current corporeal form. We additionally know that tomorrow will come after today, because that is how we habitually define diurnal progression within unilinear Time. We also confidently expect that in the future two plus two will continue to equal four; and that all the corroborated laws of physics will still apply.

And we undertake calculations, based upon past data, which provide the basis for Future predictions or estimates. For example, actuarial tables, showing age-related life expectancy, indicate group probabilities, though not absolute certainties. Or, to take a different example, we know, from expert observation and calculation, that Halley’s Comet is forecast to return into sight from Earth in mid-2061. Many, though not all, people alive today will be able to tell whether that astronomical prediction turns out to be correct or not. And there’s every likelihood  that it will be.

Commemorating a successful prediction,
in the light of past experience:
a special token struck in South America in 2010 to celebrate
the predicted return to view from Planet Earth
of Halley’s Comet,
whose periodicity was first calculated by Edward Halley (1656-1742)

Yet all this (and much more) useful information about the Future is, entirely unsurprisingly, drawn from past experience, observations and calculations. As a result, humans can use the Past to illuminate and to plan for the Future, without being able to foretell it with anything like total precision.

So how about learning from the Present? It’s live, immediate, encircling, inescapably ‘real’. We all learn in our own present times – and sometimes illumination may come in a flash of understanding. One example, as Biblically recounted, is the conversion of St Paul, who in his unregenerate days was named Saul: ‘And as he journeyed, he came near Damascus; and suddenly there shined round about him a light from heaven. And he fell to the earth, and heard a voice saying unto him, “Saul, Saul, why persecutest thou me?”’1 His eyes were temporarily blinded; but spiritually he was enlightened. Before then, Saul was one of the Christians’ chief persecutors, ‘breathing out threatening and slaughter’.2 Perhaps a psychologist might suggest that his intense hostility concealed some unexpressed fascination with Christianity. Nonetheless, there was no apparent preparation, so the ‘Damascene conversion’ which turned Saul into St Paul remains the classic expression of an instant change of heart. But then he had to rethink and grow into his new role, working with those he had been attempting to expunge.

A secular case of sudden illumination appears in the fiction of Jane Austen. In Emma (1815), the protagonist, a socially confident would-be match-maker, has remained in ignorance of her own heart. She encourages her young and humble protégé, Harriet Smith, to fancy herself in love. They enjoy the prospect of romance. Then Emma suddenly learns precisely who is the object of Harriet’s affections. The result is wonderfully described.3 Emma sits in silence for several moments, in a fixed attitude, contemplating the unpleasant news:

Why was it so much worse that Harriet should be in love with Mr Knightley, than with Frank Churchill? Why was the evil so dreadfully increased by Harriet’s having some hope of a return? It darted through her, with the speed of an arrow, that Mr Knightley must marry no one but herself!

I remember first reading this novel, as a teenager, when I was as surprised as Emma at this development. Since then, I’ve reread the story many times; and I can now see the prior clues which Austen scatters through the story to alert more worldly-wise readers that George Knightley and Emma Woodhouse are a socially and personally compatible couple, acting in concert long before they both (separately) realise their true feelings. It’s a well drawn example of people learning from the past whilst ‘wising up’ in a single moment. Emma then undertakes some mortifying retrospection as she gauges her own past errors and blindness. But she is capable of learning from experience. She does; and so, rather more artlessly, does Harriet. It’s a comedy of trial-and-error as the path to wisdom.

As those examples suggest, the relationship of learning with Time is in fact a very interesting and complex one. Humans learn in their own present moments. Yet the process of learning and education as a whole has to be a through-Time endeavour. A flash of illumination needs to be mentally consolidated and ‘owned’. Otherwise it is just one of those bright ideas which can come and as quickly go.   Effective learning thus entails making oneself familiar with a subject by repetition, cogitation, debating, and lots of practice. Such through-Time application applies whether people are learning physical or intellectual skills or both. The role of perspiration, as well as inspiration, is the stuff of many mottoes: ‘practice makes perfect’; ‘if at first you don’t succeed, try and try again’; ‘stick at it’; ‘never stop learning’; ‘trudge another mile’; ‘learn from experience’.

Indeed, the entire corpus of knowledge and experience that humans have assembled over many generations is far too huge to be assimilated in an instant. (It’s actually too huge for any one individual to master. So we have to specialise and share).

So that brings the discussion back to the Past. It stretches back through Time and onwards until ‘Now’. Of course, we learn from it. Needless to say, it doesn’t follow that people always agree on messages from former times, or act wisely in the light of such information. Hence when people say: ‘We don’t learn from the Past’, they probably mean that it does not deliver one guiding message, on which everyone agrees. And that’s right. It doesn’t and there isn’t.

One further pertinent point: there are rumbling arguments around the question – is the Past alive or dead? (With a hostile implication in the sub-text that nothing can really be learned from a dead and vanished Past.) But that’s not a helpful binary. In other words, it’s a silly question. Some elements of the past have conclusively gone, while many others persist through time.4 To take just a few examples, the human genome was not invented this morning; human languages have evolved over countless generations; and the laws of physics apply throughout.

Above all, therefore, the integral meshing between Past and Present means that we, individual humans, have also come from the Past. It’s in us as well as, metaphorically speaking, behind us. Thinking of Time as running along a pathway or flowing like a river is a common human conception of temporality. Other alternatives might envisage the Past as ‘above’, ‘below’, ‘in front’, ‘behind’, or ‘nowhere specific’. The metaphor doesn’t really matter as long as we realise that it pervades everything, including ourselves.

1 Holy Bible, Acts 9: 3-4.

2 Ibid, 9:1.

3 J. Austen, Emma: A Novel (1815), ed. R. Blythe (Harmondsworth, 1969), p. 398.

4 P.J. Corfield, ‘Is the Past Dead or Alive? And the Snares of Such Binary Questions’, BLOG/62 (Feb.2016).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 91 please click here

MONTHLY BLOG 90, CELEBRATING HUMAN DIVERSITY AMIDST HUMAN UNITY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Tree of Life
How do we combat racism, which does exist, without endorsing the idea of separate human ‘races’, which don’t exist? All humans share one big world-wide family-tree. Maybe squabbling, maybe prejudiced, maybe many things, lots of good as well as bad – but all sisters and brothers under the skin.1 So let’s celebrate human diversity amidst human unity.

Three thoughts. Firstly, it’s very right and proper for any group who are wrongly discriminated against to protest in full human dignity. It’s not only a duty which people owe to themselves. But they owe it to their children, whose entire upbringings can be blighted by a baffled sense that they are unappreciated in the wider world, without any fault of their own. A sense of inner worth is a vital gift to give to every child. ‘Of all our infirmities, the most savage is to despise our being’ (Montaigne).

Campaigns like ‘Black Pride UK’2 and ‘Black Lives Matter’3 are honourable and deserve support from everyone, though, as within all cultural/political popular movements, there are valid debates over tactics and strategy. The general point is not to disparage others but to affirm the dignity and importance of the lives of all descendants of the African diaspora. In particular, a celebration of human pride is intended not only to hearten the young but to alert authority figures in general and the police in particular. Since anthropologists tell us that all branches of humanity come ultimately ‘out of Africa’, these are campaigns that everyone can value.

Secondly: We also need cultural space to celebrate people of mixed heritage, with diverse ethnic and national backgrounds. Having written last month on the under-acknowledgement of this very common feature of human history, I was initially surprised at the number of people who hastened to tell me about their own mixed families. Yet I shouldn’t have been. Huge numbers of people, from all round the world, have mixed parentage. And as travel and migration spread, that experience is likely to become ever more common.

Among my own family, I already have an Indian/English niece, whose partner is a Catalan/Irishman. Two of my step-nieces are Japanese/English; two others are one-quarter Danish. Two first cousins are Italian/English. Another first cousin is Scottish/English (and supports Scottish nationalism). Another branch of second cousins are French-speaking, of English/French descent. And my partner has recently been told by a relative, who is investigating their south London family tree, that they have an Indian great-grandmother, who met and married their great-grandfather when he was on military service in India.

Similarly, among my close ‘English’ friends, it turns out that one has a Chinese father (whom she has never met). Someone else has both Portuguese and Spanish ancestors, whose families she meets regularly. Other friends have close family links which are (separately and variously) Algerian, American including indigenous American, Argentine, Australian, Brazilian, Canadian, Columbian, Czech, Dutch, Egyptian, Filipino, French, German, Iranian, Irish, Israeli, Italian, Jamaican, New Zealand, Nigerian, Pakistani, Polish, Portuguese, Roma (gypsy), Romanian, Russian, Serbian, South African, Spanish, Swedish, Taiwanese, Thai and Turkish.

Continuing the diversity, one of my close friends among my former students, who herself studies how people travelling in the past met and reacted to ‘different’ peoples, has Bajan/Scottish family roots.

In the wider world, an American woman of mixed parentage has just married into Britain’s royal family, which has German/Danish/Greek/English ancestry. The US President before the current incumbent has Kenyan/American roots. The current US incumbent has Scottish/German/American roots and declared in 2008, when visiting his mother’s birthplace in the Outer Hebrides, that he ‘feels Scottish’.4 And a relatively recent leader of the British Conservative Party, Iain Duncan Smith MP, is one-eighth Japanese: his maternal great-grandmother was a Chinese lady living in Beijing when she met and married his Irish great-grandfather.5

Some of these mixed family ancestries are apparent to the eye – but many, equally, are not. But it’s manifestly open to all people of mixed heritage to celebrate all their family lines; and to refuse repeated attempts on official forms to compartmentalise them into one so-called ‘race’ or another.

Collectively, all peoples of mixed heritage (including not least the English with their historically hybrid Celtic, Viking, Anglo-Saxon, Norman-French, Huguenot, and Irish roots) represent the outcome of historical population mobility. Humans are a globe-trotting species, and people from different tribes or ‘folk groups’ intermarry. It seems too that many of the separate species of very early humankind also interbred. Hence some but not all branches of homo sapiens have small traces of Neanderthal DNA, following meetings from at least 200,000 years ago.6 Diversity within unity is the norm.

So thirdly and finally: It’s overdue to accept the teachings of world religions, biological science, and philosophical humanism, which proclaim that all humans are sisters and brothers under the skin. In particular, it’s even more overdue to reject socially-invented pigment-hierarchies which claim that some shades of skin are ‘better’ and more socially desirable than others.

By the way, sometimes people ask me why I write on these matters. I have fair skin and hair (though others among my siblings don’t). And I am relatively socially privileged, though I do have the handicap of being a woman. (That last comment is meant ironically). Such questions, however, miss the point. They wrongly imply that combating racism is an exclusive task for people with dark skins. But no, it’s a matter for everyone. Indeed, it weakens campaigns for ‘Black Pride’, if others are not listening and responding.

Humans are one species which contains diversity. Our skin hues are beautifully variegated shades of the ancestral brown.7 What’s needed is not so much ‘colour blindness’ as ‘colour rejoicing’.

1 See P.J. Corfield, Talking of Language, It’s Time to Update the Language of Race (BLOG/36, Dec. 2013); PJC, How do People Respond to Eliminating the Language of ‘Race’? (BLOG/37, Jan.2014); PJC, Why is the Language of ‘Race’ Holding On for So Long, when it’s Based on a Pseudo-Science? (BLOG/38, Feb. 2014); and PJC, As the Language of “Race” Disappears, Where does that Leave the Assault on Racism? (BLOG/89, May 2018).

2 Founded 2005: see https://ukblackpride.org.uk.

3 Black Lives Matter is an international chapter-based campaign movement, founded in July 2013. See: https://blacklives matter.com.

4 Reported in The Guardian, 9 June 2008.

5 https://www.theguardian.com/politics/2001/sep/03/conservatives.uk

6 Research by Sergi Castellano and others, reported in Nature: International Weekly Journal of Science (May 2018), https://www.nature.com/news.

7 N.G. Jablonski, Skin: A Natural History (Berkeley, Calif., 2006) and idem, Living Colour: The Biological and Social Meaning of Skin Colour (Berkeley, Calif., 2012).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 90 please click here

MONTHLY BLOG 89, AS THE LANGUAGE OF ‘RACE’ DISAPPEARS, WHERE DOES THAT LEAVE THE ASSAULT UPON RACISM?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

 
2018-05 No1 circle-holding-hands - Copy
Hands around the Globe:
© WikiClipArt 2018

Many people, including myself, have declared that the language of ‘race’ should become obsolete.1 (Indeed, that is slowly happening). Talk of separate human ‘races’ is misleading terminology, since all humans belong to one species: homo sapiens. It’s unscientific, as geneticists have repeatedly shown that all people share the same deep biological inheritance and genome.2

Putting people into arbitrary ‘racial’ categories is also unjust to the many people with multiple ethnic heritages.3 And the terminology is confusing even for those who still believe in it, since there has never been agreement about fundamental questions, such as how many ‘races’ there are.

So this BLOG asks what is happening next, as the old terminology slowly disappears? For certain purposes societies need to acknowledge the range of diversity (alongside the common features) within the human species. Yet it is evident that the world has not yet agreed upon satisfactory alternative terminologies.

The first general answer is that language innovation will find a way. It’s not just for one individual to prescribe, but for usages to adapt incrementally. These days, references are usually made in terms of cultural ethnicity (or folk allegiance)4 and/or in terminologies derived from world-regional locations, or a mix of the same (as in African American).

Language innovation also needs to acknowledge the very many people around the world who have multiple inheritances. It’s not satisfactory to refer to ‘mixed race’ or ‘multi-racial backgrounds’. Those phrases smuggle the scientifically meaningless but culturally divisive concept of ‘race’ back into the picture. World regional terms have the advantage here, in that they can easily be doubled up to indicate multiple roots. However, mixings over many generations can make for cumbersome and overloaded terminologies. Often, new collective terms emerge over time. So the ancestrally hybrid Celtic/Viking/Anglo-Saxon/Norman-French population of England after 1066 became eventually ‘English’, and continue to adapt to later generations of population turnover.

One technical change that’s certainly needed is the updating of language on official forms, such as census returns. People are often still invited to self-classify into a separate ‘race’ by ticking a box. When scrutinised closely, such forms often use a very unsystematic mix of classifications, sometimes by ethnicity and sometimes by skin colour. People of multiple heritages usually have to make do with ticking ‘Other’. But sometimes they don’t even get that option. And people who reject the classification of humans into bogus ‘races’ don’t have anywhere to express their dissent.

Another key question is what happens to concepts like ‘racism’ and ‘racist’, if ‘race’ is dropped from the lexicon? Does that move let people who embrace racism off the hook?

To that pertinent question, the answer is: No. People who discriminate against other ethnic groups still need to be opposed just as firmly. But not by using their language. Rejecting the reality of ‘race’ strengthens criticism of racist prejudices. Such attitudes are not only humanly obnoxious but they are based upon non-sense: a combination of myths, pseudo-science, and a not very well disguised form of self-interest. Racists are the equivalent of flat-earthers, denying reality for their own tribalistic benefit.

En route, here’s a small point in the general scheme of things but a relevant one in this context: the United Nations should keep its International Day for the Elimination of ‘Racial’ Discrimination. It’s scheduled annually on 21 March – the anniversary of the Sharpeville killings of protestors against the infamous South American Pass Laws.5 Yet it needs a better name. Or at least ‘Racial’ in its title should be put into quotation marks, as I’ve just done. Otherwise, its subtext seems to affirm that there are separate ‘races’, when there aren’t. Indeed, one of the practical problems of implementing the South African Pass Laws sprang from the complexities of historic interminglings: many individual classifications into the stipulated camps of ‘White’ or ‘Native’ [black African] or ‘Coloured’ proved to be highly contentious.6

Following that, it’s also worth asking whether rejecting the concept of ‘race’ might imply that people shouldn’t take an interest in their own genetic and cultural/ethnic backgrounds? Here the answer is equally: No. But this time, the effect is positive. Rejecting ‘race’ liberates people from trying to fit their personal histories into false categories, which don’t exist.

Instead, individuals can investigate the ethnic identities of all their family branches with pride. Rejecting separate ‘races’ improves the potential for personal and cultural understanding of our pluralistic humanity. That’s particularly important for people from multiple heritages. Those historic legacies all merit attention, without any false rankings of one group being intrinsically ‘above’ another group of fellow humans. It’s culturally and psychologically important for people to know about their roots. (And in some cases it’s medically relevant too). Yet that exercise should be done in a democratic spirit. Pride in roots is not racist but a due acknowledgement of authentic pluralism.

In many countries, these themes are lived daily. For example, in the great ethnic melting pot of Brazil, there are rival pressures. On the one hand, there are subtle decodings of status and hierarchy by reference to an unacknowledged pigmentocracy, based upon skin colour. Lighter skinner people tend to be in positions of power, although not in all walks of life. On the other hand, there is great pride in country’s multicultural legacies. Hence there is a notable social impulse to ‘be cordial’ (in a favoured phrase) by not drawing attention to outward differences (say) in appearance and skin colour.7 Visitors report on a society where people seem admirably comfortable in their own bodies. In short, the collective dynamic may be evolving beyond older fixations upon ‘race’.

Nonetheless, Brazil’s current policies of affirmative action, to help disadvantaged groups, are running into major difficulties in classifying ethnic affiliations. Specifically, the ‘Race Tribunals’, appointed to undertake this delicate task for appointments to government posts, are struggling with the instability of ‘racial’ boundaries.8 Hence the policy, undertaken with good intentions, has already become controversial.

It may well be that in future the challenges to inequality, in Brazil as elsewhere, will turn to focus instead upon class. And ‘class’, whilst also a socio-cultural-economic concept with its own definitional fuzziness, does not purport to be pre-ordained by human biology. Achieving a full and fair democracy is no easy task; but it will be boosted by finding fresh terms for ethnic diversities within a common humanity – and fresh ways of both assessing and rectifying social disadvantage.

Lastly, the best egalitarian rejection of racism is not to urge that: ‘all “races” should be treated equally’. Such a declaration falls back into the trap of racist pseudo-science. The best statement is straightforward: ‘We are all one human race’. That’s the seriously best starting point from which to combat discrimination.

1 P.J. Corfield, See P.J. Corfield, Talking of Language, It’s Time to Update the Language of Race (BLOG/36, Dec. 2013); idem, How do People Respond to Eliminating the Language of ‘Race’? (BLOG/37, Jan.2014); and idem, Why is the Language of ‘Race’ Holding On for So Long, when it’s Based on a Pseudo-Science? (BLOG/38, Feb. 2014).

2 See L.L. and F. Cavalli-Sforza, The Great Human Diasporas: The History of Diversity and Evolution, transl. S. Thomas (Reading, Mass., 1992); and M. Gannon, ‘Race is a Social Construct, Scientists Argue’, Scientific American (5 Feb. 2016), with the strap line: ‘Racial categories are weak proxies for genetic diversity and need to be phased out’.

3 See M.P.P. Root’s Bill of Rights for People of Mixed Heritage (1993), www.drmariaroot.com/doc/BillOfRights.pdf: which includes the declaration: ‘I have the right to have loyalties and identification with more than one group of people’.

4 Ethnicity is defined as the state of belonging to a distinctive group with a shared cultural and/or national tradition. Shared religion, language and genetic markers may also contribute. The classification is not a precise one.

www.un.org/en/events/racialdiscriminationday.

6 D. Posel, Race as Common Sense: Racial Classification in Twentieth-Century South Africa’, African Studies Review,  44 (2001), pp. 87-113.

7 J. Roth-Gordon, Race and the Brazilian Body: Blackness, Whiteness, and Everyday Language in Rio de Janeiro (2016).

8 C. de Oliveira, ‘Brazil’s New Problem with Blackness’ (2017), in Foreign Policy Dispatch: http://foreignpolicy.com/2017/04/05/brazils-new-problem-with-blackness-affirmative-action/

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 89 please click here

MONTHLY BLOG 88, HOW I WRITE AS A HISTORIAN

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Invited by Buff-Coat to comment on how I compose works of history,
the answer fell into nine headings,
written as reminders to myself: 1

  1. Learn to enjoy writing: writing is a craft skill, which can be improved with regular practice.2 Learn to enjoy it.3 Bored authors write bored prose. Think carefully about your intended readership, redrafting as you go. Then ask a trusted and stringent critic for a frank assessment. Adjust in the light of critical review – or, if not accepting the critique, clarify/strengthen your original case.4
  1. Have something to say: essential to have a basic message, conferring a vital spark of originality for every assignment.5 Otherwise, don’t bother. But the full interlocking details of the message will emerge only in course of writing. So it’s ok to begin with working titles for books/chapters/essays/sections and then to finalise them about three-quarters of way through writing process.
  1. Start with mind-mapping: cudgel brains and think laterally to provide visual overview of all possible aspects of the topic, including themes, debates and sources. This is a good moment for surprise, new thoughts. From that, generate a linear plan, whilst keeping mind-map to hand as reference point. And it’s fine, often essential, to adapt linear plan as writing evolves. As part of starting process, define key terms, to be defined at relevant point in the text.6
    2018-04 No1 Mind-map clip-art

Idea of a Mind-Map
© Network Clipart (2018)

  1. Blend discussion of secondary literature seamlessly into analysis: beginners are rightly trained to start with a discrete historiographical survey but, with experience, it’s good to blend exposition into the analysis as it unfolds. Keep readers aware throughout that historians don’t operate in vacuum but debate constantly with fellow historians in their own and previous generations. It’s a process not just of ‘dialogue’ but of complex ‘plurilogue’.7
  1. Interpret primary sources with respect and accuracy: evaluate the strengths and weaknesses of primary sources from the past; be prepared to interpret them but only while treating them with the utmost respect and accuracy. Falsifying data, misquoting sources, or hiding unfavourable evidence are supreme academic sins. Historians are accustomed to write within the constraints of the evidence.8 That’s their essential discipline. Hence the claim by postmodernist theorists that historians can invent (or uninvent) the past just as they please is not justified. Indeed, if history (the past) was simply ‘what historians write’, there’d be no way of evaluating whether one historian’s arguments are historically more convincing than another’s. And there’d be no means of rebutting (say) Holocaust denial.9 The challenging task of evaluating, interpreting and knitting together many different forms of evidence from the past, in the light of evolving debates, is the essence of the historian’s practice.10
  1. Expound your case with light and shade: Counteract the risk of monotony by incorporating variety. Can take the form of illustrations; anecdotes; even jokes. Vary choice of words and phrases.11 Vary sentence lengths. Don’t provide typical academic prose, full of lengthy sentences, stuffed with meandering sub-clauses, all written in densely Latinate terminology. But don’t go to other extreme of all rat-a-tat sub-Hemingway terse Anglo-Saxon texts either. Variety keeps readers interested and gives momentum to an unfolding analysis.
  1. Know the arguments against your own: advocacy works best not by caricaturing opposite views but by understanding them, in order to refute them successfully. All courtroom lawyers and politicians are well advised to follow this rule too. But no need to focus exclusively on all-out attack against rival views. That way risks making your work become dated, as the debates change.
  1. Relate the big arguments to your general philosophy of history:12 Don’t know what that is? Time to decide.13 If not your lifetime verdict, then at least an interim assessment. Clarify as the analysis unfolds. But again ensure that the general philosophy is shown as informing the unfolding arguments/evidence. It’s not an excuse for suddenly inserting a pre-conceived view.
  1. Know how to end:14 Draw threads together and end with a snappy dictum.15

ENDNOTES:

1 This BLOG is the annotated text of a brief report, first posted on 15/03/2018 on: http://keith-perspective.blogspot.co.uk/2018/03/how-i-write-as-historian-by-penelope-j.html, with warm thanks to Keith Livesey, alias Buff-Coat, for the invitation.

2 See P.J. Corfield, Coping with Writer’s Block (BLOG/34, Oct. 2013), on website: https://www.penelopejcorfield.com/monthly-blogs/. All other PJC BLOGS cited in the following endnotes can be consulted via this website.

3 Two different historians who influenced me had very distinctive messages and writing styles: see P.J. Corfield, Two Historians who Influenced Me (BLOG/15, Dec. 2011).

4 P.J. Corfield, Responding to Anonymous Academic Assessments (BLOG/81, Sept. 2017). It followed idem, Writing Anonymous Academic Assessments (BLOG/80, Aug. 2017).

5 History is such a vital subject for all humans that it’s hard not to find something to say. See P.J. Corfield, All People are Living Histories, which is Why History Matters. A Conversation Piece for Those who Ask: Why Study History? (2007), available on the Making History website of London University’s Institute of Historical Research: www.history.ac.uk/makinghistory/resources/articles/why_history_matters; and also on PJC personal website: www.penelopejcorfield.co.uk: Essays on What is History? Pdf/1.

6 That advice includes avoiding terms still widely used by others, like racial divisions between humans. They are misleading and based on pseudo-science. See P.J. Corfield, Talking of Language, It’s Time to Update the Language of Race (BLOG/36, Dec. 2013); idem, How do People Respond to Eliminating the Language of ‘Race’? (BLOG/37, Jan.2014); and idem, Why is the Language of ‘Race’ Holding On for So Long, when it’s Based on a Pseudo-Science? (BLOG/38, Feb. 2014).

7 P.J. Corfield, Does the Study of History ‘Progress’ and How does Plurilogue Help? (BLOG/61, Jan. 2016).

8 P.J. Corfield, What’s So Great about Historical Evidence? (BLOG/66, June 2016); idem, What Next? Interrogating Historical Evidence (BLOG/67, July 2016).

9 For further discussion, see P.J. Corfield, ‘Time and the Historians in the Age of Relativity’, in A.C.T. Geppert and T. Kössler (eds), Obsession der Gegenwart: Zeit im 20. Jahrhundert; transl. as Obsession with the Here-and-Now: Concepts of Time in the Twentieth Century, in series Geschichte und Gesellschaft: Sonderheft, 25 (Göttingen: Vandenhoeck & Ruprecht, 2015), pp. 71-91. Also posted on PJC website: www.penelopejcorfield.co.uk: Essays on What is History? Pdf/38.

10 On the need to differentiate between facts and pseudo-facts, see P.J. Corfield, Facts and Factoids in History (BLOG/52, April 2015).

11 And at times, new words are needed: see P.J Corfield, Inventing Words (BLOG/84, Dec. 2017); and idem, Working with Words (BLOG/85, Jan. 2018).

12 My own account of historical trialectics is available in P.J. Corfield, Time and the Shape of History (Yale University Press, 2007). It’s also expounded theme by theme in idem, Why is the Formidable Power of Continuity So Often Overlooked? (BLOG/2. Nov. 2010); idem, On the Subtle Power of Gradualism (BLOG/4, Jan. 2011); and idem, Reconsidering Revolutions (BLOG/6, March 2011). And further discussed in idem, ‘Teaching History’s Big Pictures, Including Continuity as well as Change’, Teaching History: Journal of the Historical Association, no. 136 (2009), posted on PJC personal website: www.penelopejcorfield.co.uk: Essays on What is History? Pdf/3.

13 The time to decide for yourself might not correspond with interest from others. Never mind! Stick to your guns. See also P.J. Corfield, Writing into Silence about Time (BLOG/73, Jan. 2017); idem, Why Can’t we Think about Space without Time? (BLOG/74, Feb. 2017); idem, Humans as Time-Specific Stardust (BLOG/75, March 2017); and idem, Humans as Collective Time-Travellers (BLOG/76, April 2017).

14 It’s much easier to advise and/or to supervise others: see P.J. Corfield, Supervising a Big Research Project to End Well and On Time: Three Framework Rules (BLOG/59, Nov. 2015); idem, Writing Through a big Research Project: Not Writing Up (BLOG/60, Dec. 2015).

15 On my own travails, see P.J. Corfield, Completing a Big Project (BLOG/86, Feb.2018); and idem, Burned Boats (BLOG/87, March 2018).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 88 please click here

MONTHLY BLOG 87, BURNED BOATS

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

2018-03 No1 firework-flames-clipart-19

Firework-flames @ Clipart flames-clipart.html (2018)

What to confess this month, having burned boats last month, about my intention to finish a big never-ending writing project? First message: yes, it’s good to announce THE BOOK END, even if it still remains tantalisingly-nigh-but-not-yet-quite achieved. Burning one’s boats in public concentrates the mind and attention. Words flow from the keyboard. Deadlines hammer in the head. One feels intensely alive.

At the same time, all of life’s hazards and impediments take this declaration as a signal to attack. There is a serious leak in the bathroom, dripping water onto the stairs below. Urgent action is imperative. A car tyre goes flat at the wrong moment. Long-lost friends come round to call for a long, chatty visit. A close relative falls ill and needs attention. Other work commitments, entered into gaily months ago, suddenly become imminent. The email in-box, of course, overflows with multiple messages, which need sorting, to check that most are safe to ignore. But some are urgent requests from former students needing academic references for jobs which they seriously might get: such exercises of advocacy-at-a-distance need time and careful thought. All these intrusions from the rest of life are entirely predictable, but become major distractions when competing with THE BOOK END deadline.

Cyril Connolly (1903-74) has met with a lot of flak for writing that: ‘there is no more sombre enemy of good art than the pram in the hallway’.1 He is accused of being not only anti-baby but also misogynistic – implying that the little woman should either not have tempted the creative man to have sex in the first place – or, the worst having happened, should at least take the pram/baby out for a long bracing walk, leaving the creative genius alone, so that he can agonise over his failure to write in complete silence.

Yet Connolly wasn’t really blaming others. Instead, he was probing his own painful sense of failure. He instanced other damaging factors which may also inhibit creativity. Those embrace: drink, apathy, boredom, getting sidetracked into journalism – and coping with the burden of expectation, after early ‘promise’. There’s good scope for debate as to which of those experiences is the most destructive. These days, a later Connolly would have to add: getting bogged down by emails and social media. So a bit of sympathy is in order. We may all have our own ‘enemies’, whether internally within ourselves or externally in the pram-in-hallway-equivalent or even both.

Lastly, declaring THE BOOK END of a big project teaches another significant lesson. Finishing is not as simple as dotting the final full-stop of the final sentence. As my partner Tony Belton is fond of saying: ‘It isn’t ended until it’s ended’. He learned that when setting up computer schemes in the 1970s. People would constantly say: ‘It’s just a fortnight away from completion’. But each fortnight would turn into another fortnight. There’s a confession of that syndrome in the first iteration of the on-line fashion-retail business, Boo-Hoo, whose bankruptcy in 2000 was a scandalous part of the collapse of the dotcom bubble. Ernst Malsten and his colleagues kept promising their backers that the innovative on-line system would be activated ‘within weeks’. But the weeks kept going by. Too many different people were inputting and changing the operating system, which was getting further from completion, not closer. Too late, realisation dawned. ‘It was a mass delusion. We either hadn’t seen, or had simply closed our eyes to, all the warning signs’.2 Boo-Hoo indeed.

Finishing a big writing project is a different exercise, under one-person control. Yet many last touches are still required: last re-reads; last edits; last checks to footnotes, illustrations, and bibliography; last inputs from the publisher’s readers; last decision about the final snappy dictum. So announcing THE BOOK END helps to speed things onwards. But it isn’t ended until it’s ended.

1 C. Connolly, Enemies of Promise (1938).

2 E. Malmsten and others, Boo-Hoo: $135 Million Dollars, 18 Months … A Dot.Com Story from Concept to Catastrophe (Arrow Books, 2002), p. 233.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 87 please click here