Tag Archive for: Penelope J Corfield

MONTHLY BLOG 103, WHO KNOWS THESE HISTORY GRADUATES BEFORE THE CAMERAS AND MIKES IN TODAY’S MASS MEDIA?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Image © Shutterstock 178056255

Responding to the often-asked question, What do History graduates Do? I usually reply, truthfully, that they gain employment in an immense range of occupations. But this time I’ve decided to name a popular field and to cite some high-profile cases, to give specificity to my answer. The context is the labour-intensive world of the mass media. It is no surprise to find that numerous History graduates find jobs in TV and radio. They are familiar with a big subject of universal interest – the human past – which contains something for all audiences. They are simultaneously trained to digest large amounts of disparate information and ideas, before welding them into a show of coherence. And they have specialist expertise in ‘thinking long’. That hallmark perspective buffers them against undue deference to the latest fads or fashions – and indeed buffers them against the slings and arrows of both fame and adversity.

In practice, most History graduates in the mass media start and remain behind-the-scenes. They flourish as managers, programme commissioners, and producers, generally far from the fickle bright lights of public fame. Collectively, they help to steer the evolution of a fast-changing industry, which wields great cultural clout.1

There’s no one single route into such careers, just as there’s no one ‘standard’ career pattern once there. It’s a highly competitive world. And often, in terms of personpower, a rather traditionalist one. Hence there are current efforts by UK regulators to encourage a wider diversity in terms of ethnic and gender recruiting.2 Much depends upon personal initiative, perseverance, and a willingness to start at comparatively lowly levels, generally behind the scenes. It often helps as well to have some hands-on experience – whether in student or community journalism; in film or video; or in creative applications of new social media. But already-know-it-all recruits are not as welcome as those ready and willing to learn on the job.

Generally, there’s a huge surplus of would-be recruits over the number of jobs available. It’s not uncommon for History students (and no doubt many others) to dream, rather hazily, of doing something visibly ‘big’ on TV or radio. However, front-line media jobs in the public eye are much more difficult than they might seem. They require a temperament that is at once super-alert, good-humoured, sensitive to others, and quick to respond to immediate issues – and yet is simultaneously cool under fire, not easily sidetracked, not easily hoodwinked, and implacably immune from displays of personal pique and ego-grandstanding. Not an everyday combination.

It’s also essential for media stars to have a thick skin to cope with criticism. The immediacy of TV and radio creates the illusion that individual broadcasters are personally ‘known’ to the public, who therefore feel free to commend/challenge/complain with unbuttoned intensity.

Those impressive History graduates who appear regularly before the cameras and mikes are therefore a distinctly rare breed.3 (The discussion here refers to media presenters in regular employment, not to the small number of academic stars who script and present programmes while retaining full-time academic jobs – who constitute a different sort of rare breed).

Celebrated exemplars among History graduates include the TV news journalists and media personalities Kirsty Wark (b.1955) and Laura Kuenssberg (b.1976)., who are both graduates of Edinburgh University. Both have had public accolades – Wark was elected as Fellow of the Royal Society of Edinburgh in 2017 – and both face much criticism. Kuenssberg in particular, as the BBC’s first woman political editor, is walking her way warily but effectively through the Gothic-melodrama-cum-Greek-tragedy-cum-high-farce, known as Brexit.

In a different sector of the media world, the polymathic TV and radio presenter, actor, film critic and chat-show host Jonathan Ross (b.1960) is another History graduate. He began his media career young, as a child in a TV advertisement for a breakfast cereal. (His mother, an actor, put him forward for the role). Then, having studied Modern European History at London University’s School of Slavonic & Eastern European Studies, Ross worked as a TV programme researcher behind the scenes, before eventually fronting the shows. Among his varied output, he’s written a book entitled Why Do I Say These Things? (2008). This title for his stream of reminiscences highlights the tensions involved in being a ‘media personality’. On the one hand, there’s the need to keep stoking the fires of fame; but, on the other, there’s an ever-present risk of going too far and alienating public opinion.

Similar tensions accompany the careers of two further History graduates, who are famed as sports journalists. The strain of never making a public slip must be enormous. John Inverdale (b.1957), a Southampton History graduate, and Nicky Campbell (b.1961), ditto from Aberdeen, have to cope not only with the immediacy of the sporting moment but also with the passion of the fans. After a number of years, Inverdale racked up a number of gaffes. Some were unfortunate. None fatal. Nonetheless, readers of the Daily Telegraph in August 2016 were asked rhetorically, and obviously inaccurately: ‘Why Does Everyone Hate John Inverdale?’4 That sort of over-the top response indicates the pressures of life in the public eye.

Alongside his career in media, meanwhile, Nicky Campbell used his research skills to study the story of his own adoption. His book Blue-Eyed Son (2011)5 sensitively traced his extended family roots among both Protestant and Catholic communities in Ireland. His current role as a patron of the British Association for Adoption and Fostering welds this personal experience into a public role.

The final exemplar cited here is one of the most notable pioneers among women TV broadcasters. Baroness Joan Bakewell (b.1933) has had what she describes as a ‘rackety’ career. She studied first Economics and then History at Cambridge. After that, she experienced periods of considerable TV fame followed by the complete reverse, in her ‘wilderness years’.6 Yet her media skills, her stubborn persistence, and her resistance to being publicly patronised for her good looks in the 1960s, have given Bakewell media longevity. She is not afraid of voicing her views, for example in 2008 criticising the absence of older women on British TV. In her own maturity, she can now enjoy media profiles such as that in 2019 which explains: ‘Why We Love Joan Bakewell’.7 No doubt, she takes the commendations with the same pinch of salt as she took being written off in her ‘wilderness years’.

Bakewell is also known as an author; and for her commitment to civic engagement. In 2011 she was elevated to the House of Lords as a Labour peer. And in 2014 she became President of Birkbeck College, London. In that capacity, she stresses the value – indeed the necessity – of studying History. Her public lecture on the importance of this subject urged, in timely fashion, that: ‘The spirit of enquiring, of evidence-based analysis, is demanding to be heard.’8

What do these History graduates in front of the cameras and mikes have in common? Their multifarious roles as journalists, presenters and cultural lodestars indicate that there’s no straightforward pathway to media success. These multi-skilled individuals work hard for their fame and fortunes, concealing the slog behind an outer show of relaxed affability. They’ve also learned to live with the relentless public eagerness to enquire into every aspect of their lives, from health to salaries, and then to criticise the same. Yet it may be speculated that their early immersion in the study of History has stood them in good stead. As already noted, they are trained in ‘thinking long’. And they are using that great art to ‘play things long’ in career terms as well. As already noted, multi-skilled History graduates work in a remarkable variety of fields. And, among them, some striking stars appear regularly in every household across the country, courtesy of today’s mass media.

ENDNOTES:

1 O. Bennett, A History of the Mass Media (1987); P.J. Fourtie, (ed.), Media Studies, Vol. 1: Media History, Media and Society (2nd edn., Cape Town, 2007); G. Rodman, Mass Media in a Changing World: History, Industry, Controversy (New York, 2008); .

2 See Ofcom Report on Diversity and Equal Opportunities in Television (2018): https://www.ofcom.org.uk/__data/assets/pdf_file/0021/121683/diversity-in-TV-2018-report.PDF

3 Information from diverse sources, including esp. the invaluable survey by D. Nicholls, The Employment of History Graduates: A Report for the Higher Education Authority … (2005): https://www.heacademy.ac.uk/system/files/resources/employment_of_history_students_0.pdf; and short summary by D. Nicholls, ‘Famous History Graduates’, History Today, 52/8 (2002), pp. 49-51.

4 See https://www.telegraph.co.uk/olympics/2016/08/15/why-does-everyone-hate-john-inverdale?

5 N. Campbell, Blue-Eyed Son: The Story of an Adoption (2011).

6 J. Bakewell, interviewed by S. Moss, in The Guardian, 4 April 2010: https://www.theguardian.com/lifeandstyle/2010/apr/04/joan-bakewell-harold-pinter-crumpet

7 https://www.bbc.co.uk/programmes/articles/1xZlS9nh3fxNMPm5h3DZjhs/why-we-love-joan-bakewell.

8 J. Bakewell, ‘Why History Matters: The Eric Hobsbawm Lecture’ (2014): http://joanbakewell.com/history.html.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 103 please click here

MONTHLY BLOG 102, ARE YOU AN OPTIMIST? HOW WELL DO YOU KNOW YOUR OWN TEMPERAMENT?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

The Cheshire Cat, famed for its indestructible grin …
from Lewis Carroll’s Alice’s Adventures in Wonderland,
as depicted by John Tenniel for the book’s classic 1865 edition.
© image in public domain

 Are you an optimist? This question is one of my favourite opening gambits when launching into longish conversations with strangers. It’s a pleasant enquiry. It’s open-ended. It implies personal interest but it’s not overly intrusive. In response, people can talk about whatever they wish. They don’t have to reveal any secrets. Often, they talk about their health or work or families. In rare cases, frank individuals confide details of their hopes or fears for their love-life. And, increasingly these days, people take the question as an invitation to hold forth about politics, Brexit, and the state of the nation/world.

I’m also fond of asking questions that can go ‘round the table’, as it were. Those need to be open questions which don’t require a great deal of specialist information to answer. Getting a response from everyone, going round the group, is a great way of fostering a collective dynamic. (I enjoy this process not only in an educational context; but socially too). However, I have learned from experience that asking ‘Are you an optimist?’ really works best in one-to-one conversations. In groups, the cultural pressure to be up-beat in public militates against frank answers.1 Most people will claim, even if evasively, to be cheery – whilst allowing one or two individuals to seize the chance to play the dissident roles of ‘grumpy old men/women’. Their responses quickly lead everyone into debating ‘country going to the dogs’, Brexit, and the state of the nation/world.

However, such arguments have an increasingly stereotypical quality these days, which the question Are you an optimist? is designed to avoid. So it works best in one-to-one encounters, when there’s time to steer away from the perennial Brexit and to explore new terrain. By the way, when asking others to make whatever limited confidences they wish, it’s important to reciprocate. I have no desire to recount my life-story; but I do have some self-reflective comments about my own attitudes, which I am willing to share. Often, the question prompts an absorbing discussion, even with a newly–met stranger. It certainly is more probing than the standard gambit reportedly used by the Queen: ‘Have you come far? Or the academic’s predictable: ‘What’s your research field?’

Talking about optimism also encourages a quest for further definitions. What exactly is meant by the term? It covers a range of permutations from the mildly hopeful: ‘Well, something will turn up’ to an unshakable Panglossian faith that ‘all is for the best in the best of possible worlds’.2 And then people seek further clarification: optimistic over what sort of timespan: one year? five years? a lifetime? And with reference to what: oneself? one’s profession? one’s country? It’s very common these days for almost all educationalists across the spectrum to be deeply pessimistic about the state of the education system. By contrast, true  believers who have just discovered a great good cause tend to be highly optimistic in the early days of their faith, although over time their hopes of rapid success may become muted as they encounter obstacles and opposition (for example to feminism or to environmentalism).

Generally, however, optimists tend to skate over the complexities. Their glasses are rose-tinted. Their glasses are half full, not half empty. They see the potential in everything. And they believe, if not quite in universal ‘Progress’, at least in the positive chances of progressive betterment.3 And, as they wait in hope for things to develop favourably (even if events don’t always oblige), optimists claim to get more enjoyment out of life than do neutral observers. Milton long ago praised such feelings in L’Allegro, his hymn to mirth, jollity, dancing, nut-brown ale, good fellowship and everything that unchains ‘the hidden soul of harmony’.4

Meanwhile, lurking within every discussion about optimism is the countervailing stance of pessimism. Milton was there too. ‘Hence, vain, deluding joyes …’, he urges in Il Penseroso, his rival hymn to meditative gloom: ‘Hail divinest Melancholy …’ Pessimism in turn embraces many possibilities. Options may range through mild scepticism to world-weary disillusionment to acidic negativism to despairing self-harm.

Many pessimists, however, don’t actually accept that self-description. They prefer to call themselves ‘realists’. Whilst optimists can often be disappointed when their high hopes don’t come true, pessimists can always claim not to be surprised at any outcome, short of ecstatic and universal bliss (which is undeniably rare). It’s true that waiting for disaster to strike can seem depressing. Yet serious pessimists positively enjoy their misery. And they certainly believe that they see life more clearly than do the blinkered optimists.

At its simplest, the optimist/pessimist dichotomy can be interpreted as a function of individual psychology and basic personality traits.5 However, it’s as well to recall that changing circumstances are also liable to affect people’s template attitudes. It’s hard to remain cheerful at all times when suffering from acute pain over a long period of time. And it’s difficult to remain perennially optimistic when suffering from a relentless torrent of externally-inflicted major disasters which are entirely beyond one’s own control. So the optimist/pessimist dichotomy is by no means a rigid one. People may be pessimistic about the state of their profession (for example), whilst remaining personally optimistic about (say) their life and loves.

Crucially, too, mental states are not dictated purely by emotions and personal psychology. Considered reason plays a significant role too. The greatest expression of that truth came from Antonio Gramsci (1893-1937), the Italian Marxist who died in a Fascist prison in Rome under Mussolini. While incarcerated, he continued with stoic fortitude to analyse the state of politics and the prospects for radical change.6 What was needed, he concluded, was: ‘pessimism of the intellect, optimism of the will’. It summarised powerfully the conscious yoking of reason and emotion. Gramsci’s formula can be applied to many causes, not just his own. Equally, it can be inverted by those who have optimistic intellects but suffer from pessimistic sapping of the will. Moreover, Gramsci’s formula can be reshuffled to allow room also for super-pessimists of both intellect/will as well as for super-optimists whose smile may outlast reality.

The Cheshire Cat faded
until nothing was left but the smile …

The significant factor, in all these permutations, is that reason is reinstated into human responses to their lives and times. Intellectual attitudes draw upon many sources, rational and emotional alike. For all analysts of the human condition, it’s as well to be aware of one’s own evolving template. A reflex optimism, for example, may lead one astray, unless tempered by rational cogitation and debate with others. I write as a perennial optimist who tries to make analytical adjustments to offset my biases. This process is based upon what I’ve learned from experience – and from many ad hoc conversations with others. So readers, should we be sitting together with a good chance of open-ended discussion, I’m liable to ask my favourite question: are you an optimist?

ENDNOTES:

1 For a polemic against mindless good cheer, see B. Ehrenreich, Bright-Sided: How the Relentless Promotion of Positive Thinking has Undermined America (New York, 2009), publ. in the UK as Smile of Die: How Positive Thinking Fooled America and the World (2009). See also S. Burnett, The Happiness Agenda: A Modern Obsession (New York, 2012).

2 Referencing Dr Pangloss in Voltaire’s satirical Candide: ou l’optimisme (Paris, 1759), immediately transl. into Eng. as Candide: Or, the Optimist.

3 See e.g. discussions in K.H.M. Creal, The Idea of Progress: The Origins of Modern Optimism (Toronto, 1970); W. Laqueur, Optimism in Politics: Reflections on Contemporary History (2017).

4 Compare J. Milton, L’Allegro with Il Penseroso (both written 1631; 1st publ. 1645), in J. Milton, The Poetical Works (Oxford, 1900), pp. 20-8.

5 There is a massive literature on these themes. See e.g. E. Fox, Rainy Brain, Sunny Brain: The New Science of Optimism and Pessimism (2012); P.B. Warr, The Psychology of Happiness (2019); W.C. Compton, Positive Psychology: The Science of Happiness and Flourishing (Los Angeles, 2019); plus countless manuals of self-help.

6 From A. Gramsci, Selections from the Prison Notebooks (1971). See also context in P.D. Thomas, The Gramscian Moment: Philosophy, Hegemony and Marxism (Leiden/Boston, 2009); A. Davidson, Antonio Gramsci: Towards an Intellectual Biography (1977; 2016); L. Kolakowski, Main Currents of Marxism, Vol. 3: The Breakdown (1971); N. Greaves, Gramsci’s Marxism: Reclaiming a Philosophy of History and Politics (Leicester, 2009).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 102 please click here

MONTHLY BLOG 101, ARE YOU A LUMPER OR SPLITTER? HOW WELL DO YOU KNOW YOUR OWN CAST OF MIND?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)
The terminology, derived from Charles Darwin,1 is hardly elegant. Yet it highlights rival polarities in the intellectual cast of mind. ‘Lumpers’ seek to assemble fragments of knowledge into one big picture, while ‘splitters’ see instead complication upon complications. An earlier permutation of that dichotomy was popularised by Isaiah Berlin. In The Hedgehog and the Fox (1953), he distinguished between brainy foxes, who know many things, and intellectual hedgehogs, who apparently know one big thing.2

Fox from © Clipart 2019; Hedgehog from © GetDrawings.com (2019)

These animalian embodiments of modes of thought are derived from a fragmentary dictum from the classical Greek poet Archilochus; and they remain more fanciful than convincing. It’s not self-evident that a hedgehog’s mentality is really so overwhelmingly single-minded.3 Nor is it clear that the reverse syndrome applies particularly to foxes, which have a reputation for craft and guile.4 To make his point with reference to human thinkers, Berlin instanced the Russian novelist Leo Tolstoy as a classic ‘hedgehog’. Really? The small and prickly hedgehog hardly seems a good proxy for a grandly sweeping thinker like Tolstoy.

Those objections to Berlin’s categories, incidentally, are good examples of hostile ‘splitting’. They quibble and contradict. Sweeping generalisations are rejected. Such objections recall a dictum in a Poul Anderson sci-fi novella, when one character states gravely that: ‘I have yet to see any problem, which, when you looked at it in the right way, did not become still more complicated’.5

Arguments between aggregators/generalisers and disaggregators/sceptics, which occur in many subjects, have been particularly high-profile among historians. The lumping/splitting dichotomy was recycled in 1975 by the American J.H. Hexter.6 He accused the Marxist Christopher Hill not only of ‘lumping’ but, even worse, of deploying historical evidence selectively, to bolster a partisan interpretation. Hill replied relatively tersely.7 He rejected the charge that he did not play fair with the sources. But he proudly accepted that, through his research, he sought to find and explain meanings in history. The polarities of lumping/splitting were plain for all to see.

Historical ‘lumpers’ argue that all analysis depends upon some degree of sorting/processing/generalising, applied to disparate information. Merely itemising date after date, or fact after fact ad infinitum, would not tell anyone anything. On those dreadful occasions when lecturers do actually proceed by listing minute details one by one (for example, going through events year by year), the audience’s frustration very quickly becomes apparent.

So ‘lumpers’ like big broad interpretations. And they tend to write big bold studies, with clear long-term trends. Karl Marx’s panoramic brief survey of world history in nine pages in The Communist Manifesto was a classic piece of ‘lumping’.8 In the twentieth century, the British Marxist historian E.P. Thompson was another ‘lumper’ who sought the big picture, although he could be a combative ‘splitter’ about the faults of others.9

‘Splitters’ conversely point out that, if there were big broad-brush interpretations that were reliably apparent, they would have been discovered and accepted by now. However, the continual debates between historians in every generation indicate that grand generalisations are continually being attacked. The progression of the subject relies upon a healthy dose of disaggregation alongside aggregation. ‘Splitters’ therefore produce accounts of rich detail, complications, diversities, propounding singular rather than universal meanings, and stressing contingency over grand trends.

Sometimes critics of historical generalisations are too angry and acerbic. They can thus appear too negative and destructive. However, one of the twentieth-century historians’ most impressive splitters was socially a witty and genial man. Intellectually, however, F.J. ‘Jack’ Fisher was widely feared for his razor-sharp and trenchant demolitions of any given historical analysis. Indeed, his super-critical cast of mind had the effect of limiting his own written output to a handful of brilliant interpretative essays rather than a ‘big book’.10 (Fisher was my research supervisor. His most caustic remark to me came after reading a draft chapter: ‘There is nothing wrong with this, other than a female desire to tell all and an Oxbridge desire to tell it chronologically.’ Ouch! Fisher was not anti-woman, although he was critical of Oxbridge where I’d taken my first degree. But he used this formulation to grab my attention – and it certainly did).

Among research historians today, the temperamental/intellectual cast of mind often inclines them to ‘splitting’, partly because there are many simplistic generalisations about history in public circulation which call out for contradiction or complication. Of course, the precise distribution around the norm remains unknown. These days, I would guestimate that the profession would divide into roughly 45% ‘lumpers’, seeking big grand overviews, and 55% ‘splitters’, stressing detail, diversity, contingency. The classification, however, does depend partly on the occasion and type of output, since single-person expositions on TV and radio encourage generalisations, while round-tables and panels thrive on disagreement where splitters can come into their own.

Moreover, there are not only personal variations, depending upon circumstance, but also major oscillations in intellectual fashions within the discipline. In the later twentieth century, for example, there was a growing, though not universal, suspicion of so-called Grand Narratives (big through-time interpretations).11 The high tide of the sceptical trend known as ‘revisionism’ challenged many old generalisations and easy assumptions. Revisionists did not constitute one single school of thought. Many did favour conservative interpretations of history, but, as remains apparent today, there was and is more than one form of conservatism. That said, revisionists were generally agreed in rejecting both left-wing Marxist conflict models of revolutionary change via class struggles and liberal Whiggish linear models of evolving Progress via spreading education, constitutional rights and so forth.12

Yet the alignments were never simple (a splitterish comment from myself). Thus J.H. Hexter was a ‘splitter’ when confronting Marxists like Hill. But he was a ‘lumper’ when propounding his own Whig view of history as a process of evolving Freedom. So Hexter’s later strictures on revisionism were as fierce as was his earlier critique of Hill.13

Ideally, most research historians probably seek to find a judicious balance between ‘lumping’/‘splitting’. There is scope both for generalisations and for qualifications. After all, there is diversity within the human experience and within the cosmos. Yet there are also common themes, deep patterns, and detectable trends.

Ultimately, however, the dichotomous choice between either ‘lumping’ or ‘splitting’ is a completely false option, when pursued to its limits. Human thought, in all the disciplines, depends upon a continuous process of building/qualifying/pulling down/rebuilding/requalifying/ and so on, endlessly. With both detailed qualifications and with generalisations. An analysis built upon And+And+And+And+And would become too airy and generalised to have realistic meaning. Just as a formulation based upon But+But+But+But+But would keep negating its own negations. So, yes. Individually, it’s worth thinking about one’s own cast of mind and intellectual inclinations. (I personally enjoy both lumping and splitting, including criticising various outworn terminologies for historical periodisation).14 Furthermore, self-knowledge allows personal scope to make auto-adjustments, if deemed desirable. And then, better still, to weld the best features of ‘lumping’ and ‘splitting’ into original thought. And+But+And+Eureka.

ENDNOTES:

1 Charles Darwin in a letter dated August 1857: ‘It is good to have hair-splitters and lumpers’: see Darwin Correspondence Letter 2130 in https://www.darwinproject.ac.uk/.

2 I. Berlin, The Hedgehog and the Fox: An Essay on Tolstoy’s View of History (1953).

3 For hedgehogs, now an endangered species, see S. Coulthard, The Hedgehog Handbook (2018). If the species were to have one big message for humans today, it would no doubt be: ‘Stop destroying our habitat and support the Hedgehog Preservation Society’.

4 M. Berman, Fox Tales and Folklore (2002).

5 From P. Anderson, Call Me Joe (1957).

6 J.H. Hexter, ‘The Burden of Proof: The Historical Method of Christopher Hill’, Times Literary Supplement, 25 Oct. 1975, repr. in J.H. Hexter, On Historians: Reappraisals of Some of the Makers of Modern History (1979), pp. 227-51.

7 For Hill’s rebuttal, see The Times Literary Supplement, 7 Nov. 1975, p. 1333.

8 K. Marx and F. Engels, The Manifesto of the Communist Party (1848), Section I: ‘Bourgeois and Proletarians’, in D. McLennan (ed.), Karl Marx: Selected Writings (Oxford, 1977), pp. 222-31.

9 Among many overviews, see e.g. C. Efstathiou, E.P. Thompson: A Twentieth-Century Romantic (2015); P.J. Corfield, E.P. Thompson, Historian: An Appreciation (1993; 2018), in PJC website http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf45.

10 See P.J. Corfield, F.J. Fisher (1908-88) and the Dialectic of Economic History (1990; 2018), in PJC website http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf46.

11 See esp. J-F. Lyotard, The Postmodern Condition: A Report on Knowledge (Paris, 1979; in Eng. transl. 1984), p. 7, which detected ‘an incredulity toward meta-narratives’; and further discussions in G.K. Browning, Lyotard and the End of Grand Narratives (Cardiff, 2000); and A Munslow, Narrative and History (2018). Earlier Lawrence Stone, a classic historian ‘lumper’, had detected a return to narrative styles of exposition: see L. Stone, ‘The Revival of Narrative: Reflections on a New Old History’, Past & Present, 85 (1979), pp.  3-24. But in this essay Stone was detecting a decline in social-scientific styles of History-writing – not a return to old-style Grand Narratives.

12 Revisionism is sufficiently variegated to have avoided summary within one big study. But different debates are surveyed in L. Labedz (ed.), Revisionism: Essays on the History of Marxist Ideas (1962); J.M. Maddox, Hiroshima in History: The Myths of Revisionism (1974; 2011); L. Brenner, The Iron Wall: Zionist Revisionism from Jabotinsky to Shamir (1984); E. Longley, The Living Stream: Literature and Revisionism in Ireland (Newcastle upon Tyne, 1994); and M. Haynes and J. Wolfreys (eds), History and Revolution: Refuting Revisionism (2007).

13 J.H. Hexter (1910-96) founded in 1986 the Center for the History of Freedom at Washington University, USA, where he was Professor of the History of Freedom, and launched The Making of Modern Freedom series. For his views on revisionism, see J.H. Hexter, ‘Historiographical Perspectives: The Early Stuarts and Parliaments – Old Hat and the Nouvelle Vague’, Parliamentary History, 1 (1982), pp. 181-215; and analysis in W.H. Dray, ‘J.H. Hexter, Neo-Whiggism and Early Stuart Historiography’, History & Theory, 26 (1987), pp. 133-49.

14 See e.g. P.J. Corfield, ‘Primevalism: Saluting a Renamed Prehistory’, in A. Baysal, E.L. Baysal and S. Souvatzi (eds), Time and History in Prehistory (2019), pp. 265-82; and P.J. Corfield, ‘POST-Medievalism/ Modernity/ Postmodernity?’ Rethinking History, 14 (2010), pp. 379-404; also on http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf20.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 101 please click here

MONTHLY BLOG 99, WHY BOTHER TO STUDY THE RULEBOOK?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Joining a public committee of any kind? Before getting enmeshed in the details, I recommend studying the rulebook. Why on earth? Such advice seems arcane, indeed positively nerdy. But I have a good reason for this recommendation. Framework rules are the hall-mark of a constitutionalist culture.

Fig.1 The handsome front cover of the first edition of Robert’s Rules of Order (1876): these model rules, based upon the practices of the US Congress, remain widely adopted across the USA, their updating being undertaken by the Robert’s Rules Association, most recently in 2011.

Once, many years ago, I was nominated by the London education authority – then in the form of the Inner London Education Authority or ILEA – onto a charitable trust in Battersea, where I live. I accepted, not with wild enthusiasm, but from a sense of civic duty. The Trust was tiny and then did not have much money. It was rumoured that a former treasurer in the 1930s had absconded with all the spare cash. But anyway in the early 1970s the Trust was pottering along and did not seem likely to be controversial.

My experience as a Trustee was, however, both depressing and frustrating. The Trust was then named Sir Walter St. John’s Trust; and it exists today in an updated and expanded guise as the Sir Walter St. John’s Educational Charity (www.swsjcharity.org.uk). It was founded in 1700 by Battersea’s local Lord of the Manor, after whom it is named. In the 1970s, the Trust didn’t do much business at all. The only recurrent item on the agenda was the question of what to do about a Victorian memorial window which lacked a home. The fate of the Bogle Smith Window (as it was known) had its faintly comic side. Surely somewhere could be found to locate it, within one or other of the two local state-sector grammar schools, for which the Trust was ground landowner? But soon the humour of wasting hours of debate on a homeless window palled.

I also found it irksome to be treated throughout with deep suspicion and resentment by most of my fellow Trustees. They were Old Boys from the two schools in question: Battersea Grammar School and Sir Walter St. John School. All the Trust business was conducted with outward calm. There were no rows between the large majority of Old Boys and the two women appointed by the ILEA. My fellow ILEA-nominee hardly ever attended; and said nothing, when she did. Yet we were treated with an unforgiving hostility, which I found surprising and annoying. A degree of misogyny was not unusual; yet often the stereotypical ‘good old boys’ were personally rather charming to women (‘the ladies, God bless’em’) even while deploring their intrusion into public business.

But no, these Old Boys were not charming, or even affable. And their hostile attitude was not caused purely by misogyny. It was politics. They hated the Labour-run ILEA and therefore the two ILEA appointees on the Trust. It was a foretaste of arguments to come. By the late 1970s, the Conservatives in London, led by Councillors in Wandsworth (which includes Battersea) were gunning for the ILEA. And in 1990 it was indeed abolished by the Thatcher government.

More than that, the Old Boys on the Trust were ready to fight to prevent their beloved grammar schools from going comprehensive. (And in the event both schools later left the public sector to avoid that ‘fate’). So the Old Boys’ passion for their cause was understandable and, from their point of view, righteous. However, there was no good reason to translate ideological differences into such persistently rude and snubbing behaviour.

Here’s where the rulebook came into play. I was so irked by their attitude – and especially by the behaviour of the Trust’s Chair – that I resolved to nominate an alternative person for his position at the next Annual General Meeting. I wouldn’t have the votes to win; but I could publicly record my disapprobation. The months passed. More than a year passed. I requested to know the date of the Annual General Meeting. To a man, the Old Boys assured me that they never held such things, with something of a lofty laugh and sneer at my naivety. In reply, I argued firmly that all properly constituted civic bodies had to hold such events. They scoffed. ‘Well, please may I see the Trust’s standing orders?’ I requested, in order to check. In united confidence, the Old Boys told me that they had none and needed none. We had reached an impasse.

At this point, the veteran committee clerk, who mainly took no notice of the detailed discussions, began to look a bit anxious. He was evidently stung by the assertion that the Trust operated under no rules. After some wrangling, it was agreed that the clerk should investigate. At the time, I should have cheered or even jeered. Because I never saw any of the Old Boys again.

Several weeks after this meeting, I received through the post a copy of the Trust’s Standing Orders. They looked as though they had been typed in the late nineteenth century on an ancient typewriter. Nonetheless, the first point was crystal clear: all members of the Trust should be given a copy of the standing orders upon appointment. I was instantly cheered. But there was more, much more. Of course, there had to be an Annual General Meeting, when the Chair and officers were to be elected. And, prior to that, all members of the Trust had to be validly appointed, via an array of different constitutional mechanisms.

An accompanying letter informed me that the only two members of the Trust who were correctly appointed were the two ILEA nominees. I had more than won my point. It turned out that over the years the Old Boys had devised a system of co-options for membership among friends, which was constitutionally invalid. They were operating as an ad hoc private club, not as a public body. Their positions were automatically terminated; and they never reappeared.

In due course, the vacancies were filled by the various nominating bodies; and the Trust resumed its very minimal amount of business. Later, into the 1980s, the Trust did have some key decisions to make, about the future of the two schools. I heard that its sessions became quite heated politically. That news was not surprising to me, as I already knew how high feelings could run on such issues. These days, the Trust does have funds, from the eventual sale of the schools, and is now an active educational charity.

Personally, I declined to be renominated, once my first term of service on the Trust was done. I had wasted too much time on fruitless and unpleasant meetings. However, I did learn about the importance of the rulebook. Not that I believe in rigid adhesion to rules and regulations. Often, there’s an excellent case for flexibility. But the flexibility should operate around a set of framework rules which are generally agreed and upheld between all parties.

Rulebooks are to be found everywhere in public life in constitutionalist societies. Parliaments have their own. Army regiments too. So do professional societies, church associations, trade unions, school boards, and public businesses. And many private clubs and organisations find them equally useful as well. Without a set of agreed conventions for the conduct of business and the constitution of authority, there’s no way of stopping arbitrary decisions – and arbitrary systems can eventually slide into dictatorships.

As it happens, the Old Boys on the Sir Walter St. John Trust were behaving only improperly, not evilly. I always regretted the fact that they simply disappeared from the meetings. They should at least have been thanked for their care for the Bogle Smith Window. And I would have enjoyed the chance to say, mildly but explicitly: ‘I told you so!’

Goodness knows what happened to these men in later years. I guess that they continued to meet as a group of friends, with a great new theme for huffing and puffing at the awfulness of modern womanhood, especially the Labour-voting ones. If they did pause to think, they might have realised that, had they been personally more pleasant to the intruders into their group, then there would have been no immediate challenge to their position. I certainly had no idea that my request to see the standing orders would lead to such an outcome.

Needless to say, the course of history does not hinge upon this story. I personally, however, learned three lasting lessons. Check to see what civic tasks involve before accepting them. Remain personally affable to all with whom you have public dealings, even if you disagree politically. And if you do join a civic organisation, always study the relevant rulebook. ‘I tried to tell them so!’ all those years ago – and I’m doing it now in writing. Moreover, the last of those three points is highly relevant today, when the US President and US Congress are locking horns over the interpretation of the US constitutional rulebook. May the rule of law prevail – and no prizes for guessing which side I think best supports that!

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 99 please click here

MONTHLY BLOG 98, HOW SHOULD YOU APPROACH THE PhD VIVA?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Asked by a friend about my extensive experience
of helping candidates through PhD vivas,
I’ve distilled my advice as follows:

Anticipation
Participation
Progression

1: Anticipation

I won’t call this preparation, since everything that you have researched, debated and written about during the entire research period has been preparation for the thesis and viva. But it’s worth undertaking a thoughtful process of anticipation. After a break from the research, return to the thesis and reread it. Then prepare a short statement about your thesis aims and conclusions.

Examiners often invite candidates to start the proceedings with such a succinct statement. If they don’t in your case, then keep it up your sleeve. It’s bound to be useful at a later point in the discussions.

As you reread the thesis, note (as judiciously as you can) the good points within your thesis – and also consider where criticisms and challenges might be made. Some authors love everything that they have written; others detest their own prose. Try to keep a balance.

Having noted areas for criticism and challenge, then think carefully and be ready with answers. It’s not invariably true that authors are their own best critics. Nonetheless, they can often tell where the shoe pinches. Your supervisor will also help with this process.

In the British academic system, the viva is a serious hurdle. So don’t assemble your friends and family to wait outside the examination room. Whatever the outcome, you will need some time for quiet reflection immediately afterwards. It’s important to absorb the prior debate, alongside the examiners’ verdict. And either then or not long afterwards, you need a quick debriefing with your supervisor; and a timetable for corrections and revisions (if any). However, it’s fine to keep friends and family on hold for a celebration later in the day. By the way, in some other academic systems, e.g. in France, the critical vetting takes place before and the viva is a public confirmation of success. That’s a different process, hence processed very differently.

Either way, the viva is a big, big hurdle. Anticipate with care and relish.

2: Participation

Once in the appointed examination venue, treat the viva as a high-powered research consultancy. You are coming to talk with fellow scholars, so don’t be obsequious and deferential. On the other hand, it is your work that is under the spot-light, so don’t display either too much swagger (off-putting) or fear (disappointing).

These days, vivas are approached by all parties in a thoroughly professional way. They are intense affairs; and candidates often don’t remember much detail afterwards. So if you have the option of inviting in your supervisor (not all Universities allow this), then do so. S/he does not intervene at all – often sitting at the back of the room – but can keep useful notes on the discussion.

After a short opening statement from the candidate (depending on the decision of the examiners), a prolonged and detailed discussion ensues. It covers points both small and large, in something of a barrage. The candidate’s task is to assess the examiner’s input and take an instant decision. If the points raised are crucial to your core message, then you must hold your ground, courteously but firmly, and defend your position. The examiners are testing you. If, on the other hand, the criticisms are well made and are not absolutely central, then it’s fine to give way graciously and promise to amend either in the revised thesis or in a subsequent publication.

Every moment requires a quick assessment and a suitable response. You are on the spot throughout, which is why vivas are commonly experienced as both exciting and tiring.

Either at the very start (less common these days) or at the very end (becoming the usual practice), the examiners give you their verdict. As the discussion unfolds, do not try to second-guess the examiners’ intentions. Some will be stony-faced. Some will nod and smile continually. But their facial expressions may not reflect their private thoughts. Furthermore, the examiners have not been asked whether they like you; or even whether they agree with your argument and conclusions. Their task in a History viva is to assess whether you have made an original contribution to historical knowledge, which is well argued, well substantiated, and presented to a publishable standard. No more, and no less.

Your task therefore is not to study the examiners but to concentrate upon fielding their comments/questions and to keep the ball in play (essential advice for all interviews, incidentally).
The options for final assessments by the examiners vary, depending upon the specific regulations of each University. The main categories, however, are pretty standardised, as follows:

  • Pass, with no changes required. (Excellent.)
  • Pass, with minor corrections.(Good. The most common result. Make changes swiftly, exactly as required.)
  • Reference back, with considerable corrections required. (Initially a disappointing verdict; but, viewed in the right light, it gives chance for revisions to make the required improvements and to head off criticisms before the thesis becomes public.)
  • Offer to award degree at lower academic level: usually M.Phil. rather than PhD. (Certainly disappointing. Candidate may be given chance to decide whether to accept this award or not. If accepting, then be pleased to have gained a good research qualification, even if not at the level initially desired. If deciding against acceptance, then, depending upon University regulations, it may be possible to resubmit after major improvements. In which case, give it a serious go. But check very carefully before deciding.)
  • Fail outright, without chance of resubmitting. (This outcome should not happen, as internal Departmental or Faculty review mechanisms should have halted the candidacy before getting to the viva. In the rare event of outright failure, the candidate, in consultation with the supervisor, should reassess and consider what alternative outcomes, including publications, can be made of the research material.)

Whatever the verdict, accept it with good grace. The outcome may well require talking things over with your supervisor, after the meeting. In extremis, you may even wish to challenge the verdict on procedural grounds. But that can’t be done during the meeting.
By the way, challenges to PhD vivas are very rare; and rarely successful, unless a University has seriously failed to follow its own procedures. These days, all examinations are done carefully, by the book. Much of the solemnity of a viva thus comes from its finality. It is the ‘live’ encapsulation of everything that you have worked for during your long years of research.

3: Progression

Passing the viva is a real rite de passage. You are no longer a research apprentice but have submitted your master-work. Once your thesis is passed, perhaps after revisions, you have joined the community of accredited scholars. After all, a doctorate is a known qualification which is sincerely admired by academics world-wide as well as generally respected by the wider public.

Clio, the Muse of History, in a Victorian print.

The examiners will give you a full report, which you should discuss with your supervisor. If s/he has been in attendance, s/he will also have notes and suggestions for you. The examiners may also have made specific suggestions for publication, though they are not required to do so.

Once having passed the viva, take a deep breath; enjoy to the full; and commit to proceeding to at least one publication arising from the thesis. You have produced an original contribution to historical knowledge. That’s the definitional criterion of a History doctorate. It will be consulted by many specialists over the years.

Yet there is one further step which is mightily to be encouraged. The viva is not an ending but a moment of progression. After your many years of work, you should draw from your doctorate to achieve at least one publication. The step into print will give you an additional and well deserved public badge of scholarly honour. It allows you to contact a wider readership. And it may launch you into further publications, once having broken your duck.

So … there we are. You’ve undertaken a long, long haul. You’ve experienced an intellectual adventure as well as episodes of boredom, uncertainty, and angst. Passing the viva, after finally completing and if necessary correcting a doctoral thesis, is a great, unrepeatable moment. Bravo!

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 98 please click here

MONTHLY BLOG 97, WHY IS THE REMARKABLE CHARLOTTE DESPARD NOT BETTER KNOWN?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Fig.1 Charlotte Despard speaking at an anti-fascist rally, Trafalgar Square, 12 June 1933:
photograph by James Jarché, Daily Herald Archive.

Charlotte Despard (1844-1939) was a remarkable – even amazing – woman. Don’t just take my word for it. Listen to Mahatma Gandhi (1869-1948). Visiting London in 1909, he met all the leading suffragettes. The one who impressed him most was Charlotte Despard. She is ‘a wonderful person’, he recorded. ‘I had long talks with her and admire her greatly’.1 They both affirmed their faith in the non-violent strategy of political protest by civil disobedience. Despard called it ‘spiritual resistance’.

What’s more, non-violent protest has become one of the twentieth-century’s greatest contributions to potent mass campaigning – without resorting to counter-productive violence. Associated with this strategy, the names of Henry Thoreau, Mahatma Gandhi and Martin Luther King, all controversial in their day, have become canonised.2 Yet Charlotte Despard, who was also controversial in her day, has been substantially dropped from the historical record.

Not entirely so. On 14 December 2018 Battersea Labour unveiled a blue plaque in her honour, exactly one hundred years after the date when she stood as the Labour Party candidate in North Battersea in the 1918 general election. She was one of the feminist pioneers, when no more than sixteen women stood. But Despard lost heavily to the Liberal candidate, even though industrial North Battersea was then emerging as a Labour stronghold.3

And one major reason for her loss helps to explain her disappearance from mainstream historical memory. Despard was a pacifist, who opposed the First World War and campaigned against conscription. Many patriotic voters in Battersea disagreed with this stance. In the immediate aftermath of war, emotions of relief and pride triumphed. Some months later, Labour swept the board in the 1919 Battersea municipal elections; but without Charlotte Despard on the slate.

Leading pacifists are not necessarily all neglected by history.4 But the really key point was that Charlotte Despard campaigned for many varied causes during her long life and, at every stage, weakened her links with previous supporters. Her radical trajectory made complete sense to her. She sought to befriend lame dogs and to champion outsiders. Yet as an independent spirit – and seemingly a psychological loner – she walked her own pathway.

Despard was by birth an upper crust lady of impeccable Anglo-Irish ancestry, with high-ranking military connections. For 40 years, she lived quietly, achieving a happy marriage and a career as a minor novelist. Yet, after being widowed at the age of 40, she had an extraordinary mid- and late-life flowering. She moved to Battersea’s Nine Elms, living among the poorest of the poor. And she then became a life-long radical campaigner. By the end of her career, she was penniless, having given all her funds to her chosen causes.

A convinced suffragette, Despard joined the Women’s Social and Political Union and was twice imprisoned for her public protests. In 1907, however, she was one of the leading figures to challenge the authoritarian leadership style of Christabel Pankhurst. Despard resigned and founded the rival Women’s Freedom League. This smaller group opposed the use of violence. Instead, its members took symbolic action, like unfurling banners in parliament. They also advocated passive resistance, like non-payment of taxation and non-cooperation with the census. (I recently discovered, thanks to the research of a family member, that my great-grandmother was a would-be WFL supporter. So the 1911 census enumerator duly noted that Mrs Matilda Corfield, living in Sheffield, had given information only ‘under Protest (she wants the vote)’.5 This particular example of resistance was very muffled and inconsequential. Nevertheless, it indicated how unknown women across the country tried to respond to WFL advice. It was one way of slowly changing the climate of public opinion.)

However, the energetic Charlotte Despard did not confine her efforts solely to the cause of the female suffrage. Her life in Battersea radicalised her politically and she became a socialist. She was not good at detailed committee work. Her forte was activism. Indefatigably, she organised a local welfare system. She funded health centres for mothers and babies, exchange points for cots and equipment, youth clubs, and halls for local meetings. And the front room of her small premises in Nine Elms was made available to the public as a free reading room, stocked with books and newspapers. It was a one-woman exercise in practical philanthropy. What’s more, her 1918 election manifesto called for a minimum wage – something not achieved until 1998.

Among the Battersea workers, the tall, wiry, and invariably dignified Charlotte Despard cut an impressive figure. A lifelong vegetarian, she was always active and energetic. And she believed in the symbolic importance of dress. Thus she habitually wore sandals (or boots in winter) under long, flowing robes, a lace shawl, and a mantilla-like head-dress. The result was a timeless style, unconcerned with passing fashions. She looked like a secular sister of mercy.
2019-01-No2-Charlotte-Despard-in-slumland

Fig.2 Charlotte Despard in the poor tenements of Battersea’s Nine Elms, where she lived from 1890 to the early 1920s, instituting and funding local welfare services. Her visitors commented adversely on the notorious ‘Battersea smell’ of combined industrial effluent and smoke from innumerable coalfires; but Despard reportedly took no notice.

For a number of years, Despard worked closely with the newly founded Battersea Labour Party (1908- ), strengthening its global connections. She attended various international congresses; and she backed the Indian communist Shapurji Saklatvala as the Labour-endorsed candidate in Battersea North at the general election in 1922. (He won, receiving over 11,000 votes). Yet, as already noted, the Battersea electorate in 1918 had rebuffed her own campaign.

Then at a relatively loose end, Despard moved to Dublin in the early 1920s. She had already rejected her Irish Ascendancy background by converting to Catholicism. There she actively embraced the cause of Irish nationalism and republicanism. She became a close supporter of Maud Gonne, the charismatic exponent of Irish cultural and political independence. By the later 1920s, however, Despard was unhappy with the conservatism of Irish politics. In 1927 she was classed as a dangerous subversive by the Free State, for opposing the Anglo-Irish Treaty settlement. She eventually moved to Belfast and changed tack politically to endorse Soviet communism. She toured Russia and became secretary of the British Friends of the Soviet Union (FSU), which was affiliated to the International Organisation of the same name.

During this variegated trajectory, Despard in turn shocked middle-class suffragettes who disliked her socialism. She then offended Battersea workers who rejected her pacifism. She next infuriated English Protestants who hated her Irish nationalism. And she finally outraged Irish Catholics (and many Protestants as well) who opposed her support for Russian communism. In 1933, indeed, her Dublin house was torched and looted by an angry crowd of Irish anti-communists.6

In fact, Despard always had her personal supporters, as well as plentiful opponents. But she did not have one consistent following. She wrote no autobiography; no memorable tract of political theory. And she had no close family supporters to tend her memory. She remained on good terms with her younger brother throughout her life. But he was Sir John French, a leading military commander in the British Army and from 1918 onwards Lord Lieutenant of Ireland. The siblings disagreed politically on everything – although both shared the capacity to communicate on easy terms with people from many different backgrounds. To the Despards, ‘Aunt Lottie’ was thus an eccentric oddity. To other respectable family friends, she was ‘a witch’, and a dangerous one at that.7

These factors combined together to isolate Despard and to push her, after her death, into historical limbo. There are very few public monuments or memorials to her indomitable career. In north London, a pleasant pub on the Archway Road is named after her, on land which was owned by her husband Colonel Despard. On Battersea’s Doddington Estate, there is an avenue named after her, commemorating her welfare work in the area. And now there is the blue plaque outside the headquarters of Battersea Labour at 177 Lavender Hill, SW11. These memorials are fine but hardly enough.

Fig.3 Blue plaque to Charlotte Despard, outside 177 Lavender Hill, London SW11 5TE: installed 14 December 2018, on the precise centenary of her standing for parliament in 1918, as one of only 16 women pioneers to do so.

Why should she be remembered? The answer is not that everyone would have agreed (then or later) with all of Charlotte Despard’s political calls. As this account has shown, she was always controversial and, on Russia, self-deceived into thinking it much more of a workers’ paradise than it was (as were many though not all left-leaning intellectuals in the West). Nonetheless, she is a remarkable figure in the history of public feminism. She not only had views but she campaigned for them, using her combination of practical on-the-ground organisation, her call for symbolic non-violent protest and ‘spiritual resistance’, and her public oratory. And she did so for nigh on 50 years into her very advanced years.

Indomitability, peaceful but forceful, was her signature style. She quoted Shelley on the need for Love, Hope, and Endurance. When she was in her mid-sixties, she addressed a mass rally in Trafalgar Square (of course, then without a microphone). Her speeches were reportedly allusive and wide-ranging, seeking to convey inspiration and urgency. One onlooker remembered that her ‘thin, fragile body seemed to vibrate with a prophecy’.8

Appropriately for a radical campaigner, Charlotte Despard’s last major public appearance was on 12 June 1933, when she spoke passionately at a mass anti-fascist rally in Trafalgar Square. At that time, she was aged 89. It was still unusual then for women to speak out boldly in public. They often faced jeers and taunts for doing so. But the photographs of her public appearances show her as unflinching, even when she was the only woman amidst crowds of men. Above all, for the feminist feat of speaking at the mass anti-fascist rally at the age of 89, there is a good case for placing a statue on Trafalgar Square’s vacant fourth plinth, showing Despard in full oratorical flow. After all, she really was there. And, if not on that particular spot, then somewhere relevant in Battersea. Charlotte Despard, born 175 years ago and campaigning up until the start of the Second World War, was a remarkable phenomenon. Her civic and feminist commitment deserves public commemoration – and in a symbolic style worthy of the woman.

Figs 4 + 5: Photos showing Despard, speaking in Trafalgar Square, without a microphone:
(L) dated 1910 when she was 66, and (R) dated 1933 when she was aged 89.
Her stance and demeanour are identically rapt, justifying one listener’s appreciative remark:
Mrs Despard – she always gets a crowd’.

1 Quoted in M. Mulvihill, Charlotte Despard: A Biography (1989), p. 86. See also A. Linklater, An Unhusbanded Life: Charlotte Despard, Suffragette, Socialist and Sinn Feiner (1980); and, for Battersea context, P.J. Corfield in Battersea Matters (Autumn 2016), p. 11; and PJC with Mike Marchant, DVD: Red Battersea: One Hundred Years of Labour, 1908-2008 (2008).

2 A. Roberts and T. Garton Ash (eds), Civil Resistance and Power Politics: The Experience of Non-Violent Action from Gandhi to the Present (Oxford, 2009); R.L. Holmes and B.L. Gan (eds), Nonviolence in Theory and Practice (Long Grove, Illinois, 2012).

3 1918 general election result for North Battersea: Richard Morris, Liberal (11,231 = 66.6% of all voting); Charlotte Despard, Labour (5,634 = 33.4%). Turnout =  43.7%.

4 P. Brock and N. Young, Pacifism in the Twentieth Century (New York, 1999).

5 With thanks to research undertaken by Annette Aseriau.

6 Mulvihill, Charlotte Despard, p. 180.

7 Ibid., pp. 46-7, 78-9.

8 Account by Christopher St John, in Mulvihill, Charlotte Despard, p. 77.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 97 please click here

MONTHLY BLOG 96, WHAT’S WRONG WITH PREHISTORY?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Arthur’s Stone, Herefordshire, dating from c.3000 BCE: photo © Tony Belton, 2016

Arthur’s Stone, Herefordshire, dating from c.3000 BCE:
photo © Tony Belton, 2016

What’s wrong with ‘prehistory’? Absolutely nothing but the name. People refer to ancient monuments as ‘prehistoric’ and everyone knows roughly what is meant. The illustration (above) shows an ancient burial tomb, known as Arthur’s Stone, dating from 3000 BCE, which I visited in Herefordshire on a summer day in 2016. It did and does indeed look truly venerable. So loose terms such as ‘prehistoric’ are passable enough if used casually.

But ‘prehistory’ as a scholarly term in application to a prolonged period of human history? Seriously misleading. It implies that the long aeons of foundational human history, before the advent of literacy, somehow occurred in a separate ante-chamber to the ‘real’ deal.

The acquiring of skills in reading and writing (which occurred in different parts of the world at different times) was in fact part of a lengthy process of human adaptation and invention. Before literacy, key developments included: the adoption of clothing; the taming of fire; the invention of tools; the refinement of tools and weapons with handles; the invention of the wheel; the arrival of speech; the advent of decorative arts; the formulation of burial rituals; the domestication of animals; the development of a calendrical consciousness; the capacity to cope with population fluctuations including survival during the Ice Age; the start of permanent settlements and farming; and the cumulative mental and cultural preparation for the invention of reading and writing. Some list! The pace of change was often slow; but the changes were absolutely foundational to human history.1

In practice, of course, the skilled and ingenious experts, who study pre-literate societies, do not consider their subject to be anything other than fully and deeply historical. They use ‘prehistory’ because it is a known term of art. (Often, indeed, they may start their lectures and books with a jovial disclaimer that such terminology should not be taken literally). The idea of ‘prehistory’ was crystallised by Victorian historians, who were developing a deep reverence for the importance of written sources for writing ‘real’ history. But the differences in prime source material, although methodologically significant, are not fundamental enough to deprive the foundational early years of the full status of history. And, in fact, these days historians of all periods study a range of sources. They are not just stuck in archives, reading documents – important as those are. If relevant to their theme, historians may examine buildings, art, artefacts, materials, bones, refuse, carbon datings, statistical extrapolations, and/or genetic evidence (etc etc), just as do archaeologists and ‘prehistorians’.

Moreover, conventional references to ‘prehistory’ have now been blind-sided by the recent return to diachronic (through-time) studies of what is known as Big History. This approach to the past takes as its remit either the whole of the cosmos or at least the whole lifespan of Planet Earth.2 It draws upon insights from cosmologists and astro-physicists, as well as from geologists and biologists. After all, a lot of history had indeed happened before the first humans began to walk. So what are the millennia before the advent of homo sapiens to be entitled? Pre-prehistory? Surely not. All these eras form part of what is sometimes known as ‘deep history’: a long time ago but still historical.

So why has the misleading term ‘prehistory’ survived for so long? One major reason lies in the force of inertia – or institutional continuity, to give it a kinder name. ‘Prehistory’ has prevailed as an academic terminology for over a century. It appears in the names of academic departments, research institutions, learned societies, job descriptions, teaching courses, examination papers, academic journals, books, blogs, conferences, publishers’ preferences for book titles, and popular usages – let alone in scholars’ self-definitions. Little wonder that renaming is not a simple matter. Nonetheless, subjects are continuously being updated – so why not a further step now?

I was prompted to write on this question when three congenial colleagues asked me, a couple of years ago, to contribute to a volume on Time & History in Prehistory (now available, with publication date 2019).3 I was keen to respond but hostile to the last word in their book title. My answer took the form of arguing that this specialist section of historical studies needs a new and better name. I am grateful to the editors’ forbearance in accepting my contribution. It contributes to debates elsewhere within the volume, since criticising the terminology of ‘prehistory’ is not new.

Apart from the lack of logic in apparently excluding the foundational experiences of the human species from ‘real’ history, my own further objection is that the division inhibits diachronic analysis of the long term. A surviving relic from ‘prehistoric’ times, like Arthur’s Stone, has a long and intriguing history which still continues. At some stage long before the thirteenth century CE, the modest monument, high on a ridge between the Wye and Golden Valleys, became associated in popular legend with the feats of King Arthur. (Did he win a battle there, rumour speculated, or slay a giant?) That invented linkage is in itself a fascinating example of the spread of the Arthurian legend.4

The site later witnessed some real-life dramas. In the fifteenth century, a knight was killed there in a fatal duel. And in September 1645 the embattled Charles I dined at the Stone with his royalist troops. Perhaps he intended the occasion as a symbolic gesture, although it did not confer upon him sufficient pseudo-Arthurian lustre to defeat Cromwell and the Roundheads.

For the villagers in nearby Dorstone and Bredwardine, Arthur’s Stone at some stage (chronology uncertain) became a venue for popular festivities, with dancing and ‘high jinks’ every midsummer. This long-standing tradition continued until well into Victorian times. As a sober counter-balance, too, the local Baptists in the nineteenth and twentieth centuries organised an ecumenical religious service there each June/July. Living witnesses remember these as occasions of fervent al fresco hymn-singing. Implicitly, they were acknowledging the Stone’s sacral nature, whilst simultaneously purging its pagan associations.

When visiting the Stone myself in 2016, I met by chance a local resident, named Ionwen Williams. In a stroke of research serendipity, we got chatting and she zestfully recounted her memories, as a child before World War II, of joining her schoolfellows to sing hymns at the site each midsummer. This experience and many later visits confirmed for her the special nature of the place. I did not for a moment doubt her memories; but, as a prudent historian, thought it helpful to cross-check – and found them corroborated.

It is abundantly clear that, throughout its five thousand years of existence, Arthur’s Stone has had multiple meanings for the witnessing generations. At one sad stage in the late nineteenth century, it was pillaged by builders taking stones for new constructions. But local objections put a stop to that; and it is now guarded by English Heritage. It is utterly historic, not separately ‘prehistoric’: and the same point applies to all long-surviving monuments, many of which are much bigger and more famous than Arthur’s Stone. Furthermore, deep continuities apply to many other aspects of human history – and not just to physical monuments. For example, there are many claims and counter-claims about the foundations of human behaviour which merit debate, without compartmentalising the eras of pre-literacy from those of post-literacy.

Lastly, what alternative nomenclature might apply? Having in the first draft of my essay rebuked the specialists known as ‘prehistorians’ for not changing their name, I was challenged by the editors to review other options. Obviously it’s not for one individual to decide. It was, however, a good challenge. In many ways, these early millennia might be termed ‘foundational’ in human history. That, after all, is what they were. On the other hand, ‘foundational history’ sounds like a first-year introduction course. Worthy but not very evocative. My essay reviews various options and plumps for ‘primeval’ history. That term not only sounds ancient but signals primacy: in human history, these years came first.5 The contributions within the volume as a whole are questioning and challenging throughout, as they analyse different aspects of Time and, yes, ‘History’. It is a pleasure to join these essays in thinking long.6

1 For an enticing introduction (apart from one word in its subtitle), see C. Gamble, Timewalkers: The Prehistory of Global Colonisation (Sutton: Stroud 1993).

2 For an introduction, see D.G. Christian, Maps of Time: An Introduction to Big History (U. of California Press: Berkeley, 2004).

3 S. Souvatzi, A. Baysal and E.L. Baysal (eds), Time and History in Prehistory (Routledge: Abingdon, 2019).

4 N.J. Lacy (ed.), The New Arthurian Encyclopaedia (Garland: New York, 1991).

5 P.J. Corfield, ‘Primevalism: Saluting a Renamed Prehistory’, in Soutvatzi, Baysal and Baysal (eds), Time and History, pp. 265-82. My own interest in ‘long ago’ was sparked when, as a teenager, I read a study by Ivar Lissner, entitled The Living Past (Cape: London, 1957): for which see P.J. Corfield, ‘An Unknown Book Which Influenced Me’ BLOG no.14 (Nov. 2011).

6 On this theme, see J. Guldi and D. Armstrong, The History Manifesto (Cambridge University Press: Cambridge, 2014); P.J. Corfield, ‘What on Earth is the “Temporal Turn” and Why is it Happening Now?’ BLOG no.49 (Jan. 2015); and idem, ‘Thinking Long: Studying History’, BLOG no.94 (Oct. 2018), all BLOGs available on www.penelopejcorfield.com/monthly-blogs.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 96 please click here

MONTHLY BLOG 95, ‘WHAT IS THE GREATEST SIN IN THE WORLD?’ CHRISTOPHER HILL AND THE SPIRIT OF EQUALITY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Text of short talk given by PJC to introduce the First Christopher Hill Memorial Lecture, (given by Prof. Justin Champion) at Newark National Civil War Centre, on Saturday 3 November 2018.

Christopher Hill was not only a remarkable historian – he was also a remarkable person.1 All his life, he believed, simply and staunchly, in human equality. But he didn’t parade his beliefs on his sleeve. At first meeting, you would have found him a very reserved, very solid citizen. And that’s because he was very reserved – and he was solid in the best sense of that term. He was of medium height, so did not tower over the crowd. But he held himself very erect; had a notably sturdy, broad-shouldered Yorkshire frame; and was very fit, cycling and walking everywhere. And in particular, Christopher Hill had a noble head, with a high forehead, quizzical eyebrows, and dark hair which rose almost vertically – giving him, especially in his later years, the look of a wise owl.
Christopher-Hill-1-&-2

Christopher Hill (L) in his thirties and (R) in his seventies

By the way, he was not a flashy dresser. The Hill family motto was ‘No fuss’. And, if you compare the two portraits of him in his 30s and his 70s, you could be forgiven for thinking that he was wearing the same grey twill jacket in both. (He wasn’t; but he certainly stuck to the same style all his life).

Yet even while Christopher Hill was reserved and dignified, he was also a benign figure. He had no side. He did not pull rank. He did not demand star treatment. He was courteous to all – and always interested in what others had to say. That was a key point. As Master of Balliol, Hill gave famous parties, at which dons and students mingled; and he was often at the centre of a witty crowd. But just as much, he might be found in a corner of the room discussing the problems of the world with a shy unknown.

As I’ve already said. Christopher Hill believed absolutely in the spirit of equality. But he did know that it was a hard thing to achieve – and that was why he loved the radicals in the English civil wars of the mid-seventeenth century. They were outsiders who sought new ways of organising politics and religion. Indeed, they struggled not only to define equality – but to live it. And, although there was sometimes a comic side to their actions, he admired their efforts.

When I refer to unintentionally comic aspects, I am thinking of those Ranters, from the radical and distinctly inchoate religious group, who jumped up in church and threw off their clothes as a sign. The sign was that they were all God’s children, equal in a state of nature. Not surprisingly, such behaviour attracted a lot of criticism – and satirists had good fun at their expense.

Well, Christopher Hill was far too dignified to go around throwing off his clothes. But he grew up believing a radical form of Methodism, which stressed that ‘we are all one in the eyes of the Lord’. As I’ve said, his egalitarianism came from within. But he was clearly influenced by his Methodist upbringing. His parents were kindly people, who lived simply and modestly (neither too richly nor too poorly). They didn’t drink, didn’t smoke, didn’t swear and didn’t make whoopee. Twice and sometimes even three times on Sundays, they rode their bikes for several miles to and from York’s Central Methodist Chapel; and then discussed the sermon over lunch.

In his mid-teens, Hill was particularly inspired by a radical Methodist preacher. He was named T.S. Gregory and he urged a passionate spiritual egalitarianism. Years later, Hill reproduced for me Gregory’s dramatic pulpit style. He almost threw himself across the lectern and spoke with great emphasis: ‘Go out into the streets – and look into the eyes of every fellow sinner, even the poorest beggar or the most abandoned prostitute; [today he would add look under the hoods of the druggies and youth gangs]; look into these outcast faces and in every individual you will see elements of the divine. The York Methodists, from respectable middle class backgrounds, were nonplussed. But Hill was deeply stirred. For him, Gregory voiced a true Protestantism – which Hill defined as wine in contrast with what he saw as the vinegar and negativism of later Puritanism.

The influence of Gregory was, however, not enough to prevent Hill in his late teens from losing his religious faith. My mother, Christopher’s younger sister, was very pleased at this news as she welcomed his reinforcement. She herself had never believed in God, even though she too went regularly to chapel. But their parents were sincerely grieved. On one occasion, there was a dreadful family scene, when Christopher, on vacation from Oxford University, took his younger sister to the York theatre. Neither he nor my mother could later remember the show. But they both vividly recalled their parent’s horror: going to the theatre – abode of the devil! Not that the senior Hills shouted or rowed. That was not their way. But they conveyed their consternation in total silence … which was difficult for them all to overcome.

As he lost his faith, Hill converted to a secular philosophy, which had some elements of a religion to it. That was Marxism. Accordingly, he joined the British Communist Party. And he never wavered in his commitment to a broad-based humanist Marxism, even when he resigned from the CP in 1956. Hill was not at all interested in the ceremonies and ritual of religion. The attraction of Marxism for him was its overall philosophy. He was convinced that the revolutionary unfolding of history would eventually remove injustices in this world and usher in true equality. Hill sought what we would call a ‘holistic vision’. But the mover of change was now History rather than God.

On those grounds, Hill for many years supported Russian communism as the lead force in the unfolding of History. In 1956, however, the Soviet invasion of Hungary heightened a fierce internal debate within the British Communist Party. Hill and a number of his fellow Marxist historians, struggled to democratise the CP. But they lost and most of them thereupon resigned.

This outcome was a major blow to Hill. Twice he had committed to a unifying faith and twice he found its worldly embodiment unworthy. Soviet Communism had turned from intellectual inspiration into a system based upon gulags, torture and terror. Hill never regretted his support for Soviet Russia during the Second World War; but he did later admit that, afterwards, he had supported Stalinism for too long. The mid-1950s was an unhappy time for him both politically and personally. But, publicly, he did not wail or beat his breast. Again, that was not the Hill way.

He did not move across the political spectrum, as some former communists did, to espouse right-wing causes. Nor did he become disillusioned or bitter. Nor indeed, did he drop everything to go and join a commune. Instead, Hill concentrated even more upon his teaching and writing. He did actually join the Labour Party. Yet, as you can imagine, his heart was not really in it.

It was through his historical writings, therefore, that Hill ultimately explored the dilemmas of how humans could live together in a spirit of equality. The seventeenth-century conflicts were for him seminal. Hill did not seek to warp history to fit his views. He could not make the radicals win, when they didn’t. But he celebrated their struggles. For Hill, the seventeenth-century religious arguments were not arid but were evidence of the sincere quest to read God’s message. He had once tried to do that himself. And the seventeenth-century political contests were equally vivid for him, as he too had been part of an organised movement which had struggled to embody the momentum of history.

As I say, twice his confidence in the worldly formulations of his cause failed. Yet his belief in egalitarianism did not. Personally, he became happy in his second marriage; and he immersed himself in his work as a historian. From being a scholar who wrote little, he became super-productive. Books and essays poured from his pen. Among those he studied was the one seventeenth-century radical who appealed to him above all others: Gerrard Winstanley, the Digger, who founded an agrarian commune in the Surrey hills. And the passage in Winstanley’s Law of Freedom (1652) that Hill loved best was dramatic in the best T.S. Gregory style. What is the greatest sin in the world? demanded Winstanley. And he answered emphatically that it is for rich people to hoard gold and silver, while poor people suffer from hunger and want.          

          What Hill would say today, at the ever widening inequalities across the world, is not hard to guess. But he would also say: don’t lose faith in the spirit of equality. It is a basic tenet of human life. And all who believe in fair does for all, as part of true freedom, should strive to find our own best way, individually and/or collectively, to do our best for our fellow humans and to advance Hill’s Good Old Cause.

1 For documentation, see P.J. Corfield, ‘“We are all One in the Eyes of the Lord”: Christopher Hill and the Historical Meanings of Radical Religion’, History Workshop Journal, 58 (2004), pp. 110-27. Now posted on PJC personal website as Pdf5; and further web-posted essays PJC Pdf47-50, all on www.penelopejcorfield.co.uk

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 95 please click here

MONTHLY BLOG 94, THINKING LONG – STUDYING HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

History is a subject that deals in ‘thinking long’. The human capacity to think beyond the immediate instant is one of our species’ most defining characteristics. Of course, we live in every passing moment. But we also cast our minds, retrospectively and prospectively, along the thought-lines of Time, as we mull over the past and try to anticipate the future. It’s called ‘thinking long’.

Studying History (indicating the field of study with a capital H) is one key way to cultivate this capacity. Broadly speaking, historians focus upon the effects of unfolding Time. In detail, they usually specialise upon some special historical period or theme. Yet everything is potentially open to their investigations.

Sometimes indeed the name of ‘History’ is invoked as if it constitutes an all-seeing recording angel. So a controversial individual in the public eye, fearing that his or her reputation is under a cloud, may proudly assert that ‘History will be my judge’. Quite a few have made such claims. They express a blend of defiance and  optimism. Google: ‘History will justify me’ and a range of politicians, starting with Fidel Castro in 1963, come into view. However, there’s no guarantee that the long-term verdicts will be kinder than any short-term criticisms.

True, there are individuals whose reputations have risen dramatically over the centuries. The poet, painter and engraver William Blake (1757-1827), virtually unknown in his own lifetime, is a pre-eminent example. Yet the process can happen in reverse. So there are plenty of people, much praised at the start of their careers, whose reputations have subsequently nose-dived and continue that way. For example, some recent British Prime Ministers may fall into that category. Only Time (and the disputatious historians) will tell.

Fig. 1 William Blake’s Recording Angel has about him a faint air of an impish magician as he points to the last judgment. If this task were given to historians, there would be a panel of them, arguing amongst themselves.

In general, needless to say, those studying the subject of History do not define their tasks in such lofty or angelic terms. Their discipline is distinctly terrestrial and Time-bound. It is prone to continual revision and also to protracted debates, which may be renewed across generations. There’s no guarantee of unanimity. One old academic anecdote imagines the departmental head answering the phone with the majestic words: ‘History speaking’.1 These days, however, callers are likely to get no more than a tinny recorded message from a harassed administrator. And academic historians in the UK today are themselves being harried not to announce god-like verdicts but to publish quickly, in order to produce the required number of ‘units of output’ (in the assessors’ unlovely jargon) in a required span of time.

Nonetheless, because the remit of History is potentially so vast, practitioners and students have unlimited choices. As already noted, anything that has happened within unfolding Time is potentially grist to the mill. The subject resembles an exploding galaxy – or, rather, like the cosmos, the sum of many exploding galaxies.

Tempted by that analogy, some practitioners of Big History (a long-span approach to History which means what it says) do take the entire universe as their remit, while others stick merely to the history of Planet Earth.2 Either way, such grand approaches are undeniably exciting. They require historians to incorporate perspectives from a dazzling range of other disciplines (like astro-physics) which also study the fate of the cosmos. Thus Big History is one approach to the subject which very consciously encourages people to ‘think long’. Its analysis needs careful treatment to avoid being too sweeping and too schematic chronologically, as the millennia rush past. But, in conjunction with shorter in-depth studies, Big History gives advanced students a definite sense of temporal sweep.

Meanwhile, it’s also possible to produce longitudinal studies that cover one impersonal theme, without having to embrace everything. Thus there are stimulating general histories of the weather,3 as well as more detailed histories of weather forecasting, and/or of changing human attitudes to weather. Another overarching strand studies the history of all the different branches of knowledge that have been devised by humans. One of my favourites in this genre is entitled: From Five Fingers to Infinity.4 It’s a probing history of mathematics. Expert practitioners in this field usually stress that their subject is entirely ahistorical. Nonetheless, the fascinating evolution of mathematics throughout the human past to become one globally-adopted (non-verbal) language of communication should, in my view, be a theme to be incorporated into all advanced courses. Such a move would encourage debates over past changes and potential future developments too.

Overall, however, the great majority of historians and their courses in History take a closer focus than the entire span of unfolding Time. And it’s right that the subject should combine in-depth studies alongside longitudinal surveys. The conjunction of the two provides a mixture of perspectives that help to render intelligible the human past. Does that latter phrase suffice as a summary definition?5 Most historians would claim to study the human past rather than the entire cosmos.

Yet actually that common phrase does need further refinement. Some aspects of the human past – the evolving human body, for example, or human genetics – are delegated for study by specialist biologists, anatomists, geneticists, and so forth. So it’s clearer to say that most historians focus primarily upon the past of human societies in the round (ie. including everything from politics to religion, from war to economics, from illness to health, etc etc). And that suffices as a definition, provided that insights from adjacent disciplines are freely incorporated into their accounts, wherever relevant. For example, big cross-generational studies by geneticists are throwing dramatic new light upon the history of human migration around the globe and also of intermarriage within the complex range of human species and the so-called separate ‘races’ within them.6 Their evidence amply demonstrates the power of longitudinal studies for unlocking both historical and current trends.

The upshot is that the subject of History can cover everything within the cosmos; that it usually concentrates upon the past of human societies, viewed in the round; and that it encourages the essential human capacity for thinking long. For that reason, it’s a study for everyone. And since all people themselves constitute living histories, they all have a head-start in thinking through Time.7

1 I’ve heard this story recounted of a formidable female Head of History at the former Bedford College, London University; and the joke is also associated with Professor Welch, the unimpressive senior historian in Kingsley Amis’s Lucky Jim: A Novel (1953), although upon a quick rereading today I can’t find the exact reference.

2 For details, see the website of the Big History’s international learned society (founded 2010): www.ibhanet.org. My own study of Time and the Shape of History (2007) is another example of Big History, which, however, proceeds not chronologically but thematically.

3 E.g. E. Durschmied, The Weather Factor: How Nature has Changed History (2000); L. Lee, Blame It on the Rain: How the Weather has Changed History (New York, 2009).

4 F.J. Swetz (ed.), From Five Fingers to Infinity: A Journey through the History of Mathematics (Chicago, 1994).

5 For meditations on this theme, see variously E.H. Carr, What is History? (Cambridge 1961; and many later edns); M. Bloch, The Historian’s Craft (in French, 1949; in English transl. 1953); B. Southgate, Why Bother with History? Ancient, Modern and Postmodern Motivations (Harlow, 2000); J. Tosh (ed.), Historians on History: An Anthology (2000; 2017); J. Black and D.M. MacRaild, Studying History (Basingstoke, 2007); H.P.R. Finberg (ed.), Approaches to History: A Symposium (2016).

6 See esp. L.L. Cavalli-Sforza and F. Cavalli-Sforza, The Great Human Diasporas: The History of Diversity and Evolution, transl. by S. Thomas (Reading, Mass., 1995); D. Reich, Who We Are and Where We Got Here: Ancient DNA and the New Science of the Human Past (Oxford, 2018).

7 P.J. Corfield, ‘All People are Living Histories: Which is why History Matters’. A conversation-piece for those who ask: Why Study History? (2008) in London University’s Institute of Historical Research Project, Making History: The Discipline in Perspective www.history.ac.uk/makinghistory/resources/articles/why_history_matters.html; and also available on www.penelopejcorfield.co.uk/ Pdf1.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 94 please click here

MONTHLY BLOG 92, HISTORIANS AT WORK THROUGH TIME

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)
Historians, who study the past, don’t undertake this exercise from some vantage point outside Time. They, like everyone else, live within an unfolding temporality. That’s very fundamental. Thus it’s axiomatic that historians, like their subjects of study, are all equally Time-bound.1

Nor do historians undertake the study of the past in one single moment in time. Postmodernist critics of historical studies sometimes write as though historical sources are culled once only from an archive and then adopted uncritically. The implied research process is one of plucking choice flowers and then pressing them into a scrap-book to some pre-set design.

On such grounds, critics of the discipline highlight the potential flaws in all historical studies. Sources from the past are biased, fallible and scrappy. Historians in their retrospective analysis are also biased, fallible and sometimes scrappy. And historical writings are literary creations only just short of pure fiction.2

Historians should welcome scepticism this dose of scepticism – always a useful corrective. Yet they entirely reject the proposition that trying to understand bygone eras is either impossible or worthless. Rebuttals to postmodernist scepticism have been expressed theoretically;3 and also directly, via pertinent case studies which cut through the myths and ‘fake news’ which often surround controversial events in history.4

When at work, historians should never take their myriad of source materials literally and uncritically. Evidence is constantly sought, interrogated, checked, cross-checked, compared and contrasted, as required for each particular research theme. The net is thrown widely or narrowly, again depending upon the subject. Everything is a potential source, from archival documents to art, architecture, artefacts and though the gamut to witness statements and zoological exhibits. Visual materials can be incorporated either as primary sources in their own right, or as supporting documentation. Information may be mapped and/or tabulated and/or statistically interrogated. Digitised records allow the easy selection of specific cases and/or the not-so-easy processing of mass data.

As a result, researching and writing history is a slow through-Time process – sometimes tediously so. It takes at least four years, from a standing start, to produce a big specialist, ground-breaking study of 100,000 words on a previously un-studied (or under-studied) historical topic. The exercise demands a high-level synthesis of many diverse sources, running to hundreds or even thousands. Hence the methodology is characteristically much more than a ‘reading’ of one or two key texts – although, depending upon the theme, at times a close reading of a few core documents (as in the history of political ideas) is essential too.

Mulling over meanings is an important part of the process too. History as a discipline encourages a constant thinking and rethinking, with sustained creative and intellectual input. It requires knowledge of the state of the discipline – and a close familiarity with earlier work in the chosen field of study. Best practice therefore enjoins writing, planning and revising as the project unfolds. For historical studies, ‘writing through’ is integral, rather than waiting until all the hard research graft is done and then ‘writing up’.5

The whole process is arduous and exciting, in almost equal measure. It’s constantly subject to debate and criticism from peer groups at seminars and conferences. And, crucially too, historians are invited to specify not only their own methodologies but also their own biases/assumptions/framework thoughts. This latter exercise is known as ‘self-reflexivity’. It’s often completed at the end of a project, although it’s then inserted near the start of the resultant book or essay. And that’s because writing serves to crystallise and refine (or sometimes to reject) the broad preliminary ideas, which are continually tested by the evidence.

One classic example of seriously through-Time writing comes from the classic historian Edward Gibbon. The first volume of his Decline & Fall of the Roman Empire appeared in February 1776. The sixth and final one followed in 1788. According to his autobiographical account, the gestation of his study dated from 1764. He was then sitting in the Forum at Rome, listening to Catholic monks singing vespers on Capitol Hill. The conjunction of ancient ruins and later religious commitments prompted his core theme, which controversially deplored the role of Christianity in the ending of Rome’s great empire. Hence the ‘present’ moments in which Gibbon researched, cogitated and wrote stretched over more than 20 years. When he penned the last words of the last volume, he recorded a sensation of joy. But then he was melancholic that his massive project was done.6 (Its fame and the consequent controversies last on today; and form part of the history of history).

1 For this basic point, see PJC, ‘People Sometimes Say “We Don’t Learn from the Past” – and Why that Statement is Completely Absurd’, BLOG/91 (July 2018), to which this BLOG/92 is a companion-piece.

2 See e.g. K. Jenkins, ReThinking History (1991); idem (ed.), The Postmodern History Reader (1997); C.G. Brown, Postmodernism for Historians (Harlow, 2005); A. Munslow, The Future of History (Basingstoke, 2010).

3 J. Appleby, L. Hunt and M. Jacob, Telling the Truth about History (New York, 1994); R. Evans, In Defence of History (1997); J. Tosh (ed.), Historians on History (Harlow, 2000); A. Brundage, Going to the Sources: A Guide to Historical Research and Writing (Hoboken, NJ., 2017).

4 H. Shudo, The Nanking Massacre: Fact versus Fiction – A Historian’s Quest for the Truth, transl. S. Shuppan (Tokyo, 2005); Vera Schwarcz, Bridge across Broken Time: Chinese and Jewish Cultural Memory (New Haven, 1998).

5 PJC, ‘Writing Through a Big Research Project, not Writing Up’, BLOG/60 (Dec.2015); PJC, ‘How I Write as a Historian’, BLOG/88 (April 2018).

6 R. Porter, Gibbon: Making History (1989); D.P. Womersley, Gibbon and the ‘Watchmen of the Holy City’: The Historian and his Reputation, 1776-1815 (Oxford, 2002).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 92 please click here