MONTHLY BLOG 116, THE LONG EIGHTEENTH CENTURY’S MOST AMAZING LADY RECLUSE

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2020)

Image of Lady Hester Stanhope
(1776-1839)
Garbed as an Oriental Magus

More than matching the fame of the most notable male recluses of eighteenth-century Britain was the renown of the amazing Lady Hester Stanhope.1 She not only cut herself off from her aristocratic family background to live remotely but did so, at first in grand style and then as a recluse, in the Lebanon.

Her story indicates that there were some remarkable options open to independent-minded women, with independent fortunes but no family attachments. In fact, there was quite a substantial amount of female solitude in the eighteenth century.2 The caricature view, which asserts that every woman was under the domestic tutelage of either a husband or a father was just that – a caricature.

There were plenty of female-headed households listed in contemporary urban enumerations; and a number of these were formed by widows living alone. Many lived in the growing spas and resorts, where low-cost lodgings were plentiful. Some would have other family members living with them; but the poorest were completely alone. In Jane Austen’s Persuasion (1817), the protagonist Anne Elliott meets in Bath an old school-friend, the widowed Mrs Smith. She is impecunious and disabled. Her lodgings consist of two small rooms; and she is ‘unable even to afford herself the comfort of a servant’. Nonetheless, solitary living was not the same as being a recluse. Local gossip networks helped to counter isolation, as Jane Austen well understood. Hence, although socially remote from Bath’s smart visitors, Mrs Smith gets all the up-to-date news ‘through the short cut of a laundress and a waiter’.3 As a result, Anne Elliott is surprised to discover how much information about herself and her family is already known to her old friend.

Lady Hester Stanhope was utterly different. Lively, charming, and wealthy, she was the daughter of the 3rd Earl Stanhope and, in her late twenties (1803-6) acted as political hostess for her uncle Prime Minister William Pitt the Younger. Her social elevation and experience of life at the heart of government gave her immense self-confidence. But Pitt’s death in 1806 left her looking for a role.

In retrospect, Stanhope’s subsequent adventures indicate something of the social plight – or, put more positively, the challenges – facing talented and spirited upper class women, who did not wish (or manage) to marry or to go into business. There were plenty of female commercial entrepreneurs, usually of ‘middling’ social origins.4   And there was a positive ‘femocracy’ of high-born women who pulled the political strings behind the scenes.5 > But these were generally married ladies, hosting salons and gatherings for their particular party affiliation, under the ‘shelter’ of their husband’s rank and wealth. The options were much more limited for a single aristocratic female, albeit one with a modest state pension granted after the death of Pitt (at his request). In 1810 Stanhope began to travel extensively in the Middle East; and she never returned to Britain. Initially, she had a sizeable entourage with her; and she attracted the attention of crowds as she toured. By the end of her life, however, she was running out of money and had become a complete recluse.

During her long self-exile, she did a number of remarkable things. Firstly, she adopted her own version of male oriental dress. She sported a velvet robe, embroidered trousers, soft slippers, a swathed turban. and no veil. So attired, she caused a sensation on her travels. In 1813, crowds gathered to see the ‘Queen of the Desert’ as she rode triumphantly on horseback into the remote and beautiful city of Palmyra, having crossed the territory of potentially hostile Bedouins. That moment was, for her, one of intense joy. Her garb and demeanour signalled that she had cut herself off from her previous life; and, even more pointedly, that she rejected any submissive female role, whether in the occident or orient. She was visibly her own person. Indeed, she was a grand personage, meeting local power brokers and Ottoman officials as a potentate in her own right.

A second notable initiative happened in 1815. Stanhope at the age of 39 broke new ground in terms of female self-employment – literally, when she tried some pioneering archaeology.6 She won permission from the Turkish authorities to excavate the ancient port of Ashkelon, north of Gaza. There were disputes, both then and later, about the outcomes of this search for fabled treasure. But Stanhope’s method of basing dirt-archaeology upon documentary evidence from medieval manuscripts showed that she was not attempting a random smash-and-grab raid. But, either way, it was not an adventure that she ever repeated.

Instead, it was a moment of religious revelation which constituted Stanhope’s third claim to fame – and which governed her behaviour for the rest of her life. At some stage c.1815 she was told by Christian sooth-sayers that she would become the bride of the Messiah, whose return to Earth was imminently due. Nothing could be more aptly dramatic. Stanhope accepted her destiny; and settled down to wait. She found two noble and distinctive horses, which were carefully tended for years, awaiting the moment when the returned Messiah and his bride would ride forth to judge the world at the Second Coming.

Excited prophecies of the End of the World can be found in any era,7 and were particularly rampant in Europe in the febrile aftermath of the French Revolution and the prolonged Napoleonic wars. At different times, individuals have claimed to be the returned Messiah – or to be closely connected with such a figure – or to know the exact date of the Second Coming. In the Christian tradition, it is rare for women to claim divinity or near-divinity on their own account. However, in 1814 Joanna Southcott, aged 64, announced that she was pregnant with the new Messiah and, briefly, attracted a large following, until she died of a stomach tumour, without producing the miraculous child. During her lifetime, she had instituted her own church, with a male minister to officiate at the services. And the Southcottian movement has survived as a small sect, with numerous twists and turns in its fortunes, into the twenty-first century.8

By contrast, Lady Hester Stanhope’s vision remained an individual destiny. Visitors approached her in her Lebanese retreat, impressed by her magus-like reputation.  But Stanhope did not attempt to found a church or a supporting movement. Instead, she settled in to wait patiently. That response is a not uncommon one when a divine revelation is not immediately realised. True believers keep faith. It is the timing, not the vision, which is inaccurate. So the answer is to wait, which is what Stanhope indomitably did. Living initially in first one and then another disused monastery, she retreated eventually to a conical hill-top site with panoramic views at Joun, eight miles (13k) inland from Sidon. There she lived as the de facto local magnate. She was accepted within the religious mix of Muslim, Christian and Druze communities that has long characterised the Lebanon; and she tried to protect the Druze from persecution on grounds of their distinctive blend of Islam, gnosticism and neo-platonism. Doctrinal rigidity was very far from her personal mindset.

Only with time did Stanhope become a real recluse. By the mid-1830s, her original English companions had either died or returned home. Her funds ran low and she was besieged by creditors. The servants, allegedly, began to steal her possessions. Lady Hester Stanhope received her few last visitors after dark, refusing to let them see more than her face and hands. Reportedly, she suffered from acute depression. The Messiah did not come. Yet there was a sort of glory in her faithfulness. Her life’s trajectory was utterly distinctive, not one that could be emulated by others. Buoyed by sufficient funds, she made an independent life in an initially strange country, far from the political salons of early nineteenth-century London. And she persisted, even when impecunious. Stanhope died in her sleep aged 63, still awaiting her destiny – and having made her own legend.

ENDNOTES

1 There are many biographies: see e.g. K. Ellis, Star of the Morning: The Extraordinary Life of Lady Hester Stanhope (2008); and a pioneering survey by C.L.W. Powlett, The Life and Letters of Lady Hester Stanhope (1897).

2 B. Hill, Women Alone: Spinsters in England, 1660-1850 (2001).

3 J. Austen, Persuasion (1817/18; in Harmondsworth, 1980 edn), pp. 165-7, 200.

4 N. Phillips, Women in Business, 1700-1850 (Woodbridge, 2006); H. Barker, Family and Business during the Industrial Revolution (Oxford, 2017).

5 E. Chalus, Elite Women in British Political Life, c.1754-90 (Oxford, 2005).

6 For a sympathetic account, see https://womeninarchaeology.com/2016/05/05/lady-hester-lucy-stanhope-the-first-modern-excavator-of-the-holy-land/.

7 J.M. Court, Approaching the Apocalypse: A Short History of Christian Millenarianism (2008); C. Wessinger (ed.), The Oxford Handbook of Millenialism (Oxford, 2011).

8 J.K. Hopkins, A Woman to Deliver her People: Joanna Southcott and English Millenarianism in an Era of Revolution (Austin, Texas, 1982); J.D.M. Derrett, Prophesy in the Cotswolds, 1803-1947 (Shipston-on-Stour, 1994); P.J. Corfield, Power and the Professions in Britain, 1700-1850 (1995; 2000), pp. 106-8. 124, 139; J. Shaw, Octavia, Daughter of God: The Story of a female Messiah and her Followers (2012).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 116 please click here

MONTHLY BLOG 115, THE EIGHTEENTH-CENTURY INVENTION OF TWO NEW SOLITARY OCCUPATIONS

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2020)

1793 Statue of Ancient British Druid,
Croome Park, Worcestershire

Not only did a few famous eighteenth-century recluses choose solitude (see BLOG no 114, June 2020) but others found that isolation went with the job. Two new occupations called for people with self-contained personalities, who were willing to live and work ‘far from the madding crowd’s ignoble strife’, to cite the evocative words of Thomas Gray (1751).1

Firstly, there were the light-housekeepers, employed to tend the dramatic new structures being constructed around the British coast to keep shipping safe.2 These men and the few women in the business had to be punctilious in sticking to the timetables of the job, and able to keep themselves busy with daily repairs and maintenance. Their lamps needed constant attention, to keep the lenses clean and wicks trimmed. Some keepers brought their families with them – indeed some jobs were handed on from father to son. But their work and lifestyles were unavoidably isolated.

Light-housekeepers, who were expected to keep their eyes open for shipping in distress at sea, sometimes found themselves in danger too. The world’s first sea-girt Eddystone Lighthouse (1698-1703), which was situated on perilous rocks off the Cornish coast, did not last long. Wood-built, it was destroyed in the notorious ‘great storm’ of 1703. Also killed in the cataclysm were Henry Winstanley, its architect who was checking for repairs – and three light-housekeepers. The imperative need for their warning beacon was such that three successor structures have since been built on or near the original site. A second Eddystone (1708-55) burned down in 1755, killing its 94-year-old light-housekeeper. He was looking at the blaze, open-mouthed, when he fatally swallowed a globule of hot lead, dying several days later. (The third Eddystone lasted from 1756-1877, until replaced by the current Victorian edifice, located very close to the original (1878- now).)

Notwithstanding the isolation and occasional dangers, the light-housekeepers stuck devotedly to their roles. They knew that their beaming lamps conveyed messages of hope and support for all seafarers. And the keepers formed part of a coastal watch-guard, which included customs officials and lifeboat crews.

Things were quite otherwise in the case of an entirely new eighteenth-century post for solitary workers. A few wealthy landowners with a taste for re-envisaging the simple life built hermitages in their rolling parklands. And they hoped to employ real individuals to inhabit these properties in a suitable druidic lifestyle. The ideal hermit was a man with an imposing presence, long hair, and a beard. He should have a taste for solitude but equally be willing to remain on view as a living statue.3 But suitable candidates were hard to find. The hermits generally had no tasks other than being – and no close colleagues, being neither part of the estate workforce not part of the employing household. They were intended to be truly lonely, in order to live the role.

In the mid-1740s one resident hermit was established in a specially built Heritage at the aristocrat Charles Hamilton’s lavishly landscaped Painshill Park, near Cobham, Surrey. However, the new recluse lasted three weeks in the job – before absconding. His contract was thereupon cancelled.

But, in other cases, the hermit was asked to play a particular role. At Sir Richard Hill’s Hawkstone Estate, near Market Drayton in Shropshire, visitors in 1784 could ring a special bell and gain admittance to the grotto. There sat a venerable hermit, in front of a table bearing a skull, an hour-glass, and an open book. Conversation was allowed, in which the sage would participate with graceful melancholy.

Elsewhere, however, employers expected hermits to remain silent. One landowner advertised for a recluse who was prepared to take a vow of silence for seven years – and, in the meantime, not to wash – and to let his hair and nails grow unchecked. There was, however, no rush of applicants. Before long, the fashion for employing humans as estate ornaments collapsed.

Already by the mid-eighteenth century, some landowners were experimenting with the use of model or dummy hermits. These were cheaper and much more tractable than living people. One mechanical hermit at Samuel Hellier’s estate at The Wodehouse, near Wombourne in Staffordshire, was reported in the 1770s as being moved (by a hidden servant) in a lifelike manner to delight visitors. Such contrivances showed how landowners tried to entertain the touring guests, who frequently called to view estates and the public rooms of grand houses. An ancient hermit gave an estate the patina of antiquity.

Druid statues, meanwhile, offered an equally visible but managerially easier option. They too alluded to ancient British mythologies; and signalled an intended link with the deep past.4 Thus two powerful druidic figures were installed in 1776 to flank the main entrance to the Palladian Penicuik House in Midlothian, Scotland. And in 1793 the owners of Croome Park, near Croome D’Abitot, in south Worcestershire, joined the fashion. Their finely brooding statue of a druid (shown above in 2013 photo) was carved in the new and fashionable Coade stone, which lent itself to expressive designs.5

Figures in a landscape were a means of attracting human attention. Only a few had the space and funds for large statues. But miniaturised versions began to become popular in Britain from the 1840s onwards, with the importation from Germany of specially manufactured garden gnomes.6 In 1847 Sir Charles Isham imported a batch of these terracotta figures to adorn his garden at Lamport Hall, at Lamport, Northamptonshire. Today one gnome, named Lumpy, still survives on display. He is only an indirect descendant of the eighteenth-century hermits. But the fashion for statues and gnomes shows how people continue to add human images to complement a garden design – long after the real-life human hermits disappeared. To recap: the light-housekeepers accepted their solitude, as it was embraced for a good cause, applauded by all. But a lonely life as someone else’s invented hermit did not prove at all appealing.

ENDNOTES

1 T. Gray, Elegy Written in a Country Churchyard (1751).

2 T. Nancollas, Seashaken Houses: A Lighthouse History, from Eddystone to Fastnet (2018).

3 See the invaluable study by G. Campbell, The Hermit in the Garden: From Imperial Rome to Ornamental Gnome (Oxford, 2014).

4 R. Hutton, Blood and Mistletoe: The History of the Druids in Britain (2009).

5 A, Kelly, Mrs Coade’s Stone (Upton-upon Severn, 1990).

6 T. Way, Garden Gnomes: A History (2009).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 115 please click here

MONTHLY BLOG 114, SELF-ISOLATION EIGHTEENTH-CENTURY STYLE

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2020)

Fig.1 Engraving (1808) of Lord Rokeby (1713-1800),
a famous eighteenth-century self-isolator,
who looked like a wise old wizard
but whose actual message was obscure.

It’s not original to note that humans are a highly social species. But it’s only now becoming generally appreciated just how damaging a period of prolonged and enforced isolation from others can be. Basically, it’s bad news for both physical and mental health.1 Of course, some individuals do embrace silence and seek solitude. Maybe for spiritual reasons. Yet such conscious choices, which can be revoked at any time, are very different from enforced solitude, not of an individual’s seeking.

The eighteenth century in Britain provided two quirky individuals who famously created their own isolated lifestyles, cushioned by their private incomes. So what can be learned from their stories? No great revelation of enlightenment emerges. Instead, the two men have been slotted into the history of zany English eccentricity.2 They certainly both fitted into that category plausibly enough. Yet do their lifestyles convey some further message for humanity in the early summer of today’s special virus-avoiding Lockdown?

One of these isolates was the well-connected Matthew Robinson, 2nd Baron Rokeby (1713-1800). He was a landowner, with legal training and literary interests. In his thirties (1747-61), he became MP for Canterbury. There was nothing to suggest his impending eccentricity. Anyhow, at a certain point, he developed a passion for daily immersion in water for hours on end. At first, he walked from his country estate near Hythe (Kent), on the edge of Romney Marsh, to swim in the sea, bathing for hours until he was exhausted and had to be rescued. Then he constructed a private pool in a glass-house attached to his country mansion, which he refused to heat. Again he stayed for hours in the water, refusing company. He got nourishment chiefly from an infusion of beef tea; refused to see doctors; and claimed that he could best worship naturally, in the water and under the stars. Occasional visitors were treated to readings of his lengthy poems.

When Rokeby (rarely) appeared in public, he was taken for a foreigner, on account of his flowing locks and massive beard. Anecdotes circulated about his lifestyle; and prints were engraved (as shown above), to illustrate his hirsute appearance. His younger sister, the highly sociable literary lady and bluestocking Mrs Montagu, wrote sardonically that her brother had become a modern Diogenes: ‘he flies the life of London, and leads a life of such privacy and seriousness as looks to the beholder like wisdom’.3 Ouch. Evidently his nearest and dearest were not impressed. His two younger sisters remained busy and productive: Elizabeth Montagu (1718-1800), later dubbed ‘Queen of the Blues’, and Sarah Scott (1723-95), the novelist and translator, whose Millennium Hall (1763) envisaged a harmonious community of women without men.4

For his part, Rokeby wrote and said nothing memorable, despite looking ever more like a wizard in his later years, He did not do anything to foster swimming or sea-bathing. His eccentric pastime remained a purely private matter, which ended only with his peaceful death in bed, unmarried and childless. His estate and the barony passed to a cousin.

What did all it mean? Rokeby’s lifestyle suggests a personal quest for ecological simplicity, before there was an ecological movement to join him or for him to join. He does not seem to have been personally unhappy; or, at any rate, did not announce any disquiet. Yet his story seems at very least to have been one of unrealised talents, particularly when contrasted with his siblings.

A second case of self-isolation was that of John Tallis (1675-1755). As reported in the Gentleman’s Magazine, he stayed in bed for the last 30 years of his life, swathed in coverings and with a peg on his nose, in a darkened, draught-proof room in a country inn at Burcot (Worcestershire).5 He saw no-one but a few occasional visitors, impelled by curiosity – and his servants, who replaced his bed annually.

Insofar as he justified his strange lifestyle choice, Tallis claimed, to general bemusement, that his morbid fear of fresh air was triggered by an old beldame’s curse. Evidently, he had sufficient funds to pay for his lodging and minimal keep. And no family intervened to try to change his mind. Throughout, Tallis declined to seek medical or even spiritual help for what seemed to be a prolonged and debilitating physical and/or psychological malady.

By the end of his life, he was becoming classed among the ranks of great British eccentrics. His sad tale probably provided the inspiration for William Wordsworth’s later ballad The True Story of Goody Blake and Harry Gill (1798). That jingling poem recounted a malediction directed at a wealthy but hard-hearted farmer, who had no compassion for a poor old woman gleaning in his hedgerow.6 His penalty for an icy heart was then to lie abed, forever chilled:

Oh!  what’s the matter?  what’s the matter?
What is’t that ails young Harry Gill?
That evermore his teeth they chatter,
Chatter, chatter, chatter still.
Of waistcoats Harry has no lack,
Good duffle grey, and flannel fine;
He has a blanket on his back,
And coats enough to smother nine.

Wordsworth’s imaginative evocation was much more vivid than anything communicated by Tallis, who gave no further explanation of his condition. The poet’s moral was that a flinty heart brought its own penalty. Property-owners should not begrudge the poor who gleaned in the fields and hedgerows, Wordsworth concluded pointedly.7

Tallis’s own inert self-isolation baffled everyone during his lifetime. Such a fatalistic belief in a personal curse already seemed like a relic of a bygone age, if that was indeed his motivation. It may simply have been an excuse for doing what he wanted, although his 30 year bed-rest did not seem very enjoyable. Certainly no witnesses to Tallis’s fate made any move to get him exorcised or the notional curse removed.

However, thanks to the transmuting power of poetry, this eccentric case of self-isolation prompted Wordsworth’s appeal for liberal warm-heartedness. ‘A-bed or up, by night or day;/ His teeth they chatter, chatter still,/ Now think, ye farmers all, I pray,/ Of Goody Blake and Harry Gill’. It’s always open to self-isolates to explain themselves to the wider world. But, if they don’t, then others will have a stab at doing so for them. After all, the moral is that isolates are not actually alone. The human community is watching, trying to detect a message.

ENDNOTES:

1 K.T. Rowe (ed.), Social Isolation, Participation and Impact upon Mental Health (New York, 2015); R. Fiorella, R. Morese and S. Palermo, Social Isolation: An Interdisciplinary View (2020).

2 J. Timbs, English Eccentrics and Eccentricities (1875); E. Sitwell, The English Eccentrics (1933); D. Long, English Country House Eccentrics (Stroud, 2012); S.D. Tucker, Great British Eccentrics (Stroud, 2015).

3 https://en.wikipedia.org/wiki/Matthew_Robinson,_2nd_Baron_Rokeby.

4 J. Busse, Mrs Montagu, Queen of the Blues (1928); S.H. Myers, The Bluestocking Circle: Women, Friendship and the Life of the Mind in Eighteenth-Century England (Oxford, 1990).

5 Gentleman’s Magazine (March 1753), p. 123.

6 J.A. Sharpe, A Fiery and Furious People: A History of Violence in England (2016), pp. 251-2.

7 W. Wordsworth, Poetical Works, ed. T. Hutchinson (1920), pp. 536-7.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 114 please click here

MONTHLY BLOG 113, LIGHT FROM THE LAMP OF EXPERIENCE

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2020)

Fig.1, A hand-held eighteenth-century lantern, its lighted candle providing an immediate pool of light.

‘The Lamp of Experience’ is a marvellous phrase. A lantern throws light. It does not insist dogmatically but instead conveys sufficient illumination for good judgment. ‘Experience’ is also a vital component of the phrase. It implies not just a list of facts from history but also the capacity to cogitate about past events and to learn from them. Moreover, experience can be gleaned not just from each individual’s personal life but from the collective experiences of humanity as a whole.

During the current pandemic, for example, people can learn instructive lessons from comparable past global disasters. Factual histories provide suggestive evidence of what was done, what was not done, and what could have been done better.1 And imaginative literature allows people to share the range of subjective emotions and reactions which may be triggered by great and unexpected disasters.2 It allows for a sort of mental rehearsal. Needless to say, imaginative fiction is not written primarily for utilitarian purposes. And far from all happenings that can be conjectured will actually transpire. (Time Travel provides a pertinent example). Nonetheless, imaginative literature, even when imagining things that are technically impossible, contributes to the stock of human creativity. And thoughts and dreams, as much as deeds and misdeeds, all form part of the human experience.

There is additionally a pleasant irony in on-line references to ‘the Lamp of Experience’. Various web-lists of famous quotations attribute the dictum to Edward Gibbon (1737-94), Britain’s nonpareil historian. The full statement runs as follows: ‘I have but one lamp by which my feet are guided, and that is the lamp of experience. I know of no way of judging the future but by the past’. But that formulation does not accord with Gibbon’s impersonally magisterial and often ironic style. The words are spikier, and more personalised.

In fact, their true author is also credited on the web; and maybe with time the accurate citations will crowd out the error. True, the general observation does not lose its force by being misattributed. Yet credit should go where credit was due. The reference was first made in a celebrated speech by a Virginian planter-turned-lawyer, named Patrick Henry (1736-99).3 He was an exact contemporary of Gibbon. But they differed in their politics. Henry was an American critic of British rule. In 1765, he used his knowledge of legal precedents to argue that the Westminster government’s attempt at imposing the unpopular Stamp Tax upon the American colonists was unconstitutional.4

Lawyers, like historians, were accustomed to weighing and pondering evidence before making judgments. In this case, Henry was using the ‘lamp’ of past experience for radical purposes. His arguments, while rejected by Britain, were popular in the American colonies; and in 1776 Henry became the first Governor of Virginia post-Independence. Manifestly, his appeal to experience had not produced universal agreement. As already noted, studying history provides options, not a universal blueprint for what it to be done.

Fig.2 Engraved portrait of the intent figure of Patrick Henry (1736-99), his eye-glasses pushed up onto his lawyer’s wig: a Virginia planter who turned to law and politics, Patrick Henry served as first and also sixth post-colonial Governor of the State of Virginia.

What, then, is the appeal and power of the past? The truth is that Henry’s dictum, while evocative, does not go nearly far enough. Experience/history provides much, much more than a pool of light. It provides the entire bedrock of existence. Everything comes from the past. Everyone learns from the past. The cosmos, global biology, languages, thought-systems, the stock of knowledge, diseases, human existence …  arrive in the present from the past.5

All that is because Time is unidirectional. Humans live in the present but have to rely upon the collective databank of past human experience. That great resource is not just a lamp, sending out a single beam. Instead, collective experience provides the entire context and content of surviving successfully in Time. All humans, as living histories, are part of the process, and contribute their personal quota. The better, fuller and more accurate is that collective knowledge, the better the long-term prospects for the species.

Humans in history are restless problem creators. Yet they are also impressive problem solvers. It’s time, not just for renewed human escape from an obvious viral danger, but equally for urgent collective action to halt, and where possible to reverse, the accelerating environmental degradation, which is damaging the global climate and global biodiversity – let alone the global habitat of humans.

Now needed – not just a Lamp but a mental Sunburst, drawing upon experience and transmuting into sustained action. Stirring times! What comes from the past will have a mighty effect on the future. And decisions taken in the present contribute crucially too.
1 See e.g. M. Honigsbaum, A History of the Great Influenza Pandemics: Death, Panic and Hysteria, 1830-1920 (2013; ppbk 2020)..

2 D. Defoe, A Journal of the Plague Year (1722; and many later edns); A. Camus, La Peste (Paris, 1947), in Eng. transl. by S. Gilbert as The Plague (1960).

3 P. Henry, ‘Speech at 2nd Virginia Convention, 23 March 1775’, in L. Copeland and L.W. Lamm (eds), The World’s Great Speeches (New York, 1999), pp. 232-3; T.S. Kidd, Patrick Henry: First among Patriots (New York, 2011).

4 P.D.G. Thomas, British Politics and the Stamp Act Crisis: The First Phase of the American Revolution, 1763-9 (Oxford, 1975); E.S. and H.M. Morgan, The Stamp Act Crisis: Prologue to Revolution (1974; 1995).

5 P.J. Corfield, ‘All People are Living Histories’ (2007), available on PJC website www.penelopejcorfield.co.uk/essaysonwhatishistory/pdf1

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 113 please click here

MONTHLY BLOG 103, WHO KNOWS THESE HISTORY GRADUATES BEFORE THE CAMERAS AND MIKES IN TODAY’S MASS MEDIA?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Image © Shutterstock 178056255

Responding to the often-asked question, What do History graduates Do? I usually reply, truthfully, that they gain employment in an immense range of occupations. But this time I’ve decided to name a popular field and to cite some high-profile cases, to give specificity to my answer. The context is the labour-intensive world of the mass media. It is no surprise to find that numerous History graduates find jobs in TV and radio. They are familiar with a big subject of universal interest – the human past – which contains something for all audiences. They are simultaneously trained to digest large amounts of disparate information and ideas, before welding them into a show of coherence. And they have specialist expertise in ‘thinking long’. That hallmark perspective buffers them against undue deference to the latest fads or fashions – and indeed buffers them against the slings and arrows of both fame and adversity.

In practice, most History graduates in the mass media start and remain behind-the-scenes. They flourish as managers, programme commissioners, and producers, generally far from the fickle bright lights of public fame. Collectively, they help to steer the evolution of a fast-changing industry, which wields great cultural clout.1

There’s no one single route into such careers, just as there’s no one ‘standard’ career pattern once there. It’s a highly competitive world. And often, in terms of personpower, a rather traditionalist one. Hence there are current efforts by UK regulators to encourage a wider diversity in terms of ethnic and gender recruiting.2 Much depends upon personal initiative, perseverance, and a willingness to start at comparatively lowly levels, generally behind the scenes. It often helps as well to have some hands-on experience – whether in student or community journalism; in film or video; or in creative applications of new social media. But already-know-it-all recruits are not as welcome as those ready and willing to learn on the job.

Generally, there’s a huge surplus of would-be recruits over the number of jobs available. It’s not uncommon for History students (and no doubt many others) to dream, rather hazily, of doing something visibly ‘big’ on TV or radio. However, front-line media jobs in the public eye are much more difficult than they might seem. They require a temperament that is at once super-alert, good-humoured, sensitive to others, and quick to respond to immediate issues – and yet is simultaneously cool under fire, not easily sidetracked, not easily hoodwinked, and implacably immune from displays of personal pique and ego-grandstanding. Not an everyday combination.

It’s also essential for media stars to have a thick skin to cope with criticism. The immediacy of TV and radio creates the illusion that individual broadcasters are personally ‘known’ to the public, who therefore feel free to commend/challenge/complain with unbuttoned intensity.

Those impressive History graduates who appear regularly before the cameras and mikes are therefore a distinctly rare breed.3 (The discussion here refers to media presenters in regular employment, not to the small number of academic stars who script and present programmes while retaining full-time academic jobs – who constitute a different sort of rare breed).

Celebrated exemplars among History graduates include the TV news journalists and media personalities Kirsty Wark (b.1955) and Laura Kuenssberg (b.1976)., who are both graduates of Edinburgh University. Both have had public accolades – Wark was elected as Fellow of the Royal Society of Edinburgh in 2017 – and both face much criticism. Kuenssberg in particular, as the BBC’s first woman political editor, is walking her way warily but effectively through the Gothic-melodrama-cum-Greek-tragedy-cum-high-farce, known as Brexit.

In a different sector of the media world, the polymathic TV and radio presenter, actor, film critic and chat-show host Jonathan Ross (b.1960) is another History graduate. He began his media career young, as a child in a TV advertisement for a breakfast cereal. (His mother, an actor, put him forward for the role). Then, having studied Modern European History at London University’s School of Slavonic & Eastern European Studies, Ross worked as a TV programme researcher behind the scenes, before eventually fronting the shows. Among his varied output, he’s written a book entitled Why Do I Say These Things? (2008). This title for his stream of reminiscences highlights the tensions involved in being a ‘media personality’. On the one hand, there’s the need to keep stoking the fires of fame; but, on the other, there’s an ever-present risk of going too far and alienating public opinion.

Similar tensions accompany the careers of two further History graduates, who are famed as sports journalists. The strain of never making a public slip must be enormous. John Inverdale (b.1957), a Southampton History graduate, and Nicky Campbell (b.1961), ditto from Aberdeen, have to cope not only with the immediacy of the sporting moment but also with the passion of the fans. After a number of years, Inverdale racked up a number of gaffes. Some were unfortunate. None fatal. Nonetheless, readers of the Daily Telegraph in August 2016 were asked rhetorically, and obviously inaccurately: ‘Why Does Everyone Hate John Inverdale?’4 That sort of over-the top response indicates the pressures of life in the public eye.

Alongside his career in media, meanwhile, Nicky Campbell used his research skills to study the story of his own adoption. His book Blue-Eyed Son (2011)5 sensitively traced his extended family roots among both Protestant and Catholic communities in Ireland. His current role as a patron of the British Association for Adoption and Fostering welds this personal experience into a public role.

The final exemplar cited here is one of the most notable pioneers among women TV broadcasters. Baroness Joan Bakewell (b.1933) has had what she describes as a ‘rackety’ career. She studied first Economics and then History at Cambridge. After that, she experienced periods of considerable TV fame followed by the complete reverse, in her ‘wilderness years’.6 Yet her media skills, her stubborn persistence, and her resistance to being publicly patronised for her good looks in the 1960s, have given Bakewell media longevity. She is not afraid of voicing her views, for example in 2008 criticising the absence of older women on British TV. In her own maturity, she can now enjoy media profiles such as that in 2019 which explains: ‘Why We Love Joan Bakewell’.7 No doubt, she takes the commendations with the same pinch of salt as she took being written off in her ‘wilderness years’.

Bakewell is also known as an author; and for her commitment to civic engagement. In 2011 she was elevated to the House of Lords as a Labour peer. And in 2014 she became President of Birkbeck College, London. In that capacity, she stresses the value – indeed the necessity – of studying History. Her public lecture on the importance of this subject urged, in timely fashion, that: ‘The spirit of enquiring, of evidence-based analysis, is demanding to be heard.’8

What do these History graduates in front of the cameras and mikes have in common? Their multifarious roles as journalists, presenters and cultural lodestars indicate that there’s no straightforward pathway to media success. These multi-skilled individuals work hard for their fame and fortunes, concealing the slog behind an outer show of relaxed affability. They’ve also learned to live with the relentless public eagerness to enquire into every aspect of their lives, from health to salaries, and then to criticise the same. Yet it may be speculated that their early immersion in the study of History has stood them in good stead. As already noted, they are trained in ‘thinking long’. And they are using that great art to ‘play things long’ in career terms as well. As already noted, multi-skilled History graduates work in a remarkable variety of fields. And, among them, some striking stars appear regularly in every household across the country, courtesy of today’s mass media.

ENDNOTES:

1 O. Bennett, A History of the Mass Media (1987); P.J. Fourtie, (ed.), Media Studies, Vol. 1: Media History, Media and Society (2nd edn., Cape Town, 2007); G. Rodman, Mass Media in a Changing World: History, Industry, Controversy (New York, 2008); .

2 See Ofcom Report on Diversity and Equal Opportunities in Television (2018): https://www.ofcom.org.uk/__data/assets/pdf_file/0021/121683/diversity-in-TV-2018-report.PDF

3 Information from diverse sources, including esp. the invaluable survey by D. Nicholls, The Employment of History Graduates: A Report for the Higher Education Authority … (2005): https://www.heacademy.ac.uk/system/files/resources/employment_of_history_students_0.pdf; and short summary by D. Nicholls, ‘Famous History Graduates’, History Today, 52/8 (2002), pp. 49-51.

4 See https://www.telegraph.co.uk/olympics/2016/08/15/why-does-everyone-hate-john-inverdale?

5 N. Campbell, Blue-Eyed Son: The Story of an Adoption (2011).

6 J. Bakewell, interviewed by S. Moss, in The Guardian, 4 April 2010: https://www.theguardian.com/lifeandstyle/2010/apr/04/joan-bakewell-harold-pinter-crumpet

7 https://www.bbc.co.uk/programmes/articles/1xZlS9nh3fxNMPm5h3DZjhs/why-we-love-joan-bakewell.

8 J. Bakewell, ‘Why History Matters: The Eric Hobsbawm Lecture’ (2014): http://joanbakewell.com/history.html.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 103 please click here

MONTHLY BLOG 101, ARE YOU A LUMPER OR SPLITTER? HOW WELL DO YOU KNOW YOUR OWN CAST OF MIND?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)
The terminology, derived from Charles Darwin,1 is hardly elegant. Yet it highlights rival polarities in the intellectual cast of mind. ‘Lumpers’ seek to assemble fragments of knowledge into one big picture, while ‘splitters’ see instead complication upon complications. An earlier permutation of that dichotomy was popularised by Isaiah Berlin. In The Hedgehog and the Fox (1953), he distinguished between brainy foxes, who know many things, and intellectual hedgehogs, who apparently know one big thing.2

Fox from © Clipart 2019; Hedgehog from © GetDrawings.com (2019)

These animalian embodiments of modes of thought are derived from a fragmentary dictum from the classical Greek poet Archilochus; and they remain more fanciful than convincing. It’s not self-evident that a hedgehog’s mentality is really so overwhelmingly single-minded.3 Nor is it clear that the reverse syndrome applies particularly to foxes, which have a reputation for craft and guile.4 To make his point with reference to human thinkers, Berlin instanced the Russian novelist Leo Tolstoy as a classic ‘hedgehog’. Really? The small and prickly hedgehog hardly seems a good proxy for a grandly sweeping thinker like Tolstoy.

Those objections to Berlin’s categories, incidentally, are good examples of hostile ‘splitting’. They quibble and contradict. Sweeping generalisations are rejected. Such objections recall a dictum in a Poul Anderson sci-fi novella, when one character states gravely that: ‘I have yet to see any problem, which, when you looked at it in the right way, did not become still more complicated’.5

Arguments between aggregators/generalisers and disaggregators/sceptics, which occur in many subjects, have been particularly high-profile among historians. The lumping/splitting dichotomy was recycled in 1975 by the American J.H. Hexter.6 He accused the Marxist Christopher Hill not only of ‘lumping’ but, even worse, of deploying historical evidence selectively, to bolster a partisan interpretation. Hill replied relatively tersely.7 He rejected the charge that he did not play fair with the sources. But he proudly accepted that, through his research, he sought to find and explain meanings in history. The polarities of lumping/splitting were plain for all to see.

Historical ‘lumpers’ argue that all analysis depends upon some degree of sorting/processing/generalising, applied to disparate information. Merely itemising date after date, or fact after fact ad infinitum, would not tell anyone anything. On those dreadful occasions when lecturers do actually proceed by listing minute details one by one (for example, going through events year by year), the audience’s frustration very quickly becomes apparent.

So ‘lumpers’ like big broad interpretations. And they tend to write big bold studies, with clear long-term trends. Karl Marx’s panoramic brief survey of world history in nine pages in The Communist Manifesto was a classic piece of ‘lumping’.8 In the twentieth century, the British Marxist historian E.P. Thompson was another ‘lumper’ who sought the big picture, although he could be a combative ‘splitter’ about the faults of others.9

‘Splitters’ conversely point out that, if there were big broad-brush interpretations that were reliably apparent, they would have been discovered and accepted by now. However, the continual debates between historians in every generation indicate that grand generalisations are continually being attacked. The progression of the subject relies upon a healthy dose of disaggregation alongside aggregation. ‘Splitters’ therefore produce accounts of rich detail, complications, diversities, propounding singular rather than universal meanings, and stressing contingency over grand trends.

Sometimes critics of historical generalisations are too angry and acerbic. They can thus appear too negative and destructive. However, one of the twentieth-century historians’ most impressive splitters was socially a witty and genial man. Intellectually, however, F.J. ‘Jack’ Fisher was widely feared for his razor-sharp and trenchant demolitions of any given historical analysis. Indeed, his super-critical cast of mind had the effect of limiting his own written output to a handful of brilliant interpretative essays rather than a ‘big book’.10 (Fisher was my research supervisor. His most caustic remark to me came after reading a draft chapter: ‘There is nothing wrong with this, other than a female desire to tell all and an Oxbridge desire to tell it chronologically.’ Ouch! Fisher was not anti-woman, although he was critical of Oxbridge where I’d taken my first degree. But he used this formulation to grab my attention – and it certainly did).

Among research historians today, the temperamental/intellectual cast of mind often inclines them to ‘splitting’, partly because there are many simplistic generalisations about history in public circulation which call out for contradiction or complication. Of course, the precise distribution around the norm remains unknown. These days, I would guestimate that the profession would divide into roughly 45% ‘lumpers’, seeking big grand overviews, and 55% ‘splitters’, stressing detail, diversity, contingency. The classification, however, does depend partly on the occasion and type of output, since single-person expositions on TV and radio encourage generalisations, while round-tables and panels thrive on disagreement where splitters can come into their own.

Moreover, there are not only personal variations, depending upon circumstance, but also major oscillations in intellectual fashions within the discipline. In the later twentieth century, for example, there was a growing, though not universal, suspicion of so-called Grand Narratives (big through-time interpretations).11 The high tide of the sceptical trend known as ‘revisionism’ challenged many old generalisations and easy assumptions. Revisionists did not constitute one single school of thought. Many did favour conservative interpretations of history, but, as remains apparent today, there was and is more than one form of conservatism. That said, revisionists were generally agreed in rejecting both left-wing Marxist conflict models of revolutionary change via class struggles and liberal Whiggish linear models of evolving Progress via spreading education, constitutional rights and so forth.12

Yet the alignments were never simple (a splitterish comment from myself). Thus J.H. Hexter was a ‘splitter’ when confronting Marxists like Hill. But he was a ‘lumper’ when propounding his own Whig view of history as a process of evolving Freedom. So Hexter’s later strictures on revisionism were as fierce as was his earlier critique of Hill.13

Ideally, most research historians probably seek to find a judicious balance between ‘lumping’/‘splitting’. There is scope both for generalisations and for qualifications. After all, there is diversity within the human experience and within the cosmos. Yet there are also common themes, deep patterns, and detectable trends.

Ultimately, however, the dichotomous choice between either ‘lumping’ or ‘splitting’ is a completely false option, when pursued to its limits. Human thought, in all the disciplines, depends upon a continuous process of building/qualifying/pulling down/rebuilding/requalifying/ and so on, endlessly. With both detailed qualifications and with generalisations. An analysis built upon And+And+And+And+And would become too airy and generalised to have realistic meaning. Just as a formulation based upon But+But+But+But+But would keep negating its own negations. So, yes. Individually, it’s worth thinking about one’s own cast of mind and intellectual inclinations. (I personally enjoy both lumping and splitting, including criticising various outworn terminologies for historical periodisation).14 Furthermore, self-knowledge allows personal scope to make auto-adjustments, if deemed desirable. And then, better still, to weld the best features of ‘lumping’ and ‘splitting’ into original thought. And+But+And+Eureka.

ENDNOTES:

1 Charles Darwin in a letter dated August 1857: ‘It is good to have hair-splitters and lumpers’: see Darwin Correspondence Letter 2130 in https://www.darwinproject.ac.uk/.

2 I. Berlin, The Hedgehog and the Fox: An Essay on Tolstoy’s View of History (1953).

3 For hedgehogs, now an endangered species, see S. Coulthard, The Hedgehog Handbook (2018). If the species were to have one big message for humans today, it would no doubt be: ‘Stop destroying our habitat and support the Hedgehog Preservation Society’.

4 M. Berman, Fox Tales and Folklore (2002).

5 From P. Anderson, Call Me Joe (1957).

6 J.H. Hexter, ‘The Burden of Proof: The Historical Method of Christopher Hill’, Times Literary Supplement, 25 Oct. 1975, repr. in J.H. Hexter, On Historians: Reappraisals of Some of the Makers of Modern History (1979), pp. 227-51.

7 For Hill’s rebuttal, see The Times Literary Supplement, 7 Nov. 1975, p. 1333.

8 K. Marx and F. Engels, The Manifesto of the Communist Party (1848), Section I: ‘Bourgeois and Proletarians’, in D. McLennan (ed.), Karl Marx: Selected Writings (Oxford, 1977), pp. 222-31.

9 Among many overviews, see e.g. C. Efstathiou, E.P. Thompson: A Twentieth-Century Romantic (2015); P.J. Corfield, E.P. Thompson, Historian: An Appreciation (1993; 2018), in PJC website http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf45.

10 See P.J. Corfield, F.J. Fisher (1908-88) and the Dialectic of Economic History (1990; 2018), in PJC website http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf46.

11 See esp. J-F. Lyotard, The Postmodern Condition: A Report on Knowledge (Paris, 1979; in Eng. transl. 1984), p. 7, which detected ‘an incredulity toward meta-narratives’; and further discussions in G.K. Browning, Lyotard and the End of Grand Narratives (Cardiff, 2000); and A Munslow, Narrative and History (2018). Earlier Lawrence Stone, a classic historian ‘lumper’, had detected a return to narrative styles of exposition: see L. Stone, ‘The Revival of Narrative: Reflections on a New Old History’, Past & Present, 85 (1979), pp.  3-24. But in this essay Stone was detecting a decline in social-scientific styles of History-writing – not a return to old-style Grand Narratives.

12 Revisionism is sufficiently variegated to have avoided summary within one big study. But different debates are surveyed in L. Labedz (ed.), Revisionism: Essays on the History of Marxist Ideas (1962); J.M. Maddox, Hiroshima in History: The Myths of Revisionism (1974; 2011); L. Brenner, The Iron Wall: Zionist Revisionism from Jabotinsky to Shamir (1984); E. Longley, The Living Stream: Literature and Revisionism in Ireland (Newcastle upon Tyne, 1994); and M. Haynes and J. Wolfreys (eds), History and Revolution: Refuting Revisionism (2007).

13 J.H. Hexter (1910-96) founded in 1986 the Center for the History of Freedom at Washington University, USA, where he was Professor of the History of Freedom, and launched The Making of Modern Freedom series. For his views on revisionism, see J.H. Hexter, ‘Historiographical Perspectives: The Early Stuarts and Parliaments – Old Hat and the Nouvelle Vague’, Parliamentary History, 1 (1982), pp. 181-215; and analysis in W.H. Dray, ‘J.H. Hexter, Neo-Whiggism and Early Stuart Historiography’, History & Theory, 26 (1987), pp. 133-49.

14 See e.g. P.J. Corfield, ‘Primevalism: Saluting a Renamed Prehistory’, in A. Baysal, E.L. Baysal and S. Souvatzi (eds), Time and History in Prehistory (2019), pp. 265-82; and P.J. Corfield, ‘POST-Medievalism/ Modernity/ Postmodernity?’ Rethinking History, 14 (2010), pp. 379-404; also on http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf20.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 101 please click here

MONTHLY BLOG 100, CONTROLLING STREET VIOLENCE & LEARNING FROM THE DEMISE OF DUELLING

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Young men carrying knives today can’t simply be equated with gentlemen duelling with rapiers in the eighteenth century. There are very many obvious differences. Nonetheless, the decline and disappearance of duelling has some relevant messages for later generations, when considering how to cope with an increase in violent street confrontations.

Both themes come under the broad rubric of controlling public expressions of male violence. By the way, such a proposition does not claim violence to be purely a masculine phenomenon. Still less does it imply that all men are prone to such behaviour. Yet it remains historically the case that weaponised acts of aggression in public and semi-public places tend to be undertaken by men – and, often, by young men at that.

Duelling developed in Europe from the sixteenth century onwards as a stylised form of combat between two aggrieved individuals.1 In terms of the technology of fighting, it was linked with the advent of the light flexible rapier, instead of the heavy old broadsword. And in terms of conflict management, the challenge to a duel took the immediate heat out of a dispute, by appointing a future date and time for the aggrieved parties to appear on the ‘field of honour’.     At the appointed hour, the meeting did not turn into an instant brawl but was increasingly codified into ritual. ‘Seconds’ accompanied the combatants, to enforce the set of evolving rules and to see fair play. They were there as friendly witnesses but also, to an extent, as referees.2 In the eighteenth century, too, surgeons were often engaged to attend, so that medical attention was available if required.

Sometimes, to be sure, there were variants in the fighting format. On one occasion in 1688 two aristocratic combatants arrived, each supported by two seconds. At a given signal, all six men launched into an uninhibited sword-fight, in which all were wounded and two of the seconds died. However, such escalations were exceptional. The seconds often began the encounter by trying to reconcile the antagonists. If successful, the would-be duellists then shook hands and declared honour to be satisfied. Hence an unknown number of angry challenges never turned into outright fighting. Would-be violence in such cases had been deflected and socially contained.

Duels certainly remained a topic of both social threat and titillating gossip. They were dramatic moments, when individual destiny appeared heightened by the danger of imminent death. Later romantic novelists and film script-writers embraced the melodrama with unwearied enthusiasm. Yet the number of real-life duels in seventeenth- and eighteenth-century Britain was tiny.

No accurate records are available, since such encounters were kept semi-clandestine. Nonetheless, contemporary legal records and newspaper reports provide some clues. Scrupulous research by the historian Robert Shoemaker has identified 236 duels in the metropolitan London area between 1660 and 1830.3 In other words, there were fewer than 1.5 duels per annum on average during these 170 years. The peak duelling decades were those of the later eighteenth century. Between 1776 and 1800, there were on average 4.5 duels per annum. Yet that total emerged from a ‘greater’ London with approximately one million inhabitants in 1801. Even taking Shoemaker’s figures as a minimum, they show that duelling was much rarer in practice than its legendary status implied.

In fact, the question might be put the other way round: why were there so many duels at all, when the practice was officially deplored? The answer has relevance to today’s discussions about knife carrying. Duelling was sustained by a degree of socio-cultural acceptance by men in elite society, who were prepared to risk the legal penalties for unlawful fighting, wounding or killing. Its continuance paid tribute to the power of custom, against the law.

By the early nineteenth century in Britain, when the practice was disappearing, it was pretty much confined to young elite men of military background. However, there were three high-profile cases when very senior Tory politicians rashly took to the field. In 1798 Prime Minister William Pitt the Younger exchanged shots with his Treasurer of the Navy. (Both missed; but Pitt retired to his bed for three weeks, overcome by stress). In 1809 George Canning, the Foreign Secretary, duelled with his fellow Cabinet member, Viscount Castlereagh, Minister for War. (Castlereagh was wounded but not fatally). Most dramatically of all, in 1829 the ‘Iron Duke’ of Wellington, then Prime Minister, confronted the Earl of Winchelsea, in a row over Catholic Emancipation. (Neither was hurt; and the Duke immediately travelled to Windsor to reassure the king that his government was not suddenly leaderless).

These ill-judged episodes were signs of the acute vehemence of political confrontations in highly pressurised times. However, critics were immediately scathing. They asked pertinently enough why the populace should obey the laws when such eminent figures were potentially breaching the peace? At very least, their rash behaviour did not encourage reverence for men in high office.

Fig.2 Equestrian statue of Duke of Wellington, located in Royal Exchange Square, Glasgow: capping the statue with a traffic cone has become a source of local amusement, despite continued disapproval from Glasgow City Council and police.

Public opinion was slowly shifting against duelling. There was no guarantee that the god of battle would give victory to the disputant who was truly in the right. Fighting empowered the bellicose over the irenic. Religious and civic authorities always opposed fighting as a means of conflict resolution. Lawyers were particularly hostile. Self-help administration of justice deprived them of the business of litigation and/or arbitration. Hence in 1822 a senior law lord defined duelling as ‘an absurd and shocking remedy for private insult’.

Other voices had long been arguing that case. In 1753 the novelist Samuel Richardson strove in Sir Charles Grandison to depict a good man who declined to fight a duel, despite being strongly provoked. True, many impatient readers found this saintly hero to be somewhat priggish. But Grandison stressed that killing or maiming a rival over a point of honour was actually the reverse of honourable.4 Bourgeois good sense was triumphing over aristocratic impetuosity, although the fictional Sir Charles had a title just to soothe any anxieties over his social respectability.

Another public declaration against duelling came from the down-to-earth American inventor and civic leader Benjamin Franklin (1706-90). In 1784 he rejected the practice as both barbaric and old-fashioned: ‘It is astonishing that the murderous practice of duelling … should continue so long in vogue’. His intervention was particularly notable, in that recourse to duelling was socially more widespread in the American colonies, with their ingrained gun culture.5 And Franklin stuck to his position, refusing to rise to sundry challenges

The force of such interventions in Britain helped to render public opinion decreasingly sympathetic to duellists. One pertinent example came from 1796. Early one morning, two Americans faced each other to duel in Hyde Park. But ten swimmers in the nearby Serpentine – some of them naked – jumped out of the water and ran to stop the fight. In this particular case, they were too late; and one contestant died. Nonetheless, witnesses testified in the ensuing murder trial that the crowd, many of middling social origins, had spontaneously intervened. Public attitudes were becoming hostile. And it was that shift, rather than major changes in law or policing, which caused the practice slowly to disappear. The last fatal duel in Scotland took place in 1826; the last in England/Wales (between two exiled Frenchmen) in 1852. When Prime Minister Peel was challenged to a political duel in the 1840s he immediately refused, on the grounds that such behaviour would be ‘childish’ as well as wrong.

Viewed in terms of Britain’s historical sociology, the decline of duelling was part of a complex process of everyday demilitarisation, in the context of the slow shift from a rural to an urbanised society. Gentlemen decreasingly carried swords for other than ceremonial purposes. Canes and umbrellas came into vogue instead. Sheridan’s play The Rivals (1775) poked fun at impetuous young gentlemen who are ready to fight for their honour. Yet they are aware that ‘a sword seen in the streets of Bath would raise as great an alarm as a mad dog’, as one character remarks. The combative Irish adventurer Sir Lucius O’Trigger is lampooned – a nice touch of auto-critique from Sheridan who came from Dublin and twice fought duels himself. And the country bumpkin Bob Acres, who is egged on to fight his rival, tellingly finds his valour ‘oozing away’ when it gets to the point.6 Audiences are invited to laugh, but sympathetically.

Interestingly, by 1775 Sheridan’s play was already behind the times in terms of the technology of fighting. By the 1760s duels had come increasingly to be fought with pistols. The last known sword duel in Britain occurred in 1785. This technological updating, supplied by industrious Birmingham gun-makers, had two paradoxical effects. On the one hand, it demonstrated that the art of duelling was quick to move with the times.

On the other hand, the advent of the pistol inadvertently saved lives. The statistics collected by Robert Shoemaker showed that unequivocally. Duels with swords, among his 236 recovered examples, resulted in deaths in 22 per cent of all cases; and woundings in another 25 per cent. By contrast, it was tricky to kill a man standing at a distance, especially with early pistols which lacked rifle sights for precise aiming. Among Shoemaker’s 236 cases, as few as 7 percent of duels with pistols resulted in death; while a further 22 percent led to woundings.

Or, the point can be put the other way round. A massive 71 percent of combatants were unharmed after an exchange of pistol shots, compared with 53 per cent of duellists who were unharmed after crossing swords. In neither case did a duel guarantee a bloodbath. But pistols were a safer bet, especially after conventions established that the combatants had to stand at a considerable distance from one another and had to wait for a signal, in the form of a dropping handkerchief, before taking aim and firing. No ‘jumping the gun’. Indeed one test case in 1750 saw a duellist on trial for murder because he had fired before his opponent was ready. So the victim had testified, plaintively, on his deathbed.

It was the unavoidable proximity of the combatants rather than their martial skills which led to the greater proportion of killings by swordsmen than by gunsmen. That fact is relevant to the experience of knife-carrying today. The number of fatalities is not a sign of a special outcrop of wickedness but rather the consequence of the chosen technology. Knife-wielding in anger at close quarters is intrinsically dangerous, whatever the level of fighting expertise.

Needless to say, the moral of this history is not that combatants should switch to guns. The much-enhanced technology of gunfire today, including the easy firing of multiple rounds, makes that option ever less socially palatable, if it ever was.

Instead, the clear requirement is to separate combatants and to ritualise the expression of social and personal aggression. Achieving such policies must rely considerably upon systems of law and policing. Yet socio-cultural attitudes among the wider public are highly relevant too. As the history of duelling indicates, even august Prime Ministers allowed themselves upon occasion to be provoked into behaving in ways that put them at risk of criminal charges. But changing social mores eventually removed that option, even for the most combative and headstrong of politicians today. Community attitudes at first ritualised the personal resolution of conflicts and eventually withdrew support for such behaviour entirely.

So today multiple approaches are required. Police actions to discourage young men from carrying knives constitute an obvious and important step. Ditto effective policies to curb the drug culture. Equally crucial are strong and repeated expressions of community disapproval of violence and knife-carrying. Yet policing and public attitudes can’t work without complementary interventions to combat youth alienation and, especially, to provide popular non-violent outlets for energy and aggression. Leaving bored young people feeling fearful and at risk in public places is no recipe for social order.

How can energies and aggression be either ritualised and/or channelled into other outlets? It’s for young people and community activists to specify. But many potential options spring to mind: youth clubs; youth theatre; participatory sports of all kinds; martial arts; adventure programmes; community and ecological projects; music-making festivals; dance; creative arts; church groups; … let alone continuing educational access via further education study grants. It’s true that all such plans involve constructive imagination, organisation, and expenditure. But their benefits are immense. Violence happens within societies; and so, very emphatically, does conflict resolution and, better still, the redirection of energies and aggression into constructive pathways.

1 See variously S. Banks, Duels and Duelling (Oxford, 2014); U. Frevert, Men of Honour: A Social and Cultural History of the Duel (Cambridge, 1995); V.G. Kiernan, The Duel in European History: Honour and the Reign of the Aristocracy (Oxford, 1988; 2016); M. Peltonen, The Duel in Early Modern England: Civility, Politeness and Honour (Cambridge, 2003); P. Spierenburg (ed.), Men and Violence: Gender, Honour and Rituals in Modern Europe and America (Columbus, Ohio, 1998).

2 S. Banks, ‘Dangerous Friends: The Second and the Later English Duel’, Journal of Eighteenth-Century Studies, 32 (2009), pp. 87-106.

3 R.G. Shoemaker, ‘The Taming of the Duel: Masculinity, Honour and Ritual Violence in London, 1660-1800’, Historical Journal, 45 (2002), pp. 525-45.

4 S. Richardson, The History of Sir Charles Grandison (1753; in Oxford 1986 edn), Bk.1, pp. 207-8.

5 B. Franklin, ‘On Duelling’ (1784), in R.L. Ketcham (ed.), The Political Thought of Benjamin Franklin (Indianapolis, Ind., 1965; 2003), p. 362. For context, see also W.O. Stevens, Pistols at Ten Paces: The Story of the Code of Honour in America (Boston, 1940); D. Steward, Duels and the Roots of Violence in Missouri (2000); and C. Burchfield, Choose Your Weapon: The Duel in California, 1847-61 (Fresno, CA., 2016).

6 R.B. Sheridan, The Rivals (1775), ed E. Duthie (1979), Act V, sc. 2 + 3, pp. 105, 112. For the Irish context, see J. Kelly, ‘That Damn’d Thing Called Honour’: Duelling in Ireland, 1570-1860 (Cork, 1995).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 100 please click here

MONTHLY BLOG 97, WHY IS THE REMARKABLE CHARLOTTE DESPARD NOT BETTER KNOWN?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Fig.1 Charlotte Despard speaking at an anti-fascist rally, Trafalgar Square, 12 June 1933:
photograph by James Jarché, Daily Herald Archive.

Charlotte Despard (1844-1939) was a remarkable – even amazing – woman. Don’t just take my word for it. Listen to Mahatma Gandhi (1869-1948). Visiting London in 1909, he met all the leading suffragettes. The one who impressed him most was Charlotte Despard. She is ‘a wonderful person’, he recorded. ‘I had long talks with her and admire her greatly’.1 They both affirmed their faith in the non-violent strategy of political protest by civil disobedience. Despard called it ‘spiritual resistance’.

What’s more, non-violent protest has become one of the twentieth-century’s greatest contributions to potent mass campaigning – without resorting to counter-productive violence. Associated with this strategy, the names of Henry Thoreau, Mahatma Gandhi and Martin Luther King, all controversial in their day, have become canonised.2 Yet Charlotte Despard, who was also controversial in her day, has been substantially dropped from the historical record.

Not entirely so. On 14 December 2018 Battersea Labour unveiled a blue plaque in her honour, exactly one hundred years after the date when she stood as the Labour Party candidate in North Battersea in the 1918 general election. She was one of the feminist pioneers, when no more than sixteen women stood. But Despard lost heavily to the Liberal candidate, even though industrial North Battersea was then emerging as a Labour stronghold.3

And one major reason for her loss helps to explain her disappearance from mainstream historical memory. Despard was a pacifist, who opposed the First World War and campaigned against conscription. Many patriotic voters in Battersea disagreed with this stance. In the immediate aftermath of war, emotions of relief and pride triumphed. Some months later, Labour swept the board in the 1919 Battersea municipal elections; but without Charlotte Despard on the slate.

Leading pacifists are not necessarily all neglected by history.4 But the really key point was that Charlotte Despard campaigned for many varied causes during her long life and, at every stage, weakened her links with previous supporters. Her radical trajectory made complete sense to her. She sought to befriend lame dogs and to champion outsiders. Yet as an independent spirit – and seemingly a psychological loner – she walked her own pathway.

Despard was by birth an upper crust lady of impeccable Anglo-Irish ancestry, with high-ranking military connections. For 40 years, she lived quietly, achieving a happy marriage and a career as a minor novelist. Yet, after being widowed at the age of 40, she had an extraordinary mid- and late-life flowering. She moved to Battersea’s Nine Elms, living among the poorest of the poor. And she then became a life-long radical campaigner. By the end of her career, she was penniless, having given all her funds to her chosen causes.

A convinced suffragette, Despard joined the Women’s Social and Political Union and was twice imprisoned for her public protests. In 1907, however, she was one of the leading figures to challenge the authoritarian leadership style of Christabel Pankhurst. Despard resigned and founded the rival Women’s Freedom League. This smaller group opposed the use of violence. Instead, its members took symbolic action, like unfurling banners in parliament. They also advocated passive resistance, like non-payment of taxation and non-cooperation with the census. (I recently discovered, thanks to the research of a family member, that my great-grandmother was a would-be WFL supporter. So the 1911 census enumerator duly noted that Mrs Matilda Corfield, living in Sheffield, had given information only ‘under Protest (she wants the vote)’.5 This particular example of resistance was very muffled and inconsequential. Nevertheless, it indicated how unknown women across the country tried to respond to WFL advice. It was one way of slowly changing the climate of public opinion.)

However, the energetic Charlotte Despard did not confine her efforts solely to the cause of the female suffrage. Her life in Battersea radicalised her politically and she became a socialist. She was not good at detailed committee work. Her forte was activism. Indefatigably, she organised a local welfare system. She funded health centres for mothers and babies, exchange points for cots and equipment, youth clubs, and halls for local meetings. And the front room of her small premises in Nine Elms was made available to the public as a free reading room, stocked with books and newspapers. It was a one-woman exercise in practical philanthropy. What’s more, her 1918 election manifesto called for a minimum wage – something not achieved until 1998.

Among the Battersea workers, the tall, wiry, and invariably dignified Charlotte Despard cut an impressive figure. A lifelong vegetarian, she was always active and energetic. And she believed in the symbolic importance of dress. Thus she habitually wore sandals (or boots in winter) under long, flowing robes, a lace shawl, and a mantilla-like head-dress. The result was a timeless style, unconcerned with passing fashions. She looked like a secular sister of mercy.
2019-01-No2-Charlotte-Despard-in-slumland

Fig.2 Charlotte Despard in the poor tenements of Battersea’s Nine Elms, where she lived from 1890 to the early 1920s, instituting and funding local welfare services. Her visitors commented adversely on the notorious ‘Battersea smell’ of combined industrial effluent and smoke from innumerable coalfires; but Despard reportedly took no notice.

For a number of years, Despard worked closely with the newly founded Battersea Labour Party (1908- ), strengthening its global connections. She attended various international congresses; and she backed the Indian communist Shapurji Saklatvala as the Labour-endorsed candidate in Battersea North at the general election in 1922. (He won, receiving over 11,000 votes). Yet, as already noted, the Battersea electorate in 1918 had rebuffed her own campaign.

Then at a relatively loose end, Despard moved to Dublin in the early 1920s. She had already rejected her Irish Ascendancy background by converting to Catholicism. There she actively embraced the cause of Irish nationalism and republicanism. She became a close supporter of Maud Gonne, the charismatic exponent of Irish cultural and political independence. By the later 1920s, however, Despard was unhappy with the conservatism of Irish politics. In 1927 she was classed as a dangerous subversive by the Free State, for opposing the Anglo-Irish Treaty settlement. She eventually moved to Belfast and changed tack politically to endorse Soviet communism. She toured Russia and became secretary of the British Friends of the Soviet Union (FSU), which was affiliated to the International Organisation of the same name.

During this variegated trajectory, Despard in turn shocked middle-class suffragettes who disliked her socialism. She then offended Battersea workers who rejected her pacifism. She next infuriated English Protestants who hated her Irish nationalism. And she finally outraged Irish Catholics (and many Protestants as well) who opposed her support for Russian communism. In 1933, indeed, her Dublin house was torched and looted by an angry crowd of Irish anti-communists.6

In fact, Despard always had her personal supporters, as well as plentiful opponents. But she did not have one consistent following. She wrote no autobiography; no memorable tract of political theory. And she had no close family supporters to tend her memory. She remained on good terms with her younger brother throughout her life. But he was Sir John French, a leading military commander in the British Army and from 1918 onwards Lord Lieutenant of Ireland. The siblings disagreed politically on everything – although both shared the capacity to communicate on easy terms with people from many different backgrounds. To the Despards, ‘Aunt Lottie’ was thus an eccentric oddity. To other respectable family friends, she was ‘a witch’, and a dangerous one at that.7

These factors combined together to isolate Despard and to push her, after her death, into historical limbo. There are very few public monuments or memorials to her indomitable career. In north London, a pleasant pub on the Archway Road is named after her, on land which was owned by her husband Colonel Despard. On Battersea’s Doddington Estate, there is an avenue named after her, commemorating her welfare work in the area. And now there is the blue plaque outside the headquarters of Battersea Labour at 177 Lavender Hill, SW11. These memorials are fine but hardly enough.

Fig.3 Blue plaque to Charlotte Despard, outside 177 Lavender Hill, London SW11 5TE: installed 14 December 2018, on the precise centenary of her standing for parliament in 1918, as one of only 16 women pioneers to do so.

Why should she be remembered? The answer is not that everyone would have agreed (then or later) with all of Charlotte Despard’s political calls. As this account has shown, she was always controversial and, on Russia, self-deceived into thinking it much more of a workers’ paradise than it was (as were many though not all left-leaning intellectuals in the West). Nonetheless, she is a remarkable figure in the history of public feminism. She not only had views but she campaigned for them, using her combination of practical on-the-ground organisation, her call for symbolic non-violent protest and ‘spiritual resistance’, and her public oratory. And she did so for nigh on 50 years into her very advanced years.

Indomitability, peaceful but forceful, was her signature style. She quoted Shelley on the need for Love, Hope, and Endurance. When she was in her mid-sixties, she addressed a mass rally in Trafalgar Square (of course, then without a microphone). Her speeches were reportedly allusive and wide-ranging, seeking to convey inspiration and urgency. One onlooker remembered that her ‘thin, fragile body seemed to vibrate with a prophecy’.8

Appropriately for a radical campaigner, Charlotte Despard’s last major public appearance was on 12 June 1933, when she spoke passionately at a mass anti-fascist rally in Trafalgar Square. At that time, she was aged 89. It was still unusual then for women to speak out boldly in public. They often faced jeers and taunts for doing so. But the photographs of her public appearances show her as unflinching, even when she was the only woman amidst crowds of men. Above all, for the feminist feat of speaking at the mass anti-fascist rally at the age of 89, there is a good case for placing a statue on Trafalgar Square’s vacant fourth plinth, showing Despard in full oratorical flow. After all, she really was there. And, if not on that particular spot, then somewhere relevant in Battersea. Charlotte Despard, born 175 years ago and campaigning up until the start of the Second World War, was a remarkable phenomenon. Her civic and feminist commitment deserves public commemoration – and in a symbolic style worthy of the woman.

Figs 4 + 5: Photos showing Despard, speaking in Trafalgar Square, without a microphone:
(L) dated 1910 when she was 66, and (R) dated 1933 when she was aged 89.
Her stance and demeanour are identically rapt, justifying one listener’s appreciative remark:
Mrs Despard – she always gets a crowd’.

1 Quoted in M. Mulvihill, Charlotte Despard: A Biography (1989), p. 86. See also A. Linklater, An Unhusbanded Life: Charlotte Despard, Suffragette, Socialist and Sinn Feiner (1980); and, for Battersea context, P.J. Corfield in Battersea Matters (Autumn 2016), p. 11; and PJC with Mike Marchant, DVD: Red Battersea: One Hundred Years of Labour, 1908-2008 (2008).

2 A. Roberts and T. Garton Ash (eds), Civil Resistance and Power Politics: The Experience of Non-Violent Action from Gandhi to the Present (Oxford, 2009); R.L. Holmes and B.L. Gan (eds), Nonviolence in Theory and Practice (Long Grove, Illinois, 2012).

3 1918 general election result for North Battersea: Richard Morris, Liberal (11,231 = 66.6% of all voting); Charlotte Despard, Labour (5,634 = 33.4%). Turnout =  43.7%.

4 P. Brock and N. Young, Pacifism in the Twentieth Century (New York, 1999).

5 With thanks to research undertaken by Annette Aseriau.

6 Mulvihill, Charlotte Despard, p. 180.

7 Ibid., pp. 46-7, 78-9.

8 Account by Christopher St John, in Mulvihill, Charlotte Despard, p. 77.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 97 please click here

MONTHLY BLOG 96, WHAT’S WRONG WITH PREHISTORY?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Arthur’s Stone, Herefordshire, dating from c.3000 BCE: photo © Tony Belton, 2016

Arthur’s Stone, Herefordshire, dating from c.3000 BCE:
photo © Tony Belton, 2016

What’s wrong with ‘prehistory’? Absolutely nothing but the name. People refer to ancient monuments as ‘prehistoric’ and everyone knows roughly what is meant. The illustration (above) shows an ancient burial tomb, known as Arthur’s Stone, dating from 3000 BCE, which I visited in Herefordshire on a summer day in 2016. It did and does indeed look truly venerable. So loose terms such as ‘prehistoric’ are passable enough if used casually.

But ‘prehistory’ as a scholarly term in application to a prolonged period of human history? Seriously misleading. It implies that the long aeons of foundational human history, before the advent of literacy, somehow occurred in a separate ante-chamber to the ‘real’ deal.

The acquiring of skills in reading and writing (which occurred in different parts of the world at different times) was in fact part of a lengthy process of human adaptation and invention. Before literacy, key developments included: the adoption of clothing; the taming of fire; the invention of tools; the refinement of tools and weapons with handles; the invention of the wheel; the arrival of speech; the advent of decorative arts; the formulation of burial rituals; the domestication of animals; the development of a calendrical consciousness; the capacity to cope with population fluctuations including survival during the Ice Age; the start of permanent settlements and farming; and the cumulative mental and cultural preparation for the invention of reading and writing. Some list! The pace of change was often slow; but the changes were absolutely foundational to human history.1

In practice, of course, the skilled and ingenious experts, who study pre-literate societies, do not consider their subject to be anything other than fully and deeply historical. They use ‘prehistory’ because it is a known term of art. (Often, indeed, they may start their lectures and books with a jovial disclaimer that such terminology should not be taken literally). The idea of ‘prehistory’ was crystallised by Victorian historians, who were developing a deep reverence for the importance of written sources for writing ‘real’ history. But the differences in prime source material, although methodologically significant, are not fundamental enough to deprive the foundational early years of the full status of history. And, in fact, these days historians of all periods study a range of sources. They are not just stuck in archives, reading documents – important as those are. If relevant to their theme, historians may examine buildings, art, artefacts, materials, bones, refuse, carbon datings, statistical extrapolations, and/or genetic evidence (etc etc), just as do archaeologists and ‘prehistorians’.

Moreover, conventional references to ‘prehistory’ have now been blind-sided by the recent return to diachronic (through-time) studies of what is known as Big History. This approach to the past takes as its remit either the whole of the cosmos or at least the whole lifespan of Planet Earth.2 It draws upon insights from cosmologists and astro-physicists, as well as from geologists and biologists. After all, a lot of history had indeed happened before the first humans began to walk. So what are the millennia before the advent of homo sapiens to be entitled? Pre-prehistory? Surely not. All these eras form part of what is sometimes known as ‘deep history’: a long time ago but still historical.

So why has the misleading term ‘prehistory’ survived for so long? One major reason lies in the force of inertia – or institutional continuity, to give it a kinder name. ‘Prehistory’ has prevailed as an academic terminology for over a century. It appears in the names of academic departments, research institutions, learned societies, job descriptions, teaching courses, examination papers, academic journals, books, blogs, conferences, publishers’ preferences for book titles, and popular usages – let alone in scholars’ self-definitions. Little wonder that renaming is not a simple matter. Nonetheless, subjects are continuously being updated – so why not a further step now?

I was prompted to write on this question when three congenial colleagues asked me, a couple of years ago, to contribute to a volume on Time & History in Prehistory (now available, with publication date 2019).3 I was keen to respond but hostile to the last word in their book title. My answer took the form of arguing that this specialist section of historical studies needs a new and better name. I am grateful to the editors’ forbearance in accepting my contribution. It contributes to debates elsewhere within the volume, since criticising the terminology of ‘prehistory’ is not new.

Apart from the lack of logic in apparently excluding the foundational experiences of the human species from ‘real’ history, my own further objection is that the division inhibits diachronic analysis of the long term. A surviving relic from ‘prehistoric’ times, like Arthur’s Stone, has a long and intriguing history which still continues. At some stage long before the thirteenth century CE, the modest monument, high on a ridge between the Wye and Golden Valleys, became associated in popular legend with the feats of King Arthur. (Did he win a battle there, rumour speculated, or slay a giant?) That invented linkage is in itself a fascinating example of the spread of the Arthurian legend.4

The site later witnessed some real-life dramas. In the fifteenth century, a knight was killed there in a fatal duel. And in September 1645 the embattled Charles I dined at the Stone with his royalist troops. Perhaps he intended the occasion as a symbolic gesture, although it did not confer upon him sufficient pseudo-Arthurian lustre to defeat Cromwell and the Roundheads.

For the villagers in nearby Dorstone and Bredwardine, Arthur’s Stone at some stage (chronology uncertain) became a venue for popular festivities, with dancing and ‘high jinks’ every midsummer. This long-standing tradition continued until well into Victorian times. As a sober counter-balance, too, the local Baptists in the nineteenth and twentieth centuries organised an ecumenical religious service there each June/July. Living witnesses remember these as occasions of fervent al fresco hymn-singing. Implicitly, they were acknowledging the Stone’s sacral nature, whilst simultaneously purging its pagan associations.

When visiting the Stone myself in 2016, I met by chance a local resident, named Ionwen Williams. In a stroke of research serendipity, we got chatting and she zestfully recounted her memories, as a child before World War II, of joining her schoolfellows to sing hymns at the site each midsummer. This experience and many later visits confirmed for her the special nature of the place. I did not for a moment doubt her memories; but, as a prudent historian, thought it helpful to cross-check – and found them corroborated.

It is abundantly clear that, throughout its five thousand years of existence, Arthur’s Stone has had multiple meanings for the witnessing generations. At one sad stage in the late nineteenth century, it was pillaged by builders taking stones for new constructions. But local objections put a stop to that; and it is now guarded by English Heritage. It is utterly historic, not separately ‘prehistoric’: and the same point applies to all long-surviving monuments, many of which are much bigger and more famous than Arthur’s Stone. Furthermore, deep continuities apply to many other aspects of human history – and not just to physical monuments. For example, there are many claims and counter-claims about the foundations of human behaviour which merit debate, without compartmentalising the eras of pre-literacy from those of post-literacy.

Lastly, what alternative nomenclature might apply? Having in the first draft of my essay rebuked the specialists known as ‘prehistorians’ for not changing their name, I was challenged by the editors to review other options. Obviously it’s not for one individual to decide. It was, however, a good challenge. In many ways, these early millennia might be termed ‘foundational’ in human history. That, after all, is what they were. On the other hand, ‘foundational history’ sounds like a first-year introduction course. Worthy but not very evocative. My essay reviews various options and plumps for ‘primeval’ history. That term not only sounds ancient but signals primacy: in human history, these years came first.5 The contributions within the volume as a whole are questioning and challenging throughout, as they analyse different aspects of Time and, yes, ‘History’. It is a pleasure to join these essays in thinking long.6

1 For an enticing introduction (apart from one word in its subtitle), see C. Gamble, Timewalkers: The Prehistory of Global Colonisation (Sutton: Stroud 1993).

2 For an introduction, see D.G. Christian, Maps of Time: An Introduction to Big History (U. of California Press: Berkeley, 2004).

3 S. Souvatzi, A. Baysal and E.L. Baysal (eds), Time and History in Prehistory (Routledge: Abingdon, 2019).

4 N.J. Lacy (ed.), The New Arthurian Encyclopaedia (Garland: New York, 1991).

5 P.J. Corfield, ‘Primevalism: Saluting a Renamed Prehistory’, in Soutvatzi, Baysal and Baysal (eds), Time and History, pp. 265-82. My own interest in ‘long ago’ was sparked when, as a teenager, I read a study by Ivar Lissner, entitled The Living Past (Cape: London, 1957): for which see P.J. Corfield, ‘An Unknown Book Which Influenced Me’ BLOG no.14 (Nov. 2011).

6 On this theme, see J. Guldi and D. Armstrong, The History Manifesto (Cambridge University Press: Cambridge, 2014); P.J. Corfield, ‘What on Earth is the “Temporal Turn” and Why is it Happening Now?’ BLOG no.49 (Jan. 2015); and idem, ‘Thinking Long: Studying History’, BLOG no.94 (Oct. 2018), all BLOGs available on www.penelopejcorfield.com/monthly-blogs.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 96 please click here

MONTHLY BLOG 95, ‘WHAT IS THE GREATEST SIN IN THE WORLD?’ CHRISTOPHER HILL AND THE SPIRIT OF EQUALITY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Text of short talk given by PJC to introduce the First Christopher Hill Memorial Lecture, (given by Prof. Justin Champion) at Newark National Civil War Centre, on Saturday 3 November 2018.

Christopher Hill was not only a remarkable historian – he was also a remarkable person.1 All his life, he believed, simply and staunchly, in human equality. But he didn’t parade his beliefs on his sleeve. At first meeting, you would have found him a very reserved, very solid citizen. And that’s because he was very reserved – and he was solid in the best sense of that term. He was of medium height, so did not tower over the crowd. But he held himself very erect; had a notably sturdy, broad-shouldered Yorkshire frame; and was very fit, cycling and walking everywhere. And in particular, Christopher Hill had a noble head, with a high forehead, quizzical eyebrows, and dark hair which rose almost vertically – giving him, especially in his later years, the look of a wise owl.
Christopher-Hill-1-&-2

Christopher Hill (L) in his thirties and (R) in his seventies

By the way, he was not a flashy dresser. The Hill family motto was ‘No fuss’. And, if you compare the two portraits of him in his 30s and his 70s, you could be forgiven for thinking that he was wearing the same grey twill jacket in both. (He wasn’t; but he certainly stuck to the same style all his life).

Yet even while Christopher Hill was reserved and dignified, he was also a benign figure. He had no side. He did not pull rank. He did not demand star treatment. He was courteous to all – and always interested in what others had to say. That was a key point. As Master of Balliol, Hill gave famous parties, at which dons and students mingled; and he was often at the centre of a witty crowd. But just as much, he might be found in a corner of the room discussing the problems of the world with a shy unknown.

As I’ve already said. Christopher Hill believed absolutely in the spirit of equality. But he did know that it was a hard thing to achieve – and that was why he loved the radicals in the English civil wars of the mid-seventeenth century. They were outsiders who sought new ways of organising politics and religion. Indeed, they struggled not only to define equality – but to live it. And, although there was sometimes a comic side to their actions, he admired their efforts.

When I refer to unintentionally comic aspects, I am thinking of those Ranters, from the radical and distinctly inchoate religious group, who jumped up in church and threw off their clothes as a sign. The sign was that they were all God’s children, equal in a state of nature. Not surprisingly, such behaviour attracted a lot of criticism – and satirists had good fun at their expense.

Well, Christopher Hill was far too dignified to go around throwing off his clothes. But he grew up believing a radical form of Methodism, which stressed that ‘we are all one in the eyes of the Lord’. As I’ve said, his egalitarianism came from within. But he was clearly influenced by his Methodist upbringing. His parents were kindly people, who lived simply and modestly (neither too richly nor too poorly). They didn’t drink, didn’t smoke, didn’t swear and didn’t make whoopee. Twice and sometimes even three times on Sundays, they rode their bikes for several miles to and from York’s Central Methodist Chapel; and then discussed the sermon over lunch.

In his mid-teens, Hill was particularly inspired by a radical Methodist preacher. He was named T.S. Gregory and he urged a passionate spiritual egalitarianism. Years later, Hill reproduced for me Gregory’s dramatic pulpit style. He almost threw himself across the lectern and spoke with great emphasis: ‘Go out into the streets – and look into the eyes of every fellow sinner, even the poorest beggar or the most abandoned prostitute; [today he would add look under the hoods of the druggies and youth gangs]; look into these outcast faces and in every individual you will see elements of the divine. The York Methodists, from respectable middle class backgrounds, were nonplussed. But Hill was deeply stirred. For him, Gregory voiced a true Protestantism – which Hill defined as wine in contrast with what he saw as the vinegar and negativism of later Puritanism.

The influence of Gregory was, however, not enough to prevent Hill in his late teens from losing his religious faith. My mother, Christopher’s younger sister, was very pleased at this news as she welcomed his reinforcement. She herself had never believed in God, even though she too went regularly to chapel. But their parents were sincerely grieved. On one occasion, there was a dreadful family scene, when Christopher, on vacation from Oxford University, took his younger sister to the York theatre. Neither he nor my mother could later remember the show. But they both vividly recalled their parent’s horror: going to the theatre – abode of the devil! Not that the senior Hills shouted or rowed. That was not their way. But they conveyed their consternation in total silence … which was difficult for them all to overcome.

As he lost his faith, Hill converted to a secular philosophy, which had some elements of a religion to it. That was Marxism. Accordingly, he joined the British Communist Party. And he never wavered in his commitment to a broad-based humanist Marxism, even when he resigned from the CP in 1956. Hill was not at all interested in the ceremonies and ritual of religion. The attraction of Marxism for him was its overall philosophy. He was convinced that the revolutionary unfolding of history would eventually remove injustices in this world and usher in true equality. Hill sought what we would call a ‘holistic vision’. But the mover of change was now History rather than God.

On those grounds, Hill for many years supported Russian communism as the lead force in the unfolding of History. In 1956, however, the Soviet invasion of Hungary heightened a fierce internal debate within the British Communist Party. Hill and a number of his fellow Marxist historians, struggled to democratise the CP. But they lost and most of them thereupon resigned.

This outcome was a major blow to Hill. Twice he had committed to a unifying faith and twice he found its worldly embodiment unworthy. Soviet Communism had turned from intellectual inspiration into a system based upon gulags, torture and terror. Hill never regretted his support for Soviet Russia during the Second World War; but he did later admit that, afterwards, he had supported Stalinism for too long. The mid-1950s was an unhappy time for him both politically and personally. But, publicly, he did not wail or beat his breast. Again, that was not the Hill way.

He did not move across the political spectrum, as some former communists did, to espouse right-wing causes. Nor did he become disillusioned or bitter. Nor indeed, did he drop everything to go and join a commune. Instead, Hill concentrated even more upon his teaching and writing. He did actually join the Labour Party. Yet, as you can imagine, his heart was not really in it.

It was through his historical writings, therefore, that Hill ultimately explored the dilemmas of how humans could live together in a spirit of equality. The seventeenth-century conflicts were for him seminal. Hill did not seek to warp history to fit his views. He could not make the radicals win, when they didn’t. But he celebrated their struggles. For Hill, the seventeenth-century religious arguments were not arid but were evidence of the sincere quest to read God’s message. He had once tried to do that himself. And the seventeenth-century political contests were equally vivid for him, as he too had been part of an organised movement which had struggled to embody the momentum of history.

As I say, twice his confidence in the worldly formulations of his cause failed. Yet his belief in egalitarianism did not. Personally, he became happy in his second marriage; and he immersed himself in his work as a historian. From being a scholar who wrote little, he became super-productive. Books and essays poured from his pen. Among those he studied was the one seventeenth-century radical who appealed to him above all others: Gerrard Winstanley, the Digger, who founded an agrarian commune in the Surrey hills. And the passage in Winstanley’s Law of Freedom (1652) that Hill loved best was dramatic in the best T.S. Gregory style. What is the greatest sin in the world? demanded Winstanley. And he answered emphatically that it is for rich people to hoard gold and silver, while poor people suffer from hunger and want.          

          What Hill would say today, at the ever widening inequalities across the world, is not hard to guess. But he would also say: don’t lose faith in the spirit of equality. It is a basic tenet of human life. And all who believe in fair does for all, as part of true freedom, should strive to find our own best way, individually and/or collectively, to do our best for our fellow humans and to advance Hill’s Good Old Cause.

1 For documentation, see P.J. Corfield, ‘“We are all One in the Eyes of the Lord”: Christopher Hill and the Historical Meanings of Radical Religion’, History Workshop Journal, 58 (2004), pp. 110-27. Now posted on PJC personal website as Pdf5; and further web-posted essays PJC Pdf47-50, all on www.penelopejcorfield.co.uk

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 95 please click here