If citing, please kindly acknowledge copyright © Penelope J. Corfield (2016)

Does the study of History ‘progress’? That verb is cited cautiously in inverted commas, because we are all wary of over-simplified claims for historical Progress which can be deceptive, even cruelly so. But the study of History is a highly pluralistic discipline. It’s undertaken around the world by countless specialists and generalists alike. ‘History’ does not peddle its own party line. Instead, the subject rejoices in disagreements and debates. If it does ‘progress’ towards a triumphant end-point (on a journey which never stops), then it does so through pluralistic efforts and zig-zagging routes.
2016-01 No1 city on Hill C Alcuin 2005

A dulcet vision of the City Set on a Hill (2005): the ideal outcome, always sought, always elusive.

Here I argue that the study of the human past does progress, in the sense of collectively getting better sources, methodologies, agreed practices, advice handbooks, and theoretical investigations as well as smarter popularisations, text-books, and research publications – and deeper, better overall interpretations. The route, obviously, is not a step-by-step one, with each History book being better than the one before. On the contrary. I’m talking about a very long term process, between the generations, evolving since at least the eighteenth-century advent of a secular discipline of History-writing. Things, collectively, do get better.

On the way to substantiating that assertion, it’s helpful to answer two other related questions that are often raised by doubters, viz: Why do historians keep rewriting history? Why can’t they just tell it like it was and stop arguing? Two broad answers come to mind. Firstly, the debates and argument are an essential part of the process of interrogating the past. Just as History belongs to everyone, so there is no limit to the number of historical interpretations – and a good thing too. Furthermore, in every generation, there are discoveries of new sources, or new ways of using old sources, or new technologies that encourage new methodologies – let alone new questions and new approaches from new researchers (one of the major sources of change) – and, not least, the new perspectives brought about by the unfolding of History through time. Since historical research is always focused upon a moving target, then historical writing must do likewise. It is, in other words, a triumphant component of the discipline.

Even if no new evidence on a particular topic ever emerges, changing subsequent events introduce changed perspectives. For example, should Scotland leave the United Kingdom sometime soon, then interpretations of the 1707 Act of Union will change. It will no long be regarded as a brilliant compromise settlement that gained longevity and permanence – but instead as a political expedient which had a prolonged but ultimately limited shelf-life of just over 300 years (not that long in the grand scheme of things).

Yet, if historical output is always being rewritten (and, by implication, the old stuff rejected or discarded), then how can History ‘progress’? Doesn’t that mean that each generation’s writings are only good for their own day – and, after that, as dead as the fabled dodo? But, in fact, old efforts are not all discarded. Some elements may be entirely refuted or rejected or simply forgotten. Others lie fallow and then may later be revived and re-examined. But most studies remain on hand, more or less actively, in intellectual parks (traditionally known as libraries, now redoubled in websites). There they are subjects for further circulation, consultation, debate, adaptation, modification, forgetting, retrieval and, yes, at times complete oblivion – though even a forgotten work may have influenced another which remains in circulation. The pathways to knowledge are multi-circuited.

Sites of stored learning, from libraries to websites, and interactively between them

Historians don’t start by rewriting the whole subject from scratch. Instead, they build broadly upon the work of earlier generations. For that reason, as they are engaging in a discipline that focuses upon the workings of Time, historians often begin their studies with a historiographical review, analysing the past and present state of their chosen field. Even if one individual researcher is keen to embark upon polemical warfare with an influential precursor, it is rarely the case that the polemicist rejects absolutely everything written in earlier times. The framework of dates, events, chief protagonists, is (mainly) already fixed.

In effect, there is something like a continuous dialogue between the generations – except that it’s not a true dialogue, since earlier generations can’t answer back (and can’t adapt their views in the light of criticisms). So let me invent a word. There’s a plurilogue, across time and, simultaneously, between scholars from different cultures and traditions around the world.

But here’s an annoying discovery. I am not unique in my powers of linguistic invention. I’ve just googled ‘plurilogue’ to discover that it’s already the name of a recently-established international online journal, presenting reviews of philosophy and political science. In that case, I rally to claim instead that it’s a word whose time has come. Its parallel invention is an example of plurilogue in action.

Which leads me to my last point. Historians, like the practitioners of all academic disciplines, build upon work that has gone before. Even refutations or corrections constitute a form of reconstruction. An example is the collective effort and sometimes fierce debates that it has taken, over two generations, to establish reasonably reliable figures for the extent of the state-directed murders during the Holocaust.3 Similar endeavours combine to seek accurate figures for mortality in wars, or through political purges, or as a result of epidemics – often with the useful side-effect of refuting rumours, legends and propaganda claims. In terms of knowledge, that’s progress.

All the work of previous generations provides a scaffolding, which allows for new growth, development, reconfiguration, pulling down and building up. And that assessment applies not only to the work of scholars but also to the crucial input of all who work in libraries, archives, museums, art galleries, heritage associations, and everywhere that resources are preserved and curated for the use of this and future generations. Today those who are digitising historical materials are carrying out the same essential tasks in a different medium, generating wider democratic access, as well as new challenges and endless possibilities. Certainly were Denis Diderot and the Encyclopédistes living in today’s ‘Age of Wikipedia’, they would be leading the charge to put everything onto the web – and no doubt trying to enforce greater accuracy upon wikipedia.

Access to raw data alone will not, of course, make a work of history. Historians still need to grapple with their sources, with their own ideas, and with each other – as well as with their precursors. There’s a famous maxim about ‘standing on the shoulders of giants’. Reality is not so glamorous. Historians stand on the shoulders of all who went before, giant or pygmy (reputations rise and fall retrospectively too). It’s a collective thing. Plurilogue is endlessly expanding, which makes it hard work but endlessly enthralling.

1 Most historians focus upon the human past at various points during the many generations that have existed after the advent of basic literacy. But for some purposes, the subject can focus upon the entire history of the human species, incorporating the work of biologists, anthropologists, archaeologists and the misleadingly entitled ‘prehistorians’ (who study pre-literate societies), while for yet other purposes, the human past can be integrated into the history of the Earth and, indeed, the Cosmos. See e.g. D. Christian, Maps of Time: An Introduction to Big History (Berkeley, CA, 2004).


3 For an introduction, see >; and long list of secondary authorities cited there.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 61 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2015)

Is it a fact or a factoid? There are lots of those impostors around. Historical films perpetrate new examples daily and the web circulates them with impartial zeal. Items of information that can be verified and cross-checked with reference to other sources count as facts. But even apparently well-established truths can turn out to be no more than factoids. That useful noun was coined in 1973 by Norman Mailer when writing about Marilyn Monroe, about whom myths and legends still gather.1
2015-4 No1 norman mailer

Norman Mailer (1923-2007) – maverick American author who experimented with creative literature, confessional writing, journalism, biography and non-fiction.

A factoid is an item of information, which has gained by frequent repetition a fact-like status, even though it is actually erroneous. It may have been spawned by an outright invention or, more subtly, grown by an accretion of myth and repetition. So factoids are like lies or untruths, but they are not necessarily circulated as knowingly false. Instead these purported facts tend to be recycled again and again as non-controversial data that ‘everyone knows’. Thus factoids convey culturally-embedded information which people would like to be true or feel ought to be true. For that reason, these phoney-facts are hard to kill. And, even when slain, they may well rise and circulate again.

Eighteenth-century English history, like most periods, has generated some notable factoids of its own. One features the so-called Calendar Riots of September 1752. They have been frequently cited by historians; and one or two experts have even supplied details of their location (for example, in Bristol). I might have mentioned them in print myself, since I used to believe in their historical reality. But I didn’t commit myself publicly. That’s just as well, since there were no riots. It’s true that there was some popular grumbling and discontent in and after September 1752, when the old, lagging Julian calendar (until then standard in England and Wales) was officially jettisoned in favour of catching up with the astronomically more accurate Gregorian calendar (already in use in Scotland and across continental Europe). The gap was eleven days.

It was later myth which turned the grumbles of 1752 into riots. In one of his election satires, Hogarth included, casually amongst the chaos, an opposition poster demanding: ‘Give us our Eleven Days’. It was an irresistible formula: belligerent but anguished. Such an attitude matched with what later generations rather snobbishly considered would be the ‘natural’ response of the uneducated masses to such calendrical reforms. Expanded into ‘Give us Back our Eleven Days!’ the phrase still has resonance: we have been robbed of our time. Moreover, once embroidered into a story of riots in the early nineteenth century, the tale gained weight and encrusted detail with continued retellings.2
2015-4 No2 Hogarth's elevendays

William Hogarth’s satirical Election Entertainment (1755) shows a captured placard against calendar reform, casually discarded underfoot.

Yet, in reality, the masses in England and Wales proved quite capable of adapting to the change, which had parliamentary authority, trading convenience, congruence with Scotland, and scientific time-measurement on its side. As part of the reform process, 1 January was adopted as the start of the official year, instead of the old choice of 26 March (the quarterly Lady Day). But, in a nod to continuity, the inauguration of the tax year was left unchanged, although updated by eleven days from 26 March to 6 April (as it still remains today). Thenceforth, England and Wales adhered without difficulty to the Gregorian calendar which, once synchronised across greater Europe, continued its long journey to becoming today’s global standard. The story is interesting enough without the addition of factoids. Instead, the significant fact is that the riotous English population did not riot upon this occasion.3

Another factoid features prominently in an oversimplified version of the history of English women before women’s liberation. It is an example which is fuelled by righteous indignation against men. Or rather, not against men individually, but against the traditional legal position of men vis à vis women. It takes the form of the bald assertion that husbands ‘owned’ their wives, under common law. According to this factoid history, married women were considered as legally on a par with domestic ‘chattels’ or household goods; they were thus the property of their husbands; in effect, legally slaves. But not so.

Certainly, the independence of a married woman was legally circumscribed. Hence the eighteenth-century joke that the only truly happy female state was to be a wealthy widow. As the cynical thief-taker Peachum explains to his daughter Polly in John Gay’s Beggar’s Opera (1728): ‘The comfortable Estate of Widow-hood, is the only Hope that keeps up a Wife’s Spirits’.4

All the same, married women were not legally defined as property, capable of being bought and sold. Instead, after marriage, the legal identity of a woman (with the exception of a Queen reigning in her own right) was merged with that of her husband. Under the common law of ‘couverture’, they were one person. It was a legal fiction, which meant that a husband could not sue or be sued by his wife (though they still had to behave lawfully to one another). The law of ‘couverture’ also meant that they shared their assets and debts, unless they had some separate pre-nuptial agreement (as a considerable number of women did). Both partners, in theory at least, gained a helpmeet and the social status that came with matrimony.
2015-4 No3 regency proposal

Regency print of The Proposal.

Needless to say, in practice there were plenty of provisos. Personalities always affected the de facto balance of power within a marriage. Friends, families and servants could keep an unofficial lookout to ward against unacceptable individual behaviour. Some women also had separate pre-nuptial financial arrangements, leaving them in charge of their own money.5  And a number of married businesswomen traded in their own right, if necessary going to the equity Court of Chancery to provide a way round the rigidities of common law. The doctrine of matrimonial unity was potent but remained a legal fiction not a universal fact.
2015-4 No4 Matrimonial scene 1849

German print showing A Matrimonial Scene (1849)

Publicly and legally, the cards always remained stacked in the husbands’ favour. To make the legal fiction work, entrenched custom dictated that it was the male who acted on behalf of the couple. Hence the tongue-in-cheek dictum attributed to many a proudly married man: ‘My wife and I are one – and I am he’.7

Given this inequity at the heart of marriage according to traditional common law, there was a very good case for the legal liberation of married women, which happened piecemeal in the course of the nineteenth century.8  But the case didn’t and doesn’t need the support of a clunking factoid. Married women were not disposable property. Their plight was compared with that of slaves by some feminist reformers. That’s more or less understandable as campaign rhetoric, even if it significantly underplays the sufferings of slaves. But the factoid should not be mistaken for fact.

Real reforms are made more difficult if the target is misrepresented. Let’s keep an eye out for pseudo-history and reject it whenever possible.9  We don’t want to fetishise ‘facts and facts alone’ since much knowledge depends upon evaluating ideas/theories/experience/analysis/assumptions/intuitions/propositions/opinions/ debates/probabilities/possibilities and all the evidence which lies between certainty and uncertainty.10  Yet, given all those complexities, we don’t need factoids muddying the water as well.

1 N. Mailer, Marilyn: A Biography (New York, 1973). The term is sometimes also used, chiefly in the USA, to refer to a trivial fact or ‘factlet’: see

2 Historian Robert Poole provides an admirable analysis in R. Poole, Time’s Alteration: Calendar Reform in Early Modern England (1998), esp. pp. 1-18, 159-78; and idem, ‘“Give Us our Eleven Days!” Calendar Reform in Eighteenth-Century England’, Past & Present, 149 (1995), pp. 95-139.

3 See variously E.P. Thompson, ‘The Moral Economy of the English Crowds’, in his Customs in Common (1991), pp. 185-258, and ‘The Moral Economy Reviewed’, in ibid., pp. 259-351; J. Stevenson, Popular Disturbances in England, 1700-1870 (1979); A. Randall and A. Charlesworth (eds), Markets, Market Culture and Popular Protests in Eighteenth-Century Britain and Ireland (Liverpool, 1996); R.B. Shoemaker, The London Mob: Violence and Disorder in Eighteenth-Century England (2004); and J. Bohstedt, The Politics of Provision: Food Riots, Moral Economy and Market Transition in England, c.1550-1850 (Aldershot, 2010).

4 J. Gay, The Beggar’s Opera (1728), Act 1, sc. 10.

5 A.L. Erickson, Women and Property in Early Modern England (1993).

6 N. Phillips, Women in Business, 1700-1850 (Woodbridge, 2006).

7 E.O. Hellerstein, L.P. Hume and K.M. Offen (eds), Victorian Women: A Documentary Account of Women’s Lives in Nineteenth-Century England, France, and the United States (Stanford, Calif., 1981), Part 2, section 33, pp. 161-6: ‘“My Wife and I are One, and I am He”: The Laws and Rituals of Marriage’.

8 M.L. Shanley, Feminism, Marriage, and Law in Victorian England, 1850-95 (Princeton, 1989); A. Chernock, Men and the Making of Modern British Feminism (Stanford, Calif., 2010).

9 That’s why it’s good that these days freelance websites regularly highlight inaccuracies, omissions and inventions in historical films, before new factoids gain currency.

10 For opposition to the tyranny of facts, see Dickens’s critique of Mr Gradgrind in Hard Times (1854); L. Hudson, The Cult of the Fact (1972). With thanks to Tom Barney for a good conversation on this theme at the recent West London Local History Conference.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 52 please click here

Image/2: Not progressive order but chaotic disorder.


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2015)

The present ‘Temporal Turn’ in ideas and politics means reminding everyone, including all government policy-makers, that everything unfolds in historical context.1  There’s never a tabula rasa – a blank page on which to inscribe the future. The present comes from the past, and legacies from the past are all around us, let alone within us.

Well, that seems obvious enough. Yet insisting that we all have to look to history doesn’t advance things very far, especially since these days historians are (rightly) not giving out easy messages. It’s much easier to say that things are complex than to provide one-word answers.

Above all, historians collectively are not saying (as many Victorians did): be optimistic, Progress will win through. Partly that’s because it’s not clear exactly what constitutes historical improvement. When the supersonic Concorde first buzzed the skies over London, Paris, New York and Washington in the 1970s, protesters were firmly told off, with the snappy dictum: ‘You can’t stop Progress’. Yet … thirty years on, it’s Concorde that has gone; and it’s the urban protesters over aircraft noise who are slowly winning the battle to get the aviation industry to produce quieter planes. A different sort of Progress, it could be argued. But in the 1970s it was far from clear which version was going to succeed.

It’s a pertinent reminder that technology, which is often cited as the driver of historical change, does not hold all the trump cards. Innovations have to fit in with what humans collectively will accept, even though it may take time/arguments for that decision to become apparent. So, no simple Progress. At best/worst, a struggle or friction between conflicting interests. It’s what Marxists and Hegelians would call an example of dialectical contradiction in operation.
Image/1: Concorde – Was it Progress?

Image/1: Concorde – Was it Progress?
It flew elegantly – and faster than the speed of sound, in commercial service from 1976-2003.

 But it was super-noisy when heard from below; it did not cater for mass transport; and, by the end, its own operational systems were becoming technologically outdated.

Likewise, historians today don’t generally tell the world that ‘it’s all really the Class Struggle’ (though some still do). Or ‘it’s all really the hand of God’ (though some, not usually professional historians, still do). Or ‘it’s all really biological/gender or racial or national destiny’ (ditto).

Instead, the mainstream messages about long-term history are complex, which reflects reality. Indeed, there is an in-built tendency towards finding complexity in professional research: the more one looks, the more one finds. That can be helpful. When talking about some historically-derived situation, the remark ‘Ah well, it’s all very complex!’ can certainly be a good first inoculation against over-simplified nostrums.

On the other hand, historians should be able to say more than that. The art of research is not only to find complexity but also to explain it. Hence if fascinating historical studies offer intricate detail but no overview in conclusion, readers are entitled to feel frustrated.

Sad to say one erudite and fascinating study of three seventeenth-century women falls into that category. Natalie Zemon Davis’s Women on the Margins (1997)3  starts inventively with an imaginary conversation between the protagonists, who never met and knew nothing of each other. They are a Catholic, a Protestant, a Jew – and they don’t want to appear in the same book together. Yet Zemon Davis overrides their (imagined) objection. For her, there is evident analytical interest in studying their very different lives in conjunction. Yet, in her conclusion, she expressly declines to locate these case-studies within any wider history of women. Why not? Who could do that better than Zemon Davis? And she won’t say, what are readers to conclude? That these micro-histories are individually fascinating but collectively meaningless?

Certainly, their stories are not uncomplicated tales of female advancement. But readers would surely welcome an assessment of the changing long-term balance between constraints and opportunities for women – a seventeenth-century dialectic which has hardly ceased in the twenty-first century.

When opening a discussion of these issues, one good exercise is to ask people to explore their own implicit assumptions. If you have to draw the shape of history as a diagram, what image would you draw? The outcome then requires discussion – and gives scope for people then to have second, maybe deeper, thoughts.

When I ask my MA students to undertake this exercise – putting pen to blank paper and letting inspiration flow – they usually respond with bafflement, plus exasperation. One of them told me crossly: ‘I just don’t think like that, Penny’. In response, I urged: ‘Try’. A small minority (these days) draw a line, sometimes pointing upwards or downwards. They may explain their choices either as an expression of faith in Progress, in a distinctly Victorian style, or of deep-grained ecological pessimism. Another minority, rather more fatalistically, declare the answer to be a circle: ‘what goes round comes round’. Such images lead to fruitful discussions of the pros and cons of linear and cyclical views of history.4

But the majority (these days) scribble a confused mass, like a tangled ball of wool, and explain their choice with comments like: ‘Oh, it’s all a mess’. ‘It’s chaos’. ‘There’s no pattern to it’. ‘It’s too complex to explain’. ‘Unexpected things happen’. ‘Contingency rules’. ‘It’s just one accident after another’.
Image/2: Not progressive order but chaotic disorder.

Image/2: Not progressive order but chaotic disorder.

The only Concorde crash, just outside Paris (July 2000), following accidental damage to the plane from debris on the runway.

Very shortly after this photo was taken, 113 people died, 109 airborne and 4 on the ground.

If testimonies were needed to confirm the current absence of agreed Grand Narratives, recounting the long-term course of history, then these responses would provide it. And they lead to good discussions, once these answers are further explored. Sometimes, the advocates of chaotic randomness are very firm in their views. Their arguments may verge upon the notorious Time-heresy, that Time itself lacks all continuity and that each one moment (however brief) is sundered from the following moment.5  At that point, I usually reply: ‘Well if that’s the case, I won’t bother to mark your essays carefully. I’ll throw them into the air and those settling at the top of the heap will get top marks, and those at the bottom will be failed.’ To a man and woman, the students chorus: ‘But, Penny, that’s unfair’. So there is enough through-time coherence and order in the world to encourage people to expect a just assessment of their earlier efforts at some subsequent date.

In fact, those who see history as messy chaos don’t usually mean that there are absolutely no continuities or holding systems which operate through Time. But they do mean that things are so messy that they cannot be reduced to simplicity (except insofar as stating that ‘It’s all chaos’ is in itself a simple answer).

So we are back to encouraging historians, and all others interested in the long term, not just to report but to explain the complications. These are likely to feature an ever-changing mix not only of different forms of change and competing trends, but also deep continuities. As physicist Stephen Hawking predicted, approvingly in 2000: ‘The next [twenty-first] century will be the century of complexity’.6  For historians, the old simplicities of linear or cyclical history may have been outgrown. Yet the Temporal Turn commands us not only to engage in the study of the past (which stretches up to the present moment) but also to explain to the wider world its underlying logic. It’s a big challenge.

1 On the Temporal Turn, see P.J. Corfield, ‘What on Earth is the Temporal Turn and Why is it happening Now?’ BLOG/ 49 (Jan. 2015) and idem, ‘What does the Temporal Turn mean in Practice – for Historians and Non-Historians Alike? BLOG/ 50 (Feb. 2015).

2 Following its first flight in 1969, the supersonic Concorde was used in commercial service from 1976 to 2003: see references in

3 N. Zemon Davis, Women on the Margins: Three Seventeenth-Century Lives (Cambridge, Mass., 1997).

4 For an indication of the many possibilities, see E. Zerubavel, Time Maps: Collective Memory and the Social Shape of the Past (Chicago, 2003); and for linear and cyclical histories, see P.J. Corfield, Time and the Shape of History (2007), pp. 49-56, 80-8.

5 See, for example, a publication with an aptly fin-de-millennium title, J. Barbour, The End of Time: The Next Revolution in our Understanding of the Universe (1999).

6 S.W. Hawking, ‘“Unified Theory” is Getting Closer, Hawking Predicts’, interview in San Jose Mercury News (23 Jan. 2000), p. 29A, quoted in A. Sengupta (ed.), Chaos, Nonlinearity, Complexity: The Dynamical Paradigm of Nature (Berlin, 2006), p. vii. See also M. Gell-Mann, Adventures in the Simple and the Complex: The Quark and the Jaguar (New York, 1994).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 51 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2015)

The senior American policy-maker, who claimed in 2004 that: ‘When we act, we create our own reality’, proved to be dangerously wrong – in Iraq as elsewhere across the world. Instead, it is history which provides the past and present reality. Hence the need to understand everything in its full historical context. That’s what the Temporal Turn2  really means: a turn to Time, whose effects are studied by historians, alongside the practitioners of many other long-span subjects, like archaeologists, astrophysicists, biologists, climatologists, geologists, or zoologists.

Paying fresh attention to Time calls for greater changes in the mind-set of non-historians than it does for historians. For us, it’s axiomatic that History deals with the very long term. But for other disciplines, it means making a fresh effort to ‘think long’. To reflect that the current parameters of your discipline may not remain the same for ever. To become aware of change and historical context, as an integral component, not just an add-on extra. But also to be aware of deep continuities, which may not be amenable to policies of instant reformation. Thus the Temporal Turn will encourage an intellectual shift in many disciplines across the board in the Arts, Social Sciences and Sciences, just as the Linguistic (or structural) Turn, as announced by Richard Rorty in 1967,3  affected philosophy (his prime target) as well as anthropology, social studies, theology, ethics, literary studies, and even, to an extent, history.4

Historians are debating quite what the Temporal Turn means for them too. Crusading zeal on behalf of the discipline, as expressed in the recent History Manifesto, makes for good copy and rousing appeals. Thus Jo Guldi and David Armitage end their polemical tract with a Marxist echo: ‘Historians of the world unite! There is a world to win – before it’s too late’.5  Yet some of the early responses from fellow historians are unexcited. In effect, they are saying that public history has already arrived: ‘We do this already’. In particular, Deborah Cohen and Peter Mandler criticise The History Manifesto for being wrong both in its descriptions and its prescriptions: ‘Historians aren’t soldiers, they don’t fight on a single front, and … they certainly don’t need to be led in one direction’.6 Cohen and Mandler specifically dislike Guldi and Armitage’s hopes that public policy debates can be resolved, or at very least enlightened, by using ‘big data’, derived from massive long-span historical databases. Instead, they stress creative diversity within the discipline.

Who is right in this disagreement? In one sense, Cohen and Mandler are sure to be correct, in that historians can’t be told what to do and how to do it. Their subject is already hugely diversified; and, unlike many academic subjects, it overlaps with a huge semi- and non-academic world of freelance historians and do-it-yourself amateurs. This massive collective project, which has been developed over centuries, is not for speedy turning.
2015-2 No 1 Clio_Goddess of History c1770

Clio, Goddess of History, c1770:

in Portland stone roundel (32in diameter), from Plas Llangoedmor, Cardigan, Wales.


On the other hand, The History Manifesto is importantly right in its general message, even if not necessarily in its specific preferences. It is one sign among many of the intellectual shift towards long-term analysis and away from short-termism. Urgent contemporary issues – like the search for long-term economic growth, or the challenge of resisting/coping with climate change – have long-term roots and demand a long-span historical perspective in response. Historians should be primed and ready to contribute. Indeed, more. Where necessary, historians themselves should be recasting the debates and the big questions.

That contribution can be done on the strength of insights and analysis from micro-history as well as from macro-history. The Temporal Turn does not mean that everyone must study millennia. There are virtues in short-term probes and in long-span narratives – and in the many way-stations in between. The length of periods studied should be dictated only by the research questions in play, as mediated by the source materials available.

Nonetheless, historians of all stripes should be ready to explain or at least to speculate on the bigger picture(s) revealed by their research. When asked something sweeping, it’s not enough to reply: ‘I’m sorry. It’s not my period’. Who other than historians are better placed to comment on historical trends? And there are plenty of ways in which attention to the diachronic can be strengthened in current History research and teaching – of which more in a future BLOG.
2015-2 No 2 Shou Lao Chinese god of longevity

Chinese figurine of Shou Lao or ‘Old Longevity’, representing the power of Time.

Since he carries the scroll which records everyone’s date of death, his good favour is auspicious.


Immediately, three longitudinal insights from History are worth highlighting. (1) Covert change: there are aspects of behaviour, which people often consider to be permanently part of the human condition, which may not really be so. (2) Covert continuity: there are big crises and upheavals in history, which people often think of as ‘changing everything’, but which don’t necessarily do so. And, as a result, (3) change over time is much more than a simple binary process. People often entertain very schematic ideas of the past. Before a certain date, everyone did X, whereas after that time, no-one did. In fact, there are multiple turning points, not always in synchronisation.

Long-term change can be insidious and gradual as well as turbulent and rapid. It is halted by continuity and yet hastened by revolutions. History is interestingly complex – but not inexplicable. Ask the historians; and, historians, tell the world.

1 Attributed to Karl Rove, George Bush’s Deputy Chief of Staff (2004-7). See M. Danner, ‘Words in a Time of War: On Rhetoric, Truth and Power’, in A. Szántó (ed.), What Orwell Didn’t Know: Propaganda and the New Face of American Politics (New York, 2007), p. 17.

2 See PJC, ‘What on Earth is the Temporal Turn and Why is it Happening Now?’ Monthly BLOG/49 (Jan. 2015), for which see

3 R. Rorty (ed.), The Linguistic Turn: Recent Essays in Philosophical Method (Chicago, 1967).

4 G.M. Spiegel, Practising History: New Directions in Historical Writing after the Linguistic Turn (New York, 2005).

5 J. Guldi and D. Armitage, The History Manifesto (Cambridge, 2014), p. 126.

6 D. Cohen and P. Mandler, ‘The History Manifesto: A Critique’ for American Historical Review, at, opening paragraph

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 50 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2015)

The ‘temporal turn’ is a grand phrase to name the current political and intellectual return to interpreting things explicitly within the very long term, otherwise known as history. It’s a new trend, which is gathering pace – and it’s an excellent one too. The name is borrowed from a phrase popularised by the American philosopher Richard Rorty in 1967. He then wrote of the ‘linguistic turn’ in twentieth-century philosophy, when fresh attention was paid to language as a factor significantly influencing or even determining meanings, rather than just conveying thought.1

Since then, an array of other analytical ‘turns’ have been announced. But none have had the same resonance – until now. The serious study of history and historical trends had not, of course, disappeared. So the ‘temporal turn’ is not news to historians. But let’s hope that it becomes a confirmed and sustained development. The ‘linguistic turn’ certainly had its merits. Much was learned about the power of language to frame and convey meaning at any given point in time. Yet the ‘linguistic turn’ was eventually overdone. Analysis of the synchronic moment was excessively privileged over the study of long-term (diachronic) history.

Such an outcome, however, proved detrimental to both perspectives, which are intertwined: ‘The synchronic is always in the diachronic’, just as ‘the diachronic is always in the synchronic’. Life is not composed just of self-contained instantaneous moments. They are linked seamlessly together. Just as well-functioning gears mesh together seamlessly in the present in synchro-mesh, so the past meshes seamlessly with the present and future in diachro-mesh. As a result, it’s really not possible to divorce analytically ‘after’ from ‘before’. While some elements of the past can be properly defined as dead and gone, plenty of others persist through time.

One example of lengthy but not eternal continuity is the human genome. It’s analysed by geneticists as composed of three billion chemical bases of DNA (deoxyribonucleic acid), which contain the biological instructions to make a human. As such, the human genome frames our collective and individual genetic make-up, providing a core pattern plus individual variability. And its longevity is matched with that of our species.2015-1 No 1 Human Genome DNA split

Living History: The Human Genome – DNA split.


Recalling the genome’s long past and immediate present provides a reminder that studying the past (whether via biology or history or any other longitudinal subject) does not require a dualistic choice between either change or continuity. They are intertwined, like History and Geography, or Time and Space.

So the ‘temporal turn’ is welcome. And there are multiplying signs of its arrival, across many disciplines. Within the study of history, micro-histories are being balanced by new macro-histories. And the macro- can be very elongated indeed. Some diachronic studies now start with the origins of human society, others with the origins of Planet Earth, while others start with the origins of the cosmos.3

In practice, it’s far from easy to research and to teach on such a wide canvas, but the International Association of Big History (founded in 2010) advises and encourages practitioners. Some of us (myself included) laugh slightly at its terminology, which has a hint of Toad of Toad Hall: ‘My history is bigger than your history’. Other terms of art are ‘Deep Time’ or, with thanks to Fernand Braudel, the ‘longue durée’. But the name is not the most important point. The history of the long-term is indeed big; and it’s good that it’s returning to a range of new agendas, in everything from zoology to art.4

Lastly, why is this trend happening now? There are three big reasons, which, separately, would have had great impact – and in conjunction are commanding. But it took a combination of macro-crises to overcome ‘presentism’ and the quest for instant gratification, which is strongly entrenched in consumer culture. Nonetheless, external circumstances are forcing a rethink. One inescapable factor is climate change, especially in the context of demographic pressure and ecological degradation. This great topic for our time requires an understanding of past and present science, future prognostications, and current politics. Historians can contribute by studying how past communities have coped with ecological changes, both for good and for ill.5 Accordingly, David Armitage and Jo Guldi have just produced a stirring trumpet-blast, calling for historians to be included in all long-term planning teams organised by governments and international bodies.6

A second factor is the heightened global confrontation over a range of political and religious issues in the twenty-first century. The 2001 attack upon New York’s Twin Towers came as a huge surprise as well as a disaster. It triggered new calls, for eminently practical reasons, to comprehend the roots of conflict and the historic prospects of any countervailing forces of cooperation. Instant power-plays without a diachronic perspective have failed badly. Thus a hubristic assertion in 2004 by a senior American policy-maker that ‘We’re an empire now and, when we act, we create our own reality’, proved to be dangerously wrong.7 History has a habit of biting back – and it is still biting all the protagonists in numerous conflicts around the globe. These all call for diachronic assessment. They haven’t happened out of the blue. And they can’t be addressed cluelessly.

Thirdly, fresh thought is required in response to the unexpected 2008/9 global economic recession, whose ramifications are still unfolding. Knowledge of synchronic structures, networks, and meanings will explain only so much. The origins, treatment and prognosis of the crisis need analysis in long-term context. A sign of the times can be seen in campaigns by some economists and many students to revamp the study of economics. That subject has since the 1970s become highly technocratic, focused upon a neo-classical model, with a strictly quantitative methodology. It might be termed a structuralist or synchronic economics. Yet there are now calls to debate moral values as well as statistical assessments. And to re-incorporate the (wrongly) underrated insights of diachronic economic history.8
So the ‘temporal turn’ is very welcome. It is quietly killing the anti-history philosophy of post-modernism, which flourished in the later twentieth century.9 At last, here is an intellectual trend which historians can welcome wholeheartedly.

1 R. Rorty (ed.), The Linguistic Turn: Recent Essays in Philosophical Method (Chicago, 1967).

2 P.J. Corfield, Time and the Shape of History (London, 2007), p. xv.

3 Examples among a burgeoning field include D. Christian, Maps of Time: An Introduction to Big History (Berkeley, Calif., 2004); and C.S. Brown, Big History: From the Big Bang to the Present (New York, 2007).

4 C. Ross, The Past is the Present; It’s the Future Too: The Temporal Turn in Contemporary Art (London, 2014).

5 See M. Levene et al. (eds), History at the End of the World? History, Climate Change and the Possibility of Closure (Penrith, 2010); or J.L. Brooke, Climate Change and the Course of Global History: A Rough History (New York, 2014).

6 J. Guldi and D. Armitage, The History Manifesto (Cambridge, 2014).

7 Attributed to Karl Rove, George Bush’s Deputy Chief of Staff (2004-7). See M. Danner, ‘Words in a Time of War: On Rhetoric, Truth and Power’, in A. Szántó (ed.), What Orwell Didn’t Know: Propaganda and the New Face of American Politics (New York, 2007), p. 17.

8 See e.g. D. North, The Economic Crisis and the Return of History (Oak Park, Mich., 2011); T. Piketty, Capital in the Twenty-First Century, transl. by A. Goldhammer (Cambridge, Mass., 2014), pp. 31-3, 573-7; J. Madrick, Seven Bad Ideas: How Mainstream Economists have Damaged America and the World (New York, 2014); and students’ calls for reform, led by Manchester University’s Post-Crash Economics Society: see

9 For more, see P.J. Corfield, ‘History and the Temporal Turn: Returning to Causes, Effects and Diachronic Trends’, in J-F. Dunyach (ed.), Périodisations de l’histoire des mondes Britanniques: reflectures critiques (forthcoming Paris, 2015).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 49 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

 It really wasn’t done – for centuries. Women, respectable women especially, did not speak in public from public platforms. They do sometimes, anachronistically, in period films. So the script-writer of The Duchess (dir: Sam Dibb, 2008) decided that the famous eighteenth-century Duchess of Devonshire (played by Keira Knightley) should indicate her political commitment to the Whig reform cause by speaking at the public hustings for the 1784 Westminster election.

But the scene is a flat pancake. That’s no doubt partly because it never happened, giving the script-writer no historical documentation from which to work. The film is good at revealing the extent to which, as an aristocratic woman in the public eye, the Duchess is constrained by her social position. And then suddenly, she appears on a public balcony in her furs and feathers, delivering an impassioned election speech in favour of democracy to the London masses. There’s no sensation. No shock. There’s not even an angry husband, ordering her to desist. [See Fig.1a]

However, the script-writer knows, from evidence discussed in other scenes, that the Duchess was heavily satirised for her political affiliations. In 1784 she undertook the much milder action of canvassing in the Westminster constituency. She was young, charming, rich, high-ranking and a leader of fashion. Yet even she could not get away with it. She was socially pilloried in graphic prints which accused her of lewdly selling kisses to brutish plebeians for votes (see Fig.1b). Not only did the Duchess never venture publicly into politics again, but nor did other high-born ladies. They stuck to behind-the-scenes roles as political hostesses – not without influence, but not in the censorious public eye.

Fig.1a (L) The Duchess of Devonshire as imagined (2008) on the Westminster hustings Fig.1b (R) The Duchess as satirised in 1784 for canvassing the Westminster electors, in a print entitled ‘A New Way to Secure a Majority’

The reasons for this self-effacement were deeply rooted in Christian tradition. Women were seen as domestic helpmeets. They were expected to be modest, docile and, in public, silent. After all, St Paul enjoined that: ‘Let your women keep silence in the churches; for it is not permitted unto them to speak. But they are commanded to be under obedience … And if they will learn anything, let them ask their husbands at home.’1 And he further explained: ‘I suffer not a woman to teach, nor to usurp authority over the man, but to be in silence. For Adam was first formed, then Eve.’2 Christian feminist scholars today debate about St Paul’s own personal attitudes. But the point was not so much his original intention but the meanings internalised by his followers over time. Women, formed from ‘Adam’s rib’, were subordinate beings. Like children, they should be ‘seen but not heard’.

This social convention began to dissipate only slowly in the later nineteenth century, with the campaign for the female franchise. As a result, it is hard to find any major speeches by a British woman on a public platform (especially an outdoor public platform), before the twentieth century. Queen Elizabeth’s speech to her troops at Tilbury docks (August 1588) is the one great exception; and that famous event was legitimated not just by her royal status but by fears of imminent invasion at the time of the Armada.

Of course, there were daring women who did sometimes break with convention. Particularly in times of social tension and political upheaval, there was greater scope for direct action. It was not uncommon for women preachers, often from lower-class backgrounds, to emerge in radical religious movements, such as in the 1640s. If the spirit moved someone to ‘bear witness’, a sincere belief in divine calling could override the Pauline proscription. So early Methodism, which stressed the teachings of the heart, saw many women lay preachers playing an independent role in the 1780s and 1790s.3 One of them was Elizabeth Tomlinson. She became aunt by marriage to the novelist George Eliot, who later drew a highly sympathetic pen-portrait of a Methodist female evangelist in the form of Dinah Morris in Adam Bede (1859). However, the novel ends with Dinah’s withdrawal from public preaching. And the same happened in many real-life cases as nineteenth-century Methodism became more institutionalised and conservative.4

Nonetheless, radical religion and politics remained possible outlets for women speakers. John Wesley himself had expressed the view that treating women only as ‘agreeable playthings’ constituted ‘the deepest unkindness … horrid cruelty … mere Turkish barbarity’.5 By the later nineteenth century, with the spread of literacy and further education, increasing numbers of women began to reject the subordinate role. It was still notable, however that a number of doughty feminists in the early days of the suffragette campaigns continued to express trepidation at speaking on public platforms. One who had no qualms was Charlotte Despard, shown in Fig.2 addressing a mass meeting in Trafalgar Square. She was, however, an exceptional person, emboldened not only by her Anglo-Irish upper-crust background but also, by the 1930s, by her venerable age, doughty personality and long political experience.6

Charlotte Despard at the age of 89, speaking at an anti-fascist rally in Trafalgar Square, 12 June 1933. Photo: James Jarché. © Daily Herald Archive, 1983-5236/11073 One reason for the continuing trepidation was because the art of public speaking does not depend solely on the nerve of the speaker. Successful oratory depends upon an unstated but very real reciprocity. The audience has to be prepared to listen and to respond. If those present are unwilling, then the result can be anything from hostile shouting, jeers, catcalls, obscenities, the throwing of missiles – or simply turning away. Social conventions, in other words, are policed not so much by law (though it may contribute) but by widely-shared conventional beliefs.

Before the twentieth century, the only example known to me of a real-life young woman who spoke publicly at a political rally occurred at the Norwich Guildhall in 1794. The orator was Amelia Alderson (later Opie), the daughter of a respected local physician and a social star among the radical intelligentsia. Her speech was reported in a private letter by a disapproving (if reluctantly admiring) older female witness, Sarah Scott.7 She herself was the author of Millennium Hall (1762), which advocated an elegant female-only community as a means of helping women to escape from domestic subordination. But even a proto-feminist like Scott disapproved of Alderson’s actions. Hence getting both men and women to accept female public speaking remains essential to achieve equality on the soap-box – and (a long-running good cause still not fully resolved today) in the pulpit. Down with biblical literalism! Speak up, everyone, and listen too!

1 Holy Bible, St Paul 1 Corinthians, 14: 34-35.

2 Holy Bible, 1 Timothy, 2: 12-13.

3 See D. Valenze, Prophetic Sons and Daughters: Female Preaching and Popular Religion in Industrial England (Princeton, 1985).

4 P.J. Corfield, Power and the Professions in Britain, 1700-1850 (1995), pp. 105-8.

5 See John Wesley’s Sermon 98: On Visiting the Sick (1786), sect. III, 7: ‘There is neither male nor female in Christ Jesus’: in

6 For Charlotte Despard, née French (1844-1939), see M. Mulvihill, Charlotte Despard: A Biography (1989).

7 J. Spencer, ‘Introduction’, in Sarah Scott, Millennium Hall (1762), ed. J. Spencer (1986), pp. ix-x, citing R. Blunt (ed.), Mrs Montagu, ‘Queen of the Blues’: Her Letters and Friendships from 1762 to 1800 (1923), Vol. 2, p. 304.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 47 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

Not everyone shakes hands. But those who do are expressing an egalitarian relationship. As a form of greeting, the handshake differs completely in meaning from the bow or curtsey, which display deference from the ‘lowly’ to those on ‘high’. In one Jane Austen novel, a fearlessly ‘modern’ young woman extends her hand to a young man at a crowded party. Of course, it is Marianne Dashwood, the embodiment of ‘sensibility’. She has just re-encountered the errant Willoughly, long after he has ended their unofficial courtship. Marianne immediately holds out her hand, claiming him as an intimate friend. But he avoids her gesture. Marianne then exclaims ‘in a voice of the greatest emotion: “Good God! Willoughby, what is the meaning of this? … Will you not shake hands with me?”’. He cannot avoid doing so, but drops her hand quickly. After a few short exchanges, Willoughby then leaves ‘with a slight bow’.2  He has dropped her. Their body language says it all.

There is a particular poignancy in this scene. In this era, men and women who were not related to one another would not ordinarily touch hands as a form of greeting. But, of course, lovers might do so. No wonder that a mere touch was so powerful when it was so rare. (And it retains its appeal today in romantic mythology and countless pop songs: I Wanna hold your Hand!)3  Shakespeare, as ever, had known the scene. Romeo understands the intimacy implied when he takes Juliet’s hand in a dance, as does she: ‘And palm to palm is like holy palmer’s kiss’.4

Even more definitively, a couple would touch hands in a marriage ceremony (even allowing for the many varieties of ritual associated with weddings).5 The wording was clear. ‘Taking someone’s hand in marriage’ is an ultimate symbol of good faith, along with the exchange of rings which remain visible on the hand. These are public signs of personal commitment. An earlier poetic expression also offered an endgame variant, in the form of a final handshake. Michael Dayton’s Sonnet LXI (1594) which starts ‘Since there’s no help, come let us kiss and part’ invites the parting lovers to: ‘shake hands for ever, cancel all our vows’.

At the same time, a close handshake also has a set of commercial connotations. When two traders agree upon a contract, they may indicate the same by a handshake. However unequal they may be in wealth and commercial status, for the purposes of the deal they are equals, both pledging to fulfil the bargain. It constitutes a ‘gentleman’s agreement’ – upheld by personal honour. The same etiquette applies in making a bet.

Hence reneging upon a wager or deal sealed with a personal handshake is viewed as particularly heinous. The loser may even litigate for redress. Today the American Sports World News reports rumours that Charles Wang, the majority owner of the New York Islanders ice-hockey team, is being sued for $10 million by hedge-fund manager Andrew Barroway. Wang’s crime? He had allegedly reneged on a handshake pact to sell his Islanders franchise to Barroway.6
Typically, a handshake is a brief and routine affair, usually but not invariably with the right hand. True, there are variants. The prolonged handshake plus a clasp of the recipient’s upper arm by the shaker’s other hand is a gesture of special warmth – stereotypically undertaken by gregarious American politicians.7

Or there is the Masonic handshake. It gives a secret signal, allowing members of a separate society to identify one another. Apparently, there are many variants of the Masonic handshake, denoting differences in rank within the organisation. That information is rather depressing, since the handshake is, in principle, egalitarian. Nonetheless, it shows the potential for stylistic variation, from the firm muscular grip to the fleeting touch-and-drop.

Variations in styles of shaking hands are here caricatured as two gentlemen are almost dancing their mutual greetings; from, consulted 11 Oct. 2014. Gradually, routine British styles of greeting began to incorporate the handshake. It was most common among civilian men of similar middle-class standing. By contrast, the toffs stuck with their traditional bowing and curtseying. Meanwhile, hand-shaking was rare among workers in ‘dirty’ trades and industries, because people in unavoidably grimy jobs usually tried to contain rather than to spread the dirt. The emblem of two clasped hands nonetheless appeared proudly on various trade union banners, as a pledge of solidarity.

The advent of the social handshake was thus not uniform across all periods and classes. But it could be found, between close male friends, in Britain from at least Shakespeare’s time. Yet its subsequent spread has taken a long time a-coming. For example, in 1828 the anonymous author of A Critique of the Follies and Vices of the Age was still expressing displeasure at the new popularity of the handshake, including between men and women.8

One reason for some snobbish hostility, among polite society in Britain, was the association of this custom with the republican USA, where its usage became increasingly common after American independence. There were also connotations of support for the hand-shaking citizens of republican France from 1793 onwards. English visitors to the USA like the novelist and social commentator Frances Trollope thus waxed somewhat critical of the local mores. In 1832, she deplored the habit of hand-shaking between both sexes and all classes (albeit excluding the non-free).For her, this form of greeting was too bodily intimate, especially as ‘the near approach of the gentleman [ironically] was always redolent of whiskey and tobacco’.9

Ultimately, however, the snobs were routed. Old-style bowing and curtseying has generally disappeared, although hat wearers may still doff their hats to ladies. However, the twentieth century also produced another twist in the tale. Just as the hand-shake was becoming quite widely adopted in Britain by the 1970s, it was suddenly challenged by a new custom, imported from overseas. It is the continental kiss, in the form of a light clasp of the upper arms and a peck on the cheek (or, for the physically fastidious, an air-kiss). Such a manoeuvre would give good scope to a later Marianne Dashwood, who might grip an errant Willoughby in order to kiss him warmly. Nonetheless, be warned: whatever the greeting style, body language always provides ways of signalling the rejection as well as the offering of friendship.

1  See P.J. Corfield, previous monthly BLOG 45 ‘Doffing One’s Hat’. And for fuller discussion, see PJC, ‘Dress for Deference & Dissent: Hats and the Decline of Hat Honour’, Costume: Journal of the Costume Society, 23 (1989), pp. 64-79; also transl. in K.Gerteis (ed.), Zum Wandel von Zeremoniell und Gesellschaftsritualen: Aufklärung, 6 (1991), pp. 5-18; and posted on PJC personal website as Pdf/8.

2  J. Austen, Sense and Sensibility (1st pub. London, 1811): chapter 28.

3 The Beatles (1963).

4  W. Shakespeare, Romeo and Juliet (written mid 1590s; 1597), Act 1, sc. 5. A palmer was a successful pilgrim, returning from the Holy Land bearing palms as a sign that the journey had been achieved.

5  A traditional ritual of ‘hand-fasting’, announcing a solemn public engagement, has also been updated for use today in pagan marriage ceremonies.

6 Sports World News on-line 12.Aug. 2014, at, consulted 11 Oct. 2014.

7  See e.g. John Travolta’s film portrayal of a notably touchy-feely American presidential candidate, based upon Bill Clinton, in Primary Colors (dir. Mike Nichols, 1998).

8  Anon., Something New on Men and Manners: A Critique of the Follies and Vices of the Age … (Hailsham, Sussex, 1828), p. 174.

F. Trollope, Domestic Manners of the Americans (1832), ed. R. Mullen (Oxford, 1984), p. 83.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 46 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

TV’s Pride and Prejudice (1995) provided many memorable images, not least Colin Firth as Mr Darcy diving into a pool to emerge reborn as a feeling, empathetic human being. This transformation gains extra impact when contrasted with the intense formality of his general deportment. When, after some months of absence, Darcy and Bingley re-enter the Bennet family home at Longbourn, they bow deeply in unison, whilst Mrs Bennet and all her daughters rise as one and bend their heads in synchronised response. Audiences may well sigh, admiringly or critically according to taste. What a contrast with our own casual manners. It satisfies a sense that the past must have been different – like a ‘foreign country’, in a much-cited phrase from L.P. Hartley.1

But did people in Georgian polite society actually greet each other like that on a day-to-day basis? There is good evidence for the required formality (and dullness) of Hanoverian court life on ceremonial occasions. A fashionable ball or high society dinner might also require exceptional courtesies. But ordinary life, even among the elite of Britain’s landed aristocrats and commercial plutocrats, was not lived strictly according to the etiquette books.

Instead, the eighteenth century saw an attenuation of the lavish old-style formalities, which were known as ‘hat honour’. In theory, men when meeting their social superiors made a deep bow, removing their headgear, with a visible flourish. Gentlemen greeting a ‘lady’ would also remove their hats with a courteous nod. For women, the comparable requirement was the low curtsey from the ‘inferior’ to the ‘superior’. Those who held their heads highest (and hatted in the case of men) the more socially elevated, since lowering the head always signalled deference. This understanding underpins the custom of addressing monarchs as ‘Your Highness’.

Illustration 1 ‘The Hopes of the Family’ (1799) shows a young man being interviewed for University admission. A don presides, wearing his mortar board, whilst the nervous applicant and his eager father, an old-fashioned country gentleman, have both doffed their hats, which they carry under their arms. An undergraduate in his gown looks on nonchalantly, his hands in pockets. Yet he too remains bare-headed in the presence of a senior member of his College. Only the applicant’s mother, who is subject to the different rules of etiquette for women, covers her head with a rustic bonnet.

V0040710 A school master is sitting at a table pointing at some books

Illus 1: A gentle satire by Henry William Bunbury, entitled

The Hopes of the Family (1799) – © The Welcome Library.

In accordance with this etiquette, King Charles I on trial before Parliament in 1648 wore a high black hat throughout the proceedings. It was a signal that, as the head of state, he would not uncover for any lower authority. The answer of his republican opponents was radical. Charles I was found guilty of warfare against his own people, as a ‘tyrant, traitor and murderer’. He was decapitated, beheading the old power structure very literally and publicly.

After the Restoration of the monarchy in 1660, there was some return to the old formalities. (Or at least hopes of the same). For example, in October 1661 the naval official and MP Samuel Pepys recorded his displeasure at what he considered to be the undue pride of his manservant, who kept his hat on in the house.2 Pepys expected deference from his ‘inferiors’, whilst being ready to accord it to his own ‘superiors’. But it was not always easy to judge. In July 1663, Pepys worried that he may have offended the Duke of York, by not uncovering when the two men were walking in sight of each other in St James’s Park.3 It was a tricky decision. Failure, to doff one’s hat, when close at hand, would be rude, yet uncovering from too far away would seem merely servile.

Over the very long term, however, all these formalities began to attenuate. With the advent of brick buildings and roaring coal-fires, the habitual wearing of hats indoors generally disappeared – mob-caps and night-caps excepted. And in public, the old gestures continued but in an attenuated form. With commercial growth came the advent of many people of middling status. It was hard for them to calculate the precise gradations of status between one individual and another. The old-style mannerisms were also too slow for a fast-moving and urbanising world.

As a result, between men the deep bow began to change into a nod of the head. The elaborate flourish of the hat gradually turned into a quick lifting or pulling. And the respectful long tug of the forelock, on the part of those too poor to have any headgear, turned into a briefer touch to the head.4

A notable example of the abbreviation of hat honour was the codification of the military salute. It was impractical for rank-and-file soldiers to remove their headgear whenever encountering their officers. On the other hand, military discipline required the respecting of ranks. The answer was a symbolic gesture. ‘Inferiors’ greeted their ‘superiors’ by touching the hand to the head. Different regiments evolved their own traditions. Only in 1917 (well into World War I) did the British army decide that all salutes should be given right-handedly.

Meanwhile, the female greeting in the form of a low curtsey, holding out the dress, also evolved into a briefer bob or half-curtsey. It was expected from all lower-status women when meeting ‘superiors’. But hat honour was confined to men. On public occasions, women retained their hats, bonnets and feathers. Even in church, they did not copy men in baring their heads but respected St Paul’s Biblical dictum that it was not ‘comely’ for women to pray to God uncovered.5

These etiquette rules delight TV- and film-makers. In reality, however, the conventions were always in evolution. Rules were broken and/or fudged, as well as followed. Moreover, by the later eighteenth-century in Britain a new form of interpersonal greeting had arrived. It was the egalitarian hand-shake. Jane Austen’s characters not only bowed and curtsied to each other. They also, in certain circumstances, shook hands. In one Austen novel, a fearlessly ‘modern’ young woman extends her hand to shake that of a young man at a public assembly. Anyone know the reference? Answer follows in next month’s BLOG on Handshaking.

1 L.P. Hartley, The Go-Between (1943, p. 1: ‘the past is a foreign country – they do things differently there’.

2 R. Latham and W. Matthews (eds), The Diary of Samuel Pepys, Vol. II: 1661 (1970), p 199.

3 Ibid., Vol. IV: 1663 (1971), p. 252.

4 P.J. Corfield, ‘Dress for Deference & Dissent: Hats and the Decline of Hat Honour’, Costume: Journal of the Costume Society, 23 (1989), pp. 64-79; also transl. in K.Gerteis (ed.), Zum Wandel von Zeremoniell und Gesellschaftsritualen: Aufklärung, 6 (1991), pp. 5-18. Also posted on PJC personal website as Pdf/8.

5 Holy Bible, 1 Corinthians, 11:13.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 45 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

What does it take for individuals to disappear from recorded history? Most people manage it. How is it done? The first answer is to die young. That deed has been achieved by far too many historic humans, especially in eras of highly infectious diseases. Any death before the age of (say) 21 erases immense quantities of potential ability.

After all, how many child prodigies or Wunderkinder have there been? Very few whose fame has outlasted the immediate fuss in their own day. A number of chess-masters and mathematicians have shown dramatic early abilities. But the prodigy of all prodigies is Wolfgang Amadeus Mozart, who began composing at the age of five and continued prolifically for the remaining thirty years of his life. His music is now more famous and more widely performed than it ever was in his own day. Mozart is, however, very much the exception – and his specialist field, music, is also distinctive in its ability to appeal across time and cultures.

A second way of avoiding the attentions of history is to live and die before the invention of writing. Multiple generations of humans did that, so that all details of their lifestyles, as inferred by archaeologists and palao-anthropologists, pertain to the generality rather than to individuals. Oblivion is particularly guaranteed when corpses have been cremated or have been buried in conditions that lead to total decay.

As it happens, a number of frozen, embalmed, or bog-mummified bodies from pre-literate times have survived for many thousands of years. Scholars can then study their way of life and death in unparalleled and fascinating detail. One example is Ötzi the Iceman, found in a high glacier on the Italian/Austrian border in 1991, and now on dignified display in the South Tyrol Museum of Archaeology at Bolzano, Italy. His clothing and weaponry reveal much about the technological abilities of Alpine hunters from over five thousand years ago, just as his bodily remains are informative about his diet, health, death, and genetic inheritance.1 Nonetheless, the world-view of individuals like Ötzi are matters of inference only. And the number of time-survivors from pre-literate eras are very few.22014-5-Pic1 OtzitheIceman

Ötzi the Iceman, over 5000 years old but initially thought to be a recent cadaver when discovered in 1991:
now in the South Tyrol Museum of Archaeology, Bolzano, Italy.

The third way of avoiding historical attention is to live a quiet and secluded life, whether willingly or unwillingly. Most people in every generation constitute the rank-and-file of history. Their deeds might well be important, especially collectively. Yet they remain unknown individually. That oblivion applies especially to those who remain illiterate, even if they live in an era when reading and writing are known.

‘Full many a flower is born to blush unseen/ And waste its sweetness on the desert air’, as Thomas Gray put it eloquently in 1751 (in context, talking about humans, not horticulture).3 One might take his elegiac observation to constitute an oblique call for universal education (though he didn’t). Yet even in eras of widening or general literacy, it remains difficult for every viewpoint to be recorded and to survive. In nineteenth-century Britain, when more people than ever were writing personal letters, diaries and autobiographies, those who did so remained a minority. And most of their intimate communications, especially if unpublished, have been lost or destroyed.

Of course, past people were also known by many other forms of surviving evidence. The current vogue in historical studies (in which I participate) is to encourage the analysis of all possible data about as many as possible individuals, whether ‘high’ or ‘lowly’, by making the information available and searchable on-line.4 Nonetheless, historians, however determined and assiduous, cannot recover everybody. Nor can they make all recovered information meaningful. Sometimes past data is too fragmented or cryptic to have great resonance. It can also be difficult to link imperfect items of information together, with attendant risks: on the one hand, of making false linkages and, on the other hand, of missing real ones.

Moreover, there are still many people, even in well documented eras, whose lives left very little evidence. They were the unknowns who, in George Eliot’s much-quoted passage at the end of Middlemarch (1871/2): ‘lived faithfully a hidden life, and rest in unvisited tombs’.5 She did not intend to slight such blushing violets. On the contrary, Eliot hailed their quiet importance. ‘The growing good of the world is partly dependent on unhistoric acts’, she concluded. A realist might add that the same is true of the ‘bad of the world’ too. But again many lives remain hidden from historic record, even if the long-term impact of their collective actions and inactions has not.

Finally, there is concealment. Plenty of people then and now have reasons for hiding evidence – for example, pertaining to illegitimacy, adultery, addiction, crime, criminal conviction, or being on the losing side in warfare. And many people will have succeeded, despite the best efforts of subsequent scholar-sleuths. Today, however, those seeking to erase their public footprint face an uphill task. The replicating powers of the electronic media mean that evidence removed from one set of files returns, unbidden, in other versions or lurks in distant master files. ‘Delete’ does not mean absolute deletion.

Concluding the saga of The Mayor of Casterbridge (1886), the bipolar anti-hero Michael Henchard seeks to become a non-person after his death, leaving a savage will demanding ‘That I be not buried in consecrated ground & That no sexton be asked to toll the bell … & That no flowers be planted on my grave & That no man remember me’.6
2014-5-Pic2 Non-person

Non-Person © (2014)

Today: yes, people can still be forgotten; or even fall through the administrative cracks and become a non-person. But to disappear from the record entirely is far from easy. Future historians of on-line societies are going to face the problems not of evidential dearth but of massive electronic glut. Still, don’t stop writing BLOGs, tweets, texts, emails, letters, books, graffiti. If we can’t disappear from the record, then everyone – whether famous, infamous, or unknown – can take action and ‘bear witness’.

1 For Ötzi, see:

2 See P.V. Glob, The Bog People: Iron-Age Man Preserved, transl. R. Bruce-Mitford (London, 1969); D.R. Brothwell, The Bog Man and the Archaeology of People (London, 1986).

3 T. Gray, ‘Elegy Written in a Country Churchyard’ (1751), lines 55-6.

4 See e.g. Proceedings of the Old Bailey Online, 1674-1913:; London Lives, 1690-1800:; Clergy of the Church of England Database, 1540-1835:; London Electoral History, 1700-1850:

5 G. Eliot [Mary Ann Evans], Middlemarch: A Study of Provincial Life (1871/2), ed. W.J. Harvey (Harmondsworth, 1969), p. 896.

6 T. Hardy, The Mayor of Casterbridge: The Life and Death of a Man of Character (1886), ed. K. Wilson (London, 2003), p. 321.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 41 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

What does it take to get a long-surviving reputation? The answer, rather obviously, is somehow to get access to a means of endurance through time. To hitch a lift with history.

People in sports and the performing arts, before the advent of electronic storage/ replay media, have an intrinsic problem. Their prowess is known at the time but is notoriously difficult to recapture later. The French actor Sarah Bernhardt (1844-1923), playing Hamlet on stage when she was well into her 70s and sporting an artificial limb after a leg amputation, remains an inspiration for all public performers, whatever their field.1  Yet performance glamour, even in legend, still fades fast.

Bernhardt in the 1880s as a romantic HamletWhat helps to keep a reputation well burnished is an organisation that outlasts an individual. A memorable preacher like John Wesley, the founder of Methodism, impressed many different audiences, as he spoke at open-air and private meetings across eighteenth-century Britain. Admirers said that his gaze seemed to pick out each person individually. Having heard Wesley in 1739, one John Nelson, who later became a fellow Methodist preacher, recorded that effect: ‘I thought his whole discourse was aimed at me’.2

Yet there were plenty of celebrated preachers in Georgian Britain. What made Wesley’s reputation survive was not only his assiduous self-chronicling, via his journals and letters, but also the new religious organisation that he founded. Of course, the Methodist church was dedicated to spreading his ideas and methods for saving Christian souls, not to the enshrining of the founder’s own reputation. It did, however, forward Wesley’s legacy into successive generations, albeit with various changes over time. Indeed, for true longevity, a religious movement (or a political cause, come to that) has to have permanent values that outlast its own era but equally a capacity for adaptation.

There are some interesting examples of small, often millenarian, cults which survive clandestinely for centuries. England’s Muggletonians, named after the London tailor Lodovicke Muggleton, were a case in point. Originating during the mid-seventeenth-century civil wars, the small Protestant sect never recruited publicly and never grew to any size.  But the sect lasted in secrecy from 1652 to 1979 – a staggering trajectory. It seems that the clue was a shared excitement of cultish secrecy and a sense of special salvation, in the expectation of the imminent end of the world. Muggleton himself was unimportant. And finally the movement’s secret magic failed to remain transmissible.3

In fact, the longer that causes survive, the greater the scope for the imprint of very many different personalities, different social demands, different institutional roles, and diverse, often conflicting, interpretations of the core theology. Throughout these processes, the original founders tend quickly to become ideal-types of mythic status, rather than actual individuals. It is their beliefs and symbolism, rather than their personalities, that live.

As well as beliefs and organisation, another reputation-preserver is the achievement of impressive deeds, whether for good or ill. Notorious and famous people alike often become national or communal myths, adapted by later generations to fit later circumstances. Picking through controversies about the roles of such outstanding figures is part of the work of historians, seeking to offer not anodyne but judicious verdicts on those ‘world-historical individuals’ (to use Hegel’s phrase) whose actions crystallise great historical moments or forces. They embody elements of history larger than themselves.

Hegel himself had witnessed one such giant personality, in the form of the Emperor Napoleon. It was just after the battle of Jena (1806), when the previously feared Prussian army had been routed by the French. The small figure of Napoleon rode past Hegel, who wrote: ‘It is indeed a wonderful sensation to see such an individual, who, concentrated here at a single point, astride a horse, reaches out over the world and masters it’.4

(L) The academic philosopher G.W.F. Hegel (1770-1831) and (R) the man of action, Emperor Napoleon (1769-1821),  both present at Jena in October 1806The means by which Napoleon’s posthumous reputation has survived are interesting in themselves. He did not found a long-lasting dynasty, so neither family piety nor institutionalised authority could help. He was, of course, deposed and exiled, dividing French opinion both then and later. Nonetheless, Napoleon left numerous enduring things, such as codes of law; systems of measurement; structures of government; and many physical monuments. One such was Paris’s Jena Bridge, built to celebrate the victorious battle.

Monuments, if sufficiently durable, can certainly long outlast individuals. They effortlessly bear diachronic witness to fame. Yet, at the same time, monuments can crumble or be destroyed. Or, even if surviving, they can outlast the entire culture that built them. Today a visitor to Egypt may admire the pyramids, without knowing the names of the pharaohs they commemorated, let alone anything specific about them. Shelley caught that aspect of vanished grandeur well, in his poem to the ruined statue of Ozymandias: the quondam ‘king of kings’, lost and unknown in the desert sands.6

So lastly what about words? They can outlast individuals and even cultures, provided that they are kept in a transmissible format. Even lost languages can be later deciphered, although experts have not yet cracked the ancient codes from Harappa in the Punjab.7  Words, especially in printed or nowadays digital format, have immense potential for endurance. Not only are they open to reinterpretation over time; but, via their messages, later generations can commune mentally with earlier ones.

In Jena, the passing Napoleon (then aged 37) was unaware of the watching academic (then aged 36), who was formulating his ideas about revolutionary historical changes through conflict. Yet, through the endurance of his later publications, Hegel, who was unknown in 1806, has now become the second notable personage who was present at the scene. Indeed, via his influence upon Karl Marx, it could even be argued that the German philosopher has become the historically more important figure of those two individuals in Jena on 13 October 1806. On the other hand, Marx’s impact, having been immensely significant in the twentieth century, is also fast fading.

Who from the nineteenth century will be the most famous in another century’s time? Napoleon? Hegel? Marx? (Shelley’s Ozymandias?) Time not only ravages but provides the supreme test.

1  R. Gottlieb, Sarah: The Life of Sarah Bernhardt (New Haven, 2010).

R.P. Heitzenrater, ‘John Wesley’s Principles and Practice of Preaching’, Methodist History, 37 (1999), p. 106. See also R. Hattersley, A Brand from the Burning: The Life of John Wesley (London, 2002).

3  W. Lamont, Last Witnesses: The Muggletonian History, 1652-1979 (Aldershot, 2006); C. Hill, B. Reay and W. Lamont, The World of the Muggletonians (London, 1983); E.P. Thompson, Witness against the Beast: William Blake and the Moral Law (Cambridge, 1993).

G.W.F. Hegel to F.I. Neithammer, 13 Oct. 1806, in C. Butler (ed.), The Letters: Georg Wilhelm Friedrich Hegel, 1770-1831 (Bloomington, 1984); also transcribed in, 2005.


6  P.B. Shelley (1792-1822), Ozymandias (1818).

For debates over the language or communication system in the ancient Indus Valley culture, see:

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 40 please click here