Posts

MONTHLY BLOG 130, MEANINGS OF BEING PENELOPE

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2021)

Fig.1 A swatch of weaving,
illustrating the metaphor for History as ‘Penelope’s Web’
being constantly woven and unwoven by Penelope in Greek myth.

It’s a great name, Penelope. English. Greek. And very international. Recognised everywhere. Can be used in long majestic form. Or abbreviated into Penny, Pen, or P. It’s not too commonly used. Yet it’s very far from unknown, either.

In Greek myth, the foundational Penelope is the wife of the travelling Odysseus (Ulysses). She remains at home, weaving and waiting. And rejecting the many suitors for her hand. So the name has connotations of a woman of sexy desirability, who has great patience and perseverance while sticking at her own work, allied to a good knowledge of her own mind, and a degree of cunning in eventually getting what she wants. For me, a most attractive mix.

Perhaps British wives, waiting at home for their husbands to return from the Second World War, had visions of themselves as Penelope? Certainly a considerable number of baby daughters were then given that name. For instance, in 1940 the celebrated actor Penelope Keith was born in Sutton, to the wife of a serving army officer; and in 1946 her fellow actor, the admirable Penelope Wilton, was born in Scarborough, North Yorkshire. Whereas the name has become comparatively less common since then. The much-lauded Spanish film actor Penelope Cruz (b.1974) is a notable exception. And, of course, there are others, especially in Greece. Nonetheless, when I meet fellow Penelopes these days, there is a strong chance that we will all be post-WW2 baby boomers.

Interestingly, in Britain after the First World War, numerous baby girls were named ‘Irene’ – meaning peace. My mother (b.1919) was one of them. So it obviously seemed natural to her, after yet another grinding war, to reach for an expressive Greek name. During the fighting, she worked on the home front, deciphering captured letters for Military Intelligence, and dodging incendiary bombs on London. But her memories were chiefly of the anxiety of waiting for my father to return from active service in North Africa and Italy. So Penelope!

As a youngster, I was invariably known as Penny – and was happy enough to be teased about turning up like a ‘bad penny’; or, when I was naughty, being called ‘penny dreadful’. Such usages are broadly affectionate. And, with a long name in reserve, I never felt purely defined by the diminutive form.

Moreover, as I began to teach and then to publish, I realised the great advantage of having a public persona, which I can use alongside my private identity. These days I use Penelope daily – and some people address me only by that name. I positively enjoy it, though I would not have done when younger.

Furthermore, there is one metaphorical usage, which I do especially relish. The term ‘Penelope’s web’ refers originally to the shroud that the mythic Penelope weaves daily and unpicks secretly by night – thereby delaying a decision as to which of her suitors to choose. (They were not very bright and failed to see through her ruse, which she sustained for years). Penelope’s web can therefore simply refer to a major work which is always in progress and never done. (Ouch! Too many authors know that syndrome). Yet it is also used metaphorically for global history. That is a colossal work, which is always in progress, always being unpicked by critical historians, and then rewoven by others. As one of that tribe, I am proud to contribute to Penelope’s web.

By the way, I don’t feel any proprietorial interest over any other aspects of the mythology, though I admire both the academic deliberations1 and the contemporary retellings.2 Did Penelope secretly have sex with all 108 of the faithful suitors, giving birth to an illegitimate son Pan? (as some versions suggest). I don’t know and don’t mind one way or the other. Did Penelope look on with blood-thirsty glee when Odysseus/Ulysses returned and slaughtered all the importunate suitors and her twelve loyal handmaids as well?3 I never knew about such details as a child, so had no idea that there were moral complexities in the story (as in global history, of course). To me, Penelope was/is simply a name of serenity and potency.

But I did discover, with time, one complexity of my own. From childhood, I was trained to write my short name as ‘Pene’: literally one half of Penelope. I view ‘Penny’ as a close variant, but not actually referring to me. However, then I met some Spaniards. They were highly excited to meet a woman named ‘Penis’. For a while, I simply laughed. After all, plenty of men manage with the penile nick-names: ‘Dick’, ‘John Thomas, or ‘Johnson’, without exciting wild mirth. However, in my case the cross-gender dimension seemed to be too much. Soon I got bored with the kerfuffle, especially as my range of international contacts grew. Now I try to keep ‘Pene’ strictly for use between very old friends and family. I sign emails with the initial: P. And to the wider world, I’m very happily known as Penelope – a lovely Greek name with hidden depths.

ENDNOTES:

1 See e.g. M.A. Katz, Penelope’s Renown: Meaning and Indeterminacy in the Odyssey (Princeton, NJ, 1991); M. Janda, Odysseus und Penelope: Mythos und Namen (Innsbruck, 2015).

2 See esp. M. Atwood, The Penelopiad (2007).

3 Christopher Rush’s novel Penelope’s Web (Edinburgh, 2015) confronts the dramas and moral dilemmas both of her husband’s twenty-year absence and of his homecoming.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 130 please click here

MONTHLY BLOG 127, World citizens in the twenty-first century are generating an ‘international sphere’ of public opinion, outside and beyond the control of national governments.

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2021)

Fig.1 Globe in Speech Bubble by Moilleadóir (2009):
from WikiMedia Commons
https://commons.wikimedia.org/wiki/File:WiktLogo-Bubble-WikiGlobe-red-1-.svg

There is today a growing international sphere of public opinion. It stretches well outside and beyond the control of national governments. It is purely informal; often fragmented; and lacking direct power. Nonetheless it is an identifiable liberal trend in world history – which is causing particular anxieties for repressive states. As a result, there are also hostile forces, working against the emergent international sphere. Yet the global advance of mass literacy since c.1800 is laying the foundation (in 2015. 86% of adults across the world were able to read and write);1 the diffusion of print continues to fan the fire; and the advent of personal computing, plus especially the invention of the world-wide-web in 1989, has thrown (metaphorically) petrol on the blaze.

Not all, but many citizens are now sharing and debating ideas world-wide. The numbers participating are likely to grow. And, in time, the strength of global public opinion, when united, will increasingly influence governments. To take one example, there may well be international people-power calling for faster action to cope with climate change. Of course, global public opinion will not always agree – any more than does public opinion within any nation-state. But debates are part and parcel of all civic life. In other words, it’s better to have people arguing and voting rather than fighting and killing.

This collective arena has recently been identified as a ‘global civil order’.2 And others detect the operation of an ‘international sphere’.3 That latter terminology is a verbal adaptation from an earlier usage, popularised by the German social philosopher, Jürgen Habermas.4 Writing of western Europe in the eighteenth century, he identified the advent of a new ‘public sphere’ or civic arena, which he contrasted with the ‘private sphere’ of the domestic household. Details of his interpretation are disputed. The two spheres were not as separate and self-contained as Habermas assumed. And his dichotomy between the supposedly ‘male’ and ‘bourgeois’ civic sphere and the supposedly ‘female’ household was not nearly as clear cut either.5

Nonetheless, an adapted version of overlapping, rather than separate, spheres is a helpful one, In the course of the eighteenth century, an increasingly literate population across Britain joined in debating ideas and ideologies in books, newspapers, homes, schools, theatres, market-places, coffee-houses, and debating chambers – all the way from private societies to national legislatures.6 And today the debates are taking places not only in household, local and national spheres but also internationally. There is no need to choose between one civic forum or another: they interconnect and overlap. Individuals can thus share interests not only locally but also with others across Planet Earth.

One criticism of this emergent trend was voiced in Britain in 2016 by the then Conservative premier Theresa May. Those individuals who view themselves as ‘citizens of the world’ are really, she claimed, ‘citizens of nowhere’. She further implied that the would-be internationalists were talking just to other international elites, and were betraying their fellow citizens ‘who live down the road’.7 Some cheered. But many, including some of her fellow Conservatives, rebuked her myopia. People should be praised, not blamed, for taking seriously their responsibilities to the global community that lives on Planet Earth. Today, that point is being underlined, more emphatically than ever, by the Covid pandemic and by galloping climate change.

At this point, it’s worth stressing that the emergent international sphere is not in itself hostile to the world’s governments in general (even if specific governments may be strongly opposed). On the contrary, the global exchange of ideas and opinions depends upon a degree of international order. Chronic armed conflict between rival nations clearly does not promote reasoned discourse.

So the achievements of national governments, from the early twentieth century onwards, have been vital, in establishing an institutional framework for international cooperation.8 It doesn’t always work. Crucially, however, this framework does exist. Key bodies include: the League of Nations (founded 1920), followed by the United Nations (1945); plus Interpol (1923); the World Bank (1944), the World Health Organisation (1948); the General Agreement on Tariffs and Trade (GATT: 1948), followed by the World Trade Organisation (1995),9 the Geneva Conventions on the conduct of warfare (1949); the International Telecommunications Union (1965), the Comprehensive Test Ban Treaty (1996) and, not least, the International Criminal Court (1998). Support for such initiatives came from national populations who backed governments in thinking internationally; and these changes in turn encouraged further international thinking among ordinary citizens.

All the ensuing non-governmental global conversations are thoroughly diverse. Some are initiated by individual activists. The role of Greta Thunberg, the youthful Swedish environmentalist, is one remarkable case in point, as she tours the world to highlight the need for urgent action on climate change.10

At the same time, many non-governmental links are sustained by an immense number of global organisations.11 Sporting associations had practical reasons for collating their rules. Leading the way in 1881 was the International Gymnastics Federation. Another leader was the Fédération Internationale de Football Association (FIFA; founded 1904). Other groups which think globally include the churches; trade unions; professions; academics; librarians; scientists; doctors; and many specialist occupational groups, such as investment bankers. All these, and many others, run international organisations. One venerable and still thriving body is Apimondia, founded by the world’s bee-keepers in 1897.12

There are also numerous international aid or development agencies (some with government funding; many without). These bodies indicate that the charitable impulse, found within most countries, is now being energetically applied world-wide.13 Significantly, too, global lobbying on contentious global issues has grown ever more vigorous. In 2007, Avaaz, an American non-profit web-based organization, rallies international support to advance a liberal-left (non-ideological) agenda, opposing climate change, corruption, poverty, and conflict – and supporting human rights and animal rights.14 By contrast, some international networks deliberately operate on the dark side: those of criminals. money-launderers and people-traffickers, being prime cases.15 Unsurprisingly, these people do not contribute to the global discourse, but are instead the   subject of earnest international debate, in the difficult quest to curb them.

Another admirable set of organisations are devoted to literary and cultural matters. One congenial case is the Robert Burns World Federation, founded in 1885. Run by enthusiasts, it is a charity that promotes and celebrates Scotland’s most famous poet and song-writer. And it provides organisational links for a world-wide network of Burns Clubs (numbering over 250 in 2013).16 The fact that this Federation has now flourished for well over a century is impressive.

Robert Burns has also proved to be a song-writer for the world. In 1788, he wrote Auld Lang Syne, celebrating friendship and remembrance. Set to a traditional Scottish tune, the song has now been translated into at least 41 languages. Not only is it sung at private parties, but it is regularly performed in many countries at graduations, passing-out army parades, and festivities at the turn of the Old Year/New Year.17 It has thus become the world’s most frequently sung song, giving the international sphere an unofficial anthem. (‘We’ll drink a cup of kindness then/ For the sake of auld lang syne’). Once on a visit in Japan, I gave an ad hoc rendering, only to be asked by my audience, with pleased surprise, how I knew this traditional Japanese song so well.18

These internationalist thoughts have been triggered by my participation in the International Society for Eighteenth-Century Studies/ Société internationale d’étude du dix-huitième siècle, of which I am currently President.19 This body, founded in 1963, is now nearing its 60th anniversary. It is run on a shoe-string, without any institutional backing, and has 35 affiliated national and regional societies (some more active than others). Together, its membership may be viewed as an update of the eighteenth-century scholars’ ecumenical Republic of Letters.20 And today the Society proudly contributes to the international sphere.

ENDNOTES:

1 See variously D. Vincent, The Rise of Mass Literacy: Reading and Writing in Modern Europe (Cambridge, 2020); M. Roser and E. Otriz-Ospina, Literacy (2013) in website: https://ourworldindata.org/literacy.

2 See D. Laqua, W. Van Acker and C. Verbruggen (eds), International Associations and Global Civil Society: Histories of the Union of International Associations (2019).

3 See two recent book titles: B. Winter and L. Sorbera, Contending Legitimacy in World Politics: The State, Civil Society and the International Sphere in Twenty-First Century Politics (2018); and C.R. Alexander, Frontiers of Public Diplomacy: Hegemony, Morality and Power in the International Sphere (2021).

4 See J. Habermas, Strukturwandel der Bürgerlichen Öffentlichkeit (1963), in 4th edn. (Neuwied, 1969), transl. as The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society (Cambridge, Mass., 1989). p. 40.

5 For one pertinent critique among many, see J.A. Downie, ‘The Myth of the Bourgeois Public Sphere’, in C. Wall (ed.), A Concise Companion to the Restoration and Eighteenth Century (Oxford, 2004), pp. 58-79.

6 See e.g. H. Barker, Newspapers, Politics and Public Opinion in Late Eighteenth-Century Britain (Oxford, 1998); H. Kerr, D. Lemmings and R. Phiddian, Passions, Sympathy and Print Culture: Public Opinion and Emotional Authenticity in Eighteenth-Century Britain (Basingstoke, 2015); and M. Ellis (ed.), Eighteenth-Century Coffee-House Culture, Vols. 1-4 (2017).

7 For the full text of Theresa May’s speech to Conservative Party Conference on 5 October 2016, see The Spectator: https://www.spectator.co.uk/article/full-text-theresa-may-s-conference-speech.

8 See e.g. I. Trauschweizer, Temple of Peace: International Cooperation and Stability since 1945 (Athens, Ohio, 2021); and meditations on future prospects by D.R. Kelley, Understanding a Changing World: The Alternative Futures of the International System (Lanham, Md, 2021).

9 B. Spiesshofer, Responsible Enterprise: The Emergence of a Global Economic Order (Munich and Oxford, 2018).

10 See A. Chapman, Greta Thunberg and the Climate Crisis (2020), and a detailed summary, covering her achievements, her school-fellow colleagues, and her critics, in: https://en.wikipedia.org/wiki/Greta_Thunberg.

11 Listed in Laqua, Van Acker and Verbruggen (eds), International Associations. as cited above n.1.

12 See https://www.apimondia.com/en/the-federation/history.

13 See S. Harland, D. Griffiths, and L. Walker (eds), The International Development Directory (2001); and Directory of International Development and Relief Agencies (2021), in https://www.guidestar.org/NonprofitDirectory.aspx?cat=6&subcat=32&p=8.

14 For details, see https://secure.avaaz.org.

15 See e.g. D.R. Liddick, The Global Underworld: Transnational Crime and the United States (2004); and M. Glenny, McMafia: A Journey through the Global Criminal Underworld (Toronto, 2009).

16 For further information, see http://www.rbwf.org.uk.

17 See https://en.wikipedia.org/wiki/Auld_Lang_Syne.

18 Translated as 蛍の光 / Hotaru no Hikari.

19 See the ISECS/SIEDS website, hosted by the University of Trois Rivières, Canada:   https://oraprdnt.uqtr.uquebec.ca/pls/public/gscw031?owa_no_site=304&owa_no_fiche=11.

20 Among a large literature, see D. Goodman, The Republic of Letters: A Cultural History of the French Enlightenment (1994); A. Goldgar, Impolite Learning: Conduct and Community in the Republic of Letters, 1680–1750 (1995); G. Ostrander, Republic of Letters: The American Intellectual Community, 1776–1865 (Madison, Wis., 1999); J. Israel, Radical Enlightenment: Philosophy and the Making of Modernity, 1650–1750 (Oxford, 2001); S. Dalton, Engendering the Republic of Letters: Reconnecting Public and Private Spheres in Eighteenth-Century Europe (2003); and A. Lilti, The World of the Salons: Sociability and Worldliness in Eighteenth-Century Paris (Oxford 2015).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 127 please click here

MONTHLY BLOG 124, BATTERSEA’S FEMALE PIONEERS

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2021)

Battersea's Female Pioneers

In mid-February 2021, Battersea’s Labour MP Marsha de Cordova set a good challenge to me and to my friend and fellow historian of Battersea, Jeanne Rathbone.1 We were asked to nominate 31 pioneering women with a connection to the area. No problem. And then to write Twitter-length summaries of their achievements, especially in the local context, Trickier, as many of these women had richly multi-faceted lives. Plus, trickiest of all, to find authenticated photos of them all.

One case was extreme. The philanthropist Mrs Theodore Russell Monroe followed the reticent Victorian custom of using in public not only her husband’s surname but his given name as well. A quick search on Google for ‘Russell Monroe’ provided lots of information about the 1953 film Gentlemen Prefer Blondes, starring Jane Russell and Marilyn Monroe. But absolutely nothing about the laudable woman who in 1896 funded Battersea Hospital as a headquarters of the Anti-Vivisection movement.2 As a result, we cite Mrs Theodore Russell Monroe in the Victorian style to which she was accustomed. And, without dates or photo, she remains a monument of self-effacement.

Having (largely) met the good challenge, the 31 names and short citations were published, day by day throughout March 2021, on Marsha de Cordova’s web platform.3 It constituted her salutation, on behalf of Battersea, to National Women’s History Month.

Interestingly but not surprisingly, very few of these women were actually born within the area itself. But Battersea, like the surrounding greater London, has always attracted incomers to share its jostling mix of wealth and poverty. One who not only made that move but wrote eloquently about it was the author Nell Dunn. In 1959 she moved from ‘posh’ Chelsea to ‘plebeian’ north Battersea; and her prize-winning Up the Junction (1963; filmed 1968) won applause for its mix of gritty realism with warm cross-class sympathy.

As a further celebration of International Women’s Day on 8 March, Marsha de Cordova also hosted a well-attended (virtual) public meeting. It provided an excellent chance to take stock of what has changed – and to highlight what changes are still needed. It takes collective as well as individual action to improve the lot of women. And – needless to say – we cannot assume that all changes will automatically be progressive ones. History does reveal the existence of some world-wide and long-term trends (such as the spread of mass literacy). Yet, on the way, there are always fluctuations and sometimes outright backlashes and reversals. So women need continually to work together – and with men – to keep momentum for the right sort of changes.

By way of introducing the 8 March meeting, I organised presentations of five individual Battersea women’s lives.4 They were specifically chosen to show the range of fields open to female endeavour: politics and protest; aviation and technology; sports; literature; entertainment. Some of these areas were more traditional to women. Others, such as aviation and marathon-running, less so. The point for these women, all associated with Battersea, was either to open new doors – or to push further through doors that were already opened. It’s not career novelty per se which was required – but confidence and staying power.

So who were the five exemplary women, emerging in successive generations? One was the long-lived and remarkable Charlotte Despard (1844-1939).5 She was a leading socialist reformer, suffragette campaigner, pacifist, supporter of Irish independence, and (in her later years) advocate of Russian communism. Never elected to parliament, she began her public career funding and personally running welfare projects in the industrial slums of Battersea’s Nine Elms. Gradually, she became a notable public figure, unworried as to whether she was in or out of political fashion. Among other things, she became a powerful stump orator, regularly addressing large outdoor meetings in an era when it was still rare for women to make public speeches.6 Above all, Despard developed her own philosophy of non-violent protest. And she influenced the young Mahatma Gandhi, who met Despard on his first visit to London in summer 1914 and was highly impressed. ‘She is a wonderful woman’, he wrote.

In the following generation, Hilda Hewlett (1864-1943) took women into the skies.7 She became fascinated by flight. She rejected the view, held by many men, that ‘the fair sex;’ did not have ‘the right kind of nerve’ for aviation. Hewlett was the first British woman to get a pilot’s license; the first to open (with a partner) a flying school; and the first to open (with the same partner) a factory to manufacture aircraft. (Many were used in the First World War).  This venture was initially located in north Battersea, where there was a large skilled industrial workforce on hand. Hewlett was not only a force for change in her own right, but she opened doors for others too. Thus she trained not only young men but also young women in the skills of aviation and engineering. She was clear that new technology should empower all.

Overcoming obstacles by direct action was also the modus operandi of Violet Piercy (1889-1972).8 She proved to be a natural athlete. Yet she was constrained by traditional taboos about women in competitive sport. So Piercy began to run unofficial marathons, in a very public style. In 1926, she ran from Windsor Castle to Battersea Town Hall, close to her home. Eventually, in 1936 she was allowed to run an official marathon route but not as part of the male racing pack. Her ‘record’ stood for decades, until women were allowed freely into all competitive sports. Piercy’s aim was simple: ‘I did it to prove that a woman’s stamina can be just as remarkable as a man’s’. And through the efforts of pioneers like her, the barriers to women in sport were one by one overthrown.

Penelope Fitzgerald (1916-2000)9 wrote lovely, lyrical, downbeat novels. She had an often hand-to-mouth downbeat life, far from what she might have initially expected from her affluent, well-educated family background. And her novels’ themes were often downbeat too. Her experiences showed that adversity could strike anyone. Her family, in straitened circumstances, moved frequently, living in cheap lodgings in Battersea and for a while on a houseboat, moored in the Thames. (It sank, twice). Her most famous novel Offshore (1979) offered a wry literary evocation of the riverside community. Yet Fitzgerald found in writing a means of escaping – or transcending – her own woes; and her ultimate message was that people must hold on firmly to life, whatever happens.

Another exceptional woman was Elsa Lanchester (1902-86), who became a star of stage, TV and film.10 She rose from an unusual Bohemian left-wing childhood in south London, including Battersea, to have an international career. And she died in Hollywood. However, while she was praised for her humour and her versatility, she never had a break-through to film greatness. Instead, she was best known for her marriage to an undeniable star actor, Charles Laughton. They were a ‘celebrity couple’, in the public eye. But Lanchester firmly refused to answer any intimate enquiries. Their private life remained private. Laughton’s experiences as a gay or bisexual man were part of the coming world of gender/sexual flexibility. Amidst the glitz and speculation, Lanchester was staunch and dignified. She was a working woman and made her own way.

If these Battersea pioneers were to translate their experiences into mottoes for the early twenty-first century, what would they say? The following suggestions are improvised from their lives and recorded words.

Charlotte Despard would urge: ‘Fight – peacefully – against life’s injustices – and just don’t stop!’ [Note the adverb: ‘peacefully’]. Hilda Hewlett would add practical encouragement: ‘Plan well before you start your projects – but, after that, the sky’s the limit’. Violet Piercy would agree. ‘Women: just get out there and show the world what we can do’. And she too would add: ‘Don’t ever give up! Keep right on to the end of the road’. Meanwhile, Penelope Fitzgerald might well think: ‘It’s not always that easy’. But if pressed, she’d state firmly: ‘Even in adversity, find courage!’ And Elsa Lanchester would advise women to find both a public face and an inner self-confidence: ‘Chin up! … Smile for the cameras … And be proud to be yourself’. Confident individuals and groups then make confident movements.

ENDNOTES:

1 See J. Rathbone, Twenty Inspiring Battersea Women (in preparation 2021); with warm thanks to Jeanne for generously sharing her research.

2 There is scope for a good history of the Battersea General Hospital (closed 1972) and a skilled researcher should be able to find more details about the Hospital’s first funder.

3 In a late reshuffle of which I was unaware, a change was made to the list to insert ‘Penny Corfield, historian’. I remain shell-shocked. Most names on the list are historical figures, since time allows scope for proper critical distance. However, I thank Marsha de Cordova and her team for the huge compliment.

4 These were: Charlotte Despard, presented by Penelope Corfield; Hilda Hewlett, presented by Jeanne Rathbone; Violet Piercy, presented by Sonya Davis; Penelope Fitzgerald, presented by Carole Maddern; and Elsa Lanchester, presented by Su Elliott.

5 For Charlotte Despard, née French (1844-1939), see M. Mulvihill, Charlotte Despard: A Biography (1989); and PJC, ‘Why is the Remarkable Charlotte Despard Not Better Known?’, BLOG/97 (Jan. 2019); also available in PJC website https://www.penelopejcorfield.com/global-themes/gender-history/4.3.5.

6 PJC, Women and Public Speaking: And Why It Has Taken So Long to Get There. Monthly BLOG/47 (Nov. 2014); also in PJC website, as above 4.3.2.

7 For Hilda Hewlett, née Herbert (1864-1943), see G. Hewlett, The Old Bird: The Irrepressible Mrs Hewlett (Leicester, 2010).

8 For Violet Piercy (1889-1972), see https://en.wikipedia.org/wiki/Violet_Piercy; and context in J. Hargreaves, Sporting Females: Critical Issues in the History and Sociology of Women’s Sports (1993).

9 For Penelope Fitzgerald, née Knox (1916-2000), see H. Lee, Penelope Fitzgerald: A Life (2013); and C.J. Knight, Penelope Fitzgerald and the Consolations of Fiction (2016).

10 Two indispensable sources are E. Lanchester, Charles Laughton and I (San Diego, 1938); and idem, Elsa Lanchester Herself (New York, 1984), while there remains scope for a thoughtful biography. See also C. Higham, Charles Laughton: An Intimate Biography (1976), with introduction by E. Lanchester.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 124 please click here

MONTHLY BLOG 113, LIGHT FROM THE LAMP OF EXPERIENCE

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2020)

Fig.1, A hand-held eighteenth-century lantern, its lighted candle providing an immediate pool of light.

‘The Lamp of Experience’ is a marvellous phrase. A lantern throws light. It does not insist dogmatically but instead conveys sufficient illumination for good judgment. ‘Experience’ is also a vital component of the phrase. It implies not just a list of facts from history but also the capacity to cogitate about past events and to learn from them. Moreover, experience can be gleaned not just from each individual’s personal life but from the collective experiences of humanity as a whole.

During the current pandemic, for example, people can learn instructive lessons from comparable past global disasters. Factual histories provide suggestive evidence of what was done, what was not done, and what could have been done better.1 And imaginative literature allows people to share the range of subjective emotions and reactions which may be triggered by great and unexpected disasters.2 It allows for a sort of mental rehearsal. Needless to say, imaginative fiction is not written primarily for utilitarian purposes. And far from all happenings that can be conjectured will actually transpire. (Time Travel provides a pertinent example). Nonetheless, imaginative literature, even when imagining things that are technically impossible, contributes to the stock of human creativity. And thoughts and dreams, as much as deeds and misdeeds, all form part of the human experience.

There is additionally a pleasant irony in on-line references to ‘the Lamp of Experience’. Various web-lists of famous quotations attribute the dictum to Edward Gibbon (1737-94), Britain’s nonpareil historian. The full statement runs as follows: ‘I have but one lamp by which my feet are guided, and that is the lamp of experience. I know of no way of judging the future but by the past’. But that formulation does not accord with Gibbon’s impersonally magisterial and often ironic style. The words are spikier, and more personalised.

In fact, their true author is also credited on the web; and maybe with time the accurate citations will crowd out the error. True, the general observation does not lose its force by being misattributed. Yet credit should go where credit was due. The reference was first made in a celebrated speech by a Virginian planter-turned-lawyer, named Patrick Henry (1736-99).3 He was an exact contemporary of Gibbon. But they differed in their politics. Henry was an American critic of British rule. In 1765, he used his knowledge of legal precedents to argue that the Westminster government’s attempt at imposing the unpopular Stamp Tax upon the American colonists was unconstitutional.4

Lawyers, like historians, were accustomed to weighing and pondering evidence before making judgments. In this case, Henry was using the ‘lamp’ of past experience for radical purposes. His arguments, while rejected by Britain, were popular in the American colonies; and in 1776 Henry became the first Governor of Virginia post-Independence. Manifestly, his appeal to experience had not produced universal agreement. As already noted, studying history provides options, not a universal blueprint for what it to be done.

Fig.2 Engraved portrait of the intent figure of Patrick Henry (1736-99), his eye-glasses pushed up onto his lawyer’s wig: a Virginia planter who turned to law and politics, Patrick Henry served as first and also sixth post-colonial Governor of the State of Virginia.

What, then, is the appeal and power of the past? The truth is that Henry’s dictum, while evocative, does not go nearly far enough. Experience/history provides much, much more than a pool of light. It provides the entire bedrock of existence. Everything comes from the past. Everyone learns from the past. The cosmos, global biology, languages, thought-systems, the stock of knowledge, diseases, human existence …  arrive in the present from the past.5

All that is because Time is unidirectional. Humans live in the present but have to rely upon the collective databank of past human experience. That great resource is not just a lamp, sending out a single beam. Instead, collective experience provides the entire context and content of surviving successfully in Time. All humans, as living histories, are part of the process, and contribute their personal quota. The better, fuller and more accurate is that collective knowledge, the better the long-term prospects for the species.

Humans in history are restless problem creators. Yet they are also impressive problem solvers. It’s time, not just for renewed human escape from an obvious viral danger, but equally for urgent collective action to halt, and where possible to reverse, the accelerating environmental degradation, which is damaging the global climate and global biodiversity – let alone the global habitat of humans.

Now needed – not just a Lamp but a mental Sunburst, drawing upon experience and transmuting into sustained action. Stirring times! What comes from the past will have a mighty effect on the future. And decisions taken in the present contribute crucially too.
1 See e.g. M. Honigsbaum, A History of the Great Influenza Pandemics: Death, Panic and Hysteria, 1830-1920 (2013; ppbk 2020)..

2 D. Defoe, A Journal of the Plague Year (1722; and many later edns); A. Camus, La Peste (Paris, 1947), in Eng. transl. by S. Gilbert as The Plague (1960).

3 P. Henry, ‘Speech at 2nd Virginia Convention, 23 March 1775’, in L. Copeland and L.W. Lamm (eds), The World’s Great Speeches (New York, 1999), pp. 232-3; T.S. Kidd, Patrick Henry: First among Patriots (New York, 2011).

4 P.D.G. Thomas, British Politics and the Stamp Act Crisis: The First Phase of the American Revolution, 1763-9 (Oxford, 1975); E.S. and H.M. Morgan, The Stamp Act Crisis: Prologue to Revolution (1974; 1995).

5 P.J. Corfield, ‘All People are Living Histories’ (2007), available on PJC website www.penelopejcorfield.co.uk/essaysonwhatishistory/pdf1

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 113 please click here

MONTHLY BLOG 103, WHO KNOWS THESE HISTORY GRADUATES BEFORE THE CAMERAS AND MIKES IN TODAY’S MASS MEDIA?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Image © Shutterstock 178056255

Responding to the often-asked question, What do History graduates Do? I usually reply, truthfully, that they gain employment in an immense range of occupations. But this time I’ve decided to name a popular field and to cite some high-profile cases, to give specificity to my answer. The context is the labour-intensive world of the mass media. It is no surprise to find that numerous History graduates find jobs in TV and radio. They are familiar with a big subject of universal interest – the human past – which contains something for all audiences. They are simultaneously trained to digest large amounts of disparate information and ideas, before welding them into a show of coherence. And they have specialist expertise in ‘thinking long’. That hallmark perspective buffers them against undue deference to the latest fads or fashions – and indeed buffers them against the slings and arrows of both fame and adversity.

In practice, most History graduates in the mass media start and remain behind-the-scenes. They flourish as managers, programme commissioners, and producers, generally far from the fickle bright lights of public fame. Collectively, they help to steer the evolution of a fast-changing industry, which wields great cultural clout.1

There’s no one single route into such careers, just as there’s no one ‘standard’ career pattern once there. It’s a highly competitive world. And often, in terms of personpower, a rather traditionalist one. Hence there are current efforts by UK regulators to encourage a wider diversity in terms of ethnic and gender recruiting.2 Much depends upon personal initiative, perseverance, and a willingness to start at comparatively lowly levels, generally behind the scenes. It often helps as well to have some hands-on experience – whether in student or community journalism; in film or video; or in creative applications of new social media. But already-know-it-all recruits are not as welcome as those ready and willing to learn on the job.

Generally, there’s a huge surplus of would-be recruits over the number of jobs available. It’s not uncommon for History students (and no doubt many others) to dream, rather hazily, of doing something visibly ‘big’ on TV or radio. However, front-line media jobs in the public eye are much more difficult than they might seem. They require a temperament that is at once super-alert, good-humoured, sensitive to others, and quick to respond to immediate issues – and yet is simultaneously cool under fire, not easily sidetracked, not easily hoodwinked, and implacably immune from displays of personal pique and ego-grandstanding. Not an everyday combination.

It’s also essential for media stars to have a thick skin to cope with criticism. The immediacy of TV and radio creates the illusion that individual broadcasters are personally ‘known’ to the public, who therefore feel free to commend/challenge/complain with unbuttoned intensity.

Those impressive History graduates who appear regularly before the cameras and mikes are therefore a distinctly rare breed.3 (The discussion here refers to media presenters in regular employment, not to the small number of academic stars who script and present programmes while retaining full-time academic jobs – who constitute a different sort of rare breed).

Celebrated exemplars among History graduates include the TV news journalists and media personalities Kirsty Wark (b.1955) and Laura Kuenssberg (b.1976)., who are both graduates of Edinburgh University. Both have had public accolades – Wark was elected as Fellow of the Royal Society of Edinburgh in 2017 – and both face much criticism. Kuenssberg in particular, as the BBC’s first woman political editor, is walking her way warily but effectively through the Gothic-melodrama-cum-Greek-tragedy-cum-high-farce, known as Brexit.

In a different sector of the media world, the polymathic TV and radio presenter, actor, film critic and chat-show host Jonathan Ross (b.1960) is another History graduate. He began his media career young, as a child in a TV advertisement for a breakfast cereal. (His mother, an actor, put him forward for the role). Then, having studied Modern European History at London University’s School of Slavonic & Eastern European Studies, Ross worked as a TV programme researcher behind the scenes, before eventually fronting the shows. Among his varied output, he’s written a book entitled Why Do I Say These Things? (2008). This title for his stream of reminiscences highlights the tensions involved in being a ‘media personality’. On the one hand, there’s the need to keep stoking the fires of fame; but, on the other, there’s an ever-present risk of going too far and alienating public opinion.

Similar tensions accompany the careers of two further History graduates, who are famed as sports journalists. The strain of never making a public slip must be enormous. John Inverdale (b.1957), a Southampton History graduate, and Nicky Campbell (b.1961), ditto from Aberdeen, have to cope not only with the immediacy of the sporting moment but also with the passion of the fans. After a number of years, Inverdale racked up a number of gaffes. Some were unfortunate. None fatal. Nonetheless, readers of the Daily Telegraph in August 2016 were asked rhetorically, and obviously inaccurately: ‘Why Does Everyone Hate John Inverdale?’4 That sort of over-the top response indicates the pressures of life in the public eye.

Alongside his career in media, meanwhile, Nicky Campbell used his research skills to study the story of his own adoption. His book Blue-Eyed Son (2011)5 sensitively traced his extended family roots among both Protestant and Catholic communities in Ireland. His current role as a patron of the British Association for Adoption and Fostering welds this personal experience into a public role.

The final exemplar cited here is one of the most notable pioneers among women TV broadcasters. Baroness Joan Bakewell (b.1933) has had what she describes as a ‘rackety’ career. She studied first Economics and then History at Cambridge. After that, she experienced periods of considerable TV fame followed by the complete reverse, in her ‘wilderness years’.6 Yet her media skills, her stubborn persistence, and her resistance to being publicly patronised for her good looks in the 1960s, have given Bakewell media longevity. She is not afraid of voicing her views, for example in 2008 criticising the absence of older women on British TV. In her own maturity, she can now enjoy media profiles such as that in 2019 which explains: ‘Why We Love Joan Bakewell’.7 No doubt, she takes the commendations with the same pinch of salt as she took being written off in her ‘wilderness years’.

Bakewell is also known as an author; and for her commitment to civic engagement. In 2011 she was elevated to the House of Lords as a Labour peer. And in 2014 she became President of Birkbeck College, London. In that capacity, she stresses the value – indeed the necessity – of studying History. Her public lecture on the importance of this subject urged, in timely fashion, that: ‘The spirit of enquiring, of evidence-based analysis, is demanding to be heard.’8

What do these History graduates in front of the cameras and mikes have in common? Their multifarious roles as journalists, presenters and cultural lodestars indicate that there’s no straightforward pathway to media success. These multi-skilled individuals work hard for their fame and fortunes, concealing the slog behind an outer show of relaxed affability. They’ve also learned to live with the relentless public eagerness to enquire into every aspect of their lives, from health to salaries, and then to criticise the same. Yet it may be speculated that their early immersion in the study of History has stood them in good stead. As already noted, they are trained in ‘thinking long’. And they are using that great art to ‘play things long’ in career terms as well. As already noted, multi-skilled History graduates work in a remarkable variety of fields. And, among them, some striking stars appear regularly in every household across the country, courtesy of today’s mass media.

ENDNOTES:

1 O. Bennett, A History of the Mass Media (1987); P.J. Fourtie, (ed.), Media Studies, Vol. 1: Media History, Media and Society (2nd edn., Cape Town, 2007); G. Rodman, Mass Media in a Changing World: History, Industry, Controversy (New York, 2008); .

2 See Ofcom Report on Diversity and Equal Opportunities in Television (2018): https://www.ofcom.org.uk/__data/assets/pdf_file/0021/121683/diversity-in-TV-2018-report.PDF

3 Information from diverse sources, including esp. the invaluable survey by D. Nicholls, The Employment of History Graduates: A Report for the Higher Education Authority … (2005): https://www.heacademy.ac.uk/system/files/resources/employment_of_history_students_0.pdf; and short summary by D. Nicholls, ‘Famous History Graduates’, History Today, 52/8 (2002), pp. 49-51.

4 See https://www.telegraph.co.uk/olympics/2016/08/15/why-does-everyone-hate-john-inverdale?

5 N. Campbell, Blue-Eyed Son: The Story of an Adoption (2011).

6 J. Bakewell, interviewed by S. Moss, in The Guardian, 4 April 2010: https://www.theguardian.com/lifeandstyle/2010/apr/04/joan-bakewell-harold-pinter-crumpet

7 https://www.bbc.co.uk/programmes/articles/1xZlS9nh3fxNMPm5h3DZjhs/why-we-love-joan-bakewell.

8 J. Bakewell, ‘Why History Matters: The Eric Hobsbawm Lecture’ (2014): http://joanbakewell.com/history.html.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 103 please click here

MONTHLY BLOG 101, ARE YOU A LUMPER OR SPLITTER? HOW WELL DO YOU KNOW YOUR OWN CAST OF MIND?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)
The terminology, derived from Charles Darwin,1 is hardly elegant. Yet it highlights rival polarities in the intellectual cast of mind. ‘Lumpers’ seek to assemble fragments of knowledge into one big picture, while ‘splitters’ see instead complication upon complications. An earlier permutation of that dichotomy was popularised by Isaiah Berlin. In The Hedgehog and the Fox (1953), he distinguished between brainy foxes, who know many things, and intellectual hedgehogs, who apparently know one big thing.2

Fox from © Clipart 2019; Hedgehog from © GetDrawings.com (2019)

These animalian embodiments of modes of thought are derived from a fragmentary dictum from the classical Greek poet Archilochus; and they remain more fanciful than convincing. It’s not self-evident that a hedgehog’s mentality is really so overwhelmingly single-minded.3 Nor is it clear that the reverse syndrome applies particularly to foxes, which have a reputation for craft and guile.4 To make his point with reference to human thinkers, Berlin instanced the Russian novelist Leo Tolstoy as a classic ‘hedgehog’. Really? The small and prickly hedgehog hardly seems a good proxy for a grandly sweeping thinker like Tolstoy.

Those objections to Berlin’s categories, incidentally, are good examples of hostile ‘splitting’. They quibble and contradict. Sweeping generalisations are rejected. Such objections recall a dictum in a Poul Anderson sci-fi novella, when one character states gravely that: ‘I have yet to see any problem, which, when you looked at it in the right way, did not become still more complicated’.5

Arguments between aggregators/generalisers and disaggregators/sceptics, which occur in many subjects, have been particularly high-profile among historians. The lumping/splitting dichotomy was recycled in 1975 by the American J.H. Hexter.6 He accused the Marxist Christopher Hill not only of ‘lumping’ but, even worse, of deploying historical evidence selectively, to bolster a partisan interpretation. Hill replied relatively tersely.7 He rejected the charge that he did not play fair with the sources. But he proudly accepted that, through his research, he sought to find and explain meanings in history. The polarities of lumping/splitting were plain for all to see.

Historical ‘lumpers’ argue that all analysis depends upon some degree of sorting/processing/generalising, applied to disparate information. Merely itemising date after date, or fact after fact ad infinitum, would not tell anyone anything. On those dreadful occasions when lecturers do actually proceed by listing minute details one by one (for example, going through events year by year), the audience’s frustration very quickly becomes apparent.

So ‘lumpers’ like big broad interpretations. And they tend to write big bold studies, with clear long-term trends. Karl Marx’s panoramic brief survey of world history in nine pages in The Communist Manifesto was a classic piece of ‘lumping’.8 In the twentieth century, the British Marxist historian E.P. Thompson was another ‘lumper’ who sought the big picture, although he could be a combative ‘splitter’ about the faults of others.9

‘Splitters’ conversely point out that, if there were big broad-brush interpretations that were reliably apparent, they would have been discovered and accepted by now. However, the continual debates between historians in every generation indicate that grand generalisations are continually being attacked. The progression of the subject relies upon a healthy dose of disaggregation alongside aggregation. ‘Splitters’ therefore produce accounts of rich detail, complications, diversities, propounding singular rather than universal meanings, and stressing contingency over grand trends.

Sometimes critics of historical generalisations are too angry and acerbic. They can thus appear too negative and destructive. However, one of the twentieth-century historians’ most impressive splitters was socially a witty and genial man. Intellectually, however, F.J. ‘Jack’ Fisher was widely feared for his razor-sharp and trenchant demolitions of any given historical analysis. Indeed, his super-critical cast of mind had the effect of limiting his own written output to a handful of brilliant interpretative essays rather than a ‘big book’.10 (Fisher was my research supervisor. His most caustic remark to me came after reading a draft chapter: ‘There is nothing wrong with this, other than a female desire to tell all and an Oxbridge desire to tell it chronologically.’ Ouch! Fisher was not anti-woman, although he was critical of Oxbridge where I’d taken my first degree. But he used this formulation to grab my attention – and it certainly did).

Among research historians today, the temperamental/intellectual cast of mind often inclines them to ‘splitting’, partly because there are many simplistic generalisations about history in public circulation which call out for contradiction or complication. Of course, the precise distribution around the norm remains unknown. These days, I would guestimate that the profession would divide into roughly 45% ‘lumpers’, seeking big grand overviews, and 55% ‘splitters’, stressing detail, diversity, contingency. The classification, however, does depend partly on the occasion and type of output, since single-person expositions on TV and radio encourage generalisations, while round-tables and panels thrive on disagreement where splitters can come into their own.

Moreover, there are not only personal variations, depending upon circumstance, but also major oscillations in intellectual fashions within the discipline. In the later twentieth century, for example, there was a growing, though not universal, suspicion of so-called Grand Narratives (big through-time interpretations).11 The high tide of the sceptical trend known as ‘revisionism’ challenged many old generalisations and easy assumptions. Revisionists did not constitute one single school of thought. Many did favour conservative interpretations of history, but, as remains apparent today, there was and is more than one form of conservatism. That said, revisionists were generally agreed in rejecting both left-wing Marxist conflict models of revolutionary change via class struggles and liberal Whiggish linear models of evolving Progress via spreading education, constitutional rights and so forth.12

Yet the alignments were never simple (a splitterish comment from myself). Thus J.H. Hexter was a ‘splitter’ when confronting Marxists like Hill. But he was a ‘lumper’ when propounding his own Whig view of history as a process of evolving Freedom. So Hexter’s later strictures on revisionism were as fierce as was his earlier critique of Hill.13

Ideally, most research historians probably seek to find a judicious balance between ‘lumping’/‘splitting’. There is scope both for generalisations and for qualifications. After all, there is diversity within the human experience and within the cosmos. Yet there are also common themes, deep patterns, and detectable trends.

Ultimately, however, the dichotomous choice between either ‘lumping’ or ‘splitting’ is a completely false option, when pursued to its limits. Human thought, in all the disciplines, depends upon a continuous process of building/qualifying/pulling down/rebuilding/requalifying/ and so on, endlessly. With both detailed qualifications and with generalisations. An analysis built upon And+And+And+And+And would become too airy and generalised to have realistic meaning. Just as a formulation based upon But+But+But+But+But would keep negating its own negations. So, yes. Individually, it’s worth thinking about one’s own cast of mind and intellectual inclinations. (I personally enjoy both lumping and splitting, including criticising various outworn terminologies for historical periodisation).14 Furthermore, self-knowledge allows personal scope to make auto-adjustments, if deemed desirable. And then, better still, to weld the best features of ‘lumping’ and ‘splitting’ into original thought. And+But+And+Eureka.

ENDNOTES:

1 Charles Darwin in a letter dated August 1857: ‘It is good to have hair-splitters and lumpers’: see Darwin Correspondence Letter 2130 in https://www.darwinproject.ac.uk/.

2 I. Berlin, The Hedgehog and the Fox: An Essay on Tolstoy’s View of History (1953).

3 For hedgehogs, now an endangered species, see S. Coulthard, The Hedgehog Handbook (2018). If the species were to have one big message for humans today, it would no doubt be: ‘Stop destroying our habitat and support the Hedgehog Preservation Society’.

4 M. Berman, Fox Tales and Folklore (2002).

5 From P. Anderson, Call Me Joe (1957).

6 J.H. Hexter, ‘The Burden of Proof: The Historical Method of Christopher Hill’, Times Literary Supplement, 25 Oct. 1975, repr. in J.H. Hexter, On Historians: Reappraisals of Some of the Makers of Modern History (1979), pp. 227-51.

7 For Hill’s rebuttal, see The Times Literary Supplement, 7 Nov. 1975, p. 1333.

8 K. Marx and F. Engels, The Manifesto of the Communist Party (1848), Section I: ‘Bourgeois and Proletarians’, in D. McLennan (ed.), Karl Marx: Selected Writings (Oxford, 1977), pp. 222-31.

9 Among many overviews, see e.g. C. Efstathiou, E.P. Thompson: A Twentieth-Century Romantic (2015); P.J. Corfield, E.P. Thompson, Historian: An Appreciation (1993; 2018), in PJC website http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf45.

10 See P.J. Corfield, F.J. Fisher (1908-88) and the Dialectic of Economic History (1990; 2018), in PJC website http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf46.

11 See esp. J-F. Lyotard, The Postmodern Condition: A Report on Knowledge (Paris, 1979; in Eng. transl. 1984), p. 7, which detected ‘an incredulity toward meta-narratives’; and further discussions in G.K. Browning, Lyotard and the End of Grand Narratives (Cardiff, 2000); and A Munslow, Narrative and History (2018). Earlier Lawrence Stone, a classic historian ‘lumper’, had detected a return to narrative styles of exposition: see L. Stone, ‘The Revival of Narrative: Reflections on a New Old History’, Past & Present, 85 (1979), pp.  3-24. But in this essay Stone was detecting a decline in social-scientific styles of History-writing – not a return to old-style Grand Narratives.

12 Revisionism is sufficiently variegated to have avoided summary within one big study. But different debates are surveyed in L. Labedz (ed.), Revisionism: Essays on the History of Marxist Ideas (1962); J.M. Maddox, Hiroshima in History: The Myths of Revisionism (1974; 2011); L. Brenner, The Iron Wall: Zionist Revisionism from Jabotinsky to Shamir (1984); E. Longley, The Living Stream: Literature and Revisionism in Ireland (Newcastle upon Tyne, 1994); and M. Haynes and J. Wolfreys (eds), History and Revolution: Refuting Revisionism (2007).

13 J.H. Hexter (1910-96) founded in 1986 the Center for the History of Freedom at Washington University, USA, where he was Professor of the History of Freedom, and launched The Making of Modern Freedom series. For his views on revisionism, see J.H. Hexter, ‘Historiographical Perspectives: The Early Stuarts and Parliaments – Old Hat and the Nouvelle Vague’, Parliamentary History, 1 (1982), pp. 181-215; and analysis in W.H. Dray, ‘J.H. Hexter, Neo-Whiggism and Early Stuart Historiography’, History & Theory, 26 (1987), pp. 133-49.

14 See e.g. P.J. Corfield, ‘Primevalism: Saluting a Renamed Prehistory’, in A. Baysal, E.L. Baysal and S. Souvatzi (eds), Time and History in Prehistory (2019), pp. 265-82; and P.J. Corfield, ‘POST-Medievalism/ Modernity/ Postmodernity?’ Rethinking History, 14 (2010), pp. 379-404; also on http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf20.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 101 please click here

MONTHLY BLOG 99, WHY BOTHER TO STUDY THE RULEBOOK?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Joining a public committee of any kind? Before getting enmeshed in the details, I recommend studying the rulebook. Why on earth? Such advice seems arcane, indeed positively nerdy. But I have a good reason for this recommendation. Framework rules are the hall-mark of a constitutionalist culture.

Fig.1 The handsome front cover of the first edition of Robert’s Rules of Order (1876): these model rules, based upon the practices of the US Congress, remain widely adopted across the USA, their updating being undertaken by the Robert’s Rules Association, most recently in 2011.

Once, many years ago, I was nominated by the London education authority – then in the form of the Inner London Education Authority or ILEA – onto a charitable trust in Battersea, where I live. I accepted, not with wild enthusiasm, but from a sense of civic duty. The Trust was tiny and then did not have much money. It was rumoured that a former treasurer in the 1930s had absconded with all the spare cash. But anyway in the early 1970s the Trust was pottering along and did not seem likely to be controversial.

My experience as a Trustee was, however, both depressing and frustrating. The Trust was then named Sir Walter St. John’s Trust; and it exists today in an updated and expanded guise as the Sir Walter St. John’s Educational Charity (www.swsjcharity.org.uk). It was founded in 1700 by Battersea’s local Lord of the Manor, after whom it is named. In the 1970s, the Trust didn’t do much business at all. The only recurrent item on the agenda was the question of what to do about a Victorian memorial window which lacked a home. The fate of the Bogle Smith Window (as it was known) had its faintly comic side. Surely somewhere could be found to locate it, within one or other of the two local state-sector grammar schools, for which the Trust was ground landowner? But soon the humour of wasting hours of debate on a homeless window palled.

I also found it irksome to be treated throughout with deep suspicion and resentment by most of my fellow Trustees. They were Old Boys from the two schools in question: Battersea Grammar School and Sir Walter St. John School. All the Trust business was conducted with outward calm. There were no rows between the large majority of Old Boys and the two women appointed by the ILEA. My fellow ILEA-nominee hardly ever attended; and said nothing, when she did. Yet we were treated with an unforgiving hostility, which I found surprising and annoying. A degree of misogyny was not unusual; yet often the stereotypical ‘good old boys’ were personally rather charming to women (‘the ladies, God bless’em’) even while deploring their intrusion into public business.

But no, these Old Boys were not charming, or even affable. And their hostile attitude was not caused purely by misogyny. It was politics. They hated the Labour-run ILEA and therefore the two ILEA appointees on the Trust. It was a foretaste of arguments to come. By the late 1970s, the Conservatives in London, led by Councillors in Wandsworth (which includes Battersea) were gunning for the ILEA. And in 1990 it was indeed abolished by the Thatcher government.

More than that, the Old Boys on the Trust were ready to fight to prevent their beloved grammar schools from going comprehensive. (And in the event both schools later left the public sector to avoid that ‘fate’). So the Old Boys’ passion for their cause was understandable and, from their point of view, righteous. However, there was no good reason to translate ideological differences into such persistently rude and snubbing behaviour.

Here’s where the rulebook came into play. I was so irked by their attitude – and especially by the behaviour of the Trust’s Chair – that I resolved to nominate an alternative person for his position at the next Annual General Meeting. I wouldn’t have the votes to win; but I could publicly record my disapprobation. The months passed. More than a year passed. I requested to know the date of the Annual General Meeting. To a man, the Old Boys assured me that they never held such things, with something of a lofty laugh and sneer at my naivety. In reply, I argued firmly that all properly constituted civic bodies had to hold such events. They scoffed. ‘Well, please may I see the Trust’s standing orders?’ I requested, in order to check. In united confidence, the Old Boys told me that they had none and needed none. We had reached an impasse.

At this point, the veteran committee clerk, who mainly took no notice of the detailed discussions, began to look a bit anxious. He was evidently stung by the assertion that the Trust operated under no rules. After some wrangling, it was agreed that the clerk should investigate. At the time, I should have cheered or even jeered. Because I never saw any of the Old Boys again.

Several weeks after this meeting, I received through the post a copy of the Trust’s Standing Orders. They looked as though they had been typed in the late nineteenth century on an ancient typewriter. Nonetheless, the first point was crystal clear: all members of the Trust should be given a copy of the standing orders upon appointment. I was instantly cheered. But there was more, much more. Of course, there had to be an Annual General Meeting, when the Chair and officers were to be elected. And, prior to that, all members of the Trust had to be validly appointed, via an array of different constitutional mechanisms.

An accompanying letter informed me that the only two members of the Trust who were correctly appointed were the two ILEA nominees. I had more than won my point. It turned out that over the years the Old Boys had devised a system of co-options for membership among friends, which was constitutionally invalid. They were operating as an ad hoc private club, not as a public body. Their positions were automatically terminated; and they never reappeared.

In due course, the vacancies were filled by the various nominating bodies; and the Trust resumed its very minimal amount of business. Later, into the 1980s, the Trust did have some key decisions to make, about the future of the two schools. I heard that its sessions became quite heated politically. That news was not surprising to me, as I already knew how high feelings could run on such issues. These days, the Trust does have funds, from the eventual sale of the schools, and is now an active educational charity.

Personally, I declined to be renominated, once my first term of service on the Trust was done. I had wasted too much time on fruitless and unpleasant meetings. However, I did learn about the importance of the rulebook. Not that I believe in rigid adhesion to rules and regulations. Often, there’s an excellent case for flexibility. But the flexibility should operate around a set of framework rules which are generally agreed and upheld between all parties.

Rulebooks are to be found everywhere in public life in constitutionalist societies. Parliaments have their own. Army regiments too. So do professional societies, church associations, trade unions, school boards, and public businesses. And many private clubs and organisations find them equally useful as well. Without a set of agreed conventions for the conduct of business and the constitution of authority, there’s no way of stopping arbitrary decisions – and arbitrary systems can eventually slide into dictatorships.

As it happens, the Old Boys on the Sir Walter St. John Trust were behaving only improperly, not evilly. I always regretted the fact that they simply disappeared from the meetings. They should at least have been thanked for their care for the Bogle Smith Window. And I would have enjoyed the chance to say, mildly but explicitly: ‘I told you so!’

Goodness knows what happened to these men in later years. I guess that they continued to meet as a group of friends, with a great new theme for huffing and puffing at the awfulness of modern womanhood, especially the Labour-voting ones. If they did pause to think, they might have realised that, had they been personally more pleasant to the intruders into their group, then there would have been no immediate challenge to their position. I certainly had no idea that my request to see the standing orders would lead to such an outcome.

Needless to say, the course of history does not hinge upon this story. I personally, however, learned three lasting lessons. Check to see what civic tasks involve before accepting them. Remain personally affable to all with whom you have public dealings, even if you disagree politically. And if you do join a civic organisation, always study the relevant rulebook. ‘I tried to tell them so!’ all those years ago – and I’m doing it now in writing. Moreover, the last of those three points is highly relevant today, when the US President and US Congress are locking horns over the interpretation of the US constitutional rulebook. May the rule of law prevail – and no prizes for guessing which side I think best supports that!

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 99 please click here

MONTHLY BLOG 94, THINKING LONG – STUDYING HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

History is a subject that deals in ‘thinking long’. The human capacity to think beyond the immediate instant is one of our species’ most defining characteristics. Of course, we live in every passing moment. But we also cast our minds, retrospectively and prospectively, along the thought-lines of Time, as we mull over the past and try to anticipate the future. It’s called ‘thinking long’.

Studying History (indicating the field of study with a capital H) is one key way to cultivate this capacity. Broadly speaking, historians focus upon the effects of unfolding Time. In detail, they usually specialise upon some special historical period or theme. Yet everything is potentially open to their investigations.

Sometimes indeed the name of ‘History’ is invoked as if it constitutes an all-seeing recording angel. So a controversial individual in the public eye, fearing that his or her reputation is under a cloud, may proudly assert that ‘History will be my judge’. Quite a few have made such claims. They express a blend of defiance and  optimism. Google: ‘History will justify me’ and a range of politicians, starting with Fidel Castro in 1963, come into view. However, there’s no guarantee that the long-term verdicts will be kinder than any short-term criticisms.

True, there are individuals whose reputations have risen dramatically over the centuries. The poet, painter and engraver William Blake (1757-1827), virtually unknown in his own lifetime, is a pre-eminent example. Yet the process can happen in reverse. So there are plenty of people, much praised at the start of their careers, whose reputations have subsequently nose-dived and continue that way. For example, some recent British Prime Ministers may fall into that category. Only Time (and the disputatious historians) will tell.

Fig. 1 William Blake’s Recording Angel has about him a faint air of an impish magician as he points to the last judgment. If this task were given to historians, there would be a panel of them, arguing amongst themselves.

In general, needless to say, those studying the subject of History do not define their tasks in such lofty or angelic terms. Their discipline is distinctly terrestrial and Time-bound. It is prone to continual revision and also to protracted debates, which may be renewed across generations. There’s no guarantee of unanimity. One old academic anecdote imagines the departmental head answering the phone with the majestic words: ‘History speaking’.1 These days, however, callers are likely to get no more than a tinny recorded message from a harassed administrator. And academic historians in the UK today are themselves being harried not to announce god-like verdicts but to publish quickly, in order to produce the required number of ‘units of output’ (in the assessors’ unlovely jargon) in a required span of time.

Nonetheless, because the remit of History is potentially so vast, practitioners and students have unlimited choices. As already noted, anything that has happened within unfolding Time is potentially grist to the mill. The subject resembles an exploding galaxy – or, rather, like the cosmos, the sum of many exploding galaxies.

Tempted by that analogy, some practitioners of Big History (a long-span approach to History which means what it says) do take the entire universe as their remit, while others stick merely to the history of Planet Earth.2 Either way, such grand approaches are undeniably exciting. They require historians to incorporate perspectives from a dazzling range of other disciplines (like astro-physics) which also study the fate of the cosmos. Thus Big History is one approach to the subject which very consciously encourages people to ‘think long’. Its analysis needs careful treatment to avoid being too sweeping and too schematic chronologically, as the millennia rush past. But, in conjunction with shorter in-depth studies, Big History gives advanced students a definite sense of temporal sweep.

Meanwhile, it’s also possible to produce longitudinal studies that cover one impersonal theme, without having to embrace everything. Thus there are stimulating general histories of the weather,3 as well as more detailed histories of weather forecasting, and/or of changing human attitudes to weather. Another overarching strand studies the history of all the different branches of knowledge that have been devised by humans. One of my favourites in this genre is entitled: From Five Fingers to Infinity.4 It’s a probing history of mathematics. Expert practitioners in this field usually stress that their subject is entirely ahistorical. Nonetheless, the fascinating evolution of mathematics throughout the human past to become one globally-adopted (non-verbal) language of communication should, in my view, be a theme to be incorporated into all advanced courses. Such a move would encourage debates over past changes and potential future developments too.

Overall, however, the great majority of historians and their courses in History take a closer focus than the entire span of unfolding Time. And it’s right that the subject should combine in-depth studies alongside longitudinal surveys. The conjunction of the two provides a mixture of perspectives that help to render intelligible the human past. Does that latter phrase suffice as a summary definition?5 Most historians would claim to study the human past rather than the entire cosmos.

Yet actually that common phrase does need further refinement. Some aspects of the human past – the evolving human body, for example, or human genetics – are delegated for study by specialist biologists, anatomists, geneticists, and so forth. So it’s clearer to say that most historians focus primarily upon the past of human societies in the round (ie. including everything from politics to religion, from war to economics, from illness to health, etc etc). And that suffices as a definition, provided that insights from adjacent disciplines are freely incorporated into their accounts, wherever relevant. For example, big cross-generational studies by geneticists are throwing dramatic new light upon the history of human migration around the globe and also of intermarriage within the complex range of human species and the so-called separate ‘races’ within them.6 Their evidence amply demonstrates the power of longitudinal studies for unlocking both historical and current trends.

The upshot is that the subject of History can cover everything within the cosmos; that it usually concentrates upon the past of human societies, viewed in the round; and that it encourages the essential human capacity for thinking long. For that reason, it’s a study for everyone. And since all people themselves constitute living histories, they all have a head-start in thinking through Time.7

1 I’ve heard this story recounted of a formidable female Head of History at the former Bedford College, London University; and the joke is also associated with Professor Welch, the unimpressive senior historian in Kingsley Amis’s Lucky Jim: A Novel (1953), although upon a quick rereading today I can’t find the exact reference.

2 For details, see the website of the Big History’s international learned society (founded 2010): www.ibhanet.org. My own study of Time and the Shape of History (2007) is another example of Big History, which, however, proceeds not chronologically but thematically.

3 E.g. E. Durschmied, The Weather Factor: How Nature has Changed History (2000); L. Lee, Blame It on the Rain: How the Weather has Changed History (New York, 2009).

4 F.J. Swetz (ed.), From Five Fingers to Infinity: A Journey through the History of Mathematics (Chicago, 1994).

5 For meditations on this theme, see variously E.H. Carr, What is History? (Cambridge 1961; and many later edns); M. Bloch, The Historian’s Craft (in French, 1949; in English transl. 1953); B. Southgate, Why Bother with History? Ancient, Modern and Postmodern Motivations (Harlow, 2000); J. Tosh (ed.), Historians on History: An Anthology (2000; 2017); J. Black and D.M. MacRaild, Studying History (Basingstoke, 2007); H.P.R. Finberg (ed.), Approaches to History: A Symposium (2016).

6 See esp. L.L. Cavalli-Sforza and F. Cavalli-Sforza, The Great Human Diasporas: The History of Diversity and Evolution, transl. by S. Thomas (Reading, Mass., 1995); D. Reich, Who We Are and Where We Got Here: Ancient DNA and the New Science of the Human Past (Oxford, 2018).

7 P.J. Corfield, ‘All People are Living Histories: Which is why History Matters’. A conversation-piece for those who ask: Why Study History? (2008) in London University’s Institute of Historical Research Project, Making History: The Discipline in Perspective www.history.ac.uk/makinghistory/resources/articles/why_history_matters.html; and also available on www.penelopejcorfield.co.uk/ Pdf1.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 94 please click here

MONTHLY BLOG 92, HISTORIANS AT WORK THROUGH TIME

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)
Historians, who study the past, don’t undertake this exercise from some vantage point outside Time. They, like everyone else, live within an unfolding temporality. That’s very fundamental. Thus it’s axiomatic that historians, like their subjects of study, are all equally Time-bound.1

Nor do historians undertake the study of the past in one single moment in time. Postmodernist critics of historical studies sometimes write as though historical sources are culled once only from an archive and then adopted uncritically. The implied research process is one of plucking choice flowers and then pressing them into a scrap-book to some pre-set design.

On such grounds, critics of the discipline highlight the potential flaws in all historical studies. Sources from the past are biased, fallible and scrappy. Historians in their retrospective analysis are also biased, fallible and sometimes scrappy. And historical writings are literary creations only just short of pure fiction.2

Historians should welcome scepticism this dose of scepticism – always a useful corrective. Yet they entirely reject the proposition that trying to understand bygone eras is either impossible or worthless. Rebuttals to postmodernist scepticism have been expressed theoretically;3 and also directly, via pertinent case studies which cut through the myths and ‘fake news’ which often surround controversial events in history.4

When at work, historians should never take their myriad of source materials literally and uncritically. Evidence is constantly sought, interrogated, checked, cross-checked, compared and contrasted, as required for each particular research theme. The net is thrown widely or narrowly, again depending upon the subject. Everything is a potential source, from archival documents to art, architecture, artefacts and though the gamut to witness statements and zoological exhibits. Visual materials can be incorporated either as primary sources in their own right, or as supporting documentation. Information may be mapped and/or tabulated and/or statistically interrogated. Digitised records allow the easy selection of specific cases and/or the not-so-easy processing of mass data.

As a result, researching and writing history is a slow through-Time process – sometimes tediously so. It takes at least four years, from a standing start, to produce a big specialist, ground-breaking study of 100,000 words on a previously un-studied (or under-studied) historical topic. The exercise demands a high-level synthesis of many diverse sources, running to hundreds or even thousands. Hence the methodology is characteristically much more than a ‘reading’ of one or two key texts – although, depending upon the theme, at times a close reading of a few core documents (as in the history of political ideas) is essential too.

Mulling over meanings is an important part of the process too. History as a discipline encourages a constant thinking and rethinking, with sustained creative and intellectual input. It requires knowledge of the state of the discipline – and a close familiarity with earlier work in the chosen field of study. Best practice therefore enjoins writing, planning and revising as the project unfolds. For historical studies, ‘writing through’ is integral, rather than waiting until all the hard research graft is done and then ‘writing up’.5

The whole process is arduous and exciting, in almost equal measure. It’s constantly subject to debate and criticism from peer groups at seminars and conferences. And, crucially too, historians are invited to specify not only their own methodologies but also their own biases/assumptions/framework thoughts. This latter exercise is known as ‘self-reflexivity’. It’s often completed at the end of a project, although it’s then inserted near the start of the resultant book or essay. And that’s because writing serves to crystallise and refine (or sometimes to reject) the broad preliminary ideas, which are continually tested by the evidence.

One classic example of seriously through-Time writing comes from the classic historian Edward Gibbon. The first volume of his Decline & Fall of the Roman Empire appeared in February 1776. The sixth and final one followed in 1788. According to his autobiographical account, the gestation of his study dated from 1764. He was then sitting in the Forum at Rome, listening to Catholic monks singing vespers on Capitol Hill. The conjunction of ancient ruins and later religious commitments prompted his core theme, which controversially deplored the role of Christianity in the ending of Rome’s great empire. Hence the ‘present’ moments in which Gibbon researched, cogitated and wrote stretched over more than 20 years. When he penned the last words of the last volume, he recorded a sensation of joy. But then he was melancholic that his massive project was done.6 (Its fame and the consequent controversies last on today; and form part of the history of history).

1 For this basic point, see PJC, ‘People Sometimes Say “We Don’t Learn from the Past” – and Why that Statement is Completely Absurd’, BLOG/91 (July 2018), to which this BLOG/92 is a companion-piece.

2 See e.g. K. Jenkins, ReThinking History (1991); idem (ed.), The Postmodern History Reader (1997); C.G. Brown, Postmodernism for Historians (Harlow, 2005); A. Munslow, The Future of History (Basingstoke, 2010).

3 J. Appleby, L. Hunt and M. Jacob, Telling the Truth about History (New York, 1994); R. Evans, In Defence of History (1997); J. Tosh (ed.), Historians on History (Harlow, 2000); A. Brundage, Going to the Sources: A Guide to Historical Research and Writing (Hoboken, NJ., 2017).

4 H. Shudo, The Nanking Massacre: Fact versus Fiction – A Historian’s Quest for the Truth, transl. S. Shuppan (Tokyo, 2005); Vera Schwarcz, Bridge across Broken Time: Chinese and Jewish Cultural Memory (New Haven, 1998).

5 PJC, ‘Writing Through a Big Research Project, not Writing Up’, BLOG/60 (Dec.2015); PJC, ‘How I Write as a Historian’, BLOG/88 (April 2018).

6 R. Porter, Gibbon: Making History (1989); D.P. Womersley, Gibbon and the ‘Watchmen of the Holy City’: The Historian and his Reputation, 1776-1815 (Oxford, 2002).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 92 please click here

MONTHLY BLOG 91, PEOPLE SOMETIMES SAY: ‘WE DON’T LEARN FROM THE PAST’ AND WHY THAT STATEMENT IS COMPLETELY ABSURD

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

People sometimes say, dogmatically but absurdly: ’We don’t learn from the Past’. Oh really? So what do humans learn from, then? We don’t learn from the Future, which has yet to unfold. We do learn in and from the Present. Yet every moment of ‘Now’ constitutes an infinitesimal micro-instant an unfolding process. The Present is an unstable time-period, which is constantly morphing, nano-second by nano-second, into the Past. Humans don’t have time, in that split-second of ‘Now’, to comprehend and assimilate everything. As a result, we have, unavoidably, to learn from what has gone before: our own and others’ experiences, which are summed as everything before ‘Now’: the Past.

It’s worth reprising the status of those temporal categories. The Future, which has not yet unfolded, is not known or knowable in its entirety. That’s a definitional quality which springs from the unidirectional nature of Time. It does not mean that the Future is either entirely unknown or entirely unknowable. As an impending temporal state, it may beckon, suggest, portend. Humans are enabled to have considerable information and expectations about many significant aspects of the Future. For example, it’s clear from past experience that all living creatures will, sooner or later, die in their current corporeal form. We additionally know that tomorrow will come after today, because that is how we habitually define diurnal progression within unilinear Time. We also confidently expect that in the future two plus two will continue to equal four; and that all the corroborated laws of physics will still apply.

And we undertake calculations, based upon past data, which provide the basis for Future predictions or estimates. For example, actuarial tables, showing age-related life expectancy, indicate group probabilities, though not absolute certainties. Or, to take a different example, we know, from expert observation and calculation, that Halley’s Comet is forecast to return into sight from Earth in mid-2061. Many, though not all, people alive today will be able to tell whether that astronomical prediction turns out to be correct or not. And there’s every likelihood  that it will be.

Commemorating a successful prediction,
in the light of past experience:
a special token struck in South America in 2010 to celebrate
the predicted return to view from Planet Earth
of Halley’s Comet,
whose periodicity was first calculated by Edward Halley (1656-1742)

Yet all this (and much more) useful information about the Future is, entirely unsurprisingly, drawn from past experience, observations and calculations. As a result, humans can use the Past to illuminate and to plan for the Future, without being able to foretell it with anything like total precision.

So how about learning from the Present? It’s live, immediate, encircling, inescapably ‘real’. We all learn in our own present times – and sometimes illumination may come in a flash of understanding. One example, as Biblically recounted, is the conversion of St Paul, who in his unregenerate days was named Saul: ‘And as he journeyed, he came near Damascus; and suddenly there shined round about him a light from heaven. And he fell to the earth, and heard a voice saying unto him, “Saul, Saul, why persecutest thou me?”’1 His eyes were temporarily blinded; but spiritually he was enlightened. Before then, Saul was one of the Christians’ chief persecutors, ‘breathing out threatening and slaughter’.2 Perhaps a psychologist might suggest that his intense hostility concealed some unexpressed fascination with Christianity. Nonetheless, there was no apparent preparation, so the ‘Damascene conversion’ which turned Saul into St Paul remains the classic expression of an instant change of heart. But then he had to rethink and grow into his new role, working with those he had been attempting to expunge.

A secular case of sudden illumination appears in the fiction of Jane Austen. In Emma (1815), the protagonist, a socially confident would-be match-maker, has remained in ignorance of her own heart. She encourages her young and humble protégé, Harriet Smith, to fancy herself in love. They enjoy the prospect of romance. Then Emma suddenly learns precisely who is the object of Harriet’s affections. The result is wonderfully described.3 Emma sits in silence for several moments, in a fixed attitude, contemplating the unpleasant news:

Why was it so much worse that Harriet should be in love with Mr Knightley, than with Frank Churchill? Why was the evil so dreadfully increased by Harriet’s having some hope of a return? It darted through her, with the speed of an arrow, that Mr Knightley must marry no one but herself!

I remember first reading this novel, as a teenager, when I was as surprised as Emma at this development. Since then, I’ve reread the story many times; and I can now see the prior clues which Austen scatters through the story to alert more worldly-wise readers that George Knightley and Emma Woodhouse are a socially and personally compatible couple, acting in concert long before they both (separately) realise their true feelings. It’s a well drawn example of people learning from the past whilst ‘wising up’ in a single moment. Emma then undertakes some mortifying retrospection as she gauges her own past errors and blindness. But she is capable of learning from experience. She does; and so, rather more artlessly, does Harriet. It’s a comedy of trial-and-error as the path to wisdom.

As those examples suggest, the relationship of learning with Time is in fact a very interesting and complex one. Humans learn in their own present moments. Yet the process of learning and education as a whole has to be a through-Time endeavour. A flash of illumination needs to be mentally consolidated and ‘owned’. Otherwise it is just one of those bright ideas which can come and as quickly go.   Effective learning thus entails making oneself familiar with a subject by repetition, cogitation, debating, and lots of practice. Such through-Time application applies whether people are learning physical or intellectual skills or both. The role of perspiration, as well as inspiration, is the stuff of many mottoes: ‘practice makes perfect’; ‘if at first you don’t succeed, try and try again’; ‘stick at it’; ‘never stop learning’; ‘trudge another mile’; ‘learn from experience’.

Indeed, the entire corpus of knowledge and experience that humans have assembled over many generations is far too huge to be assimilated in an instant. (It’s actually too huge for any one individual to master. So we have to specialise and share).

So that brings the discussion back to the Past. It stretches back through Time and onwards until ‘Now’. Of course, we learn from it. Needless to say, it doesn’t follow that people always agree on messages from former times, or act wisely in the light of such information. Hence when people say: ‘We don’t learn from the Past’, they probably mean that it does not deliver one guiding message, on which everyone agrees. And that’s right. It doesn’t and there isn’t.

One further pertinent point: there are rumbling arguments around the question – is the Past alive or dead? (With a hostile implication in the sub-text that nothing can really be learned from a dead and vanished Past.) But that’s not a helpful binary. In other words, it’s a silly question. Some elements of the past have conclusively gone, while many others persist through time.4 To take just a few examples, the human genome was not invented this morning; human languages have evolved over countless generations; and the laws of physics apply throughout.

Above all, therefore, the integral meshing between Past and Present means that we, individual humans, have also come from the Past. It’s in us as well as, metaphorically speaking, behind us. Thinking of Time as running along a pathway or flowing like a river is a common human conception of temporality. Other alternatives might envisage the Past as ‘above’, ‘below’, ‘in front’, ‘behind’, or ‘nowhere specific’. The metaphor doesn’t really matter as long as we realise that it pervades everything, including ourselves.

1 Holy Bible, Acts 9: 3-4.

2 Ibid, 9:1.

3 J. Austen, Emma: A Novel (1815), ed. R. Blythe (Harmondsworth, 1969), p. 398.

4 P.J. Corfield, ‘Is the Past Dead or Alive? And the Snares of Such Binary Questions’, BLOG/62 (Feb.2016).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 91 please click here