MONTHLY BLOG 108, Why must Humans beware the Midas Touch?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

PJC REVIEWS

CHARLES DICKENS

 A CHRISTMAS CAROL (1843)

ADAPTED FOR STAGE PERFORMANCE

BY LAURA TURNER (2010; updated 2019)

Viewed at Palace Theatre, Appleton Gate, Newark NG24 1JY

16 November 2019

Cast (alphabetically): The Chapterhouse Touring Company –

Gareth Cary; Matthew Christmas; Eliza Jade; Graham Hill; Alexandra Lansdale;

Amy Llewellyn; Zachery Price

Director: Antony Law


‘Bah! Humbug!’ With those great words, Scrooge launches an evening of festive entertainment and a ripple of appreciation spreads through the audience. The central theme is set. When is it right to be frank, forthright and unsentimental? To speak the truth as one sees it. But when does such behaviour become surly, selfish and inhumane? Dismissing genuine concerns as simply sentimental and confected? There is a special resonance to such questions right now since an election campaign is in train when the Prime Minister seeking re-election has dismissed concerns about personal safety and the coarsened state of public discourse, as emotionally expressed by a female MP, as ‘Humbug!’

On stage, the youthful cast of seven actors throw themselves energetically into recreating the bustling life of mid-nineteenth-century London. All but one play multiple roles, including the ghost of Scrooge’s former partner Jacob Marley. Their parts are rather stereotyped; but they make an effective ensemble, under the skilful stage direction of Antony Law.

One character, however, has to undergo moral growth and change from being an old skinflint into a sentient, feeling human being. He is Ebenezer Scrooge, as played by Matthew Christmas, who is visibly youthful and good-looking. Does that matter? Surely not, Acting is make-believe. If Sarah Bernhardt in her 70s, with a wooden leg, could make audiences cry when she played Hamlet, then a young actor can play an old man, or woman, come to that. Christmas was stern and inflexible enough as Scrooge in the opening scenes; but perhaps he needed to convey a bit more thoroughly that Scrooge had spent an entire, dreary lifetime in amassing money, and in doing nothing but that. His avarice should be imprinted in his visage. Anyhow, once Scrooge began to soften, Christmas played the role very well. His look of initial surprise at himself when returning to the world of emotions was excellent.

The outcome of the story as a whole, as Dickens had intended, is heart-warming. There is a danger that scenes involving ghosts (four appear during the play) can be unintentionally risible. This production avoided that outcome, by playing everything to the hilt, with full intensity. There is another danger that scenes involving youthful death – in this case the demise of the handicapped but perennially cheerful Tiny Tim – can become too sentimentalised and, as a result, also unintentionally comic. No danger in this production. The actors switched immediately into a clear and still rendering of an appropriate Christmas carol, unaccompanied. It was very moving. Indeed, they sang a number of carols throughout the play, underpinning the theme of festive cheer. What a bonus to find a troupe of good actors, all with excellent singing voices.

So what does the story of A Christmas Carol mean? In one sense, Dickens’s moral is clear and simple. People should care for their fellow humans. Heartless austerity is indeed heartless. Individuals should give personal help willingly, not just for the benefit of those in want but also because caring for others is a means of unlocking one’s own heart, which otherwise would remain frozen. To be complete, a human has to be part of society. Not necessarily married or dwelling within a group. But emphatically not living in chill segregation from others.

At the same time, there is a hidden power within the story in the lure of money. Dickens is well aware that it’s not just love which makes the world go round. Money provides the basic means of subsistence but can also effect so much more. It constitutes a great source of social status and esteem, as well as confers the economic power of capital. Scrooge is an old skinflint. But he is also a respectable pillar of society and an employer, with the potential to give great happiness to others. Moreover, Scrooge’s diligence and his application are admirable qualities. Dickens is not encouraging people to live idly or without employment. Nor is he trying to envisage a different structure for society. He campaigned for reforms (for example, to the prison system), not revolutionary change.  Unlike (say) his contemporaries Robert Owen or Karl Marx, Charles Dickens is not a visionary with alternative communitarian economic models in mind.

Instead, his challenge to the world is to re-infuse everyday transactions with moral values. People must work for money but not love it too much. Gold can corrode the heart, as in the classic tale of King Midas. If everything within touching range turns to gold, then nothing is left to eat and drink. Other people too become lifeless, as King Midas killed his little daughter with a touch. Scrooge has, through his lifestyle, destroyed his own heart and feelings. He is outwardly rich and powerful, but innerly tragic.

Capping the accumulation of immense wealth and undertaking a degree of social redistribution can thus be advocated as a moral as well as a political good cause for democratic societies to undertake. The sort of economic policies that the very rich deride as the ‘politics of envy’. They certainly won’t like to hear that they must redeploy some of their wealth for their own good, as well as for the good of others. They will join Scrooge with further reiterations of ‘Bah! Humbug!’ So how are attitudes to change? It’s not enough to rely upon fictional Dickensian ghosts to create a moral awakening across society at large.

Is it being too fanciful to consider that climate change will bring about a fundamental change? In a sense, unprecedented floods, storms, heatwaves, fires and rising seas are signs from Planet Earth that humans are at risk of behaving like a collective King Midas: destroying with their touch the very things that they love the most. These thoughts are perhaps straying too far from the evening of collective good cheer provided by the youthful payers on stage in Newark. They indicate, however, that Dickens’s fable – and Laura Turner’s dramatisation of its scenes of moral redemption – are genuinely thought-provoking. Don’t love money too much! Great wealth is a curse! Make friendships! Save Planet Earth! And enjoy the midwinter festival!

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 108 please click here

MONTHLY BLOG 107, Reasons for unrepentant (relative) Optimism about the coming of Green Politics

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Fig.1 Greta Thunberg (b. 2003),
Swedish environmental activist;
author of No One is Too Small to Make a Difference (2019)

In response to my October BLOG about Greener Cities, I got many queries about how I could plausibly state that ‘I am an unrepentant optimist’? In fact, I should have said an ‘unrepentant (relative) optimist’, since it’s clear that not all is currently well with Planet Earth. Things would be better without today’s growing number of major fires, heatwaves, droughts, tempests, floods, icemelts, and rising seas. So I am far from taking the ultra-optimist’s view that all is for the best, in the best of all possible worlds.

But, short of adopting a totally Panglossian outlook, it is possible, indeed necessary, to remain optimistic that actions can be taken in time to control the adverse effects of global warming. Humans are not only problem-creators but also problem- solvers. In this case, the challenge is undeniably great. It will require significant changes from not only big business and big politics (using that term for the networks of national and international institutions) but also from individuals. Global patterns of transport, trade, energy generation; and energy consumption will have to be fundamentally adapted. And at an individual level, people will have to think again about their food and drink; their clothing; their systems for warming houses; their transport; their sports; their holidays; and, indeed, everything. It is asking a lot. Especially as remedial actions will need to be adopted at both macro- and micro-levels simultaneously.

Nonetheless, here are four arguments for (relative) optimism. Governments and big businesses have paid attention to scientific warnings in the past, and then taken successful remedial action. In the 1970s, it was first reported that there was a widening gap in the ozone layer, which shields Planet Earth from harmful ultra-violet radiation. The culprits were chemicals known familiarly as CFCs (chlorofluorocarbons), which were used in aerosol sprays, refrigerators, and blowing agents for foams and packaging materials. An international agreement, known as the Montreal Protocol (1987), then launched decisive change. CFCs were banned.

Over time, all nations around the world have signed up to the Protocol. And in May 2018 a new scientific survey confirmed that the ozone hole has diminished significantly.1 Humans still have to remain vigilant, since the workings of the upper atmosphere are volatile and not easy to study.2 Nonetheless, collective action has been undertaken; and is working.

A second example can be taken from individual actions to renounce a social practice, which was once seen as a great source of personal pleasure. Smoking tobacco in cigars and cigarettes is disappearing. Not at the same rate in all countries around the world. Nor at the same rate among all social classes. Yet, globally, humans are entering into what has been well described as the ‘tobacco-endgame’.3 For example, in the case of Britain, it is hoped that the entire country may become smoke-free by 2030, according to a health report in July 2019.4 Progress in curbing smoking has been triggered by many factors. Medical warnings paved the way from the 1950s onwards, at first cautiously, and then, with more definitive research, more emphatically. Supportive government policies eventually helped too. Above all, however, the slow but eventually decisive shift in individual and communal attitudes was crucial.

Up to and including most of the 1960s, it was considered ‘cool’ to smoke and rude to refuse a friend’s offer of a cigarette. Over time, those attitudes have been completely reversed. Many older people can still remember their personal struggles to quit. Younger people, if they are lucky, never get caught by the habit in the first place. They have no memories of pubs, cinemas, tube trains and other public places being clogged with tobacco fumes – or of their hair and clothes reeking unpleasantly. Again, the battle against smoking is far from won. There are still skirmishes and diversionary tactics (as from e-cigarettes) along the way.5 Yet the trend is becoming clear. As is the crucial role of individual decision-making and active participation in the process.

The story of Prohibition in the USA in 1919 offers an instructive contrast. There the legislative ban on the manufacture, transportation and sale of alcohol was well intentioned. Drinking as such was never made illegal; but aggregate consumption was indeed reduced. However, the policy was introduced too abruptly and without widespread public support. The outcome was evasion on an epic scale, boosting illicit stills and bootlegging gangsters. Other side-effects included a boom in hypocrisy and contempt for the law. Campaigners for a more rational system managed to repeal the ban in 1933, leaving the different US states to adopt their own policies.6 The contrast between alcohol’s survival, despite Prohibition, and nicotine’s slow demise is instructive. Government policies, health advisors and medical practitioners can and do play significant roles. But on big questions which affect people’s intimate personal behaviour on a day-by-day basis, structural policies have to work with, not against, public opinion. Hence the question of how that state-of-many-collective-minds is formed and sustained becomes crucial.

So here is a third reason for (relative) optimism on global warming. Public opinion, fuelled by young people like the Swedish activist Greta Thunberg, is being everywhere encouraged to turn in favour of urgent action. True, the mechanisms for channelling such attitudes into the political system are indirect and slow-working. However, what is happening now seems like part of a Zeitgeist shift of immense significance. The young are numerous, vocal, and willing to campaign. Furthermore, people of all ages know that the human species has no other domicile than Planet Earth. People of many different political persuasions are showing new interest in green policies. And people in all parts of the world are witnessing the increased incidence of freak weather. The voices of sceptics and deniers are waning.7 Getting collective action to harness this rising tide of opinion will depend upon big politics being able and willing to channel the tide successfully – and upon big business becoming aware and either adjusting its actions, or being made to do so. Big demands, which entail challenging big vested interests. Yet these demands are not impossible ones. Vigorous explorations are already being undertaken to find alternative technologies. Such game-changing innovations may alter the nature of the decisions that need to be made. Politicians need to show the same willingness to respond positively, in the face of an accumulating emergency.

And, lastly, a degree of activism (whether driven by pessimism or optimism) is needed from everyone, to add force to the changing Zeitgeist. The alternative is fatalism, which only makes a bad situation worse. True, being optimistic is easier for those with optimistic temperaments. Yet even those who feel nothing but gloom are called upon, in this climate emergency, to transmute their valid anxieties into pressure for change. Relative pessimism can be as great a goad to call for remedial action, as can relative optimism. ‘Climate change constitutes a global emergency!’ ‘Let’s take countervailing action!’ All can lend their voices to swell the tide of public opinion.

ENDNOTES:

1 S. Pereira, report on Ozone Layer dated 1/5/2018 for Newsweek 27 October 2019: https://www.newsweek.com/nasa-hole-earths-ozone-layer-finally-closing-humans-did-something-771922

2 E.A. Parson, Protecting the Ozone Layer: Science and Strategy (Oxford, 2003); S.O. Andersen and K.M. Sarma, Protecting the Ozone Layer: The United Nations History (2002).

3 [British Medical Journal], India: The Endgame for Tobacco Conference (2013).

4 S. Barr, report dated 23 July 2019 in The Independent: https://www.independent.co.uk/life-style/health-and-families/smoking-ban-uk-end-cigarettes-tobacco-health-green-paper-a9016636.html

5 S. Gabb, Smoking and its Enemies: A Short History of 500 Years of the Use and Prohibition of Tobacco (1990).

6 D. Okrent, Last Call: The Rise and Fall of Prohibition (New York, 2010); J.J. Binder, Al Capone’s Beer Wars: A Complete History of Organised Crime in Chicago during Prohibition (Amherst, 2017);

7 G.T. Farmer, Climate Change Science: A Modern Synthesis (Dordrecht, 2013); J. Fessmann (ed.), Strategic Climate Change Communications: Effective Approaches to Fighting Climate Change Denial (Wilmington, 2019); S. Maloney, H. Fuenfgeld and M. Gramberg, Local Action on Climate Change: Opportunities and Constraints (2017).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 107 please click here

MONTHLY BLOG 106, Cities Greener Still and Greener

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Scrapbook Silhouettes: available EBay (2019)

Towns and cities are wonderful human creations. They allow large numbers of people to live together at high density, reasonably successfully, without spreading all over Planet Earth. The mixture of collective and individual organisation that enables this process to happen and to sustain itself is impressive and admirable.1 And, without concentrated cities and towns as living spaces, it would be extremely difficult to accommodate the Earth’s 7.7 billion humans (at the latest count in September 2019) – a demanding species.

Now, however, it’s time for a major step-change in the characteristics of the prevailing urban environment worldwide. All towns and cities must go greener, much, much greener. It’s true that some urban places are already pleasantly green. And all, even the most concrete-based urban settlements have at least some city parks and green spaces, acting as urban lungs.

Yet a programme of Cities Greener Still and Greener needs a complete urban revamp and restructuring. It is required partly for ecological reasons. The impending crisis of global overheating provides the immediate call to action on behalf of all species world-wide. And human-biological needs give a subsidiary impetus to all urban leaders and planners, now that much more is becoming known about the beneficial effects of greenery (trees, plants, wildlife) upon human health and wellbeing.2

A full programme for profound urban ecological change will require structural changes to transportation systems, domestic and industrial heating systems, and so much else besides. Improvements to air quality must be a priority. This BLOG, however, is not the place for a full manifesto on behalf greenery and sustainable cities, although I have no doubt that green policies will, sooner or later, have to be adopted everywhere.3

These thoughts express some immediate hobby-horses, which a personal BLOG provides a chance to exercise. So here are four imperatives relating to planting urban trees/bushes/greenery; sowing/planting grasses and wild plants; restoring lost urban rivers and streams; and adopting permeable paving wherever possible. The aim is to banish unbroken swathes of concrete. That stern and rigid material has manifold uses; but, as currently adopted, it is stifling the earth, which is too important to be so mistreated. Not only does the making of concrete involve harmful processes which add to global warming, but the greyspread of concrete is destroying the natural infrastructure of the soil and hence seriously damaging processes of fertilisation, pollination, flood control, oxygen production and water purification. Environmentalists as well as architects and planners are beginning to warn that massy developments based upon this rigid and impermeable material are storing up problems for the not-very distant suture. ‘Simply pouring concrete is doing more harm than good’.4

Trees and Bushes: it’s not enough to give urban dwellers more opportunities of going out of town to visit forests; and/or to complain about the destruction of forests in the Amazon, although both of those are worthwhile campaigns. But instead trees and bushes should be planted in the cities – everywhere. In every road and courtyard and backyard and corner. And, if there’s no room in the ground, then the trees and bushes should be in big tubs and planters. Absolutely everywhere. Millions and millions of trees and bushes. Old city centres, with a maze of small streets, are great places to walk around. Greater still with tubs of trees and bushes everywhere. New city avenues, boulevards, urban thoroughfares, bypasses – all need trees and greenery. Precisely which species can survive and thrive in each different town or city environment is a matter for tree specialists and urban landscapers to advise. But greenery is the universal requirement, for better air quality, better visual impact, and better lives for humans plus for all forms of urban wildlife.

Tree-in-a-tub: from www.fromoldbooks.org (2019)

Sowing Grasses and Planting Plants: Suitably hardy plants, wild or cultivated, as well as grasses, which grow well without careful tending should be sown in every bit of earth, including in unused large areas of neglected ground and every small patch at the feet of trees or anywhere else, such as railway sidings. Plants and grasses are environmentally favourable for wildlife and insects, as well as pleasant for humans. And where there are opportunities for community gardening, those options should be embraced as well. Already some people spontaneously grow plants at the foot of trees in the roads where they live. And Lambeth Council in London has already begun a creative Biodiversity Policy across the Borough to the same effect (and more). The programme is bringing huge benefits – ecological, cultural, economic, health, and community – for a comparatively small outlay.

Such initiatives deserve not just congratulations but immediate imitation everywhere. In its support, Lambeth Council cites an eminent and idiosyncratic Victorian who lived in the Borough: John Ruskin, the pioneering critic of untrammelled industrialism and environmental degradation. He praised the restorative power of nature: ‘It is written on the arched sky; it looks out from every star. It is the poetry of nature; it is that which uplifts the spirit within us’.5  Well, it’s not the sort of language which is usually found in official documents; but entirely relevant. Ruskin would be proud.

Wild Flowers © Clipart 2019

Restoring Lost Streams and Rivers: Restoring lost rivers is trickier, since in many cases they flow in culverts under roads and buildings. Nonetheless, they are integral parts of the urban environmental ecology and should be respected, uncovered wherever possible, and enjoyed. It’s an excellent as well as urgent new challenge to the ingenuity of engineers and urban landscape designers. Such rethinking is part of a revised attitude to cities and their terrain, which should not be built over heedlessly.6 London is one of many places which have secret watery undergrounds. Its lost rivers have their own devotees; and people eagerly attend talks and join walks along their courses.7

There are also more ambitious plans for river restoration wherever possible. For example, in Lewisham’s Chinbrook Meadows a section of the River Quaggy (great name) has been uncovered, as something beneficial in itself but also as part of a wider water management project.8 The park has gained in amenities and popularity; wildlife has been assisted; and the wetland serves as an overflow area in time of flooding, protecting local homes and businesses. This creative feat of reverse engineering is an admirable portent for a future that is more nature- and human-friendly, as well as more practically sustainable. Not every urban river and stream will be easily restored; and town dwellers have to resolve not to throw litter into running waterways, once visible again. But these challenges are live ones, here and now!

The unremarked outfall of London’s Fleet River under Blackfriars Bridge: image from website for Paul Talling’s zestful exploration of London’s Lost Rivers (2011),
https://www.londonslostrivers.com/river-fleet.html

Permeable Paving: It’s depressing to realise that a very ingenious invention which uses concrete and still allows cars to park without stifling the earth has long been known but is not used at all widely. There are numerous forms of permeable paving. One takes the form of a concrete lattice, which allows grass to grow within the grid. (And if the climate does not encourage grass, then earth or sand fill the gaps). Water drains simply and naturally into the ground; and excess runoffs at times of heavy rain or flooding are minimised. Needless to say, this system is not suitable for all terrains and climates; and there are practical limits to the quantity of traffic and load that porous paving can bear. Indeed, a number of alternatives are being developed concurrently, using plastic or asphalt.

So porous paving exists;9 but is not (yet) used sufficiently. It seems clear that more urgent effort is needed, to research and development of such systems, and to make them easier and cheaper to use. Builders and engineers who are currently accustomed to schemes for widespread concretisation (yes, the word exists) will have to rethink their ways. But they represent buccaneering professions which are used to facing challenges. The future now requires working with nature, not stifling or attempting to erase it – for the obvious reason that outraged nature has a very determined way of striking back.

Advertisement for Grasscrete ®: Concrete Paver System (2019)

Envoi: Where has this BLOG come from? I am an urban historian who loves towns and cities, and who has long been meditating on these themes. I first saw grass-crete in Switzerland in the mid-1980s and was sure that I had seen the future – only to find that the universal future of porous paving has been somewhat delayed. Today’s debates about the problems of excess water run-off from concreted land as well as the wider context of the accelerating climate emergency have triggered me into writing down my thoughts.

So where is this BLOG going? It’s my way of bearing witness, of joining the tide of protest at the present dire state of Planet Earth. I believe that human beings are noted problem solvers as well as problem creators. It’s true that urgent action on climate change is needed to accompany the fine words from many (not all) of today’s politicians.

Nonetheless, I am an unrepentant optimist that humans will react positively in response to collective danger.10 Today’s warnings from scientists, campaigners, and many thousands of young people globally cannot be ignored for much longer. Transformative action is needed, learning from past experience to new effect. And that includes local initiatives, in every town and city, where many green micro-improvements will together promote greener still macro-change.

ENDNOTES:

1 For a panoptic historical survey, see collectively the essays in P. Clark (ed.), The Oxford Handbook of Cities in World History (Oxford, 2013).

2 K. Nilsson and others (eds), Forests, Trees and Human Health (Elsevier, 2006; New York, 2010); [US. Dept. of Agriculture/ Forest Service], Urban Nature for Human Health and Well-Being: A Research Summary (Washington DC, 2018); Q. Li, Into the Forest: How Trees can help you Find Health and Happiness (2019).

3 See e.g. new thinking in T. Elkin, Reviving the City: Towards Sustainable Urban Development (1991); and recent work on ecological cities.

4 From J. Watts, ‘Concrete: The Most Destructive Material on Earth’, The Guardian, 25 Feb. 2019: see https://www.theguardian.com/cities/2019/feb/25/concrete-the-most-destructive-material-on-earth. See also a kinder but still warning analysis in A. Forty, Concrete and Culture: A Material History (2012).

5 See details of Lambeth Biodiversity Action Plan, 2019-24, in https://www.lambeth.gov.uk/sites/default/files/lpl-lambeth-biodiversity-action-plan-2019-20.pdf

6 K. Perini and P. Sabbon, Urban Sustainability and River Restoration: Green and Blue Infrastructure (2016); M. Knoll and others (eds), Rivers Lost, Rivers Regained: Rethinking City-River Relations (Pittsburgh, PA, 2017).

7 P. Talling, London’s Lost Rivers (2011) and associated website; T. Bolton, London’s Lost Rivers: A Walker’s Guide (Devizes, 2014).

8 Case Study: Quaggy Flood Alleviation Scheme (2013) in https://restorerivers.eu/wiki/index.php?title=Case_study%3AQuaggy_Flood_Alleviation_Scheme

9 B.K. Ferguson, Porous Pavements (Boca Raton, FL, 2005).

10 P.J. Corfield, ‘Climate Reds: Responding to Global Warming with Relative Optimism’, (2011) with companion piece by M. Levene, ‘Climate Blues: Or How Awareness of the Human End might Re-instil Ethical Purpose to the Writing of History’: PJC essay available on personal website www.penelopejcorfield.co.uk/Pdf21.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 106 please click here

MONTHLY BLOG 105, Researchers, Do Your Ideas Have Impact? A Critique of Short-Term Impact Assessments

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Clenched Fist
© Victor-Portal-Fist (2019)

 Researchers, do your ideas have impact? Does your work produce ‘an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia’? Since 2014, that question has been addressed to all research-active UK academics during the assessments for the Research Excellence Framework (REF), which is the new ‘improved’ name for the older Research Assessment Exercise (RAE).1

From its first proposal, however, and long before implementation, the Impact Agenda has proved controversial.2 Each academic is asked to produce for assessment, within a specified timespan (usually seven years), four items of published research. These contributions may be long or short, major or minor. But, in the unlovely terminology of the assessment world, each one is termed a ‘Unit of Output’ and is marked separately. Then the results can be tallied for each researcher, for each Department or Faculty, and for each University. The process is mechanistic, putting the delivery of quantity ahead of quality. And now the REF’s whistle demands demonstrable civic ‘impact’ as well.

These changes add to the complexities of an already intricate and unduly time-consuming assessment process. But ‘Impact’ certainly sounds great. It’s punchy, powerful: Pow! When hearing criticisms of this requirement, people are prone to protest: ‘But surely you want your research to have impact?’ To which the answer is clearly ‘Yes’. No-one wants to be irrelevant and ignored.

However, much depends upon the definition of impact – and whether it is appropriate to expect measurable impact from each individual Unit of Output. Counting/assessing each individual tree is a methodology that will serve only to obscure sight of the entire forest. And will hamper its future growth.

In some cases, to be sure, immediate impact can be readily demonstrated. A historian working on a popular topic can display new results in a special exhibition, assuming that provision is made for the time and organisational effort required. Attendance figures can then be tallied and appreciative visitors’ comments logged. (Fortunately, people who make an effort to attend an exhibition usually reply ‘Yes’ when asked ‘Did you learn something new?’). Bingo. The virtuous circle is closed: new research → an innovative exhibition → gratified and informed members of the public → relieved University administrators → happy politicians and voters.

Yet not all research topics are suitable to generate, within the timespan of the research assessment cycle, the exhibitions, TV programmes, radio interviews, Twitterstorms, applied welfare programmes, environmental improvements, or any of the other multifarious means of bringing the subject to public attention and benefit.

The current approach focuses upon the short-term and upon the first applications of knowledge rather than upon the long-term and the often indirect slow-fuse combustion effects of innovative research. It fails to register that new ideas do not automatically have instant success. Some of the greatest innovations take time – sometimes a very long time – to become appreciated even by fellow researchers, let alone by the general public. Moreover, in many research fields, there has to be scope for ‘trial and error’. Short-term failures are part of the price of innovation for ultimate long-term gain. Unsurprisingly, therefore, the history of science and technology contains many examples of wrong turnings and mistakes, along the pathways to improvement.3

An Einstein, challenging the research fundamentals of his subject, would get short shrift in today’s assessment world. It took 15 years between the first publication of his paper on Special Relativity in 1905 and the wider scientific acceptance of his theory, once his predictions were confirmed experimentally. And it has taken another hundred years for the full scientific and cultural applications of the core concept to become both applied and absorbed.4 But even then, some of Einstein’s later ideas, in search of a Unified Field Theory to embrace analytically all the fundamental forces of nature, have not (yet) been accepted by his fellow scientists.5 Even a towering genius can err.

Knowledge is a fluid and ever-debated resource which has many different applications over time. Applied subjects (such as engineering; medicine; architecture; public health) are much more likely to have detectable and direct ‘impact’, although those fields also require time for development. ‘Pure’ or theoretical subjects (like mathematics), meanwhile, are more likely to achieve their effects indirectly. Yet technology and the sciences – let alone many other aspects of life – could not thrive without the calculative powers of mathematics, as the unspoken language of science. Moreover, it is not unknown for advances in ‘pure’ mathematics, which have no apparent immediate use, to become crucial many years subsequently. (An example is the role of abstract Number Theory for the later development of both cryptography and digital computing).6

Hence the Impact Agenda is alarmingly short-termist in its formulation. It is liable to discourage blue skies innovation and originality, in the haste to produce the required volume of output with proven impact.

It is also fundamentally wrong that the assessment formula precludes the contribution of research to teaching and vice versa. Historically, the proud boast of the Universities has been the integral link between both those activities. Academics are not just transmitting current knowhow to the next generation of students but they (with the stimulus and often the direct cooperation of their students) are simultaneously working to expand, refine, debate, develop and apply the entire corpus of knowledge itself. Moreover, they are undertaking these processes within an international framework of shared endeavour. This comment does not imply, by the way, that all knowledge is originally derived from academics. It comes indeed from multiple human resources, the unlearned as well as learned. Yet increasingly it is the research Universities which play a leading role in collecting, systematising, testing, critiquing, applying, developing and advancing the entire corpus of human knowledge, which provides the essential firepower for today’s economies and societies.7

These considerations make the current Impact Agenda all the more disappointing. It ignores the combined impact of research upon teaching, and vice versa. It privileges ‘applied’ over ‘pure’ knowledge. It prefers instant contributions over long-term development. It discourages innovation, sharing and cooperation. And it entirely ignores the international context of knowledge development and its transmission. Instead, it encourages researchers to break down their output into bite-sized chunks; to be risk-averse; to try for crowd-pleasers; and to feel harried and unloved, as all sectors of the educational world are supposed to compete endlessly against one another.

No one gains from warped assessment systems. Instead, everyone loses, as civic trust is eroded. Accountability is an entirely ‘good thing’. But only when done intelligently and without discouraging innovation. ‘Trial and error’ contains the possibility of error, for the greater good. So the quest for instant and local impact should not be overdone. True impact entails a degree of adventure, which should be figured into the system. To repeat a dictum which is commonly attributed to Einstein (because it summarises his known viewpoint), original research requires an element of uncertainty: ‘If we knew what it was we were doing, it would not be called “research”, would it?’8

ENDNOTES:

1 See The Research Excellence Framework: Diversity, Collaboration, Impact Criteria, and Preparing for Open Access (Westminster, 2019); and historical context in https://en.wikipedia.org/wiki/Research_Assessment_Exercise.

2 See e.g. B.R. Martin, ‘The Research Excellence Framework and the “Impact Agenda”: Are We Creating a Frankenstein Monster?’ Research Evaluation, 20 (Sept. 2011), pp. 247-54; and other contributions in same issue.

3 S. Firestein, Failure: Why Science is So Successful (Oxford, 2015); [History of Science Congress Papers], Failed Innovations: Symposium (1992).

4 See P.C.W. Davies, About Time: Einstein’s Unfinished Revolution (New York, 1995); L.P. Williams (ed.), Relativity Theory: Its Origins and Impact on Modern Thought (Chichester, 1968); C. Christodoulides, The Special Theory of Relativity: Foundations, Theory, Verification, Applications (2016).

5 F. Finster and others (eds), Quantum Field Theory and Gravity: Conceptual and Mathematical Advances in the Search for a Unified Framework (Basel, 2012).

6 M.R. Schroeder, Number Theory in Science and Communications: With Applications in Cryptography, Physics, Biology, Digital Information and Computing (Berlin, 2008).

7 J. Mokyr, The Gifts of Athena: Historical Origins of the Knowledge Economy (Princeton, 2002); A. Valero and J. van Reenen, ‘The Economic Impact of Universities: Evidence from Across the Globe’ (CEP Discussion Paper No. 1444, 2016), in Vox: https://voxeu.org/article/how-universities-boost-economic-growth

8 For the common attribution and its uncertainty, see [D. Hirshman], ‘Adventures in Fact-Checking: Einstein Quote Edition’, https://asociologist.com/2010/09/04/adventures-in-fact-checking-einstein-quote-edition/

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 105 please click here

MONTHLY BLOG 104, Is it Time to Look beyond Separate Identities to Find Personhood?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Collectively, the 15th International Congress on the Enlightenment (ICE), focusing upon Enlightenment Identities, was a huge triumph. For five days in Edinburgh in July 2019 some 2000 international participants rushed from event to event. There were not only 477 learned panel presentations and five great plenaries but also sundry conducted walks, coach tours to special venues, a grand reception, a superb concert, a pub quiz, and an evening of energetic Highland dancing. So much was happening that heads spun, and not just from the jovial Edinburgh hospitality.

By way of introduction, I began the first plenary session, with its global array of speakers, by offering some basic definitions. The grand themes of the Congress were Enlightenment and Identities: Lumières et Identités. Powerful concepts, which are both much contested. Needless to say, the Congress organisers did not insist on single definitions of these grand themes, which were chosen precisely to promote debate.

In that spirit, the Congress logo displayed two iconic figures from the eighteenth century. Both are shown as questioning, as they flank the silhouette of the classic monument on Edinburgh’s Calton Hill to the philosopher Dugald Stewart. These two iconic figures may be considered as the Adam and Eve of the Congress, venturing out into the world to lead the collective intellectual journey.

The young woman was named Dido Belle Lindsay. She was aged 18 at the date in 1778-9, when her portrait was painted alongside her fair-skinned cousin. By heritage, Dido Belle was an illegitimate African-Caribbean-Scot. Yet she was given a resonant first name which evoked the celebrated Queen of Carthage. And by life experiences, Dido Belle Lindsay had a protected and affluent upbringing in the household of her great-uncle, an eminent London lawyer. She later married a Frenchman and lived quietly in England with her family.

Meanwhile, the man, who drew his own brooding self-portrait at the age of 40, was a German Swiss named Heinrich Füssli.3 He had travelled to Italy, where he Italianised his surname to Fuseli and then made a successful career as an artist in London. There he married an Englishwoman. Both these individuals embodied the flexibility and fluidity of eighteenth-century identities. Neither their social milieux nor their individual life-histories were static.

As educated people, the Congress’s Adam and Eve might well have encountered, in their reading and conversations, various catch-phrases like ‘It’s an Age of Light’ or ‘This Age of Reason and Science’. Specifically, too, Fuseli as a German-speaking Swiss could have read in the original Immanuel Kant’s celebrated enquiry, published in 1784, Was Ist Aufklarung? What is Enlightenment?

Moreover, Dido Belle Lindsay, the free daughter of a formerly enslaved African woman, would no doubt have appreciated the public appeal made by the leading African abolitionist Olaudah Equiano. He urged that slavery had no place in an age of ‘Light, Liberty, and Science’. He was thereby invoking the sense of a new Zeitgeist and new forms of knowledge. By contrast, the slave traders had custom and practice in their support, as well as financial vested interests. But, tellingly, the slave traders did NOT justify their business by saying ‘It’s an Age of Slave-Trading’, even though that was factually true. On this issue, the abolitionists were ‘seizing the narrative’, to put the point into twenty-first-century terminology.5

Nonetheless, the Congress’s Adam and Eve would not have thought about their era as one of fixity. They both lived long enough to see the emergence of conscious anti-Enlightenment thought, from the later eighteenth century onwards. Fuseli specifically contributed to Romanticism in his art, and expressed scepticism about the claims of cold rationality. So neither figure would have been surprised to learn that the concept of Enlightenment remains contested among historians, political theorists and social philosophers.

Responses today range from appreciation and appropriation through to rejection and outright denial. Scholars analyse national and regional variations; and they debate differences between mainstream and radical Enlightenments. Meanwhile, in the later twentieth century, hostile postmodernist critics attacked appeals to rationalist reforms, which they identified as a single and oppressive ‘Enlightenment Project’.8   Yet rival sceptics denied the existence of any cohesive movement at all. Plenty to debate.

To those complexities, moreover, may be added the further complications of ‘Identities’. The terminology is warm and positive. But its impact is not simple. Viewed schematically, the rise of identity studies in the last thirty years has matched the decline of research interest into historical class, and the rise of ‘identity politics’ in the wider world.10  This fashionable approach is personal, individualistic. It rejects economic determinism. Instead, the factors that influence identity are seen as endlessly fluid and flexible. They may include gender, sexuality, ethnicity, and yes, social class; but they extend to religion, nationality, region, language, politics, culture, brainpower – and the power of physical appearances.

Certainly the Congress’s Adam and Eve would have known about identity issues, although they would not have described them in such terms. Dido Belle Lindsay lived with her great-uncle, the liberal judge William Mansfield. It was he in 1772 who heard the famous test case, when the captive African James Somersett sued for his freedom from the hold of an English ship in an English port. The case was an individual one. But the judge, when granting Somersett’s plea for liberty, pronounced publicly that the state of slavery was ‘odious’.11  Dido Belle Lindsay would surely have approved. As a result, Somersett gained the legal identity of a free man and judicial disapproval was directed at the entire system of personal enslavement. The case became a landmark in the long (and still continuing) struggle to abolish unfree personal servitude in its many different guises.

However, there are criticisms to be made of identity histories, as there are of identity politics. There is a danger that personal classifications may be interpreted too rigidly. In reality, people then and now may have multiple and overlapping identities. They may move between them as they prefer: an eighteenth-century gentleman livening in Northumbria might define himself as an Englishman when teasing a Scot from north of the border; but both might define themselves as Britons when opposing the French.

It’s also vital to recognise that identities are not always soft, liberal and inclusive. Group identities especially can become aggressive, bellicose, and coercive, formed in contra-distinction to ‘other’ groups. So identity politics may lead not to shared pluralism but to harsh conflict and polarisation. In sum, these big organising concepts may contain light – but also darkness.

Today it is surely time to look beyond the sub-divisions, not in blind denial but in awareness that there are also universals alongside diversities. In gender history, there is also a concept of personhood, beyond the rivalries of men and women.12  In terms of polymorphous human sexualities, there’s a potential for agreed boundaries of non-exploitative behaviour, beyond the rhetoric of individual sexual gratification. In the context of historical ‘racism’, there’s also significant movement towards a non-racialised understanding that all people are members of one human race.13  And, legally and politically, there is scope for a renewed endorsement of universalist human rights, as triumphantly if controversially expounded in the eighteenth-century Enlightenment, applying not to one section of the globe but to all – and applying in practice as well as in theory.14

These communal issues are becoming especially highlighted in the light of the global climate emergency.15  They make a huge agenda but a very human one, to be pursued with a spirit of unity which underlies diversity: avec l’esprit de l’unité, qui sous-tend la diversité …

ENDNOTES:

1 Edited text of presentation given to Edinburgh Congress Enlightenment Identities, on Monday 15 July 2019, introducing first Global Plenary. My esteemed colleagues on the panel were, in order of speaking, Deirdre Coleman (University of Melbourne); Sébastien Charles (Université du Québec à Trois-Rivières, Canada); Tatiana Artemyeva (Herzen State University of Russia); Sutapa Dutta (Gargi College, University of Delhi, India); and Toshio Kusamitsu (University of Tokyo, Japan).

2 For Dido Belle Lindsay (1761-1804), see P. Byrne, Belle: The True Story of Dido Belle (2014); and an intriguing outreach film Belle (dir. A. Asante, 2018).

3 For Henry Fuseli (1741-1825), see M. Myrone (ed.), Gothic Nightmares: Fuseli Blake and the Romantic Imagination (2016).

4 O. Equiano, The Interesting Narrative: And Other Writings, ed. V. Carretta (1995), p. 233.

5 For a huge literature, follow leads in B. Carey and others (eds), Discourses of Slavery and Abolition: Britain and its Colonies, 1760-1838 (Basingstoke, 2004); and R.S. Newman, Abolitionism: A Very Short Introduction (Oxford, 2018).

6 See e.g. R. Porter and M. Teich (eds), The Enlightenment in National Context (Cambridge, 1981).

7 See e.g. J.I. Israel, Radical Enlightenment: Philosophy and the Making of Modernity, 1650-1750 (Oxford, 2001) and ensuing debates.

8 S-E. Liedman, The Postmodernist Critique of the Project of Enlightenment (Amsterdam, 1997); G. Sauer-Thompson and J. Wayne Smith, The Unreasonable Silence of the World: Universal Reason and the Wreck of the Enlightenment Project (2019).

9 G. Garrard, Counter-Enlightenments: From the Eighteenth Century to the Present (2004).

10 See e.g. critiques like W. Egginton, The Splintering of the American Mind: Identity Politics, Inequality and Community on Today’s College Campuses (New York, 2018).

11 For the complexities of the case, see https://en.wikipedia.org/wiki/Somerset_v_Stewart.

12 See e.g. commentary in P.J. Corfield, ‘Enlightenment Womanhood, Manhood, Sexualities and Personhood: Thematic Overview’, in L. Andries and M-A. Bernier (eds), L’Avenir des Lumières: The Future of Enlightenment (Pars, 2019), pp. 89-105; L. Appell-Warren, Personhood: An Examination of the History and Use of an Anthropological Concept (Lewiston, 2014).

13 For the shared genetic history of humankind, see L. Cavalli-Sforza and F. Cavalli-Sforza, The Great Human Diaspora: The History of Diversity and Evolution, transl. S. Thomas (Reading, MA, 1995).

14 Consult A. Brysk, The Future of Human Rights (Cambridge, 2018).

15 See calls for more urgent responses as in D. Spratt and P. Sutton, Climate Code Red: The Case for Emergency Action (Victoria, Australia, 2008); and many other publications.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 104 please click here

MONTHLY BLOG 103, WHO KNOWS THESE HISTORY GRADUATES BEFORE THE CAMERAS AND MIKES IN TODAY’S MASS MEDIA?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Image © Shutterstock 178056255

Responding to the often-asked question, What do History graduates Do? I usually reply, truthfully, that they gain employment in an immense range of occupations. But this time I’ve decided to name a popular field and to cite some high-profile cases, to give specificity to my answer. The context is the labour-intensive world of the mass media. It is no surprise to find that numerous History graduates find jobs in TV and radio. They are familiar with a big subject of universal interest – the human past – which contains something for all audiences. They are simultaneously trained to digest large amounts of disparate information and ideas, before welding them into a show of coherence. And they have specialist expertise in ‘thinking long’. That hallmark perspective buffers them against undue deference to the latest fads or fashions – and indeed buffers them against the slings and arrows of both fame and adversity.

In practice, most History graduates in the mass media start and remain behind-the-scenes. They flourish as managers, programme commissioners, and producers, generally far from the fickle bright lights of public fame. Collectively, they help to steer the evolution of a fast-changing industry, which wields great cultural clout.1

There’s no one single route into such careers, just as there’s no one ‘standard’ career pattern once there. It’s a highly competitive world. And often, in terms of personpower, a rather traditionalist one. Hence there are current efforts by UK regulators to encourage a wider diversity in terms of ethnic and gender recruiting.2 Much depends upon personal initiative, perseverance, and a willingness to start at comparatively lowly levels, generally behind the scenes. It often helps as well to have some hands-on experience – whether in student or community journalism; in film or video; or in creative applications of new social media. But already-know-it-all recruits are not as welcome as those ready and willing to learn on the job.

Generally, there’s a huge surplus of would-be recruits over the number of jobs available. It’s not uncommon for History students (and no doubt many others) to dream, rather hazily, of doing something visibly ‘big’ on TV or radio. However, front-line media jobs in the public eye are much more difficult than they might seem. They require a temperament that is at once super-alert, good-humoured, sensitive to others, and quick to respond to immediate issues – and yet is simultaneously cool under fire, not easily sidetracked, not easily hoodwinked, and implacably immune from displays of personal pique and ego-grandstanding. Not an everyday combination.

It’s also essential for media stars to have a thick skin to cope with criticism. The immediacy of TV and radio creates the illusion that individual broadcasters are personally ‘known’ to the public, who therefore feel free to commend/challenge/complain with unbuttoned intensity.

Those impressive History graduates who appear regularly before the cameras and mikes are therefore a distinctly rare breed.3 (The discussion here refers to media presenters in regular employment, not to the small number of academic stars who script and present programmes while retaining full-time academic jobs – who constitute a different sort of rare breed).

Celebrated exemplars among History graduates include the TV news journalists and media personalities Kirsty Wark (b.1955) and Laura Kuenssberg (b.1976)., who are both graduates of Edinburgh University. Both have had public accolades – Wark was elected as Fellow of the Royal Society of Edinburgh in 2017 – and both face much criticism. Kuenssberg in particular, as the BBC’s first woman political editor, is walking her way warily but effectively through the Gothic-melodrama-cum-Greek-tragedy-cum-high-farce, known as Brexit.

In a different sector of the media world, the polymathic TV and radio presenter, actor, film critic and chat-show host Jonathan Ross (b.1960) is another History graduate. He began his media career young, as a child in a TV advertisement for a breakfast cereal. (His mother, an actor, put him forward for the role). Then, having studied Modern European History at London University’s School of Slavonic & Eastern European Studies, Ross worked as a TV programme researcher behind the scenes, before eventually fronting the shows. Among his varied output, he’s written a book entitled Why Do I Say These Things? (2008). This title for his stream of reminiscences highlights the tensions involved in being a ‘media personality’. On the one hand, there’s the need to keep stoking the fires of fame; but, on the other, there’s an ever-present risk of going too far and alienating public opinion.

Similar tensions accompany the careers of two further History graduates, who are famed as sports journalists. The strain of never making a public slip must be enormous. John Inverdale (b.1957), a Southampton History graduate, and Nicky Campbell (b.1961), ditto from Aberdeen, have to cope not only with the immediacy of the sporting moment but also with the passion of the fans. After a number of years, Inverdale racked up a number of gaffes. Some were unfortunate. None fatal. Nonetheless, readers of the Daily Telegraph in August 2016 were asked rhetorically, and obviously inaccurately: ‘Why Does Everyone Hate John Inverdale?’4 That sort of over-the top response indicates the pressures of life in the public eye.

Alongside his career in media, meanwhile, Nicky Campbell used his research skills to study the story of his own adoption. His book Blue-Eyed Son (2011)5 sensitively traced his extended family roots among both Protestant and Catholic communities in Ireland. His current role as a patron of the British Association for Adoption and Fostering welds this personal experience into a public role.

The final exemplar cited here is one of the most notable pioneers among women TV broadcasters. Baroness Joan Bakewell (b.1933) has had what she describes as a ‘rackety’ career. She studied first Economics and then History at Cambridge. After that, she experienced periods of considerable TV fame followed by the complete reverse, in her ‘wilderness years’.6 Yet her media skills, her stubborn persistence, and her resistance to being publicly patronised for her good looks in the 1960s, have given Bakewell media longevity. She is not afraid of voicing her views, for example in 2008 criticising the absence of older women on British TV. In her own maturity, she can now enjoy media profiles such as that in 2019 which explains: ‘Why We Love Joan Bakewell’.7 No doubt, she takes the commendations with the same pinch of salt as she took being written off in her ‘wilderness years’.

Bakewell is also known as an author; and for her commitment to civic engagement. In 2011 she was elevated to the House of Lords as a Labour peer. And in 2014 she became President of Birkbeck College, London. In that capacity, she stresses the value – indeed the necessity – of studying History. Her public lecture on the importance of this subject urged, in timely fashion, that: ‘The spirit of enquiring, of evidence-based analysis, is demanding to be heard.’8

What do these History graduates in front of the cameras and mikes have in common? Their multifarious roles as journalists, presenters and cultural lodestars indicate that there’s no straightforward pathway to media success. These multi-skilled individuals work hard for their fame and fortunes, concealing the slog behind an outer show of relaxed affability. They’ve also learned to live with the relentless public eagerness to enquire into every aspect of their lives, from health to salaries, and then to criticise the same. Yet it may be speculated that their early immersion in the study of History has stood them in good stead. As already noted, they are trained in ‘thinking long’. And they are using that great art to ‘play things long’ in career terms as well. As already noted, multi-skilled History graduates work in a remarkable variety of fields. And, among them, some striking stars appear regularly in every household across the country, courtesy of today’s mass media.

ENDNOTES:

1 O. Bennett, A History of the Mass Media (1987); P.J. Fourtie, (ed.), Media Studies, Vol. 1: Media History, Media and Society (2nd edn., Cape Town, 2007); G. Rodman, Mass Media in a Changing World: History, Industry, Controversy (New York, 2008); .

2 See Ofcom Report on Diversity and Equal Opportunities in Television (2018): https://www.ofcom.org.uk/__data/assets/pdf_file/0021/121683/diversity-in-TV-2018-report.PDF

3 Information from diverse sources, including esp. the invaluable survey by D. Nicholls, The Employment of History Graduates: A Report for the Higher Education Authority … (2005): https://www.heacademy.ac.uk/system/files/resources/employment_of_history_students_0.pdf; and short summary by D. Nicholls, ‘Famous History Graduates’, History Today, 52/8 (2002), pp. 49-51.

4 See https://www.telegraph.co.uk/olympics/2016/08/15/why-does-everyone-hate-john-inverdale?

5 N. Campbell, Blue-Eyed Son: The Story of an Adoption (2011).

6 J. Bakewell, interviewed by S. Moss, in The Guardian, 4 April 2010: https://www.theguardian.com/lifeandstyle/2010/apr/04/joan-bakewell-harold-pinter-crumpet

7 https://www.bbc.co.uk/programmes/articles/1xZlS9nh3fxNMPm5h3DZjhs/why-we-love-joan-bakewell.

8 J. Bakewell, ‘Why History Matters: The Eric Hobsbawm Lecture’ (2014): http://joanbakewell.com/history.html.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 103 please click here

MONTHLY BLOG 102, ARE YOU AN OPTIMIST? HOW WELL DO YOU KNOW YOUR OWN TEMPERAMENT?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

The Cheshire Cat, famed for its indestructible grin …
from Lewis Carroll’s Alice’s Adventures in Wonderland,
as depicted by John Tenniel for the book’s classic 1865 edition.
© image in public domain

 Are you an optimist? This question is one of my favourite opening gambits when launching into longish conversations with strangers. It’s a pleasant enquiry. It’s open-ended. It implies personal interest but it’s not overly intrusive. In response, people can talk about whatever they wish. They don’t have to reveal any secrets. Often, they talk about their health or work or families. In rare cases, frank individuals confide details of their hopes or fears for their love-life. And, increasingly these days, people take the question as an invitation to hold forth about politics, Brexit, and the state of the nation/world.

I’m also fond of asking questions that can go ‘round the table’, as it were. Those need to be open questions which don’t require a great deal of specialist information to answer. Getting a response from everyone, going round the group, is a great way of fostering a collective dynamic. (I enjoy this process not only in an educational context; but socially too). However, I have learned from experience that asking ‘Are you an optimist?’ really works best in one-to-one conversations. In groups, the cultural pressure to be up-beat in public militates against frank answers.1 Most people will claim, even if evasively, to be cheery – whilst allowing one or two individuals to seize the chance to play the dissident roles of ‘grumpy old men/women’. Their responses quickly lead everyone into debating ‘country going to the dogs’, Brexit, and the state of the nation/world.

However, such arguments have an increasingly stereotypical quality these days, which the question Are you an optimist? is designed to avoid. So it works best in one-to-one encounters, when there’s time to steer away from the perennial Brexit and to explore new terrain. By the way, when asking others to make whatever limited confidences they wish, it’s important to reciprocate. I have no desire to recount my life-story; but I do have some self-reflective comments about my own attitudes, which I am willing to share. Often, the question prompts an absorbing discussion, even with a newly–met stranger. It certainly is more probing than the standard gambit reportedly used by the Queen: ‘Have you come far? Or the academic’s predictable: ‘What’s your research field?’

Talking about optimism also encourages a quest for further definitions. What exactly is meant by the term? It covers a range of permutations from the mildly hopeful: ‘Well, something will turn up’ to an unshakable Panglossian faith that ‘all is for the best in the best of possible worlds’.2 And then people seek further clarification: optimistic over what sort of timespan: one year? five years? a lifetime? And with reference to what: oneself? one’s profession? one’s country? It’s very common these days for almost all educationalists across the spectrum to be deeply pessimistic about the state of the education system. By contrast, true  believers who have just discovered a great good cause tend to be highly optimistic in the early days of their faith, although over time their hopes of rapid success may become muted as they encounter obstacles and opposition (for example to feminism or to environmentalism).

Generally, however, optimists tend to skate over the complexities. Their glasses are rose-tinted. Their glasses are half full, not half empty. They see the potential in everything. And they believe, if not quite in universal ‘Progress’, at least in the positive chances of progressive betterment.3 And, as they wait in hope for things to develop favourably (even if events don’t always oblige), optimists claim to get more enjoyment out of life than do neutral observers. Milton long ago praised such feelings in L’Allegro, his hymn to mirth, jollity, dancing, nut-brown ale, good fellowship and everything that unchains ‘the hidden soul of harmony’.4

Meanwhile, lurking within every discussion about optimism is the countervailing stance of pessimism. Milton was there too. ‘Hence, vain, deluding joyes …’, he urges in Il Penseroso, his rival hymn to meditative gloom: ‘Hail divinest Melancholy …’ Pessimism in turn embraces many possibilities. Options may range through mild scepticism to world-weary disillusionment to acidic negativism to despairing self-harm.

Many pessimists, however, don’t actually accept that self-description. They prefer to call themselves ‘realists’. Whilst optimists can often be disappointed when their high hopes don’t come true, pessimists can always claim not to be surprised at any outcome, short of ecstatic and universal bliss (which is undeniably rare). It’s true that waiting for disaster to strike can seem depressing. Yet serious pessimists positively enjoy their misery. And they certainly believe that they see life more clearly than do the blinkered optimists.

At its simplest, the optimist/pessimist dichotomy can be interpreted as a function of individual psychology and basic personality traits.5 However, it’s as well to recall that changing circumstances are also liable to affect people’s template attitudes. It’s hard to remain cheerful at all times when suffering from acute pain over a long period of time. And it’s difficult to remain perennially optimistic when suffering from a relentless torrent of externally-inflicted major disasters which are entirely beyond one’s own control. So the optimist/pessimist dichotomy is by no means a rigid one. People may be pessimistic about the state of their profession (for example), whilst remaining personally optimistic about (say) their life and loves.

Crucially, too, mental states are not dictated purely by emotions and personal psychology. Considered reason plays a significant role too. The greatest expression of that truth came from Antonio Gramsci (1893-1937), the Italian Marxist who died in a Fascist prison in Rome under Mussolini. While incarcerated, he continued with stoic fortitude to analyse the state of politics and the prospects for radical change.6 What was needed, he concluded, was: ‘pessimism of the intellect, optimism of the will’. It summarised powerfully the conscious yoking of reason and emotion. Gramsci’s formula can be applied to many causes, not just his own. Equally, it can be inverted by those who have optimistic intellects but suffer from pessimistic sapping of the will. Moreover, Gramsci’s formula can be reshuffled to allow room also for super-pessimists of both intellect/will as well as for super-optimists whose smile may outlast reality.

The Cheshire Cat faded
until nothing was left but the smile …

The significant factor, in all these permutations, is that reason is reinstated into human responses to their lives and times. Intellectual attitudes draw upon many sources, rational and emotional alike. For all analysts of the human condition, it’s as well to be aware of one’s own evolving template. A reflex optimism, for example, may lead one astray, unless tempered by rational cogitation and debate with others. I write as a perennial optimist who tries to make analytical adjustments to offset my biases. This process is based upon what I’ve learned from experience – and from many ad hoc conversations with others. So readers, should we be sitting together with a good chance of open-ended discussion, I’m liable to ask my favourite question: are you an optimist?

ENDNOTES:

1 For a polemic against mindless good cheer, see B. Ehrenreich, Bright-Sided: How the Relentless Promotion of Positive Thinking has Undermined America (New York, 2009), publ. in the UK as Smile of Die: How Positive Thinking Fooled America and the World (2009). See also S. Burnett, The Happiness Agenda: A Modern Obsession (New York, 2012).

2 Referencing Dr Pangloss in Voltaire’s satirical Candide: ou l’optimisme (Paris, 1759), immediately transl. into Eng. as Candide: Or, the Optimist.

3 See e.g. discussions in K.H.M. Creal, The Idea of Progress: The Origins of Modern Optimism (Toronto, 1970); W. Laqueur, Optimism in Politics: Reflections on Contemporary History (2017).

4 Compare J. Milton, L’Allegro with Il Penseroso (both written 1631; 1st publ. 1645), in J. Milton, The Poetical Works (Oxford, 1900), pp. 20-8.

5 There is a massive literature on these themes. See e.g. E. Fox, Rainy Brain, Sunny Brain: The New Science of Optimism and Pessimism (2012); P.B. Warr, The Psychology of Happiness (2019); W.C. Compton, Positive Psychology: The Science of Happiness and Flourishing (Los Angeles, 2019); plus countless manuals of self-help.

6 From A. Gramsci, Selections from the Prison Notebooks (1971). See also context in P.D. Thomas, The Gramscian Moment: Philosophy, Hegemony and Marxism (Leiden/Boston, 2009); A. Davidson, Antonio Gramsci: Towards an Intellectual Biography (1977; 2016); L. Kolakowski, Main Currents of Marxism, Vol. 3: The Breakdown (1971); N. Greaves, Gramsci’s Marxism: Reclaiming a Philosophy of History and Politics (Leicester, 2009).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 102 please click here

MONTHLY BLOG 101, ARE YOU A LUMPER OR SPLITTER? HOW WELL DO YOU KNOW YOUR OWN CAST OF MIND?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)
The terminology, derived from Charles Darwin,1 is hardly elegant. Yet it highlights rival polarities in the intellectual cast of mind. ‘Lumpers’ seek to assemble fragments of knowledge into one big picture, while ‘splitters’ see instead complication upon complications. An earlier permutation of that dichotomy was popularised by Isaiah Berlin. In The Hedgehog and the Fox (1953), he distinguished between brainy foxes, who know many things, and intellectual hedgehogs, who apparently know one big thing.2

Fox from © Clipart 2019; Hedgehog from © GetDrawings.com (2019)

These animalian embodiments of modes of thought are derived from a fragmentary dictum from the classical Greek poet Archilochus; and they remain more fanciful than convincing. It’s not self-evident that a hedgehog’s mentality is really so overwhelmingly single-minded.3 Nor is it clear that the reverse syndrome applies particularly to foxes, which have a reputation for craft and guile.4 To make his point with reference to human thinkers, Berlin instanced the Russian novelist Leo Tolstoy as a classic ‘hedgehog’. Really? The small and prickly hedgehog hardly seems a good proxy for a grandly sweeping thinker like Tolstoy.

Those objections to Berlin’s categories, incidentally, are good examples of hostile ‘splitting’. They quibble and contradict. Sweeping generalisations are rejected. Such objections recall a dictum in a Poul Anderson sci-fi novella, when one character states gravely that: ‘I have yet to see any problem, which, when you looked at it in the right way, did not become still more complicated’.5

Arguments between aggregators/generalisers and disaggregators/sceptics, which occur in many subjects, have been particularly high-profile among historians. The lumping/splitting dichotomy was recycled in 1975 by the American J.H. Hexter.6 He accused the Marxist Christopher Hill not only of ‘lumping’ but, even worse, of deploying historical evidence selectively, to bolster a partisan interpretation. Hill replied relatively tersely.7 He rejected the charge that he did not play fair with the sources. But he proudly accepted that, through his research, he sought to find and explain meanings in history. The polarities of lumping/splitting were plain for all to see.

Historical ‘lumpers’ argue that all analysis depends upon some degree of sorting/processing/generalising, applied to disparate information. Merely itemising date after date, or fact after fact ad infinitum, would not tell anyone anything. On those dreadful occasions when lecturers do actually proceed by listing minute details one by one (for example, going through events year by year), the audience’s frustration very quickly becomes apparent.

So ‘lumpers’ like big broad interpretations. And they tend to write big bold studies, with clear long-term trends. Karl Marx’s panoramic brief survey of world history in nine pages in The Communist Manifesto was a classic piece of ‘lumping’.8 In the twentieth century, the British Marxist historian E.P. Thompson was another ‘lumper’ who sought the big picture, although he could be a combative ‘splitter’ about the faults of others.9

‘Splitters’ conversely point out that, if there were big broad-brush interpretations that were reliably apparent, they would have been discovered and accepted by now. However, the continual debates between historians in every generation indicate that grand generalisations are continually being attacked. The progression of the subject relies upon a healthy dose of disaggregation alongside aggregation. ‘Splitters’ therefore produce accounts of rich detail, complications, diversities, propounding singular rather than universal meanings, and stressing contingency over grand trends.

Sometimes critics of historical generalisations are too angry and acerbic. They can thus appear too negative and destructive. However, one of the twentieth-century historians’ most impressive splitters was socially a witty and genial man. Intellectually, however, F.J. ‘Jack’ Fisher was widely feared for his razor-sharp and trenchant demolitions of any given historical analysis. Indeed, his super-critical cast of mind had the effect of limiting his own written output to a handful of brilliant interpretative essays rather than a ‘big book’.10 (Fisher was my research supervisor. His most caustic remark to me came after reading a draft chapter: ‘There is nothing wrong with this, other than a female desire to tell all and an Oxbridge desire to tell it chronologically.’ Ouch! Fisher was not anti-woman, although he was critical of Oxbridge where I’d taken my first degree. But he used this formulation to grab my attention – and it certainly did).

Among research historians today, the temperamental/intellectual cast of mind often inclines them to ‘splitting’, partly because there are many simplistic generalisations about history in public circulation which call out for contradiction or complication. Of course, the precise distribution around the norm remains unknown. These days, I would guestimate that the profession would divide into roughly 45% ‘lumpers’, seeking big grand overviews, and 55% ‘splitters’, stressing detail, diversity, contingency. The classification, however, does depend partly on the occasion and type of output, since single-person expositions on TV and radio encourage generalisations, while round-tables and panels thrive on disagreement where splitters can come into their own.

Moreover, there are not only personal variations, depending upon circumstance, but also major oscillations in intellectual fashions within the discipline. In the later twentieth century, for example, there was a growing, though not universal, suspicion of so-called Grand Narratives (big through-time interpretations).11 The high tide of the sceptical trend known as ‘revisionism’ challenged many old generalisations and easy assumptions. Revisionists did not constitute one single school of thought. Many did favour conservative interpretations of history, but, as remains apparent today, there was and is more than one form of conservatism. That said, revisionists were generally agreed in rejecting both left-wing Marxist conflict models of revolutionary change via class struggles and liberal Whiggish linear models of evolving Progress via spreading education, constitutional rights and so forth.12

Yet the alignments were never simple (a splitterish comment from myself). Thus J.H. Hexter was a ‘splitter’ when confronting Marxists like Hill. But he was a ‘lumper’ when propounding his own Whig view of history as a process of evolving Freedom. So Hexter’s later strictures on revisionism were as fierce as was his earlier critique of Hill.13

Ideally, most research historians probably seek to find a judicious balance between ‘lumping’/‘splitting’. There is scope both for generalisations and for qualifications. After all, there is diversity within the human experience and within the cosmos. Yet there are also common themes, deep patterns, and detectable trends.

Ultimately, however, the dichotomous choice between either ‘lumping’ or ‘splitting’ is a completely false option, when pursued to its limits. Human thought, in all the disciplines, depends upon a continuous process of building/qualifying/pulling down/rebuilding/requalifying/ and so on, endlessly. With both detailed qualifications and with generalisations. An analysis built upon And+And+And+And+And would become too airy and generalised to have realistic meaning. Just as a formulation based upon But+But+But+But+But would keep negating its own negations. So, yes. Individually, it’s worth thinking about one’s own cast of mind and intellectual inclinations. (I personally enjoy both lumping and splitting, including criticising various outworn terminologies for historical periodisation).14 Furthermore, self-knowledge allows personal scope to make auto-adjustments, if deemed desirable. And then, better still, to weld the best features of ‘lumping’ and ‘splitting’ into original thought. And+But+And+Eureka.

ENDNOTES:

1 Charles Darwin in a letter dated August 1857: ‘It is good to have hair-splitters and lumpers’: see Darwin Correspondence Letter 2130 in https://www.darwinproject.ac.uk/.

2 I. Berlin, The Hedgehog and the Fox: An Essay on Tolstoy’s View of History (1953).

3 For hedgehogs, now an endangered species, see S. Coulthard, The Hedgehog Handbook (2018). If the species were to have one big message for humans today, it would no doubt be: ‘Stop destroying our habitat and support the Hedgehog Preservation Society’.

4 M. Berman, Fox Tales and Folklore (2002).

5 From P. Anderson, Call Me Joe (1957).

6 J.H. Hexter, ‘The Burden of Proof: The Historical Method of Christopher Hill’, Times Literary Supplement, 25 Oct. 1975, repr. in J.H. Hexter, On Historians: Reappraisals of Some of the Makers of Modern History (1979), pp. 227-51.

7 For Hill’s rebuttal, see The Times Literary Supplement, 7 Nov. 1975, p. 1333.

8 K. Marx and F. Engels, The Manifesto of the Communist Party (1848), Section I: ‘Bourgeois and Proletarians’, in D. McLennan (ed.), Karl Marx: Selected Writings (Oxford, 1977), pp. 222-31.

9 Among many overviews, see e.g. C. Efstathiou, E.P. Thompson: A Twentieth-Century Romantic (2015); P.J. Corfield, E.P. Thompson, Historian: An Appreciation (1993; 2018), in PJC website http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf45.

10 See P.J. Corfield, F.J. Fisher (1908-88) and the Dialectic of Economic History (1990; 2018), in PJC website http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf46.

11 See esp. J-F. Lyotard, The Postmodern Condition: A Report on Knowledge (Paris, 1979; in Eng. transl. 1984), p. 7, which detected ‘an incredulity toward meta-narratives’; and further discussions in G.K. Browning, Lyotard and the End of Grand Narratives (Cardiff, 2000); and A Munslow, Narrative and History (2018). Earlier Lawrence Stone, a classic historian ‘lumper’, had detected a return to narrative styles of exposition: see L. Stone, ‘The Revival of Narrative: Reflections on a New Old History’, Past & Present, 85 (1979), pp.  3-24. But in this essay Stone was detecting a decline in social-scientific styles of History-writing – not a return to old-style Grand Narratives.

12 Revisionism is sufficiently variegated to have avoided summary within one big study. But different debates are surveyed in L. Labedz (ed.), Revisionism: Essays on the History of Marxist Ideas (1962); J.M. Maddox, Hiroshima in History: The Myths of Revisionism (1974; 2011); L. Brenner, The Iron Wall: Zionist Revisionism from Jabotinsky to Shamir (1984); E. Longley, The Living Stream: Literature and Revisionism in Ireland (Newcastle upon Tyne, 1994); and M. Haynes and J. Wolfreys (eds), History and Revolution: Refuting Revisionism (2007).

13 J.H. Hexter (1910-96) founded in 1986 the Center for the History of Freedom at Washington University, USA, where he was Professor of the History of Freedom, and launched The Making of Modern Freedom series. For his views on revisionism, see J.H. Hexter, ‘Historiographical Perspectives: The Early Stuarts and Parliaments – Old Hat and the Nouvelle Vague’, Parliamentary History, 1 (1982), pp. 181-215; and analysis in W.H. Dray, ‘J.H. Hexter, Neo-Whiggism and Early Stuart Historiography’, History & Theory, 26 (1987), pp. 133-49.

14 See e.g. P.J. Corfield, ‘Primevalism: Saluting a Renamed Prehistory’, in A. Baysal, E.L. Baysal and S. Souvatzi (eds), Time and History in Prehistory (2019), pp. 265-82; and P.J. Corfield, ‘POST-Medievalism/ Modernity/ Postmodernity?’ Rethinking History, 14 (2010), pp. 379-404; also on http://www.penelopejcorfield.co.uk/PDF’s/CorfieldPdf20.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 101 please click here

MONTHLY BLOG 100, CONTROLLING STREET VIOLENCE & LEARNING FROM THE DEMISE OF DUELLING

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Young men carrying knives today can’t simply be equated with gentlemen duelling with rapiers in the eighteenth century. There are very many obvious differences. Nonetheless, the decline and disappearance of duelling has some relevant messages for later generations, when considering how to cope with an increase in violent street confrontations.

Both themes come under the broad rubric of controlling public expressions of male violence. By the way, such a proposition does not claim violence to be purely a masculine phenomenon. Still less does it imply that all men are prone to such behaviour. Yet it remains historically the case that weaponised acts of aggression in public and semi-public places tend to be undertaken by men – and, often, by young men at that.

Duelling developed in Europe from the sixteenth century onwards as a stylised form of combat between two aggrieved individuals.1 In terms of the technology of fighting, it was linked with the advent of the light flexible rapier, instead of the heavy old broadsword. And in terms of conflict management, the challenge to a duel took the immediate heat out of a dispute, by appointing a future date and time for the aggrieved parties to appear on the ‘field of honour’.     At the appointed hour, the meeting did not turn into an instant brawl but was increasingly codified into ritual. ‘Seconds’ accompanied the combatants, to enforce the set of evolving rules and to see fair play. They were there as friendly witnesses but also, to an extent, as referees.2 In the eighteenth century, too, surgeons were often engaged to attend, so that medical attention was available if required.

Sometimes, to be sure, there were variants in the fighting format. On one occasion in 1688 two aristocratic combatants arrived, each supported by two seconds. At a given signal, all six men launched into an uninhibited sword-fight, in which all were wounded and two of the seconds died. However, such escalations were exceptional. The seconds often began the encounter by trying to reconcile the antagonists. If successful, the would-be duellists then shook hands and declared honour to be satisfied. Hence an unknown number of angry challenges never turned into outright fighting. Would-be violence in such cases had been deflected and socially contained.

Duels certainly remained a topic of both social threat and titillating gossip. They were dramatic moments, when individual destiny appeared heightened by the danger of imminent death. Later romantic novelists and film script-writers embraced the melodrama with unwearied enthusiasm. Yet the number of real-life duels in seventeenth- and eighteenth-century Britain was tiny.

No accurate records are available, since such encounters were kept semi-clandestine. Nonetheless, contemporary legal records and newspaper reports provide some clues. Scrupulous research by the historian Robert Shoemaker has identified 236 duels in the metropolitan London area between 1660 and 1830.3 In other words, there were fewer than 1.5 duels per annum on average during these 170 years. The peak duelling decades were those of the later eighteenth century. Between 1776 and 1800, there were on average 4.5 duels per annum. Yet that total emerged from a ‘greater’ London with approximately one million inhabitants in 1801. Even taking Shoemaker’s figures as a minimum, they show that duelling was much rarer in practice than its legendary status implied.

In fact, the question might be put the other way round: why were there so many duels at all, when the practice was officially deplored? The answer has relevance to today’s discussions about knife carrying. Duelling was sustained by a degree of socio-cultural acceptance by men in elite society, who were prepared to risk the legal penalties for unlawful fighting, wounding or killing. Its continuance paid tribute to the power of custom, against the law.

By the early nineteenth century in Britain, when the practice was disappearing, it was pretty much confined to young elite men of military background. However, there were three high-profile cases when very senior Tory politicians rashly took to the field. In 1798 Prime Minister William Pitt the Younger exchanged shots with his Treasurer of the Navy. (Both missed; but Pitt retired to his bed for three weeks, overcome by stress). In 1809 George Canning, the Foreign Secretary, duelled with his fellow Cabinet member, Viscount Castlereagh, Minister for War. (Castlereagh was wounded but not fatally). Most dramatically of all, in 1829 the ‘Iron Duke’ of Wellington, then Prime Minister, confronted the Earl of Winchelsea, in a row over Catholic Emancipation. (Neither was hurt; and the Duke immediately travelled to Windsor to reassure the king that his government was not suddenly leaderless).

These ill-judged episodes were signs of the acute vehemence of political confrontations in highly pressurised times. However, critics were immediately scathing. They asked pertinently enough why the populace should obey the laws when such eminent figures were potentially breaching the peace? At very least, their rash behaviour did not encourage reverence for men in high office.

Fig.2 Equestrian statue of Duke of Wellington, located in Royal Exchange Square, Glasgow: capping the statue with a traffic cone has become a source of local amusement, despite continued disapproval from Glasgow City Council and police.

Public opinion was slowly shifting against duelling. There was no guarantee that the god of battle would give victory to the disputant who was truly in the right. Fighting empowered the bellicose over the irenic. Religious and civic authorities always opposed fighting as a means of conflict resolution. Lawyers were particularly hostile. Self-help administration of justice deprived them of the business of litigation and/or arbitration. Hence in 1822 a senior law lord defined duelling as ‘an absurd and shocking remedy for private insult’.

Other voices had long been arguing that case. In 1753 the novelist Samuel Richardson strove in Sir Charles Grandison to depict a good man who declined to fight a duel, despite being strongly provoked. True, many impatient readers found this saintly hero to be somewhat priggish. But Grandison stressed that killing or maiming a rival over a point of honour was actually the reverse of honourable.4 Bourgeois good sense was triumphing over aristocratic impetuosity, although the fictional Sir Charles had a title just to soothe any anxieties over his social respectability.

Another public declaration against duelling came from the down-to-earth American inventor and civic leader Benjamin Franklin (1706-90). In 1784 he rejected the practice as both barbaric and old-fashioned: ‘It is astonishing that the murderous practice of duelling … should continue so long in vogue’. His intervention was particularly notable, in that recourse to duelling was socially more widespread in the American colonies, with their ingrained gun culture.5 And Franklin stuck to his position, refusing to rise to sundry challenges

The force of such interventions in Britain helped to render public opinion decreasingly sympathetic to duellists. One pertinent example came from 1796. Early one morning, two Americans faced each other to duel in Hyde Park. But ten swimmers in the nearby Serpentine – some of them naked – jumped out of the water and ran to stop the fight. In this particular case, they were too late; and one contestant died. Nonetheless, witnesses testified in the ensuing murder trial that the crowd, many of middling social origins, had spontaneously intervened. Public attitudes were becoming hostile. And it was that shift, rather than major changes in law or policing, which caused the practice slowly to disappear. The last fatal duel in Scotland took place in 1826; the last in England/Wales (between two exiled Frenchmen) in 1852. When Prime Minister Peel was challenged to a political duel in the 1840s he immediately refused, on the grounds that such behaviour would be ‘childish’ as well as wrong.

Viewed in terms of Britain’s historical sociology, the decline of duelling was part of a complex process of everyday demilitarisation, in the context of the slow shift from a rural to an urbanised society. Gentlemen decreasingly carried swords for other than ceremonial purposes. Canes and umbrellas came into vogue instead. Sheridan’s play The Rivals (1775) poked fun at impetuous young gentlemen who are ready to fight for their honour. Yet they are aware that ‘a sword seen in the streets of Bath would raise as great an alarm as a mad dog’, as one character remarks. The combative Irish adventurer Sir Lucius O’Trigger is lampooned – a nice touch of auto-critique from Sheridan who came from Dublin and twice fought duels himself. And the country bumpkin Bob Acres, who is egged on to fight his rival, tellingly finds his valour ‘oozing away’ when it gets to the point.6 Audiences are invited to laugh, but sympathetically.

Interestingly, by 1775 Sheridan’s play was already behind the times in terms of the technology of fighting. By the 1760s duels had come increasingly to be fought with pistols. The last known sword duel in Britain occurred in 1785. This technological updating, supplied by industrious Birmingham gun-makers, had two paradoxical effects. On the one hand, it demonstrated that the art of duelling was quick to move with the times.

On the other hand, the advent of the pistol inadvertently saved lives. The statistics collected by Robert Shoemaker showed that unequivocally. Duels with swords, among his 236 recovered examples, resulted in deaths in 22 per cent of all cases; and woundings in another 25 per cent. By contrast, it was tricky to kill a man standing at a distance, especially with early pistols which lacked rifle sights for precise aiming. Among Shoemaker’s 236 cases, as few as 7 percent of duels with pistols resulted in death; while a further 22 percent led to woundings.

Or, the point can be put the other way round. A massive 71 percent of combatants were unharmed after an exchange of pistol shots, compared with 53 per cent of duellists who were unharmed after crossing swords. In neither case did a duel guarantee a bloodbath. But pistols were a safer bet, especially after conventions established that the combatants had to stand at a considerable distance from one another and had to wait for a signal, in the form of a dropping handkerchief, before taking aim and firing. No ‘jumping the gun’. Indeed one test case in 1750 saw a duellist on trial for murder because he had fired before his opponent was ready. So the victim had testified, plaintively, on his deathbed.

It was the unavoidable proximity of the combatants rather than their martial skills which led to the greater proportion of killings by swordsmen than by gunsmen. That fact is relevant to the experience of knife-carrying today. The number of fatalities is not a sign of a special outcrop of wickedness but rather the consequence of the chosen technology. Knife-wielding in anger at close quarters is intrinsically dangerous, whatever the level of fighting expertise.

Needless to say, the moral of this history is not that combatants should switch to guns. The much-enhanced technology of gunfire today, including the easy firing of multiple rounds, makes that option ever less socially palatable, if it ever was.

Instead, the clear requirement is to separate combatants and to ritualise the expression of social and personal aggression. Achieving such policies must rely considerably upon systems of law and policing. Yet socio-cultural attitudes among the wider public are highly relevant too. As the history of duelling indicates, even august Prime Ministers allowed themselves upon occasion to be provoked into behaving in ways that put them at risk of criminal charges. But changing social mores eventually removed that option, even for the most combative and headstrong of politicians today. Community attitudes at first ritualised the personal resolution of conflicts and eventually withdrew support for such behaviour entirely.

So today multiple approaches are required. Police actions to discourage young men from carrying knives constitute an obvious and important step. Ditto effective policies to curb the drug culture. Equally crucial are strong and repeated expressions of community disapproval of violence and knife-carrying. Yet policing and public attitudes can’t work without complementary interventions to combat youth alienation and, especially, to provide popular non-violent outlets for energy and aggression. Leaving bored young people feeling fearful and at risk in public places is no recipe for social order.

How can energies and aggression be either ritualised and/or channelled into other outlets? It’s for young people and community activists to specify. But many potential options spring to mind: youth clubs; youth theatre; participatory sports of all kinds; martial arts; adventure programmes; community and ecological projects; music-making festivals; dance; creative arts; church groups; … let alone continuing educational access via further education study grants. It’s true that all such plans involve constructive imagination, organisation, and expenditure. But their benefits are immense. Violence happens within societies; and so, very emphatically, does conflict resolution and, better still, the redirection of energies and aggression into constructive pathways.

1 See variously S. Banks, Duels and Duelling (Oxford, 2014); U. Frevert, Men of Honour: A Social and Cultural History of the Duel (Cambridge, 1995); V.G. Kiernan, The Duel in European History: Honour and the Reign of the Aristocracy (Oxford, 1988; 2016); M. Peltonen, The Duel in Early Modern England: Civility, Politeness and Honour (Cambridge, 2003); P. Spierenburg (ed.), Men and Violence: Gender, Honour and Rituals in Modern Europe and America (Columbus, Ohio, 1998).

2 S. Banks, ‘Dangerous Friends: The Second and the Later English Duel’, Journal of Eighteenth-Century Studies, 32 (2009), pp. 87-106.

3 R.G. Shoemaker, ‘The Taming of the Duel: Masculinity, Honour and Ritual Violence in London, 1660-1800’, Historical Journal, 45 (2002), pp. 525-45.

4 S. Richardson, The History of Sir Charles Grandison (1753; in Oxford 1986 edn), Bk.1, pp. 207-8.

5 B. Franklin, ‘On Duelling’ (1784), in R.L. Ketcham (ed.), The Political Thought of Benjamin Franklin (Indianapolis, Ind., 1965; 2003), p. 362. For context, see also W.O. Stevens, Pistols at Ten Paces: The Story of the Code of Honour in America (Boston, 1940); D. Steward, Duels and the Roots of Violence in Missouri (2000); and C. Burchfield, Choose Your Weapon: The Duel in California, 1847-61 (Fresno, CA., 2016).

6 R.B. Sheridan, The Rivals (1775), ed E. Duthie (1979), Act V, sc. 2 + 3, pp. 105, 112. For the Irish context, see J. Kelly, ‘That Damn’d Thing Called Honour’: Duelling in Ireland, 1570-1860 (Cork, 1995).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 100 please click here

MONTHLY BLOG 99, WHY BOTHER TO STUDY THE RULEBOOK?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Joining a public committee of any kind? Before getting enmeshed in the details, I recommend studying the rulebook. Why on earth? Such advice seems arcane, indeed positively nerdy. But I have a good reason for this recommendation. Framework rules are the hall-mark of a constitutionalist culture.

Fig.1 The handsome front cover of the first edition of Robert’s Rules of Order (1876): these model rules, based upon the practices of the US Congress, remain widely adopted across the USA, their updating being undertaken by the Robert’s Rules Association, most recently in 2011.

Once, many years ago, I was nominated by the London education authority – then in the form of the Inner London Education Authority or ILEA – onto a charitable trust in Battersea, where I live. I accepted, not with wild enthusiasm, but from a sense of civic duty. The Trust was tiny and then did not have much money. It was rumoured that a former treasurer in the 1930s had absconded with all the spare cash. But anyway in the early 1970s the Trust was pottering along and did not seem likely to be controversial.

My experience as a Trustee was, however, both depressing and frustrating. The Trust was then named Sir Walter St. John’s Trust; and it exists today in an updated and expanded guise as the Sir Walter St. John’s Educational Charity (www.swsjcharity.org.uk). It was founded in 1700 by Battersea’s local Lord of the Manor, after whom it is named. In the 1970s, the Trust didn’t do much business at all. The only recurrent item on the agenda was the question of what to do about a Victorian memorial window which lacked a home. The fate of the Bogle Smith Window (as it was known) had its faintly comic side. Surely somewhere could be found to locate it, within one or other of the two local state-sector grammar schools, for which the Trust was ground landowner? But soon the humour of wasting hours of debate on a homeless window palled.

I also found it irksome to be treated throughout with deep suspicion and resentment by most of my fellow Trustees. They were Old Boys from the two schools in question: Battersea Grammar School and Sir Walter St. John School. All the Trust business was conducted with outward calm. There were no rows between the large majority of Old Boys and the two women appointed by the ILEA. My fellow ILEA-nominee hardly ever attended; and said nothing, when she did. Yet we were treated with an unforgiving hostility, which I found surprising and annoying. A degree of misogyny was not unusual; yet often the stereotypical ‘good old boys’ were personally rather charming to women (‘the ladies, God bless’em’) even while deploring their intrusion into public business.

But no, these Old Boys were not charming, or even affable. And their hostile attitude was not caused purely by misogyny. It was politics. They hated the Labour-run ILEA and therefore the two ILEA appointees on the Trust. It was a foretaste of arguments to come. By the late 1970s, the Conservatives in London, led by Councillors in Wandsworth (which includes Battersea) were gunning for the ILEA. And in 1990 it was indeed abolished by the Thatcher government.

More than that, the Old Boys on the Trust were ready to fight to prevent their beloved grammar schools from going comprehensive. (And in the event both schools later left the public sector to avoid that ‘fate’). So the Old Boys’ passion for their cause was understandable and, from their point of view, righteous. However, there was no good reason to translate ideological differences into such persistently rude and snubbing behaviour.

Here’s where the rulebook came into play. I was so irked by their attitude – and especially by the behaviour of the Trust’s Chair – that I resolved to nominate an alternative person for his position at the next Annual General Meeting. I wouldn’t have the votes to win; but I could publicly record my disapprobation. The months passed. More than a year passed. I requested to know the date of the Annual General Meeting. To a man, the Old Boys assured me that they never held such things, with something of a lofty laugh and sneer at my naivety. In reply, I argued firmly that all properly constituted civic bodies had to hold such events. They scoffed. ‘Well, please may I see the Trust’s standing orders?’ I requested, in order to check. In united confidence, the Old Boys told me that they had none and needed none. We had reached an impasse.

At this point, the veteran committee clerk, who mainly took no notice of the detailed discussions, began to look a bit anxious. He was evidently stung by the assertion that the Trust operated under no rules. After some wrangling, it was agreed that the clerk should investigate. At the time, I should have cheered or even jeered. Because I never saw any of the Old Boys again.

Several weeks after this meeting, I received through the post a copy of the Trust’s Standing Orders. They looked as though they had been typed in the late nineteenth century on an ancient typewriter. Nonetheless, the first point was crystal clear: all members of the Trust should be given a copy of the standing orders upon appointment. I was instantly cheered. But there was more, much more. Of course, there had to be an Annual General Meeting, when the Chair and officers were to be elected. And, prior to that, all members of the Trust had to be validly appointed, via an array of different constitutional mechanisms.

An accompanying letter informed me that the only two members of the Trust who were correctly appointed were the two ILEA nominees. I had more than won my point. It turned out that over the years the Old Boys had devised a system of co-options for membership among friends, which was constitutionally invalid. They were operating as an ad hoc private club, not as a public body. Their positions were automatically terminated; and they never reappeared.

In due course, the vacancies were filled by the various nominating bodies; and the Trust resumed its very minimal amount of business. Later, into the 1980s, the Trust did have some key decisions to make, about the future of the two schools. I heard that its sessions became quite heated politically. That news was not surprising to me, as I already knew how high feelings could run on such issues. These days, the Trust does have funds, from the eventual sale of the schools, and is now an active educational charity.

Personally, I declined to be renominated, once my first term of service on the Trust was done. I had wasted too much time on fruitless and unpleasant meetings. However, I did learn about the importance of the rulebook. Not that I believe in rigid adhesion to rules and regulations. Often, there’s an excellent case for flexibility. But the flexibility should operate around a set of framework rules which are generally agreed and upheld between all parties.

Rulebooks are to be found everywhere in public life in constitutionalist societies. Parliaments have their own. Army regiments too. So do professional societies, church associations, trade unions, school boards, and public businesses. And many private clubs and organisations find them equally useful as well. Without a set of agreed conventions for the conduct of business and the constitution of authority, there’s no way of stopping arbitrary decisions – and arbitrary systems can eventually slide into dictatorships.

As it happens, the Old Boys on the Sir Walter St. John Trust were behaving only improperly, not evilly. I always regretted the fact that they simply disappeared from the meetings. They should at least have been thanked for their care for the Bogle Smith Window. And I would have enjoyed the chance to say, mildly but explicitly: ‘I told you so!’

Goodness knows what happened to these men in later years. I guess that they continued to meet as a group of friends, with a great new theme for huffing and puffing at the awfulness of modern womanhood, especially the Labour-voting ones. If they did pause to think, they might have realised that, had they been personally more pleasant to the intruders into their group, then there would have been no immediate challenge to their position. I certainly had no idea that my request to see the standing orders would lead to such an outcome.

Needless to say, the course of history does not hinge upon this story. I personally, however, learned three lasting lessons. Check to see what civic tasks involve before accepting them. Remain personally affable to all with whom you have public dealings, even if you disagree politically. And if you do join a civic organisation, always study the relevant rulebook. ‘I tried to tell them so!’ all those years ago – and I’m doing it now in writing. Moreover, the last of those three points is highly relevant today, when the US President and US Congress are locking horns over the interpretation of the US constitutional rulebook. May the rule of law prevail – and no prizes for guessing which side I think best supports that!

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 99 please click here