Tag Archive for: christianity

MONTHLY BLOG 157, HOW THE GEORGIANS CELEBRATED MIDWINTER (*)

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2024)

Variety was the spice of Midwinter festivities under the Georgians. There was no cultural pressure to conform to one standard format. Instead, people responded to diverse regional, religious and family traditions. And they added their own preferences too. Festivities thus ranged from drunken revelries to sober Puritan spiritual meditation, with all options in between.

It was the Victorians from the 1840s onwards – with the potent aid of Charles Dickens – who standardised Christmas as a midwinter family festivity. They featured Christmas trees, puddings, cards, presents, carol services, and ‘Father Christmas’. It’s a tradition that continues today, with some later additions. Thus, on Christmas Days in Britain since 1932, successive monarchs have recorded their seasonal greetings to the nation, by radio (and later TV).

Georgian variety, meanwhile, was produced by a continuance of older traditions, alongside the advent of new ones. Gift-giving at Christmas had the Biblical sanction of the Three Wise Men, bringing to Bethlehem gifts of gold, frankincense and myrrh. So the Georgians substituted their own luxury items. An appreciated gift, among the wealthy, was a present of fine quality gloves. But, interestingly, that custom, which was well established by 1700, was already on the wane by 1800 as fashions in clothing changed. Embroidered gloves, made of lambskin, doeskin, or silk, were given to both men and women, as Christmas or New Year gifts. These luxury items may be said therefore to have symbolised the hand of friendship.

Fig.1: Add MS 78429, John Evelyn’s Doe-Skin Gloves,
17th century, British Library. Public domain.

The first illustration shows a fringed and embroidered glove once owned by the diarist John Evelyn. It was presented to him by the young Russian Tsar, Peter the Great. He had, during his semi-clandestine stay in England in 1698, resided in a property at Deptford, owned by Evelyn. The headstrong visitor caused considerable damage. So Peter’s farewell gift to Evelyn might be seen not so much as a mark of friendship but as something of a royal brush-off.

Presents can, after all, convey many messages. In the Georgian era, it was customary also for clients or junior officials to present gloves as Christmas or New Year gifts to their patrons or employers. The offering could be interpreted as thanks for past services rendered – or even as a bribe for future favours. That was especially the case if the gloves contained money, known in the early eighteenth century as ‘glove money’.

For example, the diarist Samuel Pepys, who worked for the Admiralty Board, had a pleasant surprise in 1664. A friendly contractor presented Pepys’ wife with gloves, which were found to contain within them forty pieces of gold. Pepys was overjoyed. (Today, by contrast, strict policies rightly regulate the reception of gifts or hospitality by civil servants and by MPs).

Meanwhile, individuals among the middling and lower classes in Georgian Britain did not usually give one another elaborate presents at Christmas. Not only did they lack funds, but the range of commercially available gifts and knick-knacks was then much smaller.

Instead, however, there was a flow of charitable giving from the wealthy to the ‘lower orders’. Churches made special Christmas collections for poor families. Many well-to-do heads of household gave financial gifts to their servants; as did employers to their workers. In order to add some grace to the transaction, such gifts of money were presented in boxes. Hence the Georgians named the day-after-Christmas as ‘Boxing Day’ (later decreed as a statutory holiday in 1871). Such activities provide a reminder that midwinter was – then as today – a prime time for thanking workers for past services rendered – as well as for general charitable giving.

Innovations were blended into older Midwinter traditions. Houses interiors in 1700 might well be festooned with old-style holly and ivy. By 1800, such decorations were still enjoyed. But, alongside, a new fashion was emerging. It was borrowed from German and Central European customs; and the best-known pioneer in Britain was George III’s Queen Charlotte of Mecklenburg-Strelitz. In 1800, she placed a small yew tree indoors and hung it with decorations. Later, a small fir was substituted, becoming the Victorians’ standard ‘Christmas Tree’, as it remains today.

Overlapping customs were, however, feted in the cheery Christmas carol, ‘Deck the Hall(s) with Boughs of Holly’. It was an ancient Welsh ballad, Nos Galan, habitually sung on New Year’s Day. Child singers were then treated by gifts of skewered apples, stuck with raisins. ‘Deck the Hall(s)’ was later given English lyrics in 1862 by a Scottish bard. And it’s still heartily sung – long after holly has lost its decorative primacy.

Many famous Christian hymns were also newly written in the Georgian era. They included: While Shepherds Watched … (1703); Hark! The Herald Angels Sing! (1739); and Adeste Fideles/ O Come All Ye Faithful (Latin verses 1751; English lyrics 1841). These all appeared in the 1833 publication of Christmas Carols, Ancient & Modern, edited by the antiquarian William Sandys/ He had recovered many of these songs from the oral tradition. Now they were all recorded in print for future generations.

Notably, a number of the so-called Christmas carols were entirely secular in their message. Deck the Hall(s) with Boughs of Holly explained gleefully: ’Tis the season to be jolly/ Fa la la la la la la la la. No mention of Christ.

Similarly, the carol entitled The Twelve Days of Christmas (first published in London in 1780) records cumulative gifts from ‘my true love’ for the twelve-day festive period. They include ‘five gold rings; …  two turtle doves’ and a ‘partridge in a pear tree’. None are obviously Christian icons.

Fig.2: Anonymous (1780). Mirth without Mischief. London:
Printed by J. Davenport, George’s Court, for C. Sheppard, no. 8, Aylesbury Street, Clerkenwell.
pp. 5–16

And as for Santa Claus (first mentioned in English in the New York press, 1773), he was a secularised Northern European variant of Saint Nicholas, the patron saint of 26 December. But he had shed any spiritual role. Instead, he had become a plump ‘Father Christmas’, laughing merrily Ho! Ho! Ho! (Songs about his reindeers followed in the twentieth century).

Given this utterly eclectic mix of influences, it was not surprising that more than a few upright Christians were shocked by the secular and bacchanalian aspects of these midwinter festivities. Puritans in particular had long sought to purify Christianity from what they saw as ‘Popish’ customs. And at Christmas, they battled also against excesses of drinking and debauchery, which seemed pagan and un-Christian. One example was the rural custom of ‘wassailing’. On twelfth night, communities marched to orchards, banging pots and pans to make a hullabaloo. They then drank together from a common ‘wassail’ cup. The ritual, which did have pagan roots, was intended to encourage the spirits to ensure a good harvest in the coming year. Whether the magic worked or not, much merriment ensued.

Fig.3: A Fine and Rare 17th Century Charles II Lignum Vitae
Wassail Bowl, Museum Grade – Height: 21.5 cm (8.47 in)   Diameter: 25 cm (9.85 in).
Sold by Alexander George, Antique Furniture Dealer, Faringdon, Oxfordshire:
https://alexandergeorgeantiques.com/17th-century-charles-ii-lignum-vitae-wassail-bowl-museum-grade/

For their opposition to such frolics, the Puritans were often labelled as ‘Kill-Joys’. But they strove sincerely to live sober, godly and upright lives. Moreover, there was no Biblical authority for licentious Christmas revelries. Such excesses were ‘an offence to others’ and, especially, a ‘great dishonour of God’. So declared a 1659 law in the Massachusetts Bay Colony, specifying penalties for engaging in such ‘superstitious’ festivities.

Zealous opposition to riotous Christmases was especially found among Nonconformist congregations such as the Presbyterians, Congregationalists, Baptists and Quakers. They treated 25 December, if it fell upon a weekday, just like any other day. People went soberly about their business. They fasted rather than feasted. Sober Christmases thus became customary in Presbyterian Scotland and in the Puritan colonies of New England. It was true that, over time, the strictest rules were relaxed. The Massachusetts ban was repealed in 1681 by a Royalist Governor of the colony. But ardent Puritans long distrusted all forms of ‘pagan’ Christmas excess.

One consequence was that people sought other outlets for midwinter revelry. A great example is Scotland’s joyous celebration of New Year’s Eve or Hogmanay. (The name’s origin is obscure). One ancient custom, known as ‘first footing’, declares that the first stranger to enter a house after midnight (or in the daytime on New Year’s Day) will be a harbinger of good or bad luck for the following year. An ideal guest would be a ‘tall dark stranger’, bearing a small symbolic gift for the household – such as salt, food, a lump of coal, or whisky. General festivities then ensue.

All these options allowed people to enjoy the ‘festive season’, whether for religious dedication – or to celebrate communally the midwinter and the hope of spring to come – or for a mixture of many motives.

No doubt, some Georgians then disliked the fuss. (Just as today, a persistent minority records a positive ‘hatred’ of Christmas). All these critics could share the words of Ebenezer Scrooge – the miser memorably evoked by Dickens in A Christmas Carol (1843). Scrooge’s verdict was: ‘Bah! Humbug!

Yet many more give the salute: ‘Merry Christmas!’ Or on New Year’s Eve (but not before) ‘Happy Hogmanay!’ And, as for Scrooge: at the novel’s finale, he mellows and finally learns to love all his fellow humans. Ho! Ho! Ho!

ENDNOTES:

(*) First published in Yale University Press BLOG, December 2023: https://yalebooksblog.co.uk/2023/12/08/how-the-georgians-celebrated-christmas-by-penelope-j-corfield/

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 157 please click here

MONTHLY BLOG 140, A YEAR OF GEORGIAN CELEBRATIONS – 8: Annual Memorial Service at Bristol’s Arnos Vale Cemetery, to celebrate the life of India’s remarkable religious, social & educational reformer, Raja Ram Mohan Roy (1772-1833)

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2022)

Republic of India postage stamp (1964),
where his name has alternative spelling as Mohun Roy.

Eighteenth-century Britain witnessed a veritable ferment of ideas. Religious reformers and traditionalists within Christianity battled with one another, whilst religious sceptics, known as ‘freethinkers’, argued against all forms of revealed religion. It was a time for rethinking and renewal; and not just in Britain.

India’s age-old Hindu tradition was also witnessing its own upheavals. One of its most remarkable reformers was a Bengali thinker named Raja Ram Mohan Roy (1772-1833).1 He came from a well-established Brahmin family; and was well educated in an array of languages, although the precise details of his schooling remain disputed. He became familiar with Persian and Arabic studies, as well as with classical Sanskrit. Later he learned other languages, including Latin and Greek. Very evidently, he was a gifted linguist.

But Raja Ram Mohan Roy went further. During his formative years, he interacted with spiritual teachers from diverse religious traditions. One was a famous Baptist missionary in India, William Carey (1761-1834). A growing characteristic of Ram Mohan Roy’s own thought was his desire to see into the heart of religion, to find the one true source of godliness. And his companion wish was to purify religious observances, so that external conventions and rituals did not distract worshippers from the chance of a genuine religious experience.

Such an approach was very characteristic of fundamentalist religious reformers. However, Mohan Roy did not break from the Hindu faith to achieve his aims. He remained within its broad-based tradition, and tried to update its customs. Prominent among the targets which he sought to reform were polygamy; child marriage; and the caste system, whereby people were ‘allocated’ to one social position at birth and kept there by rigid custom. Roy was also vehement against the traditional practice of ‘sati’ or ‘suttee’, which required widows to sacrifice their lives on the funeral pyres of their deceased husbands.

These campaigns turned Ram Mohan Roy into not only a powerful religious moderniser but also a significant social and educational reformer. He was a liberal pioneer of women’s rights.2 He wrote prolifically. He founded educational institutions. He co-founded the Kolkata/Calcutta Unitarian Society and also founded the Brahma Samaj (a social reform group within Hinduism). He supported the use of English in Indian education. In some ways, then, he can be regarded as vector for the spread of Western ideas into India, in that he wanted India to shed its outmoded customs and to become ‘modernised’, like its then rulers – the officials in the East India Company, for whom Roy had worked.3

At the same time, however, Ram Mohan Roy was also a great example of the rich eclecticism of the Hindu tradition. He believed in the inner ‘oneness’ of all religion. (This aspect of his thought appealed to many British and American Unitarians, who abjured Trinitarian Christianity to worship the one divine power). And Ram Mohan Roy clearly did not seek a personal redeemer. So for him there was little point in changing churches, when the divine can be worshipped everywhere: God is one. He has no end. He exists in all the living things on the Earth’.

There were well known later debates within India, as to how far Roy was simply a ‘child of the West’. Yet that viewpoint misses the strength of his Hindu spiritualism. Moreover, he was sufficient of an Indian gentleman to accept the honorific title of ‘Raja’ (prince) from the Mughal Emperor Akbar II in 1830. A determined reformer; yes; but not a social revolutionary.

In September 1833, Raja Ram Mohan Roy was visiting Britain, as an imperial envoy from India. Staying at the small village of Stapleton, near Bristol, he fell ill suddenly and died of meningitis. He was initially buried quietly in the grounds of the house where he had died. But a decade later, his remains were re-interred at the new Arnos Vale Cemetery, at Brislington in East Bristol. This venue was not monopolised by any specific faith. It contains both an Anglican and a Nonconformist Mortuary Chapel; and the authorities made no objection to the inclusion of a devout Hindu.

Ram Mohan Roy’s grave, topped by an Indian Mausoleum, was a fitting component of this ecumenical resting place. At this spot, an annual commemoration of his life and teachings is held every September, at or near the date of his death. Dignitaries like the Mayor of Bristol and the Indian High Commissioner are joined by all other Indians and Britons who wish to share in the remembrance service. There is also a fine statue of Mohan Roy on Bristol’s College Green. And at Stapleton, too, there is today a memorial plaque and a pedestrian walk, named in his honour.       

He did not, of course, plan to die in Bristol. But for Raja Ram Mohan Roy, the apostle of spiritual oneness, there is a certain aptness in finding a peaceful resting-place among the dead of many faiths and none. For the history of Georgian Britain, too, Mohan Roy’s quest for spiritual enlightenment and social reform was part of the ferment of debates between believers and freethinkers.

Many globe-trotting Britons ventured to India in these years. Some were seeking colonial power and trading profits, while others, like William Carey, were intent on saving souls. Yet the exchange of ideas and peoples was not just one way. Where then are respects rightly paid to the remarkable Indian reformer, who was the ‘parent of the Bengal Renaissance’ and also a citizen of the world? Why, in Bristol’s Arnos Vale Cemetery, every September.

ENDNOTES:

1 See variously H.D. Sharma, Raja Ram Mohan Roy: The Renaissance Man (2002); D.C. Vyas, Biography of Raja Ram Mohan Roy (New Delhi, 2010); P. Kumari, Women, Social Customs and Raja Ram Mohan Roy (Patna, 2013).

2 Mohan Roy himself married three times. His first two wives predeceased him, and his third wife outlived him, without, of course, committing sati after his death.

3 Among a huge literature, see variously: J. Keay, The Honourable Company: A History of the East India Company (1991; 2017); S. Sen, Empire of Free Trade: The East India Company and Making of the Colonial Marketplace (Philadelphia, 1998); M. Chowdhury, Empire and Gunpowder: Military Industrialization and Ascendancy of the East India Company in India, 1757-1856 (New Delhi, 2022).

 

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 140 please click here

MONTHLY BLOG 92, HISTORIANS AT WORK THROUGH TIME

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)
Historians, who study the past, don’t undertake this exercise from some vantage point outside Time. They, like everyone else, live within an unfolding temporality. That’s very fundamental. Thus it’s axiomatic that historians, like their subjects of study, are all equally Time-bound.1

Nor do historians undertake the study of the past in one single moment in time. Postmodernist critics of historical studies sometimes write as though historical sources are culled once only from an archive and then adopted uncritically. The implied research process is one of plucking choice flowers and then pressing them into a scrap-book to some pre-set design.

On such grounds, critics of the discipline highlight the potential flaws in all historical studies. Sources from the past are biased, fallible and scrappy. Historians in their retrospective analysis are also biased, fallible and sometimes scrappy. And historical writings are literary creations only just short of pure fiction.2

Historians should welcome scepticism this dose of scepticism – always a useful corrective. Yet they entirely reject the proposition that trying to understand bygone eras is either impossible or worthless. Rebuttals to postmodernist scepticism have been expressed theoretically;3 and also directly, via pertinent case studies which cut through the myths and ‘fake news’ which often surround controversial events in history.4

When at work, historians should never take their myriad of source materials literally and uncritically. Evidence is constantly sought, interrogated, checked, cross-checked, compared and contrasted, as required for each particular research theme. The net is thrown widely or narrowly, again depending upon the subject. Everything is a potential source, from archival documents to art, architecture, artefacts and though the gamut to witness statements and zoological exhibits. Visual materials can be incorporated either as primary sources in their own right, or as supporting documentation. Information may be mapped and/or tabulated and/or statistically interrogated. Digitised records allow the easy selection of specific cases and/or the not-so-easy processing of mass data.

As a result, researching and writing history is a slow through-Time process – sometimes tediously so. It takes at least four years, from a standing start, to produce a big specialist, ground-breaking study of 100,000 words on a previously un-studied (or under-studied) historical topic. The exercise demands a high-level synthesis of many diverse sources, running to hundreds or even thousands. Hence the methodology is characteristically much more than a ‘reading’ of one or two key texts – although, depending upon the theme, at times a close reading of a few core documents (as in the history of political ideas) is essential too.

Mulling over meanings is an important part of the process too. History as a discipline encourages a constant thinking and rethinking, with sustained creative and intellectual input. It requires knowledge of the state of the discipline – and a close familiarity with earlier work in the chosen field of study. Best practice therefore enjoins writing, planning and revising as the project unfolds. For historical studies, ‘writing through’ is integral, rather than waiting until all the hard research graft is done and then ‘writing up’.5

The whole process is arduous and exciting, in almost equal measure. It’s constantly subject to debate and criticism from peer groups at seminars and conferences. And, crucially too, historians are invited to specify not only their own methodologies but also their own biases/assumptions/framework thoughts. This latter exercise is known as ‘self-reflexivity’. It’s often completed at the end of a project, although it’s then inserted near the start of the resultant book or essay. And that’s because writing serves to crystallise and refine (or sometimes to reject) the broad preliminary ideas, which are continually tested by the evidence.

One classic example of seriously through-Time writing comes from the classic historian Edward Gibbon. The first volume of his Decline & Fall of the Roman Empire appeared in February 1776. The sixth and final one followed in 1788. According to his autobiographical account, the gestation of his study dated from 1764. He was then sitting in the Forum at Rome, listening to Catholic monks singing vespers on Capitol Hill. The conjunction of ancient ruins and later religious commitments prompted his core theme, which controversially deplored the role of Christianity in the ending of Rome’s great empire. Hence the ‘present’ moments in which Gibbon researched, cogitated and wrote stretched over more than 20 years. When he penned the last words of the last volume, he recorded a sensation of joy. But then he was melancholic that his massive project was done.6 (Its fame and the consequent controversies last on today; and form part of the history of history).

1 For this basic point, see PJC, ‘People Sometimes Say “We Don’t Learn from the Past” – and Why that Statement is Completely Absurd’, BLOG/91 (July 2018), to which this BLOG/92 is a companion-piece.

2 See e.g. K. Jenkins, ReThinking History (1991); idem (ed.), The Postmodern History Reader (1997); C.G. Brown, Postmodernism for Historians (Harlow, 2005); A. Munslow, The Future of History (Basingstoke, 2010).

3 J. Appleby, L. Hunt and M. Jacob, Telling the Truth about History (New York, 1994); R. Evans, In Defence of History (1997); J. Tosh (ed.), Historians on History (Harlow, 2000); A. Brundage, Going to the Sources: A Guide to Historical Research and Writing (Hoboken, NJ., 2017).

4 H. Shudo, The Nanking Massacre: Fact versus Fiction – A Historian’s Quest for the Truth, transl. S. Shuppan (Tokyo, 2005); Vera Schwarcz, Bridge across Broken Time: Chinese and Jewish Cultural Memory (New Haven, 1998).

5 PJC, ‘Writing Through a Big Research Project, not Writing Up’, BLOG/60 (Dec.2015); PJC, ‘How I Write as a Historian’, BLOG/88 (April 2018).

6 R. Porter, Gibbon: Making History (1989); D.P. Womersley, Gibbon and the ‘Watchmen of the Holy City’: The Historian and his Reputation, 1776-1815 (Oxford, 2002).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 92 please click here

MONTHLY BLOG 91, PEOPLE SOMETIMES SAY: ‘WE DON’T LEARN FROM THE PAST’ AND WHY THAT STATEMENT IS COMPLETELY ABSURD

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

People sometimes say, dogmatically but absurdly: ’We don’t learn from the Past’. Oh really? So what do humans learn from, then? We don’t learn from the Future, which has yet to unfold. We do learn in and from the Present. Yet every moment of ‘Now’ constitutes an infinitesimal micro-instant an unfolding process. The Present is an unstable time-period, which is constantly morphing, nano-second by nano-second, into the Past. Humans don’t have time, in that split-second of ‘Now’, to comprehend and assimilate everything. As a result, we have, unavoidably, to learn from what has gone before: our own and others’ experiences, which are summed as everything before ‘Now’: the Past.

It’s worth reprising the status of those temporal categories. The Future, which has not yet unfolded, is not known or knowable in its entirety. That’s a definitional quality which springs from the unidirectional nature of Time. It does not mean that the Future is either entirely unknown or entirely unknowable. As an impending temporal state, it may beckon, suggest, portend. Humans are enabled to have considerable information and expectations about many significant aspects of the Future. For example, it’s clear from past experience that all living creatures will, sooner or later, die in their current corporeal form. We additionally know that tomorrow will come after today, because that is how we habitually define diurnal progression within unilinear Time. We also confidently expect that in the future two plus two will continue to equal four; and that all the corroborated laws of physics will still apply.

And we undertake calculations, based upon past data, which provide the basis for Future predictions or estimates. For example, actuarial tables, showing age-related life expectancy, indicate group probabilities, though not absolute certainties. Or, to take a different example, we know, from expert observation and calculation, that Halley’s Comet is forecast to return into sight from Earth in mid-2061. Many, though not all, people alive today will be able to tell whether that astronomical prediction turns out to be correct or not. And there’s every likelihood  that it will be.

Commemorating a successful prediction,
in the light of past experience:
a special token struck in South America in 2010 to celebrate
the predicted return to view from Planet Earth
of Halley’s Comet,
whose periodicity was first calculated by Edward Halley (1656-1742)

Yet all this (and much more) useful information about the Future is, entirely unsurprisingly, drawn from past experience, observations and calculations. As a result, humans can use the Past to illuminate and to plan for the Future, without being able to foretell it with anything like total precision.

So how about learning from the Present? It’s live, immediate, encircling, inescapably ‘real’. We all learn in our own present times – and sometimes illumination may come in a flash of understanding. One example, as Biblically recounted, is the conversion of St Paul, who in his unregenerate days was named Saul: ‘And as he journeyed, he came near Damascus; and suddenly there shined round about him a light from heaven. And he fell to the earth, and heard a voice saying unto him, “Saul, Saul, why persecutest thou me?”’1 His eyes were temporarily blinded; but spiritually he was enlightened. Before then, Saul was one of the Christians’ chief persecutors, ‘breathing out threatening and slaughter’.2 Perhaps a psychologist might suggest that his intense hostility concealed some unexpressed fascination with Christianity. Nonetheless, there was no apparent preparation, so the ‘Damascene conversion’ which turned Saul into St Paul remains the classic expression of an instant change of heart. But then he had to rethink and grow into his new role, working with those he had been attempting to expunge.

A secular case of sudden illumination appears in the fiction of Jane Austen. In Emma (1815), the protagonist, a socially confident would-be match-maker, has remained in ignorance of her own heart. She encourages her young and humble protégé, Harriet Smith, to fancy herself in love. They enjoy the prospect of romance. Then Emma suddenly learns precisely who is the object of Harriet’s affections. The result is wonderfully described.3 Emma sits in silence for several moments, in a fixed attitude, contemplating the unpleasant news:

Why was it so much worse that Harriet should be in love with Mr Knightley, than with Frank Churchill? Why was the evil so dreadfully increased by Harriet’s having some hope of a return? It darted through her, with the speed of an arrow, that Mr Knightley must marry no one but herself!

I remember first reading this novel, as a teenager, when I was as surprised as Emma at this development. Since then, I’ve reread the story many times; and I can now see the prior clues which Austen scatters through the story to alert more worldly-wise readers that George Knightley and Emma Woodhouse are a socially and personally compatible couple, acting in concert long before they both (separately) realise their true feelings. It’s a well drawn example of people learning from the past whilst ‘wising up’ in a single moment. Emma then undertakes some mortifying retrospection as she gauges her own past errors and blindness. But she is capable of learning from experience. She does; and so, rather more artlessly, does Harriet. It’s a comedy of trial-and-error as the path to wisdom.

As those examples suggest, the relationship of learning with Time is in fact a very interesting and complex one. Humans learn in their own present moments. Yet the process of learning and education as a whole has to be a through-Time endeavour. A flash of illumination needs to be mentally consolidated and ‘owned’. Otherwise it is just one of those bright ideas which can come and as quickly go.   Effective learning thus entails making oneself familiar with a subject by repetition, cogitation, debating, and lots of practice. Such through-Time application applies whether people are learning physical or intellectual skills or both. The role of perspiration, as well as inspiration, is the stuff of many mottoes: ‘practice makes perfect’; ‘if at first you don’t succeed, try and try again’; ‘stick at it’; ‘never stop learning’; ‘trudge another mile’; ‘learn from experience’.

Indeed, the entire corpus of knowledge and experience that humans have assembled over many generations is far too huge to be assimilated in an instant. (It’s actually too huge for any one individual to master. So we have to specialise and share).

So that brings the discussion back to the Past. It stretches back through Time and onwards until ‘Now’. Of course, we learn from it. Needless to say, it doesn’t follow that people always agree on messages from former times, or act wisely in the light of such information. Hence when people say: ‘We don’t learn from the Past’, they probably mean that it does not deliver one guiding message, on which everyone agrees. And that’s right. It doesn’t and there isn’t.

One further pertinent point: there are rumbling arguments around the question – is the Past alive or dead? (With a hostile implication in the sub-text that nothing can really be learned from a dead and vanished Past.) But that’s not a helpful binary. In other words, it’s a silly question. Some elements of the past have conclusively gone, while many others persist through time.4 To take just a few examples, the human genome was not invented this morning; human languages have evolved over countless generations; and the laws of physics apply throughout.

Above all, therefore, the integral meshing between Past and Present means that we, individual humans, have also come from the Past. It’s in us as well as, metaphorically speaking, behind us. Thinking of Time as running along a pathway or flowing like a river is a common human conception of temporality. Other alternatives might envisage the Past as ‘above’, ‘below’, ‘in front’, ‘behind’, or ‘nowhere specific’. The metaphor doesn’t really matter as long as we realise that it pervades everything, including ourselves.

1 Holy Bible, Acts 9: 3-4.

2 Ibid, 9:1.

3 J. Austen, Emma: A Novel (1815), ed. R. Blythe (Harmondsworth, 1969), p. 398.

4 P.J. Corfield, ‘Is the Past Dead or Alive? And the Snares of Such Binary Questions’, BLOG/62 (Feb.2016).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 91 please click here

MONTHLY BLOG 44, QUOTATIONS AND IRONY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

Quotations should never be mangled and should always be cited honestly, with due attention to context. Yes – absolutely yes.  It’s axiomatic for all scholarship – but also for proper communications. It does happen that words are taken out of context and twisted into another meaning. But it’s never right.

To take an example: if a theatre critic sees a controversial play and writes: ‘The very last thing that I’d say is that this production is brilliant’, then the theatre’s publicity team could put the critic’s name in lights alongside the quotation: ‘This production is brilliant’. Factually, those attributed words are correct. The critic did write them. Yet the truncated quotation gives the reverse meaning to that intended. Both the critic and any members of the audience, who were deceived into attending on the strength of the critic’s recommendation, have grounds for complaint.

Another potential for misunderstanding comes when heavy irony is taken at face value. In one of Shakespeare’s famous oratorical set-pieces, Mark Antony mourns the assassination of Caesar by Brutus and his allies with the repeated phrase: ‘And Brutus is an honourable man’ … [They are all] ‘honourable men’. The stress upon the repeated phrase, like a refrain, urges the Roman crowd to understand that the words mean the reverse of what they apparently say.

By the end, the citizens turn against the assassins: ‘They were traitors: honourable men!1  On the face of it, Mark Antony has given Brutus a favourable character reference. In context, however, he stands condemned, not just as an assassin but as one who has basely betrayed his closest friend and colleague. ‘This was the unkindest cut of all’.

Nonetheless, there is a problem for anyone who uses irony. If the listeners or readers fail to get the implied message, then they will come to an erroneous conclusion. A Roman citizen who left the forum after the opening phrases of Antony’s speech (or who wasn’t listening carefully) could depart thinking: ‘I was sorry to hear of  Caesar’s death but it must be acceptable as Brutus, a man of honour, explained why he had to do it, and Antony confirms that Brutus is an honourable man’.

Irony, then, is powerful but risky. It depends upon an attentive community between speaker/writer and audience/readers which allows the words to be decoded successfully.

For historians, quoting from sources whose authors have long gone, there is always a challenge to understand meanings in their full context. When does a word or phrase in use mean its opposite? And did people in the past always get the hidden message?

When Jonathan Swift published his Modest Proposal for Preventing the Children of Poor People from being a Burthen to their Parents or Country, and for Making them Beneficial to the Publick (1729), he provided an exercise in sustained irony that revealed itself through the moral enormity of the proposed solution. ‘A young healthy child well nursed, is, at a year old, a most delicious nourishing and wholesome food.’ Poor parents would solve their financial problems by selling their children, who would provide good food for the rich. Infanticide? Cannibalism? Class callousness? Swift does not advocate these. Instead, his irony conveys outrage at the poverty of the poor and the indifference of the rich.

Jonathan Swift’s famous use of sustained irony in his Modest Proposal (1729)Why am I writing about this now? Because I am currently thinking about the use of evidence and the dangers of inadvertent misinterpretation. The question really arises when using a lot of sources in a historical collage.

I have just done that in an essay, published in Social History, on eighteenth-century Britain as an ‘Age of Infidelity’.2  It cites at least 75 contemporary verdicts on the state of religion and irreligion. Many are book titles, some are declarations within books, some are printed texts reporting upon speeches and sermons.

A proportion of these works were clearly using overblown rhetoric, uttered in times of crisis. When John Bowlder agonised in 1798 that the British nation’s lack of faith seemed to portend nothing less than ‘the eradicating [of] Christianity in this Quarter of the World’,3  it is hard not to smile. Religion had more staying power than he was ready to admit. On the other hand, Bowdler’s deep anxiety was typical of many committed Christians in the later 1790s, when Britain was struggling in the prolonged war against France. Why such extreme danger? It could only be that God was angry with the nation for its irreligious ways.

Bowdler not only wrote to chastise the people but took practical steps to offer a remedy. He co-founded the Church Building Society, which provided new places of worship in the newly expanding towns. In my Social History essay, I am able to give further information about Bowdler, as he was a particularly notable contributor to the debates. His name on its own attracts interest. Two of his children, Thomas and Henrietta Bowdler, removed all the saucy bits from Shakespeare, in order to make the bard acceptable for respectable family reading. Their reward was much public ridicule – and the invention of a new verb ‘to bowdlerise’. Such contextual information illuminates the era’s culture wars, in which the Bowdlers were eager partisans.

But, in an essay of approximately 7,000 words, it’s not possible to devote equal attention to the other 74 eighteenth-century contemporaries – laypeople as well as clergymen – who expressed views on the state of religion. It would overrun the restricted length of a scholarly essay – and confuse the unfolding analysis. Naturally, I checked all the sources that I used, for both content and context. And I especially searched for rival tracts, arguing that the eighteenth century was an ‘Age of Faith’ or equivalent.

Is it possible that I missed some exercises in irony? Logically, yes, although I hope not. (Please check my sources, all duly footnoted!) Sustained Swiftian-style irony is comparatively rare. Moreover, people writing on the state of irreligion tended to be heated and passionate rather than coolly playing with double meanings.

What I do claim to have found is not a debate without the potential for irony but instead one which circulated a new eighteenth-century cliché. It stated that the era was ‘an Age of Infidelity’. By this phrase, the commentators did not refer to people’s unfaithfulness to their marriage vows. That constituted ‘conjugal infidelity’, plentiful enough but far from unique to the eighteenth century. Nor did the commentators refer to apostacy: Christians in this period were not turning into Islamic or Jewish or any other religious variety of ‘infidels’.

No, it was the spread of secularisation that was being noted, chiefly in alarm: the advent of a society, officially Christian, where people had the option of not going to church, not following Christian lifestyles, and (even) not sharing Christian beliefs. It is possible that some eighteenth-century references to the ‘Age of Infidelity’ were meant ironically. But, if all that the commentators left were the unvarnished words, then they are liable to be read literally.

Ironists beware. Unless your double meaning is suitably signalled, it will become lost in time.

1  W. Shakespeare, Julius Caesar (written 1599/1600), Act 3, scene 2.

2  P.J. Corfield, ‘“An Age of Infidelity”: Secularisation in Eighteenth-Century England’, Social History, 39 (2014), pp. 229-47; available via Taylor & Francis publishers online = www.tandfonline.com.

J. Bowdler, Reform or Ruin: Take Your Choice! (Dublin, 1798), p. 21.

4  For the CBS, now part of the National Churches Trust, see www.churchplansonline.org.

See Wikipedia, sub Thomas Bowdler (1754-1825): en.wikipedia.org.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 44 please click here