MONTHLY BLOG 30, BUT PEOPLE OFTEN ASK: HISTORY IS REALLY POLITICS, ISN’T IT? SO WHY SHOULDN’T POLITICIANS HAVE THEIR SAY ABOUT WHAT’S TAUGHT IN SCHOOLS?

 If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

Two fascinating questions, to which my response to the first is: No – History is bigger than any specific branch of knowledge – it covers everything that humans have done, which includes lots besides Politics. Needless to say, such a subject lends itself to healthy arguments, including debates about ideologically-freighted religious and political issues.

But it would be dangerous if the study of History were to be forced into a strait-jacket by the adherents of particular viewpoints, buttressed by power of the state. (See my April 2013 BLOG). By the way, the first question can also be differently interpreted to ask whether all knowledge is really political? I return to that subtly different issue below.*

Meanwhile, in response to the second question: I agree that politicians could do with saying and knowing more about History. Indeed, there’s always more to learn. History is an open-ended subject, and all the better for it. Because it deals with humans in ever-unfolding Time, there is always more basic data to incorporate. And perspectives upon the past can gain significant new dimensions when reconsidered in the light of changing circumstances.

Yet the case for an improved public understanding of History is completely different from arguing that each incoming Education Secretary should re-write the Schools’ History syllabus. Politicians are elected to represent their constituents and to take legislative and executive decisions on their behalf – a noble calling. In democracies, they are also charged to preserve freedom of speech. Hence space for public and peaceful dissent is supposed to be safeguarded, whether the protesters be many or few.

The principled reason for opposing attempts at political control of the History syllabus is based upon the need for pluralism in democratic societies. No one ‘side’ or other should exercise control. There is a practical reason too. Large political parties are always, whether visibly or otherwise, based upon coalitions of people and ideas. They do not have one ‘standard’ view of the past. In effect, to hand control to one senior politician means endorsing one particular strand within one political party: a sort of internal warfare, not only against the wider culture but the wider reaches of his or her own political movement.

When I first began teaching, I encountered a disapproving professor of markedly conservative views. When I told him that the subject for my next class was Oliver Cromwell, he expressed double discontent. He didn’t like either my gender or my politics. He thought it deplorable that a young female member of the Labour party, and an elected councillor to boot, should be indoctrinating impressionable students with the ‘Labour line on Cromwell’. I was staggered. And laughed immoderately. Actually, I should have rebuked him but his view of the Labour movement was so awry that it didn’t seem worth pursuing. Not only do the comrades constantly disagree (at that point I was deep within the 1971 Housing Finance Act disputes) but too many Labour activists show a distressing lack of interest in History.

Moreover, Oliver Cromwell is hard to assimilate into a simplistic narrative of Labour populism. On the one hand, he was the ‘goodie’ who led the soldiers of the New Model Army against an oppressive king. On the other hand, he was the ‘baddie’ who suppressed the embryonic democrats known as the Levellers and whose record in Ireland was deeply controversial. Conservative history, incidentally, has the reverse problem. Cromwell was damned by the royalists as a Regicide – but simultaneously admired as a successful leader who consolidated British control in Ireland, expanded the overseas empire, and generally stood up to foreign powers.1

Interestingly, the statue of Oliver Cromwell, prominently sited in Westminster outside the Houses of Parliament, was proposed in 1895 by a Liberal prime minister (Lord Rosebery), unveiled in 1899 under a Conservative administration, and renovated in 2008 by a Labour government, despite a serious proposal in 2004 from a Labour backbencher (Tony Banks) that the statue be destroyed. As it stands, it highlights Cromwell the warrior, rather than (say) Cromwell the Puritan or Cromwell the man who brought domestic order after civil war. And, at his feet, there is a vigilant lion, whose British symbolism is hard to miss.2

Cromwell statue with lion
Or take the very much more recent case of Margaret Thatcher’s reputation. That is now beginning its long transition from political immediacy into the slow ruminations of History. Officially, the Conservative line is one of high approval, even, in some quarters, of untrammelled adulation. On the other hand, she was toppled in 1990 not by the opposition party but by her own Tory cabinet, in a famous act of ‘matricide’. There is a not-very concealed Conservative strand that rejects Thatcher outright. Her policies are charged with destroying the social cohesion that ‘true’ conservatism is supposed to nurture; and with strengthening the centralised state, which ‘true’ conservatism is supposed to resist.3 Labour’s responses are also variable, all the way from moral outrage to political admiration.

Either way, a straightforward narrative that Thatcher ‘saved’ Britain is looking questionable in 2013, when the national economy is obstinately ‘unsaved’. It may be that, in the long term, she will feature more prominently in the narrative of Britain’s conflicted relationship with Europe. Or, indeed, as a janus-figure within the slow story of the political emergence of women. Emmeline Pankhurst (below L) would have disagreed with Thatcher’s policies but would have cheered her arrival in Downing Street. Thatcher, meanwhile, was never enthusiastic about the suffragettes but never doubted that a woman could lead.4

Emmeline Pankhurst and Thatcher statue parliament
Such meditations are a constituent part of the historians’ debates, as instant journalism moves into long-term analysis, and as partisan heat subsides into cooler judgment. All schoolchildren should know the history of their country and how to discuss its meanings. They should not, however, be pressurised into accepting one particular set of conclusions.

I often meet people who tell me that, in their school History classes, they were taught something doctrinaire – only to discover years later that there were reasonable alternatives to discuss. To that, my reply is always: well, bad luck, you weren’t well taught; but congratulations on discovering that there is a debate and deciding for yourself.

Even in the relatively technical social-scientific areas of History (such as demography) there are always arguments. And even more so in political, social, cultural, and intellectual history. But the arguments are never along simple party-political lines, because, as argued above, democratic political parties don’t have agreed ‘lines’ about the entirety of the past, let alone about the complexities of the present and recent-past.

Lastly * how about broadening the opening question? Is all knowledge, including the study of History, really ‘political’ – not in the party-political sense – but as expressing an engaged worldview? Again, the answer is No. That extended definition of ‘political’ takes the term, which usefully refers to government and civics, too far.

Human knowledge, which does stem from, reflect and inform human worldviews, is hard gained not from dogma but from research and debate, followed by more research and debate. It’s human, not just political. It’s shared down the generations. And between cultures. That’s why it’s vital that knowledge acquisition be not dictated by any temporary power-holders, of any political-ideological or religious hue.

1 Christopher Hill has a good chapter on Cromwell’s Janus-faced reputation over time, in God’s Englishman: Oliver Cromwell and the English Revolution (1970), pp. 251-76.

2 Statue of Cromwell (1599-1658), erected outside Parliament in 1899 at the tercentenary of his birth: see www.flickr.com, kev747’s photostream, photo taken Dec. 2007.

3 Contrast the favourable but not uncritical account by C. Moore, Margaret Thatcher, the Authorised Biography, Vol. 1: Not for Turning (2013) with tough critiques from Christopher Hitchens and Karl Naylor: see www.Karl-Naylor.blogspot.co.uk, entry for 23 April 2013.

4 Illustrations (L) photo of Emmeline Pankhurst (1858-1928), suffragette leader, orating in Trafalgar Square; (R) statue of Margaret Thatcher (1925-2013), Britain’s first woman prime minister (1979-90), orating in the Commons: see www.parliament.uk.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 30 please click here

MONTHLY BLOG 29, SHOULD EACH SECRETARY OF STATE FOR EDUCATION REWRITE THE UK SCHOOLS HISTORY SYLLABUS?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

The answer is unequivocally No. (Obvious really but worth saying still?)

History as a subject is far, far too important to become a political football. It teaches about conflict as well as compromise; but that’s not the same as being turned into a source of conflict in its own right. Direct intervention by individual politicians in framing the History syllabus is actively dangerous.

2013-4 Cavaliers and Roundheads

Rival supporters of King and Parliament in the 1640s civil wars, berating their opponents as ‘Roundhead curs’ and ‘Cavalier dogs’: the civil wars should certainly appear in the Schools History syllabus but they don’t provide a model for how the syllabus should be devised.

There are several different issues at stake. For a start, many people believe that the Schools curriculum, or prescriptive framework, currently allots too little time to the study of History. There should be more classes per week. And the subject should be compulsory to the age of sixteen.1  Those changes would in themselves greatly enhance children’s historical knowledge, reducing their recourse to a mixture of prevalent myths and cheerful ignorance.

A second issue relates to the balance of topics within the current History syllabus, which specifies the course contents. I personally do favour some constructive changes. There is a good case for greater attention to long-term narrative frameworks,2  alongside high-quality in-depth studies.

But the point here is: who should actually write the detailed syllabus? Not individual historians and, above all, not individual politicians. However well-intentioned such power-brokers may or may not be, writing the Schools History syllabus should be ultra-vires: beyond their legal and political competence.

The need for wide consultation would seem obvious; and such a process was indeed launched. However, things have just moved into new territory. It is reported that Education Secretary has unilaterally aborted the public discussions. Instead, the final version of the Schools History syllabus, revealed on 7 February 2013, bears little relation to previous drafts and discussions.3 It has appeared out of the (political) blue.

Either the current Education Secretary acted alone, or perhaps had some unnamed advisers, working behind the scenes. Where is the accountability in this mode of procedure? Even some initial supporters of syllabus revision have expressed their dismay and alarm.

Imagine what Conservative MPs would have said in 2002 if David Blunkett (to take the best known of Blair’s over-many Education Ministers) had not only inserted the teaching of Civics into the Schools curriculum as a separate subject;4 but had written the Civics syllabus as well. Or if Blunkett had chosen to rewrite the History syllabus at the same time?

Or imagine what Edmund Burke, the apostle of moderate Toryism, would have said. This eighteenth-century politician-cum-political theorist, who was reportedly identified in 2008 as ‘the greatest conservative ever’ by the current Education Secretary,5 was happy to accept the positive role of the state. Yet he consistently warned of the dangers of high-handed executive power. The authority of central government should not be untrammelled. It should not be used to smash through policies in an arbitrary manner. Instead Burke specifically praised the art of compromise or – a better word – of mutuality:

All government, indeed every human benefit and enjoyment, every virtue, and every prudent act, is founded on compromise and barter.6

An arbitrary determination of the Schools History syllabus further seems to imply that the subject not only can but ought to be moulded by political fiat. Such an approach puts knowledge itself onto a slippery slope. ‘Fixing’ subjects by political will (plus the backing of the state) leads to intellectual atrophy.

To take a notoriously extreme example, Soviet biology was frozen for at least two generations by Stalin’s doctrinaire endorsement of Lysenko’s environmental genetics.7 A dramatic rise in agrarian productivity was promised, without the need for fertilisers (or more scientific research). Stalin was delighted. Down with the egg-heads and their slow research. Lysenko’s critics were dismissed or imprisoned. But Lysenkoism did not work. And, after unduly long delays, his pseudo-science was finally discredited.
2013-4 Lysenko_with_Stalin - Copy

A rare photo of Stalin (back R) gazing approvingly at Trofim Lysenko (1898-1976)
speaking from the rostrum in the Kremlin, 1935

In this case, the Education Secretary is seeking to improve schoolchildren’s minds rather than to improve crop yields. But declaring the ‘right’ answer from the centre is no way to achieve enlightenment. Without the support of the ‘little platoons’ (to borrow another key phrase from Burke), the proposed changes may well prove counter-productive in the class-room. Many teachers, who have to teach the syllabus, are already alienated. And, given that History as a subject lends itself to debate and disagreement, pupils will often learn different lessons from those intended.

Intellectual interests in an Education Secretary are admirable. The anti-intellectualism of numerous past ministers (including too many Labour ones) has been horribly depressing. But intellectual confidence, tipped into arrogance, can be taken too far. Another quotation to that effect is often web-attributed to Edmund Burke, though it seems to come from Albert Einstein. He warned that powerful people should wisely appreciate the limits of their power:

Whoever undertakes to set himself up as a judge of Truth and Knowledge is shipwrecked by the laughter of the gods.8

1 That viewpoint was supported in my monthly BLOG no.23 ‘Why do Politicians Undervalue History in Schools’ (Oct. 2012): see www.penelopejcorfield.co.uk.

2 I proposed a long-span course on ‘The Peopling of Britain’ in History Today, 62/11 (Nov. 2012), pp. 52-3.

3 See D. Cannadine, ‘Making History: Opportunities Missed in Reforming the National Curriculum’, Times Literary Supplement, 15 March 2013, pp. 14-15; plus further responses and a link to the original proposals in www.historyworks.tv

4 For the relationships of History and Civics, see my monthly BLOG no.24 ‘History as the Staple of a Civic Education’, www.penelopejcorfield.co.uk.

5 Michael Gove speech to 2008 Conservative Party Annual Conference, as reported in en.wikipedia.org/wiki/Michael_Gove, consulted 3 April 2013.

6 Quotation from Edmund Burke (1729-97), Second Speech on Conciliation with America (1775). For further context, see D. O’Keeffe, Edmund Burke (2010); I. Kramnick, The Rage of Edmund Burke: Portrait of an Ambivalent Conservative (New York, 1977); and F. O’Gorman (ed.), British Conservatism: Conservative Thought from Burke to Thatcher (1986).

7 Z. Medvedev, The Rise and Fall of T.D. Lysenko (New York, 1969).

8 Albert Einstein (1879-1955), in Essays Presented to Leo Baeck on the Occasion of his Eightieth Birthday (1954), p. 26. The quotation is sometimes (but wrongly) web-attributed to Edmund Burke’s critique of Jacobin arrogance in his Preface to Brissot’s Address to his Constituents (1794).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 29 please click here

MONTHLY BLOG 28, ANSWERING QUESTIONS POST SEMINAR PAPERS/ LECTURES

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

march008
If post-seminar questions are less memorable that the papers or lectures which precede them, then the answers tend to be even less anecdotable. I can think of only a handful, among thousands of intellectual encounters, which remain in my memory.

Nevertheless, answers in an academic setting (as in a political one) need to meet certain criteria. They can enhance a good presentation. And wrongly handled, answers can backfire and, at worst, they can ruin an apparently successful paper or lecture by failing to rebut a fundamental criticism.

Hence the overwhelming rule is to reply rather than to evade the question. Nothing is more annoying to an audience when it detects that the presenter is intellectually absconding. If the speaker can’t immediately answer (it happens to us all), the best reply is: ‘That’s a great question. I don’t know the answer off-hand; but I will check it out and get back to you’.

On rare occasions, it is acceptable to prevaricate. Queen Elizabeth I was once in a political quandary. In response to the strong advice of a parliamentary deputation in 1586 that she execute her close relative and fellow monarch, Mary Queen of Scots, Elizabeth equivocated by giving them what she herself honestly termed as an ‘answer, answerless’.

march010
In other words, she would not say.2 Yet very few scholars find themselves walking the same sort of political highwire upon which Elizabeth I walked coolly for years. Academic waffle is thus best avoided. I have done it myself but always felt suitably remorseful afterwards.

The academic cut-and-thrust is instead predicated upon an open exchange of views and, if need be, a frank confession of an inability to answer immediately, rather than a fudge-and-mudge.

But, while too much evasive verbiage can be disappointing, too much brevity can prove equally annoying. One terse response that I can remember came from Balliol’s Christopher Hill. It was in a series of interviews with senior historians,3 in which some staple questions had been supplied by the organisers. As the interviewer, I was allowed to improvise but also requested to cover the basics. Accordingly I asked politely: ‘Would you like to explain your methodology?’ It was a relevant question, since Hill had been sternly criticised in 1975 by his fellow historian J.H. Hexter for the alleged sin of being a ‘lumper’. Even more damagingly, Hexter accused Hill being seriously unprofessional by quoting selectively from the sources, to support his big argument.4 ‘Lumpers’, by the way, lump everything together to form one big picture, while ‘splitters’ (of whom Hexter was a pre-eminent example) demur and say: ‘No, hang on – things are really much more complicated than that’.

Nonetheless, when invited to comment, Christopher Hill replied, gruffly: ‘No’. Like many of his generation, he bristled at the very word ‘methodology’. I laughed and continued to the next question, which was a mistake on my part. I should have changed the wording and tried again. In the event, the unsatisfactory exchange was cut from the final version of the interview. Not that there was any doubt that Christopher Hill was a ‘lumper’. Many (though probably not most) historians are. Yet Hill did not accept that he distorted or read sources selectively. In my view, it would have been best for him to restate a firm rebuttal of Hexter. But Hill would probably have responded, not ‘who cares?’ (he did), but ‘read my books and judge for yourselves’.

march009
Single-word replies, of the ilk of ‘Yes’ and ‘No’, should thus be avoided as a general rule. They generate an initial laugh, especially when following an over-long and tedious question. Yet single-word replies are not playing fair with the questioner or the audience. They appear to give but don’t really. It is ok to start with a single brisk word, on the other hand, provided that the speaker then justifies that verdict.

So … not too short but also … not too lengthy. In my experience (and it’s a fault that I share) most answers are too long. It’s tempting to give a reprise of the paper or lecture. But that’s a mistake. A crisp reply: to the point, and nothing more, is best. Also gives time for more questions.

Three specific tips for respondents. When first listening to a question, it can be difficult to grasp the real point and simultaneously to formulate a good answer. The best way to cope is to start with a ‘holding’ reply: such as ‘That’s an interesting question’ or ‘I’m glad that you raised that point’. During the brief postponement, it’s amazing how often a reply formulates itself in one’s mind. But it’s best to use many variants of such ‘holding’ replies. It sounds too saccharine if every question is welcomed with the same apparent rapture. Incidentally, the reverse also sounds false. A former MP of Battersea was prone to start every reply with ‘I welcome your criticism’ even if none was offered. It eventually became something of a joke, which was counter-productive.

A second tip is to have a sheet of paper discreetly to hand and always to jot down a short note, summarising the topic that’s been raised. Having that reminder is especially useful in the event of two-pronged questions. When answering one half of a query, it’s too easy to forget about the other half. A short note concentrates the mind. In the long run, too, awareness of the points raised is personally invaluable. A free consultation with experts. Soon after every public presentation, I turn the list into a personal debriefing, noting all points that need clearer explication next time; and especially noting all criticisms of my main argument, so that I can decide how to refute them next time (or, sometimes, to amend my own case).

Which brings me to the third and most important piece of advice. It’s fine to give way graciously to challenges on all sorts of points, especially if one is in the wrong. Yet if the critique is focused upon the absolute core of one’s argument, it is essential to stand fast. I once heard the historian Lawrence Stone, another well-known ‘lumper’, confront a fundamental criticism of his latest publication.5 He began frankly: ‘Oh, dear, I think I’ve been holed below the water-line’. Then, with a cheerful laugh (shared with the audience), he rallied, with words to the effect that: ‘Your evidence/argument, although important, does not invalidate my central case’. Stone then, on the hoof, thought through his response to the fundamental (and valid) criticism, without rancour or any sign of being flustered. It was a sparkling moment.

Sometimes, there is not one single ‘right’ answer; but a there is a right process of debate. That’s the aim. And it’s nice to win the argument as well. Which means keeping on one’s toes intellectually. Having given the presentation, don’t relax too soon. Keep replies crisp and pertinent. And, basically, enjoy the dialectic. Out of reasoned argument comes … knowledge.

march011
1 From Icon Archive, at www.icongal.com: downloaded 22 February 2013.

2 Elizabeth I’s non-reply was nonetheless gracefully worded: ‘[I] pray you to accept my thankfulness, excuse my doubtfulness, and take in good part my answer, answerless.’

3 ‘Christopher Hill with Penelope Corfield’ (1986), in series DVD Video Interviews with Historians, available from London University’s online store: www.store.london.ac.uk.

4 J.H. Hexter, ‘The Historical Method of Christopher Hill’, Times Literary Supplement, 25 Oct. 1975, repr. in J.H. Hexter, On Historians: Reappraisals of Some of the Makers of Modern History (1979), pp. 227-51; with riposte by C. Hill, ‘The Burden of Proof’, in Times Literary Supplement, 7 Nov. 1975, p. 1333.

5 See Lawrence Stone (1919-1999) and J.C.F. Stone, An Open Elite? England, 1540-1880 (1984); and alternative view in S.E. Whyman, ‘Land and Trade Revisited: The Case of John Verney, London Merchant and Baronet, 1660-1720’, London Journal, 22 (1997), pp. 16-32.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 28 please click here

MONTHLY BLOG 27, ASKING QUESTIONS POST SEMINAR PAPERS/LECTURES

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

What?

What? what? what? Always good to ask questions. Not always easy to manage a good one. In the debates following the thousands of public lectures and seminar papers that I’ve heard, a few examples stand out.

One was simplicity itself. It caught out a senior figure on a point of detail that refuted her argument – which she should have known but didn’t (or had forgotten). The question took five words: ‘What about the Quebec Act?’ Under this legislation (1774) Britain allowed freedom of worship to the French-speaking Quebec Catholics and enabled them to swear allegiance to the British crown without reference to Protestantism. It was a major factor in preventing the potentially rebellious province from joining the American colonial revolt. This flexibility ran contrary to the speaker’s stress upon the immovable Protestantism of eighteenth-century British state policy. There were various possible replies, such as: it was the exception to prove the rule. But she fell silent and the chair took the next question. Since then, I often think, when listening to a lecture: Is there a Quebec Act equivalent knock-down? Often there isn’t. But, if there is, it should always be done with great simplicity.

Another was a question that I asked after a public lecture (not necessarily the best; simply one that I remember). In fact, interventions from the floor are much more forgettable than the preceding oration, which is one reason not to worry too much about what to ask. In this case, a polemical speaker had castigated all historians who used anachronistic terms instead of sticking exclusively to the language of the relevant past period. Then, oblivious of his own strictures, he defined the eighteenth-century European states (including Britain) as ancien regimes. But – whether ‘ancien’ be translated as ‘old’ or ‘former’ – this descriptive term is clearly retrospective. From the floor, I argued that the historians’ art entails not only studying past societies but also communicating their findings about the past in the language of a later day. So yes to linguistic care and attention to definitions; but no to linguistic obscurantism and a quest for the impossible. Otherwise historians of pre-Conquest England would have to delete all words derived from Norman French; historians of the pre-speech era would have to grunt; and so forth. In the light of his own retrospective terminology, would the speaker like to reconsider his criticisms of others? He replied; but, it was generally agreed, not convincingly.

Those two examples reveal two possible approaches to asking questions: either working from prior knowledge; or generating a debating point from the content of the talk. Both approaches are equally valid. The point of asking questions is constructive: to probe the case that has been presented and to extend the collective discussion. A good debate helps speakers by giving them a free consultancy, allowing them to refine their arguments before bursting into print. And ditto: good discussions help listeners to stretch their minds; to learn how to joust intellectually; and to contribute to the advancement of knowledge.

Obviously enough, beginners giving their first paper should be treated comparatively gently, but not to the extent of allowing serious errors to pass unchallenged. And senior performers should be given the compliment of a bracing set of questions, which they will expect.

Most enquiries start from a wholesome quest for further information or clarification. What did you mean by statement A? How do you define concept B? Did you also check source C? … How good is the evidence for X? Can that proposition not be tested against Y? And what are the implications of Z? All of those approaches are useful. Another substantial range of questions focus upon the speakers’ methods of classification, selection, or organisation of research material. Challenges are especially required if the criteria have not been well explained in the presentation. Social classification systems, in particular, always benefit from debate, whether focusing upon class; ethnicity; nationality; or any other special identities. One phenomenon that is often under-studied is the extent of intermarriage between ostensibly different groups: ask about that.

Meanwhile, a minority of questions, which are often the best, take the form of a conceptual or philosophical depth-charge or counter-argument. Listen to the general argument and think: could the reverse or something very different be the case instead? That may mean playing devil’s advocate. But, intellectually, ‘opposition is true friendship’, to quote William Blake.1 Above all, it’s good to listen closely to the speakers, in order to identify their often-buried fundamental assumptions – and then challenge those. It’s rare that such interventions fail to stimulate. Sometimes speakers are surprised; sometimes indignant; but they are generally gratified to have been listened to with serious attention.

2013-2 Marriage Heaven and Hell 1790 Bodl p.20

From William Blake’s Marriage of Heaven and Hell (Bodleian Library copy, 1790, fo. 20)
showing the writhing serpent of knowledge and the enigmatically faded words ‘Opposition is True Friendship’

My former supervisor, Jack Fisher, the economic history guru of LSE, was famed for provocative depth-charges, which he signalled with the opening words: ‘I know nothing about this but …’. However, his formula is best used sparingly. I have heard others bodge the same tactic, leading audiences to wonder why such a self-declared ignoramus is wasting everyone’s time with fatuous questions.

Given the above range of possibilities, postgraduate students should be encouraged to start with short, punchy wholesome-quests-for-information. In that way, they get used to the invariable stir of people turning round to look at the questioner, which can be disconcerting for beginners. Then, in time, students should progress to making longer enquiries and eventually to offering counter-arguments. My own system also requires that, after the first term at a new seminar, postgraduates ask at least one question per term, rising to a specified larger number as they move through their four years of study. That instruction sounds a bit mechanical. But it’s actually easier to ask a question when one has determined beforehand to do so. Otherwise, a lot of time is spent dithering: shall I, shan’t I? Yes, go for it.

Coda: I’ll end with a personal anecdote on heckling. It’s not something that I often do. But once I heckled, unintentionally, and found that I had posed a great question or, rather, prompted a great response. It happened in the early 1970s, at a public debate in the University of London’s Beveridge Hall, with perhaps two hundred dons in attendance. Two eminent historians, Keith Thomas and Hugh Trevor-Roper, had jousted fiercely in print about seventeenth-century witchcraft. They were invited to a special debate to continue the argument. But face-to-face, as often happens, the antagonists were very polite to each other. The occasion as a whole proved to be a damp squib.

There was, however, a moment of excitement. One of the speakers, referred rather contemptuously to ‘useless old women’ and, without intending to do so, I found that I had cried out ‘Shame!’ Everyone around me recoiled. The speakers said nothing. But the chair of the meeting, the historian Joel Hurstfield, responded with aplomb: ‘Madam, contain your just indignation!’ His old-fashioned courtesy effectively rebuked my uncouthness. Yet he upheld my complaint, accepting that the tone of the debate had been too dismissive of the women accused of witchcraft. Immediately, the people around me smiled with relief and reversed their physical recoil. The debate was resumed, and I don’t suppose anyone else remembers the exchange. Nonetheless, I have waited ever since (both in politics and as an academic) for someone to heckle when I’m in the chair, to see if I can respond as brilliantly. It hasn’t happened yet; but maybe one day … In the meantime, let there be questions: what? what? what?

1 From William Blake, The Marriage of Heaven and Hell (1793), fol. 20.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 27 please click here

MONTHLY BLOG 26, WORST AND BEST ACADEMIC LECTURES THAT I’VE HEARD

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2013)

Apart from the routine lectures that form the bread-and-butter of an academic’s job, we constantly give special lectures and/or papers. These presentations are made to a miscellany of research seminars, public meetings, specialist societies, academic conferences and other outlets, at home and overseas. From the early 1970s onwards my private log tells me that I’ve given almost 300 of these extra performances.

At the same time, I am a seasoned listener to presentations from fellow academics. During my career, I must have heard many thousands. Trained by my incisive supervisor to have a critical response up my sleeve, I decided early on always to ask a question. Which I do – almost invariably, provided that the event allows for audience participation. Preparing a range of potential questions, from a dolly to an underwater torpedo, keeps the mind focused. It’s not hard to respond to a good paper. But what’s the best way to critique a dull or weak or off-beam interpretation, without being rude or dismissive? It’s a good challenge.

Over time, the standard of papers and lectures has undoubtedly risen. People are more professional and time-keeping is much more reliable. There are things that still could get better. Talking from notes (but not reciting a list of points on a screen) is much more engaging for the audience than reading aloud from a prepared script. William Hogarth long ago indicated how boring a droned lecture-from-text can be.

William Hogarth’s Scholars at a Lecture, 1736

William Hogarth’s Scholars at a Lecture, 1736

On the other hand, it can be hard for beginner-historians to manage without a script. They generally have to convey a great deal of factual information and quotations, which have to be accurate. So there is scope for progression. I usually recommend starting with full scripts but then, with greater experience, expanding the amount of free-speaking.

Ultimately, however, it’s not the style of an exposition but the content that counts. The two worst presentations were similar in format and outcome. Both were intended sincerely, by speakers who were so entranced by their material that they had lost sight of the need to explain it.

One was a seminar paper, given by an eminent professor of eighteenth-century political history, who decided to branch out into the history of political thought. As a first foray, it was not a success. Announced as ‘The Debate between Edmund Burke’s Conservatism and Thomas Paine’s Radicalism’, Professor Ian Christie itemised at length the differing views of these two hegemonic political thinkers. His conclusion was unequivocal. It consisted of the simple observation: ‘Well, there you are! Burke was right’. A deep silence fell. I felt very sorry for the chair. We struggled to coax a debate from the speaker. But he merely replied: ‘Well, you’ve heard Burke’s views’. The unsatisfactory session drew to an early close. Alone among those present, the speaker remained serenely happy.1

A second dreadful session was of the same ilk. A famously combative professor of the fifteenth-century English economy offered a seminar paper on ‘Continuity in History’. The title was one that I found especially attractive, since I love macro-sweep. Obviously others agreed, because crowds assembled. Tony Bridbury’s paper, however, consisted of a close exposition of the fifteenth-century history of the Paston family, buttressed by readings from the well-known Paston Letters.2  There were no new insights. We were supposed to understand that family life and the small concerns of daily existence are universal preoccupations. Even that point, however, was not stated explicitly. Nor was there any conclusion, other than a gleeful: ‘You see? Nothing changes’. The following discussion spluttered briefly but got nowhere.
2013-1 Paston Letters

A first select edition of the Paston Letters was published by John Fenn in 1787, with new edn by A. Ramsay (1849)

Was there anything that the seminar chairs could have done to retrieve these situations? Perhaps they might have organised rival groups from the audience, to argue the respective cases for and against the core propositions. That manoeuvre would have been possible in an established class, where the course director has more control over the format. In a seminar, with a changing attendance from session to session, it would have been more tricky. But worth a try. Certainly more positive than the disgruntlement that actually prevailed.

Needless to say, the seminar/lecture norm has always been much better than either of those examples. And I have heard many very good and some completely outstanding presentations. How to pick one from the pack?

My choice is a master-exposition by the historian E.P. Thompson. His first degree was actually in English at Cambridge. On this occasion, he regaled an adult education conference in Preston with a lecture which combined the English-literary technique of close-reading with a historian’s detective work and attention to context. It showcased Thompson’s distinctive style at its very best.
E.P.Thompson

E.P. Thompson at Glastonbury Festival 1986, by Giacomino Parkinson,from www.glastonburyfestivals.co.uk.

In Preston, the lecture began with his quiet reading of a poem by William Blake: ‘The Garden of Love’ from The Songs of Experience.3  Thompson then launched into his analysis, entirely without notes. At the end, he recited the poem again, with added emphasis. The result was startling. In the second reading, all the meanings and allusions within the poem sprang intensely to life. It was like stepping from a monochrome world into a world of vivid colour. Whether his general exposition of Blake was sustainable remained to be tested when, later, Thompson published his Witness against the Beast: William Blake and the Moral Law.4  But, as a single lecture, it was exemplary in its entirely original mixture of literary detail and historical breadth.

The thirty-odd people who had assembled on a cold November afternoon in the mid-1980s for a routine local-history conference were challenged in true Blakeian style ‘to see the world in a grain of sand’. It was an inspiration that revealed what a great lecture can do.
2013-1 Blake garden of Love

1 This session, chaired by John Dinwiddy, occurred in the later 1970s. Subsequently Ian Christie (1919-98) amplified his study of the ‘intellectual repulse of revolution’ in his Ford Lectures, published as I.R. Christie, Stress and Stability: Reflections on the British Avoidance of Revolution (Oxford, 1984).

2 This session, chaired by F.J. Fisher, occurred in the early 1970s. In other contexts, A.R. Bridbury (1924- ) was happy to detect change: see variously his Economic Growth: England in the Later Middle Ages (1962; reissued Brighton, 1975); The English Economy from Bede to the Reformation (Woodbridge, 1992); and his Medieval England: Its Social and Economic Origins and Development (Leicester, 2008).

3 W. Blake, ‘The Garden of Love’’, from his Songs of Experience (1794).

4 E.P. Thompson (1924-93), Witness against the Beast: William Blake and the Moral Law (Cambridge, 1993). For more on EPT and bibliographic references, see my earlier Blog/14, dated Dec. 2011.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 26 please click here

MONTHLY BLOG 25, CHAMPIONING THE STUDY OF HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2012)

How do we champion (not merely defend) the study of History in schools and Universities? Against those who wrongly claim that the subject is not commercially ‘useful’.

Here are three recommendations. Firstly, we should stress the obvious: that a knowledge of history and an interconnected view of past and present (cause and consequence) is essential to the well-functioning not only of every individual but also of every society. The subject roots people successfully in time and place. Individuals with lost memories become shadowy, needing help and compassion. Communities with broken memories, for example through forced uprooting, exhibit plentiful signs of trauma, often handed down through successive generations. Civics as well as economics thus demands that people have a strong sense of a sustained past. That entails learning about the history their own and other societies, in order to gain an understanding of the human condition. All knowledge comes from the past and remains essential in the present. Nothing could be more ‘useful’ than history, viewed broadly.

december003The second recommendation links with the first. We should define the subject as the study not of the ‘dead past’ but of ‘living history’.

In fact, there’s a good case for either usage. Historians often like to stress the many differences between past and present. That’s because studying the contrasts sets a good challenge – and also because an awareness of ‘otherness’ alerts students not simply to project today’s attitudes and assumptions backwards in time. The quotation of choice for the ‘difference’ protagonists comes from an elegiac novel, which looked back at England in 1900 from the vantage point of a saddened older man in the 1940s. Entitled The Go-Between by L.P. Hartley (1953), it began with the following words: The past is a foreign country: they do things differently there.

It’s an evocative turn of phrase that has inspired book titles.1 It’s also widely quoted, often in the variant form of ‘the past is another country’. These phrases draw their potency from the fact that other places can indeed be different – sometimes very much so. It is also true that numerous historic cultures are not just different but have physically vanished, leaving imperfect traces in the contemporary world. ‘Ancient Ur of the Chaldees is covered by the sands of southern Iraq. … And the site of the once-great Alexandrian port of Herakleion lies four miles off-shore, under the blue seas of the Mediterranean’.2

december002On the other hand, while some elements of history are ‘lost’, past cultures are not necessarily inaccessible to later study. Just as travellers can make an effort to understand foreign countries, so historians and archaeologists have found many ingenious ways to analyse the ‘dead past’.

There are common attributes of humanity that can be found everywhere. We all share a living human history.3 Ancient cultures may have vanished but plenty of their ideas, mathematics, traditions, religions, and languages survive and evolve. Anyone who divides a minute into sixty seconds, an hour into sixty minutes, and a circle into 360 degrees, is paying an unacknowledged tribute to the mathematics of ancient Babylon.4

december001So there is an alternative quotation of choice for those who stress the connectivity of past and present. It too comes from a novelist, this time from the American Deep South, who was preoccupied by the legacies of history. William Faulkner’s Requiem for a Nun (1951) made famous his dictum that:
The past is never dead. It’s not even past.

No doubt there are circumstances when such sentiments are dangerous. There are times when historic grievances have to be overcome. But, before reconciliation, it’s best to acknowledge the reality of such legacies, rather than dismissing them. As it happens, that was the argument of Barack Obama when giving a resonant speech in 2008 about America’s festering ethnic divisions.5

Historians rightly observe that history contains intertwined elements of life and death. But when campaigning for the subject, it’s best to highlight the elements that survive through time. That is not romanticising history, since hatreds and conflicts are among the legacies from the past. It’s just a good method for convincing the doubters. Since we are all part of living history, for good and ill, we all need to study the subject in all its complexity.

Thirdly and finally: historians must make common cause with champions of other subjects. Obvious allies come from the Arts and Humanities. But we should appeal especially to the Pure Sciences. They too fail to meet the test of immediate economic ‘usefulness’. There is no instant value in a new mathematical equation. No immediate gain from the study of String Theory in physics. (Indeed, some physicists argue that this entire field is turning into a blind alley).6 But the pure sciences need essential scope for creativity and theoretical innovation. Some new ideas have become ‘useful’ (or dangerous) only many years after the initial intellectual breakthrough. Others have as yet no direct application. And some may never have.

Humans, however, are capable of thinking long. It is one of our leading characteristics. So we must not be bullied into judging the value of subjects to study solely or even chiefly in terms of short-term criteria. The Pure Sciences, alongside the Arts and Humanities, must combat this blinkered approach. There are multiple values in a rounded education, combining the theoretical and the practical. In the case of History, the blend must include knowledge as well as skills. In the sciences, it must include the theoretical as well as the applied. One without the other will fail. And that in the long-term is not remotely useful. In fact, it’s positively dangerous. History confirms the long-term usefulness of the sciences. Let the scientists repay the compliment by joining those who reject crude utilitarianism – hence in turn championing the study of History.

1 Notably by David Lowenthal, The Past is a Foreign Country (Cambridge, 1983)

2 Quoting from an essay by myself, entitled ‘Cities in Time’, in Peter Clark (ed.), Oxford Handbook on Cities in World History (Oxford, forthcoming May 2013).

3 See Ivar Lissner, The Living Past (1957), transl. from German So Habt Ihr Geleb = literally Thus Have They Lived; and my personal response in PJC Discussion-Point Nov. 2011.

4 For the social and intellectual context of Babylonian mathematics, see Eleonor Robson, Mathematics in Ancient Iraq: A Social History (Princeton, 2008).

5 For Barack Obama’s speech ‘A More Perfect Union’, delivered at Philadelphia, PA, 18 March 2008: see video on www.youtube.com.

6 See references to the usefulness or otherwise of pure maths in PJC Blog Oct. 2012.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 25 please click here

MONTHLY BLOG 24, HISTORY AS THE STAPLE OF A CIVIC EDUCATION

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2012)

Politicians have a duty to attend to civics as well as to economics. Indeed, we all do. So talking about whether the study of History is ‘useful’ for the economy is a very partial way of approaching an essential component of human’s collective living. We all need to be rooted in space and time. Politicians should therefore be advocating the study of History as the essential contribution to individual and social connectedness. In a word, civics in the full meaning of the term. Not just learning how to fill in a ballot paper – but learning how communities develop over time, how they cope with conflict and with conflict-resolution, and, incidentally, how they struggle to create truly fair and democratic societies.

Praise of the study of History as a means of learning essential skills is all very well. Lots of useful things are indeed achieved by this means. People learn to evaluate complex sources, to make and debate critical judgments based upon careful assessments of often contradictory evidence, and to understand continuity and change over the long term. So far, so good.

Yet it is seriously inadequate to recommend a subject only in terms of the skills it teaches and not in terms of its core content. It’s like (say) recommending learning to sing in order to strengthen the vocal chords and to improve lung capacity. Or (as the ad agency Saatchi & Saatchi notoriously did in 1988) recommending a visit to the Victoria & Albert Museum in order to enjoy a nice egg salad in its ‘ace caff’ – with some very valuable art objects attached.
november004By the way, so notorious has that advertisement become that it is strangely difficult to find the originals image on the web. It seems to have been self-censored by both the Museum and the ad agency – probably in shame.

When recommending History, there is a crucial Knowledge agenda at stake as well as a supporting Skills agenda. Of course, the two are inextricably linked. Historical skills without historical Knowledge are poorly learned and quickly forgotten. But learning History has a greater and essential value purely in its own right. It is not ‘just’ a route to Skills but a subject of all-encompassing and thrilling importance.

All of human life is there; and all humans need access to this shared reservoir of knowledge about our shared past. People always glean some outline information by one means or another. They pick up myths and assumptions and bits and pieces from their families and communities.

But people learn more and better when they learn systematically: about the history of the country that they live in; and about the comparative history of other countries, both nearby and far away; and about how a myriad of different developments around the world fit into a long-term human history, which includes continuities as well as change.

Needless to say, these perceptions are hardly new. ‘Histories make men wise’, as Francis Bacon long ago observed. Thinkers and doers from classical Greece to Winston Churchill have agreed and recommended its study.
november003Why then has the subject matter of History been comparatively undervalued in recent years? It can’t just be the power of the Skills agenda and the influence of ministers fussing about every subject’s contribution to the economy.

Nor can it be that History teachers are ‘boring’ and that they teach students nothing but the dates of kings, queens and battles. Ofsted report after Ofsted report has stated otherwise. The subject is considered to be generally well and imaginatively conveyed. Moreover, the sizeable number of students choosing to take the subject, even once it has ceased to be compulsory, shows that there is a continuing human urge to understand the human past.

Nonetheless, the public reputation of History as a subject of study is currently poor. It is often dismissed as the ‘dead past’. Why should students need to know about things that have long gone? The pace of technological change in particular seems to point people ‘onwards’, not backwards. What can the experience of the older generation, who notoriously have trouble coping with shiny new gadgets, teach the adept and adaptable young?

Well, there are many answers to such rhetoric.

In the first place, things that are ‘dead’ are not necessarily lacking in interest. It is valuable to stretch the mind to learn about vanished cultures, as some indeed have. Impressively, archaeologists, historians, palaeontologists, biologists and language experts have together discovered much about the long evolution of our own species – often from the skimpiest bits of evidence. It’s a highly relevant story about adaptation and survival, often in hostile climes.

Meanwhile, there is a second answer too. It’s completely fallacious to assume that everything in the past is ‘dead’. Much – very much – survives and develops through time, to create a living history, which embraces everyone alive today. The human genome, for example, is an evolving inheritance from the past. So are the dynamic histories, languages and cultures that we have so variously created.

We need more long-term accounts of how such things continue, evolve and change over the very long term. The recent stress by historians upon close focus studies, looking at one period or great event in depth, has been fruitful. Yet it should not exclude long-term narratives. They help to frame the details and to fit the immediate complexities into bigger pictures. (My own suggestion for a secondary-schools course on ‘The Peopling of Britain’, in which everyone living in Britain has a stake, is published in the November issue of History Today).1  In sum, we all need to learn systematically – and to continue learning – about our own and other people’s histories. It’s a lifetime project, for individuals and for citizens.

• My December Blog will consider further how historians can advance the public case for studying History.

1 P.J. Corfield, ‘Our Island Stories – The Peopling of Britain’, History Today, vol. 62, issue 11 (Nov. 2012), pp. 52-3.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 24 please click here

MONTHLY BLOG 23, WHY DO POLITICIANS UNDERVALUE HISTORY IN SCHOOLS ?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2012)

Isn’t it shocking that, in the UK, school-children can give up the study of History at the age of 14? Across Europe today, only Albania (it is claimed) shares that ignoble distinction with Britain. A strange pairing. Who knows? Perhaps the powers-that-be in both countries believe that their national histories are so culturally all-pervasive that children will learn them by osmosis. Perhaps Britons in particular are expected to imbibe with their mother’s milk the correct translation of Magna Carta?

Despite my unease at David Cameron’s embarrassing displays of historical ignorance, my complaint is not a party political one. As a Labour supporter, I’ve long been angry with successive Labour Education Ministers between 1997 and 2010, who have presided uncaringly over the long-running under-valuing of History. (Their lack of enthusiasm contrasts with continuing student demand, which indeed is currently booming).

For critics, the subject is thought to focus myopically upon dates, and upon kings, queens and battles. Students are believed to find the subject ‘boring’; ‘irrelevant’; ‘useless’. How can learning about the ‘dead past’ prepare them for the bright future?

New Labour, born out of discontent with Old Labour, was too easily tempted into fetishing ‘the new’. For a while, the party campaigned under a vacuous slogan, which urged: ‘The future, not the past’. Very unhistorical; completely unrealistic. It’s like saying ‘Watch the next wave, forget about the tides’. Yet time’s seamless flow means that the future always emerges from the past, into which today’s present immediately settles.

It seems that the undervaluing of studying the past stems from a glib utilitarianism. Knowledge is sub-divided into many little pieces, which are then termed economically ‘useful’ or the reverse. Charles Clarke as Labour Education Minister in 2003 summed up this viewpoint. He was reported as finding the study of Britain’s early history to be purely ‘ornamental’ and unworthy of state support. In fact, he quickly issued a clarification. It transpired that it was the ‘medieval’ ideal of the university as a community of scholars that Clarke considered to be obsolescent, not the study of pre-Tudor history as such.1

Yet this clarification made things worse, not better. Clarke had no sympathy for the value of open-ended learning, either for individuals or for society at large. The very idea of scholars studying to expand and transmit knowledge – let alone doing so in a community – was anathema. Clarke declared that Britain’s education system should be designed chiefly to contribute to the British economy. It was not just History, he implied, but all ‘unproductive’ subjects that should be shunned.

The well-documented reality that Britain’s Universities have an immensely positive impact upon the British economy2 was lost in the simplistic attempt to subdivide knowledge into its ‘useful’ and ‘useless’ components.

By the way, it’s this sceptical attitude which has pressurised the Universities, much against their better judgement, into the current Research Excellence Framework’s insistence on rating the economic impact of academic research. An applied engineer’s treatise on How to build a Bridge becomes obviously ‘useful’. But a pure mathematician’s proof of a new theorem seems ‘pointless’.

How does contempt for learning originate in a political party whose leaders today are all graduates? It seems to stem from an imaginary workerism. Politicians without ‘real’ working-class roots invoke a plebeian caricature, as a sort of consolation – or covert apology. Give us the machine-tools, and leave effete book-learning for the toffs! They can waste their time, chatting about ancient Greece but we can build a locomotive.
'Crewe WorksÕ, LMS poster, 1937.

Illustration 1: The male world of skilled railway engineering, proudly displayed in a 1937 poster from Crewe © National Railway Museum, 2012

Such attitudes, however, betray the earnest commitment of the historic Labour movement to the value of learning. From the Chartists in the 1830s, the Mechanics Institutes, the Workers Educational Society, the trade unions’ educational programmes, the great tradition of working-class autodidacts, the campaigns for improved public education, up to and including Labour’s creation of the Open University in the 1960s, all have worked to extend education to the masses.
2012-10 Marsden (Yks) Mechanics Institute 1860

Illustration 2: Mechanics Institutes, like this 1860 edifice from the textile mill-town of Marsden, West Yorkshire, offered education to Britain’s unschooled workers. While not all had the time or will to respond, the principle of adult education was launched. In Marsden this fine landmark building was saved from demolition by local protest in the 1980s and reopened, after restoration, in 1991. © English Heritage 2012

No doubt, educational drives require constant renewal. In Britain from 1870 onwards, the state joined in, initially legislating for compulsory education for all children to the age of 10. And globally, similar long-term campaigns are working slowly, as education reforms do, to banish all illiteracy and to extend and deepen learning for all. It’s a noble cause, needed today as much as ever.

Knowledge meanwhile has its own seamless flow. It doesn’t always advance straightforwardly. At times, apparently fruitful lines of enquiry have turned out to be erroneous or even completely dead ends. Many eighteenth-century scientists, like the pioneer Joseph Priestley, wrongly believed in the theory of ‘phlogiston’ (the fire-principle) to explain the chemistry of combustion and oxidisation. Nonetheless from the welter of speculation and experimentation came major discoveries in the identification of oxygen and hydrogen.3  Today, it may possibly be that super-string theory, which holds sway in particle physics, is leading into another blind alley.4  But, either way, it won’t be politicians who decide. It’s the hurly burly of research cross-tested by speculation, experiment, debate, and continuing research that will adjudicate.

There’s an interesting parallel for History in the long-running debates about the usefulness of knowledge within mathematics. The ‘applied’ side of the subject is easy to defend, as constituting the language of science. ‘Pure’ maths’ on the other hand …? But divisions between the abstract and the applied are never static. Some initially abstruse mathematical formulations have had major applications in later generations. For example, the elegant beauty of Number Theory, originally considered as the height of abstraction, did not stop it from being later used for deciphering codes, in public-key cryptography.5

On the other hand, proof of the infinity of primes has (as yet) no practical application. Does that mean that this speculative field of study should be halted, as ‘useless’? Of course not.

My argument, in pursuing the ‘usefulness’ debates, seems to be drifting away from History. But not really. The mind-set that deplores the ‘useless’ Humanities would also reject the abstraction of the ‘pure’ sciences. But try building a functioning steam locomotive, without any knowledge of history or of formalised mathematics or of the science of mechanised motion, let alone the technology of iron and steel production. It couldn’t be done today. And we know from history that our ever-inventive ancestors didn’t do it in the Stone Age either.

1 Charles Clarke reported in The Guardian, 9 May 2003, with clarification in later edition on same date.

2 The Higher Education Funding Council for England (HEFCE) commissioned an independent report, which calculated that Britain’s Universities contributed at least £3.3bn to UK businesses in the 2010-11 academic year, as part of a much wider economic impact, both direct and indirect: see www.hefce/news/newsarchive 23 July 2012.

3 J.B. Conant (ed.), The Overthrow of Phlogiston Theory: The Chemical Revolution of 1775-89 (Cambridge, Mass., 1950).

4 For criticisms, see L. Smolin, The Trouble with Physics: The Rise of String Theory, the Fall of a Science, and What Comes Next (New York, 2006); and P. Woit, Not Even Wrong: The Failure of String Theory and the Search for Unity in Physical Law (2006).

5 See the debates after G.H. Hardy’s case for abstract mathematics in his A Mathematician’s Apology (1940): see ‘Pure Mathematics’ in www.wikipedia.

  • My November Blog will discuss the relevance of History not only for economics but also for civics.
  • And my December Blog will consider how to ensure that all students study History to the age of 16.

For further discussion, see

To read other discussion-points, please click here.

To download Monthly Blog 23 please click here

MONTHLY BLOG 22, TO TRUST OR NOT TO TRUST?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2012)

When choosing Blog topics, I draw from my professional experiences as an academic historian and my grass-roots life as a long-term party activist and former Labour councillor. But today’s theme of Trust/or No trust comes from both fields of endeavour. Can society trust people? Should we? How do we balance between a lack of regulation, which may easily cloak fraud or incompetence, and an excess of petty regulations on individuals?

A culture of universal suspicion is bad for communal living. Trust is easy to lose, hard to build.

Ok, it seems clear that big institutions do need to be audited regularly. Their intricate structures and wide-ranging responsibilities are too difficult otherwise for outsiders to assess. Depressingly, the need for such inspection always seems to come from some scandalous incompetence or crime.

Nonetheless, society should not lurch from excessive under-regulation to excessive over-regulation, especially when it comes to institutions regulating the actions of their own staff. Then it seems that the culture of suspicion has just been imported in order to let superiors tyrannise those below them in their local hierarchy, without actually controlling those at the top. What about some due proportionality?

I have two immediate examples of attempts at petty regulation. The first was foiled. It came from the examinations department of an ancient University, where I was the external examiner a few years ago. We were abruptly informed that we had to tick every page of every script, as proof that we had actually read the essays which we were supposed to be marking. But the instruction was simultaneously offensive and utterly pointless. A tick would prove only that the page had been ticked, not that its contents had been duly read and considered.

Examiners may well feel a sense of exhaustion when confronting their annual tasks. But infantilising the teaching workforce by imposing distracting and pointless extra requirements is the reverse of helpful.
august004 Did the ancient University really lack trust in its own staff and its invited external examiner? In this case, common sense prevailed; and, after a protest, the instruction was withdrawn.

This case was, however, all too typical of the excess rules (often imposed abruptly and later altered as abruptly) that try to stipulate how academics should do their jobs. The motive seems to be the urge for control by middle management – and the result is cynicism and secret evasion.

The second example has just come into my in-tray. It is a bright idea from the Labour Party, but it might come from any political organisation. The aim is to control/monitor those who stand for office (whether local, national or European) by asking them to sign a quasi-legal contract. Of course, it’s essential to let candidates to know what’s expected of them, in terms of attendance at meetings, responding to the electorate, managing publicity, canvassing and so forth. But signing a quasi-legal contract? Who is to monitor it? And who enforce it??

It’s the sort of politics as spurious-legalism that got Nick Clegg into so much trouble over his signed pledge (below) not to raise University tuition fees.
august002august003 Unfairness is written into the proposed contract from the start, by asking the elected members to attend a specified percentage of all public meetings in their constituencies. Those whose political patches contain many residents’ associations, neighbourhood watches, and other local gatherings will be required to jump over a much higher hurdle than those in sleepy Clochemerles, where nothing happens.

Judging by percentages leaves out all discretion on the part of the councillors, MPs, MEPs etc. It is mathematicalising the non-mathematical; standardising what should be un-standardised; taking spontaneity and good judgment from what should be the core of civic commitment.

Down with phoney legalism. It’s up to political parties to choose good candidates. And then for electorates to judge them. The ungracious folly of candidates’ unenforceable quasi-contracts, made and adjudged by their own political parties, has been proposed.

But – no! This sort of petty monitoring should be rejected. There are far more important and urgent problems facing politicians today than worrying over whether they have attended the right percentage of neighbourhood watch meetings this year. Trust is earned by good deeds not by percentage-pledges.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 22 please click here

MONTHLY BLOG 21, HISTORICAL PERIODISATION – PART 1

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2012)

It was fascinating to meet with twenty-three others on a humid June afternoon to debate what might appear to be abstruse questions of Law & Historical Periodisation. We were attending a special conference at Birkbeck College, London University – an institution (founded in 1823 as the London Mechanics Institute) committed as always to extending the boundaries of knowledge. The participants came from the disciplines of law, history, philosophy, and literary studies. And many were students, including, laudably, some interested undergraduates who were attending in the vacation.

At stake was not the question of whether we can generalise about different and separate periods of the past. Obviously we can and must to some extent. Even the most determined advocate of history as ‘one and indivisible’ has to accept some sub-divisions for operative purposes, whether in terms of days, years, centuries or millennia.

But the questions really coalesce about temporal ‘stages’, such as the ‘mediaeval’ era. Are such concepts relevant and helpful? Is history rightly divided into successive stages? and do they follow in regular sequence in different countries, even if at different times? Or is there a danger of reifying these epochs – turning them into something more substantive and distinctive than was actually the case?

Studies like H.O. Taylor’s The Medieval Mind (1919 and many later edns), Benedicta Ward’s Miracles and the Medieval Mind (1982), William Manchester’s The Medieval Mind and the Renaissance (Boston, 1992), and Stephen Currie’s Miracles, Saints and Superstition: The Medieval Mind (2006), all imply that there were common properties to the mind-sets of millions of Europeans who lived between (roughly) the fifth-century fall of Rome and the fifteenth-century discovery of the New World – and that these mindsets differed sharply from the ‘modern mind’. Yet are these historians justified in choosing this formula within their titles? Or partly justified? or absolutely misleading? Are there common features within human consciousness and experiences that refute these periodic cut-off points? Do we want to go to the other end of the spectrum, to endorse the view of those Evolutionary Psychologists who aver that human mentalities have not changed since the Stone Age? Forever he, whether Tarzan, Baldric or Kevin? forever she, whether Jane, Elwisia or Tracey?

Two papers by Kathleen Davis (University of Rhode Island) and Peter Fitzpatrick (Birkbeck College) formed the core of the conference, both focusing upon the culture of jurisprudence and its standard definition of the medieval. Both give stimulating critiques of conventional legal assumptions, based upon stark dichotomies. In bare summary, the ‘medieval’ is supposed to be Christianised, feudal, and customary, while the ‘modern’ is supposedly secular, rights-based, and centred around the sovereign state. For good measure, the former is by implication backward and oppressive, while the latter is progressive and enlightened. Yet the long history of legal pluralism goes against any such dichotomy in practice. Historians like Helen Cam, who in 1941 wrote What of Medieval England is Alive in England Today? would have rejoiced at these papers, and at the sharp questions from the conference participants.

For my part, I was asked to give a final summary, based upon my position as a critic of all simple stage theories of history.1 My first point was to stress again how difficult it is to rethink periodisation, because so many cardinal assumptions are built not only into academic language but also into academic structures. Many specialists name themselves after their periods – as ‘medievalists’, ‘modernists’ or whatever. Those who call themselves just ‘historians’ are seen as too vague – or suffering from folie de grandeur. There are mutterings about the fate of Arnold Toynbee, once hailed as the twentieth-century’s greatest historian-philosopher – now virtually forgotten. Academic posts within departments of History and Literary Studies are generally defined by timespans. So are examination papers; many academic journals; many conferences; and so forth. Publishers in particular, who pay great attention to book titles, often endorse traditional nomenclature and stage divisions.

True, there are now increasing calls for change. My second point therefore highlights the new diversity. Conferences and seminars are held not only across disciplinary boundaries but also across epochal divisions. An increasing number of books are published with unusual start and end dates; and the variety of dates attached to the traditional periods continues to multiply, often confusingly. In addition, some scholars now study ‘big’ (long-term) history from the start of the world, or at least from the start of human history. Their approaches do not always manage to avoid traditional schema but the aim is to encourage a new diachronic sweep. And other pressures for change are coming from scholars in new fields of history, such as women’s history or (not the same thing) the history of sexuality.

Shedding the old period terminology is mentally liberating. So the Italian historian Massimo Montanari, previously a ‘medievalist’, wrote in 1994 of the happiness that followed his discarding of all the labels of ‘ancient’, ‘medieval’ and ‘modern: ‘In the end, I felt freed as from a restrictive and artificial scaffolding …’2

Lastly, then, what of the future? The aim is not to replace one set of period terms and dates with another. Any rival set will run into the same difficulties of detecting precise cut-off points and the risk of stereotyping the different cultures and societies on either side of a period boundary. It is another example of dichotomous thinking, which glosses over the complexities of the past. Above all, all stage theories fail to incorporate the elements of deep continuity within history (see my November 2010 discussion-point).

We need a new way of thinking about the intertwining of persistence and change within history. It is chiefly a matter of understanding. But it will also entail a change of language. I don’t personally endorse the Foucauldian view that language actually determines consciousness. For me, primacy in the relationship is the other way round. A changing consciousness can ultimately change language. Yet I do recognise the confining effects of existing concepts and terminology upon patterns of thought. Such an impact is another example of the power of continuity. With several bounds, however, historians can become free. With a new language, we can talk about epochs and continuities, intertwined and interacting in often changing ways. It’s fun to try and also fun to try to convince others. Medievalists, arise. You have nothing to lose but an old name, which survives through inertia. There are more than three steps between ancient – middle – modern, even in European history – let alone around the world. Try a different name to shake the stereotypes. And tell the lawyers too.

1 P.J. Corfield, Time and the Shape of History (2007) and P.J. Corfield, POST-Medievalism/ Modernity/ Postmodernity? Rethinking History, Vol. 14/3 (Sept. 2010), pp. 379-404; also available on publishers’ website Taylor & Francis www.tandfonline.com; and personal website www.penelopejcorfield.co.uk.

2 M. Montanari, The Culture of Food (transl. C. Ipsen (Oxford, 1994), p. xii.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 21 please click here